Category Archives: Uncategorized

Report of the E-CAM workshop “Improving the accuracy of ab-initio methods for materials”

Title: Improving the accuracy of ab-initio predictions for materials
Location: CECAM-FR-MOSER
Webpage with list of participants, schedule and slides of presentations: http://www.cecam.org/workshop-0-1643.html
Dates: September 17, 2018 to September 20, 2018
Organizers: Dario Alfè, Michele Casula, David CeperleyCarlo Pierleoni

State of the art
Improving the accuracy of ab-initio methods for materials means to devise a global strategy which integrates several approaches to provide a robust, controlled and reasonably fast methodology to predict properties of materials from first principle. Kohn-Sham DFT is the present workhorse in the field but its phenomenological character, induced by the approximations in the exchange-correlation functional, limit its transferability and reliability.
A change of paradigm is required to bring the ab-initio methods to a predictive level. The accuracy of XC functional in DFT should be assessed against more fundamental theories and not, as it is often done today, against experiments. This is because the comparison with experiments is often indirect and could be misleading. The emerging more fundamental method for materials is Quantum Monte Carlo because of: 1) its favourable scaling with system size with respect to other Quantum Chemistry methods; 2) its variational character which defines an accuracy scale and allows to progressively improve the results. However QMC being much more demanding in terms of computer resources, and intricate than DFT, a combined approach is still desirable where QMC is used to benchmark DFT approximations for specific systems before performing the production study by DFT.
A different aspect of accuracy is related to size effects: often relevant phenomena occurs at length and time scales beyond the one approachable by first-principle methods. In these cases effective force fields methods can be employed. Machine Learning methods can be used to extract those force fields from training sets provided by ab-initio calculations. Presently DFT-based training sets are used. Improving their accuracy will improve the ultimate accuracy at all scales.
This change of paradigm requires building a community of people with different expertises working in an integrated fashion. This has been the main aim of the workshop.

Major outcomes
The following is a partial list of the topics discussed at the workshop, and of their importance to develop the field of computational materials science from first principles.
1) Importance of computational benchmarks to assess the accuracy of different methods and to feed the machine learning and neural network schemes with reliable data;
2) Need of a common database, and need to develop a common language across different codes and different computational approaches;
3) Interesting capabilities for neural network methods to develop new correlated wave functions;
4) Cross-fertilizing combination of computational schemes in a multi-scale environment: from the elemental interactions described at very high-level by expensive approaches to the generation of effective potentials, keeping the accuracy of high-level methods but at much lower cost.
5) Recent progress in quantum Monte Carlo to further improve the accuracy of the calculations by taking alternative routes: transcorrelated Hamiltonians, multideterminantal expansions, pfaffian wave functions.
Limitations:
Lack of a common environment where to develop multi-scale approaches for the prediction of material properties. This workshop is one of the first attempts where such needs have been discussed, and possible solutions explored.
Open questions:
How to make the codes ready for the next high performance computing (HPC) generation? A fundamental limitation to the future expansion of HPC is the need to reduce energy cost per unit of computation, which requires new technologies. Some of these new technologies are based on accelerators, such as GPU’s, which require in many case a complete rewriting of legacy scientific codes. This is a serious problem for the scientific community, requiring open and frank discussions, including a re-think of work recognition of computer code development as a major scientific endeavour.
How to develop a meaningful materials science database (which gathers both experimental and theoretical results)?
How to develop a common platform to merge different methods in a multi-scale spirit?

Community needs
The community of computational material science has increased in size tremendously in the past few decades. The drive for this expansion has been the development of ever more friendly computer codes, mainly based on density functional theory (DFT). Indeed, web of science is now reporting tens of thousands of papers per year based on DFT. By contrast, the quantum Monte Carlo (QMC) method, normally much more accurate than DFT, is only published in the hundreds/year, because of its much higher cost and also because of the intricacies of the method that make it more difficult to use. The number of workshops on QMC and the number of schools in which QMC is taught are also only a fraction compared to those on DFT. We believe that the community is now at a turning point where the extra accuracy offered by QMC is not only desirable, but also very much needed if serious progress is to be achieved in the computational design of new materials with bespoke properties. Only a few QMC codes (not even a handful) are currently supported through serious effort, and the community desperately need more formal recognition for code development in order to attract the best people to this endeavour. We are particularly focussing on QMC because we believe that it is the natural method capable of exploiting to the full the forecasted expansion in computer power in the next 10-20 years, but this is also a crucial point in time for this expansion, where new architectures require a complete re-thinking of computational approaches. A series of CECAM workshop may help to draw attention to these points.

Funding
Typical funding channels for the activities discussed at the meeting could be the Psi-k community and national funding schemes. In addition, since the ultimate goal of these activities will be to be able to design new materials entirely from first principles, it should be possible to target and persuade specific industries involved in the synthesis of new materials, including for example energy materials such as new batteries and new hydrogen storage materials. Industry funding could be targeted by offering to industry members of staff limited number of spaces to the workshops and requesting a registration fee.

Will these developments bring societal benefits?
The potential benefits of developing and handling computational tools able to predict material properties with a high level of reliability are numerous and of tremendous societal impact. During our workshop, a clearcut example was given by Xenophon Krokidis, who talked about the development of Scienomics, a software used to design and test new compounds in silico. This would allow a company to accelerate the R&D stage of its projects, cut ressources spent for checking the functionalities of a given material, and significantly shorten the “trial and error” time. The budget and time reductions yielded by the usage of a reliable material science software in the R&D is estimated to be of one order of magnitude, according to Krokidis’s experience. This is a huge amount.
Thus, the efforts of bringing together the three different communities (ab initio quantum Monte Carlo, density functional theory, and machine learning) is definitely worth, in the perspective of improving the accuracy of ab initio predictions for materials.

SCIENTIFIC REPORT ON THE PSI-K WORKSHOP: “ATOMIC SCALE MATERIALS MICROSCOPY: THEORY MEETS EXPERIMENT”

Psi-k workshop on
“Atomic scale materials microscopy: theory meets experiment”
National Railway Museum, York (UK)
26-28 June 2017

Summary:

Atomic scale materials characterization is now one of the major drivers of technological innovation in areas such as nanoelectronics, catalysis, medicine, clean energy generation and energy storage. This can in a large part be attributed to advances in electron and scanning probe microscopies, which are now able to provide atomically resolved structural, chemical and electronic characterization of a wide range of functional materials. However, the types of systems relevant to applications, which include surfaces, interfaces, nanocrystals and two-dimensional materials, are complex and interpreting experimental images and spectra is often extremely challenging. On the other hand, parallel advances in theoretical approaches means that theory can often offer invaluable guidance. These approaches include first principles methods for structure prediction, simulation of scanning probe and electron microscopy images, and prediction of various spectroscopic signatures (e.g. EELS and STS). Some of the most impressive examples of this kind of research in recent years have combined complementary theoretical and experimental approaches in a synergistic way to unravel the complex structure of materials. This type of integrated approach is increasingly being recognised as critical to advanced materials research and development by both industry and research funders.

It was in this context that the Psi-k workshop: “Atomic scale materials microscopy: theory meets experiment” was held between the 26th and 28th of June 2017 at the National Railway Museum in York (UK). The scientific focus was on the application and development of first principles methods that, in synergy with advanced microscopy techniques (e.g. TEM, EELS, STM, AFM), can help to unravel the structure and properties of materials at the atomic scale. Open to both experts and newcomers the aim was to provide a rounded overview of emerging methods and challenges in the field, and provide an opportunity for in-depth discussion and exchange of ideas. Continue reading SCIENTIFIC REPORT ON THE PSI-K WORKSHOP: “ATOMIC SCALE MATERIALS MICROSCOPY: THEORY MEETS EXPERIMENT”

Richard Martin’s ‘Electronic Structure’

martin

The study of the electronic structure of materials is at a momentous stage, with the emergence of computational methods and theoretical approaches. Many properties of materials can now be determined directly from the fundamental equations for the electrons, providing insights into critical problems in physics, chemistry, and materials science. This book provides a unified exposition of the basic theory and methods of electronic structure, together with instructive examples of practical computational methods and real-world applications. Appropriate for both graduate students and practising scientists, this book describes the approach most widely used today, density functional theory, with emphasis upon understanding the ideas, practical methods and limitations. Many references are provided to original papers, pertinent reviews, and widely available books. Included in each chapter is a short list of the most relevant references and a set of exercises that reveal salient points and challenge the reader.

Highly recommended!