Category Archives: Reports

Report on the Psi-k/CECAM Research Conference “Ab initio Spin-Orbitronics”

September 25, 2017 – September 29, 2017
Hotel Promenade, Montesilvano, Pescara (Italy)

Silvia Picozzi (Consiglio Nazionale delle Ricerche CNR-SPIN, Italy)
Stefan Blügel (Forschungszentrum Jülich, Germany)
Ingrid Mertig (Martin Luther University Halle, Germany)

The main purpose of this Psi-k/CECAM research conference (with about 110 participants) has been to highlight the very recent theoretical and computational developments related to the interplay of spin-orbit interaction with electronic structure, magnetism, transport as well as its link to strongly correlated materials and ultrafast currents in diverse materials. We have focused on discussing spin-orbit coupling (SOC) as a means of engendering fundamentally novel physical phenomena in exotic systems. The Conference therefore spanned several research dimensions, ranging from Materials (in the form of bulk compounds, surfaces and interfaces, thin films and heterostructures) to Functionalities (associated with topology, spin-momentum locking, valley degrees of freedom, skyrmions, coupling to electric currents by Berry phases, etc.) to method developments (in terms of dynamical processes in out-of-equilibrium quantum matter, Berry phase physics, etc). A brainstorm about concepts and ideas in a little understood phenomenon, such as orbital magnetization, was carried out under the guidance of Prof. Ivo Souza. While the main focus was on ab initio simulations, a few leading scientists in experiments were invited (Prof. Stuart Parkin, Prof. Claudia Felser, etc) and a strong interface to many-body physics treated on the basis of realistic model Hamiltonians was included.

Read the full report here.

Report on CECAM Workshop: “Emerging Technologies in Scientific Data Visualisation”.

Title: Emerging Technologies in Scientific Data Visualisation
Location: CECAM-IT-SISSA-SNS Node, in Scuola Normale Superiore (Pisa, Italy)
Webpage with list of participants, schedule and abstracts of presentations:

Dates: April 4, 2018 to April 6, 2018

Stefano de Gironcoli (International School for Advanced Studies (SISSA) and CNR-DEMOCRITOS IOM, Trieste, Italy)

Emine Kucukbenli (SISSA, Trieste, Italy)

Giordano Mancini (Scuola Normale Superiore, Pisa, Italy)

Monica Sanna (Scuola Normale Superiore, Pisa, Italy)

State of the art:

Visualisation allows us to tap into high-bandwidth cognitive hierarchies of our brains and allows us to process high densities of information at once. In the field of atomistic and molecular simulations, it is a key element to research: we use ball-and-stick figures to represent the simulation scenarios, graphs to recognize or communicate parametric relationships of equations. The “Big Data” trend gave rise to several projects with vast output of data, many data-driven approaches are being introduced. For instance, a new EU Center of Excellence, “NOMAD”, is established to collect, store and regularize data to build a materials encyclopedia.

Visual analytics is also making its way into material simulations beyond traditional ways. Two notable examples are i) a successful crystal structure prediction study using data clustering method supported by visual analytics,[1] ii) a time-aggregated 2D heat-map method that reduces the time to explore inner tunnels of proteins. Nevertheless, visual analytics beyond XY plots or ball-and-stick representations is still an emerging field. Several aspects are yet to be identified and discussed between different communities.

Some of the  open questions of the state-of-the-art that the workshop addressed are:

-Data Producers: What are the emerging visualization needs for Big Data; how are they different than scaled-up versions of existing tools?

-Data Analysts: How to enhance current analysis tools or create new ones with visualization? What visual analytics techniques, representations and mapping methods can we borrow from other fields now that the molecular simulations can produce a variety of data other than molecular representations?

-Technologists: How can we better use the developing technologies such as Virtual Reality, haptic feedback mechanisms, graphical artificial neural networks, and computer vision to reveal patterns and relationships that were previously not exposed to visualization at all?

[1]Stuart Card, J.D.

Major outcomes:

Prior to the beginning of the workshop we organizers doubted about the possibility of keeping the interest contributors and attendees from so many different research areas (about half of the participants came from communities outside the CECAM “core business” of atomistic and molecular simulations). As the workshop begun we realised that the same doubts were also held by many invited speakers.

However, since the first talk these doubts vanished and remained a non issue for all the three days. All the contributions were followed with keen interest by attendees and many questions were raised by the audience after every intervention. This may constitute a point for further consideration as it may imply that such an event tapping both from inside and outside our community may be an opportunity for fruitful contamination of ideas. Obviously, these positive aspects could be outbalanced by a constraint on the scope and depth of their talks but feedbacks collected during the workshop did not highlight this issue and many attendees stated that talks were source of (unexpected) inspiration for their research. Another general point worth mentioning is the basic techniques of visualization discussed particularly in the first day that, for many participants (even experienced ones), constituted material for learning according to their feedback. Perhaps a different event (such as school or part of it) blending visualization and machine learning may be worth of consideration in future CECAM events.

The workshop was an opportunity to discuss on the following topics which were perceived as very important by participants:

  1. Data visualization for complex data sets: the importance of selecting the most appropriate metaphor for conveying information to the reader from different sources and with different means and the importance of correctly selecting things such as glyphs and colour maps. The adaption of these choice to different contexts (e. g. research, dissemination) was also stressed. Direct Volume Rendering and switch to different representations with an increasing/decreasing number of represented objects were cited as “must have” features for modern visualization applications.
  2. Creation of intelligent workflow systems. Development and testing of scientific software. The large computational infrastructures available in Europe allow to address high performance computing (HPC) -based investigations in material science and soft matter. We strongly need shared resources across Europe (and outside Europe?) and more importantly environments providing a flexible and customisable integration of such resources. Sharing and validation of the of the large amount of data generated are to be really exploited to the fullest extent. The same hold for scientific software development, testing and versioning. Projects such as NOMAD constitute a model for such efforts.
  3. Immersive Virtual Reality for science. The potential benefits and disadvantages of IVR in scientific applications were debated. On one hand, some of the participants showed IVR environments with different degrees of maturity showing how with IVR it is possible to integrate very large and complex data sets (especially in molecular medicine and genomics) and even collaborate within these environments. On the other hand some of the attendees argued that IVR may be too dispersive and distracting for users and effort should be focused on scalable 2D applications. Alternatives to IVR (such as multiple display walls) were also presented.
  4. Machine Learning and Molecular Dynamics. The accuracy and flexibility of MD Force Fields (FF) trained using Neural Networks has been shown and discussed as one approach that will have a deep impact on Molecular Simulations. Also the central role of Unsupervised Learning methods such Clustering algorithms and dimensionality reduction methods has been demonstrated to be crucial for any large scale simulation study and to develop effective visualization methods.

Community needs

One need that was highlighted for this type of event was to increase the time allotted for open and informal discussion as compared to the presentaion of talks. The demontrations of IVR were appreciated but perhaps there was an umbalance between the focus given on Virtual Reality in the demontrations and the talks. Other “hands on” sessions should have considered in the schedule.

From a general point of view, inviting experts to show how to tap into new, emerging technologies has been fruitful and it should be done again in the future even if it is not straightforward how to organize these events without veering too much off the spirit of CECAM workshops. For sure the workshop has highlighted a big gap between what specialists either from the visualization world or from Big Data and Analitycs consider a solid standard that should be used and widespread and what many (young) participants normally use in their studies. This gap may be perhaps overcame by making available more powerful middleware (see point 2 above) but also by organizing more mixed type events such as this one.


Typical channels to organize these future conferences would be CECAM and Psi-K. However, in future events a greater commitment by private firms working in data analytics and/or visualization should be sought, to keep the level of multidisciplinarity of the workshops. We have one sponsorship from hardware vendors but our attempts to other companies failed, not because of interest (e. g. one speaker came from the Unicredit bank research team) but for lack of time: big companies (or public organizations) not already aware of the type of research carried out by the CECAM community may need many bureaucratic steps before agreeing to a sponsorship even if the economic commitment is not very relevant to them (such was precisely the case for Unicredit).

Will these developments bring societal benefits?

Computational approaches are becoming central in every scientific discipline; the open research policies set in H2020 [1] push researchers to adopt data intensive approaches to remain competitive. Increasing the awareness of the CECAM community in this type of events to the new possibilities created by the Big Data trend has therefore the potential of increasing the quality and quantity of research from the community and, as a consequence, bring societal benefits in all the areas were atomistic and molecular simulation may have an impact, such as nanomedicine or new materials. It may also constitute a chance for scientists with a different background to get in touch with what is CECAM and what type of research its associates do. The development and deployment of powerful and engaging visualization and virtual reality technologyhas the potential to reach a wider audience highly effective ICT tools for educational purposes and science outreach in molecular sciences, in agreement with the H2020 guidelines on open education.[2]



Report: Workshop on Electronic Structure Theory with Numeric Atom-Centered Basis Functions 2018, July 9-11, Munich

Participants of the workshop attending one of the seminars.

This workshop held July 9 to 11, 2018, focused on methods that leverage localized, numeric atom-centered orbital (NAO) basis functions, a choice upon which a number of the strongest available electronic structure developments are founded. The workshop brought together key players from the FHI-aims code and related European and international efforts to highlight, discuss, and advance the state of the art of NAO-based modeling of molecules and materials based on the first principles of quantum mechanics. This workshop covered three days and 23 invited talks, covering:

  • development of community-based, shared infrastructure projects for electronic structure theory (Garcia, Larsen, Pouillon),
  • benchmarking efforts to assess and improve the accuracy of approximations used in electronic structure theory (Al-Hamdani, Goedecker, Liu),
  • applications of density functional perturbation theory (Laasner, Raimbault, Shang),
  • automation of workflow via machine learning and “big data” efforts (Ghiringhelli, Hoja),
  • scalability towards large systems and exascale computational resources (Huhn, Scheurer, Yu),
  • numerical algorithms and new methods for NAO-based electronic structure theory (Hermann, Ringe, Rossi), and
  • extensions beyond standard Kohn-Sham DFT (Golze, Havu, Michelitsch, Oberhofer, Ren)

Continue reading Report: Workshop on Electronic Structure Theory with Numeric Atom-Centered Basis Functions 2018, July 9-11, Munich

Report: CECAM/Psi-k Workshop Bremen on Reliable and quantitative prediction of defect properties in Ga-based semiconductors

CECAM Report

Program GASC

Organizers: Thomas Frauenheim (Bremen), Peter Déak (Bremen), Klaus Irmscher (Berlin), Susanne Siebentritt (Luxembourg),  Joel B. Varley (Livermore, California)

Venue: University of Bremen, Bremen Center for Computational Materials Science (BCCMS), Germany, 8th until 12th of October 2018

Sponsors: University of Bremen (BCCMS), Psi-k, DFG

Defect engineering in micro/optoelectronics and in photovoltaics has immensely profited from electronic structure calculations. In the past two decades, local and semi-local approximations of density functional theory were the workhorses of theoretical studies but, by now, it has become clear that they do not allow a sufficiently accurate and reliable prediction of defect properties in wide band gap materials. While ab initio energy methods for calculating the total energy are struggling with the system sizes necessary for defect modeling, semi-empirical methods using various corrections or hybrid functionals are being applied for the purpose. While the theoretical background of these methods and their relation to each other is by now more or less understood, the transferability of the semi-empirical parameters and the overall predictive power is still unclear. Progress requires further systematic testing and comparison of the various methods, as well as validation against experiments. For that, accurate measurement data on defects are needed on a set of materials, which are structurally or compositionally related. Gallium based semiconductors, like GaN, Ga2O3 and CuInxGa1-xSe(S)2 chalcogenides (CIGS) offer a good possibility for testing theory and are interesting also experimentally due to their versatile applications.

While there are still open defect-related questions in the much studied blue LED-material GaN, very few defects could be positively identified as yet in CIGS solar cell materials, while the research on the potential power semiconductor and UV transparent electrode Ga2O3 has barely started. Following the successful workshops on Gallium Oxide and Related Materials, held in Kyoto (Japan) in 2015, and in Parma (Italy) in 2017, as well as several workshops on chalcogenide photovoltaic materials, e.g., Symposium V at the E-MRS Spring meeting 2016, the CECAM-workshop in Bremen 2018 will focus on bringing together experimentalist interested in gallium oxide and Ga-based chalocgenides with theorists who are active in the field. A friendly and stimulating environment will facilitate discussions, adding impetus to both the development of practically applicable theoretical methods and to progress in the defect engineering of these materials.

Workshop Report:”Modern Approaches to Coupling Scales in Materials Simulation”

Our workshop was held from the 2nd to the 4th of July at the Hotel Jäger von Fall which sits on a peninsula on the Sylvensteinsee, an artificial lake in the Bavarian foothills of the Alps near the Austrian border. This is a very remote location chosen deliberately to allow participants to concentrate fully on the scientific program. Transportation from and to the nearest train station was accomplished with a shuttle-bus courtesy of the Technical University of Munich (TUM), driven by some of the participating TUM members. For the most part the weather was very inviting, inspiring a small number of participants to take a refreshing swim during the breaks, it turns out that the water was very clear but also very, very cold.

The two evenings of the workshop where filled with a poster-session on Monday, held outdoors due to the lovely weather, and a conference dinner, again outdoors, on Tuesday.

The event was supported by Psi-k, the German science foundation (DFG), the international graduate school of science and engineering (IGSSE), as well as the Technical University of Munich.

Workshop Report: “Theoretical methods in molecular spintronics” (TMSpin)

The Workshop “Theoretical methods in molecular spintronics (TMSpin) was held at the Materials Physics Center of the University of the Basque Country in Donostia-San Sebastian from the 17th to the 20th of September 2018. This workshop welcomed 31 invited speakers and several postgraduate students presenting posters. The event was co-sponsored by Psi-k and the Donostia International Physics Centre (DIPC-

Molecular spintronics is the study of spin-related phenomena in molecules and atoms and their possible applications for the next generation of data storage and processing devices as well as for the implementation of quantum computers. Electronic structure theory has played a prominent role in molecular spintronics. The comparison of theory and experiments has demonstrated the importance of first-principles calculations, which go beyond model representations of molecular devices as simple “quantum dots” or effective spin Hamiltonians. Nonetheless, standard electronic structure methods based on Density Functional Theory often fail in describing molecular spintronic systems even at a qualitative level. This is because most magnetic phenomena are manifestations of correlation effects, which become extreme at the single molecule scale and which are not captured by standard implementations and approximations of DFT. The goal of TMSpin was to address the question:

“What electronic structure theory to use for molecular spintronics?”

The workshops gathered theoretical physicists and quantum chemists with different areas of expertise. On the one hand there were those researchers that have provided important contributions to the advancement of molecular spintronics since its inception. They were asked to give an overview about the field and moreover to highlight the open questions that to date cannot yet be addressed by theory. On the other hand, the workshop gathered some of the leading researchers in theory and code development, who presented the most recent fundamental and numerical advancements for a number of methods. The organizers promoted an intense discussion to understand whether such methods can be already employed in molecular spintronics. Continue reading Workshop Report: “Theoretical methods in molecular spintronics” (TMSpin)

Report of the E-CAM workshop “Improving the accuracy of ab-initio methods for materials”

Title: Improving the accuracy of ab-initio predictions for materials
Webpage with list of participants, schedule and slides of presentations:
Dates: September 17, 2018 to September 20, 2018
Organizers: Dario Alfè, Michele Casula, David CeperleyCarlo Pierleoni

State of the art
Improving the accuracy of ab-initio methods for materials means to devise a global strategy which integrates several approaches to provide a robust, controlled and reasonably fast methodology to predict properties of materials from first principle. Kohn-Sham DFT is the present workhorse in the field but its phenomenological character, induced by the approximations in the exchange-correlation functional, limit its transferability and reliability.
A change of paradigm is required to bring the ab-initio methods to a predictive level. The accuracy of XC functional in DFT should be assessed against more fundamental theories and not, as it is often done today, against experiments. This is because the comparison with experiments is often indirect and could be misleading. The emerging more fundamental method for materials is Quantum Monte Carlo because of: 1) its favourable scaling with system size with respect to other Quantum Chemistry methods; 2) its variational character which defines an accuracy scale and allows to progressively improve the results. However QMC being much more demanding in terms of computer resources, and intricate than DFT, a combined approach is still desirable where QMC is used to benchmark DFT approximations for specific systems before performing the production study by DFT.
A different aspect of accuracy is related to size effects: often relevant phenomena occurs at length and time scales beyond the one approachable by first-principle methods. In these cases effective force fields methods can be employed. Machine Learning methods can be used to extract those force fields from training sets provided by ab-initio calculations. Presently DFT-based training sets are used. Improving their accuracy will improve the ultimate accuracy at all scales.
This change of paradigm requires building a community of people with different expertises working in an integrated fashion. This has been the main aim of the workshop.

Continue reading Report of the E-CAM workshop “Improving the accuracy of ab-initio methods for materials”

Report: MSSC2018 – Ab initio Modelling in Solid State Chemistry

The Department of Chemistry and the Thomas Young Centre at Imperial College London and the Computational Materials Science Group of the Science and Technology Facilities Council (STFC), in collaboration with the Theoretical Chemistry Group of the University of Torino, organised the 2018 MSSC Summer School on the “ab initio modelling of crystalline and defective solids with the CRYSTAL code”.

CRYSTAL is a general-purpose program for the study of periodic solids. It uses a local basis set comprised of Gaussian type functions and can be used to perform calculations at the Hartree-Fock, density functional or global and range-separated hybrid functionals (e.g. B3LYP, HSE06), double hybrid levels of theory. Analytical first derivatives with respect to the nuclear coordinates and cell parameters and analytical derivatives, up to fourth order, with respect to an applied electric field (CPHF/CPKS) are available.

The school provided an overview of the underlying theory and fundamental issues affecting use of the code, with particular emphasis on practical issues in obtaining reliable data efficiently using modern computer hardware.  The capabilities of CRYSTAL was illustrated with hands-on tutorials organized in the afternoon sessions.

All information about the school can be found on this website:

Read the full workshop report here: MSSC2018_Psi-k_report


Report: MSSC2017 – Ab initio Modelling in Solid State Chemistry

The Department of Chemistry and the Thomas Young Centre at Imperial College London and the Computational Materials Science Group of the Science and Technology Facilities Council (STFC), in collaboration with the Theoretical Chemistry Group of the University of Torino, organised the 2017 MSSC Summer School on the “ab initio modelling of crystalline and defective solids with the CRYSTAL code”.

The school provided an overview of the underlying theory and fundamental issues affecting use of the CRYSTAL code, with particular emphasis on practical issues in obtaining reliable data efficiently using modern computer hardware.

The capabilities of CRYSTAL was illustrated with hands-on tutorials organized in the afternoon sessions.

All information about the school can be found on this website:

Read the full workshop report here: MSSC2017_Psi-k_report

Modern Approaches to Coupling Scales in Materials Simulation

Hotel Jäger von Fall, Lenggries, Bavaria, Germany
Organizers: Harald Oberhofer, Johannes Margraf

Multi-scale simulation approaches rely on a hierarchy of increasingly accurate and highly resolved methods to capture the different time- and length-scales relevant to a process of
interest. Traditionally, this might involve coupling classical molecular dynamics with electronic structure calculations (QM/MM), or embedding a quantum mechanical system in a point charge
or continuum environment. In this context, the models comprising the individual layers of the multi-scale hierarchy are often unrelated. For instance, the empirical potential and DFT method in a QM/MM simulation are independently defined at the beginning of the simulation. Enormous advances in electronic structure algorithms and hardware now allow first principles calculations to be carried out on a truly massive scale. This leads to a novel perspective of multi-scale models: electronic structure data can be generated with high enough quality and quantity to allow the application of coarse graining and machine learning techniques. Instead of defining
separate physical models at different scales, the electronic structure method directly informs the next layer of the multi-scale hierarchy. The goal of this workshop was to bridge the gap between
traditional, layered multi-scale techniques and the more direct coarse graining and machine learning approaches to the simulation of extended systems, thereby bringing together researchers working on QM/MM or other embedding techniques with those who apply coarse graining and interpolation to electronic structure data in different contexts (e.g. potential energy surfaces, electronic properties, charge transport, rate constants in catalysis) and with different methods (neural networks, Gaussian process regression, kernel ridge regression, splining, etc).

Read the full report here.