Deep learning techniques, a game-changer for quantum chemistry
Quantum chemistry is a field that dives into the behaviour of atoms and molecules at their most fundamental level, using the principles of quantum mechanics to understand how electrons interact within these systems. For researchers, one of the biggest challenges is studying systems where electrons are strongly correlated—meaning their movements are highly interdependent, like dancers in a tightly choreographed routine. These systems, found in materials like high-temperature superconductors or large molecules like fullerenes, are notoriously difficult to model because traditional methods require immense computational power that grows exponentially with the number of electrons. A recent study 1 offers a groundbreaking approach to this problem, blending quantum chemistry with techniques borrowed from deep learning to make these calculations faster and more accessible.

The challenge of strongly correlated systems
Imagine trying to predict the behaviour of a crowd where every person’s movement influences everyone else’s. In quantum chemistry, strongly correlated electron systems are like that crowd. Traditional methods, such as those based on wave functions, describe the exact position and momentum of every electron, but they become impractical for large systems with many electrons. These calculations demand so much computing power that even supercomputers struggle with systems beyond a few dozen electrons. Another approach, called density functional theory, simplifies the problem by focusing on electron density rather than individual electron paths, but it often fails to capture the complex interactions in strongly correlated systems accurately.
This is where natural orbital functional (NOF) theory comes in. NOF theory is a middle ground, offering a way to study electron interactions with reasonable accuracy without the overwhelming computational cost of wave-function methods. It uses mathematical objects called natural orbitals and their associated occupation numbers—think of these as a way to describe how electrons are distributed across different energy states in a molecule or material. While NOF theory is promising, it has a catch: optimizing these orbitals and numbers to find the lowest energy state of a system is a slow process, especially for large systems. This bottleneck has limited its use to smaller molecules or materials until now.
Borrowing from deep learning
The researchers found inspiration in an unexpected place: deep learning. In the world of artificial intelligence, deep learning models, like those powering image recognition or language processing, rely on optimization techniques to fine-tune millions of parameters efficiently. One such technique, called adaptive momentum (or ADAM), helps neural networks learn by adjusting how they update their parameters based on past calculations, balancing speed and stability. The team realized that the process of optimizing natural orbitals in NOF calculations shares similarities with training a neural network. Both involve iteratively refining a complex system to minimize an error or energy function.
In their new approach, the researchers adapted the ADAM method to optimize the natural orbitals. Instead of recalculating everything from scratch at each step, as older methods did, their technique uses information from previous steps to guide the optimization, much like how a neural network learns from past data. This makes the process faster and more efficient, allowing the team to tackle systems with hundreds or even thousands of electrons—something previously out of reach for NOF calculations. They also alternated between optimizing the orbitals and their occupation numbers separately, which further streamlined the process by focusing computational effort where it was most needed.
Testing the method on real-world problems

To show the power of their approach, the researchers applied it to three challenging cases. First, they studied a massive cluster of 1,000 hydrogen atoms arranged in a cube, simulating how it transitions from a metallic state, where electrons flow freely, to an insulating state, where electrons are localized. This process, known as a metal-to-insulator transition, is crucial for understanding materials like Mott insulators, which have potential applications in electronics. Their method accurately captured the energy changes and electron localization as the atoms were pulled apart, marking the largest NOF calculation ever performed.
Next, they examined fullerenes, which are soccer-ball-shaped carbon molecules like C₆₀ (commonly known as a buckyball). These molecules are tricky because they can exhibit both static correlation (where electrons are shared across multiple states) and dynamic correlation (where electrons avoid each other due to repulsion). The new method provided insights into how electrons are distributed in these molecules, revealing subtle differences between C₆₀ and the less common C₃₆, which could help chemists design new carbon-based materials.
Finally, the team investigated linear acenes, a series of molecules made of fused carbon rings, to study the energy difference between their singlet and triplet states—a key property for applications in organic electronics. As the number of rings increases, the electron interactions shift from dynamic to static correlation, making accurate predictions challenging. The researchers’ method, particularly a modified version called GNOFm, produced results that closely matched experimental data, offering a more reliable way to predict these energy gaps.
A game-changer for quantum chemistry
This new approach is a game-changer for quantum chemistry. By making NOF calculations faster and applicable to larger systems, it opens the door to studying complex materials and molecules that were previously too computationally demanding. For example, understanding strongly correlated systems could lead to breakthroughs in designing superconductors that work at higher temperatures or developing new catalysts for chemical reactions. The use of deep learning-inspired techniques also highlights the growing synergy between artificial intelligence and physical sciences, showing how tools from one field can solve problems in another.
Moreover, the method’s efficiency—scaling to systems with thousands of electrons—means researchers can now perform all-electron calculations without relying on approximations that limit accuracy. This could lead to more precise simulations of real-world materials, from battery components to drug molecules. The researchers also suggest that their approach could be extended beyond electrons to systems involving other particles, like bosons, which could have implications for fields like quantum computing.
Looking Ahead
The study is not without challenges. The researchers note that their modified GNOF functional still needs refinement to fully align with experimental results in some cases, and further work is needed to explore other deep learning optimization techniques. However, the potential is clear. By combining the physical insights of NOF theory with the computational efficiency of deep learning, this work paves the way for tackling some of the most complex problems in quantum chemistry. It’s a reminder that innovation often comes from unexpected connections—here, between the abstract world of quantum mechanics and the practical tools of machine learning. As these techniques evolve, they could transform our ability to understand and design the materials that shape our world.
Author: César Tomé López is a science writer and the editor of Mapping Ignorance
Disclaimer: Parts of this article may have been copied verbatim or almost verbatim from the referenced research paper/s.
References
- Juan Felipe Huan Lew-Yee , Jorge M. del Campo , and Mario Piris (2025) Advancing Natural Orbital Functional Calculations through Deep Learning-Inspired Techniques for Large-Scale Strongly Correlated Electron Systems Phys. Rev. Lett. doi: 10.1103/PhysRevLett.134.206401 ↩