AI Technology: Deep Learning
Understanding and simulating strongly correlated quantum systems is a central challenge in modern physics, with applications ranging from high-temperature superconductivity to quantum computing.
While Neural Quantum States (NQS), encoding the wavefunction through advanced transformer architectures, have revolutionized the field thanks to their high accuracy, their computational cost is high due to the need to train separate models for every point in the phase diagram.
This project aims to research, train and release Foundational Quantum States: general-purpose transformer wavefunctions capable of representing quantum states across entire phase diagrams of complex quantum mechanical problems. By leveraging Vision Transformers and Graph Attention Networks, we will create models that learn global representations of quantum systems, significantly reducing the computational cost of phase diagram exploration.
The project will release three families of FQS models, each targeting a different class of correlated quantum systems, providing an open-source toolset for the broader physics community. This paradigm shift from instance-specific training to generalizable foundational models paves the way for breakthroughs in quantum materials science, electronic structure calculations, and quantum technology design.
Filippo Vicentini, École polytechnique, école d'ingénieur, France