Orb is Orbital Materials’ AI-based universal interatomic potential, designed for simulating advanced materials at scale. It achieves state of the art performance in terms of both speed and accuracy relative to other AI-based interatomic potentials. Orb models can be used directly for accurate energy estimation and geometry/cell optimization of crystalline materials, as well as being fast enough to be used directly in molecular dynamics or monte carlo simulations.
Orb uses an attention augmented Graph Network-based Simulator (GNS), a type of Message Passing Neural Network (MPNN).
MPPNs operate on graphs and have an iterative message passing phase, in which latent representations of each node are updated as an aggregation of messages passed between a node’s neighboring nodes and edges. In physical terms, early iterations of message passing capture local atomic interactions, which are hierarchically re-used and composed in later iterations to model larger chemical structures.
The base GNS backbone model undergoes 2 steps of training. First, we train a Denoising Diffusion Model, which learns to remove random noise added to atomic positions. This has close parallels to learning effective forcefields and is a very effective technique for unsupervised pretraining of Foundation Models designed to be used as neural network potentials.
Figure 1: An example diffusion process. Our LINUS model is trained to reverse noise added to the atomic positions of systems of molecules/crystals. We find that this makes a very effective base model for multiple other downstream applications.
Secondly, we finetune these pretrained backbone models on a mix of publicly available data commonly used for training forcefield models (MP-traj and Alexandria). All models are finetuned to predict per atom forces, system energy and unit cell stress. The test sets used for evaluation are not used in training.
Our dataset for pretraining consists of 3D molecular and crystallography data from a variety of sources. Our main focus is collecting and normalizing the structures associated with the result of computational optimization procedures (trajectories from molecular dynamics, GCMC and DFT). We collect and store the 3D atomic structure (atomic numbers, 3D Cartesian coordinates, unit cell parameters, forces and stresses) as well as dataset specific metadata.
These datasets are used in two ways; for pretraining our foundation model using the 3D structure, and for finetuning the resulting foundation model for predicting specific molecular properties, such as band gaps or energies.
Figure 2: LINUS Training data size. Our base model architecture is more scalable than many equivariant architectures, allowing us to include systems of up to 5000 atoms in pretraining.
Our Orb models achieve the best performance on the Matbench Discovery benchmark. Additionally, even when constrained to only using the MPTrj dataset during both pretraining and fine tuning (a strict requirement of Matbench Discovery to be included in their main results page), we outperform all models by a substantial margin, demonstrating the effectiveness of our pretraining regime and model specification in addition to the scale of our data collection efforts.
Figure 3: Matbench Discovery F1 Results. Matbench Discovery is a task designed to simulate a high-throughput discovery campaign for new stable inorganic crystals. It requires both structural relaxation and accurate energy estimation.
Figure 4: Speed comparisons with MACE, a popoular opensource model. Speed comparisons are not available with Mattersim & GNome because they are proprietary,
The Orb forcefields are publicly available on Github under a permissive community license. If you have an interesting use case for Orb, please get in touch or open an issue on Github. We are working on a technical report which will be available soon - watch this space!
Commercial and Licensing - please contact dan@orbitalmaterials.com
Questions? Get in touch at info@orbitalmaterials.com