Skip to content

Latest commit

 

History

History
63 lines (39 loc) · 2.49 KB

README.rst

File metadata and controls

63 lines (39 loc) · 2.49 KB

PET

This repository contains an implementation of Point Edge Transformer (PET), interatomic machine learning potential, which achieves state-of-the-art on several datasets; see more details in [1]. PET is a graph neural network where each message-passing layer is given by an arbitrarily deep transformer. Additionally, this repository contains a proof-of-principle implementation of the Equivariant Coordinate System Ensemble (ECSE).

Installation

Run pip install .

After the installation, the following command line scripts are available: pet_train, pet_run, and pet_run_sp.

See the documentation for more details.

Ecosystem

Ecosystem overview

LAMMPS, i-pi, and ASE.MD are molecular simulation engines.

MTM enables the creation of a single interface for models such as PET to use them immediately in multiple simulation engines. Currently, this is implemented for LAMMPS and ASE.MD, with plans to extend it to additional engines in the future.

All MD interfaces are currently under development and are not stable. The pet_train and pet_run scripts are now semi-stable.

"sp" stands for Symmetrization Protocol and refers to ECSE.

MLIP stands for Machine Learning Interatomic Potential. Fitting is supported for both energies and/or forces (It is always recommended to use forces if they are available). The pet_train_general_target script is designed for fitting multidimensional targets such as eDOS. Optionally, a target may be atomic, meaning it is assigned to each atom in the structure rather than the entire atomic configuration. Derivatives are not supported by this script.

Documentation

Documentation can be found here.

Tests

cd tests && pytest .

References

[1] Sergey Pozdnyakov, and Michele Ceriotti 2023. Smooth, exact rotational symmetrization for deep learning on point clouds. In Thirty-seventh Conference on Neural Information Processing Systems.