AtomDNN: A New Simulation Tool for Atomistic Modeling with Machine Learning Potential
Traditional atomistic modeling fall into two broad categories: one based on quantum mechanics methods (e.g., Density Functional Theory), and the other based on empirical interatomic potentials. The former one is accurate but computationally demanding, limited to nanometers in length scale. The later one offers much more efficient simulations but less accurate, confined by the parametrization functions used in the potentials. This has always been the hurdle when people intend to get reliable modeling results for large material systems with atomistic details. The machine learning based potential has shown great promise to address the challenges posted. Machine learning potentials are not relying on a physical functional form, but instead learn the physical shape of the energy landscape from the training dataset.
We found three key technical challenges in the existing tools, which prevent the practical application of machine learning potential for large material systems: • the computational efficiency of generating atom descriptors is low. • most of the existing tools are not directly integrated to widely used atomistic modeling package LAMMPS (Large-scale Atomic Massively Parallel Simulator). • stress calculations for periodic solid-state systems is not correct in many existing tools. To overcome the challenges, we developed a new tool, called AtomDNN, which has the following features: • use Tensorflow2 platform to train a deep neuron network (DNN) based potential. • compute atom descriptors with LAMMPS, taking advantage of the already built in high efficient parallel algorithm in LAMMPS. • organic integration with LAMMPS without computation overhead. • rigorous evaluation of stress. As an example, AtomDNN has been applied to train a two dimensional material MoTe2.