View a PDF of the paper titled Ensemble Knowledge Distillation for Machine Learning Interatomic Potentials, by Sakib Matin and 7 other authors
View PDF
HTML (experimental)
Abstract:The quality of machine learning interatomic potentials (MLIPs) strongly depends on the quantity of training data as well as the quantum chemistry (QC) level of theory used. Datasets generated with high-fidelity QC methods are typically restricted to small molecules and may be missing energy gradients, which make it difficult to train accurate MLIPs. We present an ensemble knowledge distillation (EKD) method to improve MLIP accuracy when trained to energy-only datasets. First, multiple teacher models are trained to QC energies and then generate atomic forces for all configurations in the dataset. Next, the student MLIP is trained to both QC energies and to ensemble-averaged forces generated by the teacher models. We apply this workflow on the ANI-1ccx dataset where the configuration energies computed at the coupled cluster level of theory. The resulting student MLIPs achieve new state-of-the-art accuracy on the COMP6 benchmark and show improved stability for molecular dynamics simulations.
Submission history
From: Sakib Matin [view email]
[v1]
Tue, 18 Mar 2025 14:32:51 UTC (678 KB)
[v2]
Wed, 19 Mar 2025 15:03:39 UTC (678 KB)
[v3]
Thu, 12 Jun 2025 23:37:14 UTC (538 KB)