Machine Learning Potentials with the Iterative Boltzmann Inversion: Training to Experiment

Sakib Matin, Alice E.A. Allen, Justin Smith, Nicholas Lubbers, Ryan B. Jadrich, Richard Messerly, Benjamin Nebgen, Ying Wai Li, Sergei Tretiak, Kipton Barros

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

Methodologies for training machine learning potentials (MLPs) with quantum-mechanical simulation data have recently seen tremendous progress. Experimental data have a very different character than simulated data, and most MLP training procedures cannot be easily adapted to incorporate both types of data into the training process. We investigate a training procedure based on iterative Boltzmann inversion that produces a pair potential correction to an existing MLP using equilibrium radial distribution function data. By applying these corrections to an MLP for pure aluminum based on density functional theory, we observe that the resulting model largely addresses previous overstructuring in the melt phase. Interestingly, the corrected MLP also exhibits improved performance in predicting experimental diffusion constants, which are not included in the training procedure. The presented method does not require autodifferentiating through a molecular dynamics solver and does not make assumptions about the MLP architecture. Our results suggest a practical framework for incorporating experimental data into machine learning models to improve the accuracy of molecular dynamics simulations.

Original languageEnglish
Pages (from-to)1274-1281
Number of pages8
JournalJournal of Chemical Theory and Computation
Volume20
Issue number3
DOIs
StatePublished - Feb 13 2024

Fingerprint

Dive into the research topics of 'Machine Learning Potentials with the Iterative Boltzmann Inversion: Training to Experiment'. Together they form a unique fingerprint.

Cite this