You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: paper/paper.bib
+20-22Lines changed: 20 additions & 22 deletions
Original file line number
Diff line number
Diff line change
@@ -190,17 +190,16 @@ @article{Larsen-17-06
190
190
urldate = {2022-11-08}
191
191
}
192
192
193
-
@misc{Kovacs-25-01,
194
-
title = {{{MACE-OFF}}: {{Transferable Short Range Machine Learning Force Fields}} for {{Organic Molecules}}},
195
-
author = {Kov{\'a}cs, D{\'a}vid P{\'e}ter and Moore, J. Harry and Browning, Nicholas J. and Batatia, Ilyes and Horton, Joshua T. and Pu, Yixuan and Kapil, Venkat and Witt, William C. and Magd{\u a}u, Ioan-Bogdan and Cole, Daniel J. and Cs{\'a}nyi, G{\'a}bor},
196
-
year = {2025},
197
-
month = jan,
198
-
number = {arXiv:2312.15211},
199
-
eprint = {2312.15211},
200
-
primaryclass = {physics},
201
-
doi = {10.48550/arXiv.2312.15211},
202
-
urldate = {2025-01-30},
203
-
archiveprefix = {arXiv}
193
+
@article{Kovacs-25-05,
194
+
title = {{{MACE-OFF}}: {{Short-Range Transferable Machine Learning Force Fields}} for {{Organic Molecules}}},
195
+
author = {Kov{\'a}cs, D{\'a}vid P{\'e}ter and Moore, J. Harry and Browning, Nicholas J. and Batatia, Ilyes and Horton, Joshua T. and Pu, Yixuan and Kapil, Venkat and Witt, William C. and Magd{\u a}u, Ioan-Bogdan and Cole, Daniel J. and Cs{\'a}nyi, G{\'a}bor},
196
+
year = 2025,
197
+
journal = {Journal of the American Chemical Society},
Copy file name to clipboardExpand all lines: paper/paper.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -47,7 +47,7 @@ The `graph-pes` package provides a **unified interface and framework** for defin
47
47
48
48
# Related work
49
49
50
-
`graph-pes` is beginning to drive projects within our research group, and we hope that it will be useful to many others. In recent work, we have described the use of `graph-pes` for fitting NequIP models to datasets created using `autoplex`[@Liu-25-08], for assessing zero-shot performance of different graph-network MLIPs [@Mahmoud-25-02], and for distilling atomistic foundation models [@Gardner-25-06].
50
+
`graph-pes` is beginning to drive projects within our research group, and we hope that it will be useful to many others. In recent work, we have described the use of `graph-pes` for fitting NequIP models to datasets created using `autoplex`[@Liu-25-08], for assessing zero-shot performance of different graph-network MLIPs [@Mahmoud-25-11], and for distilling atomistic foundation models [@Gardner-25-06].
51
51
52
52
A number of existing packages offer training and validation pipelines for particular ML-PES architectures, including `schnetpack`[@schutt2019schnetpack; @schutt2023schnetpack], `deepmd-kit`[@Wang-18-07; @Zeng-23-08], `nequip`[@Batzner-22-05], `mace-torch`[@Batatia-22-10], `torchmd-net`[@TorchMDNet], and `fairchem`[@fairchem].
53
53
These frameworks focus on their associated model families and do not share a common interface for training.
@@ -70,7 +70,7 @@ Full API details are available in the project documentation.
70
70
All MLIP models in `graph-pes` are implemented as subclasses of the `GraphPESModel` base class.
71
71
Implementations need only define a forward pass that returns a local energy for each atom or a total energy for the structure;
72
72
the framework handles the calculation of forces and stress tensors in a conservative manner via automatic differentiation.
73
-
We also support models that return direct force and stress tensor predictions (e.g., `TensorNet` or `orb-v3-*` with their optional direct force readout heads).
73
+
We also support models that return direct force and stress tensor predictions (_e.g._, `TensorNet` or `orb-v3-*` with their optional direct force readout heads).
74
74
75
75
Building on the `GraphPESModel` class, we provide independent re-implementations of popular MLIP architectures, including `PaiNN`[@Schutt-21-06], `EDDP`[@Pickard-22-07], `NequIP`[@Batzner-22-05], `MACE`[@Batatia-22-10], and `TensorNet`[@Simeon-23-06].
76
76
We use building blocks provided by `e3nn`[@Geiger-22-07] to implement models that act on spherical tensor decompositions.
@@ -90,7 +90,7 @@ Because all models conform to the same interface, all training features can be u
90
90
91
91
## Easy access to foundation models
92
92
93
-
A recent area of research is the development of "foundational" MLIPs that can describe the potential-energy surface of a wide range of systems. `graph-pes` integrates directly with the `mace-torch`, `mattersim`, and `orb-models` packages to provide access to, among others, the `MACE-MP`[@Batatia-25-11], `MatterSim`[@Yang-24-05], `orb-v2`[@Neumann-24-10], `MACE-OFF`[@Kovacs-25-01], `Egret-v1`[@Mann-25-05], and `orb-v3`[@Rhodes-25-04] families of models. Each of these integrations generates `GraphPESModels` that are directly compatible with all relevant `graph-pes` features.
93
+
A recent area of research is the development of "foundational" MLIPs that can describe the potential-energy surface of a wide range of systems. `graph-pes` integrates directly with the `mace-torch`, `mattersim`, and `orb-models` packages to provide access to, among others, the `MACE-MP`[@Batatia-25-11], `MatterSim`[@Yang-24-05], `orb-v2`[@Neumann-24-10], `MACE-OFF`[@Kovacs-25-05], `Egret-v1`[@Mann-25-05], and `orb-v3`[@Rhodes-25-04] families of models. Each of these integrations generates `GraphPESModels` that are directly compatible with all relevant `graph-pes` features.
0 commit comments