You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
use Turbo as better initial color map on ptable heatmaps (heatmap now changeable, required sveriodic-table update)
add CGCNN+P metrics to model-stats.json
update model-metrics.svelte table
compile_metrics.py import df_metrics, df_wbm from matbench_discovery.preds
remove dates from figure file names
Copy file name to clipboardExpand all lines: data/wbm/readme.md
+2-2
Original file line number
Diff line number
Diff line change
@@ -25,7 +25,7 @@ The full set of processing steps used to curate the WBM test set from the raw da
25
25
26
26
<caption>WBM Formation energy distribution. 524 materials outside green dashed lines were discarded.<br />(zoom out on this plot to see discarded samples)</caption>
27
27
<slotname="hist-e-form-per-atom">
28
-
<imgsrc="./figs/2022-12-07-hist-wbm-e-form-per-atom.svg"alt="WBM formation energy histogram indicating outlier cutoffs">
28
+
<imgsrc="./figs/wbm-e-form-per-atom.svg"alt="WBM formation energy histogram indicating outlier cutoffs">
29
29
</slot>
30
30
31
31
- apply the [`MaterialsProject2020Compatibility`](https://pymatgen.org/pymatgen.entries.compatibility.html#pymatgen.entries.compatibility.MaterialsProject2020Compatibility) energy correction scheme to the formation energies
@@ -99,5 +99,5 @@ The number of stable materials (according to the MP convex hull which is spanned
99
99
> Note: [According to the authors](https://www.nature.com/articles/s41524-020-00481-6#Sec2), the stability rate w.r.t. to the more complete hull constructed from the combined train and test set (MP + WBM) for the first 3 rounds of elemental substitution is 18,479 out of 189,981 crystals ($\approx$ 9.7%).
100
100
101
101
<slotname="wbm-each-hist">
102
-
<imgsrc="./figs/2023-01-26-wbm-each-hist.svg"alt="WBM energy above MP convex hull distribution">
102
+
<imgsrc="./figs/wbm-each-hist.svg"alt="WBM energy above MP convex hull distribution">
Copy file name to clipboardExpand all lines: models/bowsr/metadata.yml
+2
Original file line number
Diff line number
Diff line change
@@ -33,4 +33,6 @@ hyperparams:
33
33
n_iter: 100
34
34
35
35
notes:
36
+
description: BOWSR is a Bayesian optimizer with symmetry constraints using a graph deep learning energy model to perform "DFT-free" relaxations of crystal structures.
37
+
long: The authors show that this iterative approach improves the accuracy of ML-predicted formation energies over single-shot predictions.
36
38
training: Uses same version of MEGNet as standalone MEGNet.
Copy file name to clipboardExpand all lines: models/cgcnn/metadata.yml
+8
Original file line number
Diff line number
Diff line change
@@ -24,6 +24,10 @@
24
24
hyperparams:
25
25
Ensemble Size: 10
26
26
27
+
notes:
28
+
description: Published in 2017, CGCNN was the first crystal graph convolutional neural network to directly learn 8 different DFT-computed material properties from a graph representing the atoms and bonds in a crystal.
29
+
long: It showed that just like in other areas of ML, given large training sets, embeddings that outperform human-engineered features could be learned directly from the data.
30
+
27
31
- model_name: CGCNN+P
28
32
model_version: 0.1.0 # the aviary version
29
33
matbench_discovery_version: 1.0
@@ -54,3 +58,7 @@
54
58
hyperparams:
55
59
Ensemble Size: 10
56
60
Perturbations: 5
61
+
62
+
notes:
63
+
description: This work proposes simple, physically motivated structure perturbations to augment CGCNN's training data of relaxed structures with structures resembling unrelaxed ones but mapped to the same DFT final energy.
64
+
long: From this the model should learn to map structures to their nearest energy basin which is supported by a lowering of the energy error on unrelaxed structures.
Copy file name to clipboardExpand all lines: models/m3gnet/metadata.yml
+3
Original file line number
Diff line number
Diff line change
@@ -22,6 +22,8 @@
22
22
pandas: 1.5.1
23
23
trained_on_benchmark: false
24
24
notes:
25
+
description: M3GNet is a GNN-based universal (as in full periodic table) interatomic potential for materials trained on up to 3-body interactions in the initial, middle and final frame of MP DFT relaxations.
26
+
long: It thereby learns to emulate structure relaxation, MD simulations and property prediction of materials across diverse chemical spaces.
25
27
training: Using pre-trained model released with paper. Was only trained on a subset of 62,783 MP relaxation trajectories in the 2018 database release (see [related issue](https://github.com/materialsvirtuallab/m3gnet/issues/20#issuecomment-1207087219)).
26
28
27
29
- model_name: M3GNet + MEGNet
@@ -58,4 +60,5 @@
58
60
pandas: 1.5.1
59
61
trained_on_benchmark: false
60
62
notes:
63
+
description: This combination of models uses M3GNet to relax initial structures and then passes it to MEGNet to predict the formation energy.
61
64
training: Using pre-trained model released with paper. Was only trained on a subset of 62,783 MP relaxation trajectories in the 2018 database release (see [related issue](https://github.com/materialsvirtuallab/m3gnet/issues/20#issuecomment-1207087219)).
Copy file name to clipboardExpand all lines: models/megnet/metadata.yml
+2
Original file line number
Diff line number
Diff line change
@@ -29,5 +29,7 @@ requirements:
29
29
numpy: 1.24.0
30
30
pandas: 1.5.1
31
31
trained_on_benchmark: false
32
+
32
33
notes:
34
+
description: MatErials Graph Network is another GNN for material properties of relaxed structure which showed that learned element embeddings encode periodic chemical trends and can be transfer-learned from large data sets (formation energies) to predictions on small data properties (band gaps, elastic moduli).
33
35
training: Using pre-trained model released with paper. Was only trained on `MP-crystals-2018.6.1` dataset [available on Figshare](https://figshare.com/articles/Graphs_of_materials_project/7451351).
description: A random forest trained to map the combo of composition-based Magpie features and structure-based relaxation-invariant Voronoi tessellation features (bond angles, coordination numbers, ...) to DFT formation energies.
27
+
long: This is an old model that predates most deep learning for materials but significantly improved over Coulomb matrix and partial radial distribution function methods. It therefore serves as a good baseline model to see what modern ML buys us.
Copy file name to clipboardExpand all lines: models/wrenformer/metadata.yml
+4
Original file line number
Diff line number
Diff line change
@@ -28,3 +28,7 @@ trained_on_benchmark: true
28
28
29
29
hyperparams:
30
30
Ensemble Size: 10
31
+
32
+
notes:
33
+
description: Wrenformer is a standard PyTorch Transformer Encoder trained to learn material embeddings from composition, space group, Wyckoff positions in a structure.
34
+
long: It builds on [Roost](https://doi.org/10.1038/s41467-020-19964-7) and [Wren](https://doi.org/10.1126/sciadv.abn4117), by being a fast structure-free model that is still able to distinguish polymorphs through symmetry.
0 commit comments