@@ -47,6 +47,63 @@ references:
47
47
type : chapter
48
48
URL : https://doi.org/10.1007/978-94-011-4653-1_21
49
49
50
+ - id : allen_learning_2023
51
+ abstract : >-
52
+ The development of machine learning models has led to an abundance of
53
+ datasets containing quantum mechanical (QM) calculations for molecular and
54
+ material systems. However, traditional training methods for machine learning
55
+ models are unable to leverage the plethora of data available as they require
56
+ that each dataset be generated using the same QM method. Taking machine
57
+ learning interatomic potentials (MLIPs) as an example, we show that
58
+ meta-learning techniques, a recent advancement from the machine learning
59
+ community, can be used to fit multiple levels of QM theory in the same
60
+ training process. Meta-learning changes the training procedure to learn a
61
+ representation that can be easily re-trained to new tasks with small amounts
62
+ of data. We then demonstrate that meta-learning enables simultaneously
63
+ training to multiple large organic molecule datasets. As a proof of concept,
64
+ we examine the performance of a MLIP refit to a small drug-like molecule and
65
+ show that pre-training potentials to multiple levels of theory with
66
+ meta-learning improves performance. This difference in performance can be
67
+ seen both in the reduced error and in the improved smoothness of the
68
+ potential energy surface produced. We therefore show that meta-learning can
69
+ utilize existing datasets with inconsistent QM levels of theory to produce
70
+ models that are better at specializing to new datasets. This opens new
71
+ routes for creating pre-trained, foundational models for interatomic
72
+ potentials.
73
+ accessed :
74
+ - year : 2023
75
+ month : 7
76
+ day : 30
77
+ author :
78
+ - family : Allen
79
+ given : Alice E. A.
80
+ - family : Lubbers
81
+ given : Nicholas
82
+ - family : Matin
83
+ given : Sakib
84
+ - family : Smith
85
+ given : Justin
86
+ - family : Messerly
87
+ given : Richard
88
+ - family : Tretiak
89
+ given : Sergei
90
+ - family : Barros
91
+ given : Kipton
92
+ citation-key : allen_learning_2023
93
+ issued :
94
+ - year : 2023
95
+ month : 7
96
+ day : 8
97
+ number : arXiv:2307.04012
98
+ publisher : arXiv
99
+ source : arXiv.org
100
+ title : >-
101
+ Learning Together: Towards foundational models for machine learning
102
+ interatomic potentials with meta-learning
103
+ title-short : Learning Together
104
+ type : article
105
+ URL : http://arxiv.org/abs/2307.04012
106
+
50
107
- id : aykol_rational_2021
51
108
abstract : >-
52
109
The rational solid-state synthesis of inorganic compounds is formulated as
@@ -1152,7 +1209,7 @@ references:
1152
1209
URL : https://www.nature.com/articles/s41524-022-00891-8
1153
1210
volume : ' 8'
1154
1211
1155
- - id : glawe_optimal_2016a
1212
+ - id : glawe_optimal_2016
1156
1213
abstract : >-
1157
1214
Starting from the experimental data contained in the inorganic crystal
1158
1215
structure database, we use a statistical analysis to determine the
@@ -1177,7 +1234,7 @@ references:
1177
1234
given : E. K. U.
1178
1235
- family : Marques
1179
1236
given : Miguel A. L.
1180
- citation-key : glawe_optimal_2016a
1237
+ citation-key : glawe_optimal_2016
1181
1238
container-title : New Journal of Physics
1182
1239
container-title-short : New J. Phys.
1183
1240
DOI : 10.1088/1367-2630/18/9/093011
@@ -1905,7 +1962,7 @@ references:
1905
1962
URL : https://www.nature.com/articles/nature17439
1906
1963
volume : ' 533'
1907
1964
1908
- - id : rupp_fast_2012a
1965
+ - id : rupp_fast_2012
1909
1966
abstract : >-
1910
1967
We introduce a machine learning model to predict atomization energies of a
1911
1968
diverse set of organic molecules, based on nuclear charges and atomic
@@ -1930,7 +1987,7 @@ references:
1930
1987
- family : Lilienfeld
1931
1988
given : O. Anatole
1932
1989
non-dropping-particle : von
1933
- citation-key : rupp_fast_2012a
1990
+ citation-key : rupp_fast_2012
1934
1991
container-title : Physical Review Letters
1935
1992
container-title-short : Phys. Rev. Lett.
1936
1993
DOI : 10.1103/PhysRevLett.108.058301
@@ -2371,6 +2428,45 @@ references:
2371
2428
type : article-journal
2372
2429
URL : http://arxiv.org/abs/1706.03762
2373
2430
2431
+ - id : vonlilienfeld_retrospective_2020
2432
+ abstract : >-
2433
+ Over the last decade, we have witnessed the emergence of ever more machine
2434
+ learning applications in all aspects of the chemical sciences. Here, we
2435
+ highlight specific achievements of machine learning models in the field of
2436
+ computational chemistry by considering selected studies of electronic
2437
+ structure, interatomic potentials, and chemical compound space in
2438
+ chronological order.
2439
+ accessed :
2440
+ - year : 2023
2441
+ month : 7
2442
+ day : 29
2443
+ author :
2444
+ - family : Lilienfeld
2445
+ given : O. Anatole
2446
+ non-dropping-particle : von
2447
+ - family : Burke
2448
+ given : Kieron
2449
+ citation-key : vonlilienfeld_retrospective_2020
2450
+ container-title : Nature Communications
2451
+ container-title-short : Nat Commun
2452
+ DOI : 10.1038/s41467-020-18556-9
2453
+ ISSN : 2041-1723
2454
+ issue : ' 1'
2455
+ issued :
2456
+ - year : 2020
2457
+ month : 9
2458
+ day : 29
2459
+ language : en
2460
+ license : 2020 The Author(s)
2461
+ number : ' 1'
2462
+ page : ' 4895'
2463
+ publisher : Nature Publishing Group
2464
+ source : www.nature.com
2465
+ title : Retrospective on a decade of machine learning for chemical discovery
2466
+ type : article-journal
2467
+ URL : https://www.nature.com/articles/s41467-020-18556-9
2468
+ volume : ' 11'
2469
+
2374
2470
- id : wang_predicting_2021
2375
2471
abstract : >-
2376
2472
We propose an efficient high-throughput scheme for the discovery of stable
0 commit comments