Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inconsistency in GNoME's F1 Scores on Matbench #91

Closed
shrshr111 opened this issue Feb 20, 2024 · 1 comment · Fixed by #92
Closed

Inconsistency in GNoME's F1 Scores on Matbench #91

shrshr111 opened this issue Feb 20, 2024 · 1 comment · Fixed by #92
Labels
bug Something isn't working site Website related

Comments

@shrshr111
Copy link

I noticed that the F1 scores for GNoME listed on two different web pages within the Matbench Discovery section appear to be inconsistent:

https://matbench-discovery.materialsproject.org/models
and
https://matbench-discovery.materialsproject.org/

Could someone please clarify why there is a difference in the reported F1 scores for GNoME? Thanks!

janosh added a commit that referenced this issue Feb 20, 2024
@janosh
Copy link
Owner

janosh commented Feb 20, 2024

thanks for reporting! that is indeed a mistake. the model page was still showing metrics for the complete test set whereas the landing page recently transitioned to showing metrics on a subset of the WBM test set restricted to new and unique structures prototypes. see #75 for details.

@janosh janosh added bug Something isn't working site Website related labels Feb 20, 2024
janosh added a commit that referenced this issue Feb 20, 2024
* breaking: LabelEnum.dict() -> val_dict(), add label_dict() method

* add training_set col of main metrics table into urls

* fix gnome.yml targets: EFS-> EF

also add missing predictions note

* test mbd/enums.py and LabelEnum

* show model-stats-uniq-protos.json on /models to fix mismatch with landing page metrics (closes #91)

* render missing_preds notes in ModelCard tooltip

* rename model schema .yml + d.ts files

* clickable links to training sets in metrics table
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working site Website related
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants