Skip to content

Modify the examples optimization level #230

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Jun 22, 2022
Merged

Modify the examples optimization level #230

merged 3 commits into from
Jun 22, 2022

Conversation

echarlaix
Copy link
Collaborator

In this PR, we modify the onnx runtime optimization level in the examples in the quickstart and different READMEs for users to create hardware agnostic optimized graph.

@echarlaix echarlaix marked this pull request as ready for review June 22, 2022 10:04
@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint.

Copy link
Contributor

@regisss regisss left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks good to me @echarlaix !!
I just have one question because I'm curious

@@ -53,7 +53,7 @@ def test_optimize(self):
"roberta-base",
"google/electra-small-discriminator",
}
optimization_config = OptimizationConfig(optimization_level=99, optimize_with_onnxruntime_only=False)
optimization_config = OptimizationConfig(optimization_level=2, optimize_with_onnxruntime_only=False)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why 2 and not 1 here for optimization_level?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For the tests, I think it's better to verify that even the extended graph optimizations don't impact too much the models outputs. The optimization level of 99 doesn't seem useful in our use cases.

@echarlaix echarlaix merged commit d501248 into main Jun 22, 2022
@echarlaix echarlaix deleted the modify-opt-level branch June 22, 2022 14:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants