Skip to content

Fix TransformersModel: torch_dtype and trust_remote_code not passed to VLM #1012

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Mar 17, 2025

Conversation

Louis-Gv
Copy link
Contributor

Fixes TransformersModel initialisation.
torch_dtype and trust_remote_code are missing for AutoModelForImageTextToText.from_pretrained()

The model can now be loaded in the desired type

Copy link
Member

@albertvillanova albertvillanova left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! Good catch!

@albertvillanova
Copy link
Member

I fixed the code style (make style) and added a regression test.

@albertvillanova albertvillanova changed the title Fix TransformersModel: torch_dtype and trust_remote_code not passed Fix TransformersModel: torch_dtype and trust_remote_code not passed to VLM Mar 17, 2025
@albertvillanova albertvillanova merged commit 83e971a into huggingface:main Mar 17, 2025
3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants