Skip to content

AutoModel(s) do not respect the revision flag while loading custom models #18537

Closed
@ankrgyl

Description

@ankrgyl

System Info

  • transformers version: 4.21.1
  • Platform: macOS-12.4-arm64-arm-64bit
  • Python version: 3.10.5
  • Huggingface_hub version: 0.8.1
  • PyTorch version (GPU?): 1.12.1 (False)
  • Tensorflow version (GPU?): not installed (NA)
  • Flax version (CPU?/GPU?/TPU?): not installed (NA)
  • Jax version: not installed
  • JaxLib version: not installed
  • Using GPU in script?: no
  • Using distributed or parallel set-up in script?:no

Who can help?

No response

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

from transformers import AutoModelForImageClassification
m = AutoModelForImageClassification.from_pretrained(
        "sgugger/custom-resnet50d",
        trust_remote_code=True,
        revision="ed94a7c6247d8aedce4647f00f20de6875b5b292"
)
# It will print:
# Explicitly passing a `revision` is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision.

I stepped through the code and observed that AutoConfig.from_pretrained here swallows the revision from kwargs, meaning that later on line 433 it's no longer there. I believe the same issue applies to use_auth_token.

Expected behavior

I think the revision should propagate to both the configuration and model.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions