SIGN IN SIGN UP

🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.

158577 0 0 Python

fix: prefer registered config over remote code in AutoConfig.from_pretrained (#45094)

* Prefer registered classes over remote code in Auto*.from_pretrained

When a class has been explicitly registered via AutoConfig.register()
(or other Auto*.register()), it should take precedence over auto_map
remote code. Previously, trust_remote_code=True with auto_map entries
in config.json would always load remote code, ignoring the registration.

This applies the same fix across all Auto classes:
AutoConfig, AutoModel, AutoTokenizer, AutoProcessor,
AutoFeatureExtractor, AutoImageProcessor, AutoVideoProcessor.

This caused issues for downstream libraries (e.g., vLLM) that vendor
fixed classes for models with broken remote code — internal calls
from AutoTokenizer/AutoProcessor would bypass the registration and
load the broken remote class.

Fixes: #45093

Co-Authored-By: Claude Opus 4.6 (1M context) <[email protected]>

* Only ignore remote code if local code is explicit (i.e. doesn't belong to transformers)

Signed-off-by: Harry Mellor <[email protected]>

* update tests

Signed-off-by: Harry Mellor <[email protected]>

* `make style`

Signed-off-by: Harry Mellor <[email protected]>

* fix tokenizer test

Signed-off-by: Harry Mellor <[email protected]>

* Fix tests

Signed-off-by: Harry Mellor <[email protected]>

* Fix explicit registration detection

Signed-off-by: Harry Mellor <[email protected]>

* make style

Signed-off-by: Harry Mellor <[email protected]>

---------

Signed-off-by: Harry Mellor <[email protected]>
Co-authored-by: Claude Opus 4.6 (1M context) <[email protected]>
Co-authored-by: Harry Mellor <[email protected]>
F
Fang Han committed
81db7d3513a7045ef96c55eec71b8075c529d098
Parent: 2dba8e0
Committed by GitHub <[email protected]> on 3/31/2026, 2:40:50 PM