🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
[`LlamaTokenizer`] make unk_token_length a property (#25689)
make unk_token_length a property
A
Arthur committed
6e6da5e4b860d98d3b625fe5c63db4e83087b6ff
Parent: b85b880
Committed by GitHub <noreply@github.com>
on 8/24/2023, 6:03:34 AM