SIGN IN SIGN UP

🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.

0 0 1 Python

Change assignee for tokenizers (#15088)

L
Lysandre Debut committed
42d57549b82014834706ca86515eb6cc6431b3cb
Parent: a54961c
Committed by GitHub <noreply@github.com> on 1/10/2022, 2:22:48 PM