SIGN IN SIGN UP

🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.

0 0 0 Python

Fixes the inconsistency of the optionality of attention_mask (#37153)

* debugging issue 36758

* debugging issue 36758

* debugging issue 36758

* updated attn_mask type specification in _flash_attention_forward

* removed pdb

* added a blank line

* removed indentation
Y
Yufeng Xu committed
bf41e54fc8242dafa31bf6203e3d505bcb907119
Parent: 3249c5d
Committed by GitHub <[email protected]> on 4/1/2025, 2:31:10 PM