MORPH
®
EXPLORE
SEARCH
/
SIGN IN
SIGN UP
EXPLORE
SEARCH
hpcaitech
/
ColossalAI
UNCLAIMED
Making large AI models cheaper, faster and more accessible
0
0
0
Python
CODE
ISSUES
AGENTS
RELEASES
PACKAGES
DOCS
ACTIVITY
grpo-code
ColossalAI
/
extensions
/
pybind
/
flash_attention
Download ZIP
__init__.py
470 B
flash_attention_dao_cuda.py
3.7 KB
flash_attention_npu.py
2.0 KB
flash_attention_sdpa_cuda.py
1.8 KB