MORPH
®
EXPLORE
SEARCH
/
SIGN IN
SIGN UP
EXPLORE
SEARCH
tejas
/
flash-attention
Fast and memory-efficient exact attention
0
0
3
Python
CODE
ISSUES
PULL REQUESTS
ACTIONS
AGENTS
RELEASES
DOCS
ACTIVITY
main
1 branch
Code
Tri Dao
[Cute,Bwd,Sm100] Implement cluster
c59ecd8
·
5mo ago
·
1,120 Commits
.github
assets
benchmarks
csrc
examples
flash_attn
hopper
tests
training
.gitignore
284 B
.gitmodules
235 B
.pre-commit-config.yaml
721 B
AUTHORS
29 B
LICENSE
1.5 KB
Makefile
126 B
MANIFEST.in
343 B
README.md
22.3 KB
setup.py
27.0 KB
usage.md
6.8 KB