This is stars and forks stats for /HazyResearch/flash-attention repository. As of 07 Dec, 2023 this repository has 7243 stars and 599 forks.
FlashAttention This repository provides the official implementation of FlashAttention and FlashAttention-2 from the following papers. FlashAttention: Fast and Memory-Efficient Exact Attention with IO-Awareness Tri Dao, Daniel Y. Fu, Stefano Ermon, Atri Rudra, Christopher RĂ© Paper: https://arxiv.org/abs/2205.14135 IEEE Spectrum article about our submission to the MLPerf 2.0 benchmark using FlashAttention. FlashAttention-2: Faster Attention with Better Parallelism and Work Partitioning Tri Dao Paper:...
FlashAttention This repository provides the official implementation of FlashAttention and FlashAttention-2 from the following papers. FlashAttention: Fast and Memory-Efficient Exact Attention with IO-Awareness Tri Dao, Daniel Y. Fu, Stefano Ermon, Atri Rudra, Christopher RĂ© Paper: https://arxiv.org/abs/2205.14135 IEEE Spectrum article about our submission to the MLPerf 2.0 benchmark using FlashAttention. FlashAttention-2: Faster Attention with Better Parallelism and Work Partitioning Tri Dao Paper:...
repo | techs | stars | weekly | forks | weekly |
---|---|---|---|---|---|
Dion-Systems/4coder | C++CObjective-C++ | 457 | 0 | 106 | 0 |
PJLab-ADG/SensorsCalibration | C++FortranCMake | 1.6k | 0 | 406 | 0 |
wkhtmltopdf/wkhtmltopdf | C++QMake | 13.4k | 0 | 1.8k | 0 |
klezVirus/CVE-2021-40444 | HTMLPythonOther | 766 | 0 | 170 | 0 |
reactor/reactor-netty | JavaOther | 2.4k | 0 | 603 | 0 |
alibaba/GraphScope | RustC++Java | 2.9k | +9 | 452 | +1 |
pinojs/pino | JavaScriptTypeScriptOther | 12.2k | 0 | 843 | 0 |
wekan/wekan | JavaScriptHTMLCSS | 18.8k | 0 | 2.8k | 0 |
elebumm/RedditVideoMakerBot | PythonHTMLShell | 5.6k | 0 | 1.5k | 0 |
EssayKillerBrain/EssayKiller_V2 | PythonCJava | 5.3k | 0 | 916 | 0 |