HazyResearch/flash-attention

Fast and memory-efficient exact attention

PythonC++CudaOther
This is stars and forks stats for /HazyResearch/flash-attention repository. As of 28 Apr, 2024 this repository has 7243 stars and 599 forks.

FlashAttention This repository provides the official implementation of FlashAttention and FlashAttention-2 from the following papers. FlashAttention: Fast and Memory-Efficient Exact Attention with IO-Awareness Tri Dao, Daniel Y. Fu, Stefano Ermon, Atri Rudra, Christopher RĂ© Paper: https://arxiv.org/abs/2205.14135 IEEE Spectrum article about our submission to the MLPerf 2.0 benchmark using FlashAttention. FlashAttention-2: Faster Attention with Better Parallelism and Work Partitioning Tri Dao Paper:...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
Dion-Systems/4coderC++CObjective-C++45701060
PJLab-ADG/SensorsCalibrationC++FortranCMake1.6k04060
wkhtmltopdf/wkhtmltopdfC++QMake13.4k01.8k0
klezVirus/CVE-2021-40444HTMLPythonOther76601700
reactor/reactor-nettyJavaOther2.4k06030
alibaba/GraphScopeRustC++Java2.9k+9452+1
pinojs/pinoJavaScriptTypeScriptOther12.2k08430
wekan/wekanJavaScriptHTMLCSS18.8k02.8k0
elebumm/RedditVideoMakerBotPythonHTMLShell5.6k01.5k0
EssayKillerBrain/EssayKiller_V2PythonCJava5.3k09160