nikopj/FlashAttention.jl

Julia implementation of flash-attention operation for neural networks.

JuliaC++CudaTeXOther
This is stars and forks stats for /nikopj/FlashAttention.jl repository. As of 10 May, 2024 this repository has 8 stars and 1 forks.

FlashAttention.jl Julia implementation of Flash-Attention, a fused kernel for dot-product attention operation in transformer neural networks. Recommended Julia Install Method Install Julia via the juliaup manager: curl -fsSL https://install.julialang.org | sh Todo Julia implementation Naive algorithms: The following are naive in the sense that they all form the normalized adjacency matrix P. Note that GPU version come for free-ish from all of these naive algorithms by making use of NNlib.jl. naive...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
IMFS-MMB/mmb-repMATLABCAMPL420290
mthcht/PurpleteamPowerShellPythonOther92060
DevDayBe/edition-2023JavaScriptQ#TypeScript2080
getdozer/dozerRustOther1.3k0980
uhyo/nitrogqlRustTypeScriptJavaScript277080
dandi/dandi.github.ioSCSSHTMLOther00120
zigtur/Rareskills-Solidity-BootcampSolidityJavaScriptYul860140
evmts/evmts-monorepoTypeScriptJavaScriptOther104060
reworkd/AgentGPTTypeScriptPythonMDX26.9k08.9k0
ohmplatform/FreedomGPTTypeScriptOther2.3k03040