This is stars and forks stats for /nikopj/FlashAttention.jl repository. As of 10 May, 2024 this repository has 8 stars and 1 forks.
FlashAttention.jl Julia implementation of Flash-Attention, a fused kernel for dot-product attention operation in transformer neural networks. Recommended Julia Install Method Install Julia via the juliaup manager: curl -fsSL https://install.julialang.org | sh Todo Julia implementation Naive algorithms: The following are naive in the sense that they all form the normalized adjacency matrix P. Note that GPU version come for free-ish from all of these naive algorithms by making use of NNlib.jl. naive...
FlashAttention.jl Julia implementation of Flash-Attention, a fused kernel for dot-product attention operation in transformer neural networks. Recommended Julia Install Method Install Julia via the juliaup manager: curl -fsSL https://install.julialang.org | sh Todo Julia implementation Naive algorithms: The following are naive in the sense that they all form the normalized adjacency matrix P. Note that GPU version come for free-ish from all of these naive algorithms by making use of NNlib.jl. naive...
repo | techs | stars | weekly | forks | weekly |
---|---|---|---|---|---|
IMFS-MMB/mmb-rep | MATLABCAMPL | 42 | 0 | 29 | 0 |
mthcht/Purpleteam | PowerShellPythonOther | 92 | 0 | 6 | 0 |
DevDayBe/edition-2023 | JavaScriptQ#TypeScript | 2 | 0 | 8 | 0 |
getdozer/dozer | RustOther | 1.3k | 0 | 98 | 0 |
uhyo/nitrogql | RustTypeScriptJavaScript | 277 | 0 | 8 | 0 |
dandi/dandi.github.io | SCSSHTMLOther | 0 | 0 | 12 | 0 |
zigtur/Rareskills-Solidity-Bootcamp | SolidityJavaScriptYul | 86 | 0 | 14 | 0 |
evmts/evmts-monorepo | TypeScriptJavaScriptOther | 104 | 0 | 6 | 0 |
reworkd/AgentGPT | TypeScriptPythonMDX | 26.9k | 0 | 8.9k | 0 |
ohmplatform/FreedomGPT | TypeScriptOther | 2.3k | 0 | 304 | 0 |