This is stars and forks stats for /SHI-Labs/Neighborhood-Attention-Transformer repository. As of 07 May, 2024 this repository has 900 stars and 80 forks.
Neighborhood Attention Transformers Powerful hierarchical vision transformers based on sliding window attention. Neighborhood Attention (NA, local attention) was introduced in our original paper, NAT, and runs efficiently with our extension to PyTorch, NATTEN. We recently introduced a new model, DiNAT, which extends NA by dilating neighborhoods (DiNA, sparse global attention, a.k.a. dilated local attention). Combinations of NA/DiNA are capable of preserving locality, maintaining translational equivariance, expanding...
Neighborhood Attention Transformers Powerful hierarchical vision transformers based on sliding window attention. Neighborhood Attention (NA, local attention) was introduced in our original paper, NAT, and runs efficiently with our extension to PyTorch, NATTEN. We recently introduced a new model, DiNAT, which extends NA by dilating neighborhoods (DiNA, sparse global attention, a.k.a. dilated local attention). Combinations of NA/DiNA are capable of preserving locality, maintaining translational equivariance, expanding...
repo | techs | stars | weekly | forks | weekly |
---|---|---|---|---|---|
fei-protocol/checkthechain | Python | 784 | +2 | 93 | 0 |
codePerfectPlus/awesomeScripts | PythonHTMLOther | 519 | 0 | 271 | 0 |
Jon-Becker/heimdall-rs | RustShell | 862 | +4 | 77 | +2 |
gardener/website-generator | SCSSHTMLJavaScript | 8 | 0 | 17 | 0 |
Darlington02/circom-next-starter | SolidityJavaScriptCSS | 46 | 0 | 3 | 0 |
codemistic/Web3 | JavaScriptSolidityShell | 21 | 0 | 29 | 0 |
Inventor77/Nutiveal-Nutrition-Data | Python | 21 | 0 | 22 | 0 |
ts1/BLEUnlock | SwiftShellC | 1.6k | 0 | 119 | 0 |
PRBonn/kiss-icp | PythonC++Jupyter Notebook | 1.1k | 0 | 200 | 0 |
CamHenlin/FocusedEdit | CCMakeC++ | 52 | 0 | 1 | 0 |