SHI-Labs/Neighborhood-Attention-Transformer

[CVPR 2023] Neighborhood Attention Transformer and [arXiv] Dilated Neighborhood Attention Transformer repository.

PythonShellpytorchneighborhood-attention
This is stars and forks stats for /SHI-Labs/Neighborhood-Attention-Transformer repository. As of 07 May, 2024 this repository has 900 stars and 80 forks.

Neighborhood Attention Transformers Powerful hierarchical vision transformers based on sliding window attention. Neighborhood Attention (NA, local attention) was introduced in our original paper, NAT, and runs efficiently with our extension to PyTorch, NATTEN. We recently introduced a new model, DiNAT, which extends NA by dilating neighborhoods (DiNA, sparse global attention, a.k.a. dilated local attention). Combinations of NA/DiNA are capable of preserving locality, maintaining translational equivariance, expanding...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
fei-protocol/checkthechainPython784+2930
codePerfectPlus/awesomeScriptsPythonHTMLOther51902710
Jon-Becker/heimdall-rsRustShell862+477+2
gardener/website-generatorSCSSHTMLJavaScript80170
Darlington02/circom-next-starterSolidityJavaScriptCSS46030
codemistic/Web3JavaScriptSolidityShell210290
Inventor77/Nutiveal-Nutrition-DataPython210220
ts1/BLEUnlockSwiftShellC1.6k01190
PRBonn/kiss-icpPythonC++Jupyter Notebook1.1k02000
CamHenlin/FocusedEditCCMakeC++52010