abertsch72/unlimiformer

Public repo for the preprint "Unlimiformer: Long-Range Transformers with Unlimited Length Input"

Python
This is stars and forks stats for /abertsch72/unlimiformer repository. As of 29 Mar, 2024 this repository has 932 stars and 58 forks.

Unlimiformer: Long-Range Transformers with Unlimited Length Input (NeurIPS 2023) This is the official implementation of the paper: Amanda Bertsch, Uri Alon, Graham Neubig, and Matthew R. Gormley: Unlimiformer: Long-Range Transformers with Unlimited Length Input (to appear in NeurIPS 2023) Unlimiformer is a method for augmenting pretrained encoder-decoder models with retrieval-based attention, without changing the mathematical definition of attention. This allows the use of unlimited length inputs...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
mosaicml/diffusionPythonOther4660480
h2oai/h2ogptPythonTeXShell8k09600
jasperan/whatsapp-osintPython57301110
revanced/revanced-apiPythonDockerfile30050
QuantumPackage/qp2FortranOCamlPython620280
mpoon/gpt-repository-loaderPython2.1k+5152-1
dandelionsllm/pandallmPythonJupyter NotebookShell1k0830
aguirre-valeria/educando-ecommerceCSSSCSSHTML2000
svelteland/emojimixSveltePythonJavaScript31050
ZYKsslm/BlackStone_Music_GUIPython1830120