This is stars and forks stats for /abertsch72/unlimiformer repository. As of 29 Mar, 2024 this repository has 932 stars and 58 forks.
Unlimiformer: Long-Range Transformers with Unlimited Length Input (NeurIPS 2023) This is the official implementation of the paper: Amanda Bertsch, Uri Alon, Graham Neubig, and Matthew R. Gormley: Unlimiformer: Long-Range Transformers with Unlimited Length Input (to appear in NeurIPS 2023) Unlimiformer is a method for augmenting pretrained encoder-decoder models with retrieval-based attention, without changing the mathematical definition of attention. This allows the use of unlimited length inputs...
Unlimiformer: Long-Range Transformers with Unlimited Length Input (NeurIPS 2023) This is the official implementation of the paper: Amanda Bertsch, Uri Alon, Graham Neubig, and Matthew R. Gormley: Unlimiformer: Long-Range Transformers with Unlimited Length Input (to appear in NeurIPS 2023) Unlimiformer is a method for augmenting pretrained encoder-decoder models with retrieval-based attention, without changing the mathematical definition of attention. This allows the use of unlimited length inputs...
repo | techs | stars | weekly | forks | weekly |
---|---|---|---|---|---|
mosaicml/diffusion | PythonOther | 466 | 0 | 48 | 0 |
h2oai/h2ogpt | PythonTeXShell | 8k | 0 | 960 | 0 |
jasperan/whatsapp-osint | Python | 573 | 0 | 111 | 0 |
revanced/revanced-api | PythonDockerfile | 30 | 0 | 5 | 0 |
QuantumPackage/qp2 | FortranOCamlPython | 62 | 0 | 28 | 0 |
mpoon/gpt-repository-loader | Python | 2.1k | +5 | 152 | -1 |
dandelionsllm/pandallm | PythonJupyter NotebookShell | 1k | 0 | 83 | 0 |
aguirre-valeria/educando-ecommerce | CSSSCSSHTML | 2 | 0 | 0 | 0 |
svelteland/emojimix | SveltePythonJavaScript | 31 | 0 | 5 | 0 |
ZYKsslm/BlackStone_Music_GUI | Python | 183 | 0 | 12 | 0 |