This is stars and forks stats for /PiotrNawrot/nanoT5 repository. As of 02 May, 2024 this repository has 822 stars and 49 forks.
nanoT5 (Encoder-Decoder / Pre-training + Fine-Tuning) [Paper] | TLDR | Motivation | Setup | Pre-training | Fine-tuning | Extras | Conclusions | References | Cite | Issues TLDR: This repository comprises the code to reproduce the pre-training of a "Large Language Model" (T5) under a limited budget (1xA100 GPU, < 24 hours) in PyTorch. We start from the randomly initialised T5-base-v1.1 (248M parameters) model, and we pre-train it on the English subset of the C4 dataset and then fine-tune it on Super-Natural...
nanoT5 (Encoder-Decoder / Pre-training + Fine-Tuning) [Paper] | TLDR | Motivation | Setup | Pre-training | Fine-tuning | Extras | Conclusions | References | Cite | Issues TLDR: This repository comprises the code to reproduce the pre-training of a "Large Language Model" (T5) under a limited budget (1xA100 GPU, < 24 hours) in PyTorch. We start from the randomly initialised T5-base-v1.1 (248M parameters) model, and we pre-train it on the English subset of the C4 dataset and then fine-tune it on Super-Natural...
repo | techs | stars | weekly | forks | weekly |
---|---|---|---|---|---|
rogue-hack-lab/translux | OtherPython | 2 | 0 | 3 | 0 |
zen-mod/ZEN | SQFPawnPython | 104 | 0 | 47 | 0 |
sdatkinson/NeuralAmpModelerPlugin | PythonC++Shell | 1.5k | +5 | 110 | 0 |
PySlurm/pyslurm | CythonPythonOther | 398 | 0 | 108 | 0 |
catppuccin/minecraft | PythonGLSL | 198 | 0 | 4 | 0 |
navapbc/platform-test-nextjs | HCLShellTypeScript | 2 | 0 | 0 | 0 |
drshahizan/special-topic-data-engineering | PythonJavaScriptCSS | 81 | 0 | 63 | 0 |
Ayanaminn/N46Whisper | Jupyter NotebookPython | 1.2k | 0 | 99 | 0 |
rpinedaec83/pachaBack | JavaScriptPythonHTML | 0 | 0 | 26 | 0 |
Beomi/KoAlpaca | Jupyter NotebookPythonShell | 1.3k | 0 | 219 | 0 |