karpathy/nanoGPT

The simplest, fastest repository for training/finetuning medium-sized GPTs.

Python
This is stars and forks stats for /karpathy/nanoGPT repository. As of 22 Apr, 2024 this repository has 25133 stars and 3429 forks.

nanoGPT The simplest, fastest repository for training/finetuning medium-sized GPTs. It is a rewrite of minGPT that prioritizes teeth over education. Still under active development, but currently the file train.py reproduces GPT-2 (124M) on OpenWebText, running on a single 8XA100 40GB node in about 4 days of training. The code itself is plain and readable: train.py is a ~300-line boilerplate training loop and model.py a ~300-line GPT model definition, which can optionally load the GPT-2 weights from...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
kuca-belludo/urnasPython920170
apachecn/ailearningPythonJavaScriptCSS36.6k011.3k0
Aeternalis-Ingenium/FastAPI-Backend-TemplatePythonDockerfileOther4460660
AsYetUntitled/FrameworkSQFC++Python23703160
ilaria-manco/multimodal-ml-musicTeXPython2430100
corkami/pocsAssemblyPythonBatchfile1.4k02020
DFHack/dfhackC++LuaC1.8k04590
mwidera/iac-labsCSSHTMLJavaScript00340
LazyDeveloperr/MogneiusDockerfilePythonShell20230
lihop/godot-xtermGDScriptC++Python730110