karpathy/llama2.c

Inference Llama 2 in one file of pure C

CPythonJupyter NotebookMakefile
This is stars and forks stats for /karpathy/llama2.c repository. As of 01 May, 2024 this repository has 12451 stars and 1208 forks.

llama2.c Have you ever wanted to inference a baby Llama 2 model in pure C? No? Well, now you can! Train the Llama 2 LLM architecture in PyTorch then inference it with one simple 700-line C file (run.c). You might think that you need many billion parameter LLMs to do anything useful, but in fact very small LLMs can have surprisingly strong performance if you make the domain narrow enough (ref: TinyStories paper). This repo is a "fullstack" train + inference solution for Llama 2 LLM, with focus...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
liltom-eth/llama2-webuiJupyter NotebookPythonRoff1.7k01620
langchain-ai/langchainPythonJupyter NotebookMakefile64.3k+4829.1k+122
unode/firefox_decryptPython1.6k02850
wlandau/instantiateRShellC17010
cpfiffer/julia-deeplearningHTMLSCSSJulia28+170
Miyamura80/ZKMicrophoneSolidityRubyTypeScript69010
juxtopposed/stacksortedJavaScriptSvelteCSS1010270
SITU-Research/codecSvelteJavaScriptCSS45060
adobe/spectrum-web-componentsTypeScriptCSSJavaScript99301680
TinyVG/sdkZigC#JavaScript227080