jzhang38/TinyLlama

The TinyLlama project is an open endeavor to pretrain a 1.1B Llama model on 3 trillion tokens.

PythonShell
This is stars and forks stats for /jzhang38/TinyLlama repository. As of 07 Dec, 2023 this repository has 2953 stars and 117 forks.

TinyLlama-1.1B English | δΈ­ζ–‡ Chat Demo | Discord The TinyLlama project aims to pretrain a 1.1B Llama model on 3 trillion tokens. With some proper optimization, we can achieve this within a span of "just" 90 days using 16 A100-40G GPUs πŸš€πŸš€. The training has started on 2023-09-01. We adopted exactly the same architecture and tokenizer as Llama 2. This means TinyLlama can be plugged and played in many open-source projects built upon Llama. Besides, TinyLlama is compact with only 1.1B parameters. This...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
aigc-apps/sd-webui-EasyPhotoPython2.5k01790
ReshotAI/gaussian-paintersPython2670160
georgian-io/LLM-Finetuning-HubPythonShellDockerfile374+626+1
botesjuan/Burp-Suite-Certified-Practitioner-Exam-StudyPython67501830
Probesys/lotemplateRich Text FormatPythonHTML18010
cherryramatisdev/4noobs_trackerRubyHTMLCSS31050
kentonv/lanpartyShell105050
avehtari/BDA_course_AaltoTeXRLua2.1k05470
robotgradianta/dotfilesVim ScriptShell0000
conda-forge/rasam-feedstockBatchfileShell0000