juncongmoo/chatllama

ChatLLaMA 📢 Open source implementation for LLaMA-based ChatGPT runnable in a single GPU. 15x faster training process than ChatGPT

Python
This is stars and forks stats for /juncongmoo/chatllama repository. As of 26 Apr, 2024 this repository has 1144 stars and 142 forks.

ChatLLaMA 📢 Open source implementation for LLaMA-based ChatGPT runnable in a single GPU. 15x faster training process than ChatGPT 🔥 Please check pyllama for LLaMA installation and single GPU inference setup. 🔥 To train ChatGPT in 5 mins - minichatgpt Meta has recently released LLaMA, a collection of foundational large language models ranging from 7 to 65 billion parameters. LLaMA is creating a lot of excitement because it is smaller than GPT-3 but has better performance. For example, LLaMA's 13B...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
krishnaik06/mlprojectJupyter NotebookPythonHTML218+3356+9
plotchy/defi-detectiveSoliditySvelteRust59050
slatinsky/DiscordChatExporter-frontendSveltePythonTypeScript239+690
thewtex/dotfilesVim ScriptShellLua2000
ashpil/moonshineZigHLSLPython89020
WordAsImage/Word-As-ImagePythonShell1k0710
snesrev/smCPythonOther401+1300
unum-cloud/ujrpcC++CPython8550230
SteveMacenski/slam_toolboxC++CMakePython1.2k+74240
sifis-home/wp5-cicd-exampleDockerfilePython0010