randaller/llama-chat

Chat with Meta's LLaMA models at home made easy

Python
This is stars and forks stats for /randaller/llama-chat repository. As of 27 Apr, 2024 this repository has 800 stars and 113 forks.

Chat with Meta's LLaMA models at home made easy This repository is a chat example with LLaMA (arXiv) models running on a typical home PC. You will just need a NVIDIA videocard and some RAM to chat with model. By using HF version you may fine-tune the model to any desired task. Copyright This repo is heavily based on Meta's original repo: https://github.com/facebookresearch/llama And on Steve Manuatu's repo: https://github.com/venuatu/llama And on Shawn Presser's repo: https://github.com/shawwn/llama HF...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
wangshub/Douyin-BotPython8.8k01.9k0
FuroBath/ArtifacterImageGenPython550370
tp7309/TTDeDroidPythonShellBatchfile73801620
ZhUyU1997/open-pose-editorTypeScriptCSSPython56901400
efabless/openlane2PythonTclNix70060
PacktPublishing/Modern-Computer-Architecture-and-Organization-Second-EditionAssemblyPythonVHDL1290320
idaholab/malamuteC++PythonMakefile70170
manateelazycat/mind-waveEmacs LispPython1470220
ggerganov/llama.cppCC++Cuda42.1k+3385.9k+51
dukelec/cd_pnpCSSJavaScriptPython161+524+1