a16z-infra/cog-llama-template

LLaMA Cog template

PythonJupyter NotebookMakefileShell
This is stars and forks stats for /a16z-infra/cog-llama-template repository. As of 11 May, 2024 this repository has 261 stars and 38 forks.

LLaMA Cog template 🦙 NOTE: This is an experimental branch that depends on exllama For now, you should: git clone https://github.com/turboderp/exllama cd exllama git checkout e8a544f95b3fd64dfa5549eeeafb85b1ac71a793 We're working on a proper integration. This Cog template works with LLaMA 1 & 2 versions. LLaMA is a new open-source language model from Meta Research that performs as well as closed-source models. This is a guide to running LLaMA using in the cloud using Replicate. You'll use the...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
SergeyPirogov/webdriver_managerPython1.7k+35372+3
yangjianxin1/FireflyPython2.7k+61225+7
ThioJoe/Full-Stack-AI-Meme-GeneratorPython219+2230
TheWaWaR/simple-http-serverRustShellMakefile2k01420
ratatui-org/ratatuiRustShell3.6k01280
18F/analytics.usa.govSCSSJavaScriptHTML69001970
djeada/Bash-ScriptsShell114+1140
Adonis142857/GLaDOS-SigninShell10703200
Orbiter-Finance/OB_ReturnCabinSolidityTypeScriptJavaScript720220
wdxtub/WDXpeakStataPython809+22010