facebookresearch/llama

Inference code for LLaMA models

PythonShell
This is stars and forks stats for /facebookresearch/llama repository. As of 19 Apr, 2024 this repository has 43304 stars and 7282 forks.

Llama 2 We are unlocking the power of large language models. Our latest version of Llama is now accessible to individuals, creators, researchers and businesses of all sizes so that they can experiment, innovate and scale their ideas responsibly. This release includes model weights and starting code for pretrained and fine-tuned Llama language models — ranging from 7B to 70B parameters. This repository is intended as a minimal example to load Llama 2 models and run inference. For more detailed examples...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
derrian-distro/LoRA_Easy_Training_ScriptsPythonOther6900750
twitter/communitynotesPython88901340
coolzilj/Blender-ControlNetPython6970680
Christbowel/CVE-2023-25136Python1040230
oobabooga/text-generation-webuiPythonCSSJavaScript24.9k+3173.3k+57
facebookresearch/fairscalePythonOther2.6k02460
buildkite/docsRubySCSSHTML360246+1
rust-nostr/nostrRustMakefileJavaScript2710580
primitivefinance/solstatSolidityTypeScriptShell112050
epfml/OptML_courseJupyter NotebookTeXPython89602760