hlhr202/llama-node

Believe in AI democratization. llama for nodejs backed by llama-rs, llama.cpp and rwkv.cpp, work locally on your laptop CPU. support llama/alpaca/gpt4all/vicuna/rwkv model.

RustTypeScriptJavaScriptPythonCSSMDXOthernodejsaiembeddingsllamagptnapinapi-rslarge-language-modelsllmrwkvlangchainllamacppllama-rsllama-node
This is stars and forks stats for /hlhr202/llama-node repository. As of 29 Apr, 2024 this repository has 777 stars and 56 forks.

LLaMA Node llama-node: Node.js Library for Large Language Model Official Documentations Picture generated by stable diffusion. LLaMA Node Introduction Supported models Supported platforms Installation Manual compilation CUDA support Acknowledgments Models/Inferencing tools dependencies Some source code comes from Community Introduction This project is in an early stage and is not production ready, we do not follow the semantic versioning. The API for nodejs may change in the future, use it with...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
supabase/edge-runtimeTypeScriptRustJavaScript430+231+1
YJDoc2/8086-EmulatorRustRenderScript2660390
rust-lang/logRust1.9k+10237-1
valory-xyz/autonolas-governanceSolidityJavaScriptOther8000
InternetMaximalism/intmax-interoperability-pluginSolidityTypeScriptRust6040
oddsdk/odd-app-templateSvelteCSSTypeScript27090
mozilla/glean-dictionarySveltePythonJavaScript220420
eumemic/ai-legionTypeScriptJavaScript1.2k01410
lgrammel/gptagent.jsTypeScriptJavaScriptShell2060320
dabit3/nextjs-chatgpt-plugin-starterTypeScriptJavaScript206+1240