kennethleungty/Llama-2-Open-Source-LLM-CPU-Inference

Running Llama 2 and other Open-Source LLMs on CPU Inference Locally for DocumentĀ Q&A

Pythonpythonnlpmachine-learningnatural-language-processingcpudeep-learningtransformersllamalanguage-modelsfaisssentence-transformerscpu-inferencelarge-language-modelsllmchatgptlangchaindocument-qaopen-source-llmc-transformersllama-2
This is stars and forks stats for /kennethleungty/Llama-2-Open-Source-LLM-CPU-Inference repository. As of 05 May, 2024 this repository has 820 stars and 153 forks.

Running Llama 2 and other Open-Source LLMs on CPU Inference Locally for Document Q&A Clearly explained guide for running quantized open-source LLM applications on CPUs using LLama 2, C Transformers, GGML, and LangChain Step-by-step guide on TowardsDataScience: https://towardsdatascience.com/running-llama-2-on-cpu-inference-for-document-q-a-3d636037a3d8 Context Third-party commercial large language model (LLM) providers like OpenAI's GPT4 have democratized LLM use via simple API calls. However,...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
Jamie-Stirling/RetNetPython8930820
JayZeeDesign/researcher-gptPython36302070
psychic-api/rag-stackTypeScriptPythonHCL1.1k0900
Fadi002/unshackleShellPythonC++1.5k0850
facebookresearch/amortized-optimization-tutorialTeXPython2120120
microsoft/TypeChatTypeScriptNunjucksJavaScript6.8k+61301+2
mattzcarey/code-review-gptTypeScriptJavaScriptCSS866+2560+4
apptension/saas-boilerplateTypeScriptPythonMDX1.1k0930
YvanYin/Metric3DPythonShell2940120
Maknee/minigpt4.cppC++PythonCMake482+417+2