ItsPi3141/alpaca-electron

The simplest way to run Alpaca (and other LLaMA-based local LLMs) on your own computer

JavaScriptHTMLPythonCSSDockerfileelectronwindowsmacoslinuxchatdockerappuiaielectron-appwebuillamaalpacachatgptdalaialpaca-electron
This is stars and forks stats for /ItsPi3141/alpaca-electron repository. As of 28 Apr, 2024 this repository has 1193 stars and 132 forks.

Alpaca Electron Alpaca Electron is built from the ground-up to be the easiest way to chat with the alpaca AI models. No command line or compiling needed! 📃 Features + to-do Runs locally on your computer, internet connection is not needed except when downloading models Compact and efficient since it uses llama.cpp as its backend (which supports Alpaca & Vicuna too) Runs on CPU, anyone can run it without an expensive graphics card No external dependencies...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
ivan-sincek/ios-penetration-testing-cheat-sheetJavaScriptPython2140420
caoyunkang/GroundedSAM-zero-shot-anomaly-detectionJupyter NotebookPythonOther5250530
dji-sdk/Mobile-SDK-Android-V5KotlinJavaHTML1610840
JakeLaoyu/blogNunjucksJavaScriptStylus0000
BQSKit/bqskitOpenQASMPython600160
lsk569937453/silverwindRustDockerfile3630200
hlhr202/llama-nodeRustTypeScriptJavaScript7770560
supabase/edge-runtimeTypeScriptRustJavaScript430+231+1
valory-xyz/autonolas-governanceSolidityJavaScriptOther8000
oddsdk/odd-app-templateSvelteCSSTypeScript27090