rhohndorf/Auto-Llama-cpp

Uses Auto-GPT with Llama.cpp

PythonDockerfile
This is stars and forks stats for /rhohndorf/Auto-Llama-cpp repository. As of 29 Mar, 2024 this repository has 333 stars and 58 forks.

Auto-Llama-cpp: An Autonomous Llama Experiment This is a fork of Auto-GPT with added support for locally running llama models through llama.cpp. This is more of a proof of concept. It's sloooow and most of the time you're fighting with the too small context window size or the models answer is not valid JSON. But sometimes it works and then it's really quite magical what even such a small model comes up with. But obviously don't expect GPT-4 brilliance here. Supported Models Since this uses llama.cpp...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
Doriandarko/BabyAGIChatGPTPython3580560
tudat-team/tudat-resources-feedstockBatchfileShellPython3010
noQ-sweden/noqJavaTypeScriptBicep8000
Azure/aca-dotnet-workshopBicepC#HTML570300
rvs/planDCDockerfileShell70010
mlesniew/PicoMQTTC++PythonC170+1190
daquexian/onnx-simplifierC++PythonCMake3.2k03540
ColdBox/Building-Human-Friendly-Scheduled-TasksColdFusionDockerfile0000
gustavo-depaula/stalin-sortCoqAssemblyPython1.3k+41640
claytondukes/autogpt-dockerDockerfile35090