Me163/rusty_llama

A simple ChatGPT clone in Rust on both the frontend and backend. Uses open source language models and TailwindCSS.

RustTypeScriptOther
This is stars and forks stats for /Me163/rusty_llama repository. As of 10 May, 2024 this repository has 231 stars and 37 forks.

Rusty Llama Webapp A simple webapp to showcase the ability to write a simple chatbot webapp using only Rust, TailwindCSS and an Open Source language model such as a variant of GPT, LLaMA, etc. Setup Instructions Hardware By default, the project has Apple's Metal acceleration enabled. If you are not on a macOS system, you may need to disable the metal feature in Cargo.toml. Similarly, if you are on a system with an Nvidia GPU, you may need to add CUDA as a feature (I haven't tested this, anyone who...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
openwebf/webfDartC++JavaScript1k0800
bancolombia/async-dataflowElixirTypeScriptDart14060
OnurGumus/FunPizzaShopF#CSSJavaScript25010
fantom-lang/fantomFantomC#Java1040260
SWMFsoftware/GITM2FortranPythonIDL0000
letianzj/quanttraderHTMLPythonJavaScript3580900
weaviate/weaviate-ioMDXPythonJavaScript360940
Developer-DAO/academyMDXTypeScriptOther104-1490
facebookresearch/playtorchMDXTypeScriptC++80601050
coq-community/vscoqOCamlTypeScriptCoq2630540