This is stars and forks stats for /jankais3r/LLaMA_MPS repository. As of 19 Apr, 2024 this repository has 572 stars and 47 forks.
LLaMA_MPS Run LLaMA (and Stanford-Alpaca) inference on Apple Silicon GPUs. As you can see, unlike other LLMs, LLaMA is not biased in any way 😄 Initial setup steps 1. Clone this repo git clone https://github.com/jankais3r/LLaMA_MPS 2. Install Python dependencies cd LLaMA_MPS pip3 install virtualenv python3 -m venv env source env/bin/activate pip3 install -r requirements.txt pip3 install -e . LLaMA-specific setup 3. Download the model weights and put them into a folder called models (e.g., LLaMA_MPS/models/7B) 4....
LLaMA_MPS Run LLaMA (and Stanford-Alpaca) inference on Apple Silicon GPUs. As you can see, unlike other LLMs, LLaMA is not biased in any way 😄 Initial setup steps 1. Clone this repo git clone https://github.com/jankais3r/LLaMA_MPS 2. Install Python dependencies cd LLaMA_MPS pip3 install virtualenv python3 -m venv env source env/bin/activate pip3 install -r requirements.txt pip3 install -e . LLaMA-specific setup 3. Download the model weights and put them into a folder called models (e.g., LLaMA_MPS/models/7B) 4....
repo | techs | stars | weekly | forks | weekly |
---|---|---|---|---|---|
kivy/kivy | PythonCythonC | 16.3k | 0 | 3.1k | 0 |
hyperonym/basaran | PythonJavaScriptCSS | 1.2k | 0 | 70 | 0 |
lucidrains/gigagan-pytorch | Python | 1.5k | 0 | 68 | 0 |
100daysofdevops/N-days-of-automation | ShellPythonDockerfile | 75 | 0 | 80 | 0 |
noobnooc/ohmygpt | TypeScriptJavaScriptShell | 238 | 0 | 141 | 0 |
cosin2077/chaty | TypeScriptCSSJavaScript | 451 | 0 | 42 | 0 |
retrage/gpt-macro | Rust | 483 | 0 | 10 | 0 |
toumorokoshi/toumorokoshi.github.com | SCSSHTMLPython | 4 | 0 | 1 | 0 |
clabby/substratum | SolidityPython | 58 | 0 | 5 | 0 |
ztjhz/FreeChatGPT | TypeScriptCSSJavaScript | 6.1k | +53 | 2.1k | +26 |