This is stars and forks stats for /bigscience-workshop/petals repository. As of 27 Apr, 2024 this repository has 7655 stars and 372 forks.
Run large language models at home, BitTorrent-style. Fine-tuning and inference up to 10x faster than offloading Generate text with distributed Llama 2 (70B), Falcon (40B+), BLOOM (176B) (or their derivatives), and fine‑tune them for your own tasks — right from your desktop computer or Google Colab: from transformers import AutoTokenizer from petals import AutoDistributedModelForCausalLM # Choose any model available at https://health.petals.dev model_name = "petals-team/StableBeluga2"...
Run large language models at home, BitTorrent-style. Fine-tuning and inference up to 10x faster than offloading Generate text with distributed Llama 2 (70B), Falcon (40B+), BLOOM (176B) (or their derivatives), and fine‑tune them for your own tasks — right from your desktop computer or Google Colab: from transformers import AutoTokenizer from petals import AutoDistributedModelForCausalLM # Choose any model available at https://health.petals.dev model_name = "petals-team/StableBeluga2"...
repo | techs | stars | weekly | forks | weekly |
---|---|---|---|---|---|
EnableSecurity/wafw00f | PythonOther | 4.5k | 0 | 885 | 0 |
KartoffelToby/better_thermostat | PythonSCSSShell | 478 | 0 | 84 | 0 |
xianyuyimu/TVBOX- | JavaScriptPython | 863 | 0 | 154 | 0 |
kwindrem/GuiMods | PythonQMLCSS | 140 | 0 | 19 | 0 |
znah/deepdream_c | SourcePawnCPython | 205 | 0 | 24 | 0 |
loia5tqd001/Dang-Ky-Hoc-Phan-UIT | SCSSTypeScriptOther | 178 | 0 | 9 | 0 |
rkalis/ethroulette | SolidityTypeScriptJavaScript | 47 | 0 | 25 | 0 |
bsg-external/ee477-hammer-cad | TclPythonMakefile | 0 | 0 | 0 | 0 |
Ruzim/NSFC-application-template-latex | TeXOther | 506 | +5 | 116 | 0 |
stevenpetryk/mafs | TypeScriptCSSOther | 3.1k | 0 | 73 | 0 |