This is stars and forks stats for /InternLM/xtuner repository. As of 28 Apr, 2024 this repository has 419 stars and 35 forks.
English | įŽäŊä¸æ đ join us on Twitter, Discord and WeChat đ News [2023.09.20] Support InternLM-20B models! [2023.09.06] Support Baichuan2 models! [2023.08.30] XTuner is released, with multiple fine-tuned adapters on HuggingFace. đ Introduction XTuner is a toolkit for efficiently fine-tuning LLM, developed by the MMRazor and MMDeploy teams. Efficiency: Support LLM fine-tuning on consumer-grade GPUs. The minimum GPU memory required for 7B LLM fine-tuning is only 8GB, indicating that users can use nearly...
English | įŽäŊä¸æ đ join us on Twitter, Discord and WeChat đ News [2023.09.20] Support InternLM-20B models! [2023.09.06] Support Baichuan2 models! [2023.08.30] XTuner is released, with multiple fine-tuned adapters on HuggingFace. đ Introduction XTuner is a toolkit for efficiently fine-tuning LLM, developed by the MMRazor and MMDeploy teams. Efficiency: Support LLM fine-tuning on consumer-grade GPUs. The minimum GPU memory required for 7B LLM fine-tuning is only 8GB, indicating that users can use nearly...
repo | techs | stars | weekly | forks | weekly |
---|---|---|---|---|---|
FederatedAI/eggroll | PythonScalaJava | 240 | 0 | 70 | 0 |
cmsc330fall23/cmsc330fall23 | PythonOCamlShell | 49 | 0 | 19 | 0 |
cofactoryai/textbase | Python | 1.3k | 0 | 360 | 0 |
estebanpdl/osintgpt | Python | 228 | 0 | 32 | 0 |
sgammon/rules_graalvm | StarlarkPythonOther | 12 | 0 | 5 | 0 |
luban-agi/Awesome-AIGC-Tutorials | 1.9k | 0 | 103 | 0 | |
dwrensha/acronymy-assistant | AGS ScriptJavaScriptPython | 22 | 0 | 2 | 0 |
Poing-Studios/godot-admob-plugin | GDScriptPython | 147 | 0 | 11 | 0 |
tastypepperoni/PPLBlade | GoPython | 327 | 0 | 38 | 0 |
zenml-io/mlstacks | HCLPythonShell | 220 | 0 | 20 | 0 |