mymusise/ChatGLM-Tuning

一种平价的chatgpt实现方案, 基于ChatGLM-6B + LoRA

PythonJupyter Notebooklorapeftchatgptchatglm
This is stars and forks stats for /mymusise/ChatGLM-Tuning repository. As of 29 Apr, 2024 this repository has 3345 stars and 412 forks.

ChatGLM-Tuning 一种平价的chatgpt实现方案,基于清华的 ChatGLM-6B + LoRA 进行finetune. 数据集: alpaca 有colab的同学可以直接在colab上尝试: 官方ptuning代码 Demo 开源版的文心一言 S1 Finetune 准备 显卡: 显存 >= 16G (最好24G或者以上) 环境: python>=3.8 cuda>=11.6, cupti, cuDNN, TensorRT等深度学习环境 pip3 install -r requirements.txt 数据预处理 转化alpaca数据集为jsonl python cover_alpaca2jsonl.py \ --data_path data/alpaca_data.json \ --save_path data/alpaca_data.jsonl \ tokenization python tokenize_dataset_rows.py \ --jsonl_path data/alpaca_data.jsonl \ ...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
microsoft/LoRAPython6.7k+85396+8
Nuitka/Nuitka-WatchPythonQMLShell1000
sonnylazuardi/chat-ai-desktopRustCSSTypeScript1.9k01800
discord/rules_iosStarlarkShellC3020
MustangYM/OSXChatGptSwift2960250
enricoros/nextjs-chatgpt-appTypeScriptCSSOther1.6k+14443+8
xcatliu/chatgpt-nextTypeScriptJavaScriptCSS54902230
deiucanta/chatpadTypeScriptOther966+9185+2
lurimendes/dotfilesVim ScriptPythonRuby5000
UniversalRobots/Universal_Robots_ROS2_DriverC++PythonCMake26501400