PhoebusSi/Alpaca-CoT

We unified the interfaces of instruction-tuning data (e.g., CoT data), multiple LLMs and parameter-efficient methods (e.g., lora, p-tuning) together for easy use. Meanwhile, we created a new branch to build a Tabular LLM.(我们分别统一了丰富的IFT数据(如CoT数据)、多种训练效率方法以及多种LLMs,三个层面上的接口,打造方便研究人员上手的LLM-IFT研究和使用平台。我们欢迎开源爱好者在这个repo上发起任何有意义的pr,一起将尽可能多的LLM相关技术集成进来。

Jupyter NotebookPythonMDXOthertabular-datapytorchllamaloramossalpacacottabultabular-modelp-tuningllmparameter-efficientchatgptinstruction-tuningchatglm
This is stars and forks stats for /PhoebusSi/Alpaca-CoT repository. As of 29 Apr, 2024 this repository has 2119 stars and 203 forks.

中文 | English Alpaca-CoT: An Instruction-Tuning Platform with Unified Interface of Instruction Collection, Parameter-efficient Methods, and Large Language Models This is the repository for the Alpaca-CoT project, which aims to build an instruction finetuning (IFT) platform with extensive instruction collection (especially the CoT datasets) and a unified interface for various large language models and parameter-efficient methods. We are constantly expanding our instruction-tuning data collection,...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
LC1332/CamelBell-Chinese-LoRAJupyter NotebookPython1740170
alura-es-cursos/inmersion-datosJupyter Notebook340220
vuki656/package-info.nvimLuaOther2960200
ChirikjianLab/Marching-PrimitivesMATLABPython101030
nf-core/proteomicslfqNextflowRPython320170
gfngfn/SATySFiOCamlOther1.1k0810
redteamguides/redteamguides.github.ioHTMLCSSJavaScript1270210
yandex-praktikum/anfisa1sprintHTMLPython607200
homanp/langchain-uiJavaScript811+31420
daodao97/chatdocJavaScriptCSSHTML4240960