This is stars and forks stats for /amazon-science/auto-cot repository. As of 03 May, 2024 this repository has 783 stars and 80 forks.
Auto-CoT: Automatic Chain of Thought Prompting in Large Language Models (ICLR 2023) Cheer AI up with the "let's think step by step" prompt? More plz. Let’s think not just step by step, but also one by one. Auto-CoT uses more cheers & diversity to SAVE huge manual efforts in chain of thought prompt design, matching or even exceeding performance of manual design on GPT-3. Check out our 25-page paper for more information. Requirements Python>=3.8 pip install torch==1.8.2+cu111 torchtext==0.9.2...
Auto-CoT: Automatic Chain of Thought Prompting in Large Language Models (ICLR 2023) Cheer AI up with the "let's think step by step" prompt? More plz. Let’s think not just step by step, but also one by one. Auto-CoT uses more cheers & diversity to SAVE huge manual efforts in chain of thought prompt design, matching or even exceeding performance of manual design on GPT-3. Check out our 25-page paper for more information. Requirements Python>=3.8 pip install torch==1.8.2+cu111 torchtext==0.9.2...
repo | techs | stars | weekly | forks | weekly |
---|---|---|---|---|---|
ckczzj/CHAN | MATLABHTMLPython | 72 | 0 | 3 | 0 |
foundryzero/binder-trace | PythonJavaScript | 330 | 0 | 23 | 0 |
Lightning-AI/lit-parrot | PythonJupyter Notebook | 3.1k | 0 | 307 | 0 |
yxuansu/PandaGPT | PythonShell | 646 | 0 | 51 | 0 |
reactive-python/reactpy | PythonTypeScriptOther | 6.9k | 0 | 317 | 0 |
mitre/caldera | PythonHTMLCSS | 4.7k | 0 | 946 | 0 |
pentilm/FactAI | PythonPureBasicDockerfile | 214 | 0 | 40 | 0 |
KAT-Advanced-Medical/KAM | SQFPawnPython | 44 | 0 | 62 | 0 |
chenyk1990/reproducible_research | TeXPythonMATLAB | 40 | 0 | 20 | 0 |
gavalar/vimconfig | Vim ScriptPerlShell | 0 | 0 | 1 | 0 |