amazon-science/auto-cot

Official implementation for "Automatic Chain of Thought Prompting in Large Language Models" (stay tuned & more will be updated)

Jupyter NotebookPythonreasoninggpt-3gpt3-resourcesgpt3-promptslarge-language-modelsprompt-engineeringchain-of-thought
This is stars and forks stats for /amazon-science/auto-cot repository. As of 03 May, 2024 this repository has 783 stars and 80 forks.

Auto-CoT: Automatic Chain of Thought Prompting in Large Language Models (ICLR 2023) Cheer AI up with the "let's think step by step" prompt? More plz. Let’s think not just step by step, but also one by one. Auto-CoT uses more cheers & diversity to SAVE huge manual efforts in chain of thought prompt design, matching or even exceeding performance of manual design on GPT-3. Check out our 25-page paper for more information. Requirements Python>=3.8 pip install torch==1.8.2+cu111 torchtext==0.9.2...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
ckczzj/CHANMATLABHTMLPython72030
foundryzero/binder-tracePythonJavaScript3300230
Lightning-AI/lit-parrotPythonJupyter Notebook3.1k03070
yxuansu/PandaGPTPythonShell6460510
reactive-python/reactpyPythonTypeScriptOther6.9k03170
mitre/calderaPythonHTMLCSS4.7k09460
pentilm/FactAIPythonPureBasicDockerfile2140400
KAT-Advanced-Medical/KAMSQFPawnPython440620
chenyk1990/reproducible_researchTeXPythonMATLAB400200
gavalar/vimconfigVim ScriptPerlShell0010