jackmpcollins/magentic

Seamlessly integrate LLMs as Python functions

Pythonagentaichatbotpromptopenaigptmagenticpydanticopenai-apillmchatgpt
This is stars and forks stats for /jackmpcollins/magentic repository. As of 12 May, 2024 this repository has 1043 stars and 36 forks.

magentic Easily integrate Large Language Models into your Python code. Simply use the @prompt decorator to create functions that return structured output from the LLM. Mix LLM queries and function calling with regular Python code to create complex logic. magentic is Compact: Query LLMs without duplicating boilerplate code. Atomic: Prompts are functions that can be individually tested and reasoned about. Transparent: Create "chains" using regular Python code. Define all of your own prompts. Compatible:...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
mlfoundations/open_lmPythonShell1800120
kurtosis-tech/ethereum-packageStarlarkPython600270
developerrahulofficial/jennie-anime-playerTypeScriptPythonJavaScript1800580
KellanClark/armfuckAssemblyPHPC++31000
vim/vim-win32-installerBatchfilePythonShell2.2k-22500
P1X-in/Tanks-of-Freedom-3-DGDScriptPython69060
GoogleCloudPlatform/k8s-config-connectorGoHCLShell73101480
MetaGLM/FinGLMPythonHTMLRoff4890540
gsuuon/llm.nvimLuaPython107060
mistralai/mistral-srcPythonDockerfileShell3.2k01770