CodedotAl/gpt-code-clippy

Full description can be found here: https://discuss.huggingface.co/t/pretrain-gpt-neo-for-open-source-github-copilot-model/7678?u=ncoop57

PythonJupyter NotebookOther
This is stars and forks stats for /CodedotAl/gpt-code-clippy repository. As of 05 May, 2024 this repository has 3201 stars and 217 forks.

GPT-Code-Clippy (GPT-CC) Please refer to our new GitHub Wiki which documents our efforts in detail in creating the open source version of GitHub Copilot Courtesy of the awesome Aimee Trevett! Introduction GPT-Code-Clippy (GPT-CC) is an open source version of GitHub Copilot, a language model -- based on GPT-3, called GPT-Codex -- that is fine-tuned on publicly available code from GitHub. Datasets The dataset used to train GPT-CC is obtained from SEART GitHub Search using the following...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
PabloJMoreno/mbli_powerbi_r_pythonRoffJupyter Notebook420100
THU-DSP-LAB/ventus-gpgpuScalaTclVerilog3110410
RayeRen/rayeren.github.ioSCSSJavaScriptHTML3101070
saucepoint/0xMonacoSolidityJuliaOther53020
marktoda/paradigm-ctfSolidityJavaScriptRuby40000
tguelcan/MovieHubSvelteTypeScriptJavaScript33050
jacobeisenstein/gt-nlp-classTeXJupyter NotebookPython4.8k01.1k0
IUCompilerCourse/Essentials-of-CompilationTeXOther1k+14121+1
usememos/memosGoTypeScriptLess19.6k01.4k0
outbrain/outbrain-iOS-FrameworkObjective-CRubyOther0020