EleutherAI/gpt-neox

An implementation of model parallel autoregressive transformers on GPUs, based on the DeepSpeed library.

PythonC++CudaCDockerfileShelltransformerslanguage-modelgpt-3deepspeed-library
This is stars and forks stats for /EleutherAI/gpt-neox repository. As of 03 May, 2024 this repository has 6078 stars and 872 forks.

GPT-NeoX This repository records EleutherAI's library for training large-scale language models on GPUs. Our current framework is based on NVIDIA's Megatron Language Model and has been augmented with techniques from DeepSpeed as well as some novel optimizations. We aim to make this repo a centralized and accessible place to gather techniques for training large-scale autoregressive language models, and accelerate research into large-scale training. This library is in widespread use in academic, industry,...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
EverythingSuckz/TG-FileStreamBotPythonDockerfile83801k0
quarterturn/serialMessageBoardOtherShell0000
sonnylazuardi/chatgpt-desktopRustCSSTypeScript1.9k01800
jacobalberty/unifi-dockerShellDockerfile1.8k03980
massun-onibakuchi/grim-reaperSolidityShell88060
theradixproject/radixproject.orgSvelteCSSJavaScript16010
openai/openai-nodeTypeScriptJavaScriptShell5k04780
Vysp3r/ProtonPlusValaMesonCSS136090
Mazamars312/Analogue-AmigaVerilogSystemVerilogVHDL6105+1
lutzenfried/OffensiveCloudVuePythonPowerShell2630480