bigscience-workshop/bigscience

Central place for the engineering/scaling WG: documentation, SLURM scripts and logs, compute environment and data.

ShellPythonMakefilenlptrainingmachine-learningmodels
This is stars and forks stats for /bigscience-workshop/bigscience repository. As of 01 May, 2024 this repository has 857 stars and 89 forks.

bigscience Research workshop on large language models - The Summer of Language Models 21 At the moment we have 2 code repos: https://github.com/bigscience-workshop/Megatron-DeepSpeed - this is our flagship code base https://github.com/bigscience-workshop/bigscience - (this repo) for everything else - docs, experiments, etc. Currently, the most active segments of this repo are: JZ - Lots of information about our work environment which helps evaluate, plan and get things done Experiments - many experiments...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
christianhaitian/PortMasterPythonShell304+359+1
redis/redis-docShell2.2k01.7k0
Drewsif/PiShrinkShell3.1k+9581+2
paulrberg/check-same-signSolidityShell46000
zobront/xchainSolidityShell80030
piotrwitek/react-redux-typescript-guideTypeScriptJavaScriptHTML13.2k01.1k0
pulumi/examplesTypeScriptPythonC#2.1k08420
jlebar/confVim ScriptPythonShell1000
alibaba/ilogtailC++GoC1.3k03060
openai/tritonC++PythonMLIR8.4k09170