extreme-bert/extreme-bert

ExtremeBERT is a toolkit that accelerates the pretraining of customized language models on customized datasets, described in the paper “ExtremeBERT: A Toolkit for Accelerating Pretraining of Customized BERT”.

PythonShellpythonnlpmachine-learningnatural-language-processingdeep-learningpytorchtransformerlanguage-modelslanguage-modelbert
This is stars and forks stats for /extreme-bert/extreme-bert repository. As of 29 Mar, 2024 this repository has 283 stars and 15 forks.

ExtremeBERT: Accelerate your LM Pre-training! ExtremeBERT is a toolkit that accelerates the pretraining and finetuning of BERT on customized datasets. Take a quick look at our documentation and paper. Features Simple Installation We simplify the installation process of dependencies required by the package. They can be easily installed by a single command source install.sh. No other steps are needed! Fast Training Pretraining time can be reduced...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
tinyvision/DAMO-YOLOPythonShell3.6k04460
openai/gpt-2Python20k05.1k0
extism/extismRustJavaOCaml2.7k0730
facebookincubator/buck2RustStarlarkPython2.9k01510
APOD-Taigi/apod-taigi.github.ioSassHTMLJavaScript3000
zio/zio-directScalaShell1440130
markhibberd/vaultScalaShell13010
pooltogether/ERC5164SolidityTypeScriptShell28070
balena-os/balena-jetson-orinShellCoffeeScriptBitBake5030
Autodesk/AuroraC++CPython4260260