This is stars and forks stats for /google-research/bert repository. As of 07 Dec, 2023 this repository has 35419 stars and 9406 forks.
BERT ***** New March 11th, 2020: Smaller BERT Models ***** This is a release of 24 smaller BERT models (English only, uncased, trained with WordPiece masking) referenced in Well-Read Students Learn Better: On the Importance of Pre-training Compact Models. We have shown that the standard BERT recipe (including model architecture and training objective) is effective on a wide range of model sizes, beyond BERT-Base and BERT-Large. The smaller BERT models are intended for environments with restricted...
BERT ***** New March 11th, 2020: Smaller BERT Models ***** This is a release of 24 smaller BERT models (English only, uncased, trained with WordPiece masking) referenced in Well-Read Students Learn Better: On the Importance of Pre-training Compact Models. We have shown that the standard BERT recipe (including model architecture and training objective) is effective on a wide range of model sizes, beyond BERT-Base and BERT-Large. The smaller BERT models are intended for environments with restricted...
repo | techs | stars | weekly | forks | weekly |
---|---|---|---|---|---|
ml-tooling/best-of-ml-python | 14.5k | 0 | 2.2k | 0 | |
alibaba/EasyNLP | PythonShell | 1.8k | 0 | 226 | 0 |
programthink/zhao | Python | 13k | 0 | 2.8k | 0 |
getredash/redash | PythonJavaScriptTypeScript | 23.9k | 0 | 4.2k | 0 |
doldecomp/melee | AssemblyCPython | 521 | 0 | 57 | 0 |
google/brotli | TypeScriptCC# | 12.6k | 0 | 1.2k | 0 |
F5OEO/rpitx | CAssemblyC++ | 3.6k | 0 | 481 | 0 |
sisl/AA222Project2 | JuliaPythonShell | 1 | 0 | 15 | 0 |
DataTalksClub/mlops-zoomcamp | Jupyter NotebookPythonOther | 7.8k | 0 | 1.6k | 0 |
Bai-Yu-Lan/SH-COVID19 | Jupyter NotebookPython | 122 | 0 | 13 | 0 |