google-research/bert

TensorFlow code and pre-trained models for BERT

PythonJupyter Notebooknlpnatural-language-processinggoogletensorflownatural-language-understanding
This is stars and forks stats for /google-research/bert repository. As of 27 Apr, 2024 this repository has 35419 stars and 9406 forks.

BERT ***** New March 11th, 2020: Smaller BERT Models ***** This is a release of 24 smaller BERT models (English only, uncased, trained with WordPiece masking) referenced in Well-Read Students Learn Better: On the Importance of Pre-training Compact Models. We have shown that the standard BERT recipe (including model architecture and training objective) is effective on a wide range of model sizes, beyond BERT-Base and BERT-Large. The smaller BERT models are intended for environments with restricted...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
ml-tooling/best-of-ml-python14.5k02.2k0
alibaba/EasyNLPPythonShell1.8k02260
programthink/zhaoPython13k02.8k0
getredash/redashPythonJavaScriptTypeScript23.9k04.2k0
doldecomp/meleeAssemblyCPython5210570
google/brotliTypeScriptCC#12.6k01.2k0
F5OEO/rpitxCAssemblyC++3.6k04810
sisl/AA222Project2JuliaPythonShell10150
DataTalksClub/mlops-zoomcampJupyter NotebookPythonOther7.8k01.6k0
Bai-Yu-Lan/SH-COVID19Jupyter NotebookPython1220130