NetEase-FuXi/EET

Easy and Efficient Transformer : Scalable Inference Solution For Large NLP model

CudaPythonC++eetbertgpt2bert-inference-performancegpt2-inference-performance
This is stars and forks stats for /NetEase-FuXi/EET repository. As of 29 Apr, 2024 this repository has 241 stars and 44 forks.

Easy and Efficient Transformer 中文README EET(Easy and Efficient Transformer) is a friendly Pytorch inference plugin focus on Transformer-based models to make mega-size model affordable. Features Support Mega-size model with single GPU. EET could make an 11B model available online within 24G Ampere GPU. Expertise in inference for multi-modal and NLP tasks (CLIP/GPT-3/Bert/Seq2seq etc.). High performance. Make the transformer-based model faster...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
dagger/daggerGoRustElixir9.1k+214760
kubernetes/enhancementsGoShellMakefile3.1k+701.5k+10
milvus-io/milvusGoPythonC++23.3k02.5k0
dibgerge/ml-coursera-python-assignmentsJupyter NotebookPython5.3k02.2k0
facebookresearch/visslJupyter NotebookPythonOther3.1k03220
spmallick/learnopencvJupyter NotebookPythonJava19.3k011.3k0
Atcold/pytorch-Deep-LearningJupyter NotebookPython6.6k+12.2k+2
nianticlabs/monodepth2Jupyter NotebookPythonShell3.8k09220
draios/sysdigLuaC++Shell7.4k07230
JustEnoughLinuxOS/distributionMakefileShellPython617+10129+1