abacaj/mpt-30B-inference

Run inference on MPT-30B using CPU

Pythonggmlctransformersmpt-30b
This is stars and forks stats for /abacaj/mpt-30B-inference repository. As of 08 May, 2024 this repository has 561 stars and 326 forks.

MPT 30B inference code using CPU Run inference on the latest MPT-30B model using your CPU. This inference code uses a ggml quantized model. To run the model we'll use a library called ctransformers that has bindings to ggml in python. Turn style with history on latest commit: Video of initial demo: 2023-06-25.20-13-24.mp4 Requirements I recommend you use docker for this model, it will make everything easier for you. Minimum specs system with 32GB of ram. Recommend to...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
GitHubSecurityLab/actions-permissionsPythonOther1840170
oracle-devrel/technology-engineeringPythonPLSQLShell46080
Infineon/cce-mtb-psoc6-ubm-controllerAssemblyCHTML0000
flowdriveai/flowpilotCPythonC++2530620
trevorwang/retrofit.dartDartRubyShell96702160
MetOffice/opsinputsFortranC++Python4000
labring/FastGPTTypeScriptJavaScriptHTML5k+1051k+41
apache/kvrocksC++GoCMake2.5k+103560
terraform-aws-modules/terraform-aws-solutionsHCLPython70050
cdfmlr/muvtuberHTMLPython3000580