wangzhaode/ChatGLM-MNN

llm deploy project based mnn.

C++HTMLJavaJavaScriptDockerfilePowerShellOthercppopenclcudamnnchatglm-6bchatglm2-6bqwen-7bbaichuan2-7bcodegeex2-6b
This is stars and forks stats for /wangzhaode/ChatGLM-MNN repository. As of 26 Apr, 2024 this repository has 926 stars and 143 forks.

mnn-llm Read me in english 模型支持 llm模型导出onnx模型请使用llm-export 当前支持以模型: model onnx-fp32 mnn-int4 chatglm-6b chatglm2-6b codegeex2-6b Qwen-7B-Chat Baichuan2-7B-Chat Llama-2-7b-chat 下载int4模型 # <model> like `chatglm-6b` # linux/macos ./script/download_model.sh <model> # windows ./script/download_model.ps1 <model> 构建 当前构建状态: System Build Statud Linux Macos Windows Android 本地编译 # linux ./script/linux_build.sh # macos ./script/macos_build.sh # windows msvc ./script/windows_build.ps1 # android ./script/android_build.sh 默认使用CPU后端,如果使用其他后端,可以在脚本中添加MNN编译宏 cuda: -DMNN_CUDA=ON opencl: -DMNN_OPENCL=ON 4. 执行 # linux/macos ./cli_demo # cli demo ./web_demo # web ui demo # windows .\Debug\cli_demo.exe .\Debug\web_demo.exe # android adb push libs/*.so build/libllm.so build/cli_demo /data/local/tmp adb push model_dir /data/local/tmp adb shell "cd /data/local/tmp && export LD_LIBRARY_PATH=. && ./cli_demo -m model" Reference chatglm-6b chatglm2-6b codegeex2-6b Baichuan2-7B-Chat Qwen-7B-Chat cpp-httplib chatgpt-web cppjieba ChatViewDemo
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
hden/run-envClojureDockerfileShell2010
moul/iocatCoffeeScriptJavaScriptMakefile83060
hikettei/cl-waffeCommon LispJupyter NotebookPython47020
muk-as/DOTA2_CLIENTCSSLuaReScript12+110
DarkMooNight/RainDartC++CMake3680230
duckietown/book-devmanual-softwareDockerfileTeX1000
Misaka-blog/Alist-for-CodeSandboxDockerfile130250
HKUST-Aerial-Robotics/HKUST-ELEC5660-Introduction-to-Aerial-RoboticsC++CC#36050
Wilfred/ag.elEmacs LispOther5210630
k8sgpt-ai/k8sgptGoMakefileOther3.3k+16361+1