ray-project/ray-llm

RayLLM - LLMs on Ray

PythonOtherdistributed-systemstransformersrayservinglarge-language-modelsllmllmopsllm-servingllm-inference
This is stars and forks stats for /ray-project/ray-llm repository. As of 28 Apr, 2024 this repository has 736 stars and 47 forks.

RayLLM - LLMs on Ray Try it now: 🦜🔍 Ray Aviary Explorer 🦜🔍 RayLLM (formerly known as Aviary) is an LLM serving solution that makes it easy to deploy and manage a variety of open source LLMs, built on Ray Serve. It does this by: Providing an extensive suite of pre-configured open source LLMs, with defaults that work out of the box. Supporting Transformer models hosted on Hugging Face Hub or present on local disk. Simplifying the deployment of multiple LLMs Simplifying the addition of new LLMs Offering...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
win3zz/CVE-2023-43261Python44070
ONLYOFFICE/DocSpace-buildtoolsRich Text FormatShellPython0010
rails/solid_cacheRubyHTMLOther222090
nxp-auto-linux/meta-albBitBakeC++NASL2040
AxxAxx/AxxSolderCHTMLMakefile1870130
ekknod/ECCC++Other18601080
DHEERAJHARODE/Hacktoberfest2023-Open-source-Jupyter NotebookC++C34002.3k0
Source2ZE/CS2FixesC++PythonC530330
NVIDIA/MatXC++CudaCMake9890570
egueli/ArduHeaterOther5010