triton-inference-server/server

The Triton Inference Server provides an optimized cloud and edge inferencing solution.

PythonShellC++JavaCMakeRoffOthermachine-learningclouddeep-learninggpuinferenceedgedatacenter
This is stars and forks stats for /triton-inference-server/server repository. As of 25 Apr, 2024 this repository has 6163 stars and 1261 forks.

Triton Inference Server LATEST RELEASE: You are currently on the main branch which tracks under-development progress towards the next release. The current release is version 2.38.0 and corresponds to the 23.09 container release on NVIDIA GPU Cloud (NGC)](https://catalog.ngc.nvidia.com/orgs/nvidia/containers/tritonserver). Triton Inference Server is an open source inference serving software that streamlines AI inferencing. Triton enables teams to deploy any AI model from multiple deep learning and...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
TJU-DRL-LAB/AI-OptimizerPythonCythonOther3.3k05940
inspec/inspecRubyHTMLShell2.7k+56840
thoughtbot/shoulda-matchersRubyJavaScriptOther3.4k+29080
railsadminteam/rails_adminRubyJavaScriptHTML7.8k02.3k0
BetaSu/fe-hunterJavaScript1.6k01180
datastacktv/data-engineer-roadmap11.6k01.3k0
robmarkcole/satellite-image-deep-learning6.9k01.2k0
jhuckaby/webcamjsActionScriptJavaScriptShell2.5k01.1k0
SalesforceFoundation/NPSPApexJavaScriptHTML60503490
klezVirus/inceptorAssemblyCPython1.4k02450