dmlc/xgboost

Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow

C++PythonCudaScalaRJavaOtherdistributed-systemsmachine-learningxgboostgbdtgbmgbrt
This is stars and forks stats for /dmlc/xgboost repository. As of 29 Mar, 2024 this repository has 24789 stars and 8667 forks.

eXtreme Gradient Boosting Community | Documentation | Resources | Contributors | Release Notes XGBoost is an optimized distributed gradient boosting library designed to be highly efficient, flexible and portable. It implements machine learning algorithms under the Gradient Boosting framework. XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solve many data science problems in a fast and accurate way. The same code runs on major distributed environment (Kubernetes, Hadoop,...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
lab52io/LeakedHandlesFinderC++C2540420
nasa/fprimeC++PythonCMake9.6k01.2k0
NVIDIAGameWorks/RTXGIC++HLSLCMake5800450
learning-zone/javascript-interview-questionsHTMLJavaScript1.6k+1595+2
foundation/foundation-emailsHTMLSCSSJavaScript7.7k-11.1k+2
roy-tian/learning-areaHTMLJavaScriptCSS76301.5k0
divanov11/StudyBudHTMLCSSPython67806380
paperswithcode/ai-deadlinesJavaScriptHTMLRuby5k08470
faridrashidi/kaggle-solutionsHTML3.1k01.1k0
mmiscool/serialTerminal.comG-codeHTMLJavaScript1710360