mlfoundations/open_clip

An open source implementation of CLIP.

Jupyter NotebookPythonOthercomputer-visiondeep-learningpytorchpretrained-modelslanguage-modelcontrastive-lossmulti-modal-learningzero-shot-classification
This is stars and forks stats for /mlfoundations/open_clip repository. As of 28 Apr, 2024 this repository has 6387 stars and 654 forks.

OpenCLIP [Paper] [Citations] [Clip Colab] [Coca Colab] Welcome to an open source implementation of OpenAI's CLIP (Contrastive Language-Image Pre-training). Using this codebase, we have trained several models on a variety of data sources and compute budgets, ranging from small-scale experiments to larger runs including models trained on datasets such as LAION-400M, LAION-2B and DataComp-1B. Many of our models and their scaling properties are studied in detail in the paper reproducible scaling laws...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
microsoft/WSLPowerShellPythonShell15.9k+278150
tzutalin/labelImgPythonShellMakefile20.7k+396k+4
Jack-Cherish/PythonParkPython7.9k01.5k0
Textualize/richPythonOther45k+681.7k-2
ankurchavda/streamifyPythonShellHCL3830840
drivendata/cookiecutter-data-sciencePythonMakefileBatchfile7k02.2k0
elastic/elasticsearch-rubyRubyOther1.9k06030
afshinea/stanford-cs-229-machine-learning16k+183.7k+2
xmos/lib_spdifAssemblyXCJava100140
fwupd/fwupdCPythonMeson2.3k+26382+3