segmind/distill-sd

Segmind Distilled diffusion

Pythoninferenceknowledge-distillationdistillationstable-diffusion
This is stars and forks stats for /segmind/distill-sd repository. As of 13 May, 2024 this repository has 341 stars and 24 forks.

50% Smaller, Faster Stable Diffusion 🚀 Knowledge-distilled, smaller versions of Stable Diffusion. Unofficial implementation as described in BK-SDM. These distillation-trained models produce images of similar quality to the full-sized Stable-Diffusion model while being significantly faster and smaller. Components of this Repository: data.py contains scripts to download data for training. distill_training.py trains the U-net using the methods described in the paper. This might need additional configuration...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
jupyterlab/jupyter-aiPythonTypeScriptJavaScript2k01670
xorbitsai/inferencePythonJavaScriptOther1k0630
theLSA/hack-er-toolsShellPython219065+1
slippedandmissed/XLSLCC++Yacc51010
digitalinnovationone/livecodingthefuture-docker-terraformDockerfilePython160100
PacktPublishing/Causal-Inference-and-Discovery-in-PythonJupyter NotebookPython39901200
BexTuychiev/tricking-data-scienceJupyter NotebookPython1510290
philschmid/easyllmJupyter NotebookPythonMakefile2950240
mshukor/UnIVALJupyter NotebookPythonShell1610160
QwenLM/Qwen-7BPythonShell5.4k03890