This is stars and forks stats for /segmind/distill-sd repository. As of 13 May, 2024 this repository has 341 stars and 24 forks.
50% Smaller, Faster Stable Diffusion 🚀 Knowledge-distilled, smaller versions of Stable Diffusion. Unofficial implementation as described in BK-SDM. These distillation-trained models produce images of similar quality to the full-sized Stable-Diffusion model while being significantly faster and smaller. Components of this Repository: data.py contains scripts to download data for training. distill_training.py trains the U-net using the methods described in the paper. This might need additional configuration...
50% Smaller, Faster Stable Diffusion 🚀 Knowledge-distilled, smaller versions of Stable Diffusion. Unofficial implementation as described in BK-SDM. These distillation-trained models produce images of similar quality to the full-sized Stable-Diffusion model while being significantly faster and smaller. Components of this Repository: data.py contains scripts to download data for training. distill_training.py trains the U-net using the methods described in the paper. This might need additional configuration...
repo | techs | stars | weekly | forks | weekly |
---|---|---|---|---|---|
jupyterlab/jupyter-ai | PythonTypeScriptJavaScript | 2k | 0 | 167 | 0 |
xorbitsai/inference | PythonJavaScriptOther | 1k | 0 | 63 | 0 |
theLSA/hack-er-tools | ShellPython | 219 | 0 | 65 | +1 |
slippedandmissed/XLSL | CC++Yacc | 51 | 0 | 1 | 0 |
digitalinnovationone/livecodingthefuture-docker-terraform | DockerfilePython | 16 | 0 | 10 | 0 |
PacktPublishing/Causal-Inference-and-Discovery-in-Python | Jupyter NotebookPython | 399 | 0 | 120 | 0 |
BexTuychiev/tricking-data-science | Jupyter NotebookPython | 151 | 0 | 29 | 0 |
philschmid/easyllm | Jupyter NotebookPythonMakefile | 295 | 0 | 24 | 0 |
mshukor/UnIVAL | Jupyter NotebookPythonShell | 161 | 0 | 16 | 0 |
QwenLM/Qwen-7B | PythonShell | 5.4k | 0 | 389 | 0 |