Skip to content

Inference-only implementation of "One-Step Diffusion Distillation through Score Implicit Matching" [NIPS 2024]

License

Notifications You must be signed in to change notification settings

maple-research-lab/SIM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

One-Step Diffusion Distillation through Score Implicit Matching

Static Badge  weixin

SIM  Static Badge 

intro_large

Overview

This repository contains inference-only code for our work, SIM, a cutting-edge approach for distilling pre-trained diffusion models into efficient one-step generators. Unlike traditional models that require multiple sampling steps, SIM achieves high-quality sample generation without needing training samples for distillation. It effectively computes gradients for various score-based divergences, resulting in impressive performance metrics: an FID of 2.06 for unconditional generation and 1.96 for class-conditional generation on the CIFAR10 dataset. Additionally, SIM has been applied to a state-of-the-art transformer-based diffusion model for text-to-image generation, achieving an aesthetic score of 6.42 and outperforming existing one-step generators.

Released Models

We released our model that has been trained for more steps with better generation quality. please visit https://huggingface.co/maple-research-lab/SIM for checkpoint.

Inference

python inference.py \
--dit_model_path "/path/to/our_model" \
--text_enc_path /path/to/PixArt-alpha/t5-v1_1-xxl \
--vae_path /path/to/PixArt-alpha/sd-vae-ft-ema \
--prompt "a colorful painting of a beautiful landscape" \
--output_dir out-0 \
--batch 4 --seed 112 --dtype bf16 --device cuda --init_sigma 2.5

License

One-Step Diffusion Distillation through Score Implicit Matching is released under Affero General Public License v3.0

Acknowledgements

Zhengyang Geng is supported by funding from the Bosch Center for AI. Zico Kolter gratefully acknowledges Bosch’s funding for the lab.

We also acknowledge the authors of Diff-Instruct and Score-identity Distillation for their great contributions to high-quality diffusion distillation Python code. We appreciate the authors of PixelArt- α for making their DiT-based diffusion model public.

Collaboration

For inquiries regarding accessing our latest models or collaboration, please contact Guo-jun Qi: guojunq at gmail dot com

📄 Citation

@article{luo2024one,
  title={One-Step Diffusion Distillation through Score Implicit Matching},
  author={Luo, Weijian and Huang, Zemin and Geng, Zhengyang and Kolter, J Zico and Qi, Guo-jun},
  journal={arXiv preprint arXiv:2410.16794},
  year={2024}
}

About

Inference-only implementation of "One-Step Diffusion Distillation through Score Implicit Matching" [NIPS 2024]

Topics

Resources

License

Stars

Watchers

Forks

Languages