Progressive Compression with Universally Quantized Diffusion Models

Official implementation of our ICLR 2025 paper Progressive Compression with Universally Quantized Diffusion Models by Yibo Yang, Justus Will, and Stephan Mandt.

TLDR

Our new form of diffusion model, UQDM, enables practical progressive compression with an unconditional diffusion model - avoiding the computational intractability of Gaussian channel simulation by using universal quantization.

Setup

git clone https://github.com/mandt-lab/uqdm.git
cd uqdm
conda env create -f environment.yml
conda activate uqdm

For working with ImageNet64, download from the official website the npz dataset files:

  • Train(64x64) part1, Train(64x64) part2, Val(64x64)

and place them in ./data/imagenet64. Our implementation removes the duplicate test images as saved in ./data/imagenet64/removed.npy during loading.

Usage

Load pretrained models by placing the config.json and checkpoint.pt in a shared folder and load them for example via

from uqdm import load_checkpoint, load_data
model = load_checkpoint('checkpoints/uqdm-tiny')
train_iter, eval_iter = load_data('ImageNet64', model.config.data)

To train or evaluate call respectively via

model.trainer(train_iter, eval_iter)
model.evaluate(eval_iter)

To save the compressed representation of an image and to reconstruct an image/images from their compressed representations, use

image = next(iter(eval_iter))
compressed = model.compress(image)
reconstructions = model.decompress(compressed)

Citation

@article{yang2025universal,
    title={Progressive Compression with Universally Quantized Diffusion Models},
    author={Yibo Yang and Justus Will and Stephan Mandt},
    journal = {International Conference on Learning Representations},
    year={2025}
}
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Datasets used to train justuswill/UQDM