Mitigating the Noise Shift for Denoising Generative Models via Noise Awareness Guidance
Abstract
Noise Awareness Guidance (NAG) addresses noise shift in diffusion models by aligning sampling trajectories with the pre-defined noise schedule, improving generation quality.
Existing denoising generative models rely on solving discretized reverse-time SDEs or ODEs. In this paper, we identify a long-overlooked yet pervasive issue in this family of models: a misalignment between the pre-defined noise level and the actual noise level encoded in intermediate states during sampling. We refer to this misalignment as noise shift. Through empirical analysis, we demonstrate that noise shift is widespread in modern diffusion models and exhibits a systematic bias, leading to sub-optimal generation due to both out-of-distribution generalization and inaccurate denoising updates. To address this problem, we propose Noise Awareness Guidance (NAG), a simple yet effective correction method that explicitly steers sampling trajectories to remain consistent with the pre-defined noise schedule. We further introduce a classifier-free variant of NAG, which jointly trains a noise-conditional and a noise-unconditional model via noise-condition dropout, thereby eliminating the need for external classifiers. Extensive experiments, including ImageNet generation and various supervised fine-tuning tasks, show that NAG consistently mitigates noise shift and substantially improves the generation quality of mainstream diffusion models.
Community
TL;DR: We identify a noise shift problem in diffusion models, where intermediate states deviate from the pre-defined noise schedule, and propose Noise Awareness Guidance (NAG) to correct it, significantly improving generation quality.
This is an automated message from the Librarian Bot. I found the following papers similar to this paper.
The following papers were recommended by the Semantic Scholar API
- HiGS: History-Guided Sampling for Plug-and-Play Enhancement of Diffusion Models (2025)
- NoiseShift: Resolution-Aware Noise Recalibration for Better Low-Resolution Image Generation (2025)
- Guiding Noisy Label Conditional Diffusion Models with Score-based Discriminator Correction (2025)
- CMT: Mid-Training for Efficient Learning of Consistency, Mean Flow, and Flow Map Models (2025)
- Temporal Score Rescaling for Temperature Sampling in Diffusion and Flow Models (2025)
- S2-Guidance: Stochastic Self Guidance for Training-Free Enhancement of Diffusion Models (2025)
- Flow Matching in the Low-Noise Regime: Pathologies and a Contrastive Remedy (2025)
Please give a thumbs up to this comment if you found it helpful!
If you want recommendations for any Paper on Hugging Face checkout this Space
You can directly ask Librarian Bot for paper recommendations by tagging it in a comment:
@librarian-bot
recommend
Models citing this paper 0
No model linking this paper
Datasets citing this paper 0
No dataset linking this paper
Spaces citing this paper 0
No Space linking this paper
Collections including this paper 0
No Collection including this paper