Abstract: |
Score-based diffusion models generate new samples by learning the score associated with a diffusion process. When the score is accurately approximated, the effectiveness of these models can be theoretically justified using differential equations related to the sampling process. Despite this, empirical evidence shows that models employing neural networks with multiplicative noise conditioning can still produce high-quality samples, even when their capacity is clearly insufficient to learn the correct score. We offer a theoretical explanation for this phenomenon by examining the qualitative behavior of the differential equations governing the diffusion processes, utilizing appropriate Lyapunov functions for analysis. |
|