log in  |  register  |  feedback?  |  help  |  web accessibility
Logo
Diffusion Models (two papers)
Lee Sharma - UMD
Wednesday, July 20, 2022, 3:15-4:15 pm Calendar
  • You are subscribed to this talk through .
  • You are watching this talk through .
  • You are subscribed to this talk. (unsubscribe, watch)
  • You are watching this talk. (unwatch, subscribe)
  • You are not subscribed to this talk. (watch, subscribe)
Abstract

In this meeting, we will discuss two papers relating to diffusion models.

Paper 1: Deep Unsupervised Learning using Nonequilibrium Thermodynamics (2015)
Paper 1 URL: https://arxiv.org/abs/1503.03585
Paper 1 Abstract: "A central problem in machine learning involves modeling complex data-sets using highly flexible families of probability distributions in which learning, sampling, inference, and evaluation are still analytically or computationally tractable. Here, we develop an approach that simultaneously achieves both flexibility and tractability. The essential idea, inspired by non-equilibrium statistical physics, is to systematically and slowly destroy structure in a data distribution through an iterative forward diffusion process. We then learn a reverse diffusion process that restores structure in data, yielding a highly flexible and tractable generative model of the data. This approach allows us to rapidly learn, sample from, and evaluate probabilities in deep generative models with thousands of layers or time steps, as well as to compute conditional and posterior probabilities under the learned model. We additionally release an open source reference implementation of the algorithm."

Paper 2: Denoising Diffusion Probabilistic Models (2020)
Paper 2 URL: https://arxiv.org/abs/2006.11239
Paper 2 Abstract: "We present high quality image synthesis results using diffusion probabilistic models, a class of latent variable models inspired by considerations from nonequilibrium thermodynamics. Our best results are obtained by training on a weighted variational bound designed according to a novel connection between diffusion probabilistic models and denoising score matching with Langevin dynamics, and our models naturally admit a progressive lossy decompression scheme that can be interpreted as a generalization of autoregressive decoding. On the unconditional CIFAR10 dataset, we obtain an Inception score of 9.46 and a state-of-the-art FID score of 3.17. On 256x256 LSUN, we obtain sample quality similar to ProgressiveGAN."

For more information and our full schedule, see our website (https://leesharma.com/physics-ai-reading-group/)

 

This talk is organized by Lee Sharma