Deep latent variable models for generative modelling
Speaker: Gilles Louppe (ULiège)
Abstract: Deep generative models of all kinds have recently demonstrated
high-quality samples in a wide variety of data modalities such as
images, speech, or text. In this tutorial, we will study the family of
deep latent variable models from the ground up. First, we will work
through a derivation of variational autoencoders using variational
inference. We will then generalize to hierarchical extensions and
finally further expand to denoising diffusion probabilistic models.
The tutorial will include both mathematical derivations and code
demonstrations.