Centers & Programs


Home Centers & Programs AI and Natural Sciences Seminars

AI and Natural Sciences
Feb 16 (Thu), 2023
10:00 ~ 12:00
Kim, Younggeun
Lee, Jongmin
Department of biostatistics, Columbia University
[AI] Recent topics on conditional generative model: statistical distance of conditional distributions and identifiability
Conditional generation is the task of constructing target data given conditioning data, e.g., predicting future video frames given past video frames and generating brain imaging given clinical characteristics. In this talk, I provide a brief introduction to two important topics in the conditional generative model: (i) statistical distance of conditional distributions and (ii) identifiability. For (i), I present how statistical distances between conditional distributions are related to those between joint ones for three popular statistical distances: f-divergence, Wasserstein distance, and integral probability metrics. With the results for the Wasserstein distance, I present a new conditional generative model minimizing expected Wasserstein distances between conditional distributions. It can be viewed as an extension of Wasserstein autoencoders to conditional generation or as a Wasserstein counterpart of stochastic video generation. For (ii), identifiability is an essential property to recover the true latent factors generating data. The recently proposed identifiable variational autoencoder (iVAE) framework builds an identifiable generation structure from covariates to latent independent components (ICs) to observations. Though the identifiability is appealing, I show that iVAEs could have local minimum solutions where observations and the approximated ICs are independent given covariates. To overcome this problem, I present a new approach by considering mixtures of the encoder and posterior distributions.