Seminars
Home
Centers & Programs
AI and Natural Sciences
Seminars
- FIELD
- AI and Natural Sciences
- DATE
-
Feb 16 (Fri), 2024
- TIME
- 09:00 ~ 11:00
- PLACE
- ONLINE
- SPEAKER
- Tobias Schroeder
- HOST
- Yoon, Sangwoong
- INSTITUTE
- Imperial College London
- TITLE
- Energy Discrepancy: Efficient Training of Energy-Based Models without Scores or MCMC
- ABSTRACT
- Energy-Based Models (EBMs) are unnormalised statistical models inspired from statistical physics that allow estimating the data generating distribution via deep neural networks. As such, energy-based models are a powerful approach to generative modelling with applications in various downstream tasks such as inverse problems, regression, or anomaly detection. However, the widespread adoption of EBMs has been hindered by the difficulty of training them. This talk gives a brief introduction to energy-based models and standard practices for their training. We then introduce Energy Discrepancy (ED), a novel approach for the training of energy-based models which can be computed without scores or expensive Markov Chain Monte Carlo methods. We show that Energy Discrepancy approximates standard estimation losses like score matching and maximum likelihood estimation under different limits, effectively interpolating between both. Through numerical experiments, we demonstrate that ED learns low-dimensional data distributions faster and more accurately than competing approaches. For high-dimensional image data, we explore how the manifold hypothesis places limitations on our approach and demonstrate the effectiveness of Energy Discrepancy by training the energy-based model as a prior in a variational decoder model. Finally, we show how to extend our approach to discrete data modalities.
- FILE
-