Directions in ML: Latent Stochastic Differential Equations: An Unexplored Model Class [Talk]

We show how to do gradient-based stochastic variational inference in stochastic differential equations (SDEs), in a way that allows the use of adaptive SDE solvers. This allows us to scalably fit a new family of richly-parameterized distributions over irregularly-sampled time series. We apply latent SDEs to motion capture data, and to demonstrate infinitely-deep Bayesian neural networks. We also discuss the pros and cons of this barely-explored model class, comparing it to Gaussian processes and neural processes.

Learn more about the 2020-2021 Directions in ML: AutoML and Automating Algorithms virtual speaker series >

Speaker Bios

David Duvenaud is an assistant professor in computer science at the University of Toronto. His research focuses on continuous-time models, latent-variable models, and deep learning. His postdoc was done at Harvard University, and his Ph.D. at the University of Cambridge. David is a founding member of the Vector Institute for Artificial Intelligence, and also co-founded Invenia, an energy forecasting company.

Date:
Haut-parleurs:
David Duvenaud
Affiliation:
University of Toronto