Skip to main content

Towards probabilistic representation learning of continuous-time dynamics

Resource type
Thesis type
(Thesis) Ph.D.
Date created
2023-02-28
Authors/Contributors
Author: Deng, Ruizhi
Abstract
This dissertation studies the problem of learning continuous-time dynamics from discrete observations in probabilistic approaches. We first propose an normalizing flow model for continuous-time stochastic processes that transform simple Wiener processes into more complex processes using indexed invertible mappings. Such designs permit efficient sampling and exact density evaluation given arbitrary time grids on the transformed processes. We further propose to augment the indexed normalizing flows with an expressive generic neural stochastic differential equations to model latent dynamics. Indexed normalizing flows will decode trajectories of the latent dynamics into continuous processes in the observation space. A piece-wise approach to the latent and variational posterior processes is proposed for optimization and inference. The thesis concludes with an extension of the popular particle filtering method to continuous-time models based on latent neural stochastic differential equations using the same piece-wise treatment and importance weighting. The proposed particle filtering method can serve as a better drop-in replacement for inference methods based on variational approximations and shows consistent improvements across models and tasks.
Document
Extent
110 pages.
Identifier
etd22374
Copyright statement
Copyright is held by the author(s).
Permissions
This thesis may be printed or downloaded for non-commercial research and scholarly purposes.
Supervisor or Senior Supervisor
Thesis advisor: Mori, Greg
Language
English
Member of collection
Download file Size
etd22374.pdf 80.31 MB

Views & downloads - as of June 2023

Views: 0
Downloads: 0