12/07/2020

Learning From Irregularly-Sampled Time Series: A Missing Data Perspective

Steven Cheng-Xian Li, Benjamin Marlin

Keywords: Deep Learning - Generative Models and Autoencoders

Abstract: Irregularly-sampled time series occur in many domains including healthcare. They can be challenging to model because they do not naturally yield a fixed-dimensional representation as required by many standard machine learning models. In this paper, we consider irregular sampling from the perspective of missing data. We model observed irregularly sampled time series data as a sequence of index-value pairs sampled from a continuous but unobserved function. We introduce an encoder-decoder framework for learning from such generic indexed sequences. We propose learning methods for this framework based on variational autoencoders and generative adversarial networks. We focus on the continuous-time case and introduce continuous convolutional layers that can interface with existing neural network architectures. We investigate two applications of this framework: interpolation and time series classification. Experiments show that our models are able to achieve competitive or better classification results on irregularly sampled multivariate time series classification tasks compared to recent RNN models while offering significantly faster training times.

 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at ICML 2020 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd

Similar Papers