Skip to content

Code for "Neural Controlled Differential Equations for Irregular Time Series" (Neurips 2020 Spotlight)

License

Notifications You must be signed in to change notification settings

patrick-kidger/NeuralCDE

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

58 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Neural Controlled Differential Equations for Irregular Time Series
(NeurIPS 2020 Spotlight)
[arXiv, YouTube]

Building on the well-understood mathematical theory of controlled differential equations, we demonstrate how to construct models that:

  • Act directly on irregularly-sampled partially-observed multivariate time series.
  • May be trained with memory-efficient adjoint backpropagation - even across observations.
  • Demonstrate state-of-the-art performance.

They are straightforward to implement and evaluate using existing tools, in particular PyTorch and the torchcde library.


Library

See torchcde.

Example

We encourage looking at example.py, which demonstrates how to use the library to train a Neural CDE model to predict the chirality of a spiral.

Also see irregular_data.py, for demonstrations on how to handle variable-length inputs, irregular sampling, or missing data, all of which can be handled easily, without changing the model.

A self contained short example:

import torch
import torchcde

# Create some data
batch, length, input_channels = 1, 10, 2
hidden_channels = 3
t = torch.linspace(0, 1, length)
t_ = t.unsqueeze(0).unsqueeze(-1).expand(batch, length, 1)
x_ = torch.rand(batch, length, input_channels - 1)
x = torch.cat([t_, x_], dim=2)  # include time as a channel

# Interpolate it
coeffs = torchcde.natural_cubic_spline_coeffs(x)
X = torchcde.NaturalCubicSpline(coeffs)

# Create the Neural CDE system
class F(torch.nn.Module):
    def __init__(self):
        super(F, self).__init__()
        self.linear = torch.nn.Linear(hidden_channels, 
                                      hidden_channels * input_channels)
    def forward(self, t, z):
        return self.linear(z).view(batch, hidden_channels, input_channels)

func = F()
z0 = torch.rand(batch, hidden_channels)

# Integrate it
torchcde.cdeint(X=X, func=func, z0=z0, t=X.interval)

Reproducing experiments

Everything to reproduce the experiments of the paper can be found in the experiments folder. Check the folder for details.

Results

As an example (taken from the paper - have a look there for similar results on other datasets):

Citation

@article{kidger2020neuralcde,
    title={{N}eural {C}ontrolled {D}ifferential {E}quations for {I}rregular {T}ime {S}eries},
    author={Kidger, Patrick and Morrill, James and Foster, James and Lyons, Terry},
    journal={Advances in Neural Information Processing Systems},
    year={2020}
}