A Deterministic Approximation to Neural SDEs

Andreas Look, Melih Kandemir, Barbara Rakitsch, Jan Peters

Research output: Contribution to journalJournal articleResearchpeer-review

10 Downloads (Pure)


Neural Stochastic Differential Equations (NSDEs) model the drift and diffusion functions of a stochastic process as neural networks. While NSDEs are known to make accurate predictions, their uncertainty quantification properties have been remained unexplored so far. We report the empirical finding that obtaining well-calibrated uncertainty estimations from NSDEs is computationally prohibitive. As a remedy, we develop a computationally affordable deterministic scheme which accurately approximates the transition kernel, when dynamics is governed by a NSDE. Our method introduces a bidimensional moment matching algorithm: vertical along the neural net layers and horizontal along the time direction, which benefits from an original combination of effective approximations. Our deterministic approximation of the transition kernel is applicable to both training and prediction. We observe in multiple experiments that the uncertainty calibration quality of our method can be matched by Monte Carlo sampling only after introducing high computational cost. Thanks to the numerical stability of deterministic training, our method also improves prediction accuracy.
Original languageEnglish
JournalIEEE Transaction on Pattern Analysis and Machine Intelligence
Issue number4
Pages (from-to)4023-4037
Publication statusPublished - 1. Apr 2023


  • cs.LG
  • stat.ML
  • Costs
  • Monte Carlo methods
  • Uncertainty
  • neural stochastic differential equations
  • Stochastic processes
  • Calibration
  • Computational efficiency
  • Moment matching
  • Kernel
  • uncertainty propagation
  • Neural stochastic differential equations
  • moment matching


Dive into the research topics of 'A Deterministic Approximation to Neural SDEs'. Together they form a unique fingerprint.

Cite this