2 code implementations • 10 May 2024 • Andraž Jelinčič, James Foster, Patrick Kidger
With the aim of using high order SDE solvers adaptively, we extend the VBT to generate these integrals of BM in addition to the Brownian increments.
no code implementations • 4 Aug 2023 • Andraž Jelinčič, Jiajie Tao, William F. Turner, Thomas Cass, James Foster, Hao Ni
In this paper, we propose L\'{e}vyGAN, a deep-learning-based model for generating approximate samples of L\'{e}vy area conditional on a Brownian increment.
2 code implementations • NeurIPS 2021 • Patrick Kidger, James Foster, Xuechen Li, Terry Lyons
This reduces computational cost (giving up to a $1. 87\times$ speedup) and removes the numerical truncation errors associated with gradient penalty.
1 code implementation • 25 Mar 2021 • David Smith, Frederik Geth, Elliott Vercoe, Andrew Feutrill, Ming Ding, Jonathan Chan, James Foster, Thierry Rakotoarivelo
For the modeling, design and planning of future energy transmission networks, it is vital for stakeholders to access faithful and useful power flow data, while provably maintaining the privacy of business confidentiality of service providers.
no code implementations • 19 Feb 2021 • James Foster, Karen Habermann
We study approximations for the L\'evy area of Brownian motion which are based on the Fourier series expansion and a polynomial expansion of the associated Brownian bridge.
Probability Numerical Analysis Numerical Analysis Number Theory 60F05, 60H35, 60J65, 41A10, 42A10, 11M06
1 code implementation • 6 Feb 2021 • Patrick Kidger, James Foster, Xuechen Li, Harald Oberhauser, Terry Lyons
Stochastic differential equations (SDEs) are a staple of mathematical modelling of temporal dynamics.
no code implementations • 1 Jan 2021 • Patrick Kidger, James Foster, Xuechen Li, Harald Oberhauser, Terry Lyons
Several authors have introduced \emph{Neural Stochastic Differential Equations} (Neural SDEs), often involving complex theory with various limitations.
no code implementations • 28 Sep 2020 • James Morrill, Patrick Kidger, Cristopher Salvi, James Foster, Terry Lyons
Neural Controlled Differential Equations (Neural CDEs) are the continuous-time analogue of an RNN, just as Neural ODEs are analogous to ResNets.
3 code implementations • 17 Sep 2020 • James Morrill, Cristopher Salvi, Patrick Kidger, James Foster, Terry Lyons
Neural controlled differential equations (CDEs) are the continuous-time analogue of recurrent neural networks, as Neural ODEs are to residual networks, and offer a memory-efficient continuous-time way to model functions of potentially irregular time series.
Ranked #4 on Time Series Classification on EigenWorms
4 code implementations • 26 Jun 2020 • Cristopher Salvi, Thomas Cass, James Foster, Terry Lyons, Weixin Yang
Recently, there has been an increased interest in the development of kernel methods for learning with sequential data.
5 code implementations • NeurIPS 2020 • Patrick Kidger, James Morrill, James Foster, Terry Lyons
The resulting \emph{neural controlled differential equation} model is directly applicable to the general setting of partially-observed irregularly-sampled multivariate time series, and (unlike previous work on this problem) it may utilise memory-efficient adjoint-based backpropagation even across observations.