top of page
Search
Jinlong

Ensemble Kalman Inversion for Sparse Learning of Dynamical Systems from Time-Averaged Data

Updated: Aug 19, 2022

T. Schneider, A. Stuart, J.-L. Wu

Accepted by Journal of Computational Physics, preprint at arXiv:2007.06175 (2022).


Enforcing sparse structure within learning has led to significant advances in the field of pure data-driven discovery of dynamical systems. However, such methods require access not only to time-series of the state of the dynamical system, but also the time derivative. In many applications the data is available only in the form of time-averages such as moments and autocorrelation functions.

We propose a sparse learning methodology to discover the vector fields defining a (possibly stochastic or partial) differential equation, using time-averaged statistics derived from time-series data.

Such a formulation of sparse learning naturally leads to a nonlinear inverse problem to which we apply the methodology of ensemble Kalman inversion (EKI). EKI is chosen because it may be formulated in terms of the iterative solution of quadratic optimization problems; sparsity is then easily imposed.

We then apply the EKI-based sparse learning methodology to various examples governed by stochastic differential equations (a noisy Lorenz 63 system), ordinary differential equations (Lorenz 96 system and coalescence equations), and a partial differential equation (the Kuramoto-Sivashinsky equation).

The results demonstrate that data-driven discovery of differential equations can be achieved using sparse EKI with time-averaged statistics. The proposed sparse learning methodology extends the scope of pure data-driven discovery of differential equations to previously challenging applications and data-aquistition scenarios.

264 views0 comments

Comentarios


bottom of page