English
 
Help Privacy Policy Disclaimer
  Advanced SearchBrowse

Item

ITEM ACTIONSEXPORT
  STENCIL-NET for equation-free forecasting from data.

Maddu, S., Sturm, D., Cheeseman, B., Müller, C. L., & Sbalzarini, I. F. (2023). STENCIL-NET for equation-free forecasting from data. Scientific reports, 13(1): 12787. doi:10.1038/s41598-023-39418-6.

Item is

Files

show Files

Locators

show
hide
Description:
-
OA-Status:

Creators

show
hide
 Creators:
Maddu, Suryanarayana1, Author           
Sturm, Dominik1, Author
Cheeseman, Bevan1, Author           
Müller, Christian L., Author
Sbalzarini, Ivo F.1, Author           
Affiliations:
1Max Planck Institute for Molecular Cell Biology and Genetics, Max Planck Society, ou_2340692              

Content

show
hide
Free keywords: -
 Abstract: We present an artificial neural network architecture, termed STENCIL-NET, for equation-free forecasting of spatiotemporal dynamics from data. STENCIL-NET works by learning a discrete propagator that is able to reproduce the spatiotemporal dynamics of the training data. This data-driven propagator can then be used to forecast or extrapolate dynamics without needing to know a governing equation. STENCIL-NET does not learn a governing equation, nor an approximation to the data themselves. It instead learns a discrete propagator that reproduces the data. It therefore generalizes well to different dynamics and different grid resolutions. By analogy with classic numerical methods, we show that the discrete forecasting operators learned by STENCIL-NET are numerically stable and accurate for data represented on regular Cartesian grids. A once-trained STENCIL-NET model can be used for equation-free forecasting on larger spatial domains and for longer times than it was trained for, as an autonomous predictor of chaotic dynamics, as a coarse-graining method, and as a data-adaptive de-noising method, as we illustrate in numerical experiments. In all tests, STENCIL-NET generalizes better and is computationally more efficient, both in training and inference, than neural network architectures based on local (CNN) or global (FNO) nonlinear convolutions.

Details

show
hide
Language(s):
 Dates: 2023-08-07
 Publication Status: Issued
 Pages: -
 Publishing info: -
 Table of Contents: -
 Rev. Type: -
 Identifiers: DOI: 10.1038/s41598-023-39418-6
Other: cbg-8592
PMID: 37550328
 Degree: -

Event

show

Legal Case

show

Project information

show

Source 1

show
hide
Title: Scientific reports
  Other : Sci Rep
Source Genre: Journal
 Creator(s):
Affiliations:
Publ. Info: -
Pages: - Volume / Issue: 13 (1) Sequence Number: 12787 Start / End Page: - Identifier: -