RECTIFIED DEEP NEURAL NETWORKS OVERCOME THE CURSE OF DIMENSIONALITY IN THE NUMERICAL APPROXIMATION OF GRADIENT-DEPENDENT SEMILINEAR HEAT EQUATIONS

Ariel Neufeld, Tuan Anh Nguyen

Research output: Contribution to journalArticlepeer-review

Abstract

Numerical experiments indicate that deep learning algorithms overcome the curse of dimensionality when approximating solutions of semilinear PDEs. For certain linear PDEs and semilinear PDEs with gradient-independent nonlinearities this has also been proved mathematically, i.e., it has been shown that the number of parameters of the approximating DNN increases at most polynomially in both the PDE dimension d∈ N and the reciprocal of the prescribed accuracy ϵ∈(0,1). The main contribution of this paper is to rigorously prove for the first time that deep neural networks can also overcome the curse of dimensionality in the approximation of a certain class of nonlinear PDEs with gradient-dependent nonlinearities.

Original languageEnglish
Pages (from-to)883-912
Number of pages30
JournalCommunications in Mathematical Sciences
Volume23
Issue number4
DOIs
Publication statusPublished - 2025
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2025 International Press

ASJC Scopus Subject Areas

  • General Mathematics
  • Applied Mathematics

Keywords

  • curse of dimensionality
  • gradient-dependent nonlinearity
  • multilevel Monte Carlo
  • multilevel Picard
  • partial differential equation
  • PDEs

Fingerprint

Dive into the research topics of 'RECTIFIED DEEP NEURAL NETWORKS OVERCOME THE CURSE OF DIMENSIONALITY IN THE NUMERICAL APPROXIMATION OF GRADIENT-DEPENDENT SEMILINEAR HEAT EQUATIONS'. Together they form a unique fingerprint.

Cite this