Deep ReLU neural networks overcome the curse of dimensionality when approximating semilinear partial integro-differential equations

Ariel Neufeld*, Tuan Anh Nguyen, Sizhou Wu

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

In this paper, we consider nonlinear partial integro-differential equations (PIDEs) with gradient-independent Lipschitz continuous nonlinearities and prove that deep neural networks with ReLU activation function can approximate solutions of such semilinear PIDEs without curse of dimensionality in the sense that the required number of parameters in the deep neural networks increases at most polynomially in both the dimension d of the corresponding PIDE and the reciprocal of the prescribed accuracy ɛ.

Original languageEnglish
JournalAnalysis and Applications
DOIs
Publication statusAccepted/In press - 2025
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2025 World Scientific Publishing Company.

ASJC Scopus Subject Areas

  • Analysis
  • Applied Mathematics

Keywords

  • Curse of dimensionality
  • deep neural networks
  • high-dimensional partial integro-differential equations
  • high-dimensional PDEs
  • multilevel Picard approximations
  • stochastic differential equations with jumps
  • stochastic fixed point equations

Cite this