"Data-Efficient Operator Learning for PDEs: Sparse Recovery and Deep Neural Networks"
Nick Dexter
Department of Scientific Computing,
Florida State University (FSU)
Wednesday, Sep 17, 2025
- Nespresso & Teatime - 417 DSL Commons
- 03:00 to 03:30 PM Eastern Time (US and Canada)
- Colloquium - 499 DSL Seminar Room
- 03:30 to 04:30 PM Eastern Time (US and Canada)

Click Here to Join via Zoom
Meeting # 942 7359 5552
Zoom Meeting # 942 7359 5552
Abstract:
Operator learning arises widely in scientific computing, especially for partial differential equations (PDEs) where operators map between Banach or Hilbert spaces. We study the more general setting of Banach-valued operators, focusing on holomorphic mappings with broad applications. Two complementary approaches are considered: compressed sensing (CS) for sparse polynomial approximation and deep neural networks (DNNs) with arbitrary encoders/decoders and standard feedforward architectures. CS exploits sparsity to achieve accurate approximations with minimal samples, while our DNN analysis identifies architectures with optimal generalization bounds independent of operator regularity, with width and depth depending only on the number of training samples. We show that deep learning is minimax-optimal for this problem, i.e., no recovery method can surpass its generalization bounds up to logarithmic terms, and that fully connected networks admit uncountably many optimal minimizers. A full error analysis accounting for approximation, discretization, and optimization is provided, along with numerical experiments on challenging PDE models including parametric diffusion with and without advection and reaction terms, Navier-Stokes-Brinkman, and Boussinesq equations. Finally, we extend CS techniques for sparse orthonormal polynomial approximation to Banach-valued settings, using a GPU-accelerated restarted primal-dual algorithm for ℓ1-minimization, enabling direct comparison of the two approaches and highlighting their respective strengths.
