The Operator Learning Problem
What problem are we solving? The spatial domain D, function spaces A and U, the operator G† that maps functions to functions, the parametric approximation, and discretization with resolution invariance. Grounded in 2D Darcy flow throughout.
The Neural Operator Architecture
The lift–iterate–project pipeline. The lifting operator P, the iterative kernel integral layer (Definition 1), the kernel integral operator (Definition 2), the projection Q, and the full forward pass with concrete dimensions.
From Kernels to Fourier Space
Brief Fourier refresher, the convolution theorem, translation-invariant kernels, the key insight (Definition 3): learn R directly in Fourier space. Concrete tensor shapes and mode truncation.
The Complete FNO Layer & Forward Pass
The full Fourier layer equation, two parallel paths (global Fourier + local W), the discrete FFT implementation, a complete forward-pass walkthrough with dimensions, and complexity analysis.
FNO on Real PDEs
Burgers' equation (1D), Darcy flow (2D) with full experimental results, Navier–Stokes (2D+time), zero-shot super-resolution, and an honest assessment of what FNO can and can't do.
Elastic Waves & Ultrasonic NDT
The elastic wave equation, P-waves and S-waves, Snell's law and mode conversion, the angle beam NDT setup, cracks as scatterers, and casting NDT as an operator learning problem.
FNO for Crack Detection
Crack parameterization, FNO-3D vs signal-level architectures, training data generation from COMSOL, evaluation metrics, and the challenges of applying FNO to high-frequency wave scattering.