Physics-Informed Neural Network for 2D Acoustic Scattering
PINN solver for plane-wave scattering off a rigid cylinder, validated against the exact Bessel/Hankel series solution across wavenumbers ka = 0.5 to 2π. Includes Fourier feature ablation, BGT2 absorbing boundary conditions, and a multi-scatterer honeycomb extension. Interactive results dashboard with field comparisons, training diagnostics, and error analysis.
VAN Sampling of 1D Ising Model in an External Field
Variational autoregressive network for sampling spin configurations of the 1D Ising model with an external magnetic field. Investigates the bias introduced by autoregressive factorization and compares VAN free energy estimates against exact transfer-matrix results.
A Guided Introduction to Topological Deep Learning
Background for reading Hajij et al., "Topological Deep Learning: Going Beyond Graph Data." Builds from scratch: graphs → simplicial complexes → cell complexes → combinatorial complexes → higher-order message passing → Hodge theory → spectral methods. Includes worked examples, SVG diagrams, and connections to thermal physics throughout.
LiveDiscovering Graph Convolutions: From Circulant Matrices to the Graph Laplacian
How the DFT and graph convolutions emerge from the same idea — simultaneous diagonalization of a commuting algebra. Following Bamieh (2018): the DFT is discovered, not postulated. Then the same philosophy carries forward to graphs, the Laplacian, and GNNs. Includes interactive eigenvisual app and full worked examples.
From Mean Field Theory to Variational Autoregressive Networks
Background for reading Wu, Wang & Zhang, "Solving Statistical Mechanics Using Variational Autoregressive Networks" (arXiv:1809.10606). Builds from the variational free energy principle through naïve mean field theory to the one-layer VAN architecture, with worked numerical examples, the bias question, and the REINFORCE training loop.
Physics-Informed Neural Networks (PINNs)
Embedding PDE constraints into loss functions โ theory, training dynamics, failure modes, and when they actually work. See the Helmholtz PINN project for a worked example.
Fourier Neural Operators for Parametric PDEs
Background for reading Li et al., "Fourier Neural Operator..." (arXiv:2010.08895). Sentence-by-sentence breakdown of operator learning, the neural operator architecture, and the Fourier-space parameterization, with concrete Darcy-flow examples and SVG diagrams throughout.
DeepONet & Neural Operator Theory
Universal approximation for operators, branch-trunk architectures, and the mathematical foundations connecting all neural PDE solvers.
PDEs for Machine Learning
The PDE theory neural solvers actually need: weak solutions, Sobolev spaces, variational formulations, and well-posedness.
Optimization for Deep Learning
SGD, Adam, Muon, loss landscapes, learning rate schedules, and why neural networks train at all.
Autodiff & Backpropagation from Scratch
The chain rule, computational graphs, reverse-mode automatic differentiation, and building a minimal autograd engine in Python โ the calculus that makes deep learning work.
Molecular Dynamics Simulations
Force fields, integration schemes, thermostats, ensembles, and the bridge to machine-learned interatomic potentials.
Computational Electromagnetics
FDTD, method of moments, and how Maxwell's equations connect to the de Rham complex in topological deep learning.