← All Courses

A Guided Introduction to Topological Deep Learning

Background for reading Hajij et al., "Topological Deep Learning: Going Beyond Graph Data." Builds from scratch with worked examples, step-by-step matrix computations, and connections to physics.

Prerequisites: Linear algebra (matrix multiplication, eigenvalues, null spaces), basic calculus, some familiarity with neural networks. No algebraic topology background assumed — everything is built from first principles.
Part I — The Discrete de Rham Complex
01

Graphs as Combinatorial Objects

Why go beyond graphs. The adjacency matrix, signed incidence matrix, degree matrix, and graph Laplacian. Full eigendecomposition and spectral structure. The graph Fourier transform.

Sections 1–2 · 8 figures · Worked eigenvalue example · Spectral graph convolution
02

Message Passing on Graphs

From feedforward neural networks to GNNs. The GCN layer deconstructed: normalized adjacency, neighborhood aggregation, linear transform, and activation — with a full numerical walkthrough on our running example.

Feedforward review · GCN equation · Â construction · Complete toy example · Receptive field
03

Edge Signals & the Discrete Curl

Promoting the graph to a simplicial complex. The boundary matrix B₁₂, oriented face boundaries, 1-cochains as edge flows, the discrete curl, and the Hodge Laplacian L₁ on edges.

Oriented boundaries · curl(grad) = 0 · L₁ = L₁ᵈᵒʷⁿ + L₁ᵘᵖ
04

Face Signals & the Hodge Story

2-cochains on faces, the face Laplacian L₂, Betti numbers as topological invariants, the Euler characteristic, and the complete discrete de Rham complex with all operators and eigenvalues.

Betti numbers · Euler–Poincaré theorem · Full Hodge diamond
Part II — Topological Domains
05

Simplicial, Cell & Combinatorial Complexes

The hierarchy of topological domains: simplicial complexes (rigid triangulations), cell complexes (flexible polygons), and combinatorial complexes (the maximally general setting from Hajij et al.).

Closure properties · CC axioms · Comparison table
Part III — Neural Networks on Topological Domains
06

Signals, Neighborhoods & Message Passing

Feature spaces on cells of any rank (cochains), the three types of topological adjacency, and the general higher-order message passing (HOMP) framework.

Co-adjacency · Incidence · Upper/lower adjacency · HOMP update rule
07

Hodge Theory & the Architecture Zoo

The Hodge Laplacian hierarchy, spectral methods for higher-order signals, and a map of existing topological neural network architectures: SNN, MPSN, CWN, CAN, CTNN, and more.

Hodge decomposition · Architecture landscape · Hajij unification
08

Applications & Reading Roadmap

Connecting topological deep learning to thermal physics and semiconductor simulation. Message passing as PDE iteration. Section-by-section guide to the full Hajij et al. paper.

Chip package → CC mapping · Anisotropic message passing · Key papers