Chapter 07

Hodge Theory & the Architecture Zoo

The Hodge Laplacian hierarchy, spectral theory for higher-order signals, and a map of existing topological neural network architectures.

In this chapter
  1. 09Hodge Theory & the Laplacian
  2. 10The Architecture Zoo

09Hodge Theory & the Laplacian

If you know the graph Laplacian $\mathbf{L} = \mathbf{D} - \mathbf{A}$ (or its normalized variants), you know that it governs diffusion on graphs and that its eigenvectors provide a spectral basis for graph signals. The Hodge Laplacian generalizes this to all ranks.

Definition — Hodge Laplacian

For rank-$k$ cells in a simplicial or cell complex, the $k$-th Hodge Laplacian is:

$$\mathbf{L}_k = \underbrace{\mathbf{B}_{k-1,k}^\top \mathbf{B}_{k-1,k}}_{\text{lower Laplacian } \mathbf{L}_k^{\text{down}}} + \underbrace{\mathbf{B}_{k,k+1} \mathbf{B}_{k,k+1}^\top}_{\text{upper Laplacian } \mathbf{L}_k^{\text{up}}}$$

with the convention $\mathbf{B}_{-1,0} = 0$ and $\mathbf{B}_{K,K+1} = 0$. For $k=0$, this recovers the graph Laplacian: $\mathbf{L}_0 = \mathbf{B}_{0,1}\mathbf{B}_{0,1}^\top$.

The Hodge Laplacian Hierarchy L₀ on vertices B₀₁ L₁ on edges B₁₂ L₂ on faces B₂₃ L₃ on volumes rank 0 rank 1 rank 2 rank 3 L_k = B_{k-1,k}ᵀ B_{k-1,k} + B_{k,k+1} B_{k,k+1}ᵀ └─ lower part ─┘ └─ upper part ─┘
Figure 9.1. The hierarchy of Hodge Laplacians, connected by incidence (boundary) matrices $B_{k,k+1}$. Each $L_k$ has a "lower" part (from boundaries below) and an "upper" part (from cofaces above). The graph Laplacian $L_0$ is the first in the sequence.

Physics connection — Hodge decomposition: The kernel of $L_k$ (harmonic cochains) corresponds to topological features: $\dim \ker L_k = \beta_k$, the $k$-th Betti number. For $k=0$, $\beta_0$ counts connected components. For $k=1$, $\beta_1$ counts independent loops. This is the discrete Hodge theorem — the discrete analogue of the decomposition of differential forms into exact, coexact, and harmonic components. Every $k$-cochain $\omega$ admits: $$\omega = \underbrace{d_{k-1} \alpha}_{\text{exact}} + \underbrace{d_k^* \beta}_{\text{coexact}} + \underbrace{\gamma}_{\text{harmonic}}$$ where $d_k$ is the coboundary operator ($B_{k,k+1}^\top$ in matrix form) and $d_k^*$ is its adjoint.

Why Hodge Matters for Neural Networks

The Hodge Laplacian provides a spectrally meaningful diffusion operator for each rank. Just as spectral GNNs use eigenvectors of $L_0$ to define graph convolutions (via the graph Fourier transform), spectral topological networks can use eigenvectors of $L_k$ to define convolutions on $k$-cochains. The Hodge decomposition ensures these convolutions respect the topological structure of the complex.



10The Architecture Zoo

The Hajij et al. paper surveys and unifies a large number of architectures that had been proposed independently. Here is a map of the landscape:

GRAPH DOMAIN SIMPLICIAL DOMAIN CELL DOMAIN COMBINATORIAL COMPLEX DOMAIN (unifies all above) GCN GAT GraphSAGE SNN SCA MPSN SCN CWN CAN CC-XN General CC HOAN Higher-Order Attn CTNN Copresheaf (Vinci) KEY ARCHITECTURE ACRONYMS SNN = Simplicial Neural Network · SCA = Simplicial Complex Attention · MPSN = Message Passing Simplicial Network SCN = Simplicial Convolutional Network · CWN = CW Network · CAN = Cell Attention Network CC-XN = Combinatorial Complex Network · HOAN = Higher-Order Attention · CTNN = Copresheaf Topological NN
Figure 10.1. The architecture landscape, organized by domain type. The combinatorial complex (bottom) subsumes all others. Dashed arrows indicate that graph, simplicial, and cell architectures are all special cases of the CC framework. The CTNN (Vinci AI) further generalizes by adding directional, learnable maps between cells (copresheaf structure).

Selected Architecture Summaries

Simplicial Neural Networks (SNN / SCN)

Operate on simplicial complexes using the Hodge Laplacians $L_k$ as the message-passing operator. The update for rank-$k$ features is a polynomial filter on $L_k$, analogous to ChebNet on graphs. These architectures respect the Hodge decomposition.

Message Passing Simplicial Networks (MPSN)

The most general simplicial architecture: uses separate message functions for boundary ($\mathcal{N}_\downarrow$), coboundary ($\mathcal{N}_\uparrow$), and adjacency ($\mathcal{N}_{\text{adj}}$) neighborhoods at each rank. A strict generalization of standard MPNN.

CW Networks (CWN)

Extend MPSN to cell complexes, allowing non-simplicial cells. The key addition: messages can now flow along boundaries of arbitrary polygon/polyhedron cells, not just triangles and tetrahedra.

The Hajij et al. Unification

The paper's central technical contribution is showing that all of these architectures — and several more — can be expressed as instances of a single tensor diagram on a CC. The tensor diagram specifies which neighborhood matrices are used, how messages are computed, and how they're aggregated. Different choices of these components recover different existing architectures.