Hi, I am Ameya Vikrama Singh, a college student with a recreational interest in mathematics. I was involved in Olympiad math in school, and these days pursue it in leisure.
Here we will talk about two interesting applications of Linear Algebra in Graph Theory. It is highly recommended that you have some familiarity with Linear Algebra, such as the definition of linear independence, rank, rank-nullity theorem, determinants, and some idea of eigenvalues. 3b1b's playlist is a great place to start, and you do not require very rigorous insights into these concepts. Some familiarity with the definitions in graph theory will help too. $G(n,e)$ denotes a simple graph $G$ with $n$ vertices and $e$ edges, all graphs in this post are simple, though you may extend the results beyond them. I encourage you to ponder upon the why? checkpoints to follow the post.
Here we present a few definitions:
Adjacency Matrix $A$: The adjacency matrix associated with a simple graph $G(n,e)$ is an $n \times n$ matrix, such that $a_{ij}=1$ if vertex $i$ and $j$ are connected by an edge and $0$ otherwise. For example, the adjacency matrix of the graph in figure 1 is given by
$$A = \begin{pmatrix}0 & 1 & 1 & 1 & 0 \\ 1 & 0 & 0 & 1 & 1 \\ 1 & 0 &0 & 1 & 0 \\ 1 & 1 & 1 & 0 & 1 \\ 0 & 1 & 0 & 1 & 0 \\\end{pmatrix}$$
Degree Matrix $D$: The degree matrix is an $n\times n$ diagonal matrix whose diagonal elements are the degrees of the corresponding nodes. For fig 1:
$$D = \begin{pmatrix}3&0&0&0&0\\0&3&0&0&0\\0&0&2&0&0\\0&0&0&4&0\\0&0&0&0&2\end{pmatrix}$$
Directed Incidence Matrix $B$: For every edge of an undirected graph, we assign an arbitrary direction to make it a simple directed graph. The incidence matrix of such a directed graph is an $n \times e$ matrix, where each column corresponds to an edge and contains a $+1$ for the vertex it leaves, and $-1$ for the vertex it enters. For example, (one of) the incidence matrices for fig 1 is:
$$B = \begin{pmatrix}1 & 1 & 1 & 0 & 0 &0 & 0 \\ -1 & 0 & 0 & 0 & -1 & 0&1\\ 0 & 0&-1&1&0&0&0\\0&-1&0&-1&1&-1&0\\0&0&0&0&0&1&-1\end{pmatrix}$$
Laplacian Matrix $L$: The Laplacian matrix is defined as $L = D-A$. Notice that the product $BB^T$ is independent of which orientation is assigned to the undirected graph and that $L = D-A = BB^T$ (why?). If you are familiar with the Laplacian operator ($\nabla^2$) from calculus, you might see how this matrix can be compared to a discrete version of the double derivative. Indeed, for a discrete function $f:V \rightarrow \mathbb{R}$, which can be represented as a $n$-vector, applying the Laplacian Operator quantifies the difference of the value of the function from the mean of its neighbours, and if you are familiar with the Mean-value property, you might start to think what the Laplacian Matrix represents as an operator.
$$L = \begin{pmatrix}3 & -1 & -1 & -1 & 0 \\ -1 & 3 & 0 & -1 & -1 \\ -1 & 0 &2& -1 & 0 \\ -1 & -1 & -1 & 4 & -1 \\ 0 & -1 & 0 & -1 & 2 \\\end{pmatrix}$$
We begin by analysing a few properties of these matrices, pertaining to graph theory. Each row and column of the Laplacian matrix sums to 0. This implies that the Laplacian's determinant is 0, and thus 0 is one of its eigenvalues, with eigenvector $(1,1,1 \cdots)$. Now, there is this cool theorem from Linear Algebra that says $\text{rank}(L) = \text{rank}(B)$, and also this other cool theorem from Linear Algebra, which says the number of linearly independent rows in $B$, is equal to the number of linearly independent columns in $B$. An immediate corollary from this is that $B$ can have at most rank $n-1$(why?). Now consider a certain $n-1$ columns of $B$ that are linearly independent. These represent $n-1$ edges in the graph. If these edges contained a cycle(A path starting and ending at the same vertex), then the columns couldn't be linearly independent(why?). Thus, these $n-1$ edges contain no cycle, which implies they must form a tree(Note that you also need connectedness for a tree, why should they be connected?). This gives us a necessary and sufficient condition for the connectedness of a graph, that $B$, and hence $L$, have the rank $n-1$.
Try extending this result to the number of components in a disconnected graph.
Electric Circuits and Kirchhoff's Laws
In the Physics Olympiad circle, there is a problem with electric circuits known as Foster's Theorem. The problem appears as Problem 4.9 in Electricity and Magnetism, EM Purcell. The proof as provided in the text is to consider a clever current superposition which simplifies the summation. Here we provide an alternate proof which shows the power of linear algebra.
The theorem goes as follows: the sum of resistance distance across all the edges of the graph equals $n-1$. Resistance distance across an edge means the effective resistance between the endpoints, assuming all edges of the circuits were $1 \Omega$. For example, for the fig 1, the resistances across endpoints of edges are $\frac{4}{7}, \, \frac{10}{21}, \, \frac{13}{21}, \, \frac{13}{21}, \,\frac{10}{21}, \, \frac{13}{21}, \, \frac{13}{21}$, and their sum is $4 = 5-1$.
Ohm's Law tells us that this resistance is equal to the potential difference between the endpoints when a current of 1 A is put into one endpoint and taken out of the other. To evaluate this potential difference, we must use Kirchhoff's Laws. Consider the current vectors which involve feeding 1 A into a node, and taking it out from its neighbour. There are $e$ such vectors, and they are exactly the columns of $B$! For each of these $e$ current vectors, there is a corresponding $v$ vector. We can write simultaneously the $e$ equations, by letting $i$ be a $n \times e$ matrix which is equal to $B$, and $v$ be the corresponding $n \times e$ matrix $V$ whose columns represent the solutions for each case. The equation now becomes
$$LV = B$$
But $L$ is equal to $BB^T$, and we have $$B B^TV = B$$
You might see where this going, but it's a bit more subtle than cancelling $B$. What is the product $B^T V$? It is an $e \times e$ matrix, and we care particularly about its diagonal terms, which represent the potential difference across the edge for which the current is fed. These diagonal terms are equal to the resistances across endpoints of edges! Thus the sum we are looking for is the sum of diagonal terms of $B^TV$, which is called the trace. The trace of a matrix, is equal to the sum of its eigenvalues(why?). Try and complete the proof from here.
Each of the rows of $B$ is a left eigenvector of $B^T V$, with eigenvalue $1$. There are $n-1$ of them which are linearly independent(why?), so the matrix $B^T V$ has the eigenvalue $1$ at least $n-1$ times. But, since $\text{rank}(AB) \le min\{\text{rank}(A), \text{rank}(B)\}$ , it can have at most rank $n-1$. Thus, $n-1$ eigenvalues are $1$, and the rest must be $0$. The sum of the eigenvalues, and thus the trace is equal to $n-1$. Can you obtain a similar result when the resistances are not all equal? Can you modify the method to prove Foster's Second Theorem?
The discrete Laplacian in electric circuits has much of the same utility as the Laplacian in the continuous case from calculus. Just like the Laplacian operator in calculus is Hermitian, the Laplacian matrix is real and symmetric (a subset of being hermitian). From the spectral theorem, it has an orthonormal eigenbasis in which it is diagonal. By transferring Poisson's Equation to the eigenbasis of the Laplacian, we simply have to solve a Fourier decomposition problem, and this provides a mechanical method to solving finite or infinite circuits. (If you are unfamiliar with this method, I recommend checking out a mathematical physics/quantum mechanics resource, which will typically contain all of the fun and none of the rigour :) ) An illustration of this method to solve a famous problem is seen here: Infinite Grid of Resistors.
Counting Trees
A tree is a connected graph with $n$ vertices and no cycles. It must have $n-1$ edges. A labelled tree is a tree whose nodes are labelled. We will count the number of labelled trees on n vertices, by algebraic methods.
Spanning Tree: A subgraph of a connected graph $G(n,e)$ is a spanning tree if it contains all vertices of $G$ and is a tree. Two spanning trees are the same if they have the same vertices and edges.
Can you draw the spanning trees of the graph in fig 1? There are 21 of them. The number of spanning trees of a graph is denoted $\tau$.
Consider a complete graph on $n$ vertices, denoted $K_n$, where every pair of vertices is adjacent. Let the nodes be labelled from 1 to n. Each spanning tree of this graph has a one-one correspondence to a labelled tree on n vertices. We will count the number of spanning trees of a connected graph.
Recall the condition for $n-1$ edges forming a tree, that their corresponding columns in $B$ are linearly independent. This means that there is some $(n-1) \times (n-1)$ submatrix (obtained by removing a row), whose determinant is non-zero. In fact, you get the same determinant no matter which row is removed (why?). What is the value of this determinant for any of the spanning trees in fig 1? Prove that it is $\pm 1$ for any tree (induction ftw), and 0 otherwise.
Cauchy-Binet Formula
For matrices $A_{m \times n}$ and $B_{n \times m}$, with $n \ge m$ the determinant $\text{det}(AB)$ is given by the Cauchy-Binet summation:
$$\text{det}(AB) = \sum \text{det}(A_S) \text{det}(B_S)$$
Where the summation is over all choices of $m$ columns of $A$ and the corresponding $m$ rows of $B$. The proof can be taken as an exercise, or can be found here.
The Matrix-Tree Theorem
We apply the Cauchy-Binet formula to the product $BB^T$. Since we know the determinants with one row of $B$ removed, we remove a certain row from $B$, and a certain column(not necessarily the same) from $B^T$. Now we have an $(n-1) \times e$ and an $e \times (n-1)$ matrix, whose $n-1$ determinants are all $\pm 1$, and of the same sign between $B$ and $B^T$(why?).
Thus, the terms in the Cauchy-Binet summation are $1$, if the $n-1$ edges form a tree, and $0$ otherwise. The sum counts the number of spanning trees of the graph, and it is equal to the minor of any element of the Laplacian! So, any minor of the Laplacian is equal to the same value, which is the number of spanning trees of the graph. It can also be shown that this cofactor is equal to
$$\tau = \dfrac{1}{n} \lambda_2 \lambda_3 \cdots \lambda_n$$
where $\lambda_2 \cdots \lambda_n$ are the non-zero eigenvalues of the Laplacian. (Consider the coefficient of $x$ in the characteristic polynomial of $L$)
$K_n$ and $K_{m,n}$
What is the number of spanning trees of $K_n$? The Laplacian of $K_n$ is given by
$$\begin{pmatrix} n-1 & -1 & -1 & \cdots \\ -1 & n-1 & -1 & \cdots \\ -1 & -1 & n-1 & \cdots \\ \vdots & \vdots & \vdots & \ddots \end{pmatrix} $$
Clearly, $n$ is an eigenvalue, and since $L - n I$ has nullity $n-1$, $n$ has multiplicity $n-1$, equal to the values of $\lambda_2, \lambda_2 \cdots \lambda_n$. Thus, from the Matrix-Tree theorem, the number of spanning trees is $$\tau = n^{n-2}$$An elegant combinatorial argument for the same formula by means of the Prufer Code can be found here.
How many spanning trees does the Complete Bipartite Graph $K_{m,n}$ have? The Laplacian is
$$\begin{pmatrix} m & \cdots & 0 & -1 & \cdots & -1 \\ \vdots & \ddots & \vdots &\vdots & \ddots & \vdots \\ 0 & \cdots & m & -1 & \cdots & -1\\ -1 & \cdots & -1 & n &\cdots &0 \\ \vdots & \ddots & \vdots &\vdots & \ddots & \vdots \\ -1 & \cdots& -1&0&\cdots&n\end{pmatrix}$$
with $m$ repeated $n$ times and vice-versa. Thus, we get that $m$ is an eigenvalue of multiplicity $n-1$, $n$ is an eigenvalue of multiplicity $m-1$, and from the trace, $m+n$ and $0$ are the remaining eigenvalues. Thus, the number of spanning trees are
$$\tau = n^{m-1} m^{n-1}$$
Can you modify the Prufer Code to obtain an encoding for these bipartite spanning trees?
Woahh Nice one :O
ReplyDelete