Adjacency and Laplacian Matrices of a Graph and Their Special Properties
Analyzing the eigenvalues of an adjacency matrix A(G) of a graph G can provide valuable insights into its structural and connectivity properties. This analysis is called spectral analysis of a graph, and the set of eigenvalues of a graph is called the spectrum of the graph.
In the adjacency matrix, an entry $A_{ij}$ is 1 if there’s an edge between node i and j, and 0 otherwise.
If the graph is undirected, A is symmetric (since if $A_{ij}$ is 1 also $A_{ji}$ is 1) with real values, and therefore all of its eigenvalues are real (noncomplex).
Click for proof
Proof: Let A be real and symmetric, thus: $A^T = A$ and (1): $\bar{A}=A$, where bar notation means the complex conjugate. Let's look at some complex eigenvalue of A, with non-zero eigenvalue: $Ax = \lambda x$. We can always complex conjugate the two sides and get (2): $\bar{A} \bar{x} = \bar{\lambda} \bar{x}$ Now let's look at the scalar: $\lambda \bar{x}^T x = \bar{x}^T (\lambda x) = \bar{x}^T (A x) = (A^T \bar{x} )^T x = (A \bar{x} )^T x \overbrace{=}^{(1)} (\bar{A} \bar{x} )^T x \overbrace{=}^{(2)} (\bar{\lambda} \bar{x} )^T x= \bar{\lambda} \bar{x} ^T x$ Since x is non-zero we get that $\lambda$ is real. $$\square$$$\quad$
We can show that the greatest eigenvalue of A(G) is bounded above by the maximum degree of nodes in the graph $d_{max}$. The degree of a vertex is the number of vertices it is connected to.
Click for proof
Take the greatest eigenvalue $\lambda_1$ and its eigenvector $v$. Look at the entry $i$ in which $v$ has the maximum value: $v_i \geq v_j$. $$ \lambda_1 v_i = (\lambda_1 v)_i = (Av)_i = \sum_{j=1}^{n} A_{ij} v_j \leq \sum_{j=1}^{n} A_{ij} v_i =( \sum_{j=1}^{n} A_{ij}) v_i = deg(i) \: v_i \leq d_{max} v_i $$ Therefore $\lambda_1 \leq d_{max} $ $$ \square $$$\quad$
It can be shown that the upper bound becomes equality if every node has d neighbours, and then the max degree is d and is also the largest eigenvalue. It can also be shown that the lowest eigenvalue is lower bounded by the average degree of the graph.
Number of connected components of a graph
Define D as the diagonal degree matrix, in which $D_{ii}=deg(v_i)$, and then we can define the Laplacian matrix as $L=D-A$
We can show that the number of connected components in G is equal to the number of zero eigenvalues of L.
Click for proof
First, we know that the number of zero eigenvalues of L $\geq$ number of CC, and then we'll show that they're $\leq$ number of CC.
Let's try to construct the null space of L. Let $k$ be the number of connected components in G, and $S_1, .., S_k$ the connected components.
Let's define $k$ representative vectors $u^1, ..., u^k$ each in dimension V (number of vertices), in the following way: $u^i _j = 1$ if $j\in S_i$ and $0$ otherwise. In simple words, a representative vector has 1's in all the places of the vertices that are belong to it.
Clearly, the $u$ vectors are orthogonal, since different components have disjoint vertices, so $u^i \cdot u^j = 0$ for $i \neq j$
For any vector $x$, what is $Lx$? Let's use the Laplacian definition, and check the m entry of x: $$(Lx)_m = ((D-A)x)_m = deg(m)x_m - (Ax)_m = \\ \sum_{(m,n)\in E}^{}x_m - \sum_{(m,n)\in E}^{}x_n = \sum_{(m,n)\in E}^{}(x_m -x_n) $$ In simple words, the m'th entry of $Lx$, is the summation, per every edge $n$ of $m$ of the value of $x_m$ minus $x_n$.
So what's the value of $Lu^i$? $$(Lu^i)_m = \sum_{(m,n)\in E}^{}(u^i_m -u^i_n) $$ If m is in the connected component of i, then $u^i_m=1$ and since n is connected to m it is also $u^i_n=1$, therefore $(u^i_m -u^i_n) =0$. if m is not in the connected component of i, then $u^i_m=0$ and since n is connected to m, it is also not in the connected component, therefore $u^i_n=0$. So, in any case, the sum is zero for every entry m, hence $Lu^i= 0$
So we just found k orthogonal vectors, each is an eigenvector corresponding to the zero eigenvalue. So, the number of eigenvalues of L is at least k.
Now let's try to prove that the number of eigenvalues of L is at most k. For any vector x, what is the scalar $x^TLx$ ?
$$x^TLx = \sum_{m\in V}^{}x_m (\sum_{(m,n)\in E}^{}(x_m -x_n)) = \sum_{(m,n)\in E}^{}x_m(x_m -x_n)$$ Let's split the summation: $$ = \sum_{m<n \: (m,n)\in E}^{}x_m(x_m -x_n) + \sum_{m>n \: (m,n)\in E}^{}x_m(x_m -x_n) \\ $$ Let's just swap variable names of the last summation: $$ = \sum_{m<n \: (m,n)\in E}^{}x_m(x_m -x_n) + \sum_{n>m \: (m,n)\in E}^{}x_n(x_n -x_m) $$ And merge back $$ = \sum_{m<n \: (m,n)\in E}^{}x_m(x_m -x_n) - x_n(x_m -x_n) = \sum_{ m < n | m,n \in E}^{} (x_m -x_n)^2 $$ In words: If $x$ is vector of numbers you put inside the graph nodes, the scalar $x^TLx$ is the sum of the squares of differences between two nodes connected by edge, when you count each edge once. So we know that the scalar $x^TLx \geq 0$, thus L is positive semidefinite.
If u is a null eigenvector of L, it means that $Lu=0$. If $Lu=0$ it surely means that the scalar $u^TLu=0$, and that means that $u_m=u_n$ for every $m,n \in E$ (as we just proved above). So, it means that a null eigenvector u must have the same entry value for all vertices that are in the same connected component, and that means we can span/create every such vector, using the $u^i$ vectors we constructed above, that have 1's in every connected component. Using different linear combinations of $u^i$, we can create any new null eigenvector we want. That means that there are at most k vectors that span the null space, and this concludes our proof that the eigenvalue of 0 is repeated at most k times, and that the number of zero eigenvalues is exactly the number of connected components in G. $$ \square $$
$\quad$
In addition, the smallest eigenvalue of L is always 0. Proof: Since we showed that L is PSD, we know from linear algebra that all its eigenvalues are nonnegative. If we choose a vector u of all 1’s, we get $Lu=0$, hence 0 is an eigenvalue.