 We've represented graphs using a vertex set v and an edge set e. The difficulty is that there aren't many things we can do with a set. Instead, we can represent a graph using the adjacency matrix. The adjacency matrix of a graph is a matrix A where the i, j entry is 1, for now, if there is an edge between vertices i and j, and 0 if there is no edge between the two vertices. And it's worth at least mentioning that at this point, we've now joined the two main areas of discrete mathematics, namely, linear algebra and graph theory. For example, let's consider this graph. Vertex 1 is joined to vertices 2, 3, and 5, so the second, third, and fifth entries in the first row should be 1, and the rest should be 0s. Vertex 2 is joined to vertex 1 and 4, so A21 and A24 are both 1, while the remaining entries in the row should be 0. Vertex 3 joins to vertex 1 and 5, so A31 and A35 are 1, and the rest of the entries are going to be 0. Vertex 4 joins to vertex 2 and 5, so A42 and A45 should be 1, and everything else is 0. Finally, vertex 5 is joined to 1, 3, and 4, so the first, third, and fourth entries in the last row are 1, and the rest are 0. So because we've now represented the graph as a matrix, we can then apply every tool of linear algebra. This opens up whole new realms of possibilities for what it can do with the graph. But we'll begin with a key feature of the adjacency matrix. Suppose M is the adjacency matrix for a graph G with V vertices. For any K, the i-jth entry of M to power K is the number of distinct length K walks from i to J in G. For example, let's take a look at our graph, and we might want to find how many length 5 walks there are between vertices 1 and 4. So remember we have the adjacency matrix, and because we're looking for length 5 walks, we want to find A to the 5th. Now we could find A to the 5th by multiplying A by A by A by A by A by... I've lost count. More efficiently, we can use the fast-powering algorithm. This is based on the idea that it's just as easy to find the square of a matrix than it is to multiply two arbitrary matrices. And in particular, what we'll do is we'll repeatedly square our matrix. So we'll find A squared, the square of A squared, which is A to the 4th, the square of A to the 4th, which is A to the 8th, and so on. And if we want to find A to the n, we'll multiply the appropriate powers of A. So we find A squared, that's A multiplied by A, which gives us... Now A to the 4th will be A squared times A squared. And so we find... Now we have A, A squared, and A to the 4th. So now we can find A to the 5th, which will be A to the 4th times A, which will be... And now we want to length 5 walk between vertices 1 and 4, so we find the A, 1, 4 entry, which is 10. And so there are 10 length 5 walks between vertices 1 and 4. Now just to keep my mathematician card current, let's go ahead and prove this theorem. And because we're doing a repeated multiplication, this is the type of thing we'll prove by induction. Clearly the statement is true for n equals 1. Now suppose the statement is true for n equals k. For convenience, we'll let A be m to the k plus 1. We'll let B equal m to the k, and consider Aij... The ijth entry of A m to the power k plus 1, which is m to the k times m, or we'll call this Bm. Trust me, if we try to do this any other way, we're going to drown in a c of subscripts. So remember how we find the ij entry of a product of two matrices? We're going to take the components of the i-th row of the first matrix, and so that'll be Bi1, Bi2, and so on. And then we'll multiply them by the components of the jth column of the second matrix, that's m1j, m2j, and so on. Now by our induction hypothesis, Bij is the number of walks from i to j with length k. So now let's consider this first add-end. This is the number of walks from i to vertex 1 of length k times m1j. And remember m1j is going to be 1 if there is an edge between vertex 1 and vertex j, and 0 otherwise. So we can view this first add-end as a number of walks from i to j that pass through vertex 1. Likewise, this second add-end will be the number of walks from i to j that pass through vertex 2, and then the number of walks from i to j that pass through vertex 3, and so on. And so when we add them all together, there's some will be the number of walks from i to j, regardless of which vertex they pass through, and that proves our theorem. It's important to remember that we are computing the number of walks from i to j, but remember a walk in a graph might repeat a vertex and might retrace an edge. So while a power of the adjacency matrix is useful, we might need to take additional steps to answer a combinatorial question. We'll take a look at that next.