 Welcome back to our lecture series, everyone, based upon the textbook linear algebra done openly. As usual, I will be your professor today, Dr. Andrew Misseldine. In this part of the lecture series, we are starting chapter six, which is all about eigenvalues and eigenvectors, which is also the name of section 6.1. In this first part of 6.1, I wanna show you or teach you what it actually means to have an eigenvalue and eigenvector, so kinda get used to these things. And in chapter six, you're gonna see this prefix, eigen all over the place. We have eigenvalues, eigenvectors, eigenpairs, eigenspaces, eigenbases. It's all over the place. I mean, we're gonna sound like Old McDonald's Farm because we have an eigen eigen here and an eigen eigen there, here and eigen there, eigen everywhere and eigen eigen. You're gonna see all these eigen terminology all over the place here. So to motivate and explain what an eigenvalue is in the first place, consider the following seemingly normal vectors and matrix. We have a two-by-two matrix, A, which is three negative two, one, zero, and two vectors, U and V, negative one, one for U and two, one for V. And so if we start just to do the product of these things, let's take A times U first, three negative two, one, zero, times it by U, which is negative one, one. By the usual rules of matrix multiplication, we end up with negative three minus two for the first entry and negative one plus zero for the second. That would simplify, of course, just to be negative five and negative one. And so that's the first product you have right there. How about the second one? Well, if we take A times V, we end up with three negative two, one, zero, that's our A. V, remember, was two times, or two and one. So by matrix multiplication, we end up with six minus two for the first entry and then two plus zero for the second entry. And this would then turn out to be four and two as the product. Fairly basic calculation right there. And when you compare the two side by side, you might not notice any significant difference between the two, but the mathematician inside of me is like, look at A times V, you have a four, you have a two. These are all even numbers. I can't help myself, but I must factor out the common factor of two. If you take out the two, you'll get two times two, one right here. And so this kind of draws my attention to the fact like two, one, huh? I feel like I've seen this vector somewhere before. Oh wait, aha, the vector V was two, one. This second situation had, the second product had the situation that A times V was the same thing as two times V. And it's like, that's kind of interesting. I got the same vector again. Multiplying by A is the same thing as multiplying by two. Well, that would have been useful to know the first time around. I mean, matrix multiplication isn't the worst thing in the world, but it's a little bit more complicated than multiplying by two. I mean, heck, it's a lot more complicated than just multiplying by two. That would have been kind of useful to know in the first place. And so this example we just saw is the example of what we mean by an eigenvector and an eigenvalue. So more generally, suppose we have a matrix A and we do require that this be a square matrix. A is in by N. And so we have this matrix. If there exists a non-zero vector X, so X could not, we're not allowing the zero vector. And we'll explain in another video why we're kicking out the zero vector here. Zero vector might feel kind of sad. Gosh, you don't let me be inside linear dependent sets. You don't let me be inside of orthogonal sets. Now you're not gonna let me be inside of eigen sets. What's wrong? Well, zero is exceptional. It needs to be treated as such. So imagine we have a non-zero vector X. We call this an eigenvector of A. If there exists some scalar and by tradition, the scalar we use for eigenvalues is typically the Greek letter lambda. And so we have the property that multiplying the vector by the matrix A is the same thing as multiplying the vector by a scalar lambda in this situation. So if multiplying X by A is the same thing as scaling it by some number, we say that X is an eigenvector of A. And that number that's involved in this equation, lambda, we call it an eigenvalue of the matrix A. And it corresponds to the vector X here. We act like the eigenvalues belong to the matrix A. And so in this chapter, we're gonna talk some more about these eigenvalues and their associated eigenvectors and everything else here. But the idea is that eigenvalues are scalars which do the same thing as matrix multiplication in special situations. All right, stay tuned for the next video. We're gonna talk some more about how to determine if we actually have an eigenvalue or eigenvector. Feel free to subscribe as usual. See you next time. Bye.