 The main idea for this section is the idea of a non-singular matrix. Those matrices which have matrix inverses, matrix reciprocals, you could call them. And therefore, division is essentially possible for non-singular matrices. Now, showing that a matrix is non-singular just comes down to providing an inverse matrix. It's like, oh, here it is. Everyone can see the inverse matrix. That's all the same thing you'd have to do to prove there's a big foot, right? All you have to do is just go capture him, right? And show him to the world, here's big foot. But what happens if there's no big foot? How do we prove that he doesn't exist? Because our inability to find him does not mean he doesn't exist. It could be that big foot's really good at hiding from noisy humans in the California forest or something, right? So non-existence, proving something doesn't exist is a much more challenging task. And so in this regard, we're gonna introduce the non-singular matrix theorem, which is a huge, huge, huge theorem for which we are not gonna even provide most of the details here. But what this is gonna do is it provides a list of equivalent conditions for a matrix being non-singular. And I'm not kidding on this. Like, I mean, if we look at this, if we look at this on the screen, I mean, this thing fills the entire page. We go down to xx, which, in Roman rule, that's 20, right? So this is a very long list of things. And again, some of these things are very naturally connected to each other, but let's talk about a few of the highlights of this list right here. So first thing, condition one, being non-singular means you're non-singular. That's sort of a no-brainer, because, well, if you're non-singular, you're non-singular. That doesn't give us any new information in this situation. Well, equivalent means that, when we say that two things are logically equivalent, we mean either both statements are true or both statements are false. And so when we look at this list of 20 items, which I'll warn you, this list is not comprehensive. These are just some of the items that are equivalent to being a non-singular matrix. When you have one of these statements as true, then it turns out all 20 are true. And if one statement is false, then all 20 are gonna be false as well. So if a matrix is non-singular, what does that mean? Well, first of all, it means that the matrix is invertible. That is, it has a matrix inverse, what we'll call that A inverse there. Now this A, one and two are obviously the same thing because this is actually how we defined non-singular matrices. So of course, a non-singular matrix, a matrix is non-singular if and only if it's invertible. But when you look at condition number three right here, this one might seem like it's the same thing, but it's a little bit different. We say that a matrix can be non-singular if and only if there exists a matrix C such that C A equals the identity. And so this right here would say that there exists some type of left inverse. Because after a matrix multiplication is not commutative, multiplying on the left is not the same thing as multiplying on the right. And so if there's a matrix which multiplies on the left of A that produces the identity, we claim that this is equivalent to having an inverse. That is, it's equivalent to be non-singular. So checking for a left inverse is actually sufficient to showing that a matrix has an inverse, all right? And then condition four talks about right inverses. There exists a matrix, which would multiply on A by the right, gives you the identity. So if you have a one-sided inverse, that's actually equivalent to having a two-sided inverse, aka a regular inverse. And so when we check, oh, are these two matrices inverses of each other? You only have to check one side. You don't have to do both multiplications because it'll always turn out to be the case with a square matrix. All of these conditions are equivalent to being a non-singular matrix. Condition five here says that if we look at the equation A equals zero, this is our homogeneous system of linear equations here. If this has only the trivial solution, because these things are always consistent, you can always take X to be the appropriate zero vector. We always have trivial solutions. But if there is no other solution, if the only solution to AX equals zero is the trivial solution for a square matrix, it's important in this whole conversation, we realize the matrix is square. If you relax that to an M by N matrix, many of these things will no longer be true because after all, only square matrices can have inverses. A rectangular matrix could not. If AX equals zero has only the trivial solution, that's equivalent to being a non-singular matrix. The matrix inverse will exist. And we'll actually see in the next lecture why that is the case. And that connects us to a lot of things, right? Because if AX equals zero has only the trivial solution, that means the equation AX equals zero has no free variables. And the existence of free variables has nothing to do with the vector on the right-hand side. AX equals B would have no free variables. That's exactly condition six right here. So there's some things we could say about that. Another thing to consider is that if AX equals B has no free variables, that means inside of the matrix, since we're a square matrix, you're not gonna have any rows of zeros when you row reduce A, which would mean that it necessarily is gonna be consistent because the only thing that can make the system inconsistent would be zero equal to something non-zero. So AX equals B is gonna be consistent for all vectors B, all right? Some other conditions related to that is that the column vectors of A are gonna have to be linearly independent. And if the column vectors are linearly independent, then they actually form a basis for the column space. And because of the independence, the dimension of the column space was at its rank, if you have a pivot in each column because the columns are linearly independent, that means the rank is gonna have to be in. And if the rank is in, that means the column space, which is spanned by the columns of A, would have to span the entire indimensional space, which is FN. So the columns of A span FN, they're a basis for FN. We can also make a statement that if A is singular, then its transposal also be singular because if you have a column, if you have a pivot in each and every column, where did it go, here we go, if you have a pivot in every single column, when you take the transpose, columns become rows, and since you have the same number of rows as columns, you have a pivot in every row, but you also have a pivot in every column still. So A transpose will be invertible matrices, which means that the rows of A will span FN because its column space would span FN. The rows of A have to be linearly independent. The rows of A form a basis for FN. So you see all these connections. Once we've kind of connected things together, we get all these other properties, right? And so I would encourage you to kind of read through this list of familiarize yourself with all these things equivalent to being a non-singular matrix. We've talked about free variables. We've talked about spanning and linear independence. We've talked about A transpose. Another one that's very important to mention here. So like, okay, A would have in pivot positions. That's the same thing as saying the rank is in. This one's an important one too. And this is actually our inversion algorithm, which we're gonna present in the next lecture. It's gonna be based a lot on this fact right here. A is row equivalent to the identity matrix. That is the same thing as being a non-singular matrix because if you have a matrix which has a pivot in every single column and it's a square matrix, that thing would row reduce to the identity. I want you to convince yourself of that. We can make a connection to matrix transformations. The linear transformation that sends X to AX would have to be an injective map if A was non-singular. That would have to be onto. It'd have to be one-to-one and onto, right? If you get one of the conditions, you actually automatically get both of the conditions, which is pretty impressive. And so this is just 20 of properties. This is not an exhaustive list by any means. Now some consequences of the non-singular matrix theorem I wanna talk about right now. A corollary that's actually quite interesting is that if you have a product of two matrices that's non-singular, then the two factors had to be non-singular themselves. This is sort of a weaker version of the zero product property. So with numbers, if you have X times Y equals zero, you would then say, oh, the scalars, that implies that X equals zero or Y equals zero. We couldn't do that for matrices. And that's because if you have a product of two things that's equals zero, it could be that the matrix that you have, you can have the product of singular matrices that gives you zero, even though they're non-zero themselves. So we don't get the zero product property, but we can get something kind of like unto it. If you have a product of two matrices that's invertible, that means that it's non-singular, then the two factors had to be invertible as well. And this is a consequence of the non-singular matrix theorem, although I'm leaving that as an exercise to the viewer here to finish the proof of that. What I wanna do in this video is talk about some consequences here. What type of information will guarantee that a matrix is non-singular or singular? So for example, suppose we have a three by three matrix over the complex numbers and suppose we know its nullity is equal to zero. Can we say anything about it being singular or non-singular? Well, the nullity, remember, the nullity is gonna be the number of non-pivot positions, non-pivots, which in contrast, the rank of the matrix is gonna be the number of pivots. So one thing we see very quickly is if you take the rank, the rank is just gonna be n minus the nullity right here. So that is if you take away from the number of columns, the number of non-pivot columns, that'll give you the number of pivot columns. But if you take n, take away zero, this is just n, and like we saw on our list previously, it was somewhere there, searching, searching, searching, aha. We saw this right here, condition 11, the rank of a matrix is n, which implies that it's non-singular. So if your nullity is zero, that means your rank is gonna be n. And so we can definitely say that this matrix is gonna be a singular matrix. And I mentioned n right here, this would specifically be the number three. But the fact I told you it was three, but three didn't have much bearing because it said n by n. The fact it was over the complex numbers made no difference whatsoever. Let's look at another example. Suppose this time a is a five by five matrix over the field z two. And furthermore, let's assume that the equation ax equals zero has eight solutions, which is possible. Now it suggests that the null space of a is a three-dimensional subspace of z two five. All right, what does this say about singular versus non-singular? Well, this one we actually talked about on the previous slide, whoops, not that slide, this one. We chat about this one that if the homogeneous system has only the trivial solutions non-singular, but what we see right here is that there's eight solutions. So there's gonna be seven non-trivial solutions. So this tells us that, okay, ax equals zero has non-trivial solutions. And so what that implies to us is that this matrix would be not non-singular. So that tells us it's singular. And this follows by the non-singular matrix theorem. Let's do one last one. Suppose that a is a two by two matrix such that a times the vector one two is equal to the vector one two. What can we infer about this? Is this matrix singular or non-singular? Well, it turns out it depends. In fact, we don't have enough information. There's not enough info right here. And why do I know that? Well, because there's actually two possibilities. We could take a to be the identity matrix, right? Notice the identity matrix. If you times it by the vector one two will give you the vector one two. And this would give you the identity matrix is non-singular. So it certainly is possible that the matrix could be non-singular. But on the other hand, we could take a to be the matrix one zero two zero. Consider that one for a moment. If you took this times a, you're going to get one two right here. I'll let you verify the arithmetic right there. But if you multiply one two by this matrix a, you're gonna get the vector one two. But on the other hand, this matrix is singular. How do I know it's singular? Well, look at the column of zeros right here. The column vectors of this matrix a are going to be linearly dependent because the set of vectors has a zero vector in it. And if the column vectors are linearly dependent, that implies that the matrix would be singular. Cause the non-singular matrix theorem told us that to be non-singular, the column vectors have to be linearly independent. So dependent would imply singular. And so we don't have enough information to answer this question right here because with the information provided the matrix could be singular like the one provided right here or it can be non-singular like the one provided right here. So we don't have enough information to nail down whether it's singular or not. And so that's gonna bring us to the end of section 3.3 about matrix inverses, but that does not bring us to the end about our conversation about inverse matrices. What we've done here with this conversation of the non-singular matrix theorem is we've learned ways of identifying when a matrix has an inverse and when it doesn't have an inverse. But when it does have an inverse, with the exception of two by two matrices, we yet do not have an algorithm to find what that inverse is. That will be remedied in the next section 3.4. We're gonna introduce elementary matrices as a tool to help us compute the inverse of a non-singular matrix. Thanks for watching everyone. If you've been learning things in these videos, these lecture series, please hit the like button. If you wanna learn some more cool facts about linear algebra or other math videos feel free to subscribe and always if you have any questions on any of these videos feel free to post your questions in the comments below and I will be happy to reply when I can. Have a great day everyone. Bye.