 Hi, I'm Zor. Welcome to a new Zor education. I would like to spend some time and talk about an operation which we didn't really discuss yet related to matrices. Division. Well, let me start from division of the numbers. The very convenient way to introduce division of numbers is first to introduce an inverse number. So, if you have a number N, then integer number N. We basically define a completely new universe of numbers and we define a new number which we just symbolically write as 1 over N and this is a new number which by definition multiplied by N gives 1. So, this is basically the way how we can introduce the rational numbers given integer numbers. Well, we will try to approach the similar question with matrices in exactly the same way. So, first for any matrix we have to define something which we can call actually an inverse matrix. So, it's a matrix which multiplied by our matrix gives something which is equivalent of the 1 in the matrix world. So, what is the equivalent to 1 in the matrix world? Well, we were talking about identity matrix or a unit matrix sometimes it's called. Identity matrix is something which contains 1's on the main diagonal. Now, the property of this matrix is such that if it's multiplied by any other matrix the result would be exactly this particular matrix. We have already showed that in the previous lectures related to multiplication of matrix. So, in matrix notation if identity matrix is i, i, then i multiplied by any matrix is the same as the result will be the same as the original matrix. The multiplication can be on both sides, doesn't really matter. In this case, by the way, multiplication is commutative which is generally speaking not the case. So, we know what 1 actually in the matrix world is. It's an identity matrix or a unit matrix. Now, question is can we define an inverse of the matrix A and usually it's written as A to the power of minus 1. Which is also kind of equivalent to numerical symbolics. We don't really use this type of symbolics and I will tell you why later. But in any case, so this is an inverse matrix and we would like this matrix have the property if it's multiplied by the original matrix it will give the identity matrix. And it's supposed to be on both sides. So, for a matrix A the inverse matrix is the matrix which we symbolically note as A to the power of minus 1. With such a property that if multiplied by A it gives identity matrix. So, that's how we will try to introduce this division operation through multiplication. So, division by something in the numerical world is basically multiplication by inverse. So, that's how we will define inverse in the matrix world and that's how we will approach the problem of division. Okay, so, this is basically a definition of the inverse matrix. It's such a matrix noted as A to the power of minus 1. By the way, it's not really a power, it's just a symbolic representation. So, such a matrix which if multiplied by A on the left or on the right the result will be identity matrix. So, definition is fine but definition must be reasonable, valid and the definition should have certain properties. Well, the two most important properties of any definition is if we define an object the question is does it exist? I mean we can define something which maybe doesn't exist at all. Maybe there are no matrices which have this property. And the second property of the definition which is very, very important is uniqueness. I mean does it really define the matrix considering it exists? Maybe there are more than one matrix which have these properties. In which case the definition is really fuzzy. I mean yes we can define like any matrix which has this property can be called an inverse. But that's not really nice. We would like to have a unique matrix which has this property. So, these two very important aspects existence and uniqueness of whatever we define we have to really discuss right now. Okay, so first of all let me speak about the dimension of the matrices. Can matrix A which has m rows and n columns where m is not equal to n. So, it's not a square matrix. Can this matrix be invertible? Does it have an inverse matrix? Well, let's just think about it. We would like to specify this and this which means that the number of columns in the inverse matrix should be the same as number of rows here. Remember if we have two matrices which we would like to multiply one by another. Now, if this has dimension m times n, m rows and columns this should have n rows and whatever number of columns. Otherwise the multiplication is not defined. Multiplication is defined only if number of columns of the left multiplier is equal to number of rows, number of columns of the left is equal to number of rows on the right. Right? Yes. Okay, which means that inverse matrix to A must have m columns, m columns. But now this also is true. So, it should have n rows. Okay. So, we found out that to be able to multiply this and this my matrix A to the minus one, the inverse to A matrix should have dimension n by m. But now let's talk about the result. Now, the result of this operation would be if I multiply matrix of this dimension by matrix of this dimension I will get the matrix of n by m dimension. Now, on this side I'm multiplying m by n dimension which is A and n by m which is inverse to A and the result would be m by m. Now, that's not nice again. We cannot say that these two are the same even if they are both identity matrices but of different dimensions. One is n by n and another is m by m. So, my point is that if the matrix has different number of rows and columns it cannot be inverted in the story. So, only square matrices can be inverted and this is basically the reason why. Because multiplying left by right we will get one dimension of an identity and right by left we will get another which is not really a good way to define the object. So, the first thing which we have to remember is only square matrices can be inverted. It's defined basically only for the square matrices. So, square matrix has an inverse only if there exists another matrix of the same dimension, the same square dimension n by n or whatever it is which has these properties. Alright, fine. Now, next thing is before addressing the existence and uniqueness of this definition. Let me talk about division now. Now, why did we introduce inverse matrix to talk about division? Now, again as an analogy with numbers we can have three quarters and we know what it is. Right? It's basically three times one fourth or three times four to the minus one. Now, what's interesting is that one fourth times three and four to the minus one times three is exactly the same number. Because the multiplication of numbers is commutative. Now, let's talk about matrices. What if we want to divide a by b? Now, analogously to this we can say this is a times one over b or a times b to the minus one. But why did we write it this way? We could have written it this way. Right? It's completely analogous and this is b minus one times a. But the problem is these are not equal to each other in a general case because the operation of multiplication of matrices is not commutative as we know. Which means that writing this is not really a good idea. We don't do this and similarly we don't really use this neither because it has basically the same kind of a horizontal bar which separates numerator from denominator. It's not really well-defined things. What is well-defined is this or this. So, whenever we are talking about, whenever we want to talk about division, we actually should explicitly specify which division we really mean. This or this. In which case it doesn't really make any sense to talk about division altogether because it's just a multiplication. What does make sense is to talk about inverse matrix which is b to the minus one and the multiplication. So, forget about division of matrices. Obviously, somebody can talk about this. But to be precise, let's not talk about division of the matrices. We will talk about multiplication of matrices and a concept of inverse matrix. And then how we position inverse matrix relative to another one. That's our business and that's why we really prefer to forget about the division of the matrix and talk about the multiplication. And then we will specify explicitly what kind of multiplication we need. What's on the left, what's on the right. So, that actually closed the issue of division of the matrices. There is no valid kind of a point to talk about this. What does make sense is to talk about inverse matrix and multiplication of matrices. That's why you won't find actually the concept of division of the matrices discussed in any substantial number of literature or whatever. Okay, now let's talk about existence. So, we have defined our inverse matrix as this and it's defined only for square matrices. Now, does it exist? No, it does not always exist. Let's put it this way. Not for every matrix I can find an inverse. Well, similarly to the numbers, by the way, there is no inverse for zero, right? Now, what's kind of an equivalent of zero in the matrix world? Of the number zero in the matrix world? Well, for instance, the matrix which contains all zeros, right? A null matrix, if you wish. Now, if A is a null matrix, it has all elements equal to zero. Then, obviously, no matter what this A to the minus one is, the result would be the matrix which will contain only zeros. Because that's basically how the formulas of the multiplication of the matrices look like. The numbers from the A matrix participate in every member of the result of the multiplication. So, the result would be zero, which is not, obviously, which is not an identity matrix. We cannot get identity matrix if A is a null matrix. That's a trivial example number one. A little bit less trivial example is, what if the matrix A contains only one row or only one column equal to zero? Well, as a result, I mean, again, you know from the rules of multiplication of the matrices, if you have one particular row, for instance, or one particular column equal to zero, then the result will also contain one row or one zero, or one column equal to zero. Let me just give you an example. For instance, you have a matrix A, B, C, D, and you multiply it by X, Y, zero, zero. Now, what would be the result? Well, the result would be, this is two by two, this is two by two, so the result would be two by two matrix. The element one-one is the first row multiplied by first column, which is AX plus B zero. Now, the second row, CD times XO, it's CX and D zero. Now, let's multiply element one-one-two, it's one-two, it's AY, B zero, and here is CY. Now, what if we do the other way around? What if we multiply X, Y, zero, zero times A, B, C, D? If we consider A, B, C, D to be an inverse of this matrix, the result should be exactly the same and it should be an identity. Now, in this case, we will have XA plus YC and here we will have zero and zero. Here, we will have XB plus YD and then we will have zero. Now, as I said, if you have the row equal to zero, then the left, multiply on the left, it will give the row equal to zero. Which cannot be an identity matrix, right? Because identity matrix has always one in every row and every column. Now, if you have the column equal to zero, then the situation would be slightly different than the multiplication on the right would give you a column equal to zero and on the left not. So, one of these cases will always be present. So, if either row is equal to zero in our matrix A or column equals to zero, one of these will definitely not be an identity. So, there is no inverse. There is no A, B, C, D matrix which can result in multiplication with result in an identity matrix. Okay. Fine. So, this is again a trivial example. So, the most trivial example is null matrix which does not have an inverse. The best trivial example is when only one row or only one column or multiple rows, whatever, not necessarily an entire matrix. And I have two other examples slightly more interesting. Let's consider that you have one row proportional to another row in the matrix. And I do have just a concrete example. Let's say 2, 4, 3, 6. Okay. If you have this matrix as A, and let's consider you have something which is a candidate for its inverse. Now, the result must be the identity matrix, right? Well, let's see what happens. 2A plus 4C, 3A plus 6C, 2B plus 4D, 2... Sorry, 3. 3B plus 6, right? That's my matrix. Now, what do we have as a result? Now, as a result, you have proportionality between these two lines. This line, if multiplied by 1.5 in this case, same as this. This line is proportional to this, and the coefficient is 1.5. So, the result also is the same proportionality between these two lines. Now, let's talk about identity matrix. Identity matrix, it has only one on the main diagonal. So, in this case, there is no proportionality. Identity matrix has completely independent rows. There is no linear dependency between them. So, you cannot multiply this by 1.5 to get this, obviously. So, my point was that if two rows are proportional to each other in the matrix A, then no matter what the inverse matrix candidate we will consider, the result would be the similar proportionality of their product. And since their product must be equal to identity, and the identity cannot have this proportionality. So, there is no inverse matrix. Now, in this case, I have actually columns proportional as well. Now, and just to demonstrate the proportionality of the columns result in exactly the same thing. But in this case, we have to have it in inverse. We have to consider A, B, C, D, proportional, multiply by proportional matrix 3, 4. Now, in this case, the result would be 2A plus 3B, 2C plus 3D, 4A plus 6B, 4C plus 6D. Now, in this case, as you see, the columns are proportional. So, this is double this, similarly to this. This column is double this column. So, this proportionality again is preserved. So, proportionality between the columns is preserved if we have multiplied this particular matrix on the right. Proportionality between the rows when you are multiplying by the left. And again, in the identity matrix, there are no columns or no rows proportional to each other. So, that's another a little bit less trivial example of non-existent inverse matrix. And the last example of situation when the matrix does not have an inverse is if you have a linear combination, it is slightly more complex than proportionality. Proportionality is just between two rows, let's say. One row is another multiplied by certain coefficient. The linear dependency is similar thing between multiple rows. Let's say you multiply one row by one number plus another number, another row by another number, and you will get the third row, something like this. So, in this case, exactly the same logic would lead us to similar linear dependency of the result of the multiplication. You can check it yourself on a simple 2 by 2 matrices. It's very simple. So, the point is that linear dependency of the rows or linear dependency of the columns in the matrix A is yet another cause for non-existent inverse matrix. Well, maybe there are no inverse matrices. Maybe there are only matrices which do not have inverse. Well, that's not the case, and we will have certain problems where I will actually derive concrete inverse matrices for certain matrices which do have this type of inverse. All right, so I wanted to make sure you understand that not every matrix has an inverse. But there are many matrices which do have and we will address that issue. And the last question which I would like to address is uniqueness. So, in case the inverse matrix exists, is it unique? I mean maybe there are two inverse matrices which result exactly the same thing, right? So, that actually can be formulated as a little theorem. I mean, it's very simple. It's called mini-theorem. So, if you have matrix A and you have two different matrices, both of them are inverse. I would like to prove that these matrices, U and V, are the same. Now, if both are inverse, then I can write this. That's the definition of the inverse matrix, right? If U is inverse, then the multiplication by A on both sides gives identity and if V is inverse, the same thing. All right, so let me use this equation first. Let me multiply by V on both sides. If these are equal, I multiply by the same matrix V and, by the way, all of these matrices have exactly the same dimension, whatever the dimension is, N by N square matrices. Okay, so multiply by V and I get this, right? Now, the multiplication of matrices, as we know, is associative. If you don't remember it, go to one of the previous lectures where I discussed this and prove it, too. Now, the associative law actually allows us to change the parenthesis, so I can put it this way. Now, V times I, since I is an identity matrix and we know that multiplication of any matrix by identity would result in the same matrix V, right? Now, V times A is I. Now, again, identity matrix multiplied by any the result in that matrix, so I had U is equal to V. And that's exactly what we have to prove. All right, that basically concludes my introduction into inverse matrix. We will spend some time in problem solving, probable lectures to basically find out what exactly are these inverse matrices and calculate them. And meanwhile, this is the end of the theoretical part, so to speak. I invite you to unizord.com where this lecture contains also the notes, goes through notes, it's very important. And basically, that's it. Good luck and until the next lecture about the problems related to inverse matrices. Thank you very much.