 Welcome back to our lecture series linear algebra done openly. As usual, I'll be your professor today, Dr. Andrew Missildine. Now we've mentioned in the previous section, section 3.1 about matrix operations, that the set M to the N by N, so this was the set of all N by N matrices with coefficients coming from F right here. We mentioned that this formed a vector space where the matrix addition and skill emultification satisfy the eight properties associated to definition 1.3.1. That is the definition of the vector space, those eight axioms. So this will tell us things like matrix addition commutes. Matrix addition is associative. There is an additive identity with respect to matrix addition, the so-called zero matrix. Every matrix has an additive inverse, which is really just negative one times that matrix. Scale emultification distributes over matrix addition. We have that matrix addition distributes over skill emultification. We have that matrix, our skill emultication is associative and that multiple by one gives us just the original matrix. So all of those nice properties we had before are true for these matrix operations of addition and skill emultification. But in the previous section, we introduced some new operations that didn't have counterparts for vector spaces that we studied in chapter two. Some examples would be the trace map, the transposition matrix multiplication. What happens to these? What type of properties can we stay about those operations? And so those are things I wanna talk about right here and right now. So let's begin with the transposition map. Suppose that A and B are two matrices with the same size. So let's say they're both in by in matrices just to be clear here. So then it makes sense that we could add together the matrices, at least for this first situation. So we could take A plus B in this situation. What if you take A plus B and then you take the transpose of that? How does transposition interact with matrix addition? Well, it turns out that transposition distributes over matrix addition. The A plus B transpose is equal to A transpose but B transpose. Remember the transposition operation, it switches rows to columns and columns to rows. When you add together two matrices and take the transpose, it'll just distribute the transpose, right? It's A transpose with B transpose. Likewise, if you interact with some scalar, let's say that C is some scalar from the field F. If you take C A transpose, this will be the same thing as C A transpose. That is the scalar essentially comes out of the transposition process and doesn't really interact with the transposition too much. So I just want to pause right here and just look at properties one and two. We'll get to three, four in a second. So what this tells us is that transposition preserves matrix addition and it preserves scalar multiplication. This right here is what we refer to as the linearity property. More specifically, this means that the transposition can be thought of as a linear transformation. What I mean is the following. We could think of the transposition as a map that takes matrices from the vector space F to the M by N and it transitions it to a matrix of the form N by M. So it switches rows and columns so those numbers are gonna switch around. And so the transposition map, it switches a matrix from M by N to N by M and this map, well, since this is a vector space and this is a vector space, we can think of these spaces and matrices as vector spaces. We can then ask is this map linear and properties of the transposition in fact are. The transposition is a linear map on these spaces of matrices. Well, what do we have here? Property three tells us that if you take the transpose of the transpose, so A transpose transpose, the double transpose gives you back the original matrix A. So taking the transpose twice sends you back. That's quite useful here. But what this also tells you is that the inverse, the inverse of a transpose is itself. The transpose undoes itself. It's like taking the reciprocal of the reciprocal. That's really nice here. And so then the fourth property here, we're gonna consider how does the transposition interact with matrix multiplication? Now in this situation, A we'd have to say is like an M by N matrix and then B is an N by P matrix in order for this product to be compatible, right? Cause the ends would cancel out and the product AB would be an M by P matrix, all right? So think about that. Think about that. I'll just erase this right here. We don't need it right now. So we have M by N times N by P. That'll give you an M by P matrix. This is what we think of for AB right here. So AB transpose, what it creates is the following. AB transpose is equal to the matrix B transpose A transpose. So you'll notice that when you take the transpose of a product, it actually switches things around. So B comes first now and A comes second. This is what we often refer to in this literature and this is not a joke. This is often referred to as the shoe sock principle. What are shoes and socks have to do with anything right here? Well, the idea is the following. In the morning you put your socks on then your shoes, but in the afternoon or evening when you return home you take your shoes off then your socks that the process reverses itself. So we first had shoe then, sorry, we first had sock then shoe, then we get shoe then sock. So transposition flips these things around and let's actually see why that actually has to be. So if B was a matrix of the form N by P, then B transpose is gonna be a matrix of the form P by N. And if A was a matrix that was M by N, then it's transposed will be matrix of the form N by M for which when you simplify that gives you a P by M matrix. Which as A B is an M by P matrix, it's transpose should be P by M. Where this right here is B transpose, A transpose. If you didn't flip these things around, I'm gonna draw this one in red so you think of death and blood right here because this is the wrong way to do it. If you did A transpose, B transpose, this would look like well N by M, that would be A transpose and B transpose would be P by N. We don't even know if such a matrix product would be defined or not. So you have to flip these things around in order for the matrix product to even make sense. So you hit the shoe sock principle when it comes to transposition. This is the thing that remember the most is that when you take the transposition of a product it switches things around by the shoe sock principle. This is actually alluding to or something we'll talk about in the next video is that matrix multiplication as it was defined is not a commutative operation that A times B does not equal B times A in general. And this oftentimes is the fact because like we see here, even if A times B is well-defined, B times A might not exist. The product might be incompatible but it turns out there's other issues going on there but we'll talk about that later. Let's talk about the trace for a moment. Remember the trace of a matrix was the sum of the main diagonal. So you add together all the terms on the diagonal here. That was the trace. So imagine we have two matrices that are in by in, right? We only define the trace for square matrices. So what can we say about the trace map? Well, if you take the trace of the sum of two matrices this is equal to the trace of A plus the trace of B. So the trace preserves addition and that kind of makes sense because how do you find the diagonal entries of A plus B? Well, you're just gonna add together the diagonal entries of A plus B. And since those scalars come from a field which we can associate and commute we can break apart those terms. And so the trace, the sum of the diagonals of A plus B will be the sum of the diagonals of A plus the diagonals of B. Well, what if we put a scalar multiple in there? So if you take the trace of C times A, well, if you times A by C you're gonna times every entry in the matrix by C particularly the diagonal entries will be multiplied by C as well. And so as you add up each of those terms of the diagonal of C A all of them are divisible by C, you can factor it out. And if you factor out what's left behind well, you'd be left behind with the sum of the diagonals of A. So you times that by C. So scalar multiplication is preserved by the trace. But then if I look at properties one and two again what is this telling us if we think of the trace as a map from F N by N which is an N squared dimensional vector space this would map down to just a scalar which a scalar we can think of as the one dimensional vector space, the square matrices and the field themselves are we can view those as vector spaces. And as such, if we have a map between vector spaces we should ask is it a linear map and properties one and two right here tell us that this is in fact a linear transformation. What a surprise that all these operations that we keep on considering our linear transformations hopefully it's not a surprise. We are in linear algebra. The only things we care about in this class are linear things. If the trace was not linear we probably wouldn't care about it. So yes, the transposition map is linear transformation so is the trace map. And I should mention that when it comes to the transpose map we could also say similar statements about the conjugate transpose for complex matrices. So property three here how does the trace interact with the transposition, right? So if you take the trace of A transpose that's actually the same thing as A transpose because as you switch rows to columns and columns to rows I want you to notice that for a transpose of a matrix the main diagonal is identical to what you started with the transposition doesn't change anything with respect to the diagonal. So as you add up to the other diagonal the A transpose will have the same trace as A. So transposition doesn't do anything to the trace. What about matrix multiplication? The transpose or the trace of AB is actually gonna equal the trace of BA. Now why is this property right here important? But like I was mentioning earlier when we talk about transposition here matrix multiplication is not necessarily commutative. A times B if it exists doesn't guarantee that B times A exists. Now in this setting right here I don't know the red pins out again for blood. In this situation right here if you take an N by N matrix and you times it by an N by N matrix this will give you an N by N matrix. And in this situation A times B will be defined and B times A is also defined. So because these are square matrices both AB and BA will be defined. But what we are not saying here is that AB is equal to BA. In general these matrices will be distinct from each other and check out the next video to actually see a counter example to show you that matrix multiplication does not commute. So we're not saying that AB equals BA but what we are saying is that the trace of these two matrices which are likely distinct will still be the same number. And also notice here that we're not saying that the trace of A times B we're not saying that this is equal to the trace of A times the trace of B. So the trace map does not preserve matrix multiplication but it does tell us that as things commute the trace is still the same number which is actually quite impressive thing because when we look at some examples and I'll point this out in the next video I'll show you a counter example of A times B versus B times A that are different matrices but I'll point out that the trace still turns out to be the same. But even though the diagonal entries are not the same. Again that's what's kind of curious here this is sort of a non-elementary fact right here. Speaking of all of these properties of matrix multiplication so I've sort of given some bad news, right? Matrix multiplication will not commute we'll see some examples of that soon counter examples of that soon but there are some good algebraic properties for matrix multiplication that we can count on similar to multiplication we've seen in the past. So for example matrix multiplication is associative A times BC is the same thing as AB times C that you can either multiply the first two then the last or the last two then the first and those two products would be the same thing so matrix multiplication is associative. Matrix multiplication is also distributive if you have A times B plus C you can distribute A and get AB plus AC you get the left distributive law you also get the right distributive law A plus B times C you can distribute the C and you get AC plus BC. It's important that we mention both of these properties because as I keep on mentioning matrix multiplication is non-commutative so because it distributes on the left does not mean it automatically distributes on the right although matrix multiplication does work in that regard. What about scalar multiplication? How does matrix multiplication interact with that? If you take the product AP and you scale that by some scalar C right here this is the same thing as scaling A and then multiplying them together or you could scale B and then multiply them together. So even though matrix multiplication does not commute in general it turns out scalar multiplication does commute. You can either scale the product, the first factor or the second factor it doesn't matter you get all the same things. And then the last property I did wanna mention in this video right here is that matrix multiplication does have an identity matrix and that's it has an identity, a multiplicative identity that's what we call the identity matrix. If A is an M by N matrix then we see that IM which is the M by M identity matrix once along the diagonals zeroes everywhere else IM times A will equal A and likewise A times IN will also equal A. So if you times the identity on the left or the right you'll always get back A. Now the identity you choose does depend on the size of A. If A has M many rows you need IM on the left. If A has N many columns then you're gonna have to have an IN on the right. Now of course if A was an N by N matrix it's a square then this would be the same on both sides. But because there could be a mismatch between rows and columns you have to multiply on the left or right by the appropriately sized identity matrix. So this gives us some important properties of matrix multiplication, matrix transposes and matrix traces operations we learned about in the previous section. And so for the most part these things behave very, very nicely. But like I said there are some properties of matrix multiplication that don't turn out to work so nicely and we're gonna explore those in our next video.