 In this video, I wanna talk a little bit about groups of matrices. So take the set M sub N of R to be the set of all N by N matrices with real scalars, right? Now this by itself doesn't exactly form a group because not every element is invertible, okay? So this is at the moment just a set. It's a set that does have a binary operation. We could take a square matrix and times it by a square matrix. That'll give you a square matrix. But by itself, it's not a group. Matrix multiplication is associative. It has an identity, which we would typically call the identity matrix, right? This is the matrix where you have ones along the diagonal and zeros everywhere else. But you don't in general have inverses. So this set of matrices doesn't form a group. It actually forms something we call a semi group or more precisely a monoid because a semi group is a set with a binary operation that's associative. If it has an identity, we call it a monoid, right? So monoid has associativity, it has identity, and then a semi group. This is something that has just, it's just an associative binary operation there. So these are some generalizations of groups. So this doesn't quite make it. But if we're like, hey, well some matrices have inverses. These are called non-singular matrices. What if we are only our attention to non-singular matrices? And this actually does give us a group. This is denoted as GLN of R. And this is called the general linear group. Linear kind of being because, well, matrices are a big deal in linear algebra, right? It's a linear group. The general linear group, because there are other linear groups like the special linear group. I don't need to list all the linear groups, the orthogonal group, the Hermitian group, unitary groups. I guess Hermitian matrices don't form a group. But the point is there are lots and lots and lots and lots of different types of matrix groups. We'll talk about some more of them in the future. And so if we only pay attention to invertible matrices that is non-singular matrices, we get a group. Now we can actually build a group inside of the general linear group. And this is gonna, and you can see this below here. In this group, we're gonna, it's gonna be a group of eight matrices. And this is gonna form what's called the quaternion group. And let me actually to simplify things, let me use the following symbols. So we're gonna use the number one to describe the identity matrix of one, zero, zero, one. And we're gonna use the number, or the letter I to represent the matrix zero, one, negative one, zero, okay? We're gonna use the symbol J to denote the matrix zero, I, I, zero. I guess I forgot to mention here that these are gonna be considered complex matrices right now because we're using imaginary numbers. And then the matrix K will be the matrix I, zero, zero, negative I. So these are four particular matrices I'll be interested in. Notice this, of course, is the identity matrix. And so we're gonna take the set of vectors, excuse me, instead of matrices in GL2C. So these are two by two matrices with complex coefficients. GL2C, we're gonna take the eight matrices where we take the identity and negative one times the identity. We're gonna take I and also negative I. So we times, we scale the matrix I by negative one. So negative I would look like negative one plus one right there. All right. We're also gonna take the matrix J and negative J, which would look like negative J will look like zero, negative I, negative I, zero, right? You can do that one. And then K and negative K, you get the idea. We get a negative I plus I right there. So we take these eight matrices and we claim that these eight matrices together using the usual notion of matrix multiplication forms a group. Now something is gonna be pretty obvious. Because this matrix multiplication of these eight matrices is just usual matrix multiplication, it is associative. And we do have an identity element. Great. Do we have inverses though, right? That's the ones a little bit harder to say. Do we even have an operation in hand, right? I mean, because be careful. We know that there is a map. We know multiplication will go from GL to C, cross GL to C. This will then give you something in GL to C, right? We know that matrix multiplication will take a matrix, times a matrix and will give us a matrix, but that's not what we're interested in right now. What we need is if we take something in Q eight and we combine it with something in Q eight, will it give you something in Q eight? Cause if that doesn't happen, we don't actually have a binary operation on Q eight. Instead, we have a binary operation on the complex matrices, but can we restrict the binary operation to a smaller subset? This is information we can gain from the Cayley table, which you now see illustrated right here. The identity is pretty clear. If you times any matrix by the identity, you'll just get back that element again. Also, if you times any matrix by negative one, it's almost like the identity, but you're gonna toggle between the positive and negative of the matrices, right? So one time, negative one times one gives you negative one, but negative one times negative one gives you one, right? If you times negative one by I, you get negative I and negative one times negative I gives you I. So multiplication by negative one, it acts just like scaling, which is actually why we call them one and negative one here, these are just scalar matrices. What happens when you multiply by I? I times one is I, times negative one is negative I, that's nice. If you times I times I, you actually get negative one and I times negative, negative I is actually one. So looking right here, we now have a candidate that I has as its inverse negative I and you can double check that negative I, when you multiply it by I gives you one. So I has an inverse, so does negative I, that's great. Let's see, what happens when you do I times J? I'll let you double check the details of this, but if you take the matrix I, and you times it by the matrix J, this is actually equal to the matrix K right here. Ooh, I guess I'm gonna back up on this one. I'm actually gonna do the details of this, just in case you have any doubt here. If we take zero one, negative one, zero, and you times it by J, zero I, I times zero, you just go through the usual matrix product, so you take the row times the column here, you're going to get an I. If you take this row times this column, you get a zero, you take this row times that column, you get a zero, and if you take this row times that column, you get a negative I, and that's exactly K. All right, that's not so bad. Let's go back up to our Cayley table. And so I times J is actually equal to K, I times negative J is negative K, that's not too surprising. I times K, I claim is negative J. Well, let's do that one here, and we'll erase this. So we're trying to do I times K right now, I K. So I remember is zero one, negative one, zero, and then K is I zero zero, negative I. If we do the matrix product this times that, you get a zero. If we do this times that, you get a negative I. If you do this row times that column, you should get a negative I. And if you do this row times that column, you're gonna get a zero, which if you factor out the negative ones, this gives you negative times zero I, I zero, which is in fact negative J, negative J right there. So yeah, that agrees with what we wanted to. And then the last one you'll see is that I times negative K gives you a J. So yeah, if you times I by any of these matrices, you get a matrix back inside the group Q. Multiplying by negative I does something very similar to that. If we go through these, again, I'm not gonna check every one of these. J times one, J times negative one gives you J, negative J. J times I gives you a negative K. Notice the following, I, I and J give you a K, but J and I give you a negative K, right? When you switch the order, you actually get different matrices. This is an example of a non-abelian group. So we did J, K already. Let's do, or no, we did, excuse me, I, J already. What if we do J, I? We do J, I right here. This is gonna look like zero, I, I, zero. And then you times that by zero, one, negative one, zero. Did I copy that down right? Yes, I did. So if we do the multiplication here, you're gonna get negative I. Then the next one, you're gonna get zero. Next one is zero, and the last one is I. Which that's not quite K. If you factor out the negative one, you end up with I, zero, zero, negative I. And that's then negative K. So this is in fact a non-commutative operation. And that's not too surprising. Matrix multiplication usually doesn't commute. And you can then go through these and check them one by one by one. J times negative I is K. You get J times J, which is negative one. J times negative J is one. Oh, that's the inverse of J. Likewise, the inverse of negative J is, in fact, J, they're inverses of each other. And then you can also verify, I'm not gonna do this one and I promise this time. If you take J times K, you actually get an I, and J times negative K is negative I right here. And then you can keep on going through all of these and see that you have closure here. This is a group of, this is a group with eight elements. It's non-abelian because the multiplication does not commute. It's kind of interesting in that regard. I also wanna make a connection with the cross product you might have seen, like in multivariable calculus or linear algebra, right? You often have these vectors, right? It's like the vector I, which looks like 1, 0, 0. You have the vector J, which looks like 0, 1, 0. And then you have the vector K, which looks like 0, 0, 1. And so when you do the cross product between these vectors here, like I cross J is equal to the vector K, but J cross I is equal to the vector negative K. And likewise, if you do I cross, I cross K, you get negative J. And if you do K cross I, you end up with just J, these vectors, these unit vectors here. I want you to be aware that the way that these matrices multiply together, I, J, and K, the reason we label them I, J, and K is that the way that these matrices multiply together is exactly how unit vectors multiply when you do the cross product. So it turns out that the cross product from multivariable calculus can actually be used to construct a group structure, although I did it specifically by representing instead of as unit vectors, I used complex two by two matrices.