 Welcome back to our lecture series Math 42-20, Abstract Algebra 1 for students at Southern Utah University. As usual, I'll be your professor today, Dr. Andrew Missildine. In lecture 32, we're beginning chapter 11 in Tom Judson's Abstract Algebra book about group homomorphisms. We've talked about the idea of a isomorphism previously in this lecture series, and so now we're going to focus on the more broad concept of homomorphisms. The two objects are related, and we'll see exactly what that means in just a second. So for the definition, we say a map phi between two groups, so the first one called g and its operations star, the second one called h in its operations circle. Now notice the operation, the binary operation of the groups might not have anything to do with each other. This doesn't have to be like addition to addition or multiplication to multiplication or anything like that. A map phi from g to h, and which again these are groups, is called a homomorphism or sometimes called a group homomorphism to emphasize the group structure. When we get more into algebra, you have things like rings and modules, it comes much more imperative to clarify that at this point. This series has been predominantly about groups, so we'll just call them homomorphisms. So a map phi from g to h is called a homomorphism if phi preserves the group operations. Now this is language I've used before. What it meant when we talked about isomorphisms was that phi preserves the operation if for all elements a and b inside of the domain group g, we have that phi of a star b is equal to phi of a circle phi of b. So in other words, a homomorphism is a function between groups with the homomorphic property, which we introduced this when it came to isomorphisms previously. So note that isomorphisms as we've defined them are just bijective homomorphisms. While isomorphisms preserve the group structure exactly between the two groups, a homomorphism is a function that need only preserve part of the group structure. So what we get from homomorphisms is that these are functions from a group to a group that preserves the operation. That is, the image of a product is itself equal to a product with between the two respective groups right there. But some information could be lost if the homomorphism is not a true bijection, right? Now in terms of the language here, we've learned previously that isomorphism means the same shape, right? So morph in this case means shape. So homo in this context means one. So you look at the etymology of these words, it kind of means the same thing, a homomorphism is a one shape. Now we're not saying that it's exactly the same, but if there's a homomorphism between the groups, there's probably something that's preserved. There's some similarities, even if it's not a perfect similarity. So let's look at some examples from linear algebra to motivate here, some groups that live inside linear algebra. So for example, we could take the general linear group of in-by-in real matrices. So this would be the set of all in-by-in real matrices, which are non-singular, that is they have an inverse with respect to matrix multiplication, that is a group. It's a pretty important group in fact. The determinant, in fact, so the usual determinant of a matrix, right? The determinant of a matrix forms a homomorphism from the general linear group to the real numbers with respect to multiplication. So remember our star right here, this is the non-zero real numbers with respect to multiplication. And how do we see this homomorphism? Well, this is a property that one typically sees in a linear algebra class, that the determinant, determinant of a, b is equal to the determinant of a times the determinant of b. So the determinant has this multiplicative property, which in the context of group theory, this is the homomorphic property. That is, the products are preserved. The determinant of a product is equal to a product of determinants. So the determinant forms a, it forms a homomorphism from the general linear group to real multiplication. Another example coming from linear algebra, take the trace map. So remember what did the trace map do? The trace, if you have an in-by-in matrix, the trace is going to be the sum of the diagonal entries. You add those all together. So if we take the vector space of in-by-in real matrices, so in this context, this is going to be an additive group. So we add together matrices. And then the trace is a map from in-by-in matrices to just real numbers with respect to addition. So that's the operation and playwright or addition. Well, the trace map also has the homomorphic property with respect to addition here, for which if you take the trace of the sum of two matrices, this equals the trace of a plus the trace of b. So this is the homomorphic property between these two additive groups. So we have these multiplicative groups before multiplication matrices versus multiplication of real numbers. And then we had addition of matrices with addition of real numbers. The trace is homomorphic on addition of matrices and the determinant is homomorphic on multiplication here of matrices. Now, in order for two groups to be homomorphic, we don't necessarily have to have the same name of the operation, addition and addition and multiplication and multiplication. But those are, we often label those things because of the similarities I should mention here. Now, of course, continue with this example. I should mention that this homomorphism is somewhat limited in its direction, right? So the determinant, for example, is not a homomorphism on, I guess we're using the notation r to the in-by-in right here. It's not homomorphic with respect to addition, right? If you take the determinant of a plus b, this is not equal to determinant of a plus the determinant of b. So it doesn't preserve the addition of the matrices, determinants preserve the multiplication of the matrices. So it's this multiplicative homomorphism here. Let's look at another example. This is also, believe it or not, a very important homomorphism here. So we can come up with the homomorphism from the symmetric group, so s sub n, to the cyclic group of order two. And I'm going to actually represent in this situation the cyclic group of order two as the set of primitive, well, not necessarily primitive, the set of second roots of unity. So that is, we take the cyclic subgroup generated by negative one inside of c star. So we're talking about one and negative one. So just the two elements right there. So we can define a map, which is often denoted as s g in for short, from s n to z two. So this is going to be a function on permutations, which will assign to a permutation, either the number one or the number negative one. And this is called the signum of sigma here, which is just a fancy Latin word, which means the sign of a sigma here. Is it a positive one or is it a negative one? We're really just attaching a sign to it. And so some people often call this the sign of sigma properly. It's the the signum there, but I won't get too upset. I didn't want to be they called the sign of the permutation. And I want you to and so I want you to try to convince yourself that this is in fact a homomorphism. It turns out it does preserve the multiplication here. And we've actually used this map once before, when we proved that every permutation is either even or odd. So if you were, if you recall, what we did is previously we had these permutations in s and then we transformed them into permutation matrices, right? So this would be an element inside of the general linear group, gln of r. Okay. And then we took the determinant of that map. Excuse me, took the determinant of the matrix, which then would give us a number, which had to be for since these were permutation matrices, it had to be plus or minus one, right? Those are the options. And so this would then be something going towards z two right here. And so what we see here is in the previous in the previous slide, we talked about the determinant, how it's a map from gln r to the real numbers multiplicatively here, right? Well, because it's permutation, we can guarantee that the term is either going to be plus or minus one. So we see this as a subgroup of r star right here. And so this signal map is actually the composition of two homomorphisms. There's the determinant, which we talked about previously. And then there's this other one, this inclusion map. We include the permutations inside of the general linear group via this permutation representation as a matrix right here. We'll talk about some more of this in the future, but the composition of homomorphisms is going to be homomorphic as well. Because if this preserves, if this map preserves the operation, and then this one preserves the operation, then transitivity basically comes into play here and we see that. So if you use this argument before, and maybe it comes as no surprise that what the signum is really telling us here is if you have an even permutation, an even times an even here is going to give you an even, right? Because even permutations are exactly those permutations whose signum is equal to one, all right? I mean, you take an even combined with an odd, you'll get an odd, which is one times negative one is negative one. And that goes the other direction as well, an odd combined with an even is an odd. But finally, an odd combined with an odd is an even, because you get negative one times negative one, which is positive one. And so when you look at this right here, this kind of looks like the Cayley table for Z2, right? If I were to label this zero one, zero one, in terms of addition, you get zero one, one, zero. That's if you did addition mod two, if we did this multiplication here, you get plus one, minus one, plus one, minus one. You get one times one, which is one. One times negative one is negative one, negative one, and plus one right there. This looks like the second Cayley table Z2. And that's not, that's not actually a surprise here whatsoever, because in fact, this function right here is onto. And so one thing we're going to see in the future with the first isomorphism theorem is that if you have the onto homomorphism, then this, you'll actually get this type of structure, right? We're capturing the codomain in some regard. So please note that none of the above examples are actually isomorphisms, with the exception of course, that, you know, when we talk about the determinant here, right? Or the trace, right? If you take this, if you take n to equal one, in those cases, these are isomorphisms. But for general determinants and traces, these are not isomorphisms whatsoever. And in this case, right, if you take n to be two, then this would be an isomorphism in that situation. So other than sort of like these trivialities, these are, these are not isomorphisms. But in all of these examples, they are in fact, surjective, right? Every number, every nonzero number can be represented the determinant of a non-singular matrix. Every number can be written or can be expressed as the trace of a matrix. And as there are even an odd, right, you're going to get one and negative one. So these are all onto maps. So a little bit of, a little bit of buzz jargon for you here. If you have an onto a surjective homomorphism, this is what's commonly referred to as an epimorphism. Now, I don't believe this is used in Judson's textbook anywhere, but this is commonly used here. And so it'll be proven in the future that the codomain is actually isomorphic to the factor group of, to a factor group of the domain. That's what the first isomorphism theorem is going to tell us. Now, as you're probably wondering, well, okay, is there a special name for one-to-one homomorphisms? And the answer is yes, right? This is an example of a monomorphism. Mono here meaning one, right? So it's a one-to-one homomorphism, monomorphisms. So this will be an example of a monomorphism. So we're embedding SN inside of the general linear group, okay? And with a monomorphism, we will be able to prove, well, again, this will all be a consequence of the first isomorphism theorem, is that the image of a monomorphism is isomorphic to the domain.