 Welcome back to our lecture series Math 4220 Abstract Algebra 1 for students at Southern Utah University. As usual, I'll be your professor today, Dr. Andrew Missildine. This is gonna be the first of four lectures based upon Chapter 8 from John Judson's Abstract Algebra textbook, and this is about the idea of coding theory. Now, before you get excited, think we're gonna be writing computer programs or codes or anything like that. We are gonna be talking about the idea of algebraic coding theory, which is the idea of using abstract algebra, particularly in our example, we're gonna be using group theory in linear algebra to encode so-called error detecting, error correcting codes. So to give a little bit of background to better explain what it is we're trying to be doing, imagine the following idea here. Well, digital communications are often transmitted as a sequence of binary numbers. That is, when the computer with a digital computer communicates with each other, it communicates with zeros and ones. So a single number, zero, one, this is what we refer to as a bit. Alright, so that's like the fundamental bit of information for the digital computer. Well, unfortunately, during transmission, sometimes noise can interfere with the original message and alter the bits in the message. And when I say noise, I mean this sort of like in a parabolic sense. The computer, obviously, won't be hearing anything. It doesn't have ears. But as language speakers, we're kind of used to things like this. You might be talking to someone, and then they say something, you're like, huh, I can't hear you like, there's just too much, there's too much distraction, too much noise going on here. Can you say that again? Computer communications have similar problems, although of a different nature, right? This noise can cause error to get into a signal. And so as the significance of each bit is important, right? So if your list, if the computer is listening to this message, you might actually meant to send a one, but it might have actually been heard as a one instead. So the zero became a one in the transmission. A single error in a bit can have can make the transmitted message become useless because it's just gibberish, or it could actually become dangerous because the computer was commanded to do something that was really bad. Like, imagine, you know, here's an important file, right? If the command was to copy the file, but instead her delete the file, you could see something like that being bad things. And I'm not saying that's exactly what's going to happen here. But the point is, it's important that we make the transmission, we want it to be correct. But the issue we're going to see here is that we can't expect flawless transmission to happen in all this all the time. So instead, we need our codes to be able to detect and if possible, correct errors in transmission. And so group theory is going to be a valuable tool with this in this regard. And I should mention that again, as language speakers, humans are actually hardwired for this type of thing. How many times have you been talking to someone and they say something you're like, What did you say? You know, you question them because you don't believe what you meant or you don't believe what you heard is what they meant to say. So you've detected an error in the communication. So because you detected an error, you're requesting a retransmission. Computers need to be able to do that as well. But also, how many times have you done the following? Where a friend or family or someone's talking to you and you're like, What did you say? Never mind, I got it. What just happened there? You detected an error, you requested to retransmission, but even before you're done with your request, you're like, Oh, never mind, I figured out what you said, you were able to correct the error inside of your brain. So humans are hardwired to do this with language. What? How do we do it? How can a computer simulate this process? That's what we're going to be talking about in chapter eight in this lecture series. Now the first thing I want to talk about in this video is talk about the necessity of error detection and error correction. And so I'm going to have to deviate a little bit into random variables just for a short manner of time. And I guess I need this diagram still on the screen. So let me talk about the idea of a binary symmetric channel. So we're going to call that T and that's what's illustrated with this little diagram to the right. Well, this binary symmetric channel, it's a model that consists of a transmitter capable of sending a binary signal together with the receiver. So the ideas you have over here, your sender, which is somewhere apart, you can think of as just one part of the computer, don't worry about the computer engineering whatsoever, you have a sender, and then you also have this receiver. Okay. And so they want to send bits of information. So we're sending a zero or a send in one through this symmetric channel right here. Now as the sender sends its bit to the receiver, well, there's two possibilities, the sender, either the bit it wants to send is a zero, or the bit it wants to send is a one. All right. Well, imagine that P is the probability that the bit is transmitted correctly. So if the if the sender wants to send the bit zero, then there is a P chance that it'll send correctly the zero. But there's also a Q chance where Q is at the complementary probability, one minus P is equal to Q it's complement there. There's also a Q chance that instead, a one will be sent erroneously in place of the zero. And this is a binary symmetric channel meaning that the other options to sender want to send a one. Well, there is a P percent chance that it'll send a one correctly. There's also a Q chance that it'll send a zero incorrectly. Now the likelihood of sending zero correctly is the same as the likelihood of sending one correctly. So we do have this symmetric channel in that regard. And so this then leads to the idea of a Bernoulli trial or sometimes called a binomial trial from probability theorem and probability theory. It's the idea of like if I flip a coin, how many times if I flip a coin five times, what's the probability that I'll get three heads. Now in the case of flipping a coins heads and tails are equally likely. There's a 50% percent chance of each one happening. But let's say that in this case when we flip a coin, there's a 90% chance that the message will be sent correctly, but there's a 10% chance that error could occur. And so when you look into Bernoulli trials, you get the following probability of function for this random variable. So x is going to be a random variable that is its assignment is decided by chance by randomness. And so let's say that we send we send our bits with in bits. And what is the chance that K of the bits could be sent with an error? So that's saying that are well, okay, what our random variable equals K, right? What's the probability that our random variable could equal K? And so without going into the details of the probability here, there's going to be a P to the end minus K times Q to the K, and then you're going to times this by the binomial coefficient and choose K. So that that's how one computes these Bernoulli trials here are binomial distribution. And so let me show you an example of this, let's assume that we have a let's assume we have a probability of transmitting a correct bit to be 99.5%. That seems very accurate. That's really good. You would love it if that's your grade and abstract algebra, right? So 99.5% accuracy. What happens if we send a 500 bit message? Okay, which is actually quite reasonable for a computer to want to do that. So what's the probability that this 500 bit message would have no errors in it whatsoever? Well by the by the formula from above, we're trying to find the probability that x equals zero. So we would take 500 choose zero, we time we times that by 95.5% to the 500th power, and we take half a percent to the zero with power, and that would be approximately 8%. Okay, so so let that sink in for a second. There is only an 8% chance that this 500 bit message would be sent with no errors whatsoever. So that we should anticipate errors, right? The the complement of that right is there's a 92% chance that there will be at least one error in the transmission. What you see, I'd be about 92% right there of some error in the message. Well, what's the probability of having exactly one error? Well, if we do the calculation there, the probability of when x equals one, we take the binomial coefficient, the success to the 599, 499 failure to one, you're going to get a about a 20% chance, 20% chance that you'll have at least one error. And then if you did the same calculation for x equals two, what's the probability that there will be exactly two errors? Again, you go through the calculation here, you can double check this for me. There's about a 26% chance of having exactly two errors. So when you put that together right there, there is a 46% chance that there will be one or two errors for this for this symmetric binary channel right here. And so you can see you can see that calculation going on right here. So what's the probability of being more than two errors? Well, that's still going to be about 46%. And so one has to expect that there will be errors, we can't just stick our head in the sand and expect that we can transmit this without errors. Errors are likely going to happen. They're not going to be a lot of errors, mind you. But there's a huge chance that this is going to happen. So even though the probability of success is so, so, so high when you have enough trials, eventually there's errors that's going to sneak into this. This is a conversation that happened a lot in the year 2020 when the coronavirus was prevalent around the world, right? The thing is people would say things like, well, yeah, the success, the survival rate is like 99.9% chance. Well, sure that for most of us average Joe's, that's a really good thing. It's like, oh, I'm most likely going to survive this thing. But when you multiply that by the population of the country or the world, the US population, for example, turned out to be close to 330 million people. If you took 99.9%, well, let's take the failure of the death right there. If you take 0.1% any times that by that population, you're going to see that is several million people. That's a huge number, right? And it's the same principles applying with this error transmission. We can't just pretend errors aren't going to happen. They're going to happen. And therefore, our computer science needs to be designed in such a way that it can it can detect and possibly even correct these errors when they inevitably will occur.