 So, so far what we've been doing is we've been making very sequential programs. If I think about it, this is what I've been doing. I've been sort of having a step, print something, get something, input something. And then I'd be moving on, maybe I do a mathematical calculation, and maybe I move on and I do something else. Mostly I've been doing sequential structured programs. But what if I want to make a deviation? What if all of a sudden I want to say, well instead of just going sort of down and down and down, what if all of a sudden I meet a fork in the road? You always see sort of a diamond flow chart when this happens. But now all of a sudden I'm asking sort of a question. I want to go either left or right suddenly. And that's where we start to get into today's class. So the idea is we start to think about it in this idea of how the computer operates. Remember, the computer can only operate in ones and zeros. So it becomes very difficult for it to handle anything above that. And ones and zeros, luckily if we kind of go into a logic kind of thought for a second, that lets me think, well, zero, zero, I can imagine that as a false. And one, I can imagine as a true. And so all of a sudden that's where we start to take a bunch of these and we start to get, as you can kind of guess, artificial intelligence. Say for example, look at the circle on the left. Is that a circle? You may say so. But what we can do is through that process, maybe I draw hundreds of circles. I come in here, I draw another circle. I draw another circle. Maybe really practice really good to get a one that's really good all of a sudden. And then I can tell it what doesn't look like a circle. Like that's not a circle. That's not a circle. That's not a circle. And so all of a sudden we could get a computer to sort of understand and what it's going to do all of a sudden is make a decision. And so guess what? It's close enough. So how do we start to do that kind of decision making process? We obviously start very small. And the idea is if we kind of take a look at Python for a second. You know, we started in the shell and we introduced this idea that I can do math in here. It's a big massive calculator. So 3 plus 1 allows me to do 4. 3 minus 1 allows me to see 2. But I want to kind of take a second. And I want to remember we introduced something known as the ALU. Like I asked before, what is that? If you don't know, you should probably look that up. Stands for Arithmetic Logic Unit. Now the arithmetic part, that's what we've been doing so far. We've been doing math. But that second one, that second part is my logic unit. And that's what we also are able to do inside of a computer programming language. I'm able to do comparisons. So suddenly I can ask, instead of doing a math equation that outputs say 4 or 2, now all of a sudden what I can say is, well, tell me whether or not 3 is greater than 2. That is a true statement. Tell me if 3 is greater than 4. That is a false statement. And we start to build these up. Now one of the things that we run into is, if we remember math class for a second, we had these kind of symptoms. I had the less than or equal to the greater than or equal to or the not equal to sign. Unfortunately, we don't have those. We don't have those on my keyboard, just like I didn't have this symbol on my keyboard for division. So what did we do? We found a compromise. We found keys that are all on the keyboard, and we just said we'll use them. Now, one thing I want to kind of end point all of a sudden is, you notice these two, this one over here, yeah, those are all simple. But all of a sudden, this guy right here, you know, that's an equal sign. The problem is we have to think about something called proofs. Proofs. You guys remember proofs? They were not the fun part of any math class. All of a sudden, proofs walk in, and we have to rethink how we work the equal sign. Because I come over into Python, you know, I can do something like 3 equals, or x equals 3. No errors whatsoever. However, it starts to kind of break away, because the equal sign is an assignment operator. It takes whatever was on the right side and sticks it into the variable on the left. So all of a sudden, something that we may have learned in, say for example, math class, oh, well, x equals 3, let me check x plus 1 equals 4. I get an error. You see, it can't assign to operator. It freaks out. There's a lot of different errors that it can come into, but the big focal point is that the equal sign has a very specific purpose. Now, if I do want to make a comparison, what I have to do is I find a compromise. Now all of a sudden, x plus 1 equals 2 equal signs, because that now is saying instead of trying to assign the value, let's check to see if it equals something. And so x plus 1 equal equal 4 allows me to see a chart.