 So just a little bit of kind of basic programming logic, if we think about programming for a second, all we're doing is we're telling the computer to do a specific set of instructions, which as you can already guess, is no different than baking a cake. You know, I gotta turn on the oven, I gotta mix some chocolate, I gotta put the butter in there, I gotta put the sugar, eggs, flour, I start to put, you know, I mix my wet ingredients in my dry ingredients, I mix for a set amount of time, stir the mixing bowl for one minute, continue stirring as I continue to go through the song and dance, bake with 30 to 35 minutes, and then serve with the chocolate sauce. But all of a sudden, you know, we have to think about that set of instructions. And so, how we do that is we start to go, well, how do we write something that the computer can understand? How can we use a programming language in a way that it understands us? And that's actually where we get into this idea a little bit of what it even means to have a language. I know that sounds weird programming, it seems simplistic, but think about English for a second. How is it that you understand the words that are coming out of my mouth? Hopefully you understand the words coming out of my mouth. But how that works if we understand some of the basic syntax and the basic structure of a language? Programming is no different. They all get broken down the same slight way. One of the things that we have is something known as a primitive construct. Something like English. English is simple. It's broken down into words and punctuation. How are you doing? Question mark. I just suddenly know that that's a question and has kind of a purpose to it. Python programming is a little bit on the same avenue. The only difference is instead of punctuation, it sort of works off of numbers, something we call strings and operators. We'll talk about those operators in just a little bit, but this is where we get into syntax. So, something like 1.8 plus 1.8. That is valid Python syntax. That's a number, an operator, a number. That's something that it considers valid. However, if you kind of look at the word that I've written down here, this is actually not valid English syntax. I know that if we look at it for a second, I'm seeing a lot of symbols that are not from the English language. Furthermore, that's not really a word in English. That's actually where we get into static semantics, determining whether a string has meaning. We look at that string again. One, yes, it does have symbols that are not English, but if we were to even cover those up, we were to get rid of this guy, this guy, and these two, and be left with t, u, n, a, q. That's not, that word has no meaning in the English language. That's where we get into static semantics. Same kind of concept, a string together words. I am, Spartacus, is syntactically correct. I am in Spartacus, and are all words in the English language, and it's semantically correct. It makes sense. I are Spartacus, however, does not. It's not actually semantically valid, because even though all the words are syntactically correct, it's not semantically. You wouldn't say I are Spartacus, unless you were Bizarro. Same kind of concept. Let's take, for example, 1.8 divided by cat. Well, no, that's not actually semantically valid. Even though the syntax is there, that's a string, that's an operator, that's a number, that is not semantically correct, because, well, I can't divide a cat. Cats are not numerical values. That's animal cruelty. And then finally, we get into just basic semantics. One of the differences in programming versus a traditional language is in English, things can be a little bit more ambiguous. Python is so awesome. Can be a truthful statement. I can be super excited with it, like, Python is awesome. Or if programming computer science is a little bit more difficult, I might loathe Python a little bit more. Python is so awesome. Now I'm sarcastic with that. In a programming language, in Python, however, there is only one meaning. Python equals the string representation of B7 characters, A-W-E-S-O-M-E, which could be spelled out as awesome. That's actually now, all of a sudden, one of those differences when it comes to a programming language. It can only have one meaning. X can only be five. If I want to change X, it has to forget five. It has to take on whatever new value to it. If I say Python is bad, it's no longer awesome. Awesome's gone. It's thrown out of memory completely. Python is now bad.