 Hey, I'm Charlotte, and I'm here to talk about artificial intelligence. Did you know that there are seven senses in our bodies, not five? The first five you probably already know. Your sight, your hearing, your sense of smell, your sense of taste, and touch. And the other two might become as a surprise to you. They are both proprioception, which is how you know where your body is in space, and your vestibular system, which is your sense of balance. Those seven things together make up how you perceive the world. Now computers, in order to do the work that they do, also need to perceive the world, and they use those same seven senses. So vision, for example, you may be able to take a picture or a video, and that would be a perception of a computer. But it might also be a picture of using a telescope or using a microscope to see something that our eyes would never be able to perceive without the help of technology. It might also see wavelengths of light that we can't see, such as ultraviolet light that other animals can perceive. Humans have a sense of smell that we use to sense emotion and to know what's happening around us. So for example, if you walk home and you walk in the door and you smell cookies, you probably feel really good. If you walk in and you smell fire, you might feel worried. A computer can also smell. It can have a fire detector that's connected to the fire department. So before you even get home, the fire department has already come. That computer system can detect the chemical signature of fire and keep you safe. You may have gone through an airport and if you bring food through that airport, they need to test it and see that the chemicals are the things that they should find in food and not something that's illegal for an airplane. So they'll bring out a little pad and they'll rub it on the food and that is the way a computer might taste. So another sense we have is touch. And for us, it seems very natural. We can pick up a pen, we can pick up a phone, we can pick up a heavy brick. And we know exactly how much force we need in order to pick up each one of those objects. One of the challenges that computers and robotics have is knowing how much force it takes to pick up a single object. So when you see that six-month-old picking up Cheerios, they're learning how not only to move their body in that way, but how much force it takes so that you can get the Cheerio and grab it with just enough force that you don't crush it. So robotics are still learning some of those systems of touch. A computer can hear things both at a higher and lower range than we can hear. And it can hear at a faster or slower rate. So some sounds, a computer can perceive and analyze and we have to see it on a graph. Because it's just too fast for our ears to hear. That vestibular system, that sense of balance, that's a challenge for computers right now. That they have to be able to orient themselves in space and you're probably thinking like a robot, right? Like it's trying to walk and we know that computers don't really do that well yet, those robotics yet. But we have that sensor right here in our phones where we can play video games and that has that orientation in space. Your GPS knows if you're going east or west or north or south. And that's proprioception, that's knowing where it is in space. So we also have that with the vestibular system, that sense of balance. That's where robots really struggle is, can they walk up the stairs? That's a challenge for a robot because it requires a lot of different sensors and motors to work together very finely together. Where the timing has to work out just right, where it can adapt. And we're really good at adapting. Computers aren't as good at adapting. What they are good at is finding patterns. So when you have this whole, these seven senses and they all come together, we can put computer processing behind it and we can do amazing things. Let's get started.