 Here's a really cool way that you can run an AI locally on your own computer, using Olamma. Olamma should run on almost any kind of computer. Here I'm running it on my Mac M1. There's also installers for Linux. You can run it on Windows using WSL. Just simply download the version that you need and install it, and then you have access to all of these models. Each of these models can be good at different things, so it's pretty fun to experiment and see which ones are the best at certain tasks. For now though, we're just going to stick with the Mistro model. This model works well for a chatbot like chatGBT. So to run it, we just copy this command, and we paste it into our terminal. I already have the Mistro model downloaded, but if I didn't, it would download it for me here. So since this is running locally on my M1 Mac, it takes a few seconds for it to fully load up. So let's ask it a question. Tell me a joke. So now we have our own private AI chatbot running locally on our laptop. Thanks for watching!