 Our next topic in mathematics and data principles is something called Big O. And if you're wondering what Big O is all about, well, it is about time. Or you can think of it as how long does it take to do a particular operation. It's the speed of the operation. If you want to be really precise, the growth rate of a function, how much more it requires as you add elements is called its order. That's why it's Big O. That's for order. And Big O gives the rate of how things grow as number of elements grows. And what's funny is there can be really surprising differences. Let me show you how it works with a few different kinds of growth rates or Big O. First off, there's the ones that I say are sort of just on the spot, you can get stuff done right away. The simplest one is 01. And that is a constant order. And that's something that takes the same amount of time, no matter what, you can send out an email to 10,000 people just hit one button. It's done. The number of elements, the number of people, the number of operations, it just takes the same amount of time. Up from that is logarithmic retake the number of operations, you get the logarithm of that. And you see it's increased, but it's really only a small increase. And it tapers off really quickly. So an example is finding an item in a sorted array, not a big deal. Next one up from that, now this looks like a big change, but in the grand schemes, it's not a big change. This is a linear function, where each operation takes the same unit of time. And so if you have 50 operations, it takes 50 units of time. If you're storing 50 things, it takes 50 units of space. So find an item in an unsorted list, it's usually going to be linear time. Then we have the functions where I say, you know, you better just pack a lunch, because it's going to take a little while. The best example of this is what's called log linear. That's where you take the number of items, and you multiply that number times the log of the items. And an example of this is something called a fast Fourier transform, which is used for dealing, for instance, with sound or anything that's over time. You can see it takes a lot longer. If you've got 30 elements, you're way up there at the top of this particular chart at 100 units of time or 100 units of space, or everyone to put it. And it looks like a lot. But really, that's nothing compared to the next set where I say, you know, you're just going to be camping out, you might as well go home. That includes something like the quadratic, you square the number of elements. And see how that just kind of shoots straight up, that's quadratic growth. And so multiplying two n digit numbers. So if you're multiplying two numbers that each have 10 digits, it's going to take you that long, it's going to take a long time. Even more extreme is this one. This is the exponential to raise to the power of the number of items you have. You'll see by the way the red line here doesn't even go to the top. That's because the graphing software that I'm using doesn't draw when it goes above my upper limit there. So it kind of cuts it off. But this is a really demanding kind of thing is for instance, finding an exact solution to what's called the traveling salesman problem using dynamic programming. That's an example of exponential rate of growth. And then one more I want to mention, which is sort of catastrophic is factorial, you take the number of elements and raise that to the exclamation point factorial. And you see that one cuts off really soon because it basically goes straight up. You have any number of elements of any size, it's going to be hugely demanding. And for instance, if you're familiar with the traveling salesman problem, that's trying to find a solution through the brute force search, it just takes an extraordinary amount of time. And so you know, before something like that's done, you're probably just going to, you know, turn this down and wish you never even started. The other thing to know about this is not only do some things take longer than others, some of these methods, some functions are more variable than others. So for instance, if you're working with data that you want to sort, there are different kinds of sorts or sorting methods. So for instance, there's something called an insertion sort. And what you find is that on its best day, it's linear, it's O of N, that's not bad. On the other hand, the average is quadratic, and that's a huge difference between the two selection sorts. On the other hand, the best is quadratic and the average is quadratic, it's always consistent. So it's kind of funny. It takes a long time, but at least you know how long it's going to take versus the variability of something like an insertion sort. So in some, let me say a few things about big O number one, you need to know that certain functions or procedures vary in speed. And the same thing applies to making demands on the computer's memory or storage space or whatever, they vary in their demands. Also, some of them are inconsistent, some of them are really efficient sometimes, and really slow or really difficult the others. Probably the most important thing here is to be aware of the demands of what you're doing, that you can't, for instance, just run through every single possible solution, or you know, your company will be dead before you get an answer. So be mindful of that so you can use your time well and get the insight you need in the time that you need it.