 So we've already started to discuss a few different types of operations calculations run times the first one Really super slow that we we sort of discussed was a constant an O of One and the idea behind this is if I you know tick this and say that it's a one It doesn't actually matter how big of an input size. I'm gonna get this operation will always run at a constant speed so Time will never change for it then we introduced what we classify as linear O of N so same kind of argument could be said here as my in increases as it goes from zero to one as We see through this chart. What's actually gonna happen is it's going to kind of run at a very Straight line pretty much continue going up as input size goes up. We also are time increases However, that's not these are not the only two algorithms out there. We also have something known as log. Oh log of N Now the idea behind log is that there is a a diminishing return going on there. So I like that Diminishing Diminish niche in Return So what do I mean? Well at the start. Yes, it's going to still kind of increase and it'll increase pretty much the same as My my linear time however at some point in time it's going to start to taper off and Even if my input size continues to increase at some point time, you know in our argument we could say like I don't know this is like 20 It doesn't really matter But as I my algorithm kind of looks 20 will stay consistent over time Okay, so that's that idea of diminishing return even if I go from like here 50 to 100 We see no real Let's see. That's 50 and that was a hundred We see no real jump there versus when we went from zero to 50. We saw at least some increase so logarithmic time We also then can look a little bit on the opposite side So if I have logarithmic time and I have say linear time, I also have what we would call log Linear in log of in this is really interesting This is where you'll find a lot of problems because Sometimes we do have to run through every single element in say a list But then as we kind of process them we can do them in some type of diminishing return style One good example of this is let's say I wanted to say find all unique unique letters in a sentence Tense there we go So I'd have to go through every single letter in linear, but I could Manipulate it in some way that I only have to look at certain letters if they get cut off. So This actually is kind of interesting because it's a little more than linear in fact It's sort of I wouldn't call it the reverse just yet, but it's sort of skyrockets pretty high Pretty quickly as we sort of increase size it gets bigger because again. I have to multiply These two numbers together We can continue to grow from here and another way we could think about this if I had that same algorithm of Finding all the unique letters in a sentence one of the ways I might do it is to store all those letters and then go through that that stored List well the problem is since I have to go through that list. I might run into what we call a quadratic time oh Of in squared one I have to go through every letter in a sentence and then I'm going again through every letter in sort of my Stored unique letter list the problem is as you can kind of imagine now since I'm not doing a diminished return I'm doing sort of in times in this starts to jump incredibly quickly as well and We can keep on going with this we have one final one that we can look at Exponential and the idea here is instead of me looking at in our We would actually kind of flip those two digits around so our exponent is now the in and we just continue no matter as Our input size increases. We are doubling every single time and if you thought about that for just a second something like Two to the power of thirty two that's already getting us into four point two billion As a number so we're getting really big really quickly Even at a small number here And so this seems to start skyrocketing sort of right away very quickly