 Hello everyone, welcome to the lecture on algorithm growth rates. At the end of this session, students will be able to describe algorithm growth rate, differentiate algorithm complexity classes, compare and order growth rate function. In this session on growth function, we will get ourselves familiarized with algorithm growth rate. We will learn what algorithm complexity classes are and then compare growth rate function. What is rate of growth of an algorithm? The rate at which running time increases as a function of input is called rate of growth. The performance run time of an algorithm depends on n, that is the size of the input or the number of operations required for each input item. We measure the time requirement of an algorithm as a function of the problem size. Algorithm analysis is all about understanding growth rate, that is as the amount of data get bigger, how much more resources will my algorithm take? Typically, we describe the resource growth rate of a piece of code in terms of function. To help us understand, let's take a simple example to understand this. Below, we have two different code to find a square of a number. For some time, forget that square of any number is n into n. Here, we have algorithm A and we have algorithm P. Now, to calculate the square of n in algorithm A, we have for i equal to 1 to n, do n is equal to n plus n, while the first solution required a loop which executes for n number of time. Whereas in algorithm B, we calculate this with the help of a mathematical operator of multiplication. So, which is a better approach? Of course, the second one because it uses the mathematical approach with an operator. As we can see here now, different algorithms require different amount of running time and space. The average running time of an algorithm is a function f of n of problem size n. Based on f of n, algorithms are classified into different group called complexity classes. Now, these algorithms are classified into, as seen here, constant time algorithm that is O of 1, a logarithmic algorithm O of log of n, linear algorithm, quadratic algorithm, a polynomial algorithm and exponential algorithm. Each of this algorithm, we will now see in PPL. O of 1, that is constant algorithm. Constant algorithm will always take same amount of time to be executed. The running time here, consider the given statement 1. The running time of this statement will not change in relation to n. The execution time of these algorithm is independent, it is independent of the size of the input. That is, a constant resource need is where the resource need does not grow. I mean, the execution time of this algorithm is independent of the size of the input, constant growth rate. And a constant growth rate is where a constant resource is needed, where the resource need not grow. That is, processing one piece of data takes the same amount of resource as processing one million pieces of data. The graph of such growth rate looks like a horizontal line as shown over here. Now, O of n, linear algorithm. Linear algorithm is one where runtime grows directly in proportion to n. Consider the for loop given here. Here, for i is equal to 0, i less than n, i++ statement, the time complexity of the above algorithm will be linear. Means, the running time is directly proportional to n. Linear growth rate is growth rate where the resource needs an amount of data is directly proportional to the data. That is, the growth rate can be described as a straight line. It is not horizontal as we have seen in O of n. An algorithm has a linear time complexity if the time to execute the algorithm is directly proportional to n. Example, the linear search. Next, O of log n that is logarithmic time algorithm. Here, the run time grows logarithmically in proportion to n. Consider the given piece of code. Here, the algorithm divides the working area in half. With each iteration, the running time of the algorithm is proportional to the number of times n can be divided by n. That is, here it is 2. n is the high-low over here. Now, this algorithm will have a logarithmic time complexity as example in binary search which is often used to search data set. A logarithmic growth rate is the growth rate where the resource need grows by one unit each time the data is doubled. This effectively means that as the amount of data gets bigger, the curve describing the growth rate gets flatter. Closer to horizontal line but never reaching. The following graph shows what a curve of this nature would look like. Next is the quadratic O of n square quadratic time algorithm. Consider the given piece of code. Here, there are two for loops. This is common with algorithm that involve nested iteration. A quadratic growth rate is one that can be described by a parabola. As shown, O of n square represents an algorithm whose performance is directly proportional to the square of the size of the input set. Here, when n doubles, the running time increases by n. So, if we can see over here, this is first loop, this is second loop. So, the running time over here is n square and the graph of quadratic runtime would look like this. Moving ahead to exponential time algorithm. The run time grows even faster than polynomial algorithm. Consider the given piece of code here of Fibonacci. While this may look very similar to a quadratic curve, it grows significantly faster. An exponential growth rate is one where each extra unit of data requires doubling of the resource. As you can see here, the growth rate over here, the blue color, the growth rate requires a doubling of resource. Starts off looking like it's flat, but quickly shoots up near to vertical. If you can see, it looks flat and then shoots off vertical. Note it can't be actually vertical. An example of O of n, that is exponential time algorithm, is the recursive calculation of Fibonacci numbers. This chart gives us the overview of algorithm type and average running time with respect to its time efficiency over here. Constant time independent of n, logarithmic log n, linear time n, n log n, it's n log n, quadratic time proportional to n squared, cubic and so on. Now, the examples of the runtime analysis, like for logarithmic algorithm, we have binary search. For linear algorithm, we have linear search. For super linear algorithm, keep sort, merge sort. Exponential and algorithm, Tower of Hanoi and Fibonacci series. Now, we have ordered the functions over here, like O of 1 is less than O of log n, then O of n and so on. Now, a comparison of growth rate functions. If you consider here n as 1, log n will be 0, n squared will be 1 and 2 of n is equal to 1. When you take n2, log n will change to 1, n squared to 4 and 2 raised to n to 4 and so on. If you see, this goes on increasing. At some time you see over here that it has become equal, but it keeps on increasing. So, a comparison of growth rate function. O of unconstant time is better than followed by log time, that is O of log n, n log O of n linear time, O of n log n, logarithmic linear time, quadratic time, then cubic time and exponential is the first. So, this is a graph representing the functions, pause the video and solve the question, slowest to fast. Which kind of growth best characterizes each of the function? This is the answer. Thank you.