 Hello friends. I'm Sanjay Gupta. I welcome you on Sanjay Gupta Tech School. In this video, I'm going to explain asymptotic analysis that is available in data structures and algorithm. So first I'm going to explain these theoretical points. So asymptotic analysis of an algorithm that was to define defining the mathematical foundation of framing of its runtime performance. So using asymptotic analysis, we can very well conclude the best case, average case, and the worst case scenario of an algorithm. Then asymptotic analysis is input bound. That is, if there is no input to the algorithm, it is concluded to work in a constant time. Other than the input, all other factors are considered constant. A asymptotic analysis refers to computing the running time of any operation in mathematical units of computation. So for example, the running time of one operation is computed as f of n and maybe for another operation it is computed as g n square. This means the first operation running time will increase linearly with the increase in n and the running time of the second operation will increase exponentially when n increases. So similarly, the running time of both operations will be nearly the same if n is significantly smaller. So this was the example. Now to represent asymptotic notation, we have three notations. So first is omega notation, sorry, first is bigger notation, second is omega notation and third is theta notation. So let's understand all these three notations with the help of diagram. So first is bigger notation. So the notation O of n is the formal way to express the upper bound of an algorithm's running time. So it basically express the upper bound of an algorithm's running time. So it measures the worst case time complexity or the longest amount of time and algorithm can possibly take to complete. So bigger notation is used to calculate the worst case. So with this diagram, you can see the dark line is representing f of n and dotted lines are representing g of n. So which is basically related to this bigger notation, which is showing the upper bound or you can say worst case scenario. Then second is omega notation. So the notation omega n is the formal way to express the lower bound of an algorithm's running time. So it basically shows the lower bound of the algorithm's running time. So that we can say best case. So it measures the best case time complexity or the best amount of time and algorithm can possibly take to complete. So this is omega notation and third one is the theta notation. So theta notation is the formal way to express both the lower bound and the upper bound of an algorithm's running time. So basically it calculates the average computation. So bigger notation works on worst case, omega notation works on best and theta notation works on average case complexity calculation. So these three are related with asymptotic notation. So generally in examination question will be there like explain what are asymptotic analysis and the notations which are used for asymptotic analysis. So you can explain these three notations with the help of these diagrams. So I hope this will help you out for your exam preparation. And if you want to watch more videos related to data structures and algorithm. So just go to description of this video. You will find link of playlist for DSA videos. So follow that link and you will find all data structure related videos there. Thank you for watching this video.