 In this video session, we are going to look at the concept of memoization in Fibonacci series. So while implementing this concept of memoization, we will be using Python as a language of implementation. So I advise you to look at, check the links provided in the description to understand more about the memoization. So as a learning outcome for this session, it's planned that at the end of this session students will be able to understand the concept of memoization and its implementation in Python. So we will look at what exactly is memoization. I hope you have checked the links in the description. So we will be implementing the memoization in Python way. So we'll try to use some of the Python's built-in container types so that we can implement memoization a little bit more elegantly in Python syntax. So as in live hands-on approach, I'll be using PyCharm IDE. So I expect the audience has a fundamental knowledge of Python and they have a latest table working copy of Python IDE environment called as PyCharm. And community edition is enough for our needs for this video session. Before we go for any hands-on on implementing algorithms in Python, it's very crucial to know the time complexity of containers and collections in Python. Because when we implement pseudocode algorithms, what we do is we write a pseudocode for algorithms and we calculate the time complexity of algorithms based on the pseudocode which we write or in general the algorithmic representation. But often when we implement those algorithms in languages, for example, we are using Python, but in some cases it could be C, Java, or even other language. So most developers, what they do is they just assume or rather than saying that they will just presume that a specific operation in Python, for example, maybe it's a sorting or maybe for searching or maybe finding a mini element in a list or a specific container in a Python, they assume that it's of specific complexity. And most often this always results into a bad program implementation. So it's very important that we understand what are those operations and what is the time complexity of those operations implemented in Python's C Python runtime. So we will go through the standard time complexity manual of Python, wherein it lists you the time complexity of various crude operations on containers and collections from the Python's standard library and then the collection module as well. Apart from that, we'll also look into a module in a Python called as HIPQ module, which lets me build minHips and also which also lets me have a priority queue implementation Python. So we'll also touch that aspect as well and we'll see what exactly will be the complexity of any heap or a priority queue constructed using those models in the Python. So let me switch to these Python's time complexity manual. So now we are on a page where it's a standard Python docs module where the Python implementation, it's a Python's standard module here, where you can see here that these complexities are based on the current implementation of C Python. And this is for Python 3 we are looking at. So now when you look at this, what I was talking about as a more important aspect which you need to concentrate when you give a Python implement to this algorithm is these complexities. For example, if you take a list and to the list, if you add an element, append at the end of that list, then the complexity of that operation is O of 1. So under the average case, so this is this matters so much for a theoretical consideration, but when you're writing an algorithm for a practical consideration, it's very important to also understand the immortalized worst case complexity. But as of now, we will slightly skip this, we will more talk about these complexities. So the append is O of 1. And you can see that sometimes when I insert this append at the beginning, then suddenly the complexity is very high. So you can see here pop last. It means that removing last element from the list is O of 1. But popping any of the middle element in the list, popping any of the middle element in the list, it's complexity is O of n. So you can see that in our algorithm implementation, if we do this operation, then we have to consider that operation as O of n, not as O of 1, that's most misconception with developers or the novice beginners who begin with language, they keep into mind that, okay, their algorithm and the program has equal complexity, no. It depends upon how well you correlate the pythons or any language internal implementations with your algorithms derived complexity. So since our implementation, Python implementation of algorithm, which we're going to see the hands on, it depends on Python, it's very important that we look at this chart. So this is for the list. And you can see here, if you want a double ended queue, then this is the best data structure which limits lets you append left append right, which lets you insertion deletion at the both the end of the container. And that's O of 1. That's a DQ. So that's, this is what is recommended. And one more operation which we very frequently we use is sort operation and searching in a list. So when you search in a list, for example, X in a S, then it's also O of n finding minimum and maximum of a list is also n, but you can see that surprisingly getting the length of list is O of 1, quite contrary to the major developer perception that finding the length of a list will be an operation. It's a beauty of Python, which does give you the length of list in O of 1. So this is set. So you can see that whenever you take a set symmetry difference, or when you say whether the element is present in set or not, it's O of 1. So most of, most often we look, we need such operations in algorithm implementation. And these are the dictionaries. As you can see that average case dictionaries is always O of 1. Whenever you look get item, set item and delete item, it's important that we much worry about this. But you can see that amortize worst case complex is O of n, okay. So fine, we'll not give much attention to this, but we'll only rely on this assumption. Okay. So iterating dictionary is n and copying the dictionary is O of n. Don't think that creating just a copy of dictionary is one operation that no, whenever if you create any copy of a dictionary, then it is O of n. So by looking at this, I think the one of the another very important thing is sort, which we very frequently we use in our algorithm. As you can see for sorting the complexities O of n log n, and it's, it's somewhat based on team sort implementation. Very important note that the sort built in sort, whether it is sort or sorted, it's always n log n in Python. So this is for the time class of basic operations. As we have said, we are using another module, we will be using another module heap queue. We may use it. Most algorithms do require you employing priority queue and heaps. So this is a module and this gives you a mean heap representation and it's based on the textbook heap algorithms. So the APS differs from textbook. So it gives you zero index, mean heap based textbooks implementation. And so you can see that this, this is almost like heap dot sort. So it's a kind of heap, Arab is heap implementation given to you. The general complexity of this is n log n, all the standard heap related operations, what the retrieval insertion of log n and maintaining heap is n log n. So this is also very beneficial when you want to implement heap in our goal specifically when you talk about optimal file merge pattern or some other kind of algorithm which requires to maintain a heap or tree based structure. So let us switch to PyCharmID and start writing the code for that. So let us look at the problem of memorization. And as we have seen, we are going to solve it for Fibonacci series. So I have a n and I have to calculate the Fibonacci element for nth element in the series. So what I do is a usual Fibonacci program where you will see that I am recursively calling f is equal to n of minus 1 and n of minus 2, this is a recursive one. So considering the recursive one, we have done few optimizations here. For example, I took an array of size n plus 1 and each element of this list has been initialized with minus 1. And now let us look at the definition of Fibonacci series. So when I get a n, I first check if it is less than equal to 1, these are the base cases to make the recursion of Fibonacci series code. So if it is that, I return n else. If it is else, the first thing what I will do is before making a recursive call, I will see if this value were already computed. So if memoization array of n, so for whatever n, if this value is minus 1, it means that the value is same as what we have initialized, so this value was not computed. If this value is not computed, then I go ahead and I make a recursion call here and once I make a recursion call for that specific n, I store the result which I have obtained in the array here. I mean in case of Python, the list, Python list at that index and then I return the value. Assume that in this recursion trace, maybe this Fibonacci of n, if it is called again maybe in some other branch of tree, at that time if it is not minus 1, it means that the value is already computed, then no need to once again go through all the recursive trace, branch trace there. So what we do, we directly go to else block here and I return the previously computed value. So at the end I should be able to print the value for this Fibonacci series. So now if you observe here, this greatly reduces the recursion tree and also it reduces lot of recursion overhead involved in calling the same repeated operations again and again. So in short, this is a good example of memoization when we have a recursive logic and specially when we have a recursive logic where we rely on backtracking the recursion tree a lot. So this greatly minimizes the call and performance of algorithm gets boosted. So you can now analyze this program and you can see if there is any performance in this algorithm or not. So but generally if you look at most often memoizations do tend to help algorithm improve its performance. So that's it for this hands-on. Let us move to the next slide. Okay, so now that we have seen the Python implementation of memoization for Fibonacci series example, I have a quick question for you as a reflection. What do you think? Like do you think does memoization help in improving algorithms time complexity? So you can pause the video and you can analyze the algorithm. You can watch the video again to this point and you can guess the answer. So the answer for this question is yes. In most cases, yes, memoization helps in improving algorithms time complexity. Maybe in some cases it might not directly help in improving the time complexity but it would definitely help reduce or optimize the algorithm or the program for a specific runtime environment. So that's it for this video. Thank you.