 In this video session we are going to look at Python implementation of algorithm problem called as optimal files merge pattern. So we will primarily look at implementing algorithm of that in Python. So let us see what outcome has been planned for this session. At the end of this session student will be able to provide Python implementation for optimal files merge pattern problem. Here onwards we will see a live hands-on demonstration of writing a Python code for the optimal files merge pattern problem. And before we move to that PyCharm ID, I would like to ensure that the audience who is watching this video has good knowledge of Python fundamentals and has a latest stable working PyCharm ID. And community edition is enough for our needs. So let us switch to PyCharm ID and start writing code in for the problem. Okay, before we go for any hands-on on implementing algorithms in Python, it's very crucial to know the time complexity of containers and collections in Python. Because when we implement a pseudocode algorithms, what we do is we write a pseudocode for algorithms and we calculate the time complexity of algorithms based on the pseudocode which we write or in general the algorithmic representation. But often when we implement those algorithms in languages, for example, we are using Python but in some cases it could be C, Java or even other language. So most developers what they do is they just assume or they rather than saying that they will just presume that a specific operation in Python, for example, maybe it's a sorting or maybe for searching or maybe finding a mini element in a list or a specific container in a Python, they assume that it's of specific complexity. And most often this always results into a bad program implementation. So it's very important that we understand what are those operations and what is the time complexity of those operations implemented in Python's C Python runtime. So we will go through the standard time complexity manual of Python, wherein it lists you the time complexity of various crude operations on containers and collections from the Python's standard library and then the collection module as well. Apart from that, we'll also look into a module in a Python called as heap queue module which lets me build min heaps and also which also lets me have a priority queue implementation in Python. So we'll also touch that aspect as well and we'll see what exactly will be the complexity of any heap or a priority queue constructed using those models in the Python. So let me switch to these Python's time complexity manual. So now we are on a page where it's a standard Python docs module where the Python implementation, it's a Python's standard module here where you can see here that these complexities are based on the current implementation of C Python. And this is for Python 3 we are looking at. So now when you look at this, what I was talking about as a more important aspect which you need to concentrate when you give a Python implement to this algorithm is these complexities. For example, if you take a list and to the list, if you add an element append at the end of that list, then the complexity of that operation is O of 1, so under the average case. So this matters so much for a theoretical consideration but when you're writing an algorithm for a practical consideration, it's very important to also understand the immortalized worst case complexity but as of now we will slightly skip this. We will more talk about these complexities. So the append is O of 1 and you can see that sometimes when I insert this append at the beginning, then suddenly the complexity is very high. So you can see here pop last. It means that removing last element from the list is O of 1 but popping any of the middle element in the list, popping any of the middle element in the list, its complexity is O of n. So you can see that in our algorithm implementation, if we do this operation, then we have to consider that operation as O of n, not as O of 1. That's most misconception with developers or the novice beginners who begin with language, they keep into mind that, okay, their algorithm and the program has equal complexity, no. It depends upon how well you correlate the Pythons or any language's internal implementations with your algorithm's derived complexity. So since our implementation, Python implementation of algorithm which we're gonna see the hands on, it depends on Python, it's very important that we look at this chart. So this is for the list and you can see here if you want a double ended queue, then this is the best data structure which lets you append left, append right, which lets you insert and deletion at both the end of the container and that's O of 1, that's a D queue. So this is what is recommended. And one more operation which very frequently we use is sort operation and searching in a list. So when you search in a list, for example, X in a S, then it's also O of n. Finding minimum and maximum of a list is also n, but you can see that surprisingly getting the length of list is O of 1, quite contrary to the major developer perception that finding the length of a list will be an operation. It's a beauty of Python, which does give you the length of list in O of 1. So this is set. So you can see that whenever you take a set symmetry difference or when you say whether the element is present in set or not, it's O of 1. So most often we need such operations in algorithm implementation. And these are the dictionaries. As you can see that average case dictionaries is always O of 1. Whenever you look get item, set item, and delete item, it's important that we much worry about this. But you can see that amortized worst case complexity is O of n. So fine, we'll not give much attention to this, but we'll only rely on these assumption. So iterating dictionary is n and copying the dictionary is O of n. Don't think that creating just a copy of dictionary is one operation that no. Whenever if you create any copy of a dictionary, then it is O of n. So by looking at this, I think one of the, another very important thing is sort, which we very frequently we use in our algorithm. As you can see for sorting, the complexity is O of n log n. And it's somewhat based on team sort implementation. Very important, note that the sort built in sort, whether it is sort or sorted, it's always n log n in Python. So this is for the time-close of basic operations. As we have said, we are using another module. We will be using another module heap queue. We may use it. Most algorithms do require use employing priority queue and heaps. So this is a module, and this gives you a mean heap representation and it's based on the textbook heap algorithms. So the APS differs on textbook. So it gives you zero index, mean heap based textbooks implementation. And so you can see that this is almost like heap dot sort. So it's a kind of heap, arabic heap implementation given to you. The general complexity of this is n log n. All the standard heap related operations, what the retrieval insertion of log n and maintaining heap is n log n. So this is also very beneficial when you want to implement heap in a specifically, when you talk about optimal file merge pattern or some other kind of algorithm which requires to maintain a heap or tree based structure. So given this understanding, now we can jump to Python implementation of our algorithm. Okay, now we are on PyCharm ID. This is a program for optimal files merge pattern. As you can see here, now we are here relying on heap queue module which is nothing but it's a, which lets me create a heap like data structures in Python. So if I have a files and if this list denotes the size of files, then as per the problem of optimal file merge pattern, I'm supposed to get the minimum scan operations or the minimum number of operations to merge all these files. So what I do is the best example to solve this approach is, we are looking at the code. Like algorithm has a standard, we are solving that and this code sticks to the standard method of algorithm approach. So what we do is we heapify it. So we create a heap of files. So we create one more copy of this files and we call it as heap files and we heapify it. So this is structured in form of heap. And once we structured a number of heap, now what I do is I keep on checking the heap as long as the number of nodes in heap are greater than one. So technically what I'm doing is I'm removing first two elements from the heap. And since this being a heap, it guarantees that these are the first two minimum in that. Once I get two minimum files from this, I merge that and this file C value is once again sent back to the heap because this is a resultant merge file. And now I need to look for with what other remaining files this merge file can be merged. So the moment I put it in heap, by using heap push, it automatically makes sure that this file is properly inserted in this heap. So next time in next iteration, when you pull A and B it properly makes sure that we get the minimum values which we merge as per the standard algorithm implemented for optimal files merge pattern. And in this iteration, I'm just noting down the total number of merge operation by just adding it to total operation. So in the end, when you do a print operation, it will print the total number of operations required to merge this file. And this total number of operation will be the minimum possible for the given case. So now if I execute this, you should see for the given input one, the minimum file first operation should be 68. So you can verify it by pausing the video and try to solve this input manually. Now once again, you can guess the algorithm design pattern and once again, we always try to take minimum possible a kind of a grid based approach to get always the minimum and merge that. So that's it is the program for this here for this video section. Let us move to the next slides. Okay, now that we have seen the Python implementation for the problem, can you guess what algorithm design paradigm we have used? You can pause the video at this point and you can guess your answer. But the answer for this is greedy approach. So that's it for this video. Thank you.