 This time we're going to be looking at Amdahl's law in a different light. One of the most useful applications of Amdahl's law is to understand how and when we should optimize programs. In this case, we have a series of three programs that we'd like to run, and we're going to assume that the amount of computation that each program does is going to be fixed. So when we improve the performance of one of our programs, it's not actually going to affect the distribution that we've got. We're still going to have 10% of our total computation performed by program one, even if program one takes much less time. This should give us some insight into how optimization works, what kind of limits we get, and how Amdahl's law works into all of this. To begin this problem, I'm going to be interested in finding how much time I'm spending on each of these programs on average. So the way I'm going to do this is effectively by computing a weighted average. I'm going to be looking for how much time do I spend on program one on average. So program one is 10% of the total computation that I have to do. So I have 10% times the amount of time it takes to complete one instance of program one, which is five seconds. So overall, for the average program execution, I spend half a second running program one. I will do the same thing for program two. Program two is 20% of my computation, and it takes six seconds to run. So that gives me 1.2 seconds. Then program three takes the remaining 70% of the computation and requires one second to execute. So for the average program execution, I have half a second from program one, 1.2 seconds from program two, and 0.7 seconds from program three. The useful thing that this tells me is that program two is the most expensive thing I've got here at the moment. That can either be because I need to run lots of program twos, or it could just be because program two is expensive. Either way, I will get the most benefit out of optimizing program two because that 1.2 seconds is larger than any of the other options that I've got here. The second most expensive program that I have here is program number three. So if I was going to optimize one of these programs, I'd like to start with program number two, try to improve that as much as possible, and then later I would switch to optimizing program three. But I can also consider, when is it relevant to switch from optimizing program two to program three? When am I more likely to get some benefit out of doing that? And that's going to be when this number gets down to being 0.7 seconds on average as well. So once those two numbers match, then it would be equally productive to optimize either one of them. That means that once I've sped program two up by a factor of 1.2 over 0.7, then it would complete all of its operations in 0.7 seconds on average. Basically, I'm going to take this fraction, I would invert it and apply it to this equation, and the 1.2 would cancel the 1.2 and I'd be left with 0.7. So if I'd like to know how that applies to the actual execution time of an individual program, so I don't have to go back through all of this, then I would take the six seconds that it currently takes to execute, multiply that by 0.7 divided by 1.2. If I multiply both of these numbers by 10, then I have something that's a little easier to work with, as my six will mostly cancel out my 12, leaving me with 7 over 2 seconds or 3.5 seconds. So once program two takes 3.5 seconds or less, then it would be more efficient to go work on program three, try to improve that performance. I'm switching from optimizing program number two to program number three because I'm spending more time on average on program three than on program two, but we can also consider what happens if I just continue trying to improve program two. Eventually I'm going to get to some bottleneck where program two has been improved as much as possible. It's really not taking any more time at all, and I'm spending the rest of my time running either program one or program three, but I'm exerting all of my effort to improve program two, and I'm getting some diminishing returns here. Continuing to focus on program two, even when it's not the most expensive program anymore, is not going to be as efficient. We're going to run into diminishing returns, and we really should just go switch to one of the other programs at that point. This is pretty much what Amdahl's Law is telling us, that you're going to get diminishing returns as long as you only focus on one component of your optimization.