 One particularly important result comes from the following optimization problem to minimize the sum of the squares of the differences. So suppose I have three given numbers. I want to find a value x, so the sum of the squared differences between x and the three numbers is at a minimum. So my function is going to be the sum of the squares of the differences between x and each of the numbers. So I'll differentiate. Since this is again a polynomial, then it's always defined, and so my critical points will occur where the derivative is equal to zero. So I'll solve that equation for x. A useful feature about solving for the critical points this way is the right-hand side is an expression that will have the same sign as the derivative. So we have a critical point at a plus b plus c over three. If x is less than this, then 3x will be less than a plus b plus c, and so the derivative will be negative, and f of x decreases until it reaches this point. Similarly, if x is greater than a plus b plus c over three, then 3x will be greater than a plus b plus c, so the derivative will be positive, and f of x increases after this point. And so that means x equal to a plus b plus c over three will correspond to a local minimum value. Let's try to generalize this. If we had more values a1 through an and wanted to minimize the sum of the squared distances, we'd find our derivative as before, and the algebra would tell us that a critical point is going to occur at, and we'd be able to do the same sort of analysis and find that this corresponds to a local minimum value. At this point, you should ask yourself, self, where have we seen this before? And it turns out that this is just the mean of those numbers. And this gives us an important result. Given a set of numbers, the mean minimizes the sum of the squared differences.