 So, to find an extreme value of some objective function subject to the constraint function, we can start looking at places where the gradient of our objective function is lambda times the gradient of our constraint function. But while these places will be extreme values, they might not be the extreme values we want, and it's important we should always check to see if we found a maximum or a minimum. Let's take a look at some examples. For example, a farmer has 240 feet of fencing and wants to enclose a rectangular paddock for a sheep. One side of the enclosure uses an existing wall and does not require fencing. What are the dimensions that give the maximum area? And you should recognize this as a calculus one optimization problem, but we'll look at it in multivariable calculus. So the old way, we found a of x, a single variable function giving the area. We found the critical values of a prime of x, and then determine which critical value corresponds to a maximum. Using Lagrange multipliers means we need to identify an objective function, which we want to maximize, and a constraint function. So we want to maximize the area. So let's put down variables representing our length and width. And in a spasm of creativity, we'll call one x and the other y. So we want to maximize the area x, y. And since the 240 feet of fencing have to go along three sides of the paddock, the constraint is that 2x plus y equals 240. And so we'll let our constraint be p of x, y equals 2x plus y. And our objective function, a of x, y equals x, y. And so we want the gradient of a to be some constant times the gradient of p. Now, we know what a and p are, so we can find the gradients. So since p of x, y equals 2x plus y, we find the gradient, that's the partials with respect to x and y. And likewise, a of x, y, we can find the gradient, the partials of a with respect to x and y. So we want the gradient of a to be lambda times the gradient of p. So let's substitute those values in. And we'll complete the scalar multiplication by multiplying all of the components by lambda. This gives us a vector equation we need to solve. Now, to solve this, we'll compare the components. The first component, y, is equal to the first component, lambda times 2 to lambda. And similarly, the second component, x, is equal to the second component, lambda. Now, you might notice there's just one problem. We have three unknowns, x, y, and lambda, but only two equations. We need a third equation. Fortunately, we do have another equation that involves x and y, and that's our constraint equation. And so now we have three equations and three unknowns, so we can solve. So let's go ahead and take our constraint equation. We'll substitute x and y and find a value of lambda. And since x and y are expressed in terms of lambda, we can then find x and y and get our solution. So again, we know that this value is an extreme value. So we can test to see if it's actually the maximum. And the important thing here is that since we know it is an extreme value, it's either a maximum or a minimum. So if it's not the maximum, it will be a minimum, and every other value of our objective function will be greater. And so we can compare the value at 6, 120, with our objective function for any other value in our feasible region. Now, since 2x plus y must equal 240, we can take a value like, oh, I don't know, how about x equals 120, y equals 0, and we can compare. A of 60, 120 to A of 120, 0. And since this value is less, then this value can't be a minimum, so it must be a maximum. The method of Lagrange multipliers extends to higher dimensions. In general, to find an extreme value of some objective function subject to the constraint, we seek a solution to the gradient of the objective equals lambda times the gradient of our constraint. And remember the geometric significance here is that any solution corresponds to a point where the normals to the level curve corresponding to the constraint and the level curve corresponding to a specific value of our objective function coincide. We'd say they are coincident, but this sounds like it's accidental, whereas we're very much trying to make this happen. And it's also worth mentioning that we could also solve lambda gradient of the objective equals the gradient of the constraint. It doesn't matter which side the lambda is on. So, for example, let's find the point on the plane that is closest to the point 1, negative 3, 5. Since the point must be on the plane, the equation of the plane is the constraint. And this will be the level curve f of x, y, z equals 30, where f of x, y, z is the variable portion of our equation. Now, the closest point will have the least distance to the point 1, negative 3, 5. And so we can find the distance to the point using our distance formula. And the useful trick from single variable calculus, if we want to minimize distance, we can also minimize the square of distance. And so the objective function that we'll want to use is, and you might remember the reason we did this, is that if we try to differentiate something that looks like this, that square root makes the derivative messier and more complicated. So we want the gradient of L to be lambda times the gradient of f, and that means we'll need our partial derivatives. So we find our partial derivatives of L with respect to x, y, and z, and our partial derivatives of f with respect to x, y, and z. And we have an equality between two vectors. So equating the components, first to first, second to second, third to third, we now have a system of equations and we can try to solve them. And again, there's actually four unknowns here, so we need a fourth equation, and that comes from our constraint. Now, since our constraint is linear in x, y, and z, we might actually want to solve these equations for x, y, and z. Replacing in the constraint equation, and we find a value of lambda, and once we have the value of lambda, we can find x, y, and z. And since we're looking for a point, well, here's the x, y, and z coordinates of a point that makes the objective function have an extreme value. And again, while this is an extreme value, it might not be the minimum value that we're looking for. And so we should check, could this be a maximum value? And in this case, the geometry says no. If we stay on the plane, we can get as far away from the point as we want. And that means the point that we found is in fact the point on the plane closest to the given point.