 What if we have more than one constraint? It's important to recognize that constraints are actually inequalities. For example, to maximize the area enclosed by 240 feet of fencing, you could consider areas using less fencing. Similarly to find the shortest distance between a point and a curve, you could consider points on the far side of the curve. But with just one constraint, there's usually no reason to worry about inequalities. However, if there's more than one constraint, the inequalities become possibilities. Now, in what we're about to do, it helps to think of the constraints as resources. While in the real world, these resources are different types of things we can imagine converting one into the other. We actually do this in real-world projects by allocating cost amounts to each resource required. So our different constraint functions can be combined into a single function. Note that this by itself is not a constraint function. We'll do an example to see why. So we'll come back to our table and chair factory and say putting together a chair cost five hours of cutting and two hours of finishing, and putting together a table cost two hours of cutting and three hours of finishing. If 40 hours are available for cutting and 20 hours available for finishing, let's write the constraint functions, and let's write a single function expressing the total cutting and finishing used and explain why the resulting function is not a constraint. So suppose we assemble x chairs and y tables. So since each chair takes five hours of cutting and each table takes two hours of cutting, then the total cutting time for chairs and tables will be 5x plus 2y. Since there are 40 hours available for cutting, we then require 5x plus 2y be less than or equal to 40. Similarly, since each chair takes two hours of finishing and each table takes three hours of finishing, then the total finishing time for the chairs and tables will be 2x plus 3y. And since there are 20 hours available for finishing, we require 2x plus 3y be less than or equal to 20. Now, we can incorporate both constraints into a single function if we convert the cutting hours and finishing hours into the same type of quantity. In the real world, we often do that by computing a monetary cost. Suppose it costs $1 per hour to cut and $2 per hour to finish. Then the cost to cut x chairs and y tables will be? And the cost to finish x chairs and y tables will be? And so the cost to cut and finish x chairs and y tables will be the sum. So why isn't this a constraint function? For concreteness, suppose we assume it costs $10 per hour to cut, but $15 an hour to finish. Now imagine we've used all available cutting time, 40 hours, but only 10 hours of finishing. Our cost will be $550. But if we decrease the amount of finishing time, we could increase the cutting time and maintain the same cost. For example, if our finishing time went down to eight hours, we could increase cutting time to 43 and still spend the same amount of money. But this requires more cutting time than we actually have. And so the reason this is not a constraint is because the same dollar amount might correspond to feasible and infeasible values of x and y. Now while this new function, f, is not a constraint, it does share some of the properties of a constraint. And those emerges follows. Suppose we've optimized l. Then the resources used correspond to a value of f. And again, there's a corresponding level curve. And as before, if l is not cotangent, then higher or lower values of l will correspond to the same value of f as the same use of resources. So by exactly the same argument that gave us the Lagrangian in the first place, the optimal values of l will occur when the gradient of l is a scalar multiple of the gradient of f. And here we can introduce one useful simplification. Since the effect of multiplying the gradient of f by lambda would be multiplying each lambda i by some constant, we can just use the equality of the gradients and let the coefficients lambda 1 through lambda n incorporate this scalar multiple lambda. And this leads us to the following strategy. Suppose we want to optimize l subject to some set of constraints. An optimal value will occur at a solution to the gradient of l equal the sum of Lagrange multiples of the gradients of the individual constraint functions. Now, generally speaking, these equations are too hard to be solved by hand, so we'll focus on setting up the system of equations that we need to solve. So for example, let's set up the system of equations needed to find the minimum distance between the origin and the curve formed by the intersection of two surfaces. You might recognize that this surface is a cone and this surface is a plane, so we actually have very literally a conic section. Since we have to be on the surface z squared equals x squared plus y squared and also on the plane 2x plus 3y minus 4z equals 12, these are both our constraints. So we want to rewrite our constraints as functions, and so we'll get all the variables on one side. And so one constraint is f of xyz equals 0, where f of xyz is x squared plus y squared minus z squared. Our other surface, all of our variables are already on one side, and since we're differentiating, an added or subtracted constant can be ignored. So our second constraint can be g of xyz equals 12, where g of xyz is equal to the variable part of the expression. So this gives us our two constraints. And since we're trying to find the distance from the origin, we'll use our old standby of the square of the distance as our objective function. So we want to find lambda and mu, where the gradient of l is lambda times the gradient of f plus mu times the gradient of g. So we'll fill in those partial derivatives. And comparing our components give us three equations. Now there's five unknowns, x, y, z, lambda, and mu. And so we need two more equations, and we get those from our constraints. Again, typically these systems of equations are extremely nonlinear and extraordinarily challenging to solve by hand. So when we solve them, we'll probably use some sort of technology. And in this particular case, this system leads to a fourth degree equation, and we find two sets of real solutions. And again, if we consider the geometry of the situation, one of these points is the closest point to the origin, and one is the farthest point from the origin.