 One of the most useful things we can do with calculus, and one of the more important things to do in higher mathematics, involves something called a polynomial approximation. And the basic problem is this. I want to use a polynomial to approximate a function. Before we try to solve this problem, there are some questions we need to answer before finding a polynomial. First, what degree, and second, what are the coefficients? More generally, if we think about trying to find a solution to this problem, there are also some other considerations we might want to be aware of. First, how accurate is our approximation? What's it worth? In other words, what's it worth to gain more accuracy? And importantly, how extensible is our process? Can we make it more accurate? Can we make it apply to more types of functions? How easy is it to do these things? Well, all of these things are good ideas to keep in mind. The one thing to remember is, if you don't play, you can't win. So, while these are important questions to ask, let's try and answer them as we go along, rather than trying to solve them before we start. For example, let's try to set up a system of equations to approximate f of x equals square root of x using a second-degree polynomial. So, we'll let f of x be approximated by a second-degree polynomial. And if f of x is approximately equal to this second-degree polynomial, then finding f of x for different values of x gives us an approximate equality, which we'll treat as an equation. Since there are three unknowns, a0, a1, a2, we'll need three equations, so we'll need three values of x. But which ones? Again, if you don't play, you can't win. We'll try out some values of x and see where they take us. Let's try x equals 1. Then on the one hand, we know that f of 1 is the square root of 1, which we know is equal to 1. And so, if our function is approximated by the polynomial, then our function value at 1 should be equal to the polynomial at 1. So, I'll substitute and clean things up a little bit, and this gives me one equation with three unknowns. That gives us one equation with three unknowns. So, if we can find two other equations, we can solve the system. So, let's try x equals 2. If x equals 2, then f of 2 is square root of 2. And so, substituting 2 into our function and into our polynomial, we get, which is an equation that involves a0, a1, and a2, except there's just one problem. We don't have an exact value for square root of 2, and using any approximation will reduce the accuracy of our final answer. Computer scientists have a saying for this. Garbage in, garbage out. You should start with exact values. So, we want to find a value of x for which we know the square root of x precisely. Well, how about x equals 4? So, again, our function value at 4 is square root of 4, which we know exactly equal to 2. And if we have a good approximation, that polynomial at x equals 4 should give us the function at x equals 4. So, we'll substitute those in and do a little bit of algebra, which gives us a second equation. So, that's two equations. We need a third equation. And, again, we want to pick a value of x for which we have an exact function value. So, how about x equals 9? Since f of 9 is square root of 9 is equal to 3. And so, I can substitute x equals 9 into our equation. That gives us the third equation of the three that we need to solve for a0, a1, and a2. Now, this is a system of three equations and three unknowns. If we solve this system for a0, a1, and a2, we'll get a polynomial interpolation. They are generally accurate, especially between the chosen values of x. In this case, we chose x equals 1, x equals 4, and x equals 9. But, it's a lot of work to solve this system of equations. So, what can we do? Well, we want to solve the problem in the hardest way possible. No, no, no, no, wait, wait. We want to solve the problem in a way that's easy to implement. And one simplification is for our values of x, we can use x equals 0, which will eliminate almost all of the terms of the polynomial. So, if we evaluate our function at 0 and our polynomial at 0, we get a much simpler equation. And so, now we only need three equations, so we can get rid of any one of these equations and have a system of equations that we can solve. Still, this is still a hard problem to solve, and it's not easily extensible. If I wanted to find a third, fourth, or fifth-degree polynomial approximation, I'd have to solve a system of equations in three, four, or five unknowns. If only we knew some mathematics beyond algebra. Oh, wait a minute. We do know mathematics more powerful than algebra. We've been learning calculus, so let's see what we can do with that. So, it's still convenient to use x equals 0. So, if I let x equals 0 and treat the approximation as an equality, I get the equation 0 equals a 0. So, let's try something from calculus. Suppose we differentiate. If square root of x is equal to our second-degree polynomial, then surely the derivative of square root of x is equal to the derivative of our second-degree polynomial. It's on the Internet. It must be true. This is something that's going to bear more thought. But for right now, we'll assume that the derivatives are equal. And as before, we'll choose a value of x to get an equation. If x equals 0, we have... Ah, that won't work. We'll have to pick a different value of x. So, remember, garbage in, garbage out. So, whatever value of x we choose, we need to make sure we can evaluate both sides exactly. So, if x equals 1, we have... Which will give us a second equation. And we could get a third equation by letting x equals 4. And we now have three equations and can solve them, except this problem is still too hard and still not easy to extend. So, what can we do? The secrets of the universe can be found in a shampoo bottle. Lather, rinse, repeat. We differentiated once. Why not differentiate a bunch of times? So, let's say I have my second-degree polynomial. I'll differentiate it once, twice. And if my function and all its derivatives are defined at x equals 0, then I can find the coefficients directly by evaluating the function at 0. And let's say I'm not satisfied with a second-degree polynomial. I could use a cubic. Or I could use an even higher-degree polynomial. So that means whatever degree of polynomial I want to use, we can very easily extend this approach to find all of the coefficients. And this leads to something called the Maclaurin series. Suppose a function and all of its derivatives exist at x equals 0. The Maclaurin series is the power series, sum from n equals 0 to infinity anx to power n, where the an's are determined through the values of the successive derivatives of f at x equals 0. In the next video, we'll take a look at finding the Maclaurin series for some functions.