 So this is our last lecture video for lecture 33 in our series here, but it's also our first video based upon section 11.9 on James Stewart's Calculus textbook, for which we'll talk some more about what it means to represent a function as a power series in lecture 44 in our series, so stay tuned for the next video there. What I wanted to do at just the end of lecture 43 is give you some motivation on why we care about power series after all. I said that a power series is going to try to connect the theory of series we've been developing over the last several lectures with the theory of continuous and differentiable functions we've learned about before. And the reason it's the following, well a series itself is an infinite discrete sum, so it's like a series, but it also has a continuous variable x, so it's like a continuous function. It's this marriage between the discrete and the continuous, and as it is a continuous and potentially even differentiable function, we could ask ourselves can we take the derivative of a power series, can we take the integral of a power series, and the answer here is going to be yes. If we have a power series, let's say it's a series, we take the sum of Cn times x minus a to the n, it's a power series with coefficient series Cn and center a, and let's suppose we know the radius of convergence is a positive number, it could be finite, it could be infinite, but it's not the case where the radius of convergence equals zero. Now in that situation, a power series is in fact differentiable, and since it's differentiable actually means it's continuous, it'll be differentiable on its interval of convergence a minus r to a plus r. Now it could be that the interval of convergence actually might have some brackets here, like something like this. We cannot guarantee, we can't guarantee differentiability at this end point, but on the open interval a minus r to a plus r, we can always guarantee differentiability, in which case the derivative of the function, if you take the derivative of f, so our function f was defined as a power series, the derivative of the function will look like the sum where n equals one to infinity of the sequence n times x minus a to the n minus one. Turns out that formula might seem kind of, where did it come from? But if you actually look at it step by step by step, this power series in a more expanded form looks like C0 plus C1 times x minus a plus C2 times x minus a squared, etc. When we take the derivative of this thing, the derivative of a constant, it's gone down, it's dead, it's gone here. The derivative of C1 times x minus a, the x minus a will disappear, so we get C1. That's actually why the derivative starts at one. The constant term, which is n equals zero, will disappear where you take the derivative. Then the next term, you're going to get two times C2 times x minus a. Then the next term would look like three C3 times x minus a squared, and this pattern would continue on, we're right. When you look at, kind of ignore the Cn for a moment, when you look at just this term right here, n times x minus a to the n minus a, that's just the power rule, right? If you're being confused by the x equals a there, just consider the derivative of Cn times x to the n, right? This is just going to look like, if we take the derivative of this thing here at Cn because it doesn't depend on x at all, n times x to the n minus one. And so the idea here is that a power series is essentially just an infinite polynomial. And how do we take the derivative of polynomials? We go term by term, term by term, term by term, and we apply the power rule term by term by term. That's how we get the derivative of a polynomial. And so turns out that's how it's going to work for power series as well. Now, there is sort of issues of convergence going on here, like how to, why are we justified in using linearity at the infinite stage? Well, we can get in some talks about like uniform convergence and things like that, but that kind of takes us beyond the scope of calculus too here. We can take the derivative of a power series term by term by term just like we do polynomials. And that also applies for antiderivatives as well. If we were to take the antiderivative of this function right here, well, there's going to be some constant terms. So we're going to write that first, c plus. And then the constant, the original constant from c0 becomes c0 times x minus a. Then the next one would look like c1 times x minus a squared over 2. The next one would look like c2 times x minus a cubed over 3. You just basically just raise the power by 1. And that's exactly what this formula is doing right here. You have your plus c, your constant. And then you're going from 0 to infinity. You're going to get your coefficient sequence divided by n plus 1. And then x minus a will be raised to the n plus 1 power as well. So you just apply the antiderivative power rule term by term over the entire infinite series. And that's how we take the antiderivative of the power series. So let's look at an example of such a thing. So notice we have this function f of x equals the series. It's defined as a power series n equals 0 to infinity. We take the sum of negative 2x to the nth power over 3 to the n. And so we want to take the derivative of this thing. And we want to take the antiderivative of this thing. So be aware that with f of x here, if we write it in slightly more expanded form, our sum n equals 0 to infinity. We get negative 2x to the n over 3 to the n. In expanded form, this thing is going to look like 1 minus 2 thirds x plus 4 ninths x squared plus, or I guess minus, excuse me, 8 over 27 x cubed. Then the next one we're going to get a plus 16 over 81 x to the fourth. And then we get a minus 32 over 243 x to the fifth. And this pattern would continue indefinitely. It would continue on and on and on. So if we want to take the derivative of this thing, well, first of all, we should actually mention, what is the radius of convergence here? Because our derivative and integrals will only be defined on that interval of convergence, right? Going up to there. Now with this one, we don't necessarily have to go through a full-blown ratio test to determine the radius of convergence. Because honestly, we could rewrite this as, I'm actually going to do it like this. Get this right here. We could actually notice that this thing is a, what happened to my sum? This is a geometric, it's a geometric series. And so it actually looks like negative 2x over 3 to the n. And so as it's a geometric series, it'll be convergent so long as our ratio r is less than 1. Well, that just means that the absolute value of negative 2x over 3 is less than 1, which, of course, tells us that 2 thirds times the absolute value of x is less than 1. And multiply on both sides by 3 halves, we see that the absolute value of x needs to be less than 3 over 2. So this is our radius of convergence right here. The domains of our derivative and antiderivative here are going to be from negative 3 halves to positive 3 halves. So this is our interval of convergence here, which admitted, yeah, this actually is the interval of convergence, but because we could plug in 3 halves and negative 3 halves and go from there, I'm not going to worry about that in this situation because it turns out that the derivative and the antiderivative wouldn't be defined at those endpoints even if the function was. So let's look at the derivative, right? So f prime of x. Although it's nice to look at a general formula, sometimes I like to just look at it term by term, right? If you take the derivative of 1, it's going to just die off. So we're going to get a negative 2 thirds, that's it, plus we're going to get 8 ninths x minus 24 over 27 x squared, plus we're going to get 64 over 81 x cubed. And then finally we're going to get negative 160 over 243 x to the fourth, dot, dot, dot, keep on going, right? There's this infinite domino effect falling down here. But it's good to also write this as a general power series. The derivative of power series itself will be a power series. We have n equals 1, excuse me, to infinity. And we're just going to get n times, well, what do we get here? n times the negative 2 over 3 to the nth is unaffected by this process, but x will drop by power to the n minus 1. So this does give us a general formula for the derivative, but we also just look at term by term by term if that's sufficient for us. We can also do the antiderivative with the integral of f of x dx. We'll come back to the general case in just a moment. You're going to get a c plus a sum of some things as n equals 0 to infinity. But like I said, we'll come back to that in a second. If we do the antiderivative of the original function, you're going to get c plus x minus, let's see, we're going to, x will raise to x squared. You have to divide by 2, so you get 1 third right there. Then for the next term, 4 ninths times x squared, that's going to give you x cubed and you get 4 over 27. And then for the next term, you're going to get minus 8 over 81 x to the fourth. I can squeeze one more into here, right? We're going to get, sorry, where did I get 81 from? Back up here. We need to take, when we raise to the x to the fourth, we have to divide by 4. h divided by 4 is 2, so we get 2 over 27. That's our coefficient there. Then the next one, we're going to raise, taking the 16 over 81 x to the fourth. The x to the fourth will raise to the x to the fifth. So we have to divide by 5, so we get 16 over 5 times 81. That's 405. And so this pattern will, of course, continue on. This is a positive value right there. And so we can do it term by term by term, but what's the general formula of this thing? The general formula that we're going to get here is we have our negative 2 over 3 to the n. x will be raised to the n plus 1 right there, and then you divide all of this by n plus 1. So you can do something like that. If you don't like that so much, you could rewrite it one more time. You could write it as a big fraction. We have negative 2 to the n, x to the n plus 1, and then we have a 3 to the n times n plus 1, something like that. This will give us a formula for anti-derivative. And so the derivative and anti-derivative are defined on this interval of convergence right there. So that's all there is to calculating the derivative or anti-derivative of a power series. It's fairly simple. It's really just doing like it was a big polynomial. But it turns out that observation is actually a powerful one that helps us in forthcoming lectures. We're going to see why it's important to be able to calculate the derivative and anti-derivative of a power series, and stay tuned for that in our next lecture. As always, I do want to encourage anyone who's watching these videos. If you have any questions, please, please, please post your questions in the comments section here on YouTube. I'd be happy to answer any questions you have. If you learned something in this video, please push the like button. If you want to see more cool videos about mathematics, subscribe to get updates, or even still, you can put some comments down below and make some recommendations of things you'd like to learn about, and I'd be happy to make videos like that. Other than that, I'll see you next time, everyone. Keep on calculating. Bye.