 So let's take a few more examples of determining whether or not a particular set deserves membership in club vector space. So again, the main thing we need to check is whether or not our set as described meets all 10 requirements for being in a vector space. So let's consider the following set, the polynomials with integer coefficients using ordinary polynomial arithmetic and a vector space over r. And remember that this means that our scalars are going to be drawn from r, even though our components are polynomials. So the first two requirements are closure requirements. And here the key property of closure is that the sum or scalar multiple is going to be something in our set. So if I add two polynomials with integer coefficients, do I get a polynomial with integer coefficients? And since the answer to that question is yes, then we are closed under addition. The next thing we want to check is closure under scalar multiplication. If I take a scalar and multiply it by one of these vectors, do I get another one of these vectors? So here it's important to remember our scalars come from the set of real numbers. Our vectors are the polynomials with integer coefficients. And here, if I multiply a polynomial with integer coefficients by a real number, I won't necessarily get a polynomial with integer coefficients. And so here we fail closure under scalar multiplication, and so we do not have a vector space. Well, let's try to expand our set. So rather than dealing with the polynomials with integer coefficients, let's look at the polynomials with real coefficients. And we'll run through our checklist. First, closure. If I take two polynomials with real coefficients and add them, do I get a polynomial with real coefficients? Yes. So we pass the first test. If we take a scalar, which is drawn from our set of real numbers and multiply this by one of our vectors, which is a polynomial with real coefficients, do we get a polynomial with real coefficients? And again, the answer is yes. So we pass the second requirement. Next, we have commutativity. And if I take two polynomials with real coefficients, it doesn't make a difference which order I add them. So addition is commutative. Likewise, if I take three polynomials, it doesn't matter if I add the second and third together and then add that to the first. Or if I add the first and second together and then add the third. So addition of polynomials is both associative and commutative. Next, we'll check our zero vector requirement. There is a zero polynomial, namely zero, where if I add zero to any polynomial, I get the same thing I started with. And there's also an additive inverse minus F, which is going to be each coefficient of F multiplied by negative one. And again, it is important to verify not only that these things exist, but that they're also elements of our set. Multiplication by the scalar one. Well, one times F does give us F because we're using ordinary polynomial multiplication. Associativity of scalar multiplication A times B times F is the same as A times B times F. And again, this follows because we're using ordinary polynomial multiplication. And we have our two forms of the distributive property. Does A times F plus G equal AF plus AG, and does A plus B times F equal AF plus BF. And I think we'd agree that this is true because we're using ordinary polynomial multiplication. And so this set, as described, does in fact form a vector space. How about a different example? So this one requires a little bit of calculus, prove or disprove that the set of polynomials with F plus G define to be the antiderivative of the product where our constants of integration are set to be zero. And C times F of X is going to be defined just using our ordinary polynomial multiplication. Let's see if this is a vector space over R. Now, here it's important to go back to this idea there are only so many symbols. And in this particular case, this symbol plus does not mean ordinary polynomial addition, but instead means the result of a particular function that operates on these two polynomials. Namely, the antiderivative of the product with the constant of integration set equal to zero. So we'll check our properties. First, closure under addition. If I take two polynomials F and G and add them according to this definition that the sum is the antiderivative of the product, do I get another thing that is a polynomial? And that seems to be true. The product of two polynomials is a polynomial, and when I take the antiderivative, I get another polynomial. So this set is closed under the defined addition. How about scalar multiplication? Well, here scalar multiplication is defined using ordinary polynomial multiplication. So C times F of X will still be a polynomial, so this set is closed under scalar multiplication. Next, let's check associativity. So again, an example is not a proof, but it could give us some insight into how the proof proceeds. So I want to take three polynomials and add them in different ways using this particular definition of addition and see if we get the same result. So let's consider the result of X squared plus the quantity X cubed plus X to the fourth. Now according to our definition of how we add, if we add X cubed plus X to the fourth, what we're going to find is the antiderivative of that product, which will be X to the eighth over eight with our constant of integrations that equal to zero. Now it's important to remember that this plus still refers to this definition of addition. So if I want to find X squared plus X to the eighth over eight, I find the antiderivative of that product, which will be X to the eleventh over 88. What happens if I find the sum by adding the first two together? So that's the sum X squared plus X cubed. That's the antiderivative of the product X squared times X to the third. And again, this plus refers to this addition by integration of the product. And so that's going to give me X to the eleventh over 66. And here's the thing to notice. If I do the addition of the last two terms first, I get one result. If I do the addition of the first two terms first, I get a different result and we fail associativity. And more importantly, if we try to prove this is associative, we're going to fail because it is not an associative relationship. And so we fail the associativity requirement and our application for club vector space gets bounced. We are not a vector space.