 An important extension of the notion of independence occurs when we consider our vectors to be functions. So suppose v is a set of functions. How can we find a basis for the span of our set of vectors? And a good way to start the analysis is to try and solve a specific example of this kind of problem. So for example, suppose my set v consists of the functions f1x, f2x, and f3x. Do these form a basis for the span of our vectors? Now before we proceed, we do need to establish a few ideas. First, since our determination of whether or not a set of vectors is independent or dependent depends on whether or not we can find a linear combination equal to zero, it's important to understand that a linear combination of functions is equal to zero only if it's equal to zero for all values of the variable. For example, 3x plus 7 is not zero, even though there are specific values of x that will make it equal to zero. On the other hand, a more complicated expression like this, after all the dust settles, will turn out to be equal to zero regardless of the value of x. So this is a linear combination equal to zero. So now with this idea in mind, the question at hand is whether or not there's a linear combination of these functions that's going to be equal to zero. And here we have to go back to algebra and remind ourselves that a polynomial is zero if and only if all of its coefficients are zero. So in order to determine whether the coefficients a1, a2, and a3 can make this zero, we'll have to expand and collect our like terms. And after the dust settles, we get this polynomial a1 plus a3x squared plus 3a1 plus 3a2 plus 7a3x plus 5a plus 7a2 plus 3a3 equal to zero, and we want to solve this system. So I'll set this up as a system of linear equations and try to solve it. Now the easiest way to answer this question is to note that the determinant of the coefficient matrix is going to be a non-zero value, and so this system will have exactly one solution. And by inspection, we can see that that solution has to be a1, a2, and a3 all equal to zero. And going back to the original problem, this means that the only linear combination that will give us zero is the trivial linear combination, and so the vectors are independent. Now this approach works on polynomials, but even then it takes a lot of effort, and so an important question to ask yourself is, self, can we make this more efficient? Well, since we're trying to solve linear combination equal to zero for a1, a2, and a3, one of the things we might do is to look for other equations with the same solutions. And here's where we might invoke a little calculus. So I have this equation. If I differentiate, because a1, a2, and a3 are constants, differentiation won't change their value. On the other hand, differentiation will change these expressions and will give us a new equation. So if I differentiate with respect to x, I'll get a new equation involving a1, a2, and a3. But wait, there's more. Remember, anything you can do once, you can do as many times as you'd like. And if I want to get another equation, well, I could just differentiate again. And so differentiating again, and this gives me a new equation, again involving our variables a1, a2, and a3. Now what's important to recognize here is that the variables, the things that we don't know, are these values a1, a2, and a3. So our coefficients are going to be everything else. They're going to be the thing multiplied by a1, a2, and a3 in the first equation, the thing multiplied by a1, a2, a3 in the second equation, and the thing multiplied by a1, a2, and a3 in the third equation, and this will be our coefficient matrix. What we'd like is for there to be a unique solution to our original equation, which means there has to be a unique solution to our system of equations, and that will happen when our determinant is not equal to zero. So let's check out that determinant. We note that the third row has constants and a zero, so we might try to expand along that third row. So expanding our determinant along the third row gives us, which is a rather daunting expression, but we do a little bit more polynomial algebra and find that determinant is equal to 68. And because the determinant is not equal to zero, the original equation has a unique solution. The proceeding suggests a general way to approach the problem of determining whether a set of functions is independent. So let's take some set of functions to determine if linear combination equal to zero has a unique solution, in which case our set of functions is a set of independent vectors and we can differentiate repeatedly to form a system of n equations in our unknowns, a1 through an. If the determinant of the coefficient matrix is non-zero, then the original system has a unique solution and our vectors are independent. The determinant itself is known as the Ronskian. For example, suppose I have this set of functions, x squared plus 2x plus 5, 2x plus 7, x squared minus x minus 9, and we'd like to know if this forms a basis. In particular, are these independent functions? So we want to know if some non-trivial linear combinations of these functions gives us the value zero. So we'll set up our equation and there are three unknowns, a1, a2, and a3, so we need three equations, which we can find by differentiation. Whether our original equation has non-trivial solutions depends on whether or not there is a unique solution to the system of equations. And we can answer that question by finding the determinant of the coefficient matrix, which will in fact be zero. And so our original equation has non-trivial solutions and our vectors are not independent. Consequently, v does not form a basis. If the vectors in our set are polynomial functions, we don't actually need to use the run-skin. And that's because linear combination of polynomials equal to zero corresponds to a system of equations that we can solve, albeit with some difficulty. A more important use of the run-skin occurs when our functions are not polynomials. So, for example, let's take the set of functions sine 2x, cosine 3x, and e to the x, and let's see if these form a set of independent vectors. So we'll set up our system of equations by differentiation, and the determinant of the coefficient matrix is going to be our run-skin. Now, calculating the determinant of a matrix can be difficult, especially if the matrix is sizable. We can simplify the problem using a power tool provided that we take certain safety precautions. And the most important safety precaution is making sure that you understand why you're doing something. In this case, we are interested in finding the determinant of the run-skin because it will tell us whether a linear combination of these vectors produces zero. But since these vectors are functions, a linear combination equal to zero means that for every value of x, that linear combination is going to give us zero. And so that means that every value of x will give us a determinant of zero. And so we can turn that around. If every value of x makes the determinant zero, then the original equation has non-trivial solutions. So if a single value of x produces a non-zero determinant, then the original equation has only trivial solutions. And what this means is that if we can find any value of x that gives us a non-zero determinant, then we will be able to conclude that our original vectors are in fact independent. So rather than computing the determinant of the actual run-skin, we might note that at x equal to zero, our determinant simplifies, and we can calculate the determinant of this simpler expression. And since the determinant is not equal to zero, then it means that our system will have a unique solution and our vectors are independent.