 An important concept that's related to the span of a set of vectors is the independence or dependence of that set of vectors. A set of vectors is said to be dependent if one or more of the vectors in our set can be expressed as a linear combination of the others. The set is independent otherwise. It's easiest to think about dependent sets as having redundant elements. And it's important to remember that it's the set of vectors that are dependent and not any specific vector. And mathematicians are a strange group of people because one of the things that we like are problems. And in this particular case, this concept of independence or dependence raises the following problem, given a set of vectors, v, determine whether they are independent or dependent. And so the problem then becomes we want to know whether any of these vectors can be expressed as a linear combination of the others. But which vector? So let's think about that. Again, a little analysis goes a long way. Suppose that one of our vectors can be expressed as a linear combination of the others. Well, I can write down the equation and I can rewrite it so that all vectors are on the left. And buried someplace in here, we have this minus one times our vector that can be expressed as a linear combination of the others. So what can we do next? Well, something that makes it easier is to remember that consistency counts. It's easier to look at things when they are consistent. And in this particular case, most of the things on the left-hand side are scalar multiples of a vector. The only anomalous one is this minus one times our vector that can be expressed as a linear combination of the others. Well, minus one times a vector is the same as plus a coefficient times the vector. And so all of the terms on the left-hand side can be expressed as scalar multiples of the vectors in our set. And that leads us to an important conclusion. If a vector in our set can be expressed as a linear combination of the others, we can express the zero vector as a linear combination of the vectors in our set. Now it's important to notice that we got to this conclusion by starting with the idea that the set was dependent, that we could express one of the vectors as a linear combination of the others. But what if we didn't know that already? Well, we might start with our equation once again. And by inspection, which is to say just looking at this equation, we know that there's one solution if we make all of these coefficients A i equal to zero. But what if we have a solution for A i not equal to zero? So then our equation becomes this, and someplace buried in here is that non-zero coefficient. So I'll solve the equation for that term. And because the coefficient is not equal to zero, I can multiply by negative one over A i. And this gives me V i as a linear combination of the other vectors. Now if I put the two sides together, what I see is that if I can express a vector as a linear combination of the others, then I can write the zero vector as a linear combination. And here's an important idea. In that linear combination, at least one of the coefficients is not equal to zero. We call this a non-trivial linear combination. Likewise, if I can write the zero vector as a non-trivial linear combination of the vectors in our set, then I can isolate one of those vectors. I don't know which one, but I definitely can isolate one of them and express it as a linear combination of the others. And so this proves a very important theorem in linear algebra. A set of vectors is independent if and only if the only solution to linear combination equal to zero is to have all of the coefficients equal to zero. This means that we can determine if a set of vectors is independent or dependent by solving a system of equations. For example, suppose I have a set of vectors, and I want to determine whether the set of vectors is dependent or independent. So our theorem says that we can determine this by looking for the linear combinations that give us the zero vector. So we want to try and express zero as a linear combination of the vectors. And so this gives us the augmented coefficient matrix. And since all of our constants are zeros, we can omit them in our row reduction until the very end of the process. So we'll row reduce our coefficient matrix. We'll restore our constants, which are still zero, and then use back substitution to solve for x3, x2, and x1. And because the only solution to this linear combination equal to zero is the trivial solution, all of our coefficients are zero, it follows that the vectors must be independent.