 Given a set of vectors, we can speak about whether they are independent or dependent. Now let's see if we can do anything about sets of vectors that are dependent. Suppose we want to determine whether a set of vectors form a dependent or independent set. So remember, we can determine this by seeing what linear combinations give us the zero vector. So we'll set down the equation linear combination equal to zero, create the coefficient matrix, and row-reduce it. So our system in row echelon form will be 3x1 plus x2 plus 2x3 plus 7x4 equals zero, and x2 minus x3 plus x4 equals zero. And since x3 and x4 are never-leading variables, we can parameterize them, and back substitution will give us our solutions. Now if s and t are both equal to zero, then x1, x2, x3, and x4 are all equal to zero. But if we don't use s and t both equal to zero, then we will get solutions to this equation. And so we will be able to express our zero vector as a non-trivial linear combination of these four vectors. So the vectors in this set are dependent vectors. This leads to two other definitions. If I have a set of independent vectors that span the vector space, we'll say that we have a basis for our vector space. And then mathematicians like to count, so the next important definition is that of dimension. Suppose I have a basis for my vector space, the dimension of the vector space is just going to be the number of vectors in my basis. So again, every good definition leads to a problem. Let v be a set of vectors that span the vector space v. If v is an independent set of vectors, we're done, because then v is a basis, and the dimension is the number of vectors in the basis. But what if my set of vectors is not independent? I'll have to figure out which vectors are redundant. So let's do a little bit of analysis. If v is a dependent set, then we know that linear combination equal to zero has solutions for some ai not equal to zero. Remember, we typically parameterize these solutions and our variables split into the free variables and the basic variables. Suppose I have a vector that corresponds to a free variable. Because it corresponds to a free variable, I can always choose my parameters to the equation so that the corresponding coefficient ai is not equal to zero. And so as not to confuse the issue, I'll set the other free variables to have value zero. And what this means is we have an equation that we can solve for vi. Let's see how this might work. So let's go back to this vector space and we've already determined that the set of vectors is not independent. And in fact, we found our parameterized solutions. Well, what if I let s equal one and t equals zero? What this means is we'll get an equation that includes the third vector to negative three zero, but not the fourth vector, seven, six, fifteen. And what this means is we can solve this equation for the third vector in terms of the other two vectors. And so we find that two negative three zero is equal to three one five plus negative one times one four five. Well, that works so well. Let's try it again. But this time we'll let s be zero and t equal to one. And in this case we'll have an equation that involves the fourth vector, but not the third vector. And so we'll be able to express zero as a linear combination of the first two vectors along with the fourth vector. We'll be able to solve for that fourth vector. And we'll find that seven six fifteen is two times three one five plus one four five. And what that means is that I can write my vector two negative three zero as a linear combination of three one five and one four five. And likewise I can write the vector seven six fifteen as a linear combination of three one five and one four five. So any linear combination of the vectors in my set can be expressed as a linear combination of these two vectors alone. And I don't need these two vectors. And so that means there are two vectors in our basis and span V, the vector space spanned by this set of vectors, is a two-dimensional vector space.