 Consider now the vectors V1, which is one, two, three. V2 is the vector three, one, negative two. And V3 is the vector negative three, four, 13. So we wanna determine whether or not this set of vectors is linearly independent. Remember, linearly independent means we're looking for, well, we're trying to consider the following problem. Can we find coefficients C1, C2, and C3 so that given those choice of vectors, those choice of scalars C1, C2, C3, can we combine the vectors to give you zero? Now, there's always one way of doing that, right? We can always just slap in the scalars zero, zero, zero. That'll give us the zero vector, but that's a trivial way of doing that. Can we have a non-trivial way of producing the zero vector? If you can't do that, we say the vectors are linearly independent. If there is some non-trivial way of doing it, we say the vectors are linearly dependent. Now, suppose we form a vector, well, matrix A whose column vectors are exactly just the V1, V2, V3 right here. So trying to find that a combination that equals zero says we're trying to look for a vector X so that A times X is equal to the zero vector. This matrix equation represents a homogeneous system of equations and we're trying to figure out are there solutions to this homogeneous system? We don't really care about whether it's consistent or not because we know it's consistent. We have, for example, the non-trivial solution X equals zero. We're trying to figure out are there non-trivial solutions which would come down to finding out is this linear system, does it have multiple solutions or not? At the beginning of this course, we call that the dependent case and this is actually why we call it that, that the dependent case, that is there's multiple solutions to AX equals zero is because the column vectors in play are linearly dependent. So we're trying to ask ourselves, are there non-trivial solutions to this system of equations here? Now, in order to work with that, we of course we're gonna set up a augmented matrix where the coefficient matrix that you see right here is just gonna be the like, it's just the matrix A from above. It's first column is V1, it's second column is V2, it's third column is V3. Now to be proper, we should be augmenting this with the zero vector, zero, zero, zero. You'll notice that I originally omitted that. Why would I do that? We'll actually see in a moment why that is. So if we're trying to solve this system of equations, we look at the one, one position, there's already one there which is great. To get rid of the two below it, we're gonna take row two minus two times row one and to get rid of the three below, we're gonna take row three and minus three times row one. So we're gonna get minus two, minus six, plus six and minus zero. We're gonna get minus three, minus nine, plus nine and then minus zero. Notice that two minus two is zero, one minus six is negative five, four plus six is 10. And then the next one you're gonna get zero minus zero which is still zero, all right. Here you're gonna get three minus three which is zero, negative two minus nine is a negative 11, 13 plus nine is 22 and then you're gonna get zero minus zero again which is just zero. And so notice had we kept the column at the end, the augmented column which is zero, zero, zero, it didn't change when we did these row operations, all right. Well, moving on to the next pivot position, notice that in the second row, everything's divisible by five. We have a negative five, a 10 and a zero, right? If we were to divide everything by negative five so I get a one in that spot, you get a negative one fifth row two. Well, since I'm at it, looking at the third row, everything's divisible by 11 right there. I'm gonna divide by negative 11 to get for row three. And so what then happens is that negative five becomes a one, 10 becomes a negative two. But what happens to the zero, zero divided by negative five is still a zero. And negative 11 divided by negative 11 is a one, right. 22 divided by negative 11 is a negative two. And then zero divided by negative 11 is still a zero. I'm gonna do anything in the first position. So notice when we did row replacements, the zeros don't change. When you do scaling the zeros and that last column don't change, if for some reason we had to do the interchange operation, maybe, because we wanted to switch throws, you're just gonna swap a zero with a zero and the zeros don't change. So one thing to observe here is that when you're working with a homogeneous system of equations, the three raw operations will never change a column of zeros to anything other than a column of zeros. And as such, we often just omit that augmented column for these homogeneous systems here because I know nothing's gonna change when you do that last column there. So we can kind of drop it because arithmetic-wise, it's not offering us anything. To give rid of the one below our current pivot position, we're just gonna take row three minus row two and that does gives us the matrix that you see right here. We have a pivot, we had a pivot column and the first one and the second one. We now get this row of zeros right, which if we throw in that augmented column here, this last one just says that zero equals zero. There's no problem with that. This system is consistent. There was no issue about consistency. But one thing I should mention here is that because of this last column lacking a pivot here, we do see that this non-pivot column produces a free variable. And what we've seen in the past is that a consistent system has multiple solutions exactly when it has a free variable. So what we see here is that this system does in fact have multiple solutions. These multiple solutions, other than the zero vector itself, these multiple solutions are gonna give us non-trivial solutions to the homogeneous system, non-trivial solutions. And since we have non-trivial solutions, this tells us that the set of vectors, the vectors are linearly dependent. The vectors are linearly dependent here because of the multiple solutions to this homogeneous system. So basically, when you have a set of vectors, we can write them as the columns of the matrix. You row reduce that matrix. If you row reduce that matrix and you have any non-pivot columns, then that means the set of vectors were linearly dependent. If every column turned out to be a pivot column, that would mean the vectors are linearly independent. And we'll see an example of that one sometime in the next video here. Now, since we have a set of vectors which are linearly dependent, it then is relevant to start looking for a linear dependence relationship. Can we find non-zero coefficients that when combined together with the vectors, we can produce the zero vector? Well, we're gonna continue to row reduce our matrix right here. So this matrix was an echelon form, but it's not our row reduce echelon form. To get a row reduced, we're gonna take row one minus three times row two. So we get minus three right here and we're gonna get plus six right here. And so this matrix right here is our RREF, row reduce echelon form. If we were to translate it back, well, if you take this system right here, we could do this one right here, no, not a big deal. I'm sorry, we wanted to be doing this one right here. This is the one we're doing. So if we translate the RREF back into a system of linear equations, you'll see that x one, right? Plus three x three is equal to zero. I'm omitting, in this example, I was omitting that augmented column because it's just the zero column which never changed when we do these row operations. But realizing this is a homogeneous system of equations, we should have a column of zeros right here. So we see that one zero three means to us one x one plus zero x two plus three x three equals zero. And then we also for the next row, the zero one negative two, this tells us that zero x one plus one x two minus two x three equals zero. And then the last column, the last row of zeros is just the equation zero zero, which really means that we do not need it whatsoever. Now taking the two equations we have in hand, because there was no pivot in the third column, that's our free variable. Solve for the dependent variables with respect to the free variables. And when we do that, we'll end up with x one is equal to negative three x three and x two is equal to negative, sorry, positive two x three. This is a free variable. And so we could then set x three to be whatever we want. If we said x three to be zero, then x one and x two would likewise be zero. And we've then reproduced the trivial solution. What if we take x three though to be something non zero? Like if x three set equal to be one, that means that x one would be negative three and x two is just two. And I claim that this right here is an example of a non-trivial solution. It's a non-trivial solution. And this gives us a dependence relationship. We're gonna take negative three times the first vector, which remember the first vector was the vector one, two, three. We're gonna take two times the second vector, which you don't see on the screen here, but the second vector was three, one, negative two. And then we're gonna take one times the third vector, which recall that was negative three, four and 13. So what happens when we linear, linearly combine these three vectors using these coefficients, you're gonna end up with negative three, negative six, negative nine. The second vector becomes six, two and negative four. The second one is times me by ones. You just get the same vector again, negative three, four and 13. And so adding these together component wise, you end up with negative three plus six minus three, which if I just stop you right there, that's a zero. And if we then take negative six plus two plus four, that's also zero. And if we take negative nine minus four, nine and four gives us 13. Oh, looky there, 13. You're gonna get zero here. And so in fact, this does give us a non-trivial way to combine the vectors together. And so what we now see here is we have this dependence relationship. We can see that negative three, v one plus two times v two plus one times v three is equal to the zero vector. And so this dependence relationship is evidence that the vectors were linearly dependent. We are able to combine vectors v one, v two, v three using non-zero coefficients and produce the zero vector. And therefore the vectors are linearly dependent.