 Let's determine if the columns of the matrix A, which A is given here, 1, negative 1, negative 1, first row, second row, negative 1, 2, 4, third row, 2, negative 4, negative 7, are the columns of this matrix linearly independent. So we're thinking of, here's my first vector A1, which is 1, negative 1, 2. Here's my second vector A2, which is negative 1, 2, negative 4. Now here's my third vector, call it A3, negative 1, 4, negative 7, are these three vectors linearly independent? Is there some way to non-trivial combine the vectors to get the zero vector, or can that only be done using the coefficient 0, 0, 0? Well, so we're asking are the columns of this matrix linearly independent? Well, to determine whether it's linearly independent or not, we basically have to solve the homogeneous system, Ax equals 0, and find out are there multiple solutions to this homogeneous system? Are there non-trivial solutions to it? That's what we have to ascertain right now. And so this goes back to solving a system of linear equations, which admittedly to solve the linear system, we should have this augmented matrix, A augment the zero vector. But when you do row operations, the zero vector will never change by interchange, replacement, or by scaling. And so we often just omit the zero column because arithmetic wise, it's never going to change. And therefore it's just sort of a waste of ink in that respect. Looking at the first pivot position in the 1, 1 spot, I need to get rid of the numbers below it. So we're going to take row 2 plus row 1. So that will look like 1 minus 1 minus 1. To get rid of the 2 in the third row, we're going to take row 3 minus 2 times row 1. So we're going to get minus 2 plus 2 plus 2. And so notice that 1 minus 1 cancels, 2 minus 1 is a 1 and 4 minus 1 is a 3. For the third row, 2 minus 2 cancels, negative 4 plus 2 is a negative 2 and then negative 7 plus 2 is a negative 5, which you see here. We then move to the next pivot position here in the 2, 2 position. It's already a 1, so that's great. To get rid of the negative 2 below it, we're going to take row 3 plus 2 times row 2. So we get plus 2 plus 6. 2 and 2 here cancel out to give us a 0. And then negative 5 plus 6 is actually a 1. So great, that's already a 1. Noticing here, this matrix is in echelon form. Is it row reduced echelon form? No. But from echelon form, we can determine the nature of the solution set. In echelon form, we can determine whether a linear system is consistent or inconsistent. Well, for homogeneous systems, they're always consistent, so that wasn't the question at hand. But we also can determine in echelon form whether there's a unique solution when we're consistent or when there's multiple solutions because you'll have multiple solutions for a consistent system when you have a free variable. Free variables come from columns with no pivots in them, but as you can see, there is a pivot in each and every column. So because there's no free variables, that means that there won't be multiple solutions to this equation AX equals 0. And because there's not multiple solutions, that tells us that our linear system is in fact, I should say our set of vectors is in fact linearly independent. So the column vectors are linearly independent. We could see that because there was no free variables in the system. And that's all that one has to do to check linear independence or not. If you want to check whether a set of vectors is linearly independent, I would suggest putting those vectors in a matrix. You will reduce that matrix to any echelon form for which RREF is acceptable. You find an echelon form. If every column has a pivot, then the original set of vectors were linearly independent. If there was a free variable in the system, that is, there is a column with no pivot in it, then you can continue to solve the system to determine a dependent's relationship. But from echelon form, you can determine whether a set of vectors is linearly independent or linearly dependent.