 Let us continue our discussion on dimensions, dimensions of subspaces. I would like to prove a few results today on computing the dimensions of certain subspaces but before that let me give you one or two general results which were hinted in a couple of lectures ago when we discussed the notion of linear independence. Let me start with this result. I have a square matrix with real entries order n if the columns of A are linearly independent then the matrix A is invertible. You might remember that we made use of this fact in a numerical example let us prove this general result. If the columns of A are linearly independent then A is invertible actually there is a corresponding results for the rows. I have a square matrix, the rows of A are linearly independent then it will be invertible and that is a result which you could prove using this result. So I will not state that result, I will only prove this result. Let me denote superscripts 1, 2, 3, etc and we will denote the columns of A that is I can write A as A1, A2, A3, etc A n. What is given is that these columns are linearly independent we must show that A is invertible. We will make use of a result that we proved some time ago. A matrix A is invertible if and only if the non-homogeneous equation A x equal to b has a solution for all b, okay. So let us take, let us consider the system A x equals E1. E1 is the standard, first standard basis vector of R n, okay. So let me just write it is a column vector whose first coordinate is 1, all other entries is 0. The claim is the above system has a solution. This is the claim. We will prove this claim. Once we have this we can change the right hand side. What we will show is that if the columns are linearly independent then this system will have a solution. The next system A x equal to E2 will have a solution, A x equal to E3, etc A x equal to E n. All these n systems will have a solution. We will prove that. Once we prove that these n systems have a solution it will follow that A is invertible, okay. I will give the details. But before that how do we show that the system has a solution? I have this right hand side vector, okay. This is a vector in R n, okay. Let us now recall what we proved in the last lecture. If I have a linearly independent subset of a vector space having the same number of elements as the dimension of the space then this linearly independent subset must be a basis. For E1 there exist scalars. I will call it x1, x2, etc xn such that E1 is in R n. The vectors A1, A2, etc A n the column vectors are linearly independent. So E1 can be written as a linear combination. I take the coefficients to be these. So E1 is x1 A1 plus x2 A2 plus etc xn A n, okay. This is because the fact that any linearly independent subset of V having the same number of elements as the dimension of V must be a basis. So it must be a spanning set in particular. So any vector in R n I must be able to write it as a linear combination of these vectors. So I have this but let us rewrite this using the matrix A. Can you see that this is A1, A2, etc A n into the vector x1, x2, etc xn? This is this can be easily verified. It is x1 A1 plus x2 A2, etc plus xn A n that is this expression. But this is my A. So this is equal to A x. What I have shown is that then E1 A x equal to E1 that is this system A x equal to E1 that has a solution that comes from the fact that E1 belongs to the span of the column vectors of A. What I have done for E1 can be done for E2, E3, etc. Let me say simply say similarly A x equals E i. All these systems have a solution for 1 less than or equal to 1 less than or equal to n, 2. I varying from 2 to n all these systems have a solution, okay. What I will do is let me denote the solution for the first system by A x1 and denote the solution of the ith system by A x i. I will collect the column vectors x1, etc xi denote that by capital X. Let capital X equal x1, x2, etc xn. Please observe that for a given vector its components are denoted using subscripts and different vectors are denoted by using superscripts. Different vectors are denoted by using superscripts E1, E2, etc appearing here. Components will be x1, h2, etc xn. Components of x1 are these, okay. So I have a matrix now. This belongs to R n cross n. Remember that each of these is a column vector. So this is an n cross n matrix. Now consider A x. Consider A x. A multiplied with the matrix x. This is A1, A2, okay. Let me write this as x is given here. It is A x, x1, x2, etc xn. Using matrix multiplication you can verify that this is the same as A x1, A x2, etc A xn. Using matrix multiplication this can be verified. But this is E1, E2, etc E n. E1, E1 is 100, E2 is 010, etc. So can you see that this is identity of order n, okay. And so what we have shown is that this matrix x satisfies the equation A x equal to i. Again go back to a result that we proved if a matrix, if I have a square matrix which has either a left inverse or a right inverse then it must be invertible. So the conclusion follows. So A has a right inverse and since A is square, A is invertible, okay. This is one of the results which I thought you must know, okay. So let us move on. I want to discuss the formula for the dimension of a sum of two subspaces. What is the formula for the dimension of the sum of two subspaces? Let me give a motivating example and then prove this general result. Let us look at the case of R2 and okay R2 I have a horizontal axis, I have a vertical axis. Let us look at this origin. Let us look at a line like this, okay. This is so called y equal to x. I will use x1 equals x2. Let me use some other line x1 equals x2 equals 2 x1 something like this. This is x2 is the height x2 is 2 x1, okay. They are supposed to pass through the origin. Let us call this as the subspace. What I know is that any line passing through the origin is a subspace. So this w1 is a subspace. This is another line, w2 is a subspace, okay. I want to look at w1 plus w2, okay. The dimension of R2 is 2, dimension of w1 is 1, dimension of w2 is 1. What we observe here is that dimension of w1 plus w2 is equal to dimension w1 plus dimension w2. In this particular example w1 plus w2 I am saying has dimension 2. w1 plus w2 I am claiming is the whole of R2. What is the reason for that? The reason is as follows. I must show that I must show that w1 plus w2 has a basis consisting of 2 elements. Then it will follow that w1 plus w2 is 2 dimensional that is it is equal to R2. But take any vector lying on w1, call that u, take another vector on w2, call that v. Since they are in 2 different lines one is not a multiple of the other because if one were a multiple of the other they would lie on the same line. So take any vector on w1, call that u on w2, u2, v, u and v are linearly independent, u and v span w1 plus w2. Is that obvious? Anything in w1 can be written as a multiple of u, anything in w2 can be written as a multiple of v. So anything in w1 plus w2 can be written as a linear combination of u and v. And so w1 plus w2 is R2. In this example the sum of these two subspaces is equal to the vector space the original space that we started with. And what we observe is that so dimension of w1 plus w2 it is equal to dimension w1 plus dimension w2, okay. We look at a generalization of this particular example in a general vector space not necessarily R2 but this does not hold always in a general vector space. What we have not observed, what we have not made use of is the fact that the intersection is single term 0. The intersection w1, intersection w2 in this example is single term 0. So we will show that this holds if w1, intersection w2 is single term 0, okay. So we want to prove the result that generalizes this equation. First let me take the case when the intersection is single term 0. So I would like to prove the following result. Let w1 and w2 be subspaces of a finite dimensional vector space v such that w1 intersection w2 is single term 0. Remember that w1 intersection w2 must be a subspace intersection of two subspaces must be a subspace. So it must have at least a 0 vector. Here it has at most 0, it has precisely the 0 vector. Then we have the following formula, dimension of w1 plus w2. This is dimension w1 plus dimension w2, okay. We will next prove a result which is a generalization of this but for that we need the notion of extending a basis for a subspace to a basis for the entire space. So let me first give this result, give that a little later. So I want to prove this result. Let us observe that this is well defined. w1 plus w2 was defined earlier and we know that it is a subspace. So one can talk about the dimension of a subspace. What we also know is that dimension of the subspace cannot exceed the dimension of v, v is finite dimensional. Similarly these two numbers, these two are integers w1 and w2 are subspaces so these two numbers are well defined, okay. How do we prove? The proof is probably along expected lines that is let us take the idea of the proof is as follows. You take a basis for w1, take a basis for w2, simply join them that will turn out to be a basis for w1 plus w2 in this case because the intersection is singleton 0 otherwise it would not be. So let me take b1 to be u1, u2, etc, ul, let b1 be a basis of w1, u1, u2, etc, ul that is the basis of w1, b2 will be v1, b2, etc vk. This will be a basis of w2. So I have taken a basis for w1 and a basis for w2. I know the dimensions, I have written down explicit basis. So dimension w1 is l, dimension w2 is k. What is the claim? As I mentioned let me call it b, b is a union of these two basis b1 union b2. This is u1, u2, etc ul, v1, v2, etc vk. The claim is that this is a basis of w1 plus w2. This is the claim is that this is a basis of w1 plus w2, okay. To prove that this is the basis of w1 plus w2 we need to verify the two conditions that this is linear independent and the spanning subset. Verify that it is a spanning subset that is almost there. Let us take z to be in w1 plus w2. Then there exists x in w1, y in w2 such that z can be written as x plus y. That is the definition of w1 plus w2. Now x is in w1 and w1 has this as a basis. So x is a linear combination. I will call it beta 1 u1, beta 2 u2 plus etc beta l ul where I observe that u1, u2, etc ul of course belongs to w1 that is coming from the basis. y is in w2 that is a linear combination of v1, v2, etc vk, gamma 1 v1 plus gamma 2 v2, gamma k vk. Of course v1, v2, etc vk come from w2 just to emphasize. Then look at z, z is x plus y that is beta 1 u1, etc beta l ul plus gamma 1 v1 plus etc plus gamma k vk. No more interpretation of what the right hand side is. I simply observe that this belongs to span of script B, span of script B. It is a linear combination of the vectors of B and so what we have shown is that this is a spanning set. I started with an arbitrary z in w1 plus w2. I have shown that element is in span of B. So w1 plus w2 is equal to span of B. We need to now show that this is a basis the last step then is to show that B is linearly independent. Next we show that this set B is linearly independent that is u1, u2, etc ul, v1, vk are linearly independent vectors. Next we show that B is linearly independent. To show that they are linearly independent you must consider a linear combination equate that to 0 so that the scalars are 0. So let me let me do it on this side. Remember till now we have not made use of the fact that w1 intersection w2 is single term 0. We will use that now. Consider a linear combination let us say alpha 1 u1, alpha 2 u2, etc alpha ul plus some delta 1 v1, etc delta k vk equate that to 0. Now this is a vector in w1. I will call this u. This is a vector in w1. This is a linear combination of u1, u2, etc ul which are in w1. So this is a vector in w1. This is a vector in w2. I will call this v. So what I have is u plus v equals 0. That is u equals minus v. But u is in w1, v is in w2. This means u for instance belongs to w1 intersection w2 which is the same as saying v belongs to w1 intersection w2. Let me just say this belongs to w1 intersection w2. On the left hand side I have a vector in w1. On the right hand side I have a vector in w2. They are equal and so u or v belongs to w1 intersection w2 which I know is single term 0 which means let us say u equals 0. u is a 0 vector but go back and see what u is. That is alpha 1 u1 plus alpha 2 u2, etc alpha ul. This is 0 but now I make use of the fact that u1, u2, etc ul. They form a basis so they are linearly independent. So from this it follows that alpha 1, alpha 2, etc alpha l. They all must equal 0. So I have taken care of the first part. The coefficients corresponding to the first part they are 0. Similarly v is 0 because v is minus u. Since v is also 0 it follows that by a similar argument it follows that delta 1 equals delta 2, etc equals delta k equals 0. So what we have done is to start with a linear combination of the vectors u1, etc ul, v1, etc vk equated that to 0. We have shown in the last step that this holds only if the scalars are 0. So the vectors are linearly independent. So b is a basis of the sum w1 plus w2. Now you see the formula holds. Yes sir. Can you repeat how u is equal to minus v belongs to w1 in the sense of w2? u belongs to w1 v belongs to w2 but u is equal to minus v which means that v is already in w2 but since v is minus u and u belongs to w1 v also belongs to w1. v is already in w2 by virtue of this equation v is minus u, w1 is a subspace. If u belongs to w1 minus u belongs to w1 and so v belongs to w1 also. So v belongs to w1 intersection w2 as well as u because again w1 intersection w2 is a subspace. So if I have a vector which is in the subspace it is negative will also be in that subspace. I hope it is clear. Okay I have written down how does the formula hold? I have written down the explicit basis for b in terms of basis of w1 and w2, basis of w1 plus w2 that is u1, u2, etc ul, v1, v2, etc vk we have shown that this is a basis of w1 plus w2. So look at dimension of w1 plus w2, l plus k but l is a dimension of w1 and k is a dimension of w2, u1, etc u2, etc ul is a basis for w1, v1, etc vk is a basis for w2. So dimension of w1 is l, dimension of w2 is k. So I have this equation, okay. There is a more general formula. We will prove it a little later because we need a little result before that, okay. Now what is the result that we will need to prove this general formula? Let me state this. Let V be a finite dimensional vector space and S be a linearly independent subset. S is a linearly independent subset of a finite dimensional vector space V. Then S is part of a basis of V that is S can be extended to a basis of V. Any linearly independent subset of a finite dimensional vector space can be extended to a basis of V. What is the meaning of this? The meaning of this is let S be a linearly independent subset of a finite dimensional vector space V. Let script B be a basis of V, I am sorry. There exists a basis script V of V such that S is contained in B. That is there exists a basis script B of V such that this S is contained in script B, okay. The proof will make use of a result that we proved earlier, okay. What is given is that S is linearly independent, okay. If span of S equals V then there is nothing to prove. Why? If span of S is equal to V then S is a spanning subset of V which is also linearly independent sorry yeah which is also linearly independent and so S must be a basis. In this case that is the span of S is the whole of V then there is nothing to prove already S is linearly independent. If span of S is equal to it follows that S is a basis so trivially S is part of a basis of V. If span of S is not equal to V so let us consider the case that span of S is not equal to V. Span of S is a subspace it is contained in V it is not equal to V means there is a vector in V which is not in span of S. Then there exists a vector I will call it X1 in V such that this vector does not belong to span of S. Now let me write S explicitly I have S to be equal to let us say U1, U2, etc UL I know that this is linearly independent I am now appending X1 to this set S consider S1 equals S union X1 this is U1, U2, etc UL X1. Now we have encountered this situation before since X1 does not belong to span of S it follows that this set S1 since X1 does not belong to span of S it follows that this set S1 is linearly independent. If this set S1 were linearly dependent then there is a vector which is a linear combination of the preceding vectors that cannot happen for U1, U2, etc UL because they are already linearly independent the only way that S1 is linearly dependent is that this X1 is a linear combination of U1, etc UL but that does not happen because X1 does not belong to span of S so this set S1 is linearly independent, okay S1 is linearly independent then it is like going back to this if then if span of S1 equals V we are done it would then follow that S1 is a basis as before if span of S1 equals V then S1 is a basis S1 is a basis of V and what we want to show we want to show that this S can be extended to a basis it is clear by the construction that S is contained in S1. So if S1 is a basis then we know that S is contained in S1 and so the linearly independent subset S that we started with is a subset of a basis S1 if it is not equal to V we repeat the procedure, okay but remember what happens in this case in this case in the first case let me go back and write down the inequalities for the dimension why should this process terminate, okay that is the question why should this process terminate in the first case what is the dimension of the span of S if it is equal to the dimension of the original space then they are equal so in the second case this is less than or equal to dimension of V strictly less than dimension of V that is this corresponds to dimension of span of S being equal to dimension of V so the subspace is equal to V so S is a basis in this case we know that there is nothing to prove we are looking at the case when the span of S is not equal to V this is a subset not the whole space so dimension of this must be strictly less than dimension of V in the next step the dimension has been increased by 1 S1 has one element more than S so dimension of this subspace has been increased so these are integers these are positive integers dimensions are positive integers so from the previous step we have bridged the gap by one number one integer so span of S is now it is quite possible it is equal to V but if it is not equal to V we will definitely it is clear that the dimension of this subspace is one more than the dimension of span of S so this process has to terminate because V is finite dimension since V is finite dimensional the above process that is the argument the above process must terminate after at most dimension of V steps in fact it is less than this it is at most after dimension of V minus dimension span S steps but in any case it does not exceed this number now since this is finite and every time you are increasing the dimension by 1 this procedure must stop okay so at the end of the procedure you have a subset let us say SK which is a basis maybe I will just conclude at the end of the procedure at the end of the process we obtain a basis SK such that a basis SK of V such that S is contained in SK which is what we wanted to prove that this S is a part of a basis of V okay so let us make use of this result there is a counterpart to this result probably I will leave that as an exercise the counterpart is as follows remember this process informally what it means is that you can start with a basis rather you can start with a linearly independent subset of a vector space if the vector space is finite dimensional you can go from this linearly independent subset to a basis a maximal linearly independent subset is a basis a minimal spanning set is also a basis okay a maximal from a linearly independent set you are adding one vector at a time so a maximal linearly independent set as well as a minimal spanning set must be basis of a finite dimensional vector space okay I will leave that problem as an exercise for you to solve now using this I would like to derive the particular identity for the sum of two subspaces in the general case when the intersection is not necessarily single term 0 okay so what I want to prove is this theorem let W 1 W 2 P subspaces of a finite dimensional vector space dimension of the sum W 1 plus W 2 is the sum of the dimensions remember this time I have removed the restriction that the intersection must be single term 0 I need to bring in that here in the right hand side minus dimension W 1 intersection W 2 okay so this is a formula then for general subspaces W 1 W 2 formula for the sum of two general subspaces W 1 and W 2 you can now see that this is more general than the result that I proved today because of the fact that we also assume the dimension W 1 intersection W 2 is 0 in that case so we assume that W 1 intersection W 2 is single term 0 so dimension is 0 so this number does not appear so I have this formula so this reduces to the earlier formula okay the proof you will see is somewhat similar the earlier idea of the earlier proof will be proof as before okay this time I will do it little different look at W 1 intersection W 2 this is not only a subspace of V it is a subspace of W 1 as well as W 2 a subspace of W 1 as well as W 2 okay so what I will do is start with the basis for W 1 intersection W 2 okay this time I will not use B 1 B 2 etc I will write down this explicitly let Y 1 Y 2 etc Y L be a basis of W 1 intersection W 2 this is a basis of W 1 intersection W 2 this is a subspace of W 1 W 1 itself is a subspace so W 1 is a vector space in its own right I have a linearly independent subset of a vector space W 1 this can be extended to a basis of W 1 all the spaces are finite dimensional V is finite dimensional so W 1 W 2 W 1 intersection W 2 all these are finite dimensional so I am now making use of the previous result that this being a linearly independent subset of W 1 and W 1 is a vector space this can be extended to a basis of W 1 similarly W 2 so let me write down a basis extending this explicitly let Y 1 Y 2 etc Y L, I will use U 1 U 2 etc U S let this be a basis of W 1 this is possible by the previous result and Y 1 Y 2 etc Y L V 1 V 2 etc V K I will use K be a basis of W 2 I started with the basis of W 1 intersection W 2 I can extend these basis to basis of W 1 as well as W 2 so you observe the first part is coming from this basis of W 1 intersection W 2 for these two the first part comes from this okay now the claim is you see that this vector repeats so what I will do is I will collect Y's, U's and V's I will call this now as script B collect Y's, V's and U's Y 1 Y 2 etc Y L U 1 U 2 etc U S and V 1 V 2 etc V K collect all these vectors the claim is that this is a basis of W 1 plus W 2 this is the basis of W 1 plus W 2 let us assume for the moment that we have proved that this is the basis of W 1 plus W 2 let us quickly verify whether the consequence that this formula holds suppose this is a basis of W 1 plus W 2 then if script B is a basis of W 1 plus W 2 then we have the following dimension of W 1 plus W 2 is equal to since this script B is a basis you observe that it has L plus S plus K vectors so this number will be L plus S plus K now look at dimension W 1 I want to write down dimension W 2 also W 1 has this as a basis W 2 has this as a basis so W 1 dimension is L plus S W 2 dimension is L plus K we started with W 1 intersection W 2 dimension W 1 intersection W 2 please refer to the basis that I have there it is L so you can see that this number L plus S plus K is this plus this minus this so if I prove that this B is a basis then I am through okay this number is this number plus this number minus this that is the right hand side this is the left hand side okay so we need to prove that B is a basis that it is a spanning set is easy to see that is as before so I will not write the fact that I will not prove the fact that span of B equals W 1 plus W 2 that is easy as before we will only show linear independence okay we show that B is linearly independent the spanning thing is as before there is no need to repeat the argument okay we need to consider a linear combination then I will call it alpha 1 Y 1 etc plus alpha L Y L plus beta 1 U 1 etc plus beta S U S plus gamma 1 V 1 etc plus gamma K V K I must equate this to 0 and then show that each of the scalar is 0 okay I must show that each of the scalars is 0 so this is what I have as before this is a vector in W 1 intersection W 2 let me call this Y this vector I will call it U this is in W 1 this is in W 2 I will call this V Y plus U plus V equals 0 Y is in W 1 intersection W 2 let me write this belongs to W 1 intersection W 2 this belongs to W 1 U is a linear combination of U 1 etc U S U 1 etc U S this is in W 1 similarly this is in W 2 so let me rewrite this as follows Y plus U equals minus V Y plus U equals minus V Y belongs to W 1 U belongs to W 1 so this belongs to W 1 V belongs to W 2 a similar argument as before this belongs to W 2 so V belongs to W 1 intersection W 2 is that clear V belongs to W 1 intersection W 2 V or minus V W 1 intersection W 2 is a subspace now W 1 intersection W 2 has this as a basis okay that is this means V can be written as delta 1 Y 1 etc plus delta L Y L because Y 1 Y 2 etc Y L is a basis of W 1 intersection W 2 but look at what V is write down the expanded form of V this implies I can write this and then equate this or push everything to one side and write this as delta 1 Y 1 etc plus delta L Y L minus V equals 0 that is minus gamma 1 V 1 minus gamma 2 V 2 etc minus gamma K V K equals 0 this minus V equals 0 is what I have written down here but look at this last equation this is a linear combination of Y 1 etc Y L V 1 etc V K look at what I have here Y 1 etc Y L V 1 etc V K that is the basis they are linearly independent so this equation tells me each of the scalars here is 0 is that okay maybe I will use that part since Y 1 etc Y L V 1 etc V K are linearly independent vectors it follows that these scalars delta 1 equals delta 2 etc equals delta L equals gamma 1 etc equals gamma K equals 0 in particular gamma 1 etc gamma K equals 0 what that means is that gamma 1 etc gamma K each of these scalars is 0 so I go back to this definition of V, V is gamma 1 V 1 etc gamma K V K so V is 0 from this V is 0 then I get the equation Y plus U is 0 okay in particular V is 0 so Y plus U equals 0 that is Y equals minus U Y belongs to W 1 intersection W 2 U belongs to W 1 so both in particular belong to W 1 write down the equations again you will be able to show that Y as well as U are 0 okay that is okay let us we already have the expressions for Y and U and it will follow immediate let me write down this quickly that is I have Y 1 I am sorry what is the scalar alpha 1 Y 1 plus alpha 2 Y 2 etc plus alpha L Y L this is Y plus U there is no need to go to this plus U is what is the scalar for U beta plus beta 1 U 1 etc plus beta S U S equals 0 this is what I have from the equation Y plus U is 0 but Y 1 Y 2 etc Y L U 1 U 2 etc U S they form a basis of W 1 so they are linear independent so it follows that alpha 1 equals alpha 2 etc equals alpha L equals beta 1 etc equals beta S equals 0 I have simply used the definition of Y and U and the fact that Y 1 etc Y L U 1 etc U S are linear independent being part of being members of this basis okay so let me stop with this this concludes our discussion on vector spaces subspaces linear independence basis dimension from the next lecture onwards I will discuss the notion of linear transformations matrices of linear transformations properties etc.