 So, we will be using one of the most familiar techniques that is one of mathematical induction right. So, let dimension of V is equal to 1. So, assertion is obviously true I mean forget about dimension of V any one dimensional vector space you will always be able to do this. So, that is the base case right. So, we suppose the claim is true for all finite dimensional vector spaces with dimension less than dimension V is equal to n. And what do we know? So, we know that n is equal to summation m V i i going from 1 through k plus. So, suppose this is true for all finite dimensional vector spaces with dimension less than V which is n. So, for dimension 1 dimension 2 dimension 3 until dimension n minus 1 we will always be able to find this. We already seen this to be true for 1 that is trivial. So, let this be true for all numbers from 1 to n minus 1. Now, we are going to make a very special choice of a vector space whose dimension fits this criteria. So, consider the image of n what can you say about the dimension of the image of n is nilpotent remember sorry no is it does it fit in this criteria is it does it have a dimension strictly less than n right. I mean if n was indeed non-singular which is the only situation when n can span that image of n can span the entire vector space. If n was non-singular then can it be nilpotent any non-singular matrix cannot be nilpotent right because the nilpotent matrix must have all its eigenvalues equal to 0. So, there must be at least one eigenvector algebraic multiplicity has a lower bound of 1 cannot be less than 1 we have seen the existence of eigenvalue and eigenvector. So, there must be at least one eigenvector corresponding to the 0 eigenvalue and that means along that direction yeah if you take n and hit it with this you keep raising n to sufficient powers it goes to 0. So, what can we say about this the dimension of the kernel is at least one if the dimension of the kernel is at least one the dimension of v and the dimension of the image of n which is the rank of n cannot be equal yeah. So, dimension of image n is less than strictly less than n clearly again convince yourself that it is clear to you I have just argued why it is clear because it is nilpotent right. So, it has one eigenvector. So, it has one vector in its kernel that eigenvector is the kernel is a vector in the living in the kernel. So, what I am saying is that I need to pick out an eigenvector a vector space which has a dimension strictly less than n and then I can invoke this Jordan canonical form because that is by my assumption right. So, I am claiming that the image of n is exactly one such vector space. Now, what can the dimension of the image of n be maximum v the dimension of v which is n, but can the dimension of the image of n be n small n that is if it were then it would be a non-singular matrix. So, then there would be nothing in its kernel other than the 0, but this has all its eigenvalues equal to 0 yeah. So, therefore, the geometric multiplicity of any eigenvector value must be at least one. So, at least on one direction it pulverizes everything. So, the kernel is non-trivial if the kernel is non-trivial the rank cannot be full rank if the rank cannot be full rank then the image cannot span the entire vector space v. So, the dimension of the image must be which is nothing but the rank by diagonality theorem. So, dimension of the image must be less than n. So, definitely at least for image n I can assume that Jordan canonical form holds which means that therefore, by our premise there exist u 1, u 2, u k hat I do not know this number k hat yet it is not k possibly right such that n raise to the m u 1, u 1, n raise to the m u 2 sorry u 1 minus 1 u 1 until u 1 comma n raise to the m u 2, u 2 until u 2. So, until n raise to the m u k hat till sorry u k hat until u k hat is a basis for what which vector space the image of n even if v does not have a Jordan form representation or a Jordan basis at least image of n must have a Jordan basis by our induction assumption because image of n is of a dimension less than v and we have assumed by our inductive hypothesis therefore, every vector space of size less than n it admits a Jordan basis. So, image of n admits a Jordan basis and the Jordan basis would look something like this agreed, but it also says something more and the following that is n raise to the m u 1, u 1, n raise to the m u 2, u 2 until n raise to the m u k hat, u k hat is a basis for what exactly exactly it is a basis for the kernel of n restricted to the image of n which is nothing but a fancy way of writing what is the image intersection the kernel or image of n intersection kernel of n. So, that part of the kernel dwelling inside the image that is what this means the restriction of the kernel of n to the image of n you cannot consider the entire kernel know the entire kernel resides inside v, but we are only admitting we are only focusing our attention on a smaller vector space a strictly smaller vector space a strictly smaller subspace of v which is image of n. So, we cannot have the entire kernel we cannot assume that the under kernel lives inside this. So, only some part of the kernel which intersects with image of n that is the restriction of the kernel of n to the image of n whose basis is given by this is this clear please take your time to absorb this this is obvious this is just applying this inductive hypothesis on a vector space whose dimension is less than that of v that is what we do in induction we take we see something is true for one then we take something to be true for a number smaller than a certain number and then we say that if it is true for that number also the bigger number then we say by induction we can extend this right now what can you say about u 1 u 2 until u k hat where do they come from there obviously part of the image of n. So, can we not say that is for every one of those fellows u 1 through u k hat there must exist a pre image v 1 through v k hat such that when n acts on these fellows it equals u i that is what it means the for something to be in the image of n it means that it must dwell in the column span of n in the matrix notation right very important observation next what do you think is the relation between m v i and m u i what is the relation exactly if you hit v i a certain number of times with n it will definitely pulverize it because it is nilpotent. So, it is in the annihilating ideal of v i yeah m v i minus 1 does everybody agree that m u i is equal to m v i minus 1 yeah exactly exactly we have already pulled one out because they belong to the image. So, we have if you hit this m v i times then you need to hit this m v i minus 1 times is it not, but m u i is exactly the number of times you need to hit u i until it vanishes. So, the number of times you need to hit u i is one less than the number of times you need to hit v i because u i is obtained by hitting v i already once with n. So, the number of times u i needs to be hit before it gets pulverized is one short of the number of times v i needs to be hit. So, this is true nothing fancy right just bookkeeping next non trivial step I mean so far so far are any of these steps very I mean except for a bit of a notation and maybe bookkeeping none of these steps we have we used anything very fancy so far right just a bit of a rank nullity and just keeping count of the indices and so on yeah I do tend to mess them up a bit sometimes, but I hope this is clear so far right. So, the next important point. So, the next important point is we already have a basis for the kernel of n restricted to the image of n can we not extend that basis to get a basis for the entire kernel of n right. So, basis for kernel of n intersection image of n we already have which is n sorry to the m u 1 u 1 until n to the m u k hat u k hat ok. So, consider and extend it to a basis for kernel n by augmenting with what number what do we basis for the kernel of n what is the number for the kernel of n yeah by our premise here this is just k no. So, out of the k we already have k hat fellows here. So, we will need to add k hat k minus k hat fellows yeah. So, let us say those are v k hat plus 1 v k hat plus 2 until v k. So, that the union of this set and this set yeah. So, you see that the span of this and the span of this that is this vector space their direct sum will be the vector space v right. This is a complementary the span of this is the complementary subspace to this right. So, their direct sum is nothing can do simultaneously in this we already very basic results if you have a vector space and you are extending it to a bigger vector space by adding some more fellows in the basis then the span of those additional fellows creates a complementary subspace. We have seen this just a few lectures back right and nothing can do simultaneously in this complementary subspace and the vector space itself. So, they are going to be a direct sum right. So, anyway that is besides the point, you can always do that. So, now the big claim v 1 v 2 v k hat v k hat plus 1 v k is the set we need. What do I mean by saying that the set we need it means this is exactly the set which will allow me to generate this entire Jordan basis this Jordan chain. What is the implication? So, if you have copied this down I will just erase it ok. So, let me also place this here m v i minus 1 is equal to m u i. So, this claim essentially boils down to claiming that if I just take these and keep hitting them with n keep hitting v 1 with n how many times m v 1 keep hitting v 2 with n m v 2 times until I go like this then I will exactly have the entire Jordan basis the Jordan chain that will allow me to get to the Jordan canonical form. So, what we need to show is to show that that particular set is linearly independent because if I am able to show that then I will be done just look at the numbers. This has k just like I need here k and I am going to be just generating m v 1 extra from here m v 2 extra from here m v 3 extra from here. So, k plus summation m v i. So, if I am able to now show that summation alpha 1 j j running from 1 through m v 1 n to the j sorry j running from 0 n to the j. So, obviously n to the 0 is just identity yeah. So, n to the j v 1 plus summation j is equal to 0 to m v 2 n to the j v 2 plus dot the dot plus summation j is equal to 0 to m v k hat sorry yeah you need alphas correct alpha 2 j this is alpha k hat j n to the j v k hat plus alpha k hat plus 1 0 v k hat plus 1 plus dot dot dot till alpha k 0 v k is equal to 0 for all alpha i j is equal to 0. This is not what I have to show because I have the adequate numbers to spam the entire n dimensional vector space because this is n. I have sufficient number of vectors already if I am not able to show that this is linearly independent I am done this is the Jordan basis right. I have already assured myself that I have sufficient numbers to constitute a basis. So, it is a the dimension matches provided it is linearly independent that is guaranteed to be a spanning set as well right. So, let us call this a double star I will not erase that equation I will probably erase other things. So, hit this double star with n on both sides what happens what happens to these fellows when you hit them with n these are after all sitting inside the kernel of n. So, these fellows immediately vanish the moment you hit this equation with n on both sides these additional fellows already vanish yeah sorry not all of them the first ones of first entries of each of these sums vanishes. So, what happens? So, multiplying with n what do we get? So, the sum will start from summation j is equal to 0 through m v 1 alpha 1 j n to the j plus 1. But see it will not go up to m v 1 right because m v 1 plus 1 will already vanish. So, this sum I need to only consider till this plus summation j is equal to 0 to m v 2 minus 1 alpha 2 j n to the j plus 1 v 2 plus dot dot dot till summation j is equal to 0 to m v 2. So, v k hat minus 1 alpha k hat j n to the j plus 1 v k hat is equal to 0 is that clear. What has happened? All of these fellows have gone off they have been knocked off because they were already straight away fellows sitting inside the kernel. So, this part has already vanished upon hitting it with n and each of the first terms of these they happen to be also fellows into the kernel. So, the first term has vanished, but the first term has vanished means I have just made this adjustment. We just added 1 n and now I do not need to go up to m v 1 because m to the m v 1 plus 1 acting on v 1 is anyway going to vanish. But this so the green one works right this is what m u 1 right. So, I am going to make two changes now j is equal to 0 and this is m u 1 just look at this for i is equal to 1 through k hat this is the relation we have just argued a while back this is alpha 1 sorry alpha 1 j this is n to the j what is n v 1? n v 1 is u 1 because u 1 is that exact fellow which I obtain upon acting n on v 1. So, this is u 1 plus summation j is equal to 0 to m u 2 alpha 2 j n to the j u 2 plus dot dot plus j going from 0 to m u k hat alpha k hat j n to the j u k hat is equal to 0. What does this tell you? What do I know about these fellows? They are linearly independent because they follow basis power image of n yeah, but these vectors form basis for image n to the j u k hat is equal to 0. And are thus linearly independent if they are linearly independent then all of these coefficients must vanish. So, if I use this relation now in this what are the only terms remaining? Only the first fellows in this original equation those coefficients are non-zero perhaps and these coefficients are perhaps non-zero, but what do those terms constitute? They are also linearly independent set because they exactly span by my whole construction the kernel of n they are the basis for the kernel of n. So, in other words every one of these coefficients have to be 0 like I argued. So, let me just complete that will take me a couple of minutes. So, this color works I suppose right ok great. So, now where do I ok. So, plugging this n double star we get n raise to the last fellow oops n raise to the m v 1 v 1 or rather alpha 1 m v 1 plus alpha 2 m v 2 n raise to the m v 2 v 2 plus dot dot dot dot dot dot dot dot till alpha k hat m v k hat v k hat plus alpha k hat plus 1 0 v k hat plus 1 plus dot dot dot till alpha k 0 v k is equal to 0. Just as I said these are the only terms that survive. So, these are the only terms that survive but this is also, but this is a linear combination of basis vectors for kernel n. Hence alpha i j is equal to 0 for all i j as required. So, there you have it the proof of the Jordan canonical form ok. So, first we start with this step then we hit it with n. We have done all the ground work before this. Once you have written this down like this the numbers are already there in its favor all that you need to show is linear independence. So, you hit it with n on both sides many of those terms go off. So, you have to do that. So, you have to do that. So, you need to show is linear independence. So, you hit it with n on both sides many of those terms go off, but precisely the terms in the kernel go off, but the remaining terms because they come from or they somehow constitute the image. When they are acted on once by n they actually lead to terms in the image not just any arbitrary term in the image of n, but also the basis elements in the image of n. So, they are a linearly independent set and their linear combination going to 0 means each of those coefficients must be 0. It is a trivial linear combination. So, then plug that back in again here and the terms that survive after that are exactly the basis elements in the kernel. So, they must also be 0 right. So, just hitting it with n once you arrive at this conclusion ok. So, this is the beauty of the Jordan canonical form we hope we have elucidated substantially what the benefits of getting a Jordan canonical form are prior to the proof. So, I would like to keep the proof as a last point because sometimes it helps in proving certain things. Similar techniques also help particularly when you are dealing with nilpotent matrices and when you get a linearly independent set upon acting repeatedly with the nilpotent matrix those results are very interesting. If I keep hitting something repeatedly with a matrix until it becomes a 0 at one point and at no preceding stage has it been 0 then there are certain guarantees of that operator being nilpotent ok. Perhaps you can think of those beautiful little properties you have to think about the annihilating ideals and stuff a bit ok. Those are some hints think about those there are adequate problems in your problem sheet do work them well ok and best of luck for your exams we will try to keep it in a slot that is suited to everybody. Thank you.