 So, some more subtlety here alpha 1 is this difference, alpha 2 is this difference. What we see is that alpha 3 which I started with mu 1 minus mu 2 on the board that is actually a linear combination of alpha 1 plus alpha that is also clear from this diagram that you can do alpha 1 and then alpha 2 and you can get the alpha 3. So, since you can generate that alpha 3 using the other 2 vectors you call this to be just a positive root whereas, alpha 1 and alpha 2 are the simple roots. Simple roots means I cannot break that into linear combinations of other 2 roots. So, alpha 1 is a simple root, alpha 2 is a simple root, alpha 3 is a positive root. So, this also can be shown that in any Lie algebra number of simple roots will be equal to the rank of the algebra. For SU 2 you will have only one simple root, SU 3 you will have two simple roots and so on. This is clear. It is a little you know heavy, but if you understand by comparison it is not difficult. Highest weight vector also I explained is denoted as lambda of all the racing operator acting on a particular weight vector if it is 0 that is what I wrote here that if it is going to be 0 for all i which is 1, 2 and 3 then you call this to be the highest weight vector and you denote that mu H to be lambda and you write any of these states. So, in this case which one is the highest weight vector can somebody tell me? This one if you try to raise it has to go above not possible. This one if you have to raise it in this direction it is not possible and so on. So, you can show that this one is the highest weight vector for the defining representation ok. So, this one turns out to be diagrammatically also it is visible, but in general you can mathematically write a mu 1 vector and see whether all the racing operators are annihilating it. Not just one it should be annihilated by all the three racing operators alpha 1, alpha 2, alpha 3. Then only you call it to be a highest weight in this particular diagram you can see that this cannot be raised further. If you want to raise it in this direction you have to go like this. The diagram is only these three states not possible. If you have to raise this further you have to go like this not possible. Diagrammatically I can see, but mathematically also you can work it out and say that it is not possible. So, it turns out that mu 1 is the highest weight vector and we can denote your lambda by that mu 1 ok. So, the other thing analog of your regular representation you remember regular representation in discrete group the dimensionality of those matrices were equal to the order of the discrete group right. Order of the discrete group if you had suppose you take a C 4 discrete group order of the group was 4 then the regular representation has to be 4 cross 4. Similarly in the case of Lie algebras you can write this was three states. You can have a higher dimensional representation which could be having number of states equal to number of generators just like the order of the group dictated for you the regular representations you can have the higher representation where the number of points which you are going to put will be 8, 8 because there are 8 generator. So, this is what is the higher dimensional representations lowest non-trivial representation is three states ok, but you can write a higher dimensional representations just like here if I want to go to instead of m equal to half I could go to m equal to 1 right I can go to m equal to 1. So, if I take m equal to 1 then j plus is 0. So, I can say that j is m equal to 1 and then I can see that this is m equal to plus 1 and then I get m equal to 0 and m equal to minus 1. So, there are three states achieved by j minus operation in this direction j minus 1 right you agree. So, three states are like three generators of SU 2. This is a not the lowest there are three states. So, it is not the lowest defining representation of SU 2, but it is a three dimensional representation of SU 2 clear. So, in general I can write j m what is the dimension of this representation it is 2 j plus 1 dimensional representation why m can take value from j j minus 1 up to minus j. So, you can plot it also on a weight lattice j j minus 1 j minus 2 and so on finally, you will get minus j. The number of points here on this weight tells you the dimensionality of the vector space on which your generators are going to act SU 2 generators because it is just it is not a vector each one is a number magnetic quantum number. This is a higher dimensional representation of SU 2. This dimension matches with the number of generators that is why this is called adjoint representation. So, this particular just like we called it as regular representations it is called adjoint representation where you have three states which is can be mimicked as if it is like three generators you can associate each state with each of the generators ok. In fact, you can show that this is the plus 1 and then you have 0 and then this is a minus 1. So, in fact, these are the three roots which you get one is a 0 root, but this is what will happen even in this case ok. So, let me explain that also. So, adjoint representation of SU 3 how do I plot it here now it is a 2 D diagram and there should be 8 points somehow because adjoint representation means there should be 8 states which is equal to number of generators which are 8 clear. So, it turns out that you can plot the alpha 1. So, this one will be minus alpha 1 that is ok, but the number of roots is always the number of off diagonal generators right. So, basically what I am saying is that I am going to plot it using the roots explicit value of the roots and you will see that this one has half and something here and this one will have minus half and something the same thing. So, this point where it is happening that coordinate turns out to be alpha 1 in minus. So, it is not that the value of that alpha 1 decide for you the points it is not the seeing it like mirror reflections ok. So, if you see the actual values maybe I should have written the values let me write it out. Alpha 1 is half and root 3 by 2, alpha 2 is half and minus root 3 by 2. So, that is why this is happening and this is h 1 and that is h 2 clear. So, I need to draw it in a way that I can see those points here ok. So, let me just show those values here. So, alpha 1 plus alpha 2 turns out to be 1 0 and then this becomes minus 1 0 ok. And same things you can see how alpha 1 and alpha 2 are getting, but what I want you to appreciate is that I have put a two circles at the center what are those two circles? One is corresponding to one diagonal generator and another one corresponds to another diagonal generators. So, the diagonal generators have no roots the diagonal generators are have nothing to do with the root vectors. Root vectors are something to do with only the off diagonal generators ok. So, that is why there are two zero values at the center. In the case of SU 2, you have one zero value at the center this is associated with your diagonal generator these are the two off diagonal generators. Similarly, here also you will find that there is a center point which has weight vector is 0 for it. There are two linearly independent weight vectors with zero value ok. So, these are the ones which are associated with the non-trivial these are two simple roots you have alpha 3 which is 1 comma 0. So, using them you construct your you construct your states the off diagonal generators are associated with those corresponding states. I am only saying you will find six points ok. So, you will have one point here, one point here, you will have one point here, one point here, one point here and one point here. So, it is 1, 2, 3, 4, 5, 6 and then this center point is your carton sub algebra which are your diagonal generators there are two diagonal generators. So, you will have two zero you can call this to be the weight vectors for the adjoint representation, but these weight vectors also turn out to be your root vectors ok. So, this turns out to be. So, I should say that weight vectors of adjoint representation root vectors origin is a zero weight. So, you will have two zero weight states in the adjoint representation for SU 3 and in the case of SU 2 you will have one zero weight state in the adjoint. Is that clear? So, today I have taught you defining representation and I have taken you slightly analog of regular representation how to see adjoint representation. The jargon called as adjoint representation. Interesting thing is the number of points on this weight diagram if you try to plot it turns out that there are eight points the origin is a zero vector there are two zero vectors there. So, that is what I have called it as a root diagram the points are only things which I have marked. So, this is there, but this is due to the diagonal generator this should not be. So, now, I am going to warm you up with the formal definitions with whatever I have taught now keep this in mind and then we look at the formal definition. If you take any book on a Lie algebra they will start this way they will not start this way ok. Let L elements of the Lie algebra with n basis elements obey H i H j. So, the totally n basis elements out of that take a subset L and look at their commutator and this is a abelian sub algebra and this is what we call it as a Carton sub algebra. S u 2 L is just 1, S u 3 L is 2 and so on. But there are other groups also and not really gotten into other groups that this is the way they will start. The above abelian algebra is the Carton sub algebra L denotes the rank of the Lie algebra the remaining elements which should be even in number always and they can be written as a raising and lowering operators should have put plus or minus alpha here where alpha denotes the root vector ok, you all fine. Just remember these two example and then you understand this. So, what will be the component of this root vector? It will depend on this algebra. There are L generators which are diagonal which means the component of alpha vector should be having L components, alpha vector should have L components. S u 3 has 2, L is 2. So, you had two components. Now, you will have L components. So, other things are that the closure of a Lie algebra amongst the generators are a must ok. I am not going to prove this, but there are rigorous proofs of showing all these things. Any two generators if you take suppose the two generators are like J plus and J minus, you end up getting J z in your angular momentum or S u 2 algebra. Similar thing will happen in the other groups if you take the raising and the lowering and take the commutative bracket, it will turn out to be a linear combination of all the cartons sub algebra diagonal generators clear. So, this is only postulates I am not proving it, but it can be proven and similarly you can you have to make sure that this algebra is a closed algebra. What all you have to do? H 1, H 2 is 0, but you should also see how H 1 and H 2 with the three ladder operators how the algebra closes also. Everything has to be checked, then only it forms a Lie algebra you know that right. So, whenever you have 8 generators, any commutative brackets of those 8 generators should be a linear combinations of those 8 generators. Coefficient some of them could be 0, but that is the definition of a Lie algebra. Sometimes if you take two raising operators of two different directions, here if I take raising alpha 3 and raising on alpha 2 that combination could be related to this combination right. So, you can show that such commutators of e alpha with e beta can give rise to a non-trivial coefficient times alpha plus beta. When will it be 0? If alpha plus beta is not a root of that Lie algebra. Suppose I take alpha 3 and alpha 2 and I find something new which is not alpha 1, alpha 2, alpha 3, then it is 0. So, it has to be alpha plus beta has to be a root. So, these are all formal definitions to validate that that abstract algebra with n number of generators, with n number of diagonal generators forms a algebra ok. Can be use it to do some computation. You know I gave you explicitly the Gelman matrices, then you could see what is mu 1 vector, mu 2 vector, mu 3 vector, but that is not the way it is like you need to know how to use this abstract algebra to do some computation right. And we know how to do it in the SU 2 language very well. You all agree? If I say that I want to do J plus in the SU 2 language, you know how to do it. You know those coefficients, you know how to do things. So, what we are going to do is that we want to say that these abstract algebras can be broken up for every root vector as if it is a SU 2 sub algebra. Then we will only do SU 2 and then we get the answers for any arbitrary algebras clear. So, that is what I am trying to put in here. Any compact simply algebras, you can break it up into many SU 2 sub algebras. J plus, J minus corresponding to a root alpha, alpha vector will be this is the norm of that vector inverse of it times E plus or minus alpha. Similarly, J 3 is the z component associated with that alpha. So, how many SU 2 sub algebras you can form here tell me? You can form 1 SU 2 algebra with this, 1 SU 2 algebra with this, 1 SU 2 algebra with this clear. So, this is what I am trying to show you that you can construct the J plus, J minus and J 3 for every root vector, every positive root vector by using this construction. What is the first check to show that this is SU 2 algebra? You know what is an SU 2 algebra right? What is an SU 2 algebra? If J plus and J minus is J 1 plus or minus i J 2, then you can show that J plus with J minus is J z. J z with J plus minus is now I am adding one more here as an alpha vector, alpha vector and an alpha vector. So, everywhere the same alpha vector ok. So, this means that for SU 3 there are 3 SU 2 sub algebras. How will I write it? Suppose I want to write J plus 4 let us say alpha 1. How am I going to write this? It is written as 1 over mod alpha 1 that is this and then corresponding J 3 for alpha 1 will be mod is just the norm of that vector ok. Take the norm and find the square root of it, mod squared. There is one modification here. It is the alpha 1 vector dotted with the H H vector that is the modification. H vector means H 1 and H 2 components, alpha vector will also have 2 components. You have to take a dot product of the 2 components. So, let us do a simple example. What is alpha 1? Alpha 1 was half 1. What was it? Half 1 to 3 by 2. Alpha 1 was this. So, what is alpha 1? Mod of it to take just a dot product of these 2 which is 1 plus 3 by 4 which is 1. Now this one will be this is 1 anyway. This is again 1. What about this? Alpha 1 vector is half and root 3 by 2. H vector is H 1 comma H 2. So, it will become half H 1 H 2. So, my advice for you is I have given you the Gellman matrices. Now I want you to check whether this algebra is satisfied where J 3 corresponding to this root vector is a linear combination of lambda 3 and lambda 8. Make sure that this algebra is satisfied. Then only you call this to be an SU 2 algebra. I have given you the Gellman matrices right. What I want you to check is that I have given you what is J plus. Similarly, you can write J minus. So, use the complete right hand side as substitution here by taking the matrix forms of them which is what lambda 1 plus or minus i lambda 2 and check out whether this algebra is satisfied. I have given you an abstract notation, but you can verify it explicitly for the Gellman matrices. Any doubts? This is mainly to do number crunching. If you want to do some values write out those when I say when I do a raising operator I say it is proportional to weight vector minus alpha 1, but that proportionality constant has to be fixed and we have fixed this proportionality constant for SU 2 algebra nicely J plus or minus m into J plus or minus we have done that. Now we will use this to fix them.