 First of all, so first we'll have to continue the discussion of the action of SL2 and concluded, in fact, on the showering of an abelian variety. And then we're going to start with this topic about the representation of the diagonal. So, so yeah so let me quickly recall the setting we had last time for a for an abelian variety. So a was principally polarized. So we start with number one so we have principally polarized over the complex numbers dimension G. So here is the polarization with denoted it by L principle polarization. So we also had the Poincare line bundle which we viewed on a times a we explained that normally of course it's a naturally sits on a times the dual abelian variety identified. So we had identified a with its dual using the using the polar principle polarization. Okay, and so that our main. So the main the main tool here was this theorem of Kai. So well let's. So, we consider a functor from the right category of a to itself. So, which to a class alpha associates. So this twist with the kernel given by the Poincare line bundle. Yeah. So defined us. Okay, so, so then the main result was that the F is an equivalence with and its inverse is essentially itself yeah so more precisely where there's a shift. And there's this evolution which is multiplication by minus one. Okay. So, we said also that we may view this of course on child. This is from Christ here, and we may view F on the child of the abelian variety. And here of course, the result is that the composition well there's the. There's the same evolution that appears the multiplication by minus one pull back under and of course there's a shift becomes assigned minus one to the G. F is defined as push forward of. So, I'm sorry so f applied to a child class alpha push forward of alpha intersected with the turn character of the Poincare line bundle so we said okay let's call lambda the first turn class. So, so this is in this case the functor. Okay, so we also had a few were working with these important maps. Let me just mention them quickly we had the addition map from a cross a to a, and we had the multiplication by and just to get this will come up immediately so just reminding you. So, we did one calculation with the free and the high functor last time and we said well. So, we calculated that if we compose this equivalence with the operation which is tensoring by the polarization. Okay. So, and we cube it. This was equal to just a shift by minus G so almost the identity. And so, so I guess we have these two. Yeah, so we have these two main identities, both due to Mukai. He actually does this calculation explicitly in his in his classic paper. So then this this suggests. So just the action of SL to Z right so with SL to Z now of course we were looking at standard generators here so we have. And we know they generate SL to Z of course subject to these relations which we see satisfied so this. So we're, of course, tempted to say well T. In our case to have a representation on chow is the the operation of multiplying of a. Actually, we're here on the derived category so if I'm on chow would be actually I apologize. This is multiplication by e to the theta. Okay, where theta is the first time class of polarization. And s. Well, it's almost you see it almost we said. And since there's, there's a, since there's a minus. Normally, I mean, if this were to work exactly as we could assign as to F. You're missing a minus. You're missing the pullback by the evolution. So in fact, rather this F tilde which is F composed with minus one star that plays the role of less. So to see. Okay, so then then you have it. And the. Then this identity works works well. Okay, so. So we have also seen one last thing that we did was that we have also seen that the matrix. So the lower triangular matrix which is calculated as T times s times T is represented by Well, by the composition, the triple composition of tensoring with L. Applying. Applying F and tensoring with L, and this again we calculated to be just the Pontiagin product with L. Okay. So it's sort of it's a beautiful picture. So what this means in chow. So if in child then you have that this matrix access follows you send the class alpha to push forward under the addition map of so you pull back alpha from one factor. So here M is from a cross a to a the projections are p and q. Now which is the churn character of the polarization L, and you push forward under the addition map. So yeah I hope this all this is this is good so far, and this is this is the. So this, this establishes this SL to Z action on the on the child of a. And so at the same time I also said something about a child decomposition so let me. Yeah just. So with us have an SL to Z action. And let me say also something about the child decomposition. So this starts again with, with the Mukaiz theorem so for child decomposition what we saw very kind of in a very elementary way last time was that this identity was in the right hand side is essentially the identity leads to the to a decomposition of the diagonal class inside. So this is a process but of course as a into eigen so this is viewed as a correspondence is that sort of the identity correspondence diagonal decomposes into a sum of eigen correspondences for. The operation of multiplication by and pullback under multiplication by and on one factor. So, so then, so then this gives this Boville decomposition of the of the child into eigen spaces. So this is the, this is co dimension P here. So cycles of co dimension P. There's a range for this. So there's a range for the possible eigen values which are easy I want to insist but this is something easy to see. So here. So these are eigen spaces for for and star. Okay. And I should say also that what something that didn't point out it's obvious but I think I omitted to point out last time is that this decomposition is multiplicative, you know which is which is a nice feature feature products are well just because products are well behaved under pullback right so so we have here under under multiplication. I mean intersection of cycles. Okay. So then on the level of correspondences we also have, you know we can view this opera operator which is very important it will make up. You know, it will be quite present in the first part of our lecture, this pullback under multiplication by and is immediate that is written as follows as a as a combination of these linear combination of these Eigen correspondences with these, these coefficients. Okay. All right. So this is this is just a quick recap or where we were so let me now continue and because we like to see this also in the algebra level, not just on a group level. Okay, so first we're going to still have a little bit of work to do. Okay, so we have now. This is where we are and here is what we want fee from SL to Z to the automorphism group of so we'd like to extend it to. I'll call it also fee SL to Q. So, so, let's define it so we'll start slowly there isn't so much to do really just a couple of elements to to make sure that that we have this. So define it first on on upper triangular matrices. Yeah, so. So first work with, let's say the group of upper triangular matrices here inside SL. Okay, well, so it's not so hard I mean you if we take a matrix of this form. What should the operation on child be well we know when a is one, it was e to the faith is multiplication by e to the theta. So this is just going to be multiplication by e to the theta. This is where it gets sent. So now we have to worry a bit about diagonal matrices so for any in Z let's say, and this is this is the new element in some sense. So where do you where do you send what what is a diagonal matrix represented by, and we're declaring it to be just the pullback under the multiplication by and with this pre factor and to the minus G, which means that the inverse matrix should go to end to the minus G this normalization factor. So this is a kind of push forward on the multiplication by right because we have seen this we mentioned that of course, and lower star and upper star is into the two G. The identity. So this makes sense. Yeah, so. So to have a well defined representation. So to have a well defined fee from from B to the automorphism of child we also, we just need to check the basic relation of how these commute. Yeah, so you need to check on the level of the operators we defined. So we like to see this happen. So, and it's enough to do it for an integer and yeah I mean I support you know my identity so it's enough to check it like this. Okay, so what does. So in fact, with our definitions what we need is the following that composing you multiply by to the theta. Finally this is push forward. If we apply this to a cycle alpha. This should give it should be the same as applying this operator. In other words multiplication by this exponential plan. Yeah, so for some so any site. Okay, so we need to check to have this basic compatibility. Well so what does this mean we can. I mean here. What the only things in some applying, we're applying this pullback which behaves well under multiplication so the main thing to remember here is that the pullback of theta. Under and under N is n squared returns and squared theta. And then this thing follows, this is equivalent so what we like to see is that. Well so we have n to the minus two g e to the n squared a theta, and this indeed is the same as multiplying by the exponential. Okay, so this checks. Okay, so that's. It's okay. It works well and of course it's important to remember this and I mentioned this. This, this property of theta in fact these are things worked out by Monford I mean how line bundles behave I mean you can write the pullback of any line bundle under under the multiplication. By N map and in case the line bundle is symmetric. So we're going to L is sent to L to L to the n squared right so on the level of first turn classes you see this identity so so that's that's okay. Okay so now we're going to use as to generate so since we have it on on all upper triangular matrices now we, we're going to use our SL to generator as to to move to the to lower triangular matrices and in we can generate it but again we have to check one compatibility. So to extend to know fee from SL to Q so we want to have also lower triangular matrices of course we also need to check the following basic relation in SL to. We need to do again with the diagonal matrices so this is s, we need to show that this gives the inverse. Okay, so we need to check that this basic SL to relation is satisfied. And I mean, there's an argument to be made it's sufficient certainly is new one is to be satisfied but it's actually sufficient. And yeah. Okay. So well let's see what this means whether this translates into here for us. So. So, in other words we want that what this is what s is represented by the simply the for the amokai functor with this additional twist by the multiplication by minus one. And here we have. Let's write it as a f inverse right this is, this should be equal to just the push for them there and right so then if we. Just shift this to the other side what this is equivalent to is the following. Well known property of the for the amokai functor F that. If this is, if you have an isogenic. The for the amokai functor can compose with pullback under the isogen is the same as the push for whether the isogeny composed for the amokai so this is okay for the isogeny and it works out well. So I just move left to the other side I mean just a trivial manipulation. So, yeah so this would be this would be but of course now we'd like to calculate. So, just the basic compatibility of the ways of these operators as we define them actually leads to a representation of SLQ Q. And now we have already seen just to deduce what the Lie algebra action is. So, so finally, this is the last point so. Well, we have already seen that the lower triangular matrix. So what we're going to do in zero one one is alpha goes to the Pontiagin product of alpha with either the theta, but we'd like to calculate now for an arbitrary so we're going to calculate this in particular. What this operator is multiply multiplication by this matrix. What we're going to have to compose. This is the last, the one calculation we're doing. It's a very simple one. Okay, so we have to. Okay, so then multiplication by a lower triangular matrix like this is in fact this is following composition again I'm using our definition of the operators push forward by and this is. Composed with Pontiagin product with e to the theta and this is pullback by and so in other words this is into the minus two G push forward compose with Pontiagin product compose with pullback. Okay, so, so what we're saying is that alpha goes to. Some random child class just using the definition, you know this is where I'm unraveling here what this means. Yeah, so. Yeah, so this. The feeling is this is some sort of. Pontiagin product as well but which one exactly. Just to address this, this double push forward so first by the addition map done by the multiplication by and map. Why don't I continue with this. So, obviously, so you start with XY here and I want to end up with. Sometimes X plus why here in this corner, so you can either multiply by and and and apply the addition map or apply the addition map and multiply by hand, so it's commutes. So then, so then I just calculate I use this diagram to to to reverse a bit here the order. So this is like to. So this this will indicate that there's a Pontiagin product going on and then I'll just have to contend with the expressing calculating the push forward but that's that's going to be easy so I'll do sort of this artificial thing that I will express. So here, this e to the theta is a push forward under. I'm sorry as a pullback under and so that it's the expression is symmetric since I have pulled back under and for alpha. I have here the advantages now that I have this here I have the pull pull push forward under and multiplication by and on both factors fall out. Apply to the pullback of multiplication by and on both factors. And this is simply. Okay, so the result of doing this is an end to the factor of end to the for G, because now we're we're in a billion variety of dimension to G namely a times a. So overall I have a factor and to the to G and simply this Pontiagin product. In other words, alpha goes to and to the to G, multiplying the Pontiagin Pontiagin product with e to the theta over n squared. So then I conclude that okay so why I worked this out because it was convenient foreign square but of course what this means is if I want a general expression for a lower triangular matrix of the sword to the corresponding matrix, but there's the scaling factor a to the G times the Pontiagin product with e to the theta over a. So when a is equal to one which we did before which is really you have the Pontiagin product with e to the theta. Okay, right. This is at this moment we're done we just have to look at the for for to deduce the le algebra generators we just have to look at the first order in a. Yeah, so let me write. Okay, also, so let's let's see everything that's relevant so let's one a zero one was just multiplication by e to the a theta. And zero zero and inverse was for anything. So in other words, this is on the level of correspondences is this correspondence so then. So then only algebra level. So for X, we have. This is just multiplication by theta. It's intersection with the with the ample ample divisor theta. Now, for why this is. Yeah, so here is what it is I have to track the first, the term linear in a. So this is just a Pontiagin product with theta to the g minus one. Over g minus one factory. So I have to look at order a to the g minus one in the exponential. Because of this pre factor a to the g. Now so it looks a little worse on the algebra level. And then of course with age. I have if alpha is. If I apply to an eigenvector. And there are in our decomposition. So this alpha let's say it's in cycle of co dimension P with eigenvalue given by 2p minus s. Then what we get here is to be minus g this is this minus you see here minus s. Yeah so this is the SL to algebra acting. So what we see is that is that the child decomposition under and star is also the Eigen space decomposition of age, in fact, yeah, the composition under this multiplication by and pullback is the Eigen space decomposition of the operator page. So just, I'll just post a comment a bit for a second of course, you know, it's, I mean the most interesting operator here and we worked hardest for you to so of course this, why, right I mean because what x does I mean this this operation of intersecting with an ample ample divisor is very. Okay, it's sort of very expected in algebraic geometry. It's very natural, but but this what this why has to achieve is to. This increases co dimension you start with a cycle and, you know, increase its co dimension by one. But of course the operator why has to undo that's why decreases the co dimension. Yeah, so that's actually a lot more subtle to have it's harder in algebraic geometry to have operations on cycles which, which, which decrease co dimension. And this is what this does because if you look at this convolution with theta to the g minus one of course this is a class of. Of co dimension g minus somebody are also pushing forward under the. The addition map so that's a dimension g fiber so in fact you achieve that. Yeah, so. So, so, right. So we had to, we had to sort of work a little bit for it and I wanted to just do it I mean of course it's very elementary, but it's actually pleasant calculations within an abelian variety if you, you know, I don't know if you. Enjoyable calculations to do, because there's a lot of structure. Yeah. Can you say again how you find what why alpha is, I didn't understand that. Why why alpha is what you've done. Well, because I know what e to the y is so to speak yeah you see I know we know on group level. So, um. Yeah, so. Yeah, so this I obtained it from this. Okay. So I just it's. This is the one parameter if you view it as a one parameter subgroup in your in your in your group it's in the derivative is will give you the correspondingly algebra direction. Yeah, so just by the correspondent so then I, I just looked at the first order in, I picked out the first order in a in this expression. Yeah. So. Yeah, yeah, so. Well, so, but the point is the, the, and of course, then you can do something and you know I invite, I think it's instructive to read Boville's. So here my exposition mostly follows Boville's but the calculations, everyone has their own way to do the calculations are mine because everybody calculates a little bit differently. So. The point is that he also expresses a bit what this action is I mean for example, when is multiplication by powers of theta is this a you expect it to be a bijective map and injective map and he sort of on chow level, because what we know is that on the topology if we now sort of step back and look on on homological level this gives an left sets SL to on the homology of a yeah. So this is you know it's multiplication by an ample device or theta. So this is, this is from this is the hard left set up in homology. And they're also you know of course it's. So this why is the left sets do also to speak. And, okay, you know, in, we know it's already an elementary color geometry I mean this this is always constructed using the metric I mean but we're in. So, and you do this calculations it's independent of the killer metric, but you constructed using the killer metric. So that's why an algebraic geometry on natural, you know, beautiful spaces such as an ability and variety you know or a modular space of shoes you'd like to see this intrinsically in terms of the structure. You'd like to have an explicit expression for this one explicit expression for this SL to action. And then here, this was sort of a kind of a, it's a prime example for an ability and variety that that can be done, you know so. And, and also I should say that. So in other settings that you just make this as a remark here. So when we don't have of course we're not going to have this multiplication by and other settings right so in the absence of this morphism which is available on an ability and variety but not otherwise. So if you look though at the idea that I can decomposition of age. In a left sets SL to triple. So that's. That's okay as long as you have you manage to lift this action to see it on child level or for that matter even on co homology, but you know defined in terms of natural classes. But, yeah. But on child for sure it enriches your understanding of the child of the child ring because you have this extra finer structure. Yeah. Yeah, so this, I would say this. This ends the discussion that I, you know had in mind of a, of this as a question and a billion variety and you know further, as I said the main references for these are, you know, guys paper and also a few papers a few of both wheels paper which follow this thread, you know, exactly making this action explicit. And they're both the sources are very clear and very beautifully written so it's basically there's no obstruction to learning this very well. Yeah, okay so now, if this is okay I will, I will move on to talk a bit about the diagonal. And, you know, why, how does this it seems like a way you know it's completely unrelated to what we're saying it's not quite unrelated I mean with the identity we view it as some sort of is the identity. The diagonal review does the identity operator so if we try to work our way you know and try to understand first the identity operator in terms of the available structure and a modular space so you know. So here you saw that is the composition of the, this child decomposition arises from from viewing the diagonal as the composition of correspondences given by the Fudia Mokai function. Yeah, so, so it does arise from a representation of the diagonal in some sense. Okay, so. Let's start here. And yeah so I want to do this in, I was thinking of doing this in two simple settings first, which are, well, the first one is essentially very well now I'll just recap it but the second one is also simple but not quite as simple. The first one is to talk about the grass manion of our planes in CN. The second thought is to look at. A quote scheme, which is explained on P one right so it's the simplest. The simplest quotes can you can imagine on the projective line. So this is the set of short exact sequences. So this is on P one, and this he has rank minus our and degree, I'm sorry rank our and degree minus that's what I meant. And so, yeah, so let's, let's, this is okay with you, let's, let's start. So, so with one. So here, the diagonal so I have to look at two copies so we'll see merely what I have in mind and this will sort of clarify although you might be bored by this discussion. So I want to look at the diagonal in product to copies of DRN. Okay. And here, of course, copy one and copy two, let's say. And each of them has comes with a total logical sequence. So you have me fix. So this is just a total logical sequence on the grass money. Yeah, so what I want to do here for the diagonal I'm. The vector bundle, which is home of S one into Q two on the product. Okay, so in other words this is S one dual. That's a Q two. Okay. So as one is on the first factor and is pulled back from the first factor and Q two is the quotient bundle from the second factor. And so here, there is a total logical section of this. So there's a section Sigma logical section of this bundle. So if you. So over a point which is given by V one. In CN, let's say the quotient is w one and two w two. So this is a point. Sigma this point is. Well, it's the map from V one to w two, which is the composition right so I first I have the first inclusion and then we have the second projection let's call. So you compose when you get a homomorphism from V one to w two therefore exactly you land in the fiber. Yeah. So it's a total logical section. So the good property for us of this section is that it vanishes. So if we're looking at the zero locus of S well when does this vanish well when precisely when in fact you're talking about the same quotient. Otherwise it is so, so the OS is precisely the diagonal. Yeah, so only if you have, if you pick the same subspace on both sides will you end up with just this morphism will evaluate to zero from V one to W. Okay, so. Yeah, so, so then, at the same time. This is a vector bundle of rank or times and my let me call it w. Okay. So w on GRN as rank. R times and minus R and has this total logical section sigma which vanishes precisely along the diagonal which is called dimension R times and minus R. So the diagonal is given by the top turn class of this w. Okay, so we conclude that the class of the diagonal is the top turn class of double. Okay. So is the top turn class of this tensor product. So you can write this as I you know you can decompose you can write it as a as a polynomial in turn classes of S one and turn classes of S two. So. And that has the following consequence that publishes two things immediate at once namely that first of all the chow of the grass manion is the same as the common gene other words the grass money is what we call on calls chow trivial. And also that the turn classes of S. Generate. Okay, so let me just have a remark here on this. Yeah, so just so that we don't leave it. Let's write this remark. Generally mark. So, so let's say that M is a non non non non non non non non non non non non non non non non non non non non non non non non non non non non non non non non non. So let's say that M is a non singular complete variety. And, and say that the class of the diagonal inside M times M is written as decomposes as a sum of distributions products from each factor pullbacks from each factor so for some index set here. So let's say that this is satisfied we've just seen that in the case of the diagonal it's possible to do this. So, then the observation here this remark is that this then the set alpha I generate is a Z module so additively. Okay, and this is simply the fact that if I take any class X is the diagonal is the identity operator. So this in fact is further. So this is a coefficient against the fundamental class. So it's a linear combination of these alphas. And of course if you see here also that if X is numerically trivial, then the intersection with every beta is zero that it actually implies that it's trivial in child. So it's a small step the same if you if you choose to look in homology you draw the same conclusion and in fact you immediately deduce that rational equivalence is the same as equivalency on the level of homology. Yeah, so in fact so it's easy to see then that this cycle class map from child to co homology is a nice amorphous. Okay, and moreover if you don't have this decomposition on the level of child but you have it on the level of homology you still you still conclude in the same manner that you know these classes which appear in the diagonal expression these alphas do generate the homology ring. Okay. Yeah so that's it's a it's a simple observation but it's it helps with simple if you if one manages to express the diagonal in terms of the available structure then you can draw some nice conclusions. So now I was going to go on to other any questions about this I mean I apologize. Yeah, maybe it's discussion is to elementary, I'm not sure. But anyway I wanted to discuss this less elementary in the quote scheme on P one. However, I'm, I see I'm exactly at the five o'clock market now. So let me. Yeah, maybe. So I, so here we fix RNN so we fix these parameters that we think of them as fixed and I just, I just think of this degree as varying sort of this QD as I said is this quote scheme. So the kernel has a rank R the quotient rank and minus R and this R could be anything between one and M. Right so there's in this case also there is a so the first thing that one what sort of a variety is this the first variation here is that, of course it's a projective variety, but it's easy to see that this is non singular of dimension. So notice that when the degree is actually zero, this is a constant map from E to CN and you actually have the just the grass mania in this case so the q zero is just the grass mania. Okay, so but then things change. So how do. Yeah, so how do we see this well we just have to calculate is just get well calculating a little bit. So the tangent, as it is the case with any quote scheme, the tangent at the point of the quote scheme. The tangent space is given by X to zero of E to F. Okay. And then the obstructions line X to one. But as we'll see in this case this group is actually zero. So let's see this quickly and maybe I'll stop I won't finish the discussion I just spent two more minutes just so that we get a bit familiar with the space. Yeah, so how do you see that this is zero out. We chase a bit the exact sequences I mean, it's not hard. And let's see what's involved actually what what it depends on. So, so I, you know, starting with this. So we're going to take X to the into into this complex yeah so let's see what we get on the level of global. So I get zero. The smoothness is something very special in this case and that that's this little argument will change. It's worth it because it shows that you cannot expect this in general. When then we move on with X to one. So these are all on P1 yeah. This is a sequence on P1 and finally zero yeah this is the this. So, the main point here is that this now if I'm looking what precedes so we're interested in. We're interesting in concluding that this is zero. But so what precedes it is in fact already zero. So this is H1 in fact on P1 of the dual come with twisted with CN. The main point here is that this E, since we're on P1 and it injects into CN this, this E, what it composes is a sum of line bundles just because we're on P1 for no other reason. And these have all non the degrees are are negative, or you know, maybe in some, it can be negative or zero. These di's are greater than or equal to zero which means that in fact if you calculate here. If you take the dual, there's no H1. So this gives zero here. So this is zero for this reason, because he splits like this. But it wouldn't be for example if you look at the curve of higher genus even on. You don't, you don't have this conclusion so you cannot expect that this code scheme is actually smooth on a curve. It's not, but on P1 it is so it's a good so it's a good toy model somehow to consider, because it's simple enough. And then you know, calculate the dimension. So the dimension is the dimension of this group. Okay, so just calculate on P1 to genus P1. So this is the character, it's just, it's just a lot, you know, vector bundle on the line. The F also has degree D. It adds up to exactly what we were saying R times N minus R plus D times N. Okay. Yeah, so, right, so in fact from this exact sequence that we so it's smooth okay so it's a smooth projective right it comes with a universal sequence so of course I wrote this at a point you can write it globally the tangent. The final to the code scheme is, if we project. This is, these are the universal sheaves and this is the projection on. So here is a QD cross P1 and you project to bias the projection to QD. Okay, so, but from the discussion that we had it's also easy to see that. So, a little bit more strongly. We have that X one of you want to F2 is equal to zero for any. Yeah, so there's nothing for for any two distinct points yeah on the code scheme. This X zero of E1 F2 has constant rank for any two points of the code scheme. Yeah, so, so then just so let me actually I'll take one more minute and then that's it because we're, we're reaching this conclusion where it's a good place to stop to. On a product QD times QD, we can consider the complex X along PI so taking the projection along PI of E1 into F2. Let's call this W in this case. So here the situation is as follows that this projection is. So you have two copies of the code scheme. And each comes with a universal. This is the universal sub chief on the first. And this is the universal quotient on the second and pie is just the projection to QD times QD. And, you know, again just as before if you think for a second, there is exactly the same picture that we have for the grass manion. There is a total logical. Oh, sorry. So so we have so then we have this. We have this W, which is on QD cross QD is X complex in fact is just a vector bumble. Yeah, as we. And there's as before there is exactly in the same manner there's a total logical sequence Sigma, and we conclude as before, because the rank of the bundle W is precisely right. Yeah, this is is equal to the co dimension of the diagonal this this Sigma vanished is exactly along the diagonal. So we conclude as before that diagonal is the top turn class, which would be this one of W. Okay. So, so you admit this representation as well. And does it split is this does it split into. Yes, and maybe we'll see this also next time. So maybe it's a. I'll do a little I'll take it, take it up next time and do a little calculation here but in fact I have gotten started and I'll discuss this a bit more generally module I want other modular spaces next time before we want so I think it's a good time to stop and I apologize. Most of the discussion was taken by the by the ability but yeah we, you know, so hopefully hopefully it was okay. So let me stop sharing and maybe see if there are very much. I apologize, I, you know, I apologize I went over the time. So these are I mean this is sort of really kind of easy in some sense but it gives you some results which are not not completely trivial rather effortlessly for the code scheme as we'll see we'll see some consequences. I'll see you next time. Alright, well thanks for thanks for listening to this. I'll see you I'll see you next time and. Okay. All right. Thank you.