 So this lecture is part of an online course on commutative algebra and will be mainly about the following question, what is a scissorgy? So this comes from a Greek word meaning a yoke where if you have two oxen pulling a plough you yoke them together and we'll sort of see a little later why mathematical scissorgy is called this. So the aim of the next few lectures is Hilbert's theorem on finite generation of rings or algebras of invariance. So a better start by explaining what an invariant ring actually is. So let's just look at example one. Let's look at rotations of R3 or we can allow rotations and reflections. So here we have a group denoted by the orthogonal group or maybe the special orthogonal group if you're restricting to rotations, so the O3 of R is just three by three orthogonal matrices. And rotations and reflections preserve length. So you can think of length as being an invariant of this group, it means if you act by any group on the any group element on two points of R3 the length, the distance between those two points will be preserved. Well another way of saying that is it preserves the polynomial x squared plus y squared plus z squared. So this is just being the square of the length. So we can think of x squared plus y squared plus z squared as being an invariant of O3 of R acting on the vector space R3. So x, y and z are all linear functions in R3, so this is just a polynomial on R3. And we have a similar question where you can replace the orthogonal group by your favorite group and replace R3 by any vector space axed on by this group. There's one actually slight tricky technical question which is how does G act on polynomials? So if G is acting on a vector space V, what we really want is how does G act on a polynomial function from V to R or our field K or whatever. So this is part of a general question, suppose you've got a function from a space x to a space y where x and y are acted on by G. In our example G acts trivially on K but more generally we can allow G to act non-trivially on y. And so what is G of f? Well in order to indicate the action of G of f you need to say what it does to x and this is usually defined as follows. If you apply G f to x then this is equal to G of f of G to minus one x. And there's a bit of a puzzle here because you can look at this and well the G is fairly natural but what is this doing? Why do we put in a factor of minus one there? Well I'll first explain why we put in a factor of minus one and then I'll explain what goes wrong if you don't. So the rule shouldn't really be written like this, it's better to write it like this. We write G f applied to G of x is equal to G of f of x. And this if you think about is really just a sort of special case of what happens if a group acts on A times B then you want G of A times B equals G of A times G of B. And here B might be x and A might be a space of functions from x to y and then this rule here turns out to be essentially this rule here. Now if you change x to G to minus one x you get this rule G f of x equals G f G to minus one x. So this rule looks a bit odd but this rule looks much more natural so maybe it's better to remember the action of G is looking like this. The other question is what happens if you miss this out? And if you miss it out you run into a nasty mess. So let's try putting G f of x equals f of G of x. Let's make the action of G on y trivial just that it's one less thing to think about. So what happens if we define that? Well then we run into this horrible problem if we type G1 G2 f of x. Well on the one hand this is equal to f of G1 G2 of x because we can just move G1 G2 into the inside. On the other hand it's equal to G2 of f acting on G1 of x because we can move the G1 inside and then it's equal to f of G2 of G1 of x. And if you look at this you see we get a contradiction these are actually different. So if you don't put in this minus one when you're defining actions of groups on functions you really run into horrible contradictions. So we better don't ever do this. Another way is you can put the G on the right if f acts on the right on x then you can indeed put x times G there and get away with it. So there's this, I have to remember there's this irritating and confusing slight complication when you want to define group actions on functions. You sometimes need to use the inverse of the group. So next example of invariance is just determinants. So here's example two suppose we take the special linear group and over a field acting on k to the n. So I'm just going to take n equals two so you might have A, B, C, D acting on vectors x1, x2 in the usual way which I can't be bothered to write out. And there are no interesting invariance of this in fact SL2 of n acts transitively on all non-zero vectors of k so you can't really have any non-constant functions that are invariant. On the other hand we can make k to the n bigger. We can have SL2 of k acting on k to the n plus k to the n plus k to the n and so on and what I'm going to do is I'm going to take n copies so we might have A, B, C, D acting on a space x1, x2 so that'd be one copy of k to the n and it might act on y1, y2. And now we can find an invariant we can just take the determinant of this stuff here. So in this case we can just take the determinant which will be x1, y2 minus x2, y1 and in general the determinant on k to the n plus k to the n is an invariant of SL2, sorry SLn of k. That's not terribly surprising because you basically define the special linear group to be the matrices that preserve the determinant when it's acting on n by n matrices. So the determinant is an example of an invariant and by the way if you look at the word determinant it ends in the word ant and pretty much anything that ends in the word ant tends to be an invariant of something, I guess it comes from the ant at the end of invariant so you get things like determinant, resultant, discriminant, catalecticons and all sorts of other weird things and they all tend to be invariants of some group acting on something. Well so far we've only had one invariant at a time so let's look at a more complicated example where there are several invariants. So this time I'm going to take g to be this symmetric group of all permutations of n objects and I'm going to let it act on c to the n by permuting coordinates. So polynomials, the polynomial functions on these are just going to be polynomials in x1 up to xn and the symmetric group just acts on polynomials by permuting x1 up to xn and the problem is let's find invariant functions or invariant polynomials. So we want to find polynomials that stay the same if you renumber all the variables, that's an obvious one, let's try e1 which is you can just add them all up and that's obviously stays the same if you mute them or you can multiply them together or you can add up products of pairs of them so we can take e2 equals x1x2 plus x1x3 and so on all the way up to xn minus 1xn and these are the famous elementary symmetric functions and you can write them all down by thinking of the xi's being roots of a polynomial so if I take y minus x1 times y minus x2 up to y minus xn this is y to the n minus e1y to the n minus 1 plus e2y to the n minus 2 plus or minus en. So this gives some obvious invariance and the question is is every invariant a polynomial in e1 up to en and this is the basic theorem of symmetric functions which says yes it is so every invariant polynomial in x1 up to xn is a polynomial in e1 up to en and the proof of this is quite easy so I'll just give it in a couple of minutes what we define is an order on the monomials so we say x1 to the m1, x2 to the m2 and so on is greater than x1 to the n1, x2 to the n2 if it's bigger in lexicographic order so this means m1 is greater than mn1 or m1 equals n1 and m2 is greater than n2 or m1 equals n1, m2 equals n2, m3 is greater than m3 and so on and now what we do is we suppose f is invariant and what we do is we look at the biggest monomial in f so suppose it's x1 to the n1, x2 to the n2 and so on x3 to the n3 and so on so suppose this is the biggest one we can find and now all we do is we subtract x1 plus x2 and so on to the n1 minus n2 times x1, x2 plus and so on times n2 minus n3 times x1, x2, x3 and so on plus times n3 minus n4 and so on and this kills off the biggest monomial in f and you just continue so we can subtract a polynomial in these elementary symmetric functions and kill off the biggest monomial in f and if we keep doing this we eventually make f0 so this actually gives an algorithm for expressing every symmetric function as a polynomial in the f in the elementary symmetric functions there's one slight thing that looks a bit suspicious about this if you look at this proof where did we use the fact that this is a symmetric polynomial it looks at first sight as if we've written every polynomial as a as a monomial in symmetric functions well the key point is that if f is symmetric this implies that n1 is greater than or equal to n2 is greater than or equal to n3 and so on as you can easily check and we need this because these numbers here must all be greater than or equal to zero otherwise you otherwise these things wouldn't be polynomials so that shows that every symmetric polynomial in every every invariant of the symmetric group act on polynomials is is generated by these so the invariance of sn acting on c to the n are finitely generated algebra over c so this means that you can find a finite number of invariants such that every invariant is a polynomial and those with coefficients in c so this is um maybe the first non-trivial example of of invariance being finitely generated i mean in fact it's quite easy to check that the algebra of invariance is a polynomial ring um over the events e1 up to en in other words there are no non-trivial relations between them which i won't bother checking because it's very easy and this is actually very unusual most of the time if you've got a group acting on a space the ring of invents might be rather complicated you can find a set of generators but there might be some relations between the generators as we will see in the next example um so i'll say this is unusual and it tends to happen if g is a reflection group that's a group generated by reflections in hyperplanes and the symmetric group acting on on c n happens to be a symmetric group which is why its ring of invariance is particularly nice so now we're going to see an example where the ring of invariance is a little bit more complicated so this time i'm going to take g to be an which is the alternating group and if you forgot what that is i'll tell you in a moment so the alternating group is a subgroup of s n the symmetric group and well we have the following polynomial let's call it delta which is going to be product of i less than j of xi minus xj so it looks like x1 minus 2 if n equals 2 or x1 minus x2 x1 minus x3 x2 minus x3 if n is 3 and so on and now you notice that every element of the symmetric group permutes these factors opt to sign so either so it changes delta to either delta or minus delta so a n is the subgroup of s n fixing delta and you can see that a n has indexed 2 at least if n is greater than 2 and what i want to do now is ask what are the invariant polynomials under the alternating group well that's quite easy this is just polynomials in e1 up to en and then delta is also invariant because we pretty much defined a n to be the thing such that delta is an invariant um so um and um so we saw well we stated that there are no relations between e1 up to en however there are relations between e1 up to en and delta because you notice delta squared is symmetric so is a polynomial in e1 up to en and we can figure out what this is explicitly for instance let's take n equals 2 then we find delta squared is x1 minus x2 squared and this is equal to x1 plus x2 squared minus 4 x1 x2 which is equal to e1 squared minus 4 e2 um now and when n gets larger you can still express delta squared in terms of these but the expression gets kind of complicated for instance here i've got a picture of it for degree four polynomials i guess i'm just trying to magnify this a bit okay so here we have the discriminant of a fourth degree polynomial and you can see it's already getting a bit of a mess um it's two lines long and as the degree goes up this discriminant gets worse and worse um so this is an example of something called a scissor g so let's sum up what happens for a n the ring of invariance is finitely generated oops sorry i better turn the magnification back down it's finitely generated by e1 up to en delta but there is a non-trivial relation where um delta squared minus some polynomial in e1 up to en is equal to zero so here that the invariant ring is no longer a polynomial ring it's something a little bit more complicated um and it's fairly straightforward to check that this is essentially the only non-trivial relation and that any other relation between delta and all of these has got by taking this relation and multiplying it by some polynomial so that's an example of a first order scissor g however things can get a little bit more complicated so let's see an example of a second order scissor g so for this i'm going to take g to be the cyclic group of order three and i'm going to let it act on two-dimensional vector space and i'm going to let it act in a very boring way i'm just going to put um i suppose sigma cubed is equal to one where sigma is in g as a generator and i'm just going to put sigma x y is equal to omega x omega y where omega cubed is equal to one so omega is equal to two pi i over three which is minus a half this root three i over two if you're interested which you probably aren't and now let's try and find some invariant polynomials well it's not too difficult to figure out what all the invariant polynomials are we notice that x the a y to the b is invariant if a plus b is divisible by three because omega just multiplies this by a cube root which is one if a plus b is divisible by three um and so the ring of invariance is generated as an algebra by the following monomial so we have x cubed x squared y x y squared y cubed so let's call these z zero z one z two z three and it's easy to check that these generate the algebra of invariance however there are some relations or synergies between these because we notice that z naught z three equals z one z two z naught um z two equals z one squared and z one z three equals z two squared so here we have three scissor g's well um things are a little bit more complicated than that because we've not only got three scissor g's so um so we can write the scissor g's like this we put z naught z three minus c one z two and let's call that a two and then we have z one squared minus c naught z two and let's call that a three and we have z two squared minus c one z three i'm going to call that a one and now we notice that a two a three and a one are related because a z one a one plus c two a two plus c three a three equals naught so these are the first order scissor g's and this is a relation between scissor g's so this is the sort of second order scissor g um so things are beginning to be a bit more complicated well in order to see what's going on let's draw a picture so we've got a map from the ring of polynomials in z naught z one z two z three onto the ring of invariants um and this map has a non-trivial kernel and let's call this ring r so that i don't have to keep writing it out so the kernel is an ideal and this ideal is generated by three elements a one a two a three so we can map a one two whatever it was z two squared minus c one z three and a two and a three go to um those other expressions but there's uh so this is a three dimensional three module over r mapping onto the ideal of relations and which is really a sub-module so this is a map of modules from this module to this module and the relation between a one a two and a three can be thought of as a map from a free module in one generator to r cubed so this is mapping the element b let's call this b this maps b two z one a one plus c two a two plus c three a three um and now what we have here is something called an exact sequence so an exact sequence means that the kernel of each map like the map from here to here is exactly the image of the previous maps that's just saying that b whose image is here um gives you all the relations between a one a two and a three and that any other relation between a one and a two and a three has got by multiplying this by an element of the ring r and similarly um a one a two and a three form a basis of all the relations between z naught z one z two and z three so um so this is pretty much what happens in general so I suppose we've got a ring um of invariance of some group acting on some vector space what we do is we have an invariant ring and we have a map from this from some set of polynomials say k z nor z one and so on onto it and if I call this ring r then we have some relations between the r i's which are scissor g's and we might be able to find some scissor g's forming a basis of this so we have some map from a free module r to the m to this and then there would be a map from some other free module to this and some map from some other free module to this it might go on and on and on like this so these are the first orders scissor g's or at least and these are the second order and so on and you as you can imagine you couldn't get higher order but I'm not going to write anything out because it really gets to be a bit of a mess um so um we can now ask the following questions first of all is our finitely generated as a k algebra in other words can we find a finite number of element z i and then we can ask is is the module of first order scissor g's is r to the m finitely generated as an r module and notice I'm contrasting um the ring r which is we want to be finitely generated as an algebra but all these we only want to be finite generators modules in other words we're asking is this number can we take this number m to be finite and to see the difference we notice that say a polynomial ring in x is finitely generated as an algebra over k because it's just finitely generated by x but it's not finitely generated as a module over k because you need an infinite number of powers of x to span this as a module and then we can ask is r to the n finitely generated as an r module so if we've got a finite number of first order scissor g's we can ask are the second order scissor g's relating them also finitely generated and so on um actually from this you can sort of see why a scissor g is called a scissor g because um for instance this second order scissor g b is kind of yoking together the first order scissor g's a b a1 a2 and a3 i mean you know a yoke would would tie several oxen together and here the scissor g is tying several polynomials together in a way that's isn't really at all like oxen being tied together but whatever um and then finally we can ask does this sequence of free r modules go on forever you can see that in this particular example um it's stopped after two things because this map here was injective so we can ask is this chain of modules finite so we've got three different sorts of finiteness questions we want to know if this is generate finitely generated as a algebra we want to know if these are finite generators modules and we want to know if this long chain is is finite in length and Hilbert showed um the answer is yes if g is reductive and your field k has characteristic zero well i guess he was working over complex numbers but it's close enough um so what does reductive mean well i'm not going to worry about that too much because we're not going to do the general case of reductive groups we're just going to do the special case when g is a finite group and k has characteristic zero um so the next few lectures we will be developing the commutative algebra necessary to prove well we'll be doing at least the first two of Hilbert's theorems the third one may or may not come later on in a course depending on um what i feel like um so the next lecture we will have some more examples of invariant rings which are rather closer to the problem that Hilbert was actually looking at