 Okay, right so hello everyone, welcome to the graduate student online seminar, summer 2021. Before we get started just want to say a couple of quick things, despite the words graduate student being in the title attendance is open to pretty much everyone and people background can vary a lot so it's like to ask people to be respectful of your friends and colleagues here during the seminar. I think said, please do feel free to ask questions the primary goal of these kinds of seminars is for it to be a sort of nice positive learning experience for everyone involved in asking questions definitely helps. In the spirit of sort of a usual in person seminar. It's perhaps easiest. If you do have a question or you would like a certification or anything like that just go ahead and unmute yourself and politely interrupt. You can also post your questions and chat I'll try to keep an eye on it and I can relay any questions. The speaker. And then we will also leave a little bit of time again if you have maybe a longer question or like a more involved or detailed explanation. We'll have a little bit of time for that. So with that being said, it's my great pleasure to introduce Dan Summers who will be talking to us about the RSK dash fly proposition. Alright, so just thank you for the invitation to talk at at the seminar. Clearly, my date is wrong on my title page. I commit to not being able to find the tech file this morning. So I'm just, here's a PDF of the same talk I gave three years ago. So a little background on me first. I finished my PhD at Drexel in 2019. I'm an Algebraic Competitor Symmetry, specifically Symmetric Function Theory, which is something I'll be talking about today. It's a branch of math that hopes to solve some kind of hard algebraic geometry questions in just counting things which, which makes me happy. I'm no longer associated with Drexel University that's where I did my PhD. I currently teach high school. I kind of bailed on academia, burnt out a little bit on research. And now I'm very happily teaching at a school in New York, and one of my students is a participant in today's talk so that's that there's a win right there. So, kind of the theme of this talk is the power of like smashing different branches of math together. So one example of this is in calculus. So how do we find the perimeter of a circle. Well, we're good at polygons. And we just add up all the side circles unfortunately, don't have sides I can't just add them all up but if I take the limit of the polygons that we put around the circle, we can we can figure out the perimeter of a circle, right so here's like one fruitful part of math or we kind of smushed to different branches together and this is kind of true throughout a lot of mathematics. I like to do combinatorics, I'm an algebraic combinatorialist but like that algebra part of the sentence is really. I don't do that. I do the combinatorics we count combinatorial objects that mean something in algebra. I know what some of the algebra words mean but I don't like, I don't chase diagrams and I don't really want to know what a category is because it scares me a little bit. So let's just fix some definitions for today. So bijection bijection is a one to one and on to function. Some basic examples we've seen in our lives before, and multiply everything by two, right, if I, that's the bijection from the natural numbers to the even natural numbers. There's a natural bijection from C to R cross R, just by getting rid of the eye and making the plus sign a comma. And, and then there's, you know, for finite sets, match them up whoever you'd like. So I think of combinatorics is the study of bijection on five bijections on finite sets. A lot of people say combinatorics is study of counting but if you want to think about it just counting a set is a bijection between a set and a subset of the natural numbers. So I think of combinatorics is the study of bijection because really, I deal with infinite sets so we're not actually really counting. So what kind of combinatorics like what kind of bijections do we care about care about my injections that means something in algebra, and what kind of algebraic objects that I care about symmetric polynomials are the basic version of this. So, symmetric polynomial and in variables X one through X, and it's finally where if you change the role of any two variables to get the same polynomial back, you can also think of this as a group action of SN. You just permute the variables and each of these polynomials is invariant under all group actions by permutations and SN. So with two variables like here's an example. If I switch the role of X one and X two. It's the same polynomial. Right, which I show here. With four variables. Right. So the last guy is some as symmetric polynomial with five variables, however, you might notice that like I just wrote the same polynomial down twice, but with five variables. This last guy is not symmetric. And the reason is because if I change the role of X five and X two, I get a different polynomial, right so there are, we need to take into consideration, all variables in my polynomial ring all unknown to my polynomial ring. So the amount of variables you're considering can throw you for a lurch. But there's ways to take care of that, but we'll get there. So the elementary symmetric polynomial degree D and and variables is defined by this kind of gross sum, which is iterated over a bunch of eyes. But the best way to think about it is you add up all the ways to pick D of the exercise and you can't pick one twice. Right, so if you notice in between all the eyes in the subscript of that some is all strict less thans. Right, so that means you can't pick the same twice so we're just adding up all ways to pick DXI without picking one twice or all monomials with D in determinants where where the kind of power pattern is all ones. So those are the elementary symmetric polynomials that we have the homogeneous symmetric polynomials which I always say homogenous and the people make fun of me. Definitions almost exactly the same. These are real nice and easy to tech up. You just got to change your less thans the less than or equal to us. And it's always to pick DXI we can pick each more than once. So these are two types of symmetric polynomials. So, right, I'm a combinatorialist. We like our examples. Here's our definitions. We pick two right in X1 X2 X3. That's always to pick two of the indeterminants and I can't pick them twice. Right, I can't pick anyone twice so that's X1 X2 X1 X3 X2 X3. Notice if I exchange the roles of any two of these. We get the same polynomial. The fact that the ease and the ease, particularly our symmetric isn't. I mean it's an easy fact to prove but it's not. I don't think it's inherently obvious from the definition. The H is maybe more so. These are both symmetric H2 right we got to pick two of X1 X2 and X3, but I can pick something more than once. So it's E2 right so it's X1 X2 X1 X3 X2 X3 plus the ways to pick the things twice. So those are ease those are our ages. I do H3 in X1 and X2 right now this is a little weird. Did I make a mistake? Hmm. It's always to pick three X1s and X2s. Yeah, so I made a mistake there. That should be plus X1 cubed plus X2 cubed. But yeah, I need to pick three of the X1s and the X2s and I'll have to pick them more than once. Not particularly important that example this is the important one if I want to pick for E3 of X1 X2. Notice I'm not allowed to use two variables more than one variable more than once. So, this guy is defined to be one right zero excuse me this is defined to be zero this doesn't. Choose three in determinants without choosing one twice. Right, so we have another issue here we talked about the issue before of the issue before here. Where was it. I don't remember where it was there was another issue we talked about. We have this issue here with some of these guys aren't really well defined. A lot of them end up just being zero. We deal with that well we switch to something from. We use infinitely many variables now these are no longer polynomials. We call them symmetric functions, even though we're never plugging anything in. So we're not thinking of them as a function that maps from one set to another. We call them symmetric functions when we have infinitely many variables it's just a bit of terminology that's been around a while that we just held on to for some reason. So you notice here all that change between my definitions is that there's no less than or equal to n. At the end of it, right, we have infinitely many variables. And when we have infinitely many variables we leave off the variables we just use ED and HD. So here are some examples, right so e to always to pick two variables. H two H three right so these are the same kind of idea. And you notice any three you're going to get some like issues you can get some bizarre terms but I picked one, for example in e three at the bottom, or one term will be x three x 43 and x 12345 right like that has to appear in there it's always to choose three of the infinitely many variables. So there's some families of symmetric functions in this case. Give me a second to take a peek at that. Any questions so far. I know I went kind of fast but this is kind of the more basic stuff at the beginning. Right. So, they're great they're compact definitions but they're probably difficult to work with like if you want to multiply two of these things together that's alright that's like some giant interlacing thing and I don't know that sounds terrible. So how do we encode this in combinatorics is is kind of the power of my field. So for more definition. We call a partition of an integer and a string of non increasing positive integers, who some is n. We write lambda is a partition event with this weird little sideways T. And we call each of the little lambdas I part of lambda. So we know we're talking about so here all the partitions of five. We can create a string which adds up to five and you can't get bigger. Right you can get smaller you can stay the same but you can't get bigger. So there's all the partitions of five. And we're finite partitions like you can always pad with this really many zeros that's not a big deal. So we associate with each partition a young diagram by taking the parts and drawing boxes in the I throw ensuring. Yeah, there's a lot of words here it's just easier with examples, we're taking what's known as the French notation. So the first part of this partition is for so we put four boxes in the first row. And also for so four boxes in the second row, and three boxes two boxes two boxes one box. So this is the young diagram associated with the partition lambda. It has to be up left justified. And here's the young diagram associated with the partition mu three boxes three boxes one box one box. So we use these diagrams to encode our combinatorics. Turns out these ended up being pretty useful. And here's one more example with new, I think I was just showing off my knowledge of Greek letters at this point I'm not really sure why I needed another example. Okay, so what is a semi standard young tableau well we're going to take the boxes and we're going to put numbers in them. And as when we go across the row from left to right, the injures can stay the same or get bigger. And if we go down the column to injures have to get bigger. And we call the content of a semi standard young tableau. A string which counts the number of eyes in the tableau. So if we have look at this example here. If you look at the semi standard well you notice if you read across the rose. Each row either stays the same or gets bigger. And if you read down the columns each column gets bigger. So yes you are semi standard. And here's the content, the tableau has three ones, three twos for threes, zero fours, three fives one six and two sevens. And here's our tableau. And it is semi standard. We use another example here I'll let you guys take a peek at this is this semi standard do the are the columns increasing and the rose non decreasing. And you probably can see pretty quickly yes, and the content is three ones two twos two threes two fours two fives and a six. And finally, since I'm sure you were waiting for this one for the for the for the. So threes the third one is always a non example right. So this guy's not a semi standard tableau. If you look at it, there are this one place is not semi standard and that's that to in the first row, right that to come before that one. Actually there's two places can I can anyone find a second place this thing is not semi standard. Three places I say ah man, world's greatest example. This column, it's one one, and then the second column to two. Yeah good right so there's a lot of a lot of numbers on top of each other remember you have to increase as you go down the rose. No surface. So, again, why do I care so we're going to use these guys to define more types of symmetric polynomials. We can define a list of non negative integers. We can define x to the a as just x one to the first entry in the list x to the second entry and so on. And then we can find the search, the shore function of a partition this is the same shore that pops up all over to place it took an advanced linear algebra class same shore. Isaac or Isaiah I don't remember one of those guys. And it's just an element. So, right here's our weird definition as lambda is the sum overall semi standard young tableaus of shape lambda. Whose content is T. But let's see an example to make a little more sense of this. So S 21 that means I have shape 21. And I want the variables x one x two x three. We're going to restrict to finitely many variables, just so we have finite examples. So I want to fill the boxes with the numbers one two and three, so that we have a semi standard young tableau. Here are the eight ways to do it. Right, so these are all eight semi standard young tableau of shape 21 in letters one two and three. Each of these is associated with a monomial. So this first guy on the top left x 112 is x one squared x two. Then we have 113 that's x one squared x three. Then we have 122. That's x one x two squared. We have 123 and 132. Those are two x one x two x three, and so on. Right, and if you look at this polynomial we have here. This, this guy is symmetric. It's not clear at all that sure functions need be symmetric. There's one definition in terms of determinants, which gives you a really nice clean proof for the sure functions being symmetric if you do it from the tableau definition. You might want to look up something called the bender Knuth in pollution, which will show that these are symmetric, not particularly important for the talk. But this is symmetric. If you want to do the sure function and infinitely many variables we just need more tableau. We need infinitely many tableau. So it just gets messy, but we can do it. One more example if you want s two in x one x two x three. So we got shape just two boxes in the first row. Here's all the tableau. There's our polynomial. Anyone recognize this polynomial. So we talked about it today, think about how I'm allowed to put numbers in the boxes. And what, I mean, we've only talked about three types of polynomial. So what other one would follow the same kinds of rules. Is it the H polynomial. It's the H is right it's the homogeneous ones right because the homogeneous was I had to pick two variables for each two. But I'm allowed to pick the same one twice. Right, and that's all these, these tableau are I need to pick two numbers, but I'm allowed to pick the same one twice because they live in the first row. That's H two s one one similarly is going to be e two. Right, I need to pick two numbers, except they can't be the same because they live on top of each other. So the homogeneous and the elementary guys live inside of the sure function sure function is a larger class of symmetric polynomial symmetric functions. Right, so what who cares right, you know we could moosh symbols around it's really not a big deal. We're going to get to it right so. Right, the commentators were weird we call numbers letters. So we're in the letters, all the letters come from the national numbers is finite list of entries from the set a. Here's a list here's here's a word we don't put commas because why. It's a different thing and here is here is kind of the main bit of the talk. There's a bijection between words and tableau to tableau. Where one is semi standard and one is known as standard. Now we haven't defined a standard young tableau yet. So we're going to do that here. There's two ways to think about a standard young tableau I think the easiest way is it has to be semi standard, and you have to use all the numbers up to the number of boxes once. So say we have eight boxes is a semi standard young tableau where I use each the numbers from one to eight once. Here's an example. Right. So this guy is semi standard, but it's standard because each number is used exactly once, and we use all the numbers, starting at one, and ending with the number of boxes which is eight. Okay, so we can, if we take a word, we can create a semi standard and a standard young tableau. If I have a semi standard and a standard young tableau, I can recover the word. This is indeed a bijection, and it's known as the Robin, Robinson shensted Knuth algorithm, the RSK algorithm, making the name of this talk a risky proposition. I love bad wordplay. So, what's the insertion algorithm. That's a lot to read. If you have slides afterwards you can go through it more slowly later. So, how do we insert the word a one a two a three all the way up to a n. Well, it's a kind of an inductive process, we, we start by just, we create a tableau is one number in one box is the first number on our inner word a one. We put a one in our tableau. After we've inserted I letters, how do we insert the AI plus one. So we have some tableau. If AI plus one is bigger than every number on the first row, just put it at the end of the first row, and you're done. If you're bigger than every number on your first look first row, you find the left most entry in the row which is greater than the thing we're trying to insert. Replace it with the thing we're trying to insert pretend the first row is not there and do the same thing with the second row, and continue until you're just putting something at the end of a row, and stop when you've inserted everything. The thing to read the method here is kind of gross. So let's just look at an example. We want to make the word 23211 into a tableau. So when I insert the number two. I just put it into the tableau there's nothing to do. And my q tableau my recording tableau is just recording the way the boxes are being added so we added one box. There's there it is. So now we want to insert the number three, the second number in my word. Now the number three that's bigger than every entry on the first row of my p tableau. So just get stuck at the end. Just get stuck at the end. And my again my recording tableau. Just records the way the boxes are being added. All right so now I have the tableau 23 I want to insert the number two. Well two is not bigger than every number on row one. So what it does I find the left most instance of a number bigger than two, which is that three. I take the three out I replace it with the two. I pretend the first row doesn't exist, and I do the steps of the algorithm again. I pretend the first row doesn't exist there's no tableau there. So I just put a three in the second row right there and that's the third box I created so there's my q tableau. Right now what do I do with one. Well, one is not bigger than every entry in row one. So replaces the left most number greater than it. Now I got to do something with that to pretend row one's not there. All right that to in row two has to replace the left most entry which is greater than it, that three. And then I still got to put that three somewhere but I pretend the first two rows don't exist. And I just pop it down there at the bottom. And then this to this to. Well I look. There's, it's not bigger. It's excuse me, there's not an entry in row one which is bigger than the two. So we just gets popped on at the end there. So this is the Robinson sense that Knuth algorithm for taking a word and making two tableau out of it one which is semi standard, and one which is standard. So knowing the P in the queue you can reverse engineer this thing right so I know that five is the last box that was added and I can figure out how to reverse engineer it we're not going to do that here today. I don't care so much about the recording tableau. I can start with the tableau and insert a word. Start with a tableau and insert a word, and it's the same process so we're going to start by inserting that one. In the first row it bumps out that to that to in the second row in the second row is going to bump out that three. And that three is just going to go down there. After I inserted that. Now I have the word 123 that one bumps out that to which can just go at the end of that row that to can just go at the end and that three can just go at the end, and I have nothing left to insert. So this is just an insertion algorithm we can take a word and smush it into a tableau. I know that was a lot. Oh, geez. Anyone have any questions about that algorithm. No, okay. So, what happens, well I took a tableau of shape 321 and I created a tableau of shape was that 532. It's a way to excuse me a young diagram I took a young diagram of shape 231 may want to shape 532. And notice where the boxes were added. Right they were all added kind of along the outside of the shape. And in this example, none of them lay on top of each other. Right so they all are either in one row. In the next row they live on top of part of the old tab low. This is not an accident. Right this is a this is a important part of what's going on today. You work hard enough to notice that if your word is non decreasing, which is what we had here we noticed our word here 1123 non decreasing. The boxes in your shape never lie on top of each other the new boxes, never lie on top of each other. And this is one of the bijections that we care about. So what this is going to do is going to define a bijection between pairs of tableau and non. I can't read my notes pairs of tableau and non decreasing words and tableau who differ by adding boxes so that no two boxes lie on top of each other so we have this pair, a tableau and a non decreasing word, and it lives in bijection with new tableau where the boxes added never lie on top of each other. So what is a non decreasing word of length D. Well, in the kind of the new terminology we talked about today it's a semi standard young tableau of shape D, right, all the boxes in the first row. The boxes in the first row. And furthermore this algorithm gives a fact that the, the numbers like the numbers never change our newest tableau has the same numbers in the old two objects. So hopefully we can get some kind of algebraic fact out of this. And the fact is this, if I take a short function of shape lambda multiply it by a sure function shape D, where D is just a single row, right just one number. So the thing about it. S lambda we're summing over all the semi sum semi standard young tableau shape lambda SD we're summing over all the semi standard young tableau shape D. The distributor property says we're going to have to pair all these things up in all possible ways right that's how the distributor property multiplication works it's really just a pairing. Kind of an interlacing. We can, we know how many of them are from this algorithm, right how to pair them. So let's see an example. You want to do s 11 times s two. So s 11 in the two variables is x one x two s two in the two variables is x one squared x one x two x two squared. If I want to multiply these I just need to think about it in terms of tableau. So, here's the tableau version in terms of sets. If I multiply them out right using the distributor property is the same as writing them as ordered pairs in the tableau. But RSK gives us a way to manage two pairs of tableau. The second one is just a row. Right, it can create new tableau from that. So let's go from pairs of tableau to one tableau via RSK. So, this third list right here is just insertion of the one row tableau into the one column tableau, which tells us essentially that s 11 times s two is just equal to s 31. Because we have all tableau of shape 31. In variables x one and x two. Right, so this is what RSK would tell us there is a natural way to pair up pairs of tableau in this kind of restrictive case with singular tableau of a different shape. This is known as the peary rule. So I want to do s lambda times sd. It's the sum of all sure functions which are created from lambda by adding d boxes, which never lie on top of each other. And now the specific example here is really small because there's only two variables. This isn't always to do it. But let's do this peary rule, for example, in in three variables. So if I want to do s two times s 21 times s two. I got to add two boxes to s 21 so that the two boxes never lie on top of each other. The ways to do that are shown there. I either add them both in the first row, one in the first row one in the second row, one in the first row. And a third row, or one in the second row and add a third row. So this is what the answer should be based on the peary rule. And let's, let's check it. So here's s 21, which is done by just writing out all the tableau. Here's s two, which again just has gotten by right now all the tableau. And I want to multiply this out. There's 42 terms. I don't want to do that. So the trick here actually saves us a whole lot of time right so this combinatorial trick says a lot of time this is a lot of what the study of symmetric functions relies on how can I expand products of symmetric functions into sure functions. But this is how it works so I have a problem to leave you with. So if I want to do s lambda times ed right so that's the that's the sure function where they're all written in this in one column, similar methods work, but I'll leave that to you to try to figure it out. And this is really kind of the end of my talk but let me give you a little talk about why I guess people care about sure functions so. Sure functions are really originally came up in the study of the representation theory of the symmetric group. If we want to figure out how to play around with irreducible representations it turns out it's easier to actually just deal with sure functions and there's a nice correspondence between the share functions and the irreducible representations of symmetric group. And from there actually a kind of a big study came out of it, they pop up in geometry, particularly when studying the grass money into the affine grass money and I believe the quantum cohomology of one of those things has to do with sure functions they pop up, then over in Schubert calculus they also pop up. They're kind of like a really they seem to be inextricably linked to like hard geometric problems. So, well, to me this feels like doing like adult Sudoku. Like they really are kind of a very important mathematical object that really has a pretty rich study to it right now. So, I know this is to try to get you guys into or understanding a another bit of math that maybe is not taught at your school I have no idea if it's any any if there's anyone at Georgia doing anything like this. But I think it's awesome. I think you should read about it. If you want references. I think I'll happy to give them to you if you have any questions, shoot in my way. Otherwise, thanks. Okay, thank you so much. Let's go ahead and give a round of applause. If you're wonderful. Thank you. Feel free to unmute yourself and clap or clap emoji. So anybody have any, any questions. Double check with this kind of last, this last year and years this is giving you a way to sort of expand out that huge product of polynomials but without ever having to do multiplication, right, you just know some combination of other ones. Right so yeah it turns out that the ring of symmetric functions you can think of it as a ring. You can also think of it as a vector space. And over queue. Right so the sure functions themselves form a basis for this vector space. So it gives you a way to do this kind of ring. It gives you like do this ring thing this multiplication and and just expand in terms of the basis of a very nice natural basis of the vector space. And if you're maybe only concerned about, you know, like some bounded number of these, maybe only ever want to go up to a could you actually write some change of basis matrix that just. Yeah, yeah. Yeah, and the entries in the in the change of basis matrix are like a lot of really cool combinatorics. I mean so I guess we have a little more sophisticated audience here than the last time I gave this talk so the H's and the D's and the ease themselves don't give a basis for the vector space. But if you take products of them. They give you a basis as well. So like the natural basis would be like fix the monomial right so say you want to do x one x two squared. Right, and then you just end up always to do x one x two squared that's like a natural. Those are the monomial symmetric functions. That's kind of the easiest basis but they're like the worst to compute with. So the H's and the D's, they're kind of the second easiest to compute second hardest to compute with maybe a little easier to find than the sure functions. But again, you need to take products of them to get a basis. The sure functions themselves are a basis. Any I mean any linearly independent set of which there's partition, many, or that's a probably, I think you get what I mean, although that's not the correct term is a basis. It is an infinite dimensional vector space so it's not. Not always easy to poke around with. It's really cool. Would you say it falls under just algebraic combinatorics, like strictly in that realm. So, yeah, so I mean I guess this is algebraic combinatorics is people that do a lot of different stuff a lot of it I don't understand where the symmetric function theory crew. So if you want to kind of learn a more about this stuff there's like Google more symmetric function theory then then algebraic combinatorics is people that do like hard algebra stuff with hopped algebras and then I again I don't not my not my jam. I like to count things. So some talk on. Oh, I don't even know it was like a mini course I didn't even get to a half hour of the mini course before I was completely lost on and there was algebraic competence or so there was just some guy doing hard algebra, instead of, instead of moving boxes around. I mean it's, it's, it's a small field but it's, it's kind of a very active and very. It's a wonderful research community that I've never met nicer people than I have an algebraic combinatorics, but we're a smaller group, most of us are in Philadelphia San Diego. There's a few of us that I know someone at Virginia Tech I know someone at Virginia. But, you know, it's a good group. Any, any other questions before I start reporting here. Let me go ahead and thank Dan one floor. Thank you. Thank you.