 So, great. So thank you, Michael and Alec for the kind words. Thank you, Seth and John for the wonderful introduction and all memories. And thank you all for being here. I know it's like a busy time. There is Fox deadline today, which is our key questions. Many of you still made it, so great. So I would like to also begin by thanking Charlotte Fisher for the generous gift to the university and giving the end-out chair. I would also like to thank my family, my wife Agate and Julia and David for all their love and support. So it's really a great honor to be the Patrick Fisher chair. So as we already heard, Patrick Fisher was really an amazing researcher, one of the early pioneers in the field, who did a great work in database theory and complexity theory. But what's even more amazing is he was a great visionary and really a leader in the field. So he founded this special interest group called SEGAT, which really formed the community of algorithms and computation theory. And this was back in 1968, when theoretical computer science was just beginning and it was considered like an esoteric area of mathematics. And he had this vision to really see the potential and bring the community together. And after that, in 1969, he founded the stock conference, which again we kind of live and breathe by. There are two flagship conferences in the field, Fox and Stock. And he was actually the chair for the first five editions. And he really steered stock to become what it is today. So really impressive work and even more amazing. He leaves an amazing legacy of really outstanding researchers. So Dennis Ritchie, the founder of the Sea Language, who was programmed in Sea Language. He was also a Turing awardee, he was a Patrick Fisher student. And also amazing complexity theorists like Arnold Rosenberg and Albert Meyer, who again had their descendants, many of whom we know are famous names these days, Nancy Lynch, Leonard Levin and so on. And I was looking, so according to Matt Genealogy, he has 415 descendants. And this is probably a vast undercount, as we all know. So it's really impressive lifetime work. So let me talk today a bit about algorithms, the area I work in. And again, I think algorithms have been around almost as long as human thought has been. So the first algorithms, surprisingly, were already known to Babylonians about 4,000 years ago. So here is a tablet where they figured out that this was an algorithm for computing square roots of a number two, very quickly and very high accuracy. And again, algorithms have been discovered by several civilizations over the years, starting Babylonians, Greeks and so on. In fact, the word algorithms algorithm comes from this last name of Al-Khwarizmi, who was an Arabic, very influential thinker in the Middle Ages. In fact, a very interesting character, if you read about him. So the word algebra comes from a book he wrote called Al-Jabbar. Again, back in about 880. And he had an amazing list of other books that he wrote and founded fields. So this was sort of historically the algorithms. But really, the field of algorithms started taking off in early 1900s, when the sort of the first computing machines started coming into the picture. And sort of these early machines, as you all know, were sort of huge, clunky, very slow and at very limited memory. So if you wanted to get anything useful done, you had to be sort of smart about how you program these things and come up with some clever ways. And that's what really got started people thinking about algorithms systematically. At the same time, there was sort of this more abstract view about, you know, people started thinking about what is computation. So again, one of the most famous things is this mathematical giant, David Hilbert, in his famous ICM address in 1900, which set the course for like 20th century mathematics. So you have this question that if you have a true mathematical statement, can you always sort of prove it systematically in a sequence of steps for using some axioms? So is that sort of a systematic computation? You can do to prove or disprove something. And this let people thinking about really what computation means, what you can do and cannot do. And we all know like the impressive works of Godel and Turing laid the foundations of this field. So this was sort of happening at a theoretical level. And at the algorithmic level, again the focus was more on how to program these giant machines in like clever ways. And people started coming up with sort of very impressive and clever techniques to do things faster and so on. And sort of the big breakthrough in this field came in the early 70s when this notion of efficient computation was discovered. So by efficient, we mean, you know, so maybe you can't solve things in finite time. But what you really care is whether you can do something efficiently or inefficiently, right, tractable versus intractable. And we all sort of have heard about the P versus NP problem. So this is sort of all that was starting around this time. And in fact this P versus NP paper by Cook came in stock 91 where Patrick Fisher, stock 71, sorry, where Patrick Fisher was the chair instead of this. So with this discovery, the field really took off sort of, so this was sort of the golden age and it still continues. So people understood this notion of tractable versus intractable. But then I view this as this notion kind of freed researchers away from this, thinking about specific machines and so on. So people really started thinking in terms of resources like space, time, abstractly and reasoning about computation. And very quickly people discovered there is a very rich structure in like problems. And if you zoom in further, there is this whole thing called the complexity zoo that still keeps evolving. And this led to a discovery of a lot of rich problems with various algorithmic techniques that are needed and so on. And the field really grew intensely and sort of, it started borrowing a lot of techniques from other established areas like mathematics, logic and so on. So I think these couple of quotes capture very well what was kind of the feeling of that day. So here is the famous one by Richard Feynman. It says computer science is not as old as physics but it really has very intense upbringing. So it's right up there. Similarly, there's much more to computer science than not just being about computers but just like astronomy is not about telescopes. So this was amazing growth that was happening. So in other part of the world, it's my story. So I got into computer science during the end of this phase. During 95 to 99 in IIT, Mumbai, in India. And of course, at that time, I didn't know anything about all this cool stuff. I was just a high school kid. But I kind of liked coding and mathematical problem solving. And really the main reason I got into computer science is because all cool kids I knew went into computer science. And this was our age when being cool still meant like you were nerdy and good at math problem solving and so on. From what I hear from the new generation, I think things have changed maybe a bit. So anyway, so I got into computer science and we were lucky at IIT. We had sort of several very excellent faculty, many of whom actually were faculty in the US and they decided to come to India or were on visiting positions for a while. So my advisor, Abhiram Ranade, he was actually at UC Berkeley for a while and then he decided to move to India. And it was really great to see these excellent researchers teaching really cutting art stuff and many of us got hooked on. So in fact, many of my service last counting, we had a class of 40 students and about 10 of them are actually faculty in various US universities these days. So then I went on to do my PhD at Carnegie Mellon. My advisor was Abhiram Blum. And as Seth said, I worked on sort of various algorithmic problems on finding more efficient algorithms for NPR problems or better approximations, which is kind of the craze those days and it's still very popular area. So, but around this time when I was graduating, something remarkable was happening in theoretical computer science, sort of a new phase. And people started calling this like the computational lens. So as the field of theoretical CS was maturing, what people saw is TCS gave sort of a special view with which you could view various different sciences and it revealed a much deeper and interesting structure which was sort of hidden earlier. So I mean, again, we all know these things. So for example, you take biology, but you start thinking of it as a computation that nature is doing and it sort of reveals much, much more about this. Same with algorithmic game theory, quantum computing and so on. So again, this cannot be said of much fields. So TCS has this sort of a lens which really brings into view like a lot of interesting structure that's otherwise hidden. And one such area where I fell accidentally into and I'll talk about this more later is what is called discrepancy theory. That's an area of mathematics. And once we applied this lens to it, it sort of became this area of algorithmic discrepancy. So let me actually start by a puzzle before getting into what this all really means. And this is sort of a high school Olympia type puzzle. Maybe some of you have seen this. So let's do this. I really like this puzzle. So it says, suppose I give you any 10 numbers in the range 1 through 100. So here's an example. Then you can always find two different sets of numbers in this list that sum up to the same thing. So in this example, if you take these numbers in red and the other numbers in blue, you can check that their sum is the same. And the point is, this always holds no matter which 10 numbers you choose. So I hope the problem makes sense. It's kind of a cute fact and actually does much to it as we'll see. So no matter what 10 numbers you take, you'll always find two such objects that sum up to the same thing. And how do you prove such a thing? So the proof is really amazing and I'll actually give it to you. So let's call these numbers x1 through x10 and the idea is to do a counting argument. So suppose on one side, you look at all possible sets I can form with these 10 numbers. So there are 2 to the 10 such combinations because for each element you can pick it or not. So let's write them on one side and now on the other side, for each set that I can consider, let's write the sum of that set. And now we notice that these sums, the numbers are in the range 1 through 100 and there are at most 10 of them. So all these sums should be less than 1,000. And the crucial point is that 1,024 is sort of more than 1,000 or 1,001. Some can be zero. So there should be some two sets out here which kind of have the same sum. Just because, again, there's a fancy name for a basic observation, the pigeonhole principle. If you have more pigeons than the number of pigeonholes, there'll be some hole that contains at least two pigeons. So there will be two sets which sum up to the same thing and that's the whole proof. So it's a cool proof but it's also very unsatisfying if you think about it because it's showing that something exists but it gives you absolutely no clue on how to find such sets or what they even sum to or anything for that matter. So these are called existential proofs. Now in our toy example, I actually solved this by brute force. You can simply code it up but the same argument would work if you had, let's say, a thousand numbers and maybe each of them was 300 digits long and then, of course, brute force is basically infinite time. You can never hope to solve this by trying all possible combinations. So again, if you shine the computational lens on this problem, I just don't want to know whether two sets exist but I want to know if I can actually solve it and then it sort of forces you to think much, much deeper than the previous high school level type puzzle and indeed people have studied this extensively for the last 40 years but we still have no clue whether we can actually do this and sort of the belief in this area is it's probably quite unlikely like we just don't know after thinking for so long but again, this does not rule out like maybe tomorrow some smart person comes and comes up with a way to solve this and again, sort of the way I presented it it might seem like a toy problem but this actually is something very fundamental and it actually comes up or like the higher dimensional variance of this problem come up in what's called lattice-based cryptography which Chris Spiker is an expert on and in fact that's the only candidate constructions based on these problems we know for like cryptographic protocols that are resistance against quantum computers. So there's really an amazing work behind this trying to understand these problems and more generally these non-constructive or existential proofs come up sort of all over the area in mathematics so probably many of us have heard about Nash equilibrium in games and again the way it's shown is just by the existence of a fixed point theorem and you don't really know how to find a particular game and so on. So again yeah, so these are non-constructive arguments that are used all over the place so coming back to discrepancy theory so the status in this field was something similar so I'll not define what discrepancy means I'll just say at a very vague level it's like you're given some points in high dimensions where you can think of vectors and you kind of want to divide it into two sets let's say the green part and the blue points such that their averages are roughly the same so maybe like for these green guys the average is somewhere in between there red guys you want to come up with the partition of these things and again this may look like some esoteric problem but it comes up in various areas in very surprising ways so it's like a very basic subroutine and many books have been written on this either also so it's a very beautiful area of mathematics actually studied over almost 100 years with famous people like Riemann and so on and there are many many beautiful arguments but they were mostly constructive in the sense that I showed you and in fact one of the pioneers in this field as John mentioned is Joel Spencer who's a professor at NYU who solved sort of many of these big questions in this area using non-constructive approaches and in his book where he describes these methods he starts he has a whole chapter on this stuff and he starts with every mathematician has a result he's most pleased with and here is mine and he gives this non-constructive proof at the bottom he mentions for various reasons why he thinks that it should not be possible to make it algorithmic because of these non-constructive arguments people believe they should not yeah it's not clear how to find any structure so somehow out of just sheer luck and no so actually I was aware of Spencer but I never managed I didn't ever read the line at the bottom probably if I would have read it I would have never dared to think about this but somehow I missed it and it turns out it's actually possible and I gave such an algorithm in 2010 and it was kind of an unusual algorithm what it did is it took this non-constructive argument and combined it with some techniques from probability and optimization actually find such a good discrepancy coloring yeah so I'll not maybe go into this but roughly it starts with some simple solution and does some random motion and ends up at the right thing so this was sort of a very unusual argument at the time and over the last 10 years there has been remarkable work by various people in the field and this area has really exploded and led to several resolution of several long-standing problems that were open but also in the sense of computer science shining a lens on various areas of mathematics it's actually enriched several of these areas ensured very deep underlying connections between them and this even gets invited now to this talk at the International Congress of Mathematicians so mathematicians are actually interested now in the insights the algorithms people have to offer on these classical mathematical questions so yeah so I call this method like a guided random work so somehow you're doing a random work but it's somehow not completely random it's guided by some some fancy black box right and also I feel that's sort of a good metaphor on how my research career has evolved over the last few years so again sort of I've worked on several topics and problems you know often motivated by colleagues or like the flavor of the day but so usually it's like just exploring randomly and stumbling in the dark but it's not completely random so it's there is some kind of guiding force behind it often by great mentors and colleagues so here are some of them which I'm really grateful for my career I've also learned a lot through my students and post-docs over the years been really fantastic collaborators so this was about the past so looking ahead theoretical CS is still a very vibrant and very dynamic field which is evolving every day and various new areas coming up and one topic I'm particularly interested in is sort of this connection between the discrete and continuous ideas from mathematics so this discrepancy was of a similar flavor you know we are solving a very discrete question using sort of a Brownian motion and things like that and there is much more here and this is a very hot topic these days but sort of at a non-technical level I feel there are also various other challenges that I'm going to talk about one is I feel there is a lot of over-specialization in this field like it's grown so rapidly that people have kind of hold themselves into these narrow areas and it's very hard for you know even researchers from one sub-area of algorithms to talk to people from another area and I think yeah so I would actually like to hear also opinions of what is going on especially something like algorithms which is used by people in practice and so on so I feel there should be sort of more emphasis on unification and better exposition but if there are any other ideas out there how we can sort of bridge these gaps something I'm very interested in so especially like all these insights we have as theoreticians clearly can have great impact in practice but I feel you're not talking as much as we should so finally just looking ahead so here I am in Michigan CSC so I'm very excited to work you know hopefully work with a theory group and more broadly people in CS and across the university so I look forward to my collection of the next random walks in research and more so thank you