 Sean called it complex thermostatistics, and I didn't really know what to do with that Except that there's complexity and like everyone's talking about complexity so far But I'm gonna try to give a definition which I roots things and then helps with the narrative because there's like weathering everything's complex And there's lots of stuff. So it's complex and there's big numbers. Okay, so then there's thermodynamics Statistics, statistics, statistics mechanics, condensed matter physics, soft matter physics is high energy physics, physics complex Okay, so I'm gonna go through all these things and like one side at least Okay, so first me and Sir Sean said who I am and then the way I like to say who I am is I learn quickly and seek complex problems So okay, there's complexity there and I like to do multi-scale physics modeling and prediction with various levels of complexity and This definition of complexity where I say back at the envelope versus computationally intensive simulation Actually the definition I'll give makes sense in that sort of single phrase. Okay And then I do lots of research. So right now I work in advanced computing Like massively parallel classical and also looking at quantum computing and in AI and ML And then I've also worked in material science and dynamical systems and physics and robotics and all this other stuff So real quick slide on like what's the connection between thermodynamics statistics mechanics dynamical systems and like complexity so these guys this this spans like two centuries right Boltzmann and Ken Wilson and so Way back then they started studying thermodynamics, which was like a set of axioms at the carrier level and they came up with Consistent theory which started at that level which agreed with the underlying or top level thermodynamic axioms But I think the important part is that they did all this work So we're talking about ideal gases and phase transitions But they extracted a whole bunch of useful tools from this study that goes on to be used in other complex fields I think that we've gotten a taste of that so far So I think that's this is like for me the one pager extraction or take away from Thermo and stat magic and all that is the tools that you got out of it and how much they're applicable in other domains not necessarily exactly but Like things like vanishing transitions. Okay, so for me It's like phase transitions and self-similar in universality and divergence when particularly in the alpha-fx modes and attractors and bifurcations limit cycles And all this junk in here which you can find in books like Hillborn and you can find You can find you can find a stroke ass and fraught and then these are all good ones for stat mech and canets matter physics as well Pothria and McCory and Ashkoff Merman and all that. Okay, so I would say that this is my Like definition of complexity and thermodynamics and all that is what's contained in these texts So, okay So now I want to like good definition of complexity so that like we can compare different systems because I want to look at lots of different systems like Simple systems that might have complexity the entire universe and reality so So the definition of complexity which I like because it lets you play around with those different scales It's common graph complexity, which is just the length of the short computer program in some language that produces an object There's an outfit This is the mental world set right and this thing can be like petabytes in size You can keep iterating on it forever and make it very complex object, but it was made from this thing All right, and this fits on a single piece of paper. This is the mental board set itself. It's definition This is the kernel, right? That's what it produces. This is a very important concept that like is pervasive across the different domains of complexity Is this notion that you can have this very very complex behavior, but it's coming from some very simple kernel, right? With appropriate so and then there's ingredients with the kernel like the kernel is nonlinear And if you have the right value of C, then you get that right and so this was typified in The beginning talk when he said look, it's not anizing models not a spin glass. It has these very particular Rules and that's what gives me the complex behavior that I that I observe So these are the these are the important parts like what's the necessary ingredients like one only at nonlinearity or made degrees of freedom And how can we look at simple kernels like this that make incredibly complex objects? and so for me my background also isn't Atomicistic modeling and so a nice way of bringing the comal growth complexity into Atomicistic modeling is to say so we have like the Dirac equation the many-body quantum attacks problem, which is an empty hard problem It's comal growth complexity is really really high And then you win Nobel prizes first you win a Nobel Prize for getting the Dirac equation Fine, then you get a Nobel Prize for making the Dirac equation really easy to solve This is the density functional theory and then you have like ethical's MA So the point is that comal growth complexity high up here and it's getting simpler is it coming down the computational complexity is coming down And I would say the complexity is coming down Okay, so that's this is like my background. So Jared protein folding What else you do, okay? All right, so I think what's more provocative is to think about so this is like not complex This is more complex and then reality is very very complex. So but what's but the same the same Theme is there so you have star model Lagrangian if we figure out how to put together with general activity Then you get the whole universe, right? This is the kernel. That's the object produced Comal growth complexity of universe incredibly more complex than mental growth set or the modeling I was showing you before So what's very interesting is Seth Lloyd back in 2001 or something like that He he computed the computational capacity of the universe and estimated something like 10 to 120 ops on That's incredibly. That's a lot of bits and ops Also interesting to think about in the in the context of Computation is the Beckenstein bound if anyone's ever heard that that's sort of so for a given volume of space and energy mass in it How much computation how many bits can that store and so we're really really far away from the Beckenstein bound with our current computational capabilities So like the Beckenstein bound leaves room for a million bits in a hydrogen atom. So that's crazy Okay, so I'm just trying to set the sort of a stage Okay, here's an example of one of the papers That's kind of like a seed paper in the track and this is quantifying the rise and fall of complexity and in this case Comal growth complexity in a closed system. They call this the coffee automaton So in this study what's very interesting about it is that so this is the simulation They did and we don't have to get into details, but this is the time step and this is entropy in the blue But in the green you see complexity has a local maximum in the case So I don't want to go into any more detail than that But this is the kind of thing that we could get into in the session And in fact, I got into something similar to this back in my master's work So this is a turbulence on the surface of the tank and over here We have the initial distribution of let's say some particles you bunk it on the surface and then as time goes on you see this thing The entry actually decrease of the complexity definitely go up I didn't measure what they did in the paper in this study, but you get you get the idea that these systems that are not in equilibrium You know the entry doesn't need to increase In this case it doesn't actually decreases and the complexity definitely goes up So this is one of the sort of ideas that we can play with in the session Another one which is related to the reality as a computation is along the lines of Jared's stuff with deep learning Which is why does deep and cheap learning work so well, right? So so so one idea is that it works so well because reality itself is composed of a list of a sort of finite list of functions which are important to reality like waves and exponential functions and so forth So we could also get into this although that starts to overlap with what you're doing But this is a very very provocative paper Another thing I want to point out which is I think pretty cool is thinking about what drives all this really interesting complexity in the Geosphere is actually a very simple thermodynamic principle at the highest scale. So you have the Sun And it's at 5,000 Kelvin and if we take black body radiation the entropy of the Sun Actually a black body those are going over T So the entropy of the Sun is much lower than the entropy of the re-emitted radiation of earth Reimits the difference being a change in entropy would stride all this and other stuff But the point is that sort of thermodynamics at the highest scale sets the stage for all this stuff that we're talking about I just wanted to put that slide because I it took time to make it. I don't think it's nice It's poignant, isn't it? Okay, so I think that's really cool. That's why I have to slide there Going further if you so that's it like one scale or at the highest scale above the geosphere inside the geosphere at the molecular level You've got Situation like a kinesin, right and these guys are walking in the cell membrane on the scaffolding in the micro tube bills And they're just able to do their job well enough to beat the thermal fluctuations in the environment So they can do work that's corresponding to like two or three times kv2 So they just sort of get the job done and then the whole cell works really well But thermodynamics is playing a major key role And not even non-equilibrium thermodynamics So that's another line of research we could go into another really interesting Connection is the what's the possible link between intelligence and entropy? So this is a paper by Wistner Gross talking about causal and traffic forces Systems that you could subject to these causal and traffic forces tend to reach a state which is a maximum potential Maximum possibilities. So one of their examples is a pendulum which they get to stand up on its end Due to these causal and traffic forces and and what's interesting about that is that that's the sort of point of maximum possibility Potential could go either way. It's unstable so forth And then Jeremy England at MIT has a kind of similar line of paper talking about the statistics of self-replication And all of this being driven by entropy. So it's almost as if to say In an entropic universe are you sort of bound to get intelligence if you the universe is big enough and Comments don't blow everything up and all that sort of thing in other words, you know, we have a universe from no entropy Do we not see intelligence? So this is very interesting work. Oh Interjecting real quick about me. So I also do form computing and just to show you like how many connections you can make with thermodynamics and complexity so quantum computing is about using qubits is It's any quantum computers and I'm using this stuff to try and figure out how to do you have the hard problems faster than classical computers like traveling salesman and There's this notion of computational complexity I'll bring the complexity problem complexity in terms of P and NP if you've heard of this stuff and then and how much of this region Bqp, which is from quantum computation Can be how many problems can be solved efficiently in this in this Bqp space? Well, what's interesting actually gives you Leonard Susskin who's at Stanford recently made this conjecture that nature is the fastest scrambler of quantum information and uses a Quantum circuit model to do this analysis. So again, it's like everything in complexity is everything thermodynamics So this is another line of research that we can get into much time. I don't want to go to I Was the end of it anyway So I just gave you like six things I guess that have connections thermodynamics systems in equilibrium and non-equilibrium I tried to give you my definition complexity because I think that that Like sets an even ground for all of these definitions or people saying complex and complexity I think we all know what everyone means when they say it but to me that makes things very very Concrete