 We will continue the second lecture by Professor Nomanyao from Harvard University. Okay, good. Yeah, so this lecture will continue where we left off last time in discussing flow-k phases of matter and in particular, we're in the midst of talking about time crystals. So just a little bit of a recap. Last time we sort of, you know, I started trying to really make sure that everyone is on the same page. I've realized many of you have thought about physics like this before. But to really introduce, especially given that time crystals aren't the most familiar phase of matter, to introduce the language of spontaneous symmetry breaking as the essence of what defines a phase of matter. And I wanted to connect to Anatoly's lecture, and in particular I emphasize this picture of thinking about spontaneous symmetry breaking or the stability of a phase of matter as a form of ergodicity breaking. You need to remember which of the two pieces of the, which of the two states of the ferromagnet you're in if you want to be a stable ferromagnet. And we ended the last lecture with a goal, and the goal was that we'd like to discuss possible realizations, especially in the context of modern research, stable or rigid, discrete time translation symmetry breaking, or STALP, in many body systems that could be either classical or quantum mechanical. And of course you'll remember that the spontaneous breaking of time translation symmetry, the discrete spontaneous breaking of time translation symmetry, we really think about as a sub-harmonic response on top of, for example, a discrete time step or a full-ok Hamiltonian. So today we'll go into the next section, and in particular we'd like to try to elevate, related, I don't remember exactly where the question was, related to the question about stability and rigidity, we'd like to elevate this notion of just having a sub-harmonic response, which is not neither common nor the most uncommon thing, I would say in nature, to be actually to be able to actually discuss this type of discrete STALP as a time crystalline phase of matter. So that's where we'll start today. So up to this point, up to this point, we've mainly been discussing, mainly been discussing the spontaneous breaking of time translation symmetry in the language that you guys, I think, are probably most familiar with, and certainly in the language of physics. And what I mean by that is that we've always been thinking about there being there existing some Hamiltonian and the fashion of flow-k context. We were thinking about Hamiltonians that were periodic in time. So we had some periodically driven system. But I'd like to emphasize that at some level, at some level, from the perspective, from the perspective of discrete time translation symmetry breaking, you don't really need to start in the language of Hamiltonians or think about time evolution dynamics that are intrinsically governed by Hamiltonians. But rather, all we really need is some discrete time update rule. The reason I'm saying words like this is that I'd like to try connect, I'd like to try to connect, ultimately, what we'll talk about, which are thinking about time crystals as phase of matter, I'd like to try to connect time crystals to a slightly much more general framework, and in particular, to the framework of non-linear dynamical systems. Something you've already been hearing about from both myself and Anatoly. And in particular, I'll try to help answer the question, which I emphasized yesterday is the most common question that I get asked at talks, which is, is some particular type of a harmonic response a time crystal? Usually asked as, why is such a particular type of subharmonic response not a time crystal? The problems that usually I've never thought about the particular thing that lives in the ellipses over here, but I'll try to give some framework that helps you recast what is important for thinking about time crystals as a phase of matter and distinguishing them from more general and generic subharmonic responses in non-linear dynamical systems. And to do that, I want to adopt a slightly more general notation. I promise that I will come back to the notation of flow-k Hamiltonians that we're familiar with, but I want to start by adopting a slightly more general notation. And in particular, I'd like us to just consider it doesn't have to be generated by Hamiltonian dynamics, which are much more specific. Well, let's consider just some discrete time update rule. And this discrete time update rule will take the state of a system. Let's call that state x. We'll take the state of the system, and it'll simply evolve it from x goes to 5x. And this means x is really the state of the system. If you were thinking about this in the quantum Hamiltonian case, this would maybe be just a ket, some ket psi. But here, I'm really trying to use notation that's very standard in sort of the non-linear dynamical systems community. Because I think that's where a lot of these ideas and these concepts originated from. What this means is in fact that given the slightly more general notation, at time step t, at a time step t, the state of the system, state of the system is simply given by x of t is equal to phi applied t times to some initial state x at time t equal to 0. Super simple. Super simple. But just change the notation slightly. And the reason why this is super nice at some level, and I think why it's helpful to use this notation is that it really emphasizes, there was a question from last time, it really emphasizes that from the perspective of time crystals or time translation symmetry breaking, I don't need any other symmetry of the system. I don't need to have spatial translation symmetry. I don't need to have spin rotation symmetry, phase transformation symmetry, nothing else. And this discrete time update rule really highlights, highlights that there's only one thing happening in the system, which is the fact highlights the fact. The only thing that exists is that you do phi at each time step, the system to think about. As promised, I'll want to just very quickly, just very quickly, reconnect to the language of Flo-K, Hamiltonians that you're familiar with. So all I mean by this is there for Flo-K systems, we're thinking about some continuous time evolution. But I emphasize that really from the perspective of symmetry breaking, we're just thinking about the discrete breaking of a time translation symmetry, or already discretized time translation symmetry. And what that means is that if I only want to think about this map phi, it really should think about what's called the stroboscopic dynamics of my Flo-K system. So what this means is that if I had a classical Flo-K system, that this particular discrete time update rule phi is furnished by literally doing the obvious by integrating Hamilton's equations, integrating Hamilton's equations over one period of the drive, writing this out slightly more mathematically. And the quantum mechanical case is that phi, this discrete time update rule, is really given by what people oftentimes call the Flo-K unitary in the quantum case. Which is simply the time-ordered exponential of e to the minus i0 to the period h of t prime dt prime. The generator of time dynamics over one period of the drive, we are only going to be caring about discrete breaking of a, the discrete breaking of an already discretized time translation symmetry. So here this phi emphasizes this kind of stroboscopic point of view. This is often times called the Flo-K unitary. So at this point, using the notation of dynamical systems, it's very easy for us to define the notion of discrete s tau b. In particular, I say that discrete s tau b is the following. The dynamics of a system exhibit m-fold discrete time translation symmetry breaking if there exists a local observable, a local observable, oh, exhibits, that exhibits periodic oscillations, periodic oscillations out to infinite times. Again, there was a question last time. I'm always talking about the thermodynamic limit, at least for these definitions. We'll talk about lifetimes and how those lifetimes scale at the end a little bit. So the dynamics, we say that dynamics exhibit m-fold discrete s tau b if there exists a local observable, oh, which exhibits periodic oscillations out to infinite times for a measurable volume for a measurable volume of initial conditions x. At some level, given what we've talked about already, this is super intuitive. You already have a good feeling for this, but just so that we're mathematically precise, mathematically, I'm going to write down this as taking the limit as tau goes to infinity, coming over many oscillations and evaluating this observable on average, phi in the m times n step acting on a state x, and we say that there is discrete time translation symmetry breaking. If this does not ig, this is not the same as the observable evaluated in another orbit of the m-fold symmetry breaking. So that would be the limit, again, averaging 1 over tau sum from n equals. From n equals 1 out. Observable evaluated in the state phi at period mn plus p x. If this never equals, they're not equal unless they're not equal for any p less than m greater than zero. Here, again, where orbits or the different sectors of the discrete tau phi. So just unpack this super simply. All I'm doing, the average over here is just making sure that, you know, if I'm measuring the observable, I'm not just adding up a bunch of numbers. It's just normalizing things, so you don't need to worry about that really much. But here, what I'm saying is that if I think that there are m orbits, m ground states, for example, if we were thinking about spontaneous symmetry breaking in the previous lecture, we're saying is that when we average, you know, over each of these periods and you go sort of for the m time, to the 2m time, to the 3m time, then if you look at the m plus first time step and you do the 2m plus first time step, the 2m, 3m plus first time step, et cetera, that there are m distinct orbits for p less than m greater than zero where it never comes back to itself. So we'll draw a very simple picture for this in just a second, but I just want to make sure that we write down the formal definition. It'll be, it'll be much, much more clear in just one second. And I'd like to emphasize that kind of again from the perspective, from the perspective of ergodicity breaking, from the perspective of ergodicity breaking, a time crystal must remember a time crystal much, must remember which of the kind of m initial conditions it is in. And again, I'm going to draw a picture right now that kind of clarifies this. This picture will come up towards the end of the next lecture where we talk about different types of time crystals in, for example, spin chain systems. But again, as a very, very simple pictorial representation, we might imagine that we're looking for a time crystal in a one-dimensional spin chain, just as a particular space where we're looking. And we might say that it turns out that there's some complex rule of five and at the end of whatever this complex rule of five does, essentially what happens is that the state simply flips from up to down. Every spin flips from up to down. So at some level, the formula over here is just saying this is clearly period-doubles. This is clearly the example of n equals two. It's just saying that if I measure a local observable like sigma z, every even period, it does not equal the local observable every odd period. Because the only p over here that's between zero and m is one, m is two. So it's just distinguishing even odd periods. And if you had m-fold symmetry breaking, there would be m different symmetry sectors. And this is just saying that there's an observable that distinguishes each of those. It doesn't come back to itself until the nth time. That is the definition of a subharmonic oscillation. But in some sense, just intuitively, we'll find that actually it's very, very hard to actually furnish a map that does this. But intuitively, all we're saying with the definition here, and from the perspective of ergodicity breaking, is that you have to remember whether or not the time crystal, much like the ferromagnet needs to remember whether it's up or down, the time crystal needs to remember whether it's up, the dynamics are up down up, or whether the dynamics are down up down. These are kind of the effectively two different symmetry broken states. Whether or not in your zeroth period, were you up or down? So on even periods, are you always up or in even periods, are you always down? It would be very, very clear. Yeah, please. Wait, sorry, what's the question exactly? So in general, as we'll see actually, it will not be dependent on the initial condition. So for example, let me imagine just pick for each of the spins, pick randomly up or down and then start with that initial condition. I mean, I haven't told you how to make the map yet. But in principle, you could imagine a map that simply flips basically every single individual spin from up to down. So that initial condition is down very, very different. It's up, down, down, up, up, down. But at the next time step, it'll simply be the opposite. Again, I haven't explained sort of all the non-trivialness of getting this map and making it stable. But in principle, at least from the perspective of the map, it's the map that matters, not the state that matters. And there will be time crystals. In the end, you want to have this behavior for some finite volume, finite ensemble of initial states. In certain cases, people claim that you'll have this behavior for every initial state you could start the system in. And for other cases, you'll find that people will claim that you'll only be able to see this behavior for some finite subset of initial states, but nonetheless, it's not a fine-tuned specific initial state, it's some finite volume. So it'll always be a property at some level of the system. And what initial states undergo this type of time crystalline phase or correspond to this time crystalline phase depends on the particular nature of actually ergoticity breaking. Does that help? Great. Yeah, please. Right. For example, you could imagine a situation where it's like, you know, I'm going to draw it super well, but you know, like, you could have a map that then goes to everything being like this. And then a map that goes, what's the next one? Everything like this, I guess. And then basically coming back to itself, for example. Yeah, that's right. That's right. That's all I know. Exactly. I'm just trying to sort of write down the definition very intuitively. Good. Yeah, please. It doesn't matter, actually. I think, you know, in principle, I could have, you know, I could have just had this, you know, in principle, keep going up, but somehow the average of the local observable feels like it is bounded norm. So it's easier to think about, but in principle, it wouldn't matter. I don't really need to have this. Why not evaluate? Non-zero measure. Exactly. Exactly. Exactly. Exactly. Yeah, yeah, yeah, there's scars. Yeah, it depends who you talk to. You know, the field is yet young, I would say. So depending on who you talk to, yeah. Yeah, so okay. So in principle, actually, I won't, you know, for a lot of what we'll discuss now, I won't take Elgos to infinity. I'll look at single degrees of freedom. But in principle, you know, when I talk about infinite times in general in the definition, I want to think about there being, you know, in most physical systems, I've taken the thermodynamic limit because if you have a finite system, as Anna totally said, at some point at L squared over D, if you're diffusing, you hit the boundary and stuff starts to happen. So that's what I mean when I say, you know, the thermodynamic limit. But you'll see for a lot of it, actually, I will start with very, very, very, very, you know, finite numbers of degrees of freedom. Okay, excellent. Good questions. It turns out that actually this language and this specific formula goes under a slightly different name in the nonlinear dynamical systems community. And it's called asymptotic periodicity. There is a very nice paper by La Sota, Lee, and York, Transactions of the American Mathematical Society on 1984, that kind of summarizes a lot of progress up until the 80s on asymptotic periodicity. But now I'd like to make a note that in principle, if we only require this tau b according to the definition, according to the definition that we have, we only require discrete time translation symmetry breaking according to this definition in order to define a time crystal, in order to define a time crystal, then in principle, even dynamical maps or discrete time update rules, even discrete time update rules on a single degree of freedom. So just to give a little bit of a roadmap for how I'm trying to address the problem, I'm going to try to build up kind of examples of time crystals, and for each of them, or examples of discrete time translation rigging, and for each of them, I'll try to give an obstruction. And as we keep obstructing and obstructing and obstructing, we'll see somehow that, you know, it's actually extremely hard in the context that we're most interested in, namely Hamiltonian full case systems, to actually get this type of physics in the first place. And ultimately we'll see that the obstruction can be understood in the language of ergodicity breaking, in the language of how difficult it is in a generic many body system to break ergodicity. But at this point, if we just require that we have the definition of this form, then in fact you could consider perhaps the most boring map ever, which is just the map that takes 5x to minus itself, and at least from the perspective of the definition over here, that would be perfectly satisfied. There is a local observable, the sine of x, which is different depending on whether I'm looking at even or odd periods. But clearly we don't want to define that, we don't want to think about that as a time crystalline phase of matter. And again this is very related to the question from yesterday about the need for rigidity or stability. The question, the question of discrete S tau B becomes much more non-trivial. It's much more non-trivial when we require a key property that we usually think about when we discuss the statistical mechanics of phases of matter, which is namely stability or rigidity. To illustrate that the discrete time update, well the super-trivial one we just wrote up, doesn't have this. Immediately note that in fact the oscillations of this map 5x equals minus x are totally not stable or would defaze to essentially any perturbation. But for example we can imagine changing the map 5x just a tiny bit and multiplying it by 1 minus epsilon or epsilon is small. But immediately see that in fact the oscillations, the nominal time crystal oscillations become damped beyond a time scale that scales as 1 over epsilon the strength of the perturbation. Super-trivial. You guys all far should be feeling that this is, you know, I'm just trying to explain stuff but it's very very basic. Yeah, so in this case think about it. In this case, you know, when we talk about, you know, flow case systems or quantum systems there will be some observable associated. But for this case it's a real number. I'm just thinking about the much more general setting of dynamical systems. So what this means is that in order to really think about or tease out the non-trivialness of a time crystal phase of matter it's natural to require to add a few requirements, a few requirements that are in some sense the essence of the stability of phases of matter. The first is that we'd like there to be a well-defined thermodynamic limit related again to the questions that we've had about what exact they were talking about with infinite times. We'd like there to be a well-defined thermodynamic limit and in order to have that I'd like that my discrete time update rule is generated, could be in a physical setting or a dynamical system setting, is generated from local interactions. This means that there's kind of a well-defined notion of dimensionality of a coordination number. I'd like my discrete time update rule 5 to be generated from local interactions such that indeed there is a well-defined thermodynamic limit to be taken. These could be real numbers sitting on a two-dimensional lattice that are interacting with each other. They could be classical harmonic oscillators sitting on a three-dimensional lattice interacting with each other. These could be quantum spins sitting on a one-dimensional lattice interacting with each other. But I just want there to be a well-defined notion of how to take a thermodynamic limit. I want my discrete time update rule 5 to be generated from local interactions such that there is a well-defined thermodynamic limit to be taken. And second, this notion of rigidity, I'd like to require that the discrete breaking of the time translation symmetry which we defined over there is robust. So it does not get dipped. It is robust to any small locality-preserving perturbation date rule 5 or the state. Again, this is just passing to words some sense that you should already have in terms of phases of matter which is that they shouldn't be infinitely fine-tuned. It says they only work for a specific initial state or they only work for one particular update rule. You want to have some finite, quote-unquote, radius of convergence for this physics. Good. So it seems like I've now endowed the definition of a time crystal with everything that we want. We have the definition of the symmetry breaking that underlies the time crystalline order with the notion that we'd like to have a thermodynamic limit and with the requirement of stability. Somewhat surprising or not, it turns out as we will see in just one second that even this not quite enough, allow us to help us zoom into kind of most non-trivial cases. So even furnishing the notion, the very natural notion of a time crystalline phase with symmetry breaking order, thermodynamic limit, as well as stability and rigidity, it turns out is still not quite enough to help us get to the most non-trivial cases that we will discuss. Yeah, please. Yeah, okay, fine. Good, good. I perhaps shouldn't say that. There's a limit where, there's a well-defined limit where I take the number of degrees of freedom to infinity, but yes, it doesn't, it's, yes, point taken. The thermodynamic sort of, you know, is a very natural notion if we do start talking about physics, but given that I've generalized the dynamical maps, perhaps I shouldn't call it that. The question's not really necessarily, you know, I'll emphasize this a bit. At some level, you know, about the possibility, definitely, it turns out it's actually much, much easier to find time crystals and non-local models. The question at some level, you know, is why is it much easier? That much easierness is related to the fact that if you have very, very non-local stuff, you know, I don't have ergodicity in the first place. There's like, you know, a very mean field description of what happens. So I'm sort of trying to zoom into the cases that are sort of toughest, but in fact, at some level, yeah, once you allow for non-local interactions, you know, time crystals abound, and you'll see the time crystals abound even in this case. Dependent guess, you can take the limit of n goes to infinity, but if you've all-to-all interactions, is it really thermodynamic? You think about the spectrum of a many-body Hamiltonian where you have all-to-all interactions, and you ask about, you know, whether or not that spectrum is extensive. It's not. It's super extensive. So is there really a thermodynamic? Yeah, there's a limit that I think the number of particles, degrees of freedom to infinity, but whether it's thermodynamic or not, I'm not so sure. So it turns out that even furnishing the requirement for a time crystal with all of these requirements that seem very, very natural, it turns out, at least in the dynamical systems context, the nonlinear dynamic system context, in some sense, there have been time crystals or asymptotically periodic many-body systems for a very long period of time. And the example that I'll just very quickly work through together with you is how one might naturally think about the logistic map as a particular example of an old-school time crystal. So what is this map? All I'm saying is that I'd like to consider a slightly more, a slightly more complicated nonlinear dynamical map or dynamical update rule. And the update rule is perhaps one that I suspect many of you are familiar with. It's one that you would have seen in any undergraduate nonlinear dynamics class. It's the map of the logistic map, which takes x to rx times 1 minus x. And it turns out for those of you who have seen this type of a map, we'll oftentimes associate one of the classic, classic phrases with the logistic map is that it exhibits a so-called period-doubled route to chaos. We won't be so interested in the chaos part of it. You've already heard about that a lot from Anatoli. But we will be interested in the period-doubled part of it. Let me draw a picture that essentially explains what people mean when they say that there exists a period-doubled route to chaos. It turns out that if I plot rx, approximately accurate, at least what I was getting from Mathematica last night, or actually very early this morning, it turns out that I'm just going to draw a picture first, then I'll explain exactly what the picture means. So essentially what one finds when one applies this particular dynamical update rule to a system, just a number, is that for any given value, for any given value of the parameter r, ultimately, at late times, phi just takes x between stable values. What you'll find is you start with whatever initial condition you find. You have all those many, many times you keep applying phi. At some point, you will find that pick a particular value of r, and you'll find that what happens is that the value x just hops between two particular values whenever you apply them at phi. Just between them. So literally at some point, you start with x wherever it is, and at some point you'll evolve. You'll find x just evolves. It goes from 1 to 5 to 7 to 3, whatever. It keeps evolving, and at some point you'll find at late times, it'll lock into being sort of 10, 20, 0.5, 1, 0.5, 1, 0.5, 1. So that's what I mean by back and forth. Which is that literally the value goes back and forth between these two stable values. So in fact, this means that no matter what, you will always, always end up, you'll always end up with doubled oscillations. You'll always end up with doubled oscillations. It turns out that actually when you undergo many, many bifurcations, this is a natural route as you increase r to get to the chaotic regime of the logistic map. But again, it's not what we're interested in. We are interested in this kind of finite regime of phase space. And in particular, you can immediately tell that unlike the map that I had over here, which is 5x equals minus x, where the oscillations get damped if you have a perturbation, in some sense this period double behavior is clearly rigid. This type of, you know, this type of discrete s tau b that exhibits in this logistic map is clearly rigid in the sense that you can change or x a little bit, literally define an observable, call the observable, is it above 0.75 or not? And I can vary r a little bit. I can perturb it by some epsilon, do whatever I want to it, but I can always still be able to have that type of oscillation. So it's not something that's yet... Yeah, sorry, sorry, sorry, sorry, sorry. So for any value of r within this regime, sorry, actually, for any value of r within the period doubled regime, yes, sorry, that's exactly right. Yeah, that's right. So you want to, so there's some, again, all, I need a finite radius of convergence. So if I take it right here, then for sure you're right. If there's little perturbations on the left, then it's unstable. But I just have a finite radius of convergence, so there's clearly some finite regime where it works, and I just don't want to consider the boundary of that particular regime if I'm asking about a finite ensemble. Because I'm picking a particular value of r. The chaotic regime of the logistic map is at larger values of r. I'm focusing on a particular regime of r. Yeah, I should have not said it for any given value of r. For any given value of r that's far from the chaotic regime, then you have this type of behavior. So that's another, that's a way to get different periods, actually. So there are different maps, actually. There's, like, some classic problems of whether or not you could get kind of three-fold symmetry. I think odd symmetries in a particular three, whether you could get maps that essentially had orbits that were three-fold symmetric. It was an open question for many years, and then I think solved in the affirmative. But yes, that's exactly right. Good, so I'm just trying to build up, I'm just trying to build up different physical systems related to discrete-time update rules that seem to satisfy what we want to have with respect to the definition of a time crystal. You might say, okay, so clearly this is discrete s tau b. It's rigid in some way. But you might say that the thing that it doesn't satisfy is that it's still a single degree of freedom. So you might object, might object that there is, this is still the physics of a single degree of freedom. There is no, okay, should not say thermodynamic, no, well, I'll say thermodynamic still, but you know what I mean now. No proper thermodynamic limit where we take the number of degrees of freedom to infinity in some meaningful way, because really here we have a single degree of freedom. Good. But it turns out that even, this objection is very, very easy to overcome. It's kind of super easy to overcome this objection. And the idea, as you might imagine, is literally to put logistic maps on some lattice. I have a 1D lattice, a 2D lattice, and couple a bunch logistic maps. You have logistic maps to find on the degrees of freedom, X of I, where you can now take I from 1 to infinity, put them on a lattice, interact them with each other. And these couple logistic maps actually are called, they're called coupled map lattices within the nonlinear dynamical systems community, and they also exhibit the same exact phenomenology. So they exhibit the same exact period double phenomenology, the same exact rigidity, the same exact stability. So they also exhibit what we would have defined as rigid, what we would define as rigid, discreet, STLB. And a beautiful reference on this is from Kudy Kaneko. Kudy Kaneko, in his review article in progress in theoretical physics from 1984 again. 1984. Works one dimension fine. So this should somehow feel wrong at some level, right? Because you're used to thinking about symmetry breaking and thinking about, oh, there's some upper critical dimensions, some lower critical dimensions, there's Merman-Wagner theorems, fluctuations are going to kill me, but we'll see why that doesn't happen in just one second. So in some sense, what I would say is if I were to summarize this, exactly sort of really going to this point, there is a form of discreet symmetry breaking in zero-dimensional systems, one-dimensional systems, 2D, 3D, 4D, any dimension. There is discreet symmetry breaking in these types of nonlinear dynamical maps, and they've been known for a very long time. The point is that you have stable, rigid, many body, if you elevate yourself to coupled map lattices, discreet. Now here we're interested in the breaking of a time translation symmetry, but you might as well just think discreet, spontaneous symmetry breaking in general. There's stable, rigid, many body discreet time translation symmetry breaking that has been known for a very long time. What gives? I've been emphasizing that there's a modern research topic on the question of flow-K phases of matter, the poster child perhaps of which is time crystals, but yet I'm telling you that given the definition of a time crystal, which is I want to discreetly break a time translation symmetry, I want there to be a well-defined thermodynamic limit, and I want there to be stability and rigidity, I'm emphasizing that already within the nonlinear dynamical systems community, asymptotic periodicity has been known to do that for a very long time. What gives that they related earlier to a question about the difference between the Hamiltonian systems that we're used to considering for physical systems and the types of differential equations that I'm thinking about in the context of dynamical maps. So a very important mantra that I hope will keep for the rest of the set of lectures is that the surprise or the challenge or the non-trivialness of finding a discreet time crystal, the surprise, the challenge or the non-trivialness of finding a discreet time crystal depends very strongly on what I will call the dynamical class of the update rule of mind. How hard or unusual or surprising it is to find something that looks like a time crystal and satisfies these three things, these three properties that we've been emphasizing depends very much on the dynamical class that the map by belongs to. Okay, so I should unpack that a little bit. So in particular, what do I mean by dynamical class? What mean by dynamical class? Here I think it's most easy to illustrate with a couple of examples. You might imagine that you have a many-particle classical mechanical system. I would argue that there is closed classical dynamics of a many-particle system. I would argue that that dynamical class is governed essentially by Hamilton's equations. I can also imagine taking my classical many-particle system and imagine it's an open system. It's clearly a different dynamical class. Now it's what people call open systems. And what I mean by an open system, for example, is let's imagine that I've coupled it to a finite temperature, Beth. So all of the stuff that you heard from Anatoly about fluctuation dissipation theorem apply. In this case, people usually say that this dynamical class is a Langevin equation or the Langevin dynamical class. If you were to write down the differential equation that governs closed classical many-particle dynamics, you'd write down Hamilton's equations. If you were to write down the set of differential equations of the dynamical class that describes an open finite temperature classical system, you'd write down a Langevin equation. Similarly, you might say that for quantum mechanics, closed quantum dynamics are governed by what kind of dynamics? Anybody? There we go. Unitary dynamics. Perfect. So here we have unitary dynamics. And if we have open, again, kind of finite temperature quantum dynamics, we would naturally think about writing down an Limbladian dynamics or a Limblad equation. What do I mean? I'm just saying that, you know, there are different dynamics. And this is clearly not an exhaustive list of all the different dynamical classes that you can have. But at some level, for the types of physics that we think about with multi-particle interacting systems, this kind of summarizes most of the things that we care about. Cloth systems that are just devolving by themselves open finite temperature systems in both classical and quantum settings. And you can already, I suspect, guess what I'm about to say, which is, in fact, that it is impossible to map it is impossible to map the dynamics of a logistic map as a differential equation onto any of these dynamical classes. In fact, the logistic map belongs to a different, the logistic map belongs to a different dynamical class, and that dynamical class is often called contractive dynamics. But I haven't yet told you, I've named it for you, I've told you it's a different class. It's called contractive dynamics, but I haven't told you why it's so, okay, not easy people to think about this for a long time, why it's relatively expected to find stable discrete S tau B in contractive dynamics. But what I will emphasize is it's much, much harder to find it than any of these physically motivated dynamical classes. And the simplest way to understand that is to first understand what people mean, why they call it contractive dynamics in the first place. So to understand why the origin of the name, the nomenclature, let's say that, the nomenclature of contractive dynamics, let's contrast it with Hamiltonian dynamics. And I've already seen, I can already see what happened by description of the logistic map over here, and why I talked about contractive dynamics, but to really build it out, let's contrast it with Hamiltonian dynamics. So it'd be an immediately easy way to see why the logistic map cannot be written, you will not be able to write down a Hamiltonian that effectively does the logistic map. And the reason for that is because one reason, one way to say this is that if you think about Hamiltonian evolution, you will remember from your classical mechanic class that there is Louisville's theorem. And what does Louisville's theorem tell you? It tells you that, in fact, the volume of phase space, volume of phase space is always conserved under Hamiltonian dynamics. The volume of phase space is always conserved under Hamiltonian dynamics. By contrast, the logistic map, and here you'll immediately see its naming convention, the logistic map takes any, finite volume of phase space and essentially always contracts it, always contracts it to a single point. You can already see that was the description that we gave when we were talking about the picture of the logistic map. So it doesn't matter what you start with for x. You start with an ensemble of different x's. At the end of the day, they'll always end up going to these two values. The logistic map takes any finite volume of phase space and contracts it to a point, as does a coupled map lattice. So in some sense, if you think about this with respect to couplings, if you wanted to think about this with respect to physical systems, you might say that really the logistic map is an example of a system that kind of really doesn't satisfy fluctuation dissipation. You have dissipation, thus contractive dynamics, but without any fluctuations. So this immediately gives us a little bit of intuition. In fact, we've emphasized already that at some level, for dynamical maps in general, it's not that hard to write down something that looks like it has an order parameter that looks like it has discrete s tau b. What's much harder is to guarantee stability. But in some sense, the stability is inherited from the contractive property of the logistic map. And the way I would say this is that, in fact, stable discrete spontaneous time-translation symmetry-breaking is, let's say relatively, there's been 100 years of research on this, is relatively easy to find in purely dissipative dynamics, dissipative or contractive dynamics, or contractive dynamics, because the contraction is these dynamics, because the contraction kind of prevents any would-be fluctuations, would-be fluctuations from stabilizing time-translation symmetry-breaking. Very simple, right? It's just sort of intuitive, you know. At the end of the day, you have a dynamical system. It's undergoing some evolution. If you thought about even having fluctuations at the level of having a finite ensemble, so there's fluctuations that you built in and you chose a finite volume of an ensemble of initial states, it doesn't matter, because if you have contractive dynamics, it always contracts down to a single point. So at this point, I hope we're sort of now starting to zoom into the question, right? We started off just saying that, in principle, we had these three things. We went to this very, very general story of dynamical, nonlinear dynamical systems, and we emphasized there that you could already get what seemed to satisfy all the definitions, but there's a very good reason for that. Doesn't really, doesn't naturally make it any less interesting. It just, that there is a very good reason why that type of stable time translation symmetry-breaking is natural in this particular class of dynamical systems. But what we are interested in, what we will be interested in, and what I would say modern research topics are interested in, is whether or not you can get discrete STLB in any of these four classes. And it turns out that's very hard. And the reason it's very hard is the ubiquity of ergodicity. Good, so let's follow our nose and continue to, okay, yes. Are there any questions? We're going very methodically. Yeah, please. You just can't. I mean, so the point is that, you know, like literally one of the defining properties of Hamilton's equations is that they conserve phase space. So one of the defining properties of the logistic map is that it does not conserve phase space. So there's just no way to write down, you know, a Hamiltonian version of the logistic map because of that, you know, very different property. So in that case, you can start to think about contractive dynamics because you do have dissipation of those systems, but usually dissipation comes with fluctuations. And really the logistic map, again, is an example of dissipation without any fluctuations. So it also doesn't satisfy, doesn't sort of naturally go into any finite temperature physics that you think about. Really just very different. Yeah, please. Yeah, I don't want to say open, because I'm going to use open in the context of like, you know, finite temperature for coming from the fact that it's a contractive system. Exactly. Sorry, was there an additional question? I didn't, I sorry, I apologize, I cut you off. No, that's right, that's right, it's exactly. You don't expect symmetry breaking because all of these basically satisfy some conservation of information. And here basically, you just don't have that. Not that satisfies as far as where you can, absolutely, at some fluctuation logistic map. And in some sense, but if you add fluctuations that ultimately satisfy something like fluctuation dissipation, then it'll bring you back to here and you won't be able to have the stable as kind of be anymore. But you can add fluctuations that don't, actually, it's not even immediately clear. It's believed that you can add certain types of fluctuations that satisfy certain correlations of those fluctuations that don't look like finite temperature, that still allow for the logistic map to have something like asymptotic periodicity. But certainly it's known, I would say known, certainly, by the time you get to something that does satisfy fluctuation dissipation, then all bets are off. But it's believed that you can add fluctuations of a certain sort that basically would take you away from having no fluctuations whatsoever, but would still lead you to contractive dynamics. But if you now have a point where you start to have enough fluctuations and added in just the right ways that the volume of phase space is preserved, then you're back here. Always want to, it could be that the contract aren't as fast as the logistic map without fluctuation, but you cannot get to the full limit. Right, exactly. No quantum here. The logistic map, I want you to think about purely in this side of things. Yeah, exactly. I just don't think about it. I've never turned on H-bar for the logistic map, because, you know, there's no like, I turn on H-bar, like turning on H-bar to me means I have a classical limit. Maybe I want to sort of perturbatively turn on H-bar and do WKB or something like this. But I'm telling you, there's no classical limit of this thing. So it just, I don't even know how to think about the quantum analog, because I'd like to start out, basically, with at least some well-defined classical many-body version of the logistic map, and then turn on H-bar. And then you could maybe ask, oh, what's the quantum nature of that classical Hamiltonian simulation of the logistic map? But the first step doesn't exist. So it's hard for me to imagine what it means to have a quantum logistic map. I suspect that if you Google quantum logistic maps, somebody's thought about quantum logistic maps, but it doesn't, I mean, maybe they haven't thought about it very properly, but it doesn't, it doesn't, it doesn't feel very natural to me. Okay, let us continue. So at this stage, I'm going to continue, we're going to continue following our nose, and we're going to now try to build, we're going to try to build Hamiltonian time crystal, as I've tried to be very consistent about, and as we've discussed, I will be agnostic to whether I'm quantum or classical. Again, at some point, I will talk about ergodicity breaking strategies, and there we will have to distinguish whether we're quantum mechanical or classical. But at the moment, I'm talking about flow-K Hamiltonians, and I'd like to try to build a flow-K Hamiltonian time crystal, indeed, that kind of lives within these four dynamical classes, distinguish whether I'm necessarily quantum or classical. It turns out that even in Hamiltonian systems, so I've emphasized how in logistic maps or contracted dynamics, you can get this kind of stuff, but it turns out even in Hamiltonian systems, take examples, classic examples, not classical, just classic. Classic examples of subharmonic oscillations. It turns out even in Hamiltonian systems, there are classic examples of subharmonic oscillations, and it turns out that perhaps the most well-known, most well-known such example is a so-called parametric resonance. And this will immediately be related, I think it was your question yesterday, to what happens, don't you expect to get subharmonic stuff when the driving frequency is about two times the natural frequency of something? Exactly what I would call a parametric resonance. And in particular, just to make sure that you have a picture in your minds, or if you go to literally look up YouTube, a picture of a classic, very pretty example of a parametric resonance, is that of Faraday waves. We'll start with Faraday waves, and then ultimately move to Anatoly's favorite single, non-linear harmonic oscillator, but Faraday waves are prettier to start off. So it turns out, the example of Faraday waves is the following. Take an open container of water. So imagine that you shake or periodically drive a container of water that has a liquid-air interface. I should zoom out for one second to make sure that we're road mapping. So we defined at the beginning of the lecture what are the three requirements for thinking about time crystals in the phase of matter. Then we said that they're, whether or not you can find systems that satisfies those three requirements easily or not so easily depends on the dynamical class of the update rule or the equations of motion that you have. And I gave the example of a very broad class of dynamical update rules called contractive dynamics, of which the logistic map or the van der Paul oscillator are examples, where indeed you satisfy those. And what I'm going to do now, if I emphasize there, that in fact, well, the problem was they're not really Hamiltonian systems. They were contractive dynamics. What I'm doing now is I'm going to go to the Hamiltonian system and we will see that in fact for few body systems in the Hamiltonian context, it is in fact possible to satisfy all of the requirements that we want, but only in few body systems. There will be a very, very strong obstruction when we get to truly many body systems and then that will kind of verge onto the model of research is about finding new ways in these many body systems to get flow-k phases of matter that are themselves built on the breaking of the air-cuticity. That's where we're going. So at the stage, you should expect that at some level I'm going to show you Hamiltonian systems that look like they almost satisfy everything, but ultimately in the many body limit they will not and I'll explain very intuitively in two or three different ways why they do not. So you shake a container of water that has a liquid-air interface and you shake it vertically or you drive it periodically at your driving frequency omega D and what can happen this is really very pretty stuff is that you can get surface waves can develop surface waves can develop that oscillate that oscillate at a subharmonic frequency omega D over M. In fact, this is related to a question I think that was asked after lecture in fact you can get in general rational subharmonics of the form N over M times omega D. There's a question, can you only get something that looks like period doubling or period quadrupling that's frequency omega over some integer that you can also at least in these contexts and it'll also be very natural when we start talking about slightly more complicated scenarios you can always get rational subharmonics quite easily as well. So it seems like there's a very physical system so that clearly should probably be described by some type of a dynamical class that is very physical that also looks like it has a subharmonic observable the height of the surface wave as a function of time and it turns out I would say at some level the intuition was known for a long time but really the understanding microscopically and mathematically was not perfect I would say it was understood really well by Benjamin and Purcell in 1954 although the physical description the physical phenomena was known for many many years like a hundred years before this maybe not quite a hundred Benjamin and Purcell understood the nature of the subharmonic oscillations as the existence of a linear we'll unpack all this as the existence of a linear subharmonic instability as a linear subharmonic instability that is regulated by nonlinearities let me unpack what each of these words means know that in fluid dynamics there's many complicated equations but if you're thinking about incompressible fluid dynamics and you're trying to understand the surface waves you might naturally imagine solving the Euler equations the Euler equations of course are nonlinear differential equations what Benjamin and Purcell did was they linearized the Euler equations and when they linearize the Euler equations they eventually get the Matieu equations and for the Matieu equations precisely when the driving frequency is something like two times the natural frequency of an oscillator you have an instability to wanting to become a subharmonic oscillation to having a solution of the differential equation that is a subharmonic oscillation but it turns out that solution has an exponentially blowing up coefficient and the role of the nonlinearity which you go back, remember they linearized Euler equations but Euler equations by themselves are nonlinear the role of the nonlinearity is to regularize that blow up we'll write this down in equations now but that's the story in words so what they did was they started by linearizing Euler equations and what they found is that for each Fourier component for each Fourier component of the surface height surface height, u the problem reduces to a differential equation of the form qk double dot equals minus omega k squared plus delta k cosine omega dt multiplied by qk linear without the driving term just an oscillator qk is the amplitude of the surface wave for wave vector k delta k is the amplitude of the driving effective driving on that wave vector k omega d is the frequency omega d is the frequency, omega k is the natural frequency and it turns out what they showed which you can kind of almost immediately see you can certainly work it out for yourself is that when you have the driving frequency near two times the natural frequency of the system you get a parametric residence so you should emphasize that this is again they're basically those that have thought about dynamical systems this is an example of the bit 2 equation what Benjamin and herself figured out is that when you're driving near two times the residence frequency there exists an exponentially growing but subharmonic an exponentially growing subharmonic solution of the differential equation of the form e to the gamma t that's the exponential cosine omega d over 2t that's the subharmonic corresponding to period doubling plus theta so we have a period double solution in a Hamiltonian system so again we're kind of increasing we're changing the nature of the complexity at each step we're with nonlinear dynamical maps that were contractive now we're in Hamiltonian systems but at the same time we're always looking for subharmonic oscillations and trying to understand whether the manifestation of these subharmonic oscillations should be thought about as a time crystalline phase of matter in kind of the standard stat mech picture so it turns out you can immediately see that e to the gamma t it's exponentially blowing up so that's a very unphysical solution and in fact what Benjamin and herself showed is that if you try to really understand the role of the nonlinearity the first point is what they analyzed and the second point is how I understand the obstruction ultimately that we will have to having a many body version of a time crystal in such a system so the good which our cell Benjamin and herself showed is in fact that if you go back to the nonlinearity and you add in perturbatively the nonlinearity of the Euler equation what the nonlinearity serves to do is it serves to regulate the exponential blow also sort of alluded to already in that paper is that in fact nonlinearity immediately starts to couple different k-modes in the linear approximation I just write down independent met two equations for my different wave vectors k as soon as I have nonlinearity now all my different k-modes are coupled and of course we emphasize the need to think about generic ensembles of initial conditions and if you think about any generic initial condition any generic initial condition this generic initial condition will exhibit or have some weight, some energy in higher k-modes for example and I will want you to think intuitively at some level about the thing that destabilizes discrete STLB as fluctuations that are generated by this type of nonlinearity so it turns out that you will always have to set everything that we'll talk about from now on there will be interactions nonlinearity that are important to stabilize things in this case it's to stabilize the exponential blow up but they'll also hurt because you'll want to for example see purely doubled oscillations in a particular k-mode but now if you have lots of weight in other modes that are also evolving those via the interaction act as an effective noisy force on the thing that you might be looking for stable STLB in the specific language in this is that in the other context that we'll discuss the language will be a little bit different but the concept will always be the same so the idea here is that the coupling these other higher k-modes for example and look effectively look effectively like a noisy force on the mode where one is trying to stabilize STLB so really combining the good and the bad now begs the obvious question we've now gotten to Hamiltonian system so now we're very happy with respect to our tree over there we're thinking of the right type of dynamical class we've said that there are literally physical phenomenon that look like there are sub harmonic oscillations but now there's a question related to the role of the nonlinearity and the question now is does infinitely long lived in the thermodynamic limit discrete STLB survive does it survive treat when we treat this type of para-para resonance when we treat parametric resonance emphasize again that you know we're already at some level this is clearly kind of a many body problem in the sense that there's lots of water there's lots of air there's lots of stuff happening but in fact parametric resonances are most easily understood and we will understand them in just a little bit for single degrees of freedom for single nonlinear oscillators the question is from the perspective of parametric resonances which are extremely well known and well understood including how stable they are in single nonlinear degrees of freedom does discrete STLB survive when we treat such a parametric resonance as a genuine genuine nonlinear nonlinear I want to emphasize something that will be very important and I think something that I feel quite strongly about at some level given that you've seen things like Faraday waves and you can see them on human time scales the question that the community is asking we're asking that I'm part of it so it's fine but whether or not it really is super it's a very concrete question but whether or not the questions of stability that we're going to try to explore are really ultra important from for example an experimental perspective I'm not sure about but what I mean by that is ultimately I've emphasized even in the definition that we had of there being an infinitely long lifetime for the discrete STLB and what I mean by that is that as you take the thermodynamic limit as you take the system size L to infinity I would like the lifetime to diverge for example as an exponential in the system size this is what happens when you have the transverse field ising model your magnet if you think about the lifetime for any finite size system if you wait exponentially long enough and you're in a finite size system your magnet will flip from up to down it will forget its initial condition but it happens on a really long time scale that goes as E to the L so here at some level we'd like to ask the same thing can you get STLB that survives in a genuine non-linear many-body classical or quantum system with a lifetime that diverges as E to the L like your ferromagnet does I will emphasize at some level that in the same level whether or not your refrigerator magnet had a lifetime that diverges as E to the L or some other really astronomically long time scale it wouldn't matter for sticking your picture on the fridge that's what I'm emphasizing here that although I will explain what the obstruction is for ferrite waves to truly have that type of exponentially long-lived STLB at the level of sub-harmonic phenomenon it's certainly there so we'll try to answer I started a little late so we will try to answer we will attempt to answer answer this question by working through with an answers question in maybe let's call it the context of single and then coupled non-linear harmonic oscillators system that you have seen a lot already in Ant Holy's lectures although not with parametric wrapping we'll try to answer this in the context of single coupled non-linear harmonic oscillators and in fact just as a preview I will tell you that the answer is the following the answer is that strictly for a single body so I wouldn't call it a phase of matter anymore because a single degree of freedom but for a single oscillator and although less proven but still expected for a few oscillators it turns out that the answer is yes you will get rigid STLB that survives for a lifetime that is essentially infinite and this will ultimately be a consequence of something you've heard before the KM consequence of Karin Kolmogorov Arnold Moser it turns out that for the many body case that for the many body case of many coupled parametric parametrically driven non-linear oscillators the answer will be no at least for me I will explain it to you in the language of the obstruction being ergodicity I think this is as good a spot to stop as any we'll continue after lunch by continuing to work through this thanks