 Yeah, good afternoon everyone, I had the pleasure to introduce our speaker today, John Gold. John is from Ireland and he obtained his PhD in 2006 from an university college in Cork. 2010. 2010, okay. And then he moved to Singapore for his first postdoc and then back to Europe with a marigold fellow in Oxford for three years. Then you were five years in Italy at ICTP in Trieste. Then you went back to Ireland in 2018 with a fellowship from Royal Society and in the same year you obtained an IRC starting grant and he's currently an associate professor at Trinity College in Dublin where he's also a director of the Master in Quantum Science and Technology and he's the head of the research group in thermodynamics and energetics of quantum systems. John is an expert in thermodynamics, both classical and quantum. I think we will talk about that today and he's also an expert in many body physics and quantum and classical simulation. Thanks a lot Alberto, yeah. So yeah, thanks very much for the institute here for inviting me. I was supposed to come last year but some stuff got in the way. I've never been to Denmark before so I'm enjoying myself here. I was telling Alberto yesterday that every Irish person around my age knows Aorus from a poem by Seamus Heaney who's a famous Irish poet. He won the Nobel Prize in 1995. He wrote a poem called A Tolland Man which is about these preserved men I think that are in the museum here in Aorus so I was thinking about that in the bus ride from the airport when he talks about the flatness of juttland in the poem but anyway. So yeah I'm going to tell you all a little bit about something which I call quantum dual paddle buckets. It sounds a bit weird but actually it's quite natural mainly in the context of pure state thermodynamics right but maybe I can just start off by telling you a little bit about my own group and what we're interested in in case there's anybody else here that's interested in any of these topics and maybe wants to meet me to talk about any of them since I'm here until tomorrow evening essentially. So we're a group that are kind of interested in the interface between I would say thermodynamics, many body physics and quantum information loosely so on the one side we're interested in sort of the energetics of quantum information processing so really you know as we kind of move into an era where quantum technologies are being you know sort of developed at a rapid pace there's lots of interesting questions surrounding the physics of those devices or at least you can construct physics as theorists inspired by those devices and then I guess something which kind of bridges the gap with people like Alberto is that I'm also very interested in sort of stochastic thermodynamics as applied to quantum systems and actually while the classical stochastic thermodynamics is a very well-defined kind of topic there's still fairly good scope for development in the quantum domain and mainly due to the you know presence of coherence and energy eigen basis and the invasive nature of quantum measurements poses problems from the interpretation perspective and you try to construct such a theory and lastly you know we're interested in something that I want to talk about today which I'm very interested in which is the sort of notion of a emergent thermalization and transport in in many body systems both kind of closed and open so how do we understand the kind of emergence of thermodynamic quantities from you know sort of quantum evolution so sort of fundamental question of emergence of statistical mechanics and kind of recently as well you see in the second part of the talk the other thing that we became interested in mainly just sort of curiosity is quantum simulation in general so there are the kind of topics that we're interested in so what I'm going to talk about in the first part is something which actually is motivated in 19th century physics it's a dual paddle bucket that I'm going to tell you about in a second and because it's a kind of colloquium and I'm assuming basically everybody is from a very different background I'll give a fairly pedagogical at least in my mind pedagogical introduction to eigenstate thermalization and then tell you a little bit about how we're using kind of defacing thermometer to pose interesting questions regarding sort of emergence of temperature so this work in the first instance was done by Mark Mitchison who is a postdoc in my group in Dublin but now joined the faculty as an assistant professor Marlon Brennis who is a PhD student he graduated last year and now moved to Toronto Archak Purkayashta actually who spent a little bit of time in ours before moving on to a faculty position in India so he was a postdoc at the time and Alessandro Silva from Trieste in Italy and although he wasn't involved in this work a lot of the stuff on eigenstate thermalization I learned directly from Marcus Regal who's one of the kind of world experts in computational physics of sort of closed systems and emergence of thermalization so I spent some time in Santa Barbara in 2018 and picked up a few things from him that we transferred across to our own work okay so what do I mean about motivation from 200 years ago I'm always kind of interested in the history of of thermodynamics and in the turn of the 19th century believe it or not we didn't really know what energy was there was no real concept of what energy was at all they didn't have any good theory for it the prevailing idea at the time this is pre kind of discovery the first law of thermodynamics was that heat was a kind of fluid which they call the caloric and it flowed kind of from hot to cold and you know they didn't really sort of see that you could convert mechanical work into heat jet and the first person who noticed that there might be something more to it than the caloric theory was a guy called Rumford so if you've ever been to Munich you have the English gardens so he established that he was an American actually and he was tasked with the with the kind of task of organizing the Bavarian army in the first part of the of the 19th century and he noticed that huge amount of heat was generated while they were boring holes and cannon guns so they were making cannon guns by boring a hole and he got very excited about that and he started devices on apparatus okay which he took these cannon guns immersed them in water and then developed a thermometer that he could actually read off the temperature of the water changing as they bore the hole in the cannon gun okay and he wrote to the Royal Society with his findings he actually was able to make the water boil which was a big surprise for everybody that was sort of around Munich at the time because the only if you think about that time the only time they would have seen boiling water was really when they lit a fire right so the guy was just boring a hole in the cannon gun and boiling water around his apparatus it was a big deal and you can see he wrote to the Royal Society and he starts to say look the caloric theory maybe it's not right actually you know heat is probably a form of motion it was completely discredited you go 25 years later and Carnot wrote the paper on the heat engine it was actually written in the language of the caloric which is an incorrect theory but the right answer came from it so somebody who knew about Romford's work I read the paper recently was about 50 years later was James Joule okay and James Joule designed the defining experiment of I would say 19th century thermodynamics in the paddle bucket it's ingenious sort of idea so he starts off his work by quoting Romford that you know heat could be a form of motion and he sets out a kind of operational way of measuring that so you have some thermally isolated container okay it has water in it and then you have an apparatus which allows you to churn paddles okay and in churning the paddles it lifts away so you can measure the work done and also there's an in-situ thermometer which he reads off a temperature change okay so he was able to kind of confirm that you could in an isolated system you could really convert mechanical work into heat right so the first law of thermodynamics was born out of Joule's experiment he makes a big deal in the experiment saying that he could measure off from his mercury thermometers to three decimal points of accuracy I'm not sure how much I believe him there but in any case you know you appreciate it now but you know in the referee report actually he called it originally the conservation of energy but they said that was too radical and was taken out so there's a referee said look this is too radical to call this the conservation of energy and well the rest is history so kind of thermodynamics was born in a sense from this experiment so that's what the paddle bucket experiment is and we were kind of interested in this story but now trying to interpret Joule's original experiment from the perspective of the Schrodinger equation so how can we recover everything that Joule saw just assuming quantum mechanics and Schrodinger equation so no ensembles there's this is an isolated system so I don't not assuming any you know heat bat in the outside so how we can get this from a closed system okay and so this is the resultist paper here and actually it leads you to the idea of how you take the temperature of a pure quantum state which I'm going to tell you about and this is connected to this termometry business which has become quite fashionable in the intersection between thermodynamics and quantum information community essentially measuring temperature is a metrological problem so it's like a parameter estimation problem in quantum mechanics so let's break down the original experiment by Joule into three stages so the first stage is the ability to perform and measure work in thermal isolation okay so defining that the second part which is a bit non-trivial is the notion of relaxation inside in this thermally isolated container you have to imagine some sort of ergodic postulate whereby you know sort of the the fluid re-equilibrates or thermalizes to a new temperature which corresponds to the initial energy density that's put in by moving the paddles and then there's the important in situ thermometry right so we want to try to get those three components into a quantum mechanical setting so this notion of you know number two is the tricky one for quantum mechanics and it's an old story so how do you define thermalization in an isolated quantum system and that question is sort of as old as quantum mechanics itself it was posed by the likes of Schrodinger and von Neumann you see these papers in 27 and 29 where people were trying to construct a sort of ergodic hypothesis in quantum mechanics you know if you open up a textbook and you see you know statistical mechanics textbook you know classical you know the ergodic hypothesis almost assumed at the start right there are some rigorous arguments but they're kind of rare they're phase space arguments and of course in quantum mechanics we don't have the luxury of a well-defined phase space okay so this is a problem you know and how you actually define thermalization right so a relatively you know so since these papers okay I would say that it was a bit academic in the 20s and 30s it was kind of strange I mean in the end you can just open the window and you thermalize to the temperature of the room right so but it started to become much more important with the development of AMO physics whereby you know we have now examples of many body systems unlike in condensed matter physics which are you know thermally isolated from you know phonons for time scales that would be really unprecedented okay and they're so well isolated that to a good approximation for very long time scales you can model those systems of particles as obeying a Schrodinger equation up to some long time scale and the first experiment actually down at the bottom here that sort of led to a resurgence of interest in this old question of why or how systems quantum systems thermalize was this experiment in the Dave Weiss group in Penn State and actually it's the opposite they found when they trapped the one-dimensional gas of bosons and the underlying Hamiltonian of such a system is called a Lieblinger model it's bosons interacting with a delta function interaction in one dimension it's a norm to be integrable by the Beth Anzatz and that means essentially that the system has an extensive set of conserved quantities okay and they noticed that if they you know did a quench so they put a brag pulse in and then they tracked the dynamics of the density that the momentum distribution that they measured after a long time was not thermal okay and people said okay what's this about and actually it makes sense that it's not thermal because it's an integrable system as an extensive set of conserved quantities so it's not it might be equilibrating but it's not equilibrating to a standard statistical mechanics ensemble I mean many many other examples exist where this integrability can be broken so there's this kind of dipolar Newton's cradle experiment whereby they can tune the system between being integrable and not being integrable so they can observe standard thermalization and now you can see you know very well that if you perform quenches on generic many body systems if you look locally then at an observable okay that observable appears at long times like it's thermalized at a temperature which is proportional to the initial energy density induced by the quench okay I'll explain this in more mathematical terms next. So the question is what is the mathematical apparatus or you know framework to think about this type of internal thermalization and again this is really thermalization inside a many body system in the laboratory like an internal temperature that emerges of course there's a temperature of the room and the temperature of the apparatus but I'm really saying locally what does the temperature of an observable look like if you assume the universe or your experiment is is evolving according to the Schrodinger equation okay. So the standard way of thinking about thermalization and isolated systems is the following assume a large system okay assume you have some Hamiltonian which is generic in the sense that it doesn't have any conserved quantities other than the Hamiltonian itself let's just assume there's no degeneracies okay and take some initial state that initial state can be trivial if you like and not an eigenstate of the Hamiltonian so it's also some superposition in the eigen basis and now you're going to see that of course this coefficients are just the overlaps between the energy eigen basis and the initial state and you're going to assume that that state has an extensive energy which is reasonable in statistical mechanics the energy scales like n okay the number of particles but you also want to assume that there's sub extensive energy fluctuations in the initial state okay so in other words the distribution of these coefficients are the energy distribution of the initial state is sharply peaked around the average energy okay so again these are the two assumptions you have a large tree assumptions large system some spectrum initial state with some extensive energy and sub extensive energy fluctuations right so such a state you might generate from a quench you could generate it from an arbitrary preparation in a simulator or whatever okay and now what you want to do is you want to do time evolution according to that complex Hamiltonian okay so you do some time evolution and very quickly if your system is generic your initial product state for example will become entangled right and typically ergodic systems are not special systems they typically have a behavior of the half chain or a half system entanglement entropy which scales roughly like t and then saturates to something like in a volume law okay so scaling with the system size so if I look at an observable a local observable in such an evolution I see behavior like this if I do a numerical experiment or a real experiment I'm going to see the observable it's going to oscillate okay I make the system larger and I might see that these oscillations will start to settle down around this sort of average value just write out the expectation value of the observable and you see that you know essentially you can split it up into the diagonal and off diagonal matrix elements of the observable in the energy eigen basis so it's reasonable to assume that this type of behavior is dictated to be to you by the behavior of that observable in the energy eigen basis of the Hamiltonian okay so what is this dashed value especially the dashed value is nothing but the time average of the signal which is the expectation value of the observable as a function of time and if you make a time average of o of t what you can see is that you kill off the phases so the off diagonal contribution goes away and you can accurately describe o average as sum over n modulus c n squared o and n where o and n are the diagonal matrix elements in the energy eigen basis so this is known as the diagonal ensemble so the diagonal ensemble is equivalent to you know defacing your state in the energy eigen basis which is equivalent to an infinite time average so the expectation values of o here correspond to expectation values in this diagonal ensemble whose eigen values are the overlaps of your initial state with the energy eigen basis of the Hamiltonian okay is it all clear for now any questions good so the question now of thermalization is the following we know that in the thermodynamic limit that o average unless there's something very peculiar happening will be described by the trace of o on the diagonal ensemble but the question is when can you replace now that diagonal ensemble which contains a huge amount of information with one of the ensembles of statistical mechanics let's say the micro canonical ensemble which is the most fundamental one okay so when can I get rid of all of that extra information the diagonal ensemble and just replace it with a thermal expectation value I mean if we didn't have such a postulate and classical statistical mechanics I mean there wouldn't be much point in doing things you need to make some simplifications and when is it reasonable to do such a thing and how do we understand that well one of the ways that people understand that that's very popular is called the eigenstate thermalization hypothesis so it's basically a beefed up version of random matrix theory with an energy dependence it was the seeds lie in work by michael berry and the formulation that I'm going to talk about and which is widely accepted is by mark serenity who actually I talked with him and he said that the reason why he decided to come up with it was because he was teaching quantum statistical mechanics to graduates and he had no way of explaining thermalization so he wanted to work backwards and come up with an explanation so this this is a hypothesis there's no rigorous proof and it tells you something or it's a hypothesis on how the matrix elements of local observables in chaotic systems behave okay and you see that there's two components the diagonal component and the off diagonal component there's two different energy scales the average energy which is en plus em over two and there's the frequencies with em minus en right so you see that the diagonal okay is essentially the is postulated to be a smooth function of the energy and the off diagonal contributions which I'll come to in a bit a bit later these have a pre-factor which is exponential of the entropy at energy e okay which obviously is something which goes away okay quite quickly as a function of l because it's one over the square root of density of states okay and you have another smooth function on the off diagonal which we'll see later we'll govern in correlation functions of the system and you have this r and m which is a random variable with zero mean and unit variance okay so how does this explain this this is the question so this is a postulate this is how the matrix elements of generic systems behave in the thermodynamic limit or for large systems well to see how it works think about that expectation value of the observable in the diagonal ensemble so this looks like this eigenstate terminalization hypothesis tells you that the diagonal matrix elements of the observable in the energy eigen basis become a smooth function of the energy and that means we can tailor expand them about the average energy so e here is the average energy right and that ever average energy okay is set by the only conserved quantity in the system which is the energy of the initial condition it's constant in time right so what you do is you perform a tailor expansion of onn which according to the eigenstate terminalization hypothesis is a smooth function of the energy look at the first term i put this in here now okay into the diagonal ensemble prediction and you see that i can draw the c n squared times e n out so this just becomes zero right i'm getting sum over n c n squared e n that's the average energy minus the average energy zero so you're left with something which is basically the expectation value of the operator energy e plus a term which is in the second derivative of the observable with respect to the energy and notice that delta e squared is of order n again but this thing here is of order one over n squared so this thing is a one over n correction okay so the thermodynamic limit you see that if i if i have eigenstate terminalization hypothesis if it's true if the all that you require is that the diagonal matrix elements of observable in the energy eigen basis becomes a smooth function of the energy then that that's actually enough to say that the diagonal ensemble prediction is equivalent to the micro canonical prediction which is the value of the observable at energy e and that energy is set by the energy density of the initial condition okay so the last i mean the other thing that you want to show is that the time evolution the actual time evolution should remain close to the average for most times right because you know you could have that the average agrees but there's wild fluctuations and that's taken care of in the et h or the eigenstate terminalization hypothesis by the fact that the off diagonal matrix elements have this pre factor in e to the minus s e right so they're basically the fact that the entropy is extensive with n and this is an exponential kills this off so if you want to show that these time average of the temporal fluctuations of the actual observable about the average value die away as you make n large you can just use some basic bounds to show that these things are bounded by e to the minus s e right the fluctuations so they're small as you increase the system size and you'll see a lot more of this explained in more detail in this nice review here okay so and the last thing that I'd like to say about the eigenstate terminalization hypothesis is this function here what is it f e of omega right f e of omega which you call the spectral function is actually really where the magic of the et h takes off and what it tells you is something about fluctuations if you're now not focusing on expectation values of the observable but you're really focusing on two point functions so think about the connected correlator on an eigenstate right what you can see is that the spectral function the modulus squared of the spectral function okay evaluated on another you know we'll give you back the correlation functions that you know very well from condensed matter physics like the symmetric response function or the cuba response function but the cool thing is the et h tells you that it's sufficient to evaluate these correlation functions on eigenstates energy e okay with an energy which matches the energy of your initial condition in time evolution so not only if you got terminalization at the level of the observable okay what the et h also imposes is that the fluctuation dissipation theorem holds at the level of a single eigenstate okay so this somehow is less studied and observe than an observable termization but it's an extremely important feature how can you extract these things how do you study these things well you're dealing with a very complicated many body system with no particular symmetry so you're pretty much limited towards exact diagonalization and luckily enough obviously you just study matrix elements in the energy eigenbasis that you do ed but if you want to extract the smooth function of the energy f which dictates the behavior of the correlation functions what you can do is you can extract it by basically taking you know binning essentially the matrix elements and frequency windows and continually changing the frequency right so this allows you to extract the modulus of f squared up to an energy dependent pre-factor right this is just a numerical procedure it's pretty straightforward when you think about it what do they look like so this is computed in a spin chain with some integrability breaking terms I get to more detail these are the matrix elements as a scatter plot and here's the kind of running average of this thing right so you just bin it into different frequency windows and this black line okay is this smooth function on the off diagonal matrix elements and what you can do then is you can take that and you can build a standard response functions of whatever operator that you're interested in so this is for example in a staggered field model this one is the symmetric response function this one is the kubo response function obviously these things are interesting also because you can measure them experimentally right so the temperature that you get here is the temperature that you would compute in the micro canonical ensemble so the derivative of the entropy with respect to the energy the inverse of that okay you can use this thing to study like fissure information because the fissure information can be recast in terms of response functions I don't want to talk about that it's the work that we've done before but what I'd really like to talk to is go back to this idea of the dual paddle bucket so we're going to put a non-integrable many body system that fulfills the eth as our water in joules paddle bucket and we're going to perform an in silico experiment numerically exactly like he did on a computer so this is the model so this is the heisenberg model in the stagger magnetic field you can see this field in the z direction breaks the integrability okay so it's known to break all of the conserved quantities of the xxy and it's known to be diffusive obeys eth etc so if you look in the matrix elements of this model okay as a function of the this is basically as a function of the energy density is the diagonal matrix elements what you see in the main plot the scatter plot are the matrix elements as a function of the system size and you can see as you increase system size you're shrinking okay and this black line is the micro canonical prediction for a given observable a right so this is what you do when you do a scatter analysis of matrix elements on the right hand side you see on this semi-low plot extraction of f e of omega at different ease so different ease correspond to different thermalization temperatures and what you notice is that the sort of higher frequency stuff is totally insensitive to changing the energy density but the low frequency part of the correlation function is what seems to be sensitive to changing the temperature so this is going to form the basis of our thermometer scheme in a little bit okay so just to show you that eth is fulfilled naturally in this model now what we're going to do is we're going to start in the ground state of this model okay and we're going to turn on our stirring our paddles okay by just modulating a single spin in the center of the chain so we're just going to pump energy into the system just by periodically modulating one of the spins okay so you see here a plot of the energy as a function of time in what we call the preparation stage so this is stirring the the paddle okay start zero energy to ground state okay and you start to increase the energy by just essentially stirring this for different timescale so if I stop the modulating field at any point in time I fix the energy that I'm now then going to do free evolution with okay so what's shown down here is the energy distribution yeah for a given preparation corresponding to an average energy of minus 8j where j is the hopping so that means I've stopped somewhere here okay and that gives me a into different preparations in time here right so if you want the black line is lower temperature so you go towards zero energy that's higher temperature okay it's a bit confusing but density of states is sort of gaussian around peak around zero energy and the high temperature states are in the center of the spectrum right so but you can see the higher temperature states you rapidly equilibrate okay the lower temperature ones are taking a bit of time so what's shown here as the at the dashed line is the prediction from the diagonal ensemble okay so this is the equilibration time when for this observable when this red line touches the dashed line that's the equilibration time and how do we know that that corresponds to thermalization well we can compare different initial energy preparations and the micro canonical prediction so what you show here in the black dots are predictions from the time average against the micro canonical prediction and you can see statistical mechanics works very well the cool thing is of course remember that in this experiment we're never leaving the assumption of a closed system we're just looking locally at some observable right so I mean if you want to call that an open system fine but globally you're just evolving according to the Schrodinger equation and what I'm showing here on the right and left is just the behavior of the correlation functions okay just to show that the fluctuation dissipation theorem is also holding so this is the sort of preparation so I'm performing work this is essentially performing work mechanical work by stirring the the system measuring the work distribution and then allowing for relaxation and that relaxation or thermalization depends on how long you've stirred for okay so that's the idea of this paddle book and the last thing really to think about is how you can do in situ thermometer tomometry okay and actually the way in which we do in situ tomometry is there's a number of different ways you can do that but we choose to do Ramsey interferometry so what we do is we take this time evolving state this many body system that's evolving in time we know that it's equilibrated and thermalized to the correct temperature we want to try to read out that temperature which is set by the energy that you've pumped in so we couple in an ancillary qubit which couples in a spin dependent way so when the ancillary is in state up okay it sees it kicks the the many body system and if it it stayed down it doesn't see it at all so if you want it's a ramsey interferometer and we choose an interaction with our ancillary which commutes with the system Hamiltonian which means it preserves the energy of the ancillary so really you're just affecting the off diagonal matrix elements of the ancillary right and from that defacing signal yeah we want to try to read out the temperature right so it sounds like a bit I mean you know fast forward but like it's something that's done fairly routinely I was only talking to Jan about it today in ultra cold gases so this is an experiment from Rudy Grimm with impurities in in in Fermi gases and they can do this Ramsey interferometry scheme by means of tuning fresh back resonances so they can really read out the defacing of embedded impurities in gases right so the question is how is the defacing yeah connected to the temperature that you're equilibrating to so I'm probing an equi I'm probing an equilibration temperature here so in order to do this you have to do a little bit more work right so again here's the Hamiltonian this is the Hamiltonian of my dual paddle bucket if you want my spin chain this is the interaction Hamiltonian of my of my of my spin and I prepare my ancillian plus and what I do then is I notice that the off diagonal matrix elements when I trace out the many body system are just given to you by this sort of loch mid amplitude whereby you have an evolution corresponding to a perturbed evolution here psi zero is not the initial state of the many body system it's actually the state after sort of equilibrates because you're trying to probe the emergent temperature scale and what you can do then is you can assume that the ancillas weak in weak coupling you don't have to but if you do you can do a little bit more you can perform a cumulant expansion of this object and what you see is in that long times yeah this thing is dominated by the zero frequency component of the Fourier transform of the auto correlation function of the interaction Hamiltonian right so if you assume for example that the interaction Hamiltonian is coupling via sigma z this thing probes the zero frequency component of the Fourier transform of that correlation function and remember that it's precisely the low frequency part of the correlation function that I showed you which is which is sensitive okay to the energy right to the temperature so so what we've what you can see is that chaotic systems they generically display finite DC noise right and that leads to a pure exponential decay with temperature dependent rate okay so you can find details in some of these papers so how good is this so for example unfortunately what you'd be hoping for because when you make a thermometer right you want to have some very well defined property of your system that you can gauge the temperature from think about a mercury thermometer I mean what good is a thermometer if you don't know how your probe your physical properties your probe are very varying with temperature so you first have to make a model so what we find is that although the exponent of the exponential decay in the defasing okay is linear in the energy density this translates to a non-linear dependence of the temperature which is a little bit disappointing what you'd be hoping for is something like a linear dependence of the temperature to really make a good thermometer however all is not lost and the problem is we can only do numerics in one dimension okay because these are very complicated exact diagonalizations which you can make a hydrodynamic argument that I don't really want to go into which was sort of first you know kind of made to us by alessandro silva which shows you if you were in three dimensions weak coupling then actually the exponent would be linear with temperature it's just a peculiarity of the way that the hydrodynamics work in one dimension that doesn't give you a linear behavior with the temperature so what are the pros so this works for kind of generic ethobank systems in arbitrary thermal preparations it does not require fine-tuned energy levels like other termometry schemes the cons are that it requires knowledge of temperature dependent dc noise so you need to know if you want to make a temperature or thermometer if you want if you want to make a thermometer you need to know how the property of some part of your system behaves with temperature I mean that's how you make a thermometer I mean you really need to have a good model well and the open questions are we don't really we didn't really explore the precision bounds so with the Fisher information in a meteorological sense or anything like that you know what is the role of measurement back action when you actually really go on you measure off the ancilla you're probably doing something to the gas that you're immersed in and can you improve the readout by using multiple probes and I actually have a few more slides I don't know how much time I have left okay fine let me quickly tell you another thing about pure state preparation so that was like tomometry there's another useful application of pure state thermodynamics and we recently were doing some experiments with Nathan Keenan who's a PhD student my group was actually employed by IBM the advantage of having somebody employed by IBM is that they have access to the full stack so we're using this couple trasmand device called Montreal okay to do some interesting experiments or simulation of spin chain model okay so what's the idea again we take this Heisenberg chain and this Heisenberg chain I'm not breaking the integrability so it's really an integrable system and at high temperatures the transport properties of the Heisenberg model are extremely well characterized right so it's known for example as a function of this anisotropy how how spin is transported so spin is conserved in the model so for example when delta is less than one you have a ballistic transport when delta is greater than one you have diffusive and at delta equal to one it's known that this model has super diffusive funny super diffusive transport which means you get a current which scales like one over l to the z minus one where z is three over two or a spin-spin correlation that goes like t to the one over z yeah and these exponents are connected so these this was known that this funny transport behaves now what is super diffusion super diffusion that finite size looks diffused looks like it's diffusion but in the term of dynamic limit it looks ballistic right so just like sub diffusion which is also anomalous diffusion it's the opposite right it looks diffusive at finite system size but in the term of dynamic limit the transport goes to zero okay so um it's been conjectured recently okay by netheridge pros and another people that the origin of this super diffusion is because the correlation functions in this model fulfill an equation called the kardar perisi zang equation which describes surface growth in classical statistical physics right it's not proven analytically it's a conjecture so this funny exponent is coming from some underlying emergent universality at high temperatures at this particular point in the phase diagram and by now there's actually considerable experimental proof quite recently that this is the case so there was a pump probe experiment traditional pump probe experiment in neutron scattering where they measure essentially s of omega at low frequencies again okay by neutron scattering and they try to extract the exponent from the fit at high temperatures and they see something quite close to this 3 over 2 value and in the manual blocks group in cold atoms they sort of prepared initial preparations of domain walls and look at the evolution of the main walls to extract from the sort of population transfer this exponent as well and they get close to this kardar perisi zang exponent so we were thinking is there a way of doing this on a intermediate scale device right so i'm one of these couple transmon if you want early stage quantum computers would we be able to see something like that exponent and at first glance you know you think this is crazy because transport requires quite large system sizes and long time scales which are precisely the type of things that's not very easy to do with a quantum computer but you can do some tricks so what tricks can you do so what we play a funny game so we take the device yeah and what we do is we we use this qubit that we know is good it's good in the sense that the readout errors are very small we know that just from looking at the at the the numbers on the device and we the first thing that we do is like a jewel paddle bucket except now the infinite temperature is we just make a random circuit so what we do is we apply a random layer of single qubit gates on all qubits so we choose from root of x root of y and t okay and then we do a layer of c knots in a particular sequence we repeat not applying the same single qubit gate that we applied in step one and we keep going and what you can do is you can plot the entanglement entropy in theory okay of half this device and you can see that you saturate the maximum possible value after around 20 layers so it's it's quite good as a randomizer it's not true randomness in the sense of horror okay but it'll it's enough to generate this big superposition in the computational basis which would be reflective then when you do dynamics of very high temperatures so what you see here is that we leave one of the qubits untouched completely untouched while we randomize everything else so this is actually a shot of the magnetization following the circuit on the actual device so what you see is you get this magnetization profile which is one of the qubits is polarized and the rest are in this mess okay that's what it is that's actually off the hardware the cool thing is about this procedure as we know from you know from numerical work is you can use a completely random state to compute a correlation function at infinite temperature so this is this typicality trick so what we're doing is using this as an initial state okay and then we're going to do a propagation via trotterization on top okay so the cool thing is is that the kicked xxz model so if i if i think about continuous time evolution right i need lots of trotter steps to to mimic continuous time evolution but actually what's known is that the kardar perisi zang universality also holds in discrete time so if i got the first order in trotter i get something which is like a floquet problem known as the kicked xxz model so if you want it's just a brickwork circuit where i have two qubit gates acting on even and odd sites the the advantage of that is you can take arbitrary large trotter steps and you define a kind of floquet problem so what we're going to do is we're going to start to apply this circuit which represents the discrete time xxz or the kicked xxz model on top of this initial random preparation okay so we do that and we see if we can extract something from the device so again just to summarize what we're doing and i'll finish in a second we create a random circuit on l minus one qubits we leave one qubit alone right we then take the xxz model and we break it down to first order in trotter that defines the floquet problem okay each trotter step on two qubits is a circuit like this is well known or how you simulate the xxz model and then we just start hitting it with trotter steps okay this random preparation now i still don't understand why this random state preparation works so well as you'll see in a second but my my feeling is that it's if something is messed up enough the noise locally on the device it doesn't really care too much about it right so if you think about a random state locally if i trace out a few qubits i look locally i'm going to see something equivalent to the identity so it's kind of invading it's in invariant under defacing channels in some sense local defacing channels anyway such a preparation except for qubit one okay so we do this and what you see is for 21 qubits we don't use all the qubits under the 27 qubit device so what this is a plot of the measured correlation function so the black line is a minus two over three fifths so that's what you expect at the super diffusive point the gray line is classical trotter evolution on a classical computer okay and the green line is what we got from the device right so pretty good it follows without any error mitigation yeah it's following the what you'd expect we can then break the integrability okay by adding a stagger magnetic field which amounts to another layer of gates in the z direction right and then you'd expect this two over three exponent to go to the diffusive exponent of a half and what you see is you know again something which looks pretty good now you could argue that maybe it's not exactly that it's curve fitting whatever but it's promising and it's pretty difficult working with these devices usually to get results of them so we were very happy about that so we are currently working pushing the boundaries on this a little bit further looking at slightly different discrete time models and you know using sort of these tricks to extract some exponents etc so again randomized you get this preparation on all qubits except one this represents density perturbation in a kind of a homogeneous background which then evolves with time under the trotter evolution and you check how that density profile is evolving with time and that will give you information on the correlation function according to the theory right so that's the idea so for this part and conclusions in general is that we simulated the local infinite temperature spin-spin auto correlation function of the xxz model for 21 qubits you could simulate for long enough to get some hydrodynamic information which was actually surprising to us and it seems that discrete time and the random state preparation are crucial and we're currently working on pushing that out to larger system sizes using error mitigation and some other models that's what we're doing at the moment with that so with that I'd like to thank you and I'll happily take any questions from anybody