 Thank you very much, Ali. Both for introduction and for inviting me here. It's really, really great to see this packed audience of young enthusiastic people, and I hope that I will meet your expectations. So what I want to say is please stop me. I really enjoy being interrupted. What I don't enjoy is if we get at the end of this hour and a half and you feel that you have understood nothing. So please, if you feel that something is not clear, that I'm going too fast, stop me. I have put all the didactic part at the beginning of the talk, and at the end I present a few applications. So it's not important that you go through the whole of the applications, but it's important that you understand what are the main concepts behind this technique. So please interrupt me as much as you wish. So the outline of the talk is going to go like this. So we will start with an introduction to what I mean when I speak about nuclear quantum effects and what sort of the standard technique to model them in the condensed matter. And then I will discuss some techniques that I've been working on since my PhD to obtain this kind of modeling with much, much reduced computational effort because you will see this is really demanding kind of calculations. And then I will present you some applications that are specifically dealing with water. So, OK, getting started. What do I mean by nuclear quantum effects? I certainly am not speaking about anything that has to do with nuclear physics. Here we are happily considering nuclear to be positively charged point particles. We really don't care much about the inner structure of a nucleus. All that we care about is the fact that even though in most simulations, in particular in molecular dynamics, I know you had an introduction to molecular dynamics at the beginning of last week from Professor Skandolo, so all of that he was talking about to you was about Newtonian mechanics, so solving perhaps the Schrodinger equation for the electrons quantum mechanically, but then using the forces that had been computed by solving the Schrodinger equation to evolve the dynamics of the nuclei classically as if the nuclei were classical particles. So this is a fantastically good approximation for simulating plutonium, uranium, some very, very heavy elements, but if you are dealing with something like hydrogen, this is not such a good approximation, and you can recognize this very, very simple just by comparing for different vibrational modes of your molecule or material the quantum of harmonic energy and KBT, the quantum of thermal energy so as to speak. So whenever this ratio is larger than one, it means first of all that 0.10 energy fluctuations will be at least as important as thermal fluctuations, and second that excitations in your system of the ionic degrees of freedom will be more complicated. So these two effects are a manifestation of the quantum nature of the nuclei, and if you look at a realistic system just liquid water, you can see first of all that you can recast this expression to see that all the modes with the frequency in excess of 200 inverse centimeters will be more and more quantized, and then if you look at the vibrational spectrum of liquid water, you can see that 95% of the ionic degrees of freedom exhibit some degree of quantumness, which is completely ignored if you do molecular dynamics of the nuclei as classical particles and using Boltzmann statistics, which is fully classical. So you can ask yourself, how do I see these from experiments? Do we have any way to quantify how important these effects are just based on experimental observation? And a very easy way to do that is just look at physical properties of isotopically pure water. So basically the point is that the more massive for even Born-Oppenheimer potential energy surface, the more massive a nucleus become, the smaller the frequency, the least quantum. So you can look, for instance, at the heat capacity or at the pH of H2o, D2o, and when you have it also for Cr2o, and you extrapolate on an appropriate scale, and you can see, well, you can see two things. The first thing is that, of course, tritsium is more classical than hydrogen, but it's still strongly quantized. And the second thing you can see is that if you extrapolate all the way to classical water, you have huge differences between the physical properties of real water and the physical properties that water would have if it behaved classically. So as I like to say, I mean, being an Italian, this is a tragedy, 30% longer to boil water for making pasta, but this is even more worrying because basically liquid water, if it were classical, would have the pH of bleach, of a dilute solution of bleach, so probably wouldn't be something very refreshing and healthy. And you can also see all sorts of quantum effects, and typically the way to experimentally get a handle of how important the quantum nature of the nuclei is, is by looking at how the properties change upon isotope substitution. However, always bear in mind that deuterium is not classical. So if you see a large isotope effect, the effect that you get by completely neglecting the quantum nature of the nuclei would be even larger. OK, so how do you, I hope that I convinced you that these effects are not small corrections, but they are really important. The other question is how do we include them in a simulation? What do we need to do on top of classical MD? Now, if you plan on solving the Schrodinger equation for the nuclei, you better forget it. So people have been doing this, but you can, it's very, very hard to go beyond the systems with a handful of degrees of freedom. So if you want to study something like a defect in condensate phase, liquid water, anything like that, this is completely hopeless because of the exponential increase of complexity of the problem. And you always have to consider that the problem for the nuclei is on top of the problem for the electrons. So if you want to treat quantum mechanically both the electrons and nuclei, you can basically forget it. So luckily, if you are interested in statistical properties, so in properties that are time-independent, I mean, there are many of such properties, I don't know, Debye-Walter factors in condensate phase, something like pH, the relative stability of two phases, melting points, there is this whole set of properties for which you don't care about dynamics, and then you actually have a very, well, not very, but at least not exponentially scaling way to include the quantum nature of the nuclei into a simulation of atoms in the condensate phase. So what you need to do is basically, rather than describing the partition function of your system quantum mechanically, use the isomorphism that maps this partition function onto a classical system, which is what we call a ring polymer. And basically all that you have to do is simulate a classical system, which is described by this Hamiltonian. So if you look at how this Hamiltonian is made up, there are three copies. For each copy we have a classical Hamiltonian, potential and kinetic energy, and then we have an additional term that corresponds to springs that join together the atoms in adhesion replicas. If you do these as you increase the number of replicas, the statistical properties of this classical system will be mapped exactly on a quantum system of distinguishable particles. But for anything that happens at room temperature and distinguishability of nuclei, it's really something you don't have to worry about. If you start thinking at liquid helium temperatures, then you worry about this, but if you are interested in the whole of the physics that goes on down to 50 Kelvin, that's not a concern. So the problem is that the number of replicas that you need has to be larger than this ratio. And if you go back to water, basically you have to consider how shall I describe this system. So this is 4,000 inverse centimeters. KBT corresponds to 200 inverse centimeters. So the bare minimum is something like 20 copies. In this copy you have to compute the potential, so your simulation has already become 20 times more expensive. So let me try to give you... I really want to give you a general idea and not so much get into the details, but you might get the feeling that this comes a little bit out of thin air, so I want to show you how this expression for the pat integral partition function comes about. I describe this from Feynman path integrals and showing the equivalence between the propagator in real time and the propagator in internal imaginary time, but it's actually much, much easier to see where it comes from in this way. So you start from the quantum mechanical partition function in the position representation. So I'm just taking the Boltzmann operator and averaging it over all the possible positions. And then you could be foolish enough to try to do this, so you write the exponential of the Hamiltonian as the product of e to the minus beta v times e to the minus beta, the kinetic energy operator. Now q is an eigenstate of the potential, so this is just a number when it works on the bra. And then you are just left with the diagonal element of the kinetic energy operator which you can compute easily going in the momentum representation and you get the classical partition function. So clearly something is wrong here because the quantum mechanical partition function is not the classical partition function and can you tell me where was the mistake? They don't commute exactly, so you can't do that. So I find that this is nice because you basically see that in the interview of statistical mechanics or the 100% of the difference between the quantum statistics and classical statistics comes from the fact that position and kinetic energy don't commute. If they commuted, you would get classical statistics. So what you can do, however, is notice the fact that if you take a high enough temperature, then you go towards a classical limit and actually this becomes not such a bad approximation. More formally you can basically do this kind of trick. So you take the Boltzmann operator. You formally write it in this way and this is exact. So you basically consider the Boltzmann operator at the inverse temperature beta divided by p to the power p. This is still just an equality. And now this is a Boltzmann operator at a much higher temperature, which is p times higher. So it's not such a bad idea to do an expansion like this, which is just basically slightly better than... I mean, this kind of symmetric daughter expansion is a bit better than this, but it's the same principle. You have a leading order term that goes down as 1 over p square. So it converges. And now the only thing that you need to do is take this expression and plug it back into the trace. Now, when you plug it into the trace, what you can do, then you have these to the power p bracketed in between q1 and q1, and you can introduce p minus 1 closure relations. You see, so you probably can write these. So you start having these e to the minus beta p v half, e to the minus beta p t, e to the minus beta p v half. And this is to the power p. And this is computed between q1 and q1. So this is what you need to compute. But then you can basically just say, OK, this is really a product of several of these terms, p times. And in between each of these you can insert an integral over qn, basically. So you get an expression that looks like this. Here you had the integral over q1, and so on. So now you see that your problem now has become to compute this kind of matrix element. And this is actually very simple because q1 and q2 are both against states of the potential energy again. So all of the potential operators become just numbers. And then you are left to computing these off diagonal terms of the kinetic energy Boltzmann operator. And you just go in the momentum representation and you end up with something which is a harmonic term in the difference between qi and qj. And this would be basically q1 minus q2, q2 minus q3, q3 minus q4. And when you plug everything together this is just something that corresponds to a classical average over this Hamiltonian. Easy. The problem is that this is pretty darn expensive. So in order to do this on top of a initial electronic structure method for getting the potential and the forces this becomes very quickly prohibitively expensive. So in order to show you how you can actually do this much more easily or cheaply at least, I need to introduce you something else. So I need you to step back a little while and let's go back to classical mechanics. You have seen on last week how molecular dynamics is basically just integrating Newton's equations. However, if you want to do statistical mechanics based on classical md you get a problem because if you integrate these equations you would get conserved energy. So the total energy of your system will not change. But if you are, I mean in pretty much any experiment you can think of simulating that you will not get your experimental colleague telling you oh, yeah, I've run this experiment at ten mille electron volt of total energy. That would tell you I was running at 200 Kelvin. So experimental conditions are typically constant temperature conditions particularly at the size scale that you can afford to simulate. So you need to modify these in order to allow for fluctuations of the total energy that are consistent with Boltzmann statistics. So you probably have seen something of these with Sandro last week. So one possibility of doing these is actually to use a Langeven equation. So a Langeven equation is actually something that dates back to the beginning of the 20th century and originally was introduced to model Brownian motion. So it's a modification of Hamilton's equation in which you introduce a friction term which is just minus gamma friction p and these models, the viscous drug which is exerted by a fluid on something like a pollen particle that moves around in a drop of water and then you have a noisy force term that instead models the stochastic interactions with the particles of the fluid. Now, not only this is a fantastic model for Brownian motion but this is also very useful because these kind of interactions with this virtual bath is actually making your system change its total energy and the balance between the friction term and the noisy force term lead to a distribution of the fluctuations in the total energy which is fully consistent with Boltzmann statistics. Now, you will see over and over again that for each of these Langevan equations I will always have something like this next to it. So there is always a relation between the friction term and the noisy force term. So this is what is called a Markovian equation in that this basically means that there is no history dependence. So the time derivative of the momentum only depends on the instantaneous value of the momentum. Since there is no memory from the point of view of the friction a fluctuation dissipation theorem prescribes that there should be also no memory from the point of view of the noise. So now the noise is not correlated in time. A problem with this framework is that if you want to use it to model Brownian motion, fine. But if you want to use it to simulate your system at constant temperature and you want to do this efficiently or you want to do this together with Karparinello-Molekouar dynamics or you want to do this together with some fancy technique you don't have much leeway to play around with. The only thing you can play with is the value of the friction. So since this potentially is a way to modify your simulation so that you can get your answer more quickly, for instance this was back during my PhD I started playing around with the idea of actually using a generalized version of this equation and actually if you stare at this equation you have a number of very nice features. The nicest feature of these equations is that they are linear. Linearity is always what you strive for because it makes everything easier to deal with numerically and also analytically. So we don't really want to give up the fact that the relation between derivative and the momentum is linear but what I am prepared to give up is the fact that there is no history. So we can generalize this by introducing a memory kernel. So you see now it's the same structure but now the time derivative of p depends on the past history of the momentum. So you see here I am integrating back in time from time t, which is the time back to minus infinity the history of the momentum and as I was anticipating because of the need of having the fluctuation dissipation theorem in order to have Boltzmann statistics the noisy force now is correlated in time. So this is so-called color of noise because whereas the memory if you do the Fourier transform of a delta function it's a constant so the power spectrum of the noise here is flat. Here you can play with the spectral distribution of your noise and this is extremely useful I don't know if any of you is familiar with carparinello molecular dynamics but in carparinello molecular dynamics the idea is that you let the electrons move on a completely different time scale than the nuclei and if you try to do this with the Langevin dynamics it breaks down because this noisy force that contains all the possible frequencies acts on the nuclei and therefore the nuclei start moving at unphysically high frequency and disrupt the adiabatic the coupling which is at the basis of carparinello molecular dynamics but if you have the freedom of playing around with the spectral distribution of the noise you can introduce a high frequency cutoff so that you can use your Langevin dynamics without disrupting the adiabatic coupling and this is just one of the many applications of this technique but what really makes this powerful is the fact that I don't really do this I mean implementing can you think about implementing this on a computer you would need to generate noise with a prescribed correlation which is doable but messy and then you would have to store all the past history and at any instant of time when you want to compute the friction term you need to integrate back in time up to possibly infinitely longer history so this is really not something that you want to do so what you can do instead is map that dynamics onto something that looks like this so what have I done here so here you probably still recognize your good old Hamiltonian dynamics friends and now there are these S and these are just tissues degrees of freedom they are not physical but they sort of model the degrees of freedom of a virtual bath so now my bath is described by these degrees of freedom that has nothing to do with the physics of my system which is described by these but they as I will show you in a second they somehow carry the memory for your dynamics and so formally this is a Markovian differential equation there is no explicit dependence on the past at each instant in time in order to get the time derivatives I only need to know the state of the system at that time so this is much more handy and also all of these the only nonlinearity in all of these is at the level of the force everything else is linear so basically you can see that in the harmonic oscillator also the force will be linear this will become just minus omega square q and this is cool because then you can solve everything analytically and you can ask yourself questions about how your system will behave subject to these non-Markovian dynamics without the need to run a trial simulation you just analytically predict how quickly your system will sample phase space how much the dynamics of your system will be perturbed and so on so just to give you perhaps a little bit a clearer idea of how it is possible that you can represent a history dependent dynamics just by extending the size of the phase space I won't give you this example now think that you have a system which is and you can think of it just in the q-p space and you have two trajectories and these two trajectories meet at the point so if your system were non-history dependent this would be impossible because when you are here if there is no history dependence your system has only one possibility where to go if the dynamics doesn't depend on the how much you are here you can't go in two different directions but now if you think that these two trajectories are actually the projection on the q-p space of a dynamics in a higher dimension space now these two trajectories can actually correspond to different values of these additional degrees of freedom so the full lines are actually Markovian so when I'm here and if I have a different value of the s I go in a different direction but when I look at the dynamics from the point of view of the q-p subspace it looks like the dynamics was no Markovian and you can actually show these formally and it's very simple there is this beautiful book by Tvanzik that shows this very, very clearly so you do it the other way around so you start from this expression and then you say but then I can assuming that the memory is fine I can compute the time history of these s's by integrating for minus infinity to time t and since this is formally linear it's a very simple differential equation to solve the only thing which is a little bit tricky is that look up, Tvanzik it's not too complicated and then once you have solved for s you can plug that into the expression for p and you end up with an expression for the dynamics of p only that contains a memory kernel and a history dependent noise so now you might be wondering what has this to do with quantum effects and I'm getting there all along I have been maintaining that in order to get Boltzmann sampling you must have fluctuation dissipation relation between the memory kernel of the friction and the auto correlation function of the noise however, within the formalism by which you write an equation like this there is nothing that forces you to enforce that relation so what happens if you break fluctuation dissipation theorem so you can regard the system in which you have broken fluctuation dissipation theorem as a system in which you have two different heat baths coupled to your system and then you could design for instance the hot bath to couple preferentially to the fast degrees of freedom and the cold bath to couple preferentially to the slow degrees of freedom when you let this combined system run what you will obtain is a stationary state in which fast modes will be hot and slow modes will be cold so you don't get something which is a proper Boltzmann distribution where all of your system is equilibrated at the same temperature but you have a stationary state in which different vibrational modes equilibrate different temperatures so the point is that since everything here can be predicted analytically and you can play around you know you can mess around with the elements inside this matrix predict what would be the response of your system say oh I don't like it so you change it to be this and you keep iterating until the response becomes the response that you like you can do this in a very very precise manner so for instance this is called a delta thermostat so the idea here is that I run a dynamics and I want to keep all the vibrational frequencies of my system stuck at zero Kelvin except for a very narrow range of frequencies that I set to find it temperature and if you look at how the corresponding dynamics looks like you see that as I target different frequencies the dynamics of my system will automatically adjust to selectively excite the normal modes that correspond to that frequency so you see when I am in the vibration band I only excite vibrations when I get the bending band my dynamics only excites bendings and when I get to the stretching band should jump I start exciting only the stretches and the point is that ok let me try to do something this was not planned as well so this is just because we have time ok so what am I doing here so here I am taking a specific form of this matrix so I have I can write some specialized forms of this A matrix that enters the friction part of the memory kernel in such a way that I can write the corresponding memory kernel for instance in this case as a sum of Lorentzian functions ok so here I am showing you the Fourier transform of the memory kernel that corresponds to a given dynamics and I can change the center of this peak in the memory function of the noise I can add where is that I can add multiple peaks I can add a white noise component to it ok so here I am just showing you that I can instantaneously obtain the memory kernel given a certain form of the matrix but then you can also do more ok if I take this noise and I apply it to the dynamics of a harmonic oscillator of frequency 1 what will be the dynamics of the oscillator you don't need to run a simulation to find out you basically only need to compute something analytical that gives you how oscillator so this is a log log scale so this will be frequency 1 and this is the velocity-velocity correlation function of that oscillator in the absence of noise I would get a delta function in 0 in 1 I mean log of frequency equal to 0 if if I increase the white noise contribution I broaden the response of the oscillator this is like a damped harmonic oscillator but you see how I can predict how the system will respond very very very very quickly so I can also do something where so right now I am conserving the fluctuation dissipation theorem and so if I look at the mean potential for oscillator as a function of the frequency it's a constant but if I break the fluctuation dissipation theorem now the temperature I mean the mean potential for my oscillator as a function of frequency changes is modulated and I can adjust these to obtain a sharp peak in the position where I want it to be so I can basically a priori design a Langeven dynamics that would set all the normal modes around 2000 inverse centimeters to a temperature of 100 Kelvin and all the other normal modes of the system will be frozen did that clarify at all? so this is sort of a cute example but it's not particularly useful at least in my opinion but this actually can be used to model nuclear quantum effects why? well I really love the harmonic oscillator as you will see but if you look at the if you solve the statistical mechanics of a harmonic oscillator classically you get the probability of getting a certain value of q is a Gaussian distribution centered in zero and if you solve the quantum mechanical problem for the same oscillator you still get a Gaussian at any temperature I mean the ground state of course is a Gaussian but if you use a Boltzmann weighted combination of the excited states of a harmonic oscillator magic-magic you get a Gaussian at any finite temperature and so the point is that the only difference between the two cases is the amplitude of the fluctuations so if you were able to simulate your classical oscillator at a higher temperature you would be able to obtain precisely the same fluctuations that you have in the quantum case so from the point of view of the harmonic oscillator and to model nuclear quantum statistics is just changing the temperature however there is a problem I mean why people haven't done this well the problem is that the effective temperature that you have to use is frequency dependent in short if your frequency is very small your system will behave classically so the target temperature should be the classical temperature if your frequency is very high you basically only have zero point energy so your temperature should be whatever h bar omega half divided by kB more or less modulo of few but I mean as a function of the overall frequency you need to obtain these kind of frequency dependence of the effective temperature that you enforce on different normal modes but the point is that in these color and noise machinery you can precisely enforce these kind of frequency dependence automatically so you don't need to know what's the vibration spectrum of your material of your molecule you just design a generalize of an equation that gives you these kind of frequency dependent fluctuations and bam in the harmonic limit you get all the normal modes so these so you might say whatever if I have a harmonic system I don't need to run molecular dynamics but what is remarkable is that these idea works exceedingly well also for very harmonic systems so here I'm basically showing you a quartic double well which is pretty darn anharmonic and I'm showing you as a function of the height of the barrier so basically the dashed black lines tell you what should be average potential kinetic and total energy computed by solving the Schrodinger equation and computing the finite temperature density matrix basically classically well the mean kinetic energy would be 0.5 you have equipartition and since it's anharmonic the mean potential has this shape but it's very different from the quantum case but if you just take this generalized Geven equation fitted to give the correct result in the harmonic limit and you apply this system you get the red dots and it's for free it's basically as expensive as classical molecular dynamics but it gives you a pretty good approximation to quantum statistics so these actually surprisingly also works very well in the condensate phase in anharmonic problems that are not so trivial so you can for instance look so do you know what is a radial distribution function? perhaps not everyone of you so basically what I'm doing here for those of you who don't know I'm sitting on in this case I'm sitting on an oxygen atom in water and I'm asking myself what's the probability that I have a hydrogen atom at a distance of one? Oh, a lot, because there is the covalent bond which is about one angstrom long so if I'm sitting on an oxygen it's very likely that I will have two hydrogens one angstrom away from me and this is sort of trivial what is not so trivial is that then you have in a longer range a distribution that is non constant and that you can measure by neutron scattering for instance and you can see that particularly for the OH stretch classically you would have a very sharp peak but when you switch on quantum effects and the fully convergent path integral molecular dynamics would be the black line you get a much broader peak because you have 0.10 energy and you have much larger fluctuations and if you just apply this generalized range of n equation thing I mean it gets very much closer to the quantum result for free so it's not bad so however you have to be a little bit careful so the situation is not so perfect as I make it sound like because the problem is that in this cartoon that I have presented to you earlier I had assumed that the two different normal modes were completely the coupled so I have a system which is perfectly harmonic and the different normal modes don't speak to each other but in any real system you will have anharmonic couplings between different degrees of freedom and what you are trying to do effectively if you stay within this cartoon picture you are trying to keep your high frequency modes hot and your low frequency modes cold and if they can speak to each other they will tend to equilibrate to the same temperature so what's happening is that in a real system such as water you have a fight going on between the two thermostats that try to keep a temperature differential between different normal modes in your system and the intrinsic anharmonic cross talk between different normal modes that instead try to make the temperature the same everywhere and so what you can do that is brutal is just make the coupling to the thermostats very very strong so that they win and this as I showed you this sort of works water is extremely anharmonic the lifetime of a stretch is very short compared to anything that you would have in a solid still you can get something decent just by making the coupling to the thermostats very strong so however my take in all of this is that if you care about nuclear quantum effects that are important but not the most important effect in the world you might as well get them right so I was not completely satisfied with this because you are sort of patching it up but you don't know how large is your error so I started thinking ok wait a second on one hand we have this quantum thermostat thing which is exact in the harmonic limit but in any real system will be approximated and I don't know how large the error will be and on the other extreme of the spectrum I have pat integrals that are very accurate I can make them as accurate as I wish but they are too expensive so what I can do perhaps is put a fraction of colored noise on top of a pat integrals simulation and hope that in doing this I will keep the systematic convergence that is guaranteed by pat integrals molecular dynamics but make it faster somehow things will be exact in the harmonic limit and actually this can be done and I just want to give you an idea of how this is done and I actually urge you if you have some time this evening to go and do this kind of derivation yourself because this is something that is just a way to convince yourself that pat integrals work so if you again if you solve the quantum mechanical problem for a harmonic oscillator you find that the mean fluctuations as a function of frequency should look like this ok this is the same thing as the temperature that I showed you before only converted in fluctuations of q square so instead if you do pat integrals molecular dynamics what you have is a system where you have a spring a chain of springs with periodic boundaries since I gather that most of you are condensed matter physicists this is your beloved phonons in 1D you have periodic boundaries and you have your frequencies and then of that you have an external harmonic potential which is the physical v so I can't really go back all the way to the pat integral Hamiltonian but if you remember we had sum over p of v of q I mean just the configurational part the kinetic energy who cares so you see this would be your I mean here the conditions are implied so I plus p is the same as I so this is basically your chain of oscillators and then if you choose a harmonic potential this is just one half m omega square where omega is the physical frequency of your oscillators so you see this is a diagonal sorry you of course so this is a diagonal term so this just shifts all the frequencies for the closed chain of oscillators and so if you you see that basically this would be basically the dispersion for the chain of oscillators and you just shift them up by omega squared so these are the actual frequencies of the normal modes in your ring polymer the physical potential of frequency now since the potential is diagonal it doesn't change the eigen states of the dynamical matrix so so you can basically just since you know the normal mode transformation for a chain of springs is a unitary transformation so you can compute your expectation value of q squared for a ring polymer you can compute it analytically just by transforming in normal mode coordinates normal mode coordinates will have that frequency and classically the expectation value of q squared for an oscillator frequency omega is proportional to one over omega squared so basically in the end you get that the value of q squared for a finite number of p is just this one over omega k squared where omega k is given by these now you rush to your favorite handbook of series and you look at how this looks like when you take the p going to infinity term basically the trick is that in the p going to infinity term you can linearize the sign and then you can work out how that looks like or you can feed this to matematica so that the limit for large p is precisely what it should be no surprise here however now you can ask yourself this question let's not take the p going to infinity limit if I don't take the p going to infinity limit this will not well it's so if you have ever seen a quantum Monte Carlo talk they typically speak about timelines, crossing so here the springs if you stay within this ring polymer picture so let me go back so the average pat integral talk would show you something like this so you have multiple copies of your system that are connected by springs so so fair to show it like this so you have parallel universes and the same particles in adjacent parallel universes are connected by springs so the distinguishability here comes from the fact that the springs connect the particle with the same index in adjacent parallel universes if you want to have indistinguishability the chain of springs for this hydrogen atom for instance would switch to this hydrogen atom go around and then close back on to that hydrogen atom and actually you can build the partition function for a system of undistinguishable particles as a sum of partition functions one with springs that close on to the same particle one term that comes from two particles mixing up together one part coming from three particles mixing up together and then you get bosons by summing them all with the same sign and fermions basically by using an opposite sign when you have odd number of particles into the combined ring polymer minus when you have an even number and clearly the sign problem in this picture comes from the fact that you have a partition function which is a sum of oscillating terms the point is that both for bosons and for fermions the weight the statistical weight of the polymerized ring polymers partition functions is very very small to go down to very very low temperature before this actually matter so ok so we were here and then I mean you can ask yourself ok but I want to I want a word and I want that I have this relation not for only in the limit of p going to infinity but for any value of p I know that for the one by using a quantum term use a temperature dependent a frequency dependent temperature that gives me this relation to within the accuracy of my fit with just one replica so why can't I do the same with two replicas and you actually can do it so you just write the temperature dependent fluctuations like these and you say ok I want to find what is the function what is the frequency dependent fluctuation that I need to enforce to obtain these for any finite number of p and this is basically a functional equation because you see hidden this is the physical frequency omega and this is the frequency of a one of the normal modes of the ring polymer that contains both the ring polymer frequency and the physical frequency so the physical frequency enters is hidden inside here and you get this functional equation but you can solve it and then you can get something where we really can get the best of both worlds so this would be for instance for the quantum conjugation energy of a hydrogen atom in water it's large thermal energy at room temperature is 25 mdV the kinetic energy I mean this is the difference with the classical kinetic energy so the quantum kinetic energy of a hydrogen atom in water at room temperature is about 5 times the thermal energy it's big and if you want to converge that using conventional PIMD you need at least 32 replicas but if you use this colored noise on top of it with 4 or 6 replicas you are perfectly on top of it and you see the point is that whereas with the quantum thermostat I had just one point and the only way of knowing how much I was wrong was to converge but integrals which kind of defeats the poor pose right here I can systematically increase the number of replicas and see that it doesn't change anymore so I am converged so you can save more or less a factor of 6 in CPU time which really makes it possible to do things that wouldn't have been possible otherwise but to give you now perhaps do you have any questions up to this point because now I am getting more into the applications so if you have some questions about all of these this might be a good moment to have them I am doing it now so honestly I am doing this more as a curiosity this is one of these things that I am doing just because I can not because I think it is very important so you know that normally if you go beyond the primitive approximation you start having commutators now commutators most of the time in the simplest forms of higher-order expansions bring you a modulus square of the force inside the Hamiltonian so that is okay if you are doing Monte Carlo but if you want to do MD when you compute you are doing classical you are sampling that partition function by classical dynamics so you need to do the derivative of the Hamiltonian with respect to the Q and when you do D over the Q of the modulus square of the force you get a term that contains the Hessian so it is very impractical so people have been doing have sampled using trotter and then reweighted using the difference between the two terms but the statistical properties are really bad so I mean I am doing it because why not but I don't expect these to be it is probably going to be useful for doing stuff at very very low temperature but then if you go at very very low temperature boson or fermion statistics starts to become important and that is much much harder to cope with within this color noise framework so it is doable but I don't think it is going to be a game changer so let's move on to applications so the system where it is perhaps more interesting to work on nuclear quantum effects is probably water because it is important I think there is no need to discuss it I am thirsty and I can drink that is something that makes it very interesting and also because it has a huge amount of hydrogen in it so as you have seen at the beginning more than 90% of the vibrational modes of water have a degree of quantum necetron temperature so it is a system where you might expect quantum effects to be very strong and in some cases they are but if you look at some other physical properties say the change in melting temperature when going from H2O in H2O it is for Kelvin for Kelvin is nothing it is 10% of the melting temperature you will never ever get a empirical potential for water that gives you that kind of accuracy not just by chance so for a system which is so strongly quantized how comes that many physical chemical properties change by very little when you go from hydrogen to deuterium the surface tension of water changes by one part in a thousand between H2O and H2O so it is kind of a puzzling question based on sort of purely theoretical arguments back in around 2010 let's say people have started to suggest that the reason why this happens is that you have a competition between quantum effects in water that go in opposite directions so 0.10 to put it very very simply 0.10 energy along the stretching modes will tend since the stretching is anharmonic make it a little bit longer increase the dipole moment of the water molecule and make the hydrogen bond stronger however the 0.10 energy in the vibration and degrees of freedom will make it easier to break a hydrogen bond and so the hydrogen bond will become weaker and the idea is that the two components almost perfectly cancel out so that even though each component is very strong the overall effect is not negligible so an interesting thing an interesting way to look into this is by looking at the particle momentum distribution so now I am not looking at the position representation basically but I am looking at momenta and this is actually interesting because I think that all of you know that based on classical mechanics you would expect to have a distribution which is just e to the minus p square so a spherical distribution which only effectively depends on the mass and on the temperature so there are these people it's actually they work rather for the apple to close to oxford where they have a neutron scattering beam line where they can have very high energy neutrons so they can do Compton scattering of nuclei and they have a time of flight measurement apparatus so they can measure how much the velocity change upon in elastic scattering and they can infer what was the distribution of velocities of the nuclei that the neutrons have interacted with now if the nuclei behaved classically these people would be using an instrument which is worth tens of millions of dollars to measure the temperature of their sample but the point is that the particle momentum distribution is not Boltzmann and it's not even spherical so if you look at the particle momentum distribution for a proton in water it is much more spread along the covalent OH bond than it is in the orthogonal direction and this is no rocket science it's just that you have larger 0.10 energy in this direction and so you have anisotropic particle momentum distribution and what is nice is that you can actually measure this anisotropic experimentally and so you can get a direct experimental measure of this competition of quantum effects so you can measure the particle momentum distribution actually the three components of this kinetic energy tensor in the liquid and in the solid phase for this long story but anyway and you do this in a simulation and what you see is that the high energy component goes down when you freeze but the low energy component goes up when you freeze and the intermediate component changes by very little so the point is that even though these two components change by quite a bit in going from the liquid to the solid state your kinetic energy changes by very little and you can actually link the change in kinetic energy to the change in melting temperature between, I mean the change in kinetic energy between the two phases for even isotope can be connected to the change in melting temperature upon isotope substitution and the nice thing is that if you make the experiment the experiment is a nightmare I mean they use the same software in the error analysis because you have to consider multiple scatterings from the neutrons so you have sort of a self consistent problem to solve in order to get what is the actual signal so the error bar in the experiment here is a mystery but it's pretty large but at least qualitatively you get the same trend so you get that the highest energy component goes down leading to a much smaller changing the total kinetic energy and then very closely connected to these you can look at another purely quantum mechanical effect which is something which is called isotope partitionation so if you do classical statistical mechanics and you just look for instance at the probability of having an atom this or in another phase this can be related to the partition function and just the configuration part of the partition function so in classical statistical mechanics your partition function can be factorized exactly so basically this doesn't change when it changes but in a trivial way when you go from phase to another so you really don't care about that you care about is the partition function relative to the coordinator and the mass enters nowhere here so the statistical properties of hydrogen or deuterium should be identical if they behaved classically but they don't and one way to see this very clearly is that if you have two phases in equilibrium with each other let's say liquid and vapor in water you will have a different concentration of deuterium in the liquid and in the gas phase this is actually more or less how they do separate heavy water from liquid water it's a multi-stage process but at a certain point they do fractionation distillation of water and they separate out the component which is richer in each wall and it depends on the factor that you have a differential in the concentration in the liquid and in the vapor so this is used a lot in geochemistry they actually use it also to analyze where a wine was produced because the concentration of deuterium in the wine depends on the year the location so it's a way also to do sort of wine forensics follow the Rosetta mission on the comet there was all of this discussion about how much deuterium is there in comet ice compared to the water on the earth and the question is the fact that it's different does it mean that the water on earth doesn't come from comets or does it depend from the fact that in 4 billion years comets have lost light hydrogen relative to heavy hydrogen so all of these things can actually be computed and this is very nice because this is typically a very delicate balance between the first so it's something which is very small energy differences but it's something that experimentally you measure basically by mass spectrometry and it's very easy to separate hydrogen and deuterium so they can measure the concentration of deuterium with as many digits of accuracy that you can desire and so this is nice for two reasons first of all it's a direct test of an effect which is only quantum mechanical and also if you are interested in doing simulations of matter in the condensate phase this is also a fantastic way to benchmark models because it's something where you have a very very delicate effect that you can measure very accurately and there is no there is no debate about the experiment it's very very clean and you can also compute it by simulation so what you need to do is sort of nice so you have to do a thermodynamic integration so you have basically to do many different simulations of your system in which you virtually change the mass of one of your particles to the mass of deuterium and you can show that basically by doing this, by computing the kinetic energy for the different values of the mass and integrating you get something that corresponds to the delta G for the exchange between the two phases so you do this for abinitial simulations of water for instance and I don't know if you have been to abinitial calculations they always go around which function they used and here you can actually put the different flavors of density functional theory and compare them with experiment and these are tiny energy differences so here we are at room temperature so KBT is 25 mL volts and this is basically so this is probably I'm lost this is a small fraction of KBT so these are really a few mL volt of difference in free energy between the different phases but you can compute it you can measure it experimentally and so you can for instance understand something about why PBE doesn't work if you know what PBE is popular version of of DFT and the most expensive versions work somehow better and then I find it really nice that you have now I mean people are starting to look the difficulty when you want to look at quantum effects is that you need something which is really selective for quantum effects and you can get very very subtle but interesting effects so here for instance so you can do something which is called some frequency spectroscopy so what you are doing basically you have two infrared lasers and you scatter out something which is the sum of the two frequencies and you detect that now in order to have a zero transition dipole for the some frequency process which is non-centrosymmetric and it has to be non-centrosymmetric on the length scale of the wavelength of light so you need to have an environment which is non-centrosymmetric on the length scale of the hundreds and hundreds of nanometers so what's nice about this experiment is that if you do this for liquid you only get a signal from the surface because in the bike on a length scale of the hundreds of nanometers you are homogeneous and isotropic and centrosymmetric and the only place where you have a broken symmetry is at the surface and so you can detect a signal which is only coming from the surface of your sample and so these guys in Boston were doing experiments measuring the signal from the stretching peak in mixtures of H2O and D2O and what they were seeing was that as you decrease the fraction of H2O the OH stretch signal goes down and as you increase the fraction of D2O the OD stretch goes up not very surprising what's surprising is that the ratio of the intensities of the two peaks was not consistent with the concentration in the bike and the problem that they had was that these are complicated experiments and so the scatter of the experimental data was pretty much all over the place so these are the red and the blue points are basically the estimated surface concentration well, excess or depletion of hydrogen and deuterium at the surface and they were not so sure that they could publish this because results were all over the place even though you could clearly see a trend in which for half and half concentration you had about between 6 and 1 percent change in concentration so we actually could run a simulation for these and the nice aspect of studying from a computational point of view is that you can really go and look into individual points so they have a very good sensitivity for the surface but we can really go and look at individual atoms so we can look for instance at the very, very, very first surface layer and then we can, just by looking at the orientation of a water molecule and the position relative to the surface we can distinguish hydrogen atoms that stick out of the surface hydrogen atoms that point down and hydrogens that are lying flat on the surface and we can look at how often we find deuterium or hydrogen in each place and you can see that the only place where there is a difference with the bulk is really just for the OH that stick out of the surface and the point is that we can compute these with very, very high accuracy we don't and so theoretically we had these black lines these black dots basically so this is the error bar, it's tiny and we could show that the simulation points were lying perfectly on a lagomir absorption kind of curve so basically a curve with a single parameter could describe perfectly the concentration dependence of the surface excess and so they could fit their noisy data to the same curve and just out of luck they got something that falls almost perfectly on top of theory I mean the perfect agreement is clearly for two choices, but I think that this is a nice story of how simulation could give validation and reference model for an experiment which was very, very hard to interpret just based on the experimental data so final thing, I won't give you a long break before we go to the computer room I just want to show you briefly how I mean other kind of quantum effects that you can detect in water and this is particularly interesting because it sort of shows you how much quantum effects can change our understanding of something as common as water at the atomic level so if you look at the simulation of classical water, so this water abiniso molecular dynamics, but the nuclei are treated classically you can look at all the hydrogen bonds that you find and ask yourself, ok, so what's the angle of the hydrogen bond and what's the position of the proton along the oxygen-oxygen distance so this is what's called the proton transfer coordinator, so that's the difference between the covalent bond length and the hydrogen bond length, and if you do your simulation classically, this is always negative, so your hydrogen is always closer to the oxygen in this covalently bound to than to the other but if you switch on quantum effects, the picture changes quite a lot and you start having some extreme quantum fluctuations that bring the proton closer to the acceptor oxygen then the oxygen was covalently bound to and this is actually related also it's coupled to electronic fluctuations so here I'm basically looking at the electronic structure close to a hydrogen bond in terms of on the Y axis I have a measure of whether an electron pair is like a bond or is like a lone pair and if you do the simulation classically it's very clear that some electrons are bond like and some electrons are lone pair like, but if you switch on quantum effects, well you increase the fluctuations along the proton transfer coordinator and also you start having electrons that are a little bit they don't know really what they are and the point is that these kind of quantum effects in room temperature water don't lead to any major observable effect because once your proton has quantum mechanically hopped onto the nearby water the energy to separate the two charges is too large so it goes back however if you perturb the hydrogen bond network of water for instance by increasing the pressure the interplay between quantum effects and this classical perturbation of the hydrogen bond network can lead to huge effect so here these are simulations at 10 giga Pascal and 750 Kelvin so even at here we are in basically super critical water still quantum effects matter a lot so here I'm basically just analyzing the simulation to see how many neutral waters I have ever compared to H3O plus or OH minus and if I look at the classical simulation and I see how many ionic pairs I detect pretty much all of the snapshots of my simulation are fully neutral however you see that there is a little bit of atoms jiggling around because the increasing pressure is bringing in the oxygen atoms closer to each other and it's making it easier for the proton to delocalize now if you also put nuclear quantum effects on top of that that becomes very simple for the protons to go around and your system becomes actually super ionic and by the end of the simulation all the protons have exchanged with each other so there is also a small effect on the density which is interesting because it's counter intuitive so if you switch on quantum effects and they go and your density goes down because you have the de Broglie wavelength of your particles is different from zero so they swell up and they can go around but actually you can see that the quantum simulation has a slightly higher density than the classical one then I don't know this you can also try to look at how quantum effects couple at room temperature and also here you see so this is basically in a classical simulation this is showing how much the presence of a proton you have an acid so you have an excess proton how much that perterbs the hydrogen bond network of water and basically beyond 3.5 ohms from the proton in fact if you switch on quantum effects well everything gets more blurred because you can have larger quantum fluctuations but also if you see you continue to have an impact on the degree of the localization up to much longer distances and this is not a purely electrostatic effect but it's something really that has to do with the interplay between the hydrogen bond network and quantum effects and this is now I just want to briefly introduce what we will do in the in the lab after the coffee break so you ask yourself ok you have introduced party integrals color of noise how are we going to do this I have my favorite Abinition molecular dynamics code CP2K Abinit quantum espresso I suppose that quantum espresso is quite popular in Trieste well it's a lot of work to implement all of this stuff into your favorite code so rather than suggesting that you should do that what I did was making a python code that communicates with your Abinitia atomic special code the one that you love most using internet sockets so they too basically run independently you have to do very very little change to your electronic structure code and you get all of these automatically and if you want to get you know I discussed this fitting and I showed you this I mean this thing where you can play around with the form of the noise you don't have to do this in order to get the parameters that you need to simulate quantum effects in a certain system there is this website where you just go there you select which kind of color and noise you want pick the code that you want to use this with and you get the stuff that you need to paste into your simulation to get it to run so let me just thank the people who were involved with this so this is really the work of almost 5 years starting from the second half of my phd up to now so this started back in Lugano in my phd and with Giovanni who is now at CISA and there was working mostly on the classical sampling problem with color and noise and then in Oxford with David we worked on combining these with quantum effects now in Lausanne we are refining these putting the final touches higher order propagators and so on and here Ali who works here at CTP is also involved in many of the projects and many other people so to wrap up I hope that you walk away with the with the feeling that nuclear quantum effects are something to consider at least they might not be important or might be important for your problem and it's generally worth checking the standard way is patin integral molecular dynamics which is actually orders of exponentially better than solving the Schrodinger equation but still it's very demanding and if you want to make it faster you can use color and noise so if you just want to have a qualitative idea you can use the quantum thermostat if you want to be systematic you can combine patin integral and color and noise and you can get this fairly easily and as you will see it's not too hard to run it so thank you a lot for your attention and let me know if you have any question