 Welcome. I'm going to try a slightly different setup and take you through equations by drawing and the slicing on my iPad. It's going to be a little bit of chicken scratch writing I'm afraid because by default the iPad I can't really rest my hand on it because then it goes out of the presentation mode. So the idea here is that I will try to derive the Boltzmann distribution for a general large system like the one you see on the screen and we want to use this to make sense on what's on the left versus right here talk about order entropy and everything. This is not really hard. It's actually a very simple math. You're going to need to understand the exponential function and Taylor expansions but that's it. The thing that might be slightly confusing is that we can't really assume anything about the system so that the part that is possibly the hardest is this slide where we're going to need to introduce some sort of general observations. Observation is the key word here because I'm interested in my small part of the system. This could be a protein in a test tube. It could be a small lab set up at Alvanova or it could be a brick in the classroom anything really. This part does not exist in isolation but it's part of a larger universe in the rectangle here. The universe I'm going to use a special word for that. I'm going to call it a thermostat because when I only care about energy here if I assume that my small part can exchange energy with the rest of the universe well something exchange energy is intimately related to temperature control and everything and that's simply why I call it a thermostat because energy will usually flow from high temperature to low temperature but that's a parenthesis. So what properties my small system will have is interconnected to the large system. If I have a total amount of energy capital E in the universe if I put a small part of that epsilon in my small system by definition I will have to have the rest in the thermostat. If there are lots of ways of doing this then there's lots of ways I can get that division and then there's likely lots of ways that I will observe this many states or many microstates on the other end if there are a few ways of doing this for instance if I try to put virtually all my energy in my small system there would be fewer such microstates available and it's going to be less likely to see them. I bet that sounds abstract to you so I think we're going to try to jump right into the equations. Remember what I said the equations frequently help you rather than confuse you. We're going to need a couple of definitions here though. When we talk about probabilities the easy way to talk about probabilities of something happening is proportional to the number of states that represent the thing happening whatever the thing is. So the thing here is what I have in the figure right that I want to put energy epsilon in my small system so that will correspond to putting energy capital E minus epsilon in the large system. Let's introduce a completely arbitrary letter for that so that's the number of microstates capital M. That is a function of the energy in general so that's I will write M as M-therm as a function of E minus epsilon and then you would just for now have to buy it that's the probability of this happening that is the probability of my small system having energy epsilon is proportional to whatever this number M-therm E minus epsilon is. In principle we don't know more but well we know a little bit right. This is number of states or something unless you have an infinite energy or something there will be at least one state so we can say that it's larger than zero I need to be able to count. If that is larger than zero I'm also allowed to completely arbitrary introduce a definition I can take the logarithm of any positive number so let's just for convenience start to take the logarithm of that function M and multiply it by a constant. Now you don't know what that constant is it's just unit conversion for that I will use the letter S. Of course you know that this is going to be the entropy but pretend you don't for now because M is a function of E minus epsilon S in general will also be a function of E minus epsilon and for now that's pretty much a winnow and K is just a unit constant. Let's see what that takes us shall we? This is what you had in slide 15 I think a lecture 4. I will take you through this. The very first equation here is just a recap of the last equation for the last slide. S is K multiplied by the logarithm of the number of states. We don't know anything about that function but when we don't know anything about a function it's usually a great idea to start looking into zero's expansion in physics. I'm going to need a bit of clean sheet to do that and I don't have sheets here but at least I have clean electrons. They're green. I've reused the electrons it's environmentally friendly. For a general function let's call it f. I can expand that around an arbitrary point x0 so if f is a function of x I could think of f closed a position x0 by looking at a small displacement delta x from x0 and if I do a Taylor expansion around x0 here that would be f at the point x0 plus the displacement delta x multiplied by the first derivative right df dx. I could use parenthesis and use the argument here but to follow the Finkelstein book he loves this physics notation or just or vertical bar and that means that I evaluate that function at x0 so it's no longer a function but this is a value right and then I should divide this by one faculty but one faculty is one so I'm not going to do anything more there then we have a second-order term that's going to be delta x squared divided by two faculty multiplied by the second derivative d2f dx2 taken at the position x0 and then we have a third-order term to delta x to the power of 3 divided by 3 faculty d3f dx3 divided by x0. I hope you follow me because I run out of screen space now. In general you could of course say that you don't know anything so you will just approximate this with the 0, 3rd or 2nd order term but that's a bit dangerous because remember we said that we wanted to prove that this is universally true right so I can't arbitrarily do something I need to prove that something is needed versus not needed for a general function I can't do that but in this case it's not the general function I have a function s that is a function of the energy e I would argue that both s and e's are proportional to the size of the system for energy this is probably fairly easy if I have what one kilo water under particular conditions if you take two kilos of water the total energy there is going to be twice what I had from the start hopefully by that with s it's not quite as easy and remember that s the total s is the logarithm of the total number of states right if I have many states well that is going to have to do with permutations and that I can put I've had two parts in the system the total number of ways I can organize it would be the product of m1 and m2 so this is going to be the logarithm of m1 multiplied by m2 multiplied by m3 etc but then I can just use the logarithm laws so this corresponds to the logarithm of m1 plus the logarithm of m2 plus the logarithm of m3 etc and that is what I wanted to show right so if you had two parts that were identical the total entropy is suddenly going to be or s which you don't know yet is the entropy is certainly going to be the sum of these so that too will be proportional to the system size so going back to this equation we had on the previous slide both s and e are proportional to the system size which is really cool because that's going to get us some things if we now take the Taylor expansion up here and instead of f we have s and instead of x we have e the first order term well I can't say a whole lot about that that's in principle s right so that's going to be proportional to the system size the second order term look at the derivative here we have a derivative well we have a nominator there that's something related to s and then I divide it by something related to the argument which was e so I have something proportional to the system size divided by something else that is also proportional to the system size and that is going to be proportional not to the system size but it's roughly constant well if it's constant we might need that term too let's continue to the second order term here at the top I have something roughly proportional to this system size s and then the denominator I have something proportional to the square of the system size this term is going to be proportional to one over the size of the system and remember what I said in the first slide we're going to look at properties for very large systems so at the size of the systems goes to infinity this term disappears and that term disappears and all the other higher order terms disappear too so we only have to look at the constant term and the first order approximation and then the limit of large system this is going to be exact not an approximation with that I think we can go back to the previous slide so what I've done here I've used the equations from the last slide exactly the only difference is doesn't say x plus delta x it says capital E minus epsilon the minus sign here is going to change like so we have the 0th order term that's s taken at the value e just as before delta x is now negatives turns into minus epsilon instead of plus delta x and then we had the value of the derivative d s d e taken at the value e so that's exactly the 0th and first order term in the zero expansion this might not look particularly beautiful but if we now take the top equation here again and I would like to go back to the number of states because the probability remember that was proportional to this m let's try to solve for m how do I solve for m well I start with the first equation again right so I divide both sides by kappa and then I take the exponent so m would be the exponent of s divided by kappa that's what I do here but now I need to do this for m of e minus epsilon is the exponent of s of e minus epsilon divided by kappa the first term here is just the exact definition but inverted in the second part here let's use the series expansion I had here and put that into the equation there so then I get first as first order term here so the argument for the exponent I'm gonna have s of e divided by kappa that is a constant term so let's break that out into a separate exponent term that's gonna make life and things easier so that's the first term e of s exponential of s of e divided by kappa multiplied by the second term here so that's this part right minus e dsd that's exponential raised to minus epsilon dsde and then divided by kappa it's the same kappa I had there this is beautiful I know that you don't think it's beautiful yet but this explains everything this entire part here is just constant and we are physicists so what comes to constant I will just scratch that out say yeah whatever it's a constant and then I have some stuff remaining here the probability of observing a state was proportional to this number of microstates and if I say proportional to I can complete it it's the constant on the previous slide and that is proportional to an exponential raised to minus the energy in the small part of the system multiplied by this derivative dsde taken at the at the point e divided by kappa or k sorry I might have been using kappa and k interchangeably here I knew I know that you don't think this looks beautiful but the point here is dsde is something natural in nature it comes out of the properties of the system but if it's natural let's just call that something 1 over t is just a definition and the k we had before let's call that kappa what this will give us is a very familiar equation that this proportionality is going to be proportional to e raised to minus epsilon divided by kt or the Boltzmann distribution which is exactly wanted wanted to prove but the cool thing here so we have not assumed anything about a pillar of gas or anything but just looked at general distributions of energies in system and the properties of the exponential and logarithms quote era demonstrator