 I'll try to be more kinetic. Gary, what is it? It's trying to understand microscopic properties on the basis of a microscopic picture, in particular, the general round of mountains. So, last week, I think most of what we're going to say, we saw that in classical statistical mechanics, there is an equal partition principle. It says there's a contribution to internal energy associated with each of the accessible means that it's a variable, and the energy depends quadratically on that variable. Accessible means that the temperature is high enough so that this degree of freedom is excited. And so we've talked about the example of a diatomic molecule, which in addition to its center of mass movement, in three dimensions, depends on the three components of momentum. Maybe I should say px, py, pz squared for each one of molecules in the gas. In addition, there is a vibrational degree of freedom, vibrating along the axis, and rotational degrees of freedom along two axes, the diagram's getting a little indecipherable, but think of it as a little stick, which has a non-zero moment of inertia for rotating about two different axes. And so if everything is accessible, the internal energy per particle would be three-halves tau, or in other words, kt, associated with the center of mass motion, three quadratic variables for each molecule, px squared, py squared, and pz squared. And then plus another tau, because there are two rotational kinetic degrees of freedom, the kinetic energy of rotation of about two different axes, or if you like the angular momentum, l1 squared, l2 squared, about two different axes, another two quadratic degrees of freedom. And then there's another tau associated with vibration, because now we have a p squared and an x squared, which is a restoring force for the vibrational degree of freedom. So p squared is required for concentration along the axes. So if everything is accessible, we have half tau for the energy per particle in the case of a diatomic molecule. So why does px squared, py squared, and pz squared be three-halves? Are the other ones only at one? Well, everybody gets one-half tau, and so it's a rather good one. One-half tau per degree of freedom. Right. So this is three degrees of freedom, and this is two, and this is two. And so what does that mean to be accessible? When the temperature is sufficiently low as to be comparable to the level spacing for the quantum acceptations, either rotational or vibrational, then quantum effects become important. So if pt, that is tau, is small compared to h bar squared over twice the moment of inertia, or h bar squared over i, then patience frees up when the temperature is small compared to h bar or negative, which has a score, and vibration frees up. For typical molecules, vibration frees up at a high temperature. In classical, just on the edge, we can also talk about the distribution of velocities for particles and effects. As we discussed many times, if we count orbitals, we have to consider three dimensions integrating over position and momentum. And there's a power of, we can point this concept, h2i h bar associated with each dimension of this normalization factor. So if I integrate the position in the case of a free particle over the whole box, there's a factor of volume. So I can write that as volume, and then I can assume that we have a rotationally invariant distribution of momentum integrated over the orientation of the momentum. And so I would integrate the area of a shell momentum space dp divided by the normalization factor. Now, the people in the 19th century who were trying to understand statistical mechanics didn't know about h bar, but they did have the idea that when we count states, we should count a volume in the xp space. So they could write down a formula like this. They just didn't know how to normalize it. For a lot of purposes, the normalization doesn't matter. If we're considering non-relativistic particles with momentum equal to mass times velocity, we can say there's some constant and who knows what it is or cares, a factor of volume, and then the number of states per interval of velocity or speed, right, v here is the magnitude of velocity, is weighted by v squared. That's coming from this p squared in the density of orbitals. So when we normalize it, the constant c drops out. I want to know what the probability of distribution is. Chosen so that when I integrate p of e to v over all speeds, I get 1. So the denominator is the integral. And then the Boltzmann factor, here I'm counting the number of orbitals, but then Boltzmann and Maxwell also knew that the probability of an orbital being I didn't use that word would be suppressed by a Boltzmann factor. So it's also a factor which goes like into the kinetic energy divided by tau, and I should put that in the numerator. And so this integral, if we change to dimensionless variables, so I can write this as the integral 0 to infinity dx x squared e to the minus x where I'm choosing x squared equals 1 half and v squared over tau. So I'm doing this out. I'm multiplied by m over 2 tau twice here and once here, then I have to put in and compensate a factor of m over 2 tau to the minus 3 halves. And this is just the number which is everybody knows is squared out of 4. And so when we divide by that factor it is the square root of 4 over 5 e to the minus 1 half of the square root of 5. It's called the Maxwell Distribution of Velocities. It means that if we have ideal gas and then I randomly pick a particle in the gas and measure its speed and do that many times and find out if I find many particles, I will find undistributed according to the Maxwell Distribution. Did you miss a v squared? Did you miss a v squared? I can't hear you. Did you miss a v squared? I sure did. Thank you. He asked me if I missed a v squared. Each time a particle comes out through the hole, then the distribution will be biased because faster particles come out through the time. And hence my factor of v. Why is the correct weight v? Why is the correct weight v after the particles coming out through the hole? Because there's some the number of particles coming out is going to go like a density of particles times the factor of velocity. Okay? So it's going to be proportional to v times density. So think of this probability distribution as describing if you think of the density as being a density of particles in a certain speed window in an integrated dv. But the rate at which they're coming out in beginning time will be enhanced by a factor of speed. Okay? So look at some little time interval. Take all the particles that came out measure their speed. Okay, so at least we have the right dimensions. You can do something with that. Yeah, but I mean the rate faster particles are more likely to come out in a given time interval. But that makes faster particles. Oh, I see why they're proportional to p. Well, I'm just saying that there's a flux. A flux is proportional to number per unit volume times speed. This is flux, meaning number of particles per area in time. So we have some hole fixed area. Right? And in each time interval the if I compare particles with two different speeds, right, with the same density, the particles which have higher speed will be coming out at a rate per unit time which is enhanced by a factor of speed. Then if I look at all the particles that came out and I measure all their speeds, I'll have a distribution which is enhanced by an extra factor of speed. How should I say p will be 2 times? If I do kind of hole out and let particles leak out for a while, both in distribution which starts out quadraticly as function v, the particles that will escape will be biased towards the higher end of the distribution. So when the particles escape then the mean value of the speed is going to be shifted down. Of course, if we just were to say we can take away the high speed particles, that's no longer a Boltzmann distribution because it's no longer an equilibrium. But if we wait a while the particles, let's say, open the hole for a while and let some of the high speed particles escape, then I close the hole. After a while, because of collisions among the particles, they'll redistribute the energy and come back to equilibrium. But after they re-equilibrate or re-thermalize, the distribution will be shifted to the left. So the particles will be cooler than before. They'll still have match Boltzmann distribution, but shifted to a smaller mean velocity hence a smaller temperature. So that's an example of cooling by letting particles where the particles that escape are biased towards high speed is the way the upper end of the cooling will be something to talk about in the context of reaching the ultra-cooled temperatures for dilute atomic gases. Well, the story which I think you've seen before, but it's fun to briefly revisit it is how we understand pressure from the point of view of an adversary and an idealist. So we have gas in a box and look at one of the balls of the box. One of the walls of the box and why is it that we have balls? We do have balls. We have the dimensions of the molecules with one of the walls and there will if we have events at the end particles, let's say this is the z direction and I'm looking at the wall which is normal to the z direction and half the particles will be moving up and half will be moving down so if I'm interested in the rate at which particles are bumping into the wall I only consider the upward blocks which means I'll consider half of the density of the particles and multiply that by the speed in z direction to get the flux of upward of flux. I'm a particle bounce into the wall and bounces off elastically it will give an impulse to the wall which is twice its z component momentum, so that impulse since it has upward momentum MVZ in the z direction before the collision downward momentum MVZ after the collision the changing momentum is 2 MVZ and so the pressure which is the average force for the unit area and as the impulse comes the rate at which collisions are occurring in the number of places. The force is the impulse absorbed by the wall of per unit time so this is the pressure and so I can write the pressure as an average over the distribution of speeds in the gas of the density because that goes into the flux times the mean value MVZ squared because I have a 2 MVZ in the impulse and a 1 half N MVZ in the upward flux so in other words it's number of particles per unit value put in the vector 2 and this is the kinetic energy coming from motion in the z direction people partition tells me that classically this is a quadratic degree of freedom this is the kinetic energy due to that convenient degree of freedom so that's 1 half N now this is N tau over P that's the ideal gas law as interpreted from a might just an awkward point of view in the theory arising from molecular collisions what we like to do is understand the departures from ideal behavior which we haven't talked about very much in this course coming from the fact that the molecules actually do collide with one another and the gas is not an ideal so we're going to take a very naive point of view but which will be good for understanding and qualitatively and for order of magnitude calculations which is to think of each one of the molecules has a billiard ball which can collide elastically with a billiard ball and with some finite size which is going to be important or maybe stupidly I call the radius of the billiard ball P which sounds a lot like this I am but D for distance let D be the radius of one of the billiard balls and now let's consider one of these billiard balls moving along sweeping out a cylinder over time and in a while there will be another billiard ball not necessarily with its velocity oriented to collide head on but such that there's a collision and when the collisions occur what I really should do is average over the impact parameters of the collision but we'll consider a collision to just be something that randomizes the direction speeding another curve in the approximation and so think of this cylinder being swept out by the moving billiard ball but every once in a while the billiard ball forgets what its previous direction of motion is and heads off in a new direction so it moves for a while and then there's collision then it moves off in a new direction and then there's another collision and then a new direction again in its own and how often do collisions occur? so determined by two things one is the size of the particles and the other is the typical separation between particles or the density of particles and we can think about it this way if the molecule moves a distance out this cylinder that is sweeping south has a volume which is its cross-sectional area which is just of high D squared times the distance out that it travels is the particle going to bump into other particles when it moves a distance out the volume per particle is one over the concentration by definition so the number of collisions is going to be the typical number of particles contained in the volume occupied by this cylinder so we'll say the number of collisions is the swept volume of the cylinder divided by the volume per particle so that will be divided by volume per particle means multiplying by 10 just high D squared L and we'll call the mean free path the typical distance that the particle travels before a collision occurs when the number of collisions is 1 that means the particle has a distance L given by the mean free path M and P mean free path concentration times high D squared let's say a few other terms it's a concentration of an ideal gas at standard temperature and pressure 22.4 moles per liter that was quick ideal gas let's say at 0 degrees Celsius and 1 atmosphere concentration translating what you just said is something like 227 times 10 to the 19 particles per cubic centimeter and so what we find for the mean free path is 1 centimeter so it's larger than the size of the molecules in the vector of maybe a few hundred or so and it's larger than the mean distance between molecules in fact vectors like pi and stuff goes like the typical separation between molecules 2 because that's just another expression for the volume per particle or volume density and then divided by D squared is what we call a cross-sectional area of a molecule so if we compare mean free path to typical distance between molecules that goes like the separation size of molecules squared the separation is large compared to the size of the molecule and then the mean free path is larger still so when we speak about in physical systems the behavior the system over distance scales are large compared to the mean free path so on that scale if we were following the motion of a particular molecule it would get scattered many times having its motion randomized typically in some characteristic distance scale and its trajectory would look just like that so we call this the diffusive regime or the behavior of a molecule which is scattered many times and walks around in random directions it's called diffusion and it's important because it arises in many different physical contexts which in a few we can consider the diffusion of energy in let's say a gas where the temperature is not uniform so in some regions of the system the molecules are higher temperature in other regions lower temperature is what happens while the energy diffuses the higher energy particles gathering many times eventually the energy that they carry gets spread out until the system internalizes so diffusion of energy that's what we call thermal conduct it tells us how quickly high temperature moves towards a low chance of curing the system when we consider a diffusion of electrical charge that's electrical conduct tells us how quickly electric charge will flow from a region of high electric potential to low electric potential with the electrons being scattered many times diffusion of momentum in a fluid governs what we call hydrogen resistance to the low all of these things are diffusive processes for all of them it's important that we're talking about a scale which is largely compared to the mean free path when microscopically particles are scattered many times the context of the diffusion arises is mixing there's an example of mixing I let water and I put a little drop of ink and the ink is starting to spread out so it looks less blue than before and the region which is colored is bigger and then it comes back a few minutes later and it's spread out even more why is that happening is because the ink consists of many molecules they're being repeatedly scattered by water molecules and that's causing the distribution of the molecules to gradually spread giving rise to mixing after a long time the ink would be formally distributed in the chamber of report so many experiments with ink drops over the years watching how fast they spread reproduce something called fix law as anyone did so something that's empirically observed to be true and what it says is if I consider the flux in case of any flux of ink molecules once in that rate if I put down a surface somewhere in the water at which the ink molecules per unit area and time are flowing across that surface that's equal to well it's proportional to the gradient of the density the minus sign and the constant which I'm called capital D so in the case of ink I started out with the high concentration of ink right in the center near the edge of the droplet there's a big gradient in the intensity and that makes the impulse spread out and this is how it spreads out and D is called the diffusion constant or something that's called the diffusivity by writing this law of diffusion was to make use of the continuity equation the ink molecules are conserved there's some fixed number of ink molecules floating around in the water so if I consider some little cell or a volume in the water or this and then there are two things I can equate one is the total of flux of ink through the walls integrated over the boundary of the cell so that's the total flux out in the number of ink molecules per unit time that's equal to obtained by integrating the flux over the boundary of the cell and I can use the divergence there to write that as an integral over the volume of the flux and another expression for the rate at which the ink molecules are leaving the cell is while I consider the quantity of ink molecules in the volume integrated over the volume that's the total number of ink molecules in the cell and this is the rate at which that is increasing so I can write that as the integral of a minus of the density per unit time so since I can equate these two there's a continuity equation satisfied by ink molecules in the key density of ink molecules is equal to minus the divergence of the flux but according to the fit the flux can be written as minus d times the gradient of the density which becomes d times the gradient squared of density so our conclusion is that the density of ink molecules obeys partial differential equation I consider n as the function of time and position is satisfied as d dt n equals d gradient squared which is called the diffusion equation what we'd like to do is understand that equation from a microscopic point of view how it arises from molecular collisions who was the first to explain the diffusion equation from a microscopic point of view? Yes in 1907 what year? 1905 and what I decided to realize is that we can understand diffusion by thinking of molecules making random walks like this picture that each one of the ink molecules actually is walking around and turning in a random direction each ink molecule has no preference between going one way or another one it kind of moves in an isotropic way but if we have a gradient where there are more ink molecules lower on the left than there are on the right then if I look at some particular cut of the system into two parts we'll have more ink molecules flowing from left to right than from right to left if I'm worrying molecules over on the left side of the cut and that will cause the distribution to smooth out from the gradient to go down and that's what the diffusion equation describes so the smoothie is not due to interactions? it's not it's due to collisions but microscopically each ink molecule has no doesn't care whether it goes left or right it's as likely to go left as well but there are more on the left and therefore there are more going from left to right than right to left across the cut so if I have some initial distribution of ink molecules let's say it looks like a Gaussian what's going to happen it's going to spread out and that's what the diffusion equation describes so I want to have a simple model in which we can see that how that flow from high density to low density results from decisions that are made by individual molecules at the single molecule level it's a very big and low simplified model but it also takes a point very nicely and I'll stick with one dimension to start with since that's the easiest thing to discuss mathematically so I've got a one dimensional gas but it's not an ideal gas the particles are colliding a particular particle let's start with what it does and let's imagine although the system might really be continuous making it discreet for purposes of analysis so I'm going to consider a particle which lives on a lattice lives on a set of discrete points and there will be some time step which I'll call epsilon and a lattice spacing I'll call delta a delta is the spatial distance between these lattice sites and the time step epsilon well every time epsilon the particle is going to decide to either hop from its current position to the side on the left or the side on the right so it always hops it always hops by one lattice spacing but it's indifferent about whether it hops to the left or to the right in each time set there's no point to decide what to do whose two neighboring sites it goes to the left imagine there's some probability distribution to begin with describing where the particle is at a particular time in each set that probability distribution will change in a little while and then we'd like to consider over many time steps how the probability distribution changes and we'll see that if we imagine taking the formal limit of some time step goes to zero we get the distribution equation let's see how that works well in fact you kind of you kind of recognize the model it's from the first lecture but it changes in a way to protect the innocent suppose we start I'm going to consider the physician in other words to be s times delta delta is the lattice spacing and the time is equal to I'll call it n I guess may regret that not the same thing as the density but that's what I have in mind where n is an integer plus one that means we go from a time set to a time set and when s goes to s plus one that means the particle moves to the side on the right when s goes to s minus one it moves to the side on the left so after n times sets suppose we start at time zero or an angle zero with at the origin and I mean that's true with probability one we can learn to look like after many steps well what's s going to be it's the number of steps that we make to the right minus the number we make to the left that's where we wind up each time we either go left or right and the total number of steps to the right we've made is n right minus n left it can be either positive or negative having a moment of one look to the right of the origin or the left of the origin the total times is the total number of the discrete time is the total number of steps some of them I went to the right some of them I went to the left so this is just like tossing a coin at times and asking what's the excess of heads over chips the calculation we've done when we consider um in the process of many time steps we know we have a Gaussian distribution for s after n flips or in time steps we're using the Stirling approximation if we can adjust the function in this large and um so I think of that as a probability density on the line you just give it a random walk I'm just doing a random walk it's unbiased so I go probably one half to the left or to the right so I could also describe this as a probability density because now it's a probability density and if I want to know the probability that the particle is found in some interval which has length delta times an integer I multiply this probability distribution by the length of the set and then I can put this in terms of um since the time is an epsilon and the position is s delta I can write this as the square root of 2 pi delta squared that's just putting this delta inside the square root and then write n as time over epsilon so delta squared over epsilon times time and then in the exponential I have e to the minus epsilon over 2 delta squared x squared over t again just writing s as x over delta and n as t over x so I'd like to convince you that we can get this same formula in a different way by writing down a microscopic equation that describes the evolution of the probability distribution and the insolvent equation which will be the diffusion equation by starting out with a difference equation using this discrete model and then take a limit in which the time step becomes small and the light spacing comes out and looks like this it's a difference equation for the probability distribution which depends on the integer value of times and on both time steps and the integer value position the number of lattice spacings to the right when s is positive to the left when s is negative and the observation we make is this that if we want to know what the probability is how can you be at site s at time n well if you're at site s at time n where were you at time n minus 1 in each step you either move to the left or to the right if you're at site s at time n then at time n minus 1 you have to either be at s plus 1 and to have taken a step to the left or at site s minus 1 and taken a step to the right those two things are equally probable so in other words the probability that you wind up at site s with at time n the probability that you jump to the right times the probability that you were at site s minus 1 at time n minus 1 plus 1 half the probability that you were at s plus 1 at n minus 1 because if you were at s plus 1 at time n minus 1 then the probability 1 half you jumped to the left and you wound up here and if you were at s minus 1 you wouldn't believe that equation so I'd like to write an equation for how the probability of being of being at s changed between time set n minus 1 and time set n so I'll subtract that away from both sides the amount by which the probability of being at site s increased between time set n minus 1 and time set n is 1 half the probability of being at 4 s minus 1 n minus 1 but then I subtract this away and I'm figuring out 1 half so I'll put in a 2 minus 2 probability of being at s n minus 1 plus at s plus 1 and you can prevent me from saying the same thing as 1 half the probability of being at s plus 1 at time n minus 1 minus the probability of being at s at time n minus 1 minus the probability of being at s at time n minus 1 minus the probability of being at s minus 1 yeah even the probability of x t why do you divide by delta? why did I divide by delta here? that's because this is the probability density by that I mean that the probability of being between x and x plus dx probably of x dx but changing the integer by delta times that amount so I had to divide by delta so now I want to take a so to speak continual limit in other words I want to imagine that the probability distribution changes very smoothly when I change the lattice spacing so the characteristic scale which the probability distribution is changing is not the lattice spacing and the probability distribution is changing just a little bit in each time step because I've chosen the time step to be very short if p of x t is smooth on the scale I'll say delta x and delta t of order epsilon then we can approximate difference equation by differential equation think of this difference as being the elapsed time and one time step which is epsilon times the derivative of the probability distribution at side s or at some particular position x with respect to time so the left hand side becomes the partial derivative of probability distribution with respect to time times the time step epsilon and I guess quickly speaking I can say it's evaluated at position s on the lattice and a time in between time epsilon and time epsilon n minus one which this derivative is defined as the midpoint of the derivative and what I have on the right hand side I can think of as the lattice spacing delta times the partial derivative of probability distribution with respect to x as this difference between probability at time s plus one and probability at side s plus one probability at side s think of that as the difference delta times the spatial derivative of probability at the midpoint so that's delta times partial derivative of p with respect to x evaluated at position s plus one half time n minus one but then I have minus delta with respect to x evaluated now at s minus one that's in the second term I'm subtracting the probability at side s minus probability of side s minus one so this is the first derivative of the probability distribution I'm taking the difference between two positions which are one that is facing apart so we can think of this as another factor of delta times the second derivative of the function with the right position and now evaluated at the midpoint of this interval which is s and time n minus one my order corrections have to do with the fact that here I have time n minus one half and here I have time n minus one which I'll be able to ignore when I take the limit of epsilon to zero we have the probability distribution in this smooth limit satisfies the differential equation derivative of probability density with respect to time equals delta squared over epsilon if I divide both of them by epsilon times one half I'll put a two in the denominator the second derivative would be with respect to position so that looks just like a diffusion equation if I identify delta squared with two epsilon as the diffusivity of the diffusion constant well the one half came from here and here but I had it here but I missed it there it's the one half because of the one half of the expression on the boreal but I don't want to drive the cameraman crazy and walk over there again so I can just pack up the big one and so d is equal to delta squared over two epsilon so let's talk about the density instead of the probability it's just they just differ by constant factor so the density obeys the same differential equation as the probability distribution mainly the diffusion equation and actually we can consider the case now in the continual language that we considered here where we started out with the origin we can ask what is the solution to the diffusion equation if the probability density times zero is just a delta function well I said that I'm supposed to so the integral p of x so if there's any justice it's really to be interpreted as a differential equation governing a probability distribution it will be non-negative as probabilities should be normalized at all times well if we look at the expression just by thinking about the unbiased point for the probability density as a function of position and time written in terms of what we've now identified as the diffusivity delta squared over two epsilon time t solution with this boundary condition when we start out with the origin we're going over to square root of four pi e t e to the minus x squared over four e t and you can chat just by differentiating with respect x twice and differentiating with respect t once that expression obeys the diffusion question is the unique solution obeying our boundary condition that the probability distribution is a delta function at time zero and it is normalized correctly at all times non-negative so we can chat in the integral and so our delta function should be spread out distribution at time zero and then at a later time here's a broader Gaussian and at a still broader Gaussian the variance of this probability distribution tell us in front of this mean square value of x is given by the variance of this Gaussian and that's just d t where t is the diffusion factor which we now know in terms of our lattice model we can write delta squared over epsilon on time t so the width of the distribution throws out the square root of t the square root the square value of x squared is equal to the square root of t dt in terms of our naive microscopic model we can think about it this way think of each time step as the time the particle travels a mean free path and then get position re-randomized so after it moves distance delta it forgets what direction it was moving and moves off in a random direction either left or right the probability you have so you can think of delta as being like a mean free path how fast are the particles moving well they hop either to the left or the right a distance delta in epsilon so the speed is delta over epsilon so at least in our naive model we can think of the diffusion constant from the point of view of a microscopic theory of our molecular gas as a factor of one half which we shouldn't take two series depending on the free model of our chosen but what it is of broader interest is we can think of it as a mean free path time to speed a mean free path is how far you travel before you forget what your initial direction of motion is to be as how fast you travel the product of the tube is essentially the diffusion constant which tells you how quickly the probability of distribution spreads so the formal limit that we took to get a diffusion equation is the limit in which I call it the delta goes to zero limit delta goes to zero and epsilon goes to zero but what we hold fixed which is delta squared over 2 epsilon we're saying delta goes to zero that means we're observing the behavior of this fluid on a characteristic distance scale which is larger compared to the mean free path and over time scales which are long compared to the typical collision time epsilon now something that's kind of weird about this is that when we take this formal limit if the speed delta over epsilon is behaving like the diffusion constant which we're holding fixed divided by delta so the speed is actually going to infinity another way of saying it may be a better way of saying it is that if we look at x as a function of t in our lattice model x is hopping back and forth we start out at the origin maybe we go up, down, up, up, down up, up, down down, up, down and so on, kind of at random and now we want to consider the behavior of that curve when we take the time interval between jumps going to zero and what happens as delta epsilon go to zero is that x of t is continuous with no derivative no first derivative another way of saying that is if you think of this x as a function of t is being governed by some trajectory some function of time x is the square root of x where it's increasing like the square root of t so that's smooth as t goes to zero but its first derivative is blowing up if you look with a microscope at an instant particle with better and better resolution it's always wiggling you can't take a limit in which the resolution gets arbitrarily fine and becomes smooth now of course that's kind of an artifact of the fact that we've taken this limit of delta to 0 in practice there really is some cutoff in the collision time so if you finally look with good enough spatial resolution and time resolution so that you can see individual collisions then you see the particle has a trajectory and it's differentiable except for the discontinuous change in the direction of the velocity at each collision of course we can also generalize more dimensions and in fact in three dimensions it's really the same thing you can think about a three dimensional lattice that we've already mentioned so each position on the lattice has six neighbors now in fact if it's d dimensions that's because there are d dimensions d lines along which you can move for each of them you can either move up or back possibilities along each axis so if we do the same analysis that we've already done it's exactly the same thing except now we'll say in each time step probability each of two d possible directions or two d in other words it's a uniform distribution there are two d possible ways to be done they're all equally probable before we had two ways and probability was one half so in three dimensions there are six possible ways to move up or down left or right or over back so we will again get a diffusion equation the probability distribution will obey the differential equation if we take the same formal continual limit delta and epsilon going to zero as before the time derivative of the probability is diffusion equation times gradient squared p because I'll have different first derivatives like this for each one of the three directions x, y and c and where the diffusion constant now will be delta squared over in d dimensions two d times epsilon instead of two epsilon in the denominator that's just coming from the probability of being one over two d instead of one half for each of the directions so in particular in three dimensions so if you want to think of that as expressed in terms of a mean free path or in terms of a mean free path and its speed in constant is one sixth delta delta over epsilon for one sixth a mean free path is speed actually in the book they write down a similar formula but with the one third instead of a one sixth but don't worry about it it's both using kind of crude models and actually a different the important thing is that diffusion counts unlike we found in the one dimensional model scales like a mean free path times the speed of the molecules and that determines how quickly probability is spread out probably the easiest way to think about it is if you have three dimensional diffusion the particle is independently doing a random walk in the x, y and z directions each one of which we can think of as described by flipping an unbiased coin many times so the distribution that we get after a long time if we start out at the origin it's just going to be the product of the one dimensional distribution that we found earlier in the deltilation for the x, y and z directions so in other words if equal to zero we have a delta function normalized to the one that would support at the origin then at time t the probability distribution will be 1 over 4 pi e to the 3 halves e to the minus x squared over 4 dp that's just the product of the probability distribution I wrote down earlier for the x direction, the y direction and the z direction so I get x squared plus y squared plus c squared here or in other words the total distance of the travel squared and again you can check this by to this form of the diffusion equation and in three dimensions this is a solution and as anyone to mention the distance that our particle travels in time t goes like the square root of t when t is small so we can think of the path that we regard the diffusion equation as describing the probability distribution for a diffusing particle as one that is continuous but not differentiable at any point the particle keeps changing its mind about the y direction so Einstein did another great thing in this 1905 paper which was his third best paper from here and that was he derived the so-called Einstein relation relation between the diffusion constant and something called the mobility involving temperature the mobility a statement about what an article is diffusing through a medium and we apply a force to it at what point it reaches terminal velocity so in other words if we apply a force the terminal velocity in the presence of that force is proportional to that and the constant of proportionality is called the mobility so this is terminal speed terminal velocity a lot of force to pull an object through the fluid and this is the mobility it's kind of a remarkable equation because the statement about mobility this definition of mobility mobility characterizes dissipation it's the resistance the friction that the particle encounters when it flows through the fluid the diffusion constant though has to do with fluctuations the way the distribution spreads out and we'll see next time why the two are related