 Alright, today let us formally solve this problem of a continuous time random walk on a lattice in D dimensions. Just to be specific let us take a hyper cubic lattice. So here is the problem we are going to solve. It is a continuous time random walk on a hyper cubic lattice in D dimensions and we could specialize to D equal to 1, 2, 3, etc., etc. So just to recapitulate we already know a lot about this. What I would like to do is to write the formal solution down. It is fairly straightforward to do it for a class of continuous time random walks governed by some renewal process in time and then we will look at the diffusive behavior the first moments and second moments of the displacement and so on and understand how diffusion comes about in this case. So all the while we will have the Markov case as a check on our calculations because we must reduce to the known results in the Markovian random walk case which we solved completely. So here is what happens. Let me denote by J an element of hyper cubic lattice in D dimensions labeled by the integers infinite lattice. So this J is shorthand for J 1, J 2 up to J D where these are all integers. So that is a lattice point and we are going to look at walks which by translation variance we started the origin in this D dimensional lattice and we take steps in time. Now the first thing is we ask what is the probability of being at the side J at time n. So just to avoid confusion with what happens in continuous time let me put the subscript outside to show that this is equal to the probability of being probably probability of being at of the walker being at J at time n given that the walker was at the origin at time 0. We already know this. We know this is some combinatorial factor it depends on whether there is a bias in the problem or not. We already know what this quantity is it is some multinomial coefficient because remember that in the case of one dimension it was just a binomial coefficient with some factors which depended on the bias to the right or to the left etc. We could as well take a bias in this problem also. So let us suppose that you have bias factors alpha i and beta i equal to 1 minus alpha i so these are bias factors but i is any one of these directions okay. So in the dimensionality 1 to D any one of them could have a bias but we also need to know the total probability must be 1. So we also need to write summation i equal to 1 to D alpha i plus beta i sorry alpha i and beta i in general after you sum this is equal to 1. So in each Cartesian component you have a bias and the total probability of a jump out of a site into any one of its nearest neighbor sites is equal to 1 okay. So each of these is a fraction the unbiased case the unbiased case would be every one of these factors is equal to 1 over 2 D alpha i equal to beta i. So it says if you are in three dimensions for example there are six nearest neighbors and the probability of jumping right or left each of these is one sixth or top or bottom etc okay. So this is some multinomial coefficient in fact we know what this fellow is explicitly we looked at its generating function and let me call the generating function let us call that G sub n of Z 1 up to Z D this is equal to a summation over all components of J elements of Z D over all the integers for each component P n of J Z 1 to the J 1 dot dot dot Z D to the J D that is the generating function what is the random variable here J is the random variable you are given a time and this components the components of J they are random numbers they are random integers you want to know which site you ended up at etc. And this is the generating function okay this coefficient here as we know is a multinomial coefficient right we already know what this P J looks like we know what this generating function looks like remember for one dimension this was P times Z alpha times Z plus beta times Z inverse to the power n. So in the general case this quantity here is equal to summation I equal to 1 to D alpha I Z I plus beta I over Z I the whole thing to the power n because this is the generating function for a single step which could be in any one of the sec 2D directions okay. So we already know this and therefore what you are asking for is of this quantity when you take this fellow and you expand it in powers of each of the Z's the coefficient of this combination is this quantity which you want here so it is a multinomial coefficient. So let us call give it a give this fellow a name this is equal to G of let us collectively call it Z to the power n okay I want to make the notation straight that is the reason I am taking pains to do this that this is just a shorthand for Z 1 Z 2 up to Z D okay. Now let us introduce so this in a sense this problem is trivial we know how to solve it once you know how to make a multinomial expansion this identifying this quantity is trivial okay. Now let us ask what happens we put continuous time on this right so let us go to continuous time and the way to do this is very simple. Now I ask what is the probability of being at the site J at time t since t is a continuous variable I put this index inside here okay this is obviously equal to a summation over n from 0 to infinity in principle P n J times W of n comma t where this fellow here is the probability of n jumps in the time interval 0 exactly n jumps and you are going to allow for all possibilities so you sum over n okay remember the different jump events are independent events so the probability that you have 10 jumps in time t is distinct from the probability that you have 15 jumps and so on these are mutually exclusive events the number of jumps and therefore I sum over all these guys so I have separated out the spatial and the temporal parts all the time dependence is sitting here the space part is right here and that is just pure combinatorial factors now you can generalize this to other lattices you do not need this hyper cubic lattice you have to tell me what is the quote unquote generating function or characteristic function for a single event jump event and that is what this quantity is so on any lattice you can find out what are the nearest neighbor vectors etc and find out what this quantity is and after that you are guaranteed that since the jumps are independent of each other this is the generating function for this probability distribution now you could ask what happens if the system remembers its previous jump and so on all that is happening in time so that is going here that input is going here so the jumps could all be uncorrelated with each other and that will show up in w of n, t so as it stands this is the very general framework out here now we can ask what about what can we say about the generating function for this so let me call I need a new symbol so let us call it L of z, t to be equal to the generating function for this this is equal to summation over j all lattice points j over z d p of j t times z 1 to the j 1 z d to the j that is this follow here it is the continuous time analog of this generating function here so instead of p of p n of j I got p of j, t so what I have to do is to multiply this by z 1 z 2 etc this factor and sum over all the j's components of j's and when I do that this becomes equal to summation n equal to 0 to infinity w of n, t and then here I have to multiply this by these factors and sum it but that factor we already know explicitly it is this guy g n so it is just g n because I multiply this by these things and sum over all the j's and that gives me this but this fellow is already known to me in terms of this characteristic function so finally I get this is equal to summation n equal to 0 to infinity w of n, t g of z to the power I still do not know anything about this I have not specified anything about it at all but I have a formal expression of this kind now w of n, t is the probability of the distribution of the random variable n which is the number of jumps in any given time interval t okay so we can define a generating function for that quantity. So let us define so w of n, t generating function h of xi and t this is equal to summation n equal to 0 to infinity w of n, t sum xi to the power n I do not want to put a z here z here because I use that for the space variables conjugate so this is the definition of the generating function of this probability distribution in n so once I do that it is clear what this is this is by definition now equal to h of g of z, t because it is just this to the power n. So in principle the problem is solved because if you give me this characteristic function or generating function little g which depends on the lattice structure what are the nearest neighbors etc and you give me the statistics of this number of jumps in time t I compute a generating function and there it is and I write that in that expression that gives me the generating function for p of j, t which is what I want to find. So once you have this L which is this crazy power series in multiple variables formally formally we can write what its inversion is so p of j, t is equal to the coefficient of z 1 to the j 1 z 2 to the j 2 etc till z d to the power j d right. In the moment you have an expansion of this kind it is a Laurent expansion because there are positive and negative powers of each z this can be written now as a product from i equal to 1 to d 1 over 2 pi i an integral around the origin d z i over z i to the j i plus 1 times this guy here h of v of z product acting on that so just inverting this Laurent transform this Laurent series to find the coefficients. This is around a small circle around the origin in the z i plane and you got to do this for each of the z i is in principle you got the answer or you have to differentiate it a sufficient number of times is some suitable derivative of this quantity or of this quantity with respect to each of the z i's will give at z i equal to 0 will give me the coefficients right but the difficulty with that is that you got negative powers of z so therefore that is not a trick which is easily doable this is a much more reasonable formulas so that is it this solves the problem but now let us see whether we can do something more in this matter in particular I want to say something about w of n comma t and I want to say that it is a renewal process w of n t given by a renewal process that is what a continuous time random walk is in other words there is a certain function a waiting time density which tells you the density that in time for the random interval between two jumps so the idea is that if you give me time 0 and t the first jump may occur here the second may occur here t n t n minus 1 and so on this interval between jumps successive jumps is given by a specifying a waiting time density which is the same for all the intervals okay so we introduce a waiting time or holding time density waiting time density let us call it psi of t so it says if you start at t equal to 0 then the probability that you have a jump between t and t plus dt is psi of t dt and it is obvious that psi of t must decrease as a function of time clearly you wait long enough there is going to be a jump it is a density so psi of t must be non-negative it is integral from 0 to infinity must be 1 it is a normalized probability density function so these are the properties we require of it the Markov case is given by an exponential waiting time density as I mentioned and it is not hard to see then the w of n comma t will be a Poisson distribution with intensity lambda okay but we will do it more generally than that does not have to be exponential just has to satisfy these conditions at all what happens then because the waiting time density is the same for all of these guys that is why it is called a renewal process with a common waiting time density I could generalize this a little bit by saying this jump has a waiting time density different from the second different from the third etc. But in the simplest instance that is not what we look at we look at just the simplest case where you got a common waiting time density okay not necessarily exponential okay then it is clear that in this geometry here t 1 is less than t 2 is less than etc etc it is clear that this t 1 appears anywhere between 0 and t 2 t 2 appears anywhere between t 1 and t 3 and so on and it is in the form of a convolution because these intervals the size are the arguments for the size right. So it is immediately clear that if you take a Laplace transforms and you write w tilde of n comma s being the transform variable it is a convolution after all this guy here has a psi of t 1 this fellow here has a psi of t 2 minus t 1 because that is the time you waited for and so on till out here you have a psi of t n minus t n minus 1 and it is a convolution of all these fellows because each ti is bounded by the next ti plus 1 in the upper limit right. So the n of these guys so it is immediately clear that this is psi tilde of s to the power n but we got to be a little careful that is not w tilde of n comma s because you can ask what happens if n is 0 then you need the probability that in the time interval t no events have occurred at all nothing has happened at all and what would that probability be because if I just took this to be the case w tilde of 0 comma s is 1 and the inverse Laplace transform of 1 is a delta function that is not what is happening at all right. So what is the correct situation what is the correct w tilde of w of 0 t what is this equal to nothing should happen in this interval right if a jump occurs it occurs with density psi of t. So this must be equal to 1 minus integral 0 to t dt prime psi of t prime that is the total probability that a jump occurs in time t and 1 minus that is the probability that no jump occurs what is the Laplace transform of this guy what is the Laplace transform of 1 1 over s yeah 1 over s minus the transform of this guy is the convolution of 1 times i of t. So it is 1 of 1 over s times psi tilde of s right the Laplace transform of this integral is psi tilde of s over s when you differentiate you multiply by s when you integrate you divide by s. So this in fact is the correct answer so this is 1 minus psi tilde of s over s and check this check this against the case we already know against the Markovian case Markov psi of t equal to lambda e to the minus lambda t lambda some positive constant right this is guaranteed to give you immediately trivial to check that w of n comma t is going to be the correct answer which is a Poisson distribution and what is w of 0 comma t when you have such a Poisson distribution w of n comma t is lambda t to the n over n factorial e to the minus lambda t. So what is w of 0 comma t e to the minus lambda t you should have it should come out because this will imply that psi tilde of s lambda over s plus lambda 1 minus psi tilde of s over s is this fellow here it is 1 over s plus lambda. So that tells you that w of 0 t equal to e to the minus lambda t so that checks that checks so indeed that is the correct answer for w tilde of 0 comma s otherwise you end up with this inconsistency that you have a delta function it jumps immediately which is not true w of 0 comma t has a delta function at t and then it is 0 thereafter it is not true okay. So what is the correct expression now for this w tilde of n comma s we took into account a jump here a jump here a jump here a jump here with this convolution but we must be careful we want w of n comma t is the probability density probability that exactly n jumps have occurred in time interval t. So nothing should happen here nothing should happen here so if I call this the probability that nothing happens here call it phi of t it is a w of 0 comma t actually you have to multiply by that that also is in the convolution right. So the correct expression is this times 1 minus psi tilde of s over s that is the correct expression for w tilde of course you put n equal to 0 you end up with this correct. So we are all set we have an expression for w tilde and therefore we know what h is let us put that in see what happens. So we have h of psi t equal to summation n equal to 0 to infinity w of n t psi to the power n but this is one my so therefore h tilde of psi s equal to summation n equal to 0 to infinity w tilde which of course is very conveniently 1 minus psi tilde of s over s times psi times psi tilde of s to the power n which is a geometric series. So that is trivially summed and this gives you 1 minus psi tilde of s divided by s times 1 minus psi times psi tilde of s and therefore finally we have our l of v comma s the plus transform this is the generating function for p of j comma t now I take it so plus transform instead that is equal to 1 minus psi tilde of s divided by s times 1 minus not psi but g of z psi tilde of s and that is the Laplace transform with respect to time of the generating function of my probability distribution p of j comma t okay quote unquote all I have to do is to invert this transform and to invert the power series itself I mean the Laurent series itself but in principle that is a solution that actually finishes it completely explicitly what we need to know now do is to ask how do we extract information from this in principle we can solve this once again but let us first check the Markovian case quickly and see whether this works or not so again Markov case psi tilde of s equal to lambda over s plus lambda so what does this give you for l this is equal to 1 minus this guy is s over s plus lambda and the s cancels out so it is 1 over s plus lambda 1 over 1 minus g of z psi tilde of s so there is a lambda sitting there over s plus lambda so let us take this and write it as s plus lambda minus this guy and this s plus lambda goes away right and I take out this lambda and I get 1 minus g of z and of course we can invert this transform in time 1 over s plus anything s plus a constant independent of s is e to the minus that constant times t right so that immediately tells us so it says l tilde l of z t equal to e to the minus lambda t e to the lambda g of z lambda t that is it of course if you look at the case we are looking at this is e to the minus lambda t on this hyper cubic lattice e to the lambda t summation i equal to 1 to d alpha i g i plus beta i g i inverse and what is the usual trick well take out alpha i beta i so write this as alpha i beta i times alpha i over beta i square root z i plus square root beta i over alpha i 1 over z and that is the generating function for pardon me for which one yes so what is the function that you get this is generating function for the modified Bessel function we know that of course right so recall exactly the skellum distribution so recall that e to the power xi over 2 z plus 1 over z equal to summation j equal to minus infinity to infinity i j of xi z to the power j and that is the modified Bessel function on the with the integer order and i minus j is identically equal to i plus j so this is a product this is equal to e to the minus lambda t product i equal to 1 to d should not forget this factor so alpha i over beta i to the j i over 2 that is sitting there and then i j sub i of twice root alpha i beta i so it is a product of these Bessel functions that is the solution to the bias random walk in a hyper cubic lattice the Markovian case so this general answer this guy here reproduces that immediately but you do not always have such a simple function to invert unless this is 1 over s plus lambda or something simple this is not such a trivial matter to invert this always in general the solution could be much more complicated in particular we are interested in the asymptotic behavior of both the mean displacement and the mean square displacement of the variance now we know by symmetry that if you do not have a bias in the problem if each alpha i equal to beta i equal to 1 over 2d then there is no bias at all and the mean displacement should go to 0 let us see if that emerges let us see if that emerges from this general case even without inverting the transform you see what I want is j any one of the j so j i as a function of t this is equal to a summation over all the j is p of j t times j sub i any particular j right so the transform of this guy j i of s transform in t is with a p tilde here so over j p tilde j sub i I need to pull down a j i how do I do that from the generating function we already have a generating function because this fellow here if you recall this guy equal to summation over j element of d dimensional lattice p of j comma s p tilde and then z 1 to the j 1 z d to the j d I want to pull this down pull the one power of j i down what should I do differentiate with respect to that particular z and then set all the z is equal to 1 so this is equal to delta l over delta z i at all the z equal to 1 so this is 1111 all z all z components are equal to 1 now when I differentiate that what is going to happen recall that we had a formula for this fellow here this was you have to remind me of the what this formula was 1 minus psi tilde of s in general divided by s times 1 minus now I am not very sure what really happened there was a g of z and then psi tilde of s and we have to differentiate this and then set z equal to 1 all the z components equal to 1 you differentiate this guy remember that so if you differentiate this fellow here delta l over delta z i l tilde is going to have all these factors as before but it is going to have a g prime of z on top with respect to that particular z and this g so I would not belabor the point very much because if you put g all the z equal to 1 what do you get here you get summation alpha i plus beta i which is equal to 1 so it is very clear that g of 1 equal to 1 so after you differentiate you are going to put g equal to 1 so you are going to get 1 minus psi tilde divided by this fellow squared you put g equal to z equal to 1 it cancels one factor between top and bottom but you are going to have this factor here and remember the corresponding g i dependence was in the form alpha z i plus beta over z i plus other stuff but when I differentiate this I get alpha z i minus beta over z i squared and I put all z i is equal to 1 so it is immediately clear that this fellow is going to be proportional to alpha i minus beta i that is going to come out as a factor indeed it should because if there is no bias in that direction that particular coordinates average value must be 0 so this is where it is coming from moreover if it is a Markovian walk we know it must be linear and that emerges also because what is psi tilde of 0 equal to that is the integral of psi of t from 0 to infinity so it must be 1 right so this must be 1 but now I ask what is psi tilde so you got to be careful when you put s equal to 0 this vanishes here so you have to ask what is the small s behavior of psi tilde of s now it is clear that in the Markov case if psi tilde of s equal to lambda over s plus lambda this is equal to a pull out a lambda and equal to 1 plus s over lambda inverse this is equal to 1 minus s over lambda plus order s squared near s equal to 0 so you have to be careful because we have got factors like 1 minus psi tilde of s so we better be careful so now look at what happens if you differentiate this guy gives me alpha minus beta multiplied by a psi tilde but that is equal to 1 if I put s equal to 0 this gives me a 1 over 1 minus psi tilde between this and the square of this so that gives me a factor s this gives me an s so the whole thing for small s is going like 1 over s squared what is the inverse Laplace transform of 1 over s squared it is a ramp function it is t because we know that the t to the power alpha the transform is 1 over s to the alpha plus 1 so if you have large t behavior like that you have small s behavior like this that is a useful thing to remember so you got a 1 over s squared so the whole thing it will tell you that ji of t average for t turning to infinity will go like this in general as long as psi tilde of s has an expansion 1 plus order s as long as that this factor is not relevant I mean the fact that in this case it happens with this is not relevant so if psi tilde of s is analytic at s equal to 0 the leading term must be 1 at s equal to 1 0 by conservation of probability the next term if it is analytic must be of order s there is a Taylor series which exists then you immediately get this result now look at what happens to diffusive behavior in the case of diffusive behavior let us let us we must look at the variance but let us be simple and look at just the mean square displacement let us put the bias factor equal to 0 so all alpha is equal to beta i's and so on and then this g of z just becomes z i plus 1 over z i sum over 1 over 2 d outside common factor so let us look at the unbiased case and see how diffusion comes out so we want ji squared of t so ji squared of s has a square here you still have this but now you want twice ji you want to differentiate right what should you do if you differentiate once you are going to get a ji if you differentiate a second time you are going to get a ji times ji minus 1 so a second derivative is really d 2 l tilde over d z i 2 is going to give you this moment ji times ji minus 1 but that is ji squared average minus ji average but if there is no bias ji average is 0 what does that imply that says g prime remember this no bias implies g prime of z equal to 1 equal to 0 because each time you differentiate this fellow you get alpha minus beta over z squared but if alpha equal to beta this is 1 minus 1 over z squared you put z equal to 1 it goes to 0 right what I am saying is g of z is 1 over 2 d alpha 1 plus z 1 plus beta 1 over z 1 plus dot dot dot if I differentiate this guy in the unbiased case it is just this if I differentiate this g prime of z i equal to 1 over 2 d if I differentiate with respect to one of the z it is z i minus 1 over z i squared this is delta d z over delta z i so g prime if I put z equal to 1 vanishes in that stage therefore it is sufficient to take the second derivative of this guy and that will tell me what the behavior of this is at long times immediately so let us go back quickly do that I have l tilde of z s equal to 1 minus psi tilde of s over s times 1 minus g of z psi tilde of s I have to differentiate this twice I differentiate it once I am going to get a g prime on top this fellow squared and then I am going to get psi tilde times g prime I use prime for the partial derivative with respect to one of the z i's for the moment I differentiate second time I differentiate this guy and I am going to get another g prime factor on top but g prime is equal to 0 at z equal to 1 so that does not contribute and in the second derivative it is only the second derivative of this that contributes but what is the second derivative of this it is just 2 over z i cubed and I put z i equal to 1 so it is just some number 2 some finite number so this guy goes the second derivative of this fellow at z equal to 1 1 this goes to 1 minus psi tilde psi tilde over times a number here times s into 1 minus psi tilde the square this goes away that is it times a number into some number times that there is no s dependence what this is now what this is equal to J i squared what does it do for small s that will tell me what this guy does at the inverse transform does at long t what does it do at small what does this fellow do at s equal to 0 this is 1 so this is harmless and you just have this so all we got to do is to look at the small s behavior of this guy here if psi tilde of s equal to 1 plus order s analytic then this gives you an s here you get an s squared and the inverse Laplace transform of s squared is t diffuser behavior you have normal diffusion everything therefore depends on the small s behavior of this or the large t behavior of the waiting time density if it is exponential or anything which has got a Laplace transform which is analytic at s equal to 0 the first term is order s other than 1 then immediately you have diffusive behavior in general but suppose it turns out that if psi tilde of s equal to 1 plus s to the beta order s to the beta less than 1 not analytic like square root of s for example what is going to happen is that this whole thing is going to go like 1 over s to the 1 plus beta because there is 1 cancels and you have s to the 1 plus beta and what will it imply for the mean square displacement which is sub diffuser this is the famous anomalous diffusion it is entirely therefore dependent on the long tail behavior of psi of t because what does it mean if for instance psi of t went like 1 over t to the 3 halves what would it mean for the Laplace transform you have take the Laplace transform of t to the alpha where alpha is minus 3 halves so that is 1 over minus 3 halves plus 1 which is s to the half on top so this will immediately tell you that this fellow here would not be true you get s to the half that will immediately lead to sub diffusive behavior which says the mean square displacement will go like the square root of the time rather than the time itself so the root mean square behavior will go like t to the 1 fourth in that case it is slower than normal sub diffuser and as a huge literature on anomalous diffusion the practical implications of anomalous diffusion etc will we will say a little more about this but this is one mechanism by which you can see a huge variety of behavior will emerge and it is essentially exact you solve the problem on an arbitrary lattice and we have seen that everything depends finally on this waiting time density what it is long tail behavior if it has a long tail then you have anomalous diffusion okay if on the other hand it cuts off exponentially then it is Laplace transform is analytic at s equal to 0 and then you do not have any problem you have normal diffusion the process is still non mark of as soon as you do not have an exponential waiting time density but it could lead to all kinds of anomalous diffusion if this has a long term t so we have done this without actually writing down a master equation for p of j, t which we cannot in the normal in most non Markov in cases so we did this by this trick of writing down generating functions and then looking at the statistics of the time the step distribution separately from that the geometrical problem of finding out where you are on this lattice and so on we can go ahead with this and find first passage time distributions recurrence properties and so on and so forth the matters of detail but I thought this will give you some idea of how anomalous diffusion arises how this non analytic behavior arises we stop here today.