 We were talking about the idea of a random walk dimension and I had started mentioning that on structures like fractals the walk dimension could be other than 2 and let me show you with a simple example how this follows and after that we will go back a little bit and talk about the problem of recurrence in Markov chains in general with application to random box okay. Recall that I mentioned that if you say the square of the distance goes like the time to 2 over dw where either this or that could be a random variable and we mean a suitable average could say for a given time the mean square distance that you cover or you could look at for a given distance the time it takes the mean time it takes to cover this distance either way when you have a relation of this kind a scaling relation then one says that the random walk dimension of that particular process is dw and for normal diffusion it turns out dw is 2 because we know r squared goes like t it scales like t well the formal expression for this can be found in the following way suppose you have two distances say one of unit distance and one which is double that then if the corresponding times are t1 and t2 it is clear that dw then is equal to log t2 over t1 divided by log 2. So how long does it take to cover twice the distance so t1 is the time taken to cover some distance and t2 is the distance time taken to cover twice the distance you take the ratio of these two take the log and divide by log 2 and that gives you tw. So this is what we would like to do now look at the following problem on a hierarchical structure in the way we do this is defining a procedure for constructing a deterministic fractal in this fractal is called the sierpinski gasket in two dimensions and there are generalizations in higher dimensions and so on and the way it is constructed is as follows and I tell you what the reason for constructing this in the first place was and has to do with the following fact if you have a uniform curve it is got at any point two nearest neighbors at a given distance from this point but if the curve has a self intersection then at this point there are four such nearest neighbors and the question was whether you could find a curve or some construction where every point has four nearest neighbors in this specific sense okay and the idea of sierpinski was the following you start with an equilateral triangle does not have to be equilateral but for simplicity equilateral triangle and then double it in size and at the same time remove the middle third so then the figure at the next stage I am not able to double it and draw it to scale but the next stage you remove this open set then at the next stage you do that for each of the three triangles that remains so the next generation you have a figure which looks like this in which that is missing this is missing and this is missing so you have a structure with more and more holes in it of all sizes but what we are interested in is the following we are interested in a graph essentially a graph of some kind so I will focus only on the nodes of whatever remains and say that is the sierpinski graph in two dimensions and there are generalizations to higher dimensions also so basically what is happening is you start with this simplex this triangle and you break it up into three simplexes remove the fourth one here and that gets broken up into nine of them and so on and each time there is also a scaling up so that the figure looks bigger and bigger or scaling down we do not care whichever way now one way of constructing the graph is as follows it is to say I start with a triangle of some size and I decorate each bond each edge of this triangle with a new site so the next stage so in the first stage you have three sites call them 1 2 3 and in the next stage you decorate the midpoints of these sites with new sites here so now we have 1 2 3 4 5 6 and then in the next stage you have more sites and 15 sites and so on and so forth it is easy to see that if you call this generation 0 generation 1 2 3 etc that the number of sites goes on increasing exponentially with n go like 3 to the n essentially so the question is what happens if you do a random walk on this you can blow up the size of the triangle appropriately to make sure that the size of each triangle the smallest triangle is always unity or something like that okay so the question asked is to go twice as far on this structure assuming that this is one that is one etc etc how long does it take on the average now it does not matter where you put this is a completely deterministic graph because given the n-1 generation you can find the n generation by decorating the midpoint of each of these sites by trying by with a point and then adding all those nodes to the graph okay so the question is if you start here a random walker and there is always jumps only to nearest neighbors at any stage how long does it take to go unit distance now the original graph the sides 2 and 3 are unit distance away so it is clear that this walker if he jumps at the end of every time step with equal probability to all the nearest neighbors then these 2 points if they are traps if he reaches this or that you have traveled unit distance that is the end of it so the mean time taken t1 is equal to 1 and that is the end of it to go to this strap or this strap now ask what happens in the next generation okay so again you start at site 1 but you would like to go twice the distance so you would like to go to either 4 or 5 that is where the traps are so the question is it scaled up and now to go double the distance how long does it take so the assumption the idea is to ask you have a random walker on this graph who jumps from one site to nearest neighbor site with equal probability if the site has 2 nearest neighbors with probability half but the generic site apart from these sites apart from this this and this you notice that every one of these sites has 4 nearest neighbors always so the coordination number is 4 in general except for those corner sites so now I ask I start here and with equal probability the walker jumps to nearest neighbor sites what is the mean time for the walker to hit either 4 or 6 and if you hit either 4 or 6 the walk is over okay now we can do the same thing exactly as we did for the linear lattice except that you have to now allow for the fact that it is possible that the walker will get stuck in this loop here and keep going for a long time or go back to 1 etc etc and then eventually hit 6 so the question is what is the mean time we saw that on a linear lattice it was to go twice as far it was just 4 times as long because on a on a lattice with J sites we started 0 to go to J for the first time it took time J squared therefore to go to a site 2 J it will take 4 J squared and that was it but what is it here well let us write these equations down and see what happens in this case now T 1 is equal to with probability half the first jump will take you either to 2 or to 3 so it is 1 half T 2 plus 1 because you used up a step a time step in doing that plus 1 half T 3 plus 1 and that is it that is what the mean time to hit the traps from T 1 starting from 1 is okay but what about T 2 and T 3 it is clear by symmetry since the traps are situated symmetrically you can use a little bit of symmetry here I do not have to write the equations for everything T 2 must be equal to T 3 because the two traps here are completely symmetric so let us call this T 2 and this will imply T 1 equal to T 2 plus 1 what is T 2 itself what would be the recursion relation for T 2 well where are the possible places to which T from 2 you can jump to 1 3 5 and 4 if you jump back to 1 you wasted a time step but you are back to T 1 right so it is clear this is 1 quarter T 1 plus the next near the other nearest neighbors are T 3 plus T 5 plus T 4 and each time you used up a step so this is plus 1 quarter times 4 which is 1 once again but T 3 is the same as T 2 so we can kill this and write this as T 2 T 5 is a distinct point we cannot do anything about that and what is T 4 0 by definition because you are at the trap and that is it so this is gone and what is T 5 we do not need to write a separate equation for T 3 because it is the same as T 2 T 5 is a distinct site and what is T 5 with probability a quarter you go T 2 or T 3 but these two are equal to each other so that gives you a half times T 2 I included the T 3 contribution in there and then what about these T 4 and T 6 are 0 and you go there with probability half quarter plus quarter right plus 1 that is it why yeah so it is half of T 2 plus 1 plus half and the halves add up and give you a 1 so the one is conservation of probability simply saying that one time step has been used up over all the nearest neighbors wherever you jump right so we have three equations for three unknowns T 1 T 2 and T 5 and you can solve this set of equations fairly trivially and without any difficulty you established that T 1 is in fact 5 so on this graph on this graph if I start at the apex A and I want to go to either the point B or C how long will it take well we saw that on this graph if you start with the apex and you want to go to this or this it takes you time 5 relative to 1 here in this graph how long will it take to go from A to either B or C the mean time to go twice as far it takes 5 times as long right this is 4 steps away 1 2 3 and 4 steps away 25 so the time unit will be 25 you have to work out the full random walk problem on this over all possible excursions of this random walker so there are lots of loops where the walker can get stuck for a very long time and eventually you hit either this point or that point okay and these are all connected so there is no doubt that this walk will eventually hit either B or C because it is periodic but the mean time to do so there is now going to be 25 each time it is multiplied by 5 so what is d walk on this structure it says T 2 is 5 times T 1 right so this is log 5 over log 2 which is greater than 2 so it immediately says that r squared on this structure is like 2 over d walk which is log 5 by log 2 and this is greater than 1 greater than 2 it immediately says that r squared goes like a power of T which is less than 1 not 1 so the process is sub diffusive you can ask how is this an artifact of where we started at such a nice symmetric point what would happen where you can start here you can start on this graph at this point for instance and say what are the nearest neighbor points that is this this this or this so you can ask how long does it take on the average to go from here to this this this or this and the answer of course is 1 the first step you are off then you want to go to points twice as far from this point in other words you want to go either here here here or here it is just scaled up twice and the answer will turn out to be 5 once again so I urge you to work it out on this graph because now you have interesting points you have this point you have that point and there could be lots of meandering here without hitting these boundary points but when you work it out it will be exactly the same as that you will again discover that the walk dimension here is log 5 over log 2 by the way what is the fractal dimension of this structure what is the fractal going to be remember when you scale things up or scale it down break a unit interval into pieces of size epsilon by multiplied by epsilon then it is clear that exactly as we saw in the triangular triadic couch curve what is happening here is that one triangle is being replaced by 1 2 3 triangles and what is the size of each of these I am putting a point here point here point in decorating so the size is a half therefore D fractal equal to log 3 there are that is N of epsilon over log 2 and this is 1 over epsilon epsilon is a half so the fractal dimension once again as you would expect is between 1 and 2 is greater than 1 less than 2 because you are on a Euclidean plane you could play this game not with the triangles but with simplexes for example tetrahedra in 3 dimensional space so in 3 dimensional Euclidean space you embed tetrahedra so you take a tetrahedron and at each apex reduce the size by half and at each apex you place a tetrahedron this I do not want to draw it here and then you wait this and so on and so forth okay and now what would be the fractal dimension of that pardon me it would be well the fractal dimension there if you are in D dimensions if you are in D dimensions it is log D plus 1 over log 2 okay whereas in 2 dimensions it was log 3 over log 2 and so on so if you are in D Euclidean dimensions where D is greater than equal to 2 the fractal dimension turns out to be log D plus 1 over log 2 and what would be the walk dimension that requires a little harder work you have to find out how many nearest neighbor the neighbors there are here there are 2 nearest neighbors in this case but in that graph the more complicated graph in a tetrahedron you would end up having a larger number of nearest neighbors and you have to include those things and it turns out that the walk dimensionality is log D plus 3 over log 2 that is why it became deep log 5 here okay. So for the Sierpinski fractal in D Euclidean dimensions you can write down what is the fractal dimension you can write down what the walk dimension is and so on but the essential point is that in these structures with lots of nooks and crannies it takes longer to go a given distance than it does on a regular graph and that is why the motion is sub diffusive. You cannot describe even though this is a Markov process you cannot write down the usual kind of Gaussian solutions for the diffusion equation in the continuum approximation nor is this going to be nor is this process going to lead to the conventional kind of diffusion in the continuum limit it is going to lead to a very peculiar sub diffusive behavior we will talk a little bit about it when I talk about non Markovian diffusion okay in terms of what are called continuous time random box we will come back to this we will come back to this graph we will see what is needed in order to understand graph of this kind completely how to work an arbitrary random box on this kind of graph okay. Let me go now to a topic which I should have covered when I talked about Markov chains but which I mentioned off and on and that had to do with recurrence in Markov chains when does the system come back to its starting point in particular we have been worried about the linear lattice and we have been worried about random box on a lattice and I have said things like the probability of a return to the origin is not one if you have a bias in the random box it is equal to one when you have no bias at all we need to establish this I already have set the stage for it by writing this renewal equation for the first passage time density but let us do it in the language of discrete time discrete Markov chains so that you have a clear understanding of where this comes from okay. So let us call it recurrence and first passage in Markov chains so I have in mind a Markov chain in which the states are labeled by some integer J okay and the time is discrete labeled by N for instance and we have been talking about the conditional probability P that you hit for instance a state K at time N given that you started state J at time 0 we will consider stationary Markov chains so the origin can be shifted as you please and time is measured in discrete units N is the label for time K and J are state labels so that this is the probability that if you started at equal at 0 time in the state J you are at the state K at time N by a Markov sequence of transitions etc. So for compactness let me write it in this form now it is clear that if I start with J I hit the state K to go to the state at time N I must have hit the state K at some time in between 0 and N either 1 or 2 or 3 for the first time okay and if I hit the thing for the first time at a particular time that is a unique event in the sense that hitting it at time 3 is excludes hitting it at time 2 for the first time or time not so for P itself but for the time I hit it the first time that is a unique event always. So I can decompose this probability this I can decompose it by asking when do I hit this state for the first time okay and let us call the probability f that I hit K for the first time starting from J so this is K 1 J and I hit it in the first time at time 1 let us say and after that this is multiplied by P K N – 1 after that I start at 1 because it is a Markov process it renews itself and then in the remaining N – 1 time units I wander around I come back to K this is a contribution to this probability from the event in which I start at J and hit K for the first time but to this I must add f of K 2 J P of K N – 2 because these 2 are mutually exclusive right. So what I am doing is to decompose this in terms of mutually exclusive events based on the time of first hitting that final state okay plus of course dot dot dot plus f of K N – 1 J so I hit it for the first time at time N – 1 and after that all I have to do is to remain there K 1 K and there is one more possibility which is that I hit K for the first time at time N itself and then it is finished right and that exhausts all the possibilities if you like I can put P of K 0 K but by definition this is identically equal to 1 by the probability that I am in K if I start with K at time 0 is of course by definition 1 so I do not need this now this is the discrete analog of the renewal equation that I wrote down earlier right and now of course the obvious thing to do is to define a generating function right. So if I define the generating function for each of these guys so let us define a generating function for this for instance this is equal to pi K J of Z that is equal to a summation from N equal to 1 to infinity P of K N J Z to the power N and similarly let us define one for this first passage time first passage probability Phi K J of Z equal to summation N equal to 1 to infinity F of K N J Z to the power N from 1 to infinity I sum and multiply by Z to the N and sum and that is the usual way to define a generating function remember that the notation is such that this is the initial state and that is the final state I have retained that same order here so this always stands for the final state and that stands for the initial state then this relation here will imply the following relation pi K J of Z is equal to F K J of Z sorry Phi Phi K J times pi K K of Z plus Phi this fellow does not have a P here so there is only an F you are summing from 1 to infinity whereas in the continuum I took Laplace transforms and use the convolution theorem because time was continuous here time is discrete so I just use the generating function and that gives us an expression which says Phi K J of Z is equal to Phi K J divided by 1 plus Phi K K a relation between the generating functions so this means that if you solve the Markov chain equation for instance for this conditional probability and find its generating function then you found the generating function for the first passage from J to K you found the distribution itself okay now when can you assert that first passage from J to K is a sure event remember that these are all mutually exclusive events not so here not so here at all different ends these are not mutually exclusive events there will be lots of overlaps but here the statement that you go from J to K for the first time at time n is distinct from the statement that you go for the first time at time n prime so the different ends are all distinct events they are mutually exclusive and I use that in writing this equation okay so when can I assert that first passage from J to K is a sure event I do not care when but it is a sure event this means the sum over all these f's over n must be equal to 1 so at some time or the other it hits it for the first time right so first passage from J to K is a sure event implies probability equal to 1 if and only if summation n equal to 1 to infinity f of K n J equal to Phi K J of 1 equal to 1 so you put Z equal to 1 and you have to get 1 then it is a sure event otherwise it is not a proper random variable right so this n is not a proper random variable it is not normalized to unity but if this is true then it is a sure event okay now in these Markov chains in which every system every state is connected to every other system and there are no traps or cyclic sub systems and so on it will always happen that you wait long enough you sum over all n it will definitely hit it okay so we will come back to that but now let us ask what happens for recurrence I want to start with an initial state and I want to say do I ever come back to this state or not so recurrence would imply that this K is equal to J so recurrence of the state J recurrence of the distribution now the moment generating function is Phi J J of Z this is equal to a summation n equal to 1 to infinity f of J n J and this should be unity if you put Z equal to 1 because this itself is e to the power n here and we know that Phi J J of Z equal to Phi J J of Z divided by 1 plus Phi J J of these are all probabilities you have summed over probabilities multiplied by Z to the n so if Z is X which is a positive number for instance these guys are all positive numbers and this clearly looks like a fraction here but if it is recurrence is a sure event Phi J J of 1 must be equal to 1 so this means recurrence of J is a sure event if and only if how can this fellow be equal to 1 only if Phi J J of 1 tends to infinity if it diverges that is the only possibility if and only if limit Z tends to 1 Phi J J of Z tends to infinity it should diverge in other words summation n equal to 1 to infinity P J n J should tend to infinity only then will it happen let us check this out let us put let us let the Markov chain be a random walk on a linear lattice it is a completely translation variant process infinite linear lattice so we can call any site the site J we will call it the origin okay so let us look at return to the origin in a on a linear lattice so we have all these sites etc this is the origin this is minus 1 1 minus 2 2 and so on and let us look at a biased walk the more general case so the probability of a jump to the right is alpha and the probability of a jump to the left is beta where alpha plus beta is 1 yeah yeah this is the first yes starting with J should be 1 starting with J at t equal to 0 the question is when do I come back to J for the first time now if I remain at that point then at time 1 I am still there so that is f 1 n equal to 1 n equal to 0 is a starting point so n run 0 1 2 3 etc okay so I started an origin at the end of every second I jump now at the end of 1 second I may not have jumped in that case I am still there and that contributes that would correspond to f J 1 J okay now if I want to look at 2 J 2 and I come back I go out on the first step and come back in the second step so that would be J n equal to 2 it is clear that on this lattice when I look at the lattice case it is clear that I can come back to 0 only on an even step I cannot come back in an odd step so that should emerge that f would not contribute at all f would be 0 if n is 3 and J is 0 it would not be able to come back in 3 steps either comes back in 2 steps or 4 steps or 6 steps and so on and so forth okay so notice that it is a good point notice I am summing from 1 to infinity because that is the meaning of recurrence I have to let time go on okay similarly for P I have kept it from 1 to infinity and we will see the significance of that okay so we need to see if this is true and under what conditions it is true if it is finite if this series is finite then it is clear that in this ratio this is a finite number this is a finite number and this is not equal to 1 if you set z equal to 1 on the other hand if it diverges we can be sure that it is going to be a recurrent event okay so all we need to do is to compute this number and ask what it is so let us look at what pi J J is for the linear for the bias random box 5 pi 0 0 without loss of generality I take J equal to 0 here any side at all of Z this is equal to what it is equal to a summation n equal to 1 to infinity P of 0 n 0 it is the probability that at time step n I am at the origin having started from the origin at time 0 right so this is equal to a summation n equal to 1 to infinity n binomial some symbol n and then you had n minus J over 2 to go to site J if you are going to end up at site J but that is going to be site 0 so it is n over 2 on this side and then you have alpha to the power n plus J over 2 and beta to the power n minus J over 2 to hit the site J at time n but we want to hit the site 0 once again so this is equal to alpha beta to the n over 2 moreover this probability this combinatorial thing the binomial distribution is true only if the point J starting from the origin the point J has the same parity as n so n minus J must be even so what are the conditions for if you had P J n and you wrote n n plus J over 2 alpha to the n plus J over 2 beta to the n minus J over 2 the statement is that mod J less than equal to n because you cannot go beyond that in n steps and n minus J even but if you set J equal to 0 it only says n is greater than equal to 0 which we know and n is even and that is the answer there but this summation runs from 1 to infinity this thing runs from 1 to infinity right so what does it are there is also a z to the power whatever so z to the power n equal to n and n must be even so let us put let n equal to 2k and k runs from 1 2 3 etc because n runs here from 2 3 2 4 6 etc so it says pi 0 0 of z equal to summation k equal to 1 to infinity 2k and then there is a k here well let us put pi 0 of 1 to see what happens alpha beta to the power k and that is all we have to sum this series and find out when it converges and if I dive and when it diverges right how would you do that well a simple way to do that is to write down what these factorial are so what does 2k k alpha beta to the power k what does this fellow do well it is clear that this is equal to 2k factorial here over k factorial squared and then there is an alpha beta to the power k and we need to know what the large k behavior of this coefficient is and then do a ratio test or something like that right well large k you do sterling so this goes to 2k to the power 2k so let us write it as 2 to the 2k k to the power 2k e to the minus 2k and then square root of 2 pi times whatever it is so square root of 4 pi k that sterling use there and then the denominator you have k to the k e to the minus k square root of 2 pi k squared there are two of them and then there is an alpha beta to the power k and as you can see k to the 2k cancels e to the minus 2k cancels you have a 1 over square root of k from these two and then you have a 2 to the 2k by combine it here it is 4 alpha beta to the k right so this coefficient so the term the any typical term goes like the k term for example goes like 4 alpha beta to the k over square root of k this is what the kth term goes like finally and I do a ratio test take the n plus 1 to the nth term and then of course square root of k plus k over k plus 1 tends to 1 so the power of k does not do anything and you are left with just 4 alpha beta right so it says convergent if 4 alpha beta is less than 1 and divergent certainly divergent divergent because this will go away and then you just have a 1 over square root of k which if you sum over k becomes infinite it will diverge like the square root of the cutoff now what is the largest value that 4 alpha beta can have 1 because beta is 1 minus alpha so the largest value it can have is 1 so this will imply alpha equal to beta equal to half unbiased unbiased so if it is unbiased you are guaranteed that the return to the origin is guaranteed but if it is biased either to the right or left if alpha is bigger than beta or less than beta it does not matter 4 alpha beta is less than 1 and then the series converges so the probability of a return to the origin is less than 1. We should like to find out what exactly it is like to find out you know in that case what is it which we will do in a minute so let us see so let us do the following let us first look at this case where the return to the origin is not probable with probability 1 the probability less than 1 find out exactly what this probability is and then we will come to the case where we have a return to the origin which is sure namely with probability 1 and try to find out what is the mean time that it takes to do so so let us first consider 4 alpha beta less than 1 it says if 4 alpha beta less than 1 ie the walk is biased when we still need to know what is pi not not of z pi not not of 1 rather of 1 this is equal to this guy 2kk alpha beta to the k summation k equal to 1 to infinity does anyone remember what this series is it is a binomial series we can play around with this but it is a binomial series I will just write the answer down this turns out to be equal to 1 over square root of 1 minus 4 alpha beta there are several ways of doing this you can remember it in many many different ways it is useful to remember what the general binomial theorem is in a form which is easy to apply to these series because what happens when you have negative index negative in non integer index etc is that a lot of minus signs cancel out etc but the way I remember it is slightly different than this what one way to remember it is like this write this as summation consider k equal to 0 to infinity by the way the k equal to 0 term has been left out here so this is this minus 1 okay but let me show you how this is remembered how I how I remember it so if you have 2kk and then something let us call it xi to the power k that is something like this then this is equal to a summation k equal to 0 to infinity 2k factorial that is gamma of 2k plus 1 so this is gamma of 2k plus 1 over gamma of k plus 1 that is k factorial another gap another k factorial and then xi to the k so let us write this xi to the k over k separate and now I remember what is called the duplication formula for the gamma function gamma of 2z can be written as gamma of z times gamma of z plus half okay times a 2 to the power 2z minus 1 which is 2 to the 2k over gamma of half which is root pi gamma of k plus half gamma of k plus 1 over gamma of k plus 1 xi to the k over k factorial and this factor cancels out so you have 2 to the 2k so that becomes 4 xi to the k that is right so it is 4 xi to the k and then there is a gamma of k plus half over gamma of half and this is the binomial series it is a binomial series for 1 minus 4 xi to the power minus a half so if you had an alpha here some constant alpha it is 1 minus this whatever is the argument to the power minus alpha so that is an easy way to remember the binomial series and if you apply that here it is 1 I put a 4 here and then 1 minus because there is a 2k here it gave me a half it is 1 over square root so that is pi naught naught of 1 therefore we know what phi naught naught of 1 is so it says phi 00 of 1 equal to pi 00 of 1 over 1 plus pi 00 of 1 this is equal to 1 minus the square root divided by the square root and it cancels out so this is equal to 1 minus square root of but what is this fellow we can simplify it 1 minus 4 alpha beta 1 minus 4 alpha times 1 minus alpha so it is plus 4 alpha square equal to 2 alpha plus 1 minus 1 the whole square so this is equal to 1 minus the square root of 2 alpha minus 1 squared what is that mod 2 alpha minus 1 mod which is 1 minus mod alpha minus beta it is a useful way to remember it so if you got a bias and alpha is not equal to beta then the total probability of a return to the origin is this number summed over all walks and that is a number between 0 and 1 but if alpha is equal to beta the probability is 1 so this right away tells us that summing over all possible walks if you have a bias in either direction because you got an infinite lattice in both sides the system is less than 1 probability of returning depending on how biased it is of course as you can see if either alpha or beta is 0 so that it is in unidirectional motion you are never going to come back the probability is 0 now the question is when it does return when alpha is equal to beta equal to half what then is the actual distribution of this return time now what we have to do is to compute this number so if alpha equal to beta equal to half recurrence or return to the origin is a sure event but we need to find out what is the distribution of the time as I said you can return to the origin only on an even time step so it is clear that the probability distribution will have support only when n equal to 246 etc so let us see what this probability distribution actually is now we need to compute these numbers so let us compute pi 0 0 of 1 this is equal to summation k equal to 1 to infinity again 2k and a k that is this alpha equal to beta equal to 1 equal to half in this case so there was a 1 fourth here and inside you also had a z to the power n but I put n equal to 2k so this is z squared over 4 to the power k and by the rule which we just had this is equal to 1 over square root of 1 minus 4 times this guy which is z squared minus 1 because I had left out the k equal to 0 term here so that gives us an explicit formula here in this case it says pi 0 0 of z equal to 1 minus square root of 1 minus z squared divided by the square root and the whole thing divided by 1 plus this guy which cancels the square root therefore that is it that is it if you now expand this in powers of n z in powers of z it will give you the coefficients will give you the first passage first recurrence time probabilities at those ends okay so the coefficient of z to the n will give f 0 n will give you the distribution of this time of course n equal to 0 is not a recurrence so that term cancels out as you can see if you put z equal to 0 the 1 cancels out and let us do this is a power series and see what happens so this thing here is equal to 1 minus and then 1 minus z square to the power half so 1 minus z squared over 2 the next term is plus half into half minus 1 which is minus half z squared whole squared that is z power 4 over 2 factorial minus dot dot dot dot so you will see that all the terms will remain positive the next one will have half into minus half into half minus 2 and so on and so forth minus 3 halves etc but it will appear with a minus sign so it will again give you a positive contribution this will tell you for instance that p 0 2 0 equal to the coefficient of z squared which is a half p 0 4 0 equal to we can compute what this number is it is going to be 1 8 these 2 1 8 so you can compute what the probability of return is at the end of every even time step and all the odd time steps there is no return to the origin and you can check that it is normalized to unity I do not have to check it independently all I have to do is to put z equal to 1 and show that the answer is 1 and then the probability is summed automatically so it is clear that it is normalized unity in this case what is the mean time it takes to do this what is the mean time it takes it is the derivative of this at z equal to 1 right so the mean time t to go from 0 to 0 to recur to 0 averaged over all over this distribution must be equal to d over d z phi 0 0 at z equal to 1 and what is that equal to it is infinity because if I differentiate this I get a 1 over square root 1 minus z squared and I put z equal to 1 it is gone right so this is equal to infinity so that checks out we knew that this process is null recurrent it is recurrent but it is null recurrent in the sense that the mean time to have for it to happen is infinity first moment diverges but it does happen with probability 1 we could also ask as time increases how will this diverge how will this mean time diverge remember that in the continuum case we discovered that the first passage time from any point to any other point went like 1 over t to the 3 halves for the diffusion equation I multiplied by t I ended up with a thing like integral dt over t to the half so over a long time if I integrate this guy here goes like t to the power half and it diverges in this fashion right so the question is what is that going to do is there a similar thing is there a 1 over n to the 3 halves whatever where does this come from if so where does that come from the answer is very straightforward all we have to do is to look at what the general coefficient here does after all it came from the expansion of this function here now we work this backwards in the other direction and then you discover I use this identity here but now I got 1 – z squared to the power plus a half so it is clear that this must come from summing k equal to 0 to infinity 2kk this guy here in this fashion or in terms of the gamma functions gamma of I have a plus half here so this must have been a gamma min k minus half over gamma of minus half which is minus 2 root pi so that is irrelevant here some constant and then there was also a gamma of k plus 1 here this is what the coefficient look like right times the z squared or whatever it is sitting here now what does this fellow look like for large k what does this look like from sterling what does what is this have going to do gamma of k plus a divided by gamma of k plus b or z plus a z plus b what does this look like when mod z becomes very large you have to use sterling once again remember this will go like z plus a to the power z plus a this will go like z plus b to the power z plus b and if a and b are finite and z is large this has z to the power a on top and z to the power b below so it is clear that this is going to go like this mod z to the power a minus b so what is this ratio going to do it is k to the power minus half minus one so it is going like 1 over k to the three halves that is precisely your 1 over t to the three halves behavior in the continuum k is like time to k was n as you can see and now you are going to multiply this by a k and sum you are going to get coefficients which go like 1 over square root of k and when you integrate that you are going to get the cut off to the power plus half on top it will diverge so it is the same divergence exactly the same power law divergence such as we had in the continuum case but this is an exact solution this tells you exactly when it converges and when it does not now I made the statement so the to summarize in one dimension on a linear lattice if you have a bias random walk a constant bias either to the right or to the left then a return to any particular point is not a sure event it is an event which happens with probability less than one no matter how long you wait on the other hand if there is no bias in the walk then a return to any point is a sure event as is first passage to any point from any point is a sure event but the mean time for it is infinite such a process said to be recurrent but null recurrent when the mean time is infinite okay now you can ask what happens in higher dimensions what happens in 2 3 4 etc etc and the answer is known and I will do this next time because we need a little bit of information about the green function it will turn out that in dimensions 1 and 2 first passage in the case of unbiased walks first passage from any point to any other point is a sure event but the mean time is infinite in dimensions greater than 2 3 4 etc etc it is not a sure event in the total probability of return to any point is less than 1 or first passage to any point is less than 1 even if there is no bias in the walk and we can establish this directly it is not very difficult to do so we will do that next time but again notice that the Markov property has been used very clearly because without that then all these statements go out of the window and you need to reexamine the case over again but because of that renewal equation you were able to make this very powerful statement okay now I might mention you could ask what happens on that sierpinski graph if I now go to an infinite generation I generate this so that the number of points increases exponentially what happens on such a graph well if it is a finite graph it is a gothic in the sense that every point is visited infinitely often unless you have traps then of course it ends but if you do not have traps and if you have an infinite graph then is it guaranteed that if you start at any one point you hit all the other points any arbitrary point this has to do again with a dimensionality being less than 2 or greater than 2 but it is not the walk dimension or the fractal dimension but a certain ratio of these two which is called the spectral dimension and that has to be less than or equal to 2 for the walk to be recurrent otherwise it is transient I will state what that dimensionality is next