 So, in the last class we looked at a nice property of irreducible Markov chain that is positive recurrent. We said that it is going to be positive recurrent if and only if there exist a probability function pi such that it is the solution of the equation pi equals to pi p right. So, in the last class we just showed existence of such a pi and we argued that that is going to unique right, but yeah one of you pointed who is that you write so about the uniqueness. So, he has a concern that whatever the method we started with we had some way of construction of B n's. So, we had this B n's and this was a vector and we showed that this as n goes to infinity this was converging to some gamma right. So, if I am going to take some we said this was positive, but this is we know this it is unique, but you are starting with some method where you are going to start with some particular sequence and saying that that is going to convert some little. So, this whatever and we somehow translated this we take this pi equals to gamma, but after normalizing this gamma by dividing it by summation of gamma is. So, by this method I mean what are the gamma we got yes this is an unique solution for the method we started with, but if you have started with another method maybe you would have constructed another sequence and looked into its limit maybe what is the guarantee that that would have also given you the same gamma. For uniqueness I want that irrespective of what method you are going to construct when you look at the limit that limit should be the same that is what I wanted to show. So, in a way the last construction we said for that method the gamma you obtained is unique, but we are going to say that that was a kind of hint for this now we are going to say that that is going to be the unique whatever method you are going to use. So, now we are going to argue that if then any other pi exist let us say which also which satisfies the relation pi equals to pi p that must be necessarily the case that that pi is exactly equals to this gamma. So, let us try to argue that. So, in the last class it was clear that this pi is a positive probability vector because all this gamma is a positive. Now, how to show uniqueness? Now, let us say that pi is any probability vector with strictly positive elements in this and then let us say that is a solution of pi equals to pi p. So, some pi which is going to satisfy this now we are going to argue that this pi is nothing but this gamma this pi must be necessarily this gamma. Now, how we are going to show that here this pi is such that summation of pi i equals to 1 and pi i is equals to 0. So, let us assume that this pi is such that this is a probability vector with stick positive elements in it and it satisfies this relation. If it satisfies this relation I already know that by repeating this recursion I should be able to get this for all greater than k equals to greater than or equals to 1. Now, what I do is I will add this for k time. So, this is true for any k equals k right. Now, I will take this for k equals to 1 2 up to n and then add all of them. So, if I add all of them on the right hand side I am going to get n pi and on the right hand side I am going to get and let us divide them by n. Now, this guy on the right hand side if I just slightly reorganize this I know already how to handle this right. We have already dealt with how to handle this we have already said that if this is going to be. So, if only if you are going to look into this quantity this quantity is going to be finite if my state is transient right and in that case if I am going to divide it by n anyway it will go to 0. We have shown that this quantity in the limit as n goes to infinity already goes to 0 if my state is and I am now looking at 1 by n and null recurrent. But if it is not either of these two then it goes to some constant which we said as gamma j right. So, now just do this you let n goes to infinity. So, the left hand side is anyway constant what is changing if the right hand side as n goes changes and where this limit is going we have already shown that this is gamma right. So, then this pi is actually equals to gamma 1 and that gamma is what that gamma is a specific vector which is the limit of my sequences. Now what we have just argued is this pi whatever this pi that satisfies this relation that pi is equals to this gamma right. There is a pi multiplication here, but when you look at component wise that what we showed right. So, now this is in a vector format you take a specific component j in this and now look at for that when we look into that this quantity is going to be turn out to be simple gamma j and now this is like a probability vector right. So, it will just add to 1 and you will just get the constant gamma j for that and now just look at the vector just the same argument we did last time except that I just writing compactly it for the vector here. So, now we have done the what we started proving was only the only if part right we are trying to we will try to show that the necessary condition that if my Markov chain is irreducible and positive recurrent, then the pi that exists here that is there exists a pi which is a solution of this and which is also unique. Now we want to say that suppose let us say we want to show the if part that is the sufficiency what we want to show now if indeed such a relation holds pi equals to pi p for some positive probability vector then my irreducible Markov chain is positive recurrent how we are going to show that. So, we are again going to show this by contradiction suppose assume that you have a irreducible Markov chain and let us say it is it should be of one type either it should be transient positive recurrent or null recurrent suppose let us assume it is transient or null recurrent. So, I am assuming so assume pi equals pi equals to pi p and then Markov chain is transient or null recurrent then for this we know that this quantity here goes to 0 as n goes to 0 and again going back to this previous step here what I have that is pi equals to pi 1 by n and k equals to 1 to n p to the power k and if I apply my limit as n goes to infinity what does this say this says this guy goes to 0 but what is this we have assumed this pi vector to be positive strictly positive. So, this is right away are going to give us a contradiction. So, it must be the last case if this relation holds then none of this should be possible a contradiction infinity. So, this relation should be true for any n. So, this is this relation whatever we wrote this should be true for any k. So, I can just add all of them and get it for any n and then I let n go to infinity ok. So, now our part is complete. So, if you hand up with a irreducible Markov chain then how you are going to ensure that that is going to be positive recurrent what you first do is take this transition probability matrix check whether there it has a solution pi equals to pi p right. If it has a pi that solution and all the components in that are going to be strictly positive then you conclude that that this is a positive recurrent matrix ok. Now, what we did is we took a irreducible class right and we just focused on that. What if my Markov chain has many communicating classes which are all closed for example, how does this result extend? So, let us say I have. So, this is my state space S I have one communicating class 1, one communicating class 2 and one communicating class 1. So, let us say my Markov chain reduced into these three communicating classes and let us say each one of them is a closed communicating class. So, then what would we say ok that is the case just take focus on this and on the states here then you can think of your Markov chain on this state space is irreducible because your entire now state space is one communicating class and then we did just study this. But suppose your Markov chain had multiple things like this. So, how you are going to extend results like this? Now, is it the case that if I look at my solution on the entire space for pi equals to pi p will that pi is going to be unique? Let us do that. Yeah, it could be unique for C 1, C 2 and C 3 that we have already shown ok. Let us say let me take A 1 here which is the solution of this to this. So, be watchful here when I write P 1 here this is the transition probability matrix reduced to the states in this class. So, for this entire thing there is one transition probability matrix which is of size mod s into mod s. But now this P 1 is that portion of this matrix which corresponds to this state space. That is again a transition probability matrix right because you have just restricted yourself to some particular rows here and similarly E 2 equals to now all of this A 1, A 2, A 3 are according to us unique because we have just focused on that particular communicating class. Now, what can we say? So, now extend this to this kind is equals to pi 1 equals to pi 1 over entire p. Now, what is this pi 1? Pi 1 is same as this A 1, but on the places where the states are not included in this state I am just going to append zeros there. They are getting a sense of what I mean by this. So, pi 1 is still on the entire s states, but it is going to be some A 1 here and zeros other way. So, that means this some portion it is A 1 that is corresponding to the states in this class and I am just appending zeros in the other places. Similarly, pi 2 I mean the position where this A 1 and A 2 lies need not be at the same this is just for my representation. So, if you are going to extend like this you can still check that pi 1 here is going to be a solution of this equation still. This is going to be the case because transition from this class to this class or this class to this class those probabilities are going to be zero or these are not there. So, because of that even if you extend it like this this p will be such that for the portions corresponding to zeros here the corresponding elements in p is also going to be zero because the cross probabilities are going to be zero. Just check that this is in this going to be true for all of them. So, let us say this is I have all these are solutions according to our now all of you know what I mean what is a convex combination of two vectors. So, let us take this is a probability vector this is a probability vector this is a probability vector let us take their convex combination will it be probability vector this is going to be probability vector. So, now if I take lambda 1 pi 1 lambda 2 pi 2 plus lambda 3 pi 3 this is going to be where this lambda 1 plus lambda 2 plus lambda 3 is 1. I will take lambda 1 lambda 2 lambda 3 which they are positive and they sum up to 1 and they have taken their convex combination like this. So, let us call this another pi. So, this is the same pi here what I have in the entire Markov chain I have now based on this pi 1 pi 2 pi 3 by taking their convex combination I have obtained another probability vector this satisfies this equation pi equals to pi p. So, so this pi is now I have such a pi where pi is lambda 1 pi 1 but lambda 2 pi 2 plus lambda 3 pi 3. But now is this pi unique here that satisfies this relation pi equals to pi p what exactly right. So, this lambda 3 this convex combination if you change this. So, what are this they are this lambda 1 lambda 2 lambda 3 you take any such thing such that you take any lambda 1 lambda 2 lambda 3 such that it satisfies this then you will get another pi here different different pi which satisfies this. So, on the whole thing this solution pi equals to pi p need not be unique there could be many pies that could solve this and specifically these pies can be obtained as a convex combination of the pies we obtain on each of these communicating classes. Then so, let us look at another example an example so far. Well, let us say I have some 5 states like this starting from 0 1 2 3 and 4. So, let us draw this transitions. So, let us say I have a Markov chain like this. So, think of this initial state as you start throwing some coin here you toss a fair coin when the probability half you enter state 1 with probability half you enter state 3. So, when you enter state 2 1 after that you either go to remain in state 1 or go to state 2 you go to state 2 with probability half sorry 0.8 and similarly when you are in state 2 you go to state 1 with 0.2 or remain there itself. Now, this 5 state Markov chain in how many classes I can derive this into and what are those classes? So, will 1 and 0 communicate no right. So, 0 and 1 cannot be in the same group in the same class and similarly 0 and 3 can be in the same class no they cannot be right, but can 0 1 and 2 can be in the same class and 2 and 3 and 4 can be in the same class. So, then what are the possible classes? 3 classes right to a class 1 class 2 and this has to be in the separate one. Now, what about class 1? So, let us call this class 1 and then let us call this class 2 here 1 and 2 are closed right now. So, fine you can have such a transition like and based on that you can see what are the different classes here. Now, suppose that you want to solve for this equation. It is still not one irreducible class, but suppose let us say you solve this and obtain some solution for this case it looks like the solution is going to be 0 1 by 10 and 4 by 10 1 by 4 and 1 by 10 just check this I am just dumping this on the board. If you are going to start your Markov chain with this distribution as your initial distribution ok. You right now this I have just told you this is a transition I have not told you anything about the initial distribution right. Suppose you solve this and you start this with is your initial distribution. We already said one property what was that property? So, x 1, x 2, x 3 at every point it is going to remain in the same thing right. Now, then you can also then in a way that can also imply that if we start with initial distribution I then my dT mc is stationary. So, you already discussed what is the stationarity of a process right ok. So, if your Markov chain is such that if you are going to start with your initial distribution simply to be this invariant distribution then it is going to be stationary right. So, I am this is not a irreducible Markov chain here right. So, that was the case when it is irreducible. So, now I am not saying that this is going to be reducible this is an arbitrary Markov chain with different different different possible set of classes. Now, you just still take this as the solution of this provided this this pi solution happens to be a probability vector you take it and make it as your initial distribution and then run it then your Markov chain is going to be stationary. Yeah. So, we have only ensured that my pi is going to be unique when I have a irreducible positive recurrent one, but here it is neither positive sorry it is irreducible and I have not yet verified whether this states are recurrent or what. So, I have just said take a solution like this. So, in that case I am not guaranteed to have an unique distribution. So, that is the thing right like if there are multiple solutions which satisfies this and if you are going to start with those different different possibilities then the your stationarity your probability that your Markov chain is going to take a particular state in a particular time that is going to be different it depends on what is the initial distribution you are started with and that you can have multiple possibilities in that case in case if you have many possible pies in that case. Yeah let us say let us try to see a case where such a thing will not happen ok when when is the possibility can you think of any case where this pi equals to pi p cannot be a solution. So, at least there will be always a pi equals to pi p solution will be there right because why we know that we already argued that pi is a stochastic matrix right. So, pi has a eigenvalue of 1 and this pi is what we are basically saying that this pi is nothing but the eigenvector corresponding to that one we will always have, but the question is will this add up to 1 if whether it is going to be going to be if not if it is not going to add up to 1 is there a way you can make it and add up to 1 how normalize this then you will have always one such pi which is going to satisfy this relation right then you can start with that yeah again think of a case when it is possible fine I have one as eigenvalue if that eigenvalue will be such that 0 is the only possible eigenvector yes yeah. So, we have to think about an example for that then in general I do not know like under what condition it holds we have to basically construct an example. So, in a way what this says if suppose pi is equals to 0 is the vector right if rank is not full it can have multiple solution the question is can 0 be a solution no we are not saying p equals to 1. So, this is pi equals to pi p right. So, let us write it as pi into 1 into pi p this is my eigenvalue suppose this is all 0 this is anyway severely satisfied that is not an issue here right. So, think about this is there any transition probability matrix where my pi equals to 0 is the only solution for. So, come with the proper. So, even forget transition probability matrix just construct matrix with this stochastic we will have eigenvalue 1 is there a matrix whose eigenvalue 1 will can have only 0 as the eigenvector that is the case yes then we will have we have an eigenvector when we say by default let us say it makes sense only when the component at least some components are going to be positive right. If that is not the case then this is going to be this relation is going to be satisfied for any constant for any eigenvalue that is what all we are asking now is suppose if I have a such a relation is it always the case that pi i is equals to 0 for some i for some i as long as one component is going to be positive that is fine right I will have I will come up with a vector which will not have all 0. So, what is that matrix or is it the case that whatever p you are going to start with you will end up with this ok just think about this I think we should be able to argue that whatever p you are going to take I will end up with the pi in which at least one of the component is here non-zero. So, because of that I can always discard the all 0 solutions ok. So, it looks like pi here is an eigenvector, but we have to just make sure that our definition of eigenvector is consistent in the sense that that exclude the case that all the components being 0 right we just need to ensure that ok just check about this ok fine. So, now what we have we have just dealt with the case when I have a reducible class how to say whether it is going to be a positive recurrent right. So, now if it is positive not positive recurrent there are that means if it is not possible recurrent that means I will not be able to find a solution pi equals to pi p where pi is going to be my positive is a probability vector with positive components right. And by the way notice that we have also argued that if one of the component is positive in my vector pi it must be the case that all the components are also positive. It is not that only in the solution only one component is positive and other components are going to be 0 ok. So, we showed it right when we discussed the properties of this statement when we said that if one component is going to be strictly it means that all other components are also going to be going to be strictly positive. How to ensure that now what are the other properties do have any other properties to say that if this is not going to be positive I will not end up with such a pi which is a solution of pi equals to pi p and will have all strictly positive element. I will end up with some pi equals to pi p solutions here with some of the components to be 0, but that does not say that my reducible transfer my DTMC is a positive recurrent class. For my DTMC to be positive recurrent I and I want that the solution of this pi equals to pi p will be such that all the components in that are going to be positive ok. So, I first when I have such a big matrix if I want to see that first thing I will do is this irreducible communicating class if that try to find pi equals to pi p solution see if all its components are going to be strictly positive then you are done you know it is a positive recurrent. If you happen to find this pi equals to pi p solutions with some of the component is 0 then you know this is not a positive recurrent. It has to be either transient or non-recurrent how to verify this. So, next we are going to look for a condition when we can say that this is going to be transient ok.