 So, when we basically went from this standard Markov chains, where we are conditioned upon a particular index and ask whether the future from this time onwards is going to be independent of past. Now, when we just instead of asking for a deterministic term, when we move to a random time, it is a slight technicality, but that technicality is important to handle because most of the times we will be facing such cases. For example, in the example I gave, you are not always going to like if you go to a stock broker and if you are interested in stock trading, you will not just say that you sell my shares on 100th day, you will tell him to sell your shares when its value is above certain number. So, then only you feel that it is profitable for you. So, that is why we have to worry about conditioning on a random time and also like in the aircraft example I gave you, so you want to always keep track of what is your fuel level or whatever some other parameter and you want to say that if this parameter exceeds something, you see how what things happen in the future. So, that is why it is important to handle this technicality and to understand this, we have to kind of slightly introduce some notions called we want to bring in something called strong Markov property, this is Sn, sorry, Jn, so now what we want is we are going to define a Markov chain to satisfy a strong Markov property with respect to random time T. So, the strong Markov property is defined with respect to some random time, if this condition holds, what is this condition? Till random time T, you give all the state of my Markov chain, then ask the question from that time onwards, if you go S1 steps ahead, S2 steps ahead and S3 Sn steps ahead, that distribution is going to be equal to this probability where this probability says that no, okay, that is still correct. So, this says that you can as if now you can think of your Markov chain starting from this time and then going to in the next S1 states to state J1 and further in state in time S2 to J2 like that, XT, no, so that is now become your starting point, that will become your starting point, whatever that XT, that time where you took I, that you will now take it as your origin and from there you start going in the next S1 step you state J2 and in S2 you take J2. So, what it basically Markov, this strong Markov property is telling that if you going to condition till a random time, then the future you can think of as if your origin is at that point and from there you are looking at the same amount of future ahead. So, what you did, here till XT you looked into, then the future you looked into further S1 rounds in the future and then another S2 rounds in the future, right. So, here it is just saying that then you can imagine that your process has started right from there, then look jumping to J1 in the S1 steps ahead in the future. So, like that basically what it is telling is if strong Markov property holds the random time, whenever the condition holds, I mean whenever observation is given till that random point, the future I can think of as the time starting afresh, Markov chain is starting afresh from that state. This part is deterministic, this part is deterministic, there is no randomness involved in this case. That is the properties of homogeneous Markov chain. So, what we did is say it just does not depend on n, but how does it is connecting here? This is with respect to random time right and we are now talking about you look into the distribution starting from that point. So, what did Markov property told you? Sorry, homogeneous, it said that pijn is pij for all n right and that is not this definition. You are going from step state i to state j from h step n is the same irrespective of which step you are looking at. So, here it is about if you have given me all the observations till time t, I cannot think of my process starting at that point into the future. This is a definition. So, this is what we are going to call it as strong Markov property. If my process at all satisfies this, my Markov chain, I am going to call it as a strong Markov property or a strong Markov chain. So, now let us see is it true that any random variable will for if I have a Markov chain, is it like if I take any random variable t which is integer value, this property will be true? So, let us look some examples. Let us look at some examples of random times, t. Let us say I let us take my Markov chain xn. Now let us fix a state j. Now I am focused on one particular j and now define t to be visit to j. You are going to just take one particular state and now you will be defining your t to be first visit to j. Your Markov chain is starting maybe at some point let us say it hits state j whenever it hits that time slot is given by this t. So, when it is going to hit g that depends on your transition probability matrix and also on your initial distribution or maybe like from which point you are going to start. So, t here is a random time. Now I want to ask this question what is what is so t is now what is a map from omega to natural numbers. Now I want to ask this question for some particular sample what is I want to ask this question what is this when this is going to happen when this event is going to occur. Now t is simply a random number it has nothing to do as of now with the Markov chain. Markov chain is there now I am just defining t on this Markov chain as follows. t is the to be the time of first visit to j. Now I want to ask so maybe like let me make its index by j so that this j is more explicit. So, if this is the case on some sample point that my random variable takes value k what does that mean it must be the case that X of omega is not equals to j X 1 is not equals to j all the way up to X k minus 1 of omega is not equals to j and then X k of omega is j. So, what does this mean I am looking at this event right when I write. So, Pella this is not first state 0 state is not j first state is not j all the way like even second is not j even k minus 1 state is not j but k state is j. So, this is the time right t is the time of first visit time t I am not talking about its distribution I am just defining a random variable is its distribution right now I am not worried about. So, is this random variable definition clear to you this random variable is just telling it is focusing on a particular state and just it is looking it to it and seeing when I am going to first hit that state if it is the first state it must be the case that previously I should not have hit that state right. So, in the previous time indices. So, previous time indices k minus 1 k minus 2 all the way up to 1 and 0 these are not state j they could be anything else but on the k state it is going to be j I am not selecting anything here I am just defining what is the meaning of this. So, t j is a random variable right. So, t j is what t j is from omega to R ok now on a particular point omega what is the meaning of t j omega value is k what does this mean. So, our definition is the first time it visits state j right the time of first visit ok. So, then in that case if this it has taken this t j has taken k value k on sample point omega this must have happened right only in the kth round it would have hit j not before that yeah if this is happened then that is exactly t j equals to k on that means yeah I mean this is we do not need to write. So, this means yeah fine I mean this is the basically definition right definitions means they are both way ok implies ok yeah. So, now let us see ok I can always define as some different different random variables like this which are a random times right. I could say instead of first visit to state j I could say t j is the time of second visit to j that means it is going to look it keeps on looking when first time happens ok and then looks when second time happens that time slot it is going to give it as value of t j like that I can define many random times here. But the question is is it true that for any random time this going this property is going to be satisfied it so happens that not really it is the strong Markov property is not necessarily need to be satisfied by any random times ok. So, to see that let us look at another example. So, I am going to define u j to be what hold this condition I do not know I mean I do not know because right now if this is if it holds then I am going to call it as strong Markov property. If it holds whether it holds that is a different question when it holds or for what random times it holds I am just telling you that I am giving examples of random times I am not saying anything about whether this random time satisfies this property if it satisfies at all then we are going to call my Markov change to be strong Markov property with respect to that random time ok. So, this is just an example of a random time. So, now let us take another example where my random time is second visit to my state j and now u j now I am going to define my random time to be just one step before this. Let us say it is going to hit second time my state j and the one step before is my random time I will be interested in one step just before it hits the second time the state j ok. Now let us see I want to compute this probability ok let us take some i and j and now let us try to understand what this probability is. So, what we are saying? So, t is what t is specific to particular state and we are saying that one step before. So, what is this t is giving you let us say you have this time slot let us say j has happened here and j has happened here ok on this time slots and the t is basically giving you this time just one step before my second j and now when I give this right till time t I have given you and let us say so this t is going to always be this in this realization just one step before this right and we have told that this time this state is what i by this definition what is x t plus 1 is going to be j. So, if this k is not j what is this probability is going to be? If and if k is equals to j this is going to be 1 right if k is equals to j by definition this state has to be j right. So, by with probability 1 and if k is not equals to j then it cannot happen this event cannot happen. So, then that is why it is going to be 0. So, this probability is 0 and 1, but is it going to be same as what we want according to this definition say. So, according to this definition we want it to be probability that x 1 equals to k given x not equals to i or basically this is same as saying e i k right that is my definition, but p i k can be anything instead of being 0 1 right I can construct a Markov chain where this p i k can be anything, but if I am going to use such a random time what is happening is this probability is not equal to this probability. So, at least with this respect to this random time my strong Markov property whatever I am defining it here this is not satisfied. So, let us see what is making this strong Markov property failed in this example. So, what is this? This random variable t j here in a way already anticipates what is going to happen in the next round right. So, t j is one step below my second visit. So, that means I already know the next step is going to be j. So, it is kind of anticipatory here. So, when such a thing is going to happen my strong Markov property is not satisfying. So, we can anticipate in that case whenever my stopping time is such that it is not anticipatory in nature may be my strong Markov property is satisfied ok. So, you see that already any random time is not going to satisfy my strong Markov property ok, let us take it t j plus 1 and then. So, this is one step after j. So, you are already looking into the future in this case right like. So, I mean this is not a good definition to apply here. So, suppose you have been told that my state is I in the next state after I visited my state 2 for a second time. So, in that case it may be I do not know like this could be satisfying or not satisfying I do not know, but when it was like this it was clear that this should happen when you have in this case. So, when you have plus here yes you have been told that after visiting this here you have hit state 1 right and then after this what is going to happen I do not know that is the question you are asking here right first j occurrence, but the way you are just then taken immediately the next slide right. So, u j is what u j is exactly this where the you have hit j for a second time and then you are telling you t j to be immediate is a next slot. So, either say either direct beach me kuch nahi ho rahe. So, then you have to see if you want to apply your definition what you have been told is at this point your state is I and now you want to ask what is x t plus 1 is going to be k that I cannot say anything right with this. So, that is why I cannot I do not know what is happening here, but if you have defined like this the previous case it is not going to happen ok let us stop here then.