 Let us continue with some examples for Markov processes to start with let us take up an example of the fluctuating bank balance of an individual. Even for people with fixed salaries there is a some level of fluctuations in the bank balance. This fluctuation is much more so for people who do not have fixed salaries self-employed people. So, let us simplify the model every model has complications has complexity, but just for understanding let us simplify the model. Let us assume that bank balance cannot become less than 0 for example, and there is a distribution in income and there is a distribution in spending and there are only 2 processes by which the bank balance fluctuates. So, we have income and that is in our problem we transform it as depositing or amount credited to the account or amounted deposited in the account and this is one quantity and another quantity spending. We call this random variable as D and spending another random variable as S and the balance is basically the difference is an accrued due to the net differences in deposit and spending, but it is an accrued quantity that is the net what remains after several months. So, we have n here the steps is actually months say months let us say bank balance is declared once a month we ignore interest and all other issues. So, random variable identify we can say for example, that D the corresponding realization could be Dn in nth month similarly spending corresponding realization could be Sn in nth month and the bank balance D its realization could be some value Dn in nth month and it goes on changing if say in the previous month one had some units B and units of bank balance depending on how much is deposited in this month and how much is withdrawn or because of spending the net difference adds to the bank balance and it is made sure that it cannot be negative. So, we cannot have negative bank balance. So, we assume only one sided random walk in the in the balance or money space. So, we can write in the n plus 1th month or n plus 1th step the balance Bn plus 1 or maybe let us write it as Bn plus 1 is maximum of Bn plus the amount it deposits minus the amounts one spends subject to the constraint that the net should be 0 not allowed to go below. Now, one can easily construct given the fact that the change that comes in each month Bn plus 1 minus Bn or the transition now we are basically now dealing with the transition as n proceeds n equal to 1, 2 we are talking of bank balance B1, B2. So, in general Bn and in n plus 1th step Bn plus 1. So, it is a random walk of the quantity B along the along this money space. So, we can now write the transition probability before we write let us postulate that his income person has an income distribution or a deposit distribution in his deposit it is given by Pd of some x the probability that he will deposit x amount is given by Pdx. If it is a continuous number you can put as Pdx plus dx Pdx can be a density distribution, but let us say it is a kind of discrete x is defined in integer units. So, it defines a discrete distribution similarly this is distribution of deposits or I would call right now as income distribution all the income he deposits. So, this is x by experience one can develop a distribution function for oneself by just looking at all your past income similarly you can develop a spending distribution function Pdx Psx which is spending distribution each month probability that he spends x in a month is distributed. So, one can define these distribution functions. So, it is with this we can easily see that the probability that is balance transits to a value Bn plus 1 given that it was Bn on the previous month is nothing, but the net balance is a sum over it is it is exactly a distribution the probability of that he spends some k amount for some k amount let us say somebody spends then in order to have this much of extra in that month he must have earned that much more than what he has spent. So, the probability that he would have earned k plus Bn plus 1 minus Bn and now of course k can be given as 0 to any amount. So, long as positivity is maintained these distributions automatically take care of the positivity because they are not they are automatically 0 for negative values let us say. So, we very easily we wrote down a transition probability from a simple argument that if if there is a jump in his balance from Bn to Bn plus 1 that extra amount should actually be the difference of deposit minus a spending expressed in terms of probability distribution. So, this is just that statement. So, we can see that this now if you look at this transition probability did not depend on the history depends only on the states. So, this therefore, is an example of a Markov process because it depended only on Bn plus 1 and Bn. So, not some other B values only those two values. So, hence it falls within our definition of the Markov process. We can discuss several useful examples to elucidate the elucidate this concept of dependence of states and how to write actually an equation for the evolution of the state over a various step. So, I might use the word a time at some occasions, but basically it can be a discrete time is a step. For that very often cited example is the so called weather problem. So, let us reduce the weather is a very of course, most very complex problem, but we can reduce to a very simple type over simplified type that we are only discussing the character of two issues whether it is going to be rainy tomorrow or whether it is going to be sunny tomorrow. If it is non rainy it is going to be sunny. So, it is we divide the whole whole gamut of weather spectrum into a dichotomous situation where it is either rainy or sunny. So, let us say weather is either rainy or sunny. So, we therefore, it since it is very clearly defined. So, we can easily note in a diary how the weather would change given from today. So, one has as a function. So, every day my steps are going to be days now day 1, day 2, day 3 etcetera and day 0 let us say is when I am starting the very first time let us say it could be sunny. I observed that the next day was also sunny, then it was rainy, then it was sunny, maybe then it was rainy and so on. I can go on noting this and develop a sequence. Let us assume further that there may be a pattern in this randomness and that pattern is that the transition from a state sunny to say next day it being either sunny or rainy perhaps did not depend on the history. Any where I take transition to the next state whether it is rainy or sunny depended only on the state today whether it is rainy or sunny today and not on the path. Supposing I make that assumption and see if the data fits in I can later of course, always say that well that did not fit in or rather it fitted in fairly well a simple model would work. So, one always begins with making an assumption of the Markovian character and that is the advantage. So, accordingly this problem we try to reduce it to Markovian character and assume that this is a Markovian process. Here we have to assume because this process is much more complicated in reality. So, we therefore define transition probability that the probability that it is sunny given that it was sunny on the nth day. So, it is sunny on n plus then first day given that it was sunny on the nth day we can call it as let us say PSS now we can write it as PSS or PSS it did not depend on the previous history. So, it just depended on these two values SN and SN plus 1. So, it is basically sunny to sunny transition. So, we can also say the probability of transition into rain is rains tomorrow given that it is a sunny today that will be then 1 minus PSS this is actually we could write it as PS2R from S, but we know that it should either transit to a rainy or to a sunny and therefore, the total we should be 1. Similarly, I can define P transition to S next day given that it was rainy today. So, so far we discussed sunny to sunny transition sunny to rainy transition. Now, we introduce rainy to sunny transition denoted denoted by P of 2S from R or with an arrow it will be like this then to transition to rainy on n plus nth next day given that it was rainy today that will be 1 minus PSR basically PRR. So, we have four probabilities transition probabilities which seem to define this system. Many stochastic examples have many ways of writing it in a compact or in a useful manner for example, the same the various representation to this which is also very useful to learn. A simple representation would be write it as say today is whether it can be either sunny or rainy and it can transit to again either sunny or rainy tomorrow and we assign all the corresponding probabilities of transition in this table. For example, sunny to let us say sunny to sunny transition is 0.4 and this will give you rainy to sunny transition because this is tomorrow's weather has to be sunny today's weather is rainy. So, the corresponding transition element will be to sunny from rainy let us say this is 0.3. Now, the other two such as probability of transition from sunny to rainy that is to rainy from sunny that is going to be 1 minus 0.4. So, one can write is 0.6 similarly the probability of transition to rainy from rainy that is going to be 0.7. So, in writing like this the column sums should be always unity, but the rows need not be and a important thing to note in stochastic processes is that transition probabilities always defined from today to tomorrow or from step n to n plus 1. One does not have a reverse transition probability. We can also have other representations like for example, as this is one representation. Another representation of the same could be through over the circle picture that we drew for example, I can define this as my sunny state and this as my rainy state then the transition probability from sunny to sunny is 0.4, transition probability from here I do not have to index in terms of which is yesterday and it was today because the direction of the arrow will automatically indicated. So, from rainy to sunny to sunny from rainy. So, we always write a to sunny from rainy this is given as 0.3 and transition to P to rainy from sunny PRS this is to sunny from rainy. So, this is to this one as given as 0.6 and transition from rainy to rainy is given as 0.7. So, one can always represent this way also with these representations we can always write a propagation equation for example, of how my probability of whether it is going to be sunny or whether it is going to be rainy will evolve for that we have to define an occupancy probability. So, as I mentioned there are two types of probabilities the transition probability which is a conditional probability and the probability of occupying that state. So, that we call it as the probability that on nth day or nth day n plus 1th day it is sunny probability of occupancy of sunny state on n plus 1th day will be the probability of occupancy of sunny state on nth day and the fact that it translated from sunny to sunny only the weather remained sunny tomorrow also and the transition probability PSS is given here as 0.4 for example. So, or it is basically an and or problem. So, probability is equal to probability of being sunny and the fact that it would remain sunny or the probability that it was rainy the previous day and it translated to sunny state from the rainy state. This completes the equation to find out the probability on the next day similarly I can write n plus 1 the probability of finding that it is a whether it is a rainy day or not the next day is the same as probability that it was sunny day today and that it translated to rainy state from the sunny state or the probability that it was a rainy day today and it translated to rainy state from rainy state. So, we have basically the propagation equations this is a simple two state problem. So, we can write down, but the reasoning remains the same even if you have many states we can for mathematical purposes we can further write the whole thing in matrix form that is let us now denote it is more convenient to go for numbers. So, the state S by 1 state R by 2. So, when we say S or when we say 1 it means subscript 1 it would mean S. So, when we can write the equation for evolution as PSS is the S P S R P R S P R R a matrix whose elements are this transition probabilities multiplied by the column vector of occupancy probabilities. So, the occupancy probabilities at the nth step are now getting transformed into occupancy probabilities W n plus 1 1 W n plus 1 2. So, this is a transition matrix or transition probability matrix this is a state vector even this is a state vectors. So, one can at we can perform this calculation to calculate the probabilities tomorrow given the transition matrix and the state of the system today. So, briefly one can use the compact matrix notation now one can write P W is W n plus 1 in a compact form. So, the state of the system in tomorrow is going to be state of the system today operated by the transition matrix. So, this is in a matrix form. So, in the matrix form for example, we have the matrix for the weather problem my transition matrix P is going to be 0.4, 0.3, 0.6 and 0.7. So, my weather is either a state 1 or state 2. Once we have defined this problem in terms of a transition matrix we can easily ask a question of how to propagate the weather problem to a next day that is from today, tomorrow to day after tomorrow and so on. If my states are S and R and let us say we start with the state 1 that means, today it is a sunny day and I want to find the probability of it being either sunny or rainy tomorrow from the transition matrix I have defined. So, given the transition elements you can easily see that the probability of it becoming sunny is sunny to sunny transition which is 0.4 probability because the state is 1 plus. It is actually probability that was rainy yesterday and it is going to transit to rainy to sunny transition and rainy to sunny transition was 0.3, but that is 0. So, and because the probability is 0 and hence it is 0.4. Similarly, you can show that this will be 0.6 into 1 probability that it translated from sunny to rainy is 0.6. So, plus from rainy to rainy is going to be 0. So, this probability is 0.6. So, it basically reproduces the elements of the transition matrix. However, when I now propagate to day after tomorrow from Markov and assumption allows me to start from tomorrow and the same transition matrix will operate now on the states. Now, the state vectors are not 1 0 they are 0.4 and 0.6 and accordingly when you do this is going to be 0.4 into 0.4 plus 0.3 into 0.6 which is let us say going to be 0.34. Similarly, the probability of occupancy of the rain R state is going to be 0.6 into 0.4 plus 0.7 into 0.6 and just multiplying the elements of the transition matrix with the previous days probabilities. So, this will be something like 0.66. So, one can march forward calculate the state vector a state elements of the state vector and then take them as input for the next days transition probabilities and go on predicting what will be the situation as the day progresses. In a formal way this what we have done is we have basically they said we have we have considered it as a matrix equation and that matrix equation can be written in a formal way as W n plus 1 of basically W n plus 1 was P W n and we can generalize it to beyond the two states. Supposing I have a system with many states like this and I have this is today tomorrow and day after tomorrow. So, if there are many states if I want to find the probability for a given state I have to sum over all intermediate states. The probability that from this state it would transit say this state would involve probabilities of transition to all intermediate states and if there are s let us say there are s states the state index state index let us say I is 1 to up to s basically. Then the matrix basically will have an s by s matrix and we can then write the transition probability to a state j as sum of elements of all the it has to pass through all the states. So, k equal to 1 to s the probability that it translated from a state k to state j given that it was at system was it in state k in the previous step. So, read explicitly it just says the probability of finding the system in the n plus 1 step in state j is the sum over all intermediate states to which the system translated and then exited to state j. So, it is a kind of statement converted into a matrix form and a matrix equation. This is in fact, an expanded version of the matrix equation. This is made possible because of the Markov assumption and in the next we see this leads to a very important relationship called the Chapman relationship. And between transition probabilities which is a cornerstone in developing differential formalisms for random processes stochastic phenomena especially the continuous phenomena allows us to propagate the transition probabilities from one step to multiple steps. Thank you.