 In the previous lecture, we transformed random experiment into a stochastic process by ordering the sequence of random experiments into a increasing order of steps. So, this is the main difference between a stochastic phenomena and a statistical description. However, in making this transformation it opens up interpretations to actual real world systems. More importantly, it allowed us to introduce a concept of transition probability which is a unique character of a stochastic process. We take up from now onwards various types of stochastic phenomena and their descriptions and before doing that, let us consolidate our understanding of what constitutes a stochastic a process. So, generally a stochastic process is characterized by we can we can put together a few ideas. One is of course, random variable which is common to any statistical experiment. One must identify the random variable and by that what we mean is it is actually physical variable. It could be a for example, position of a particle, it can be size of a particle, it can be price of a commodity. One can opt to choose many depending on the problem in question any quantity. Second most important concept in describing stochastic phenomena is the concept of states what is called as the state of a system. So, there is a system described whose quantities or quality of interest is such to be identified then that quality assigns or accesses the states of the system. So, these are accessed by the random variable we can call this as r ways and these states can be discrete as in the example of lattice random walk. It can be even continues as could be for example, the actual position of a particle. Then distinguishing feature of a stochastic process is the so called steps concept of steps or if you are dealing with the continuous process it is just actually time. We will see these things as we go on and then of course, the unique thing the concept of transition probability. To make the idea more concrete last time we saw that transition probability is essentially a conditional probability states the probability that given the system is in some state b at some time step n that it transits to state a at next time step n plus 1. So, that probability essentially governs the rule it is actually it defines the rule of the process very key quantity. So, these 4 I would say are some key definitions or key issues in understanding or defining a stochastic process. Let us dwell a little more on these states what is meant by state. When we discuss the random walk on a lattice states where those fixed jump positions for example, starts from the origin plus L 2 L etcetera minus L minus 2 L. So, these fixed points in space one can call as a state of the system. So, that you can ask the question what is the probability of transition from a state from one lattice point another lattice point. This of course, in principle being defined on a real line there will be infinity of states. In physics for example, I can have energy levels as states particles undergoing transitions between various energy levels ordered by n. So, one can call that again as a semi infinite set of states. One can have for example, states defined on a circle a random walker random walks on a circle. Here system will be periodic and you have basically a finite number of states. So, this is one example of finite number of states. So, states can be infinite states can be finite. If there are many other finite states we can think of like for example, if we reduce the problem to a dichotomous problem head or tail type of problems then there are only two states for example, in the head tail coin tossing two states. If you want to describe the economic status for example, of individuals or progression of economic status of individuals call say as rich or poor just dichotomous. Even though that cannot be complete, but one can start with an analysis like that. And depending on the transition from states to states the whole process can be described in various ways. So, when we say transition. So, we can always represent states also finite number of states also like this. Say state b to state a if you want to have. So, it can have transition b to a, it can have transition from a to b, a can have transition to itself, b can also have transition to itself. What is meant by having transition to itself? For example, this I will describe by p a a. So, when I say p a a I mean when I toss a coin or when I take a step the particle persists to stay in the state a continuous. So, that can be rightfully called a transition from a to a. Similarly, this would be p b to b from b. This one is a transition to a from b. This is a transition to b from a. Here we use as much as possible the convention that we use in defining the conditional probability in defining I mean the same convention we use for transition probability. The quantity to which the transition occurs is written first and from which it occurs is written later. So, it is p a given b in that sense. So, we have all the transition probabilities now defined for example, for this dichotomous or two state process a b a and b. In reality a stochastic phenomena may proceed over a cluster of such states. And depending on the transition probability character one can say that the states are recurrent or the states are transient transient states are absorbing each of them can be explained by this diagram. Supposing I have state 1 and then let us say there are a cluster of states here call them as 2, 3, 4 and 5 label them with this. And then there is another cluster of states. Let us say there are various transition possible just I indicate a few of them with various probabilities. So, here too. So, this let us say is 6, 7, 8. Now we supposing there is a situation where there is a transition probability from this set let us say you call it as set a set of states a call it as the set of states b. Let us call this as set of states c. So, from the group a or set b from the set b there may be a transition from one of the elements in that to state 1, but there may not be a reverse transition. We call then this as the absorbing state. What about this? We will come to that, but before that let us see what is the character. Similarly, here too there could be transition to one of the states, but no return transition. Now let us see what happens if I place a particle in one of the set b states at a t equal to 0 or step n equal to 0. So, the particle would of course, be undergoing internal transitions moving around the states depending on the probabilities, but since there exists a transition from 4 to 1 sometime in the course of this it must be able to get out to either one or to set c. If it gets out to set state a set a or state 1 then it is permanently absorbed because there is no way for it to undergo any further changes. It will be basically it will be within one itself. On the other hand, if it transits to the set c, the set c consists of a cluster of a cluster of very many number of states then it will be undergoing continuous internal transition, but nevertheless it will not be coming back to b. So, in any case a particle starting from set to b will eventually be lost from it and hence this is a transient state. It will exist only for a short time and this set the set of states within set c they can be called as recurrent states. So, in this example demonstrates how you can have an absorbing state, how it connects to a transient state and what is the character of a recurrent state is. A kind of a physical example I can give like this supposing I have a box which consists of basically let us say air medium. Let us say it is connected to a liquid medium here. Let us say on this side it is connected to a solid this is a solid interface and this is a liquid air interface and let us say this is air supposing I place a Brownian particle here. The property is that this particle if it touches either a solid surface or a liquid surface it is permanently lost from the system. It is no longer able to come back to the air space. Hence supposing it undergoes random walk because in the course of this random walk maybe it contacts the liquid. But within the liquid it can continue to have Brownian motion but it will never come back to air. Let us say it is completely non volatile there. So, it will be permanently stuck here and this then is equivalent to a recurrent state. So, this becomes a transient state. Now to a solid if it is stuck most often the particles are stuck very strongly very first contact makes a very strong bond and the particle is going to be just absorbed there is no recurrence there. So, this is then an absorbing state. So, just to give a some picturesque example of these three phenomena, but you can have this phenomena in any process that you are discussing it need not be limited to a physical system it can happen to finance it can happens to price anywhere. So, having come to this there are other two characters maybe we will discuss them later. The property of so called stationarity of a system, but basically we call a stochastic phenomena as a stationary process as opposed to non stationary process and probably we will through examples we will explain when we advance the subject a little more. So, to give an introductory idea basically it is a process in which the basic transition probabilities they do not depend on time they are invariant in time which means the process occurs in a same fashion at whatever point in time we introduce the system. There are many paths by virtue of it being stochastic, but the constants of the system such as the transition probability they remain constant in time. So, this is one simple way to understand the concept of stationarity of a stochastic process. Now, given this background so, a stochastic process is a process in which a random variable accesses different states dependent on the transition probability at different steps. So, all the four definitions enter in defining a process. There could be many ways by which the system can be transiting from one state to another. We can identify three ways of transitions or I would say evolution of stochastic process. One of course is that transition to future state from the present state that that goes without saying now does not depend this depend on either the states or the steps. For example, successive toing crossing is simple example successive tossing of coins. The probability that a head will turn up or a tail will turn up does not depend on at what point in the in this experiment we are deciding on it or it does not depend on whether previously the head fell or the tail fell. It is independent of the state as well as the steps. This can be one way. The second way the transition depends only on the present state of the system. Previously it did not depend on any state. Here we now relax that and say maybe it depends on the present state of the system, but of course, not on number not on steps. We will explain what exactly it mean, but it just means that particle is at some site and in the random experiment we have to take a we have to assign it what it is going to do next which site it is going to access to take that decision. We will say that any probability that I will have to assign to that will depend only on that particular step number. It will not depend on all the past steps through which it would have passed. Neither it will depend on at what nth or at what time we are discussing this transition. So, this is the so called present state the case number 2. Similarly, we can have the third situation where that the transition depends on the paths transition meaning transition probability depends on paths or basically the path or the history or the past state the previous states access. For example, I have let us let us say that take the example of the random walk I have some nth step here it is at some state let us say state x of n and we want it to take a transition let us say this is x of n plus 1 there are many states possible. So, at this point in time it would have reached this state x n from the some state x 1 via various paths. So, I can I will just for descriptions sake I will show it as continuous paths although they are basically they are continuous, but they are like disjoint they are paths which are not smooth non smooth. So, it could have had many types of paths all of them leading to n. So, it has a history. So, in the first point we said that the probability to transition to an excess the state some state that probability to either x n plus 1 or x n plus 2 does not depend on neither this state nor the history that was of course, 1 and coin tossing was example, but the case 2 says that forget the past history it would depend on x n and may be on the next state it is going to jump also. So, the probability of transition from x n to x n plus 1 will could be for example, it could depend on the difference between these two states. If I assign numbers to x n and x n supposing there are energy levels then it could depend on the difference in the energy levels between the states, but in any case it would not depend on the past history that is situation number 2 and the situation number 3 is that this transition would depend on the history path that it would have covered in order to reach x n. So, obviously as you can see the situation 3 is the most complex and no easy way of describing this and we will not be dealing with this kind of processes in this course. We will specifically focus on situation 2 and that situation is a celebrated by being called what is known as a Markov process. A statistical theorist Markov is very famous has proved for example, Markov's proof of central limit theorem and many other phenomena is very well known, but his contribution to this area is highest by having identified a process through which we are able to carry forward formulation of stochastic phenomena and solve them. Although most of the processes in the world could be in a very theoretical or fundamental sense could be non Markovian and they have to be addressed differently it is not that it is insoluble they have to be addressed differently and we will not be dealing on with them. So, our random walk problem is of course Markovian here the transition probability between any two states did not depend on previous path transition occurred from present state m to a future state m prime only when m prime is one of the two nearest neighbors of m. This can be put mathematically as that is P 2 m prime from m equal to half chronicle delta m prime m minus 1 plus half chronicle delta m prime m plus 1 in random walk problems transitions can occur to other than nearest neighbors also still it would be a Markovian process. If these transition probabilities between any two states depended only on the two states and not on the history for simplicity. Let us use the notation x n to denote the realization of a state x at the nth step in general the transition probability P between states could be a function of the history and can be represented as a joint probability. The complete representation of a transition process through a probability P would be P x n plus 1 given that it was at a point x n at the previous step given that it was at x n minus 1 before x n minus 2 etcetera starting from a point x 1. This is a complete statement of the transition probability in general when we say that it is a Markovian process all that we are saying therefore, is the probability that at the n plus 1 step it has access the value x plus n plus 1 from a value x n at step n having passed through invariably having passed through various in fact, I should write x n minus 1 at step n minus 1 etcetera which I am omitting the subscript n minus 1 automatically denotes the step and it is a having started with x 1 such a complicated joint probability distribution for a transition from x n to x n plus 1 is a simple function of only x n plus 1 and x n. So, the statement of Markovian process is actually an assumption one is making about the nature of transition probability and its dependence on paths it has also therefore, is a characteristic the processes which satisfy this they belong to the Markovian character. So, it helps us immensely in formulating transition probabilities and more precisely it helps us to formulate the concept of one step transition probability we will see these things in the next lectures. Thank you.