 So, let us continue our discussion on Poisson process. So, before that in the last class we introduce what is the stochastic process and then we introduce some properties like what is the counting process and then what we meant by independent increments and based on these properties we defined a process called Poisson process. So, if you are going to look at some of the properties of this Poisson process and try to prove them as you will see that these properties you can directly derive from the definition and they are also very appealing in the sense when you want to do kind of a counting this property is what you desire. So, these are the properties we said that n is a Poisson process or random process is a Poisson process with rate lambda that means n is a counting process l has independent increments and if you take any interval or any indices t and s then the number of counts. So, we said that this is nt minus ns has Poisson distribution with a rate which is lambda times the length of that interval t minus s. So, we are going to today see the following properties. So, we had said some random process to be a Poisson process. Now, we are saying that it has other equivalent characterization that is you can say n is a Poisson process with lambda that is equivalent to saying that the inter count times u1, u2. So, remember we have already defined what we mean by inter count times that is between two count the time elapsed. So, that time going to denote as this random variable u1, u2. So, they are random variables for a process the time between two successive counts can be a random. So, that is why I am denoting them as u1, u2. They are these inter count times they are mutually independent and also exponentially distributed with parameter lambda. So, all this u1, u2 they are identically distributed further independent and the common distribution is exponential distribution. And this is same as saying if you are going to take any tau positive then n tau what is this n tau this is the in the process the random variable indexed at time tau is Poisson with rate lambda t. So, if you are going to take any tau this nt is Poisson distributed with nt and if you are going to take any n the conditional density of the first n count times. Remember earlier we have defined small t1, t2, t3 like t1 was the first count time t2 was the second count time like that, but this count itself can happen at random times. So, that is why I am now going to denote them as random variables with capital letters and if you are going to look at the first n count times and that joint distribution that joint distribution is given condition on the fact that this n tau has taken the value n that is going to be given as n factorial divided by tau to the power n. So, we are saying this is A same as saying B, E same as C or A same saying same as C all these are one implies the other. So, just now from this property we are going to look them try to prove each one of them today. What it is saying is if I have a Poisson process then the time of arrival between any two counts or the interval between any two counts is going to be exponentially distributed with rate lambda and these intervals are going to be independent. The distribution of this interval is going to be independent and further now if you look focus on this. So, I have missed this is conditioned on the event n tau. Suppose let us say n t if the Poisson processes that at the time t tau sorry tau n counts has happened you are conditioned upon this. Now you want to see that what is the joint distribution of this. So, first count happened at time index t 1 second count happened at t 2 and let us say that n count happened at time t n. This distribution is going to be expressed like this and I should also. So, notice that I am already saying till time small tau n counts has happened and the joint distribution of this times is going to be this. So, now if you focus on this part forget numerator now it is going to be like just now 1 upon t to the tau to the power n. So, if I am going to look at the distribution of n random variables and if it happens that their joint distribution is 1 upon tau to the power n can you say something about what this joint distribution looks like. So, just take n equals to 1. So, I am saying 1 upon tau. So, in the interval tau I am saying something like the PDF is constant 1 upon tau what that corresponds to uniform right something. Now, I am saying the joint distribution of n random variable is kind of 1 upon tau to the power n. What does this imply in a way what yeah there are n uniformly distributed random variable each with parameter tau and they are independent right. So, they are getting multiplied 1 upon tau n into 1 upon tau into further I have to look at such kind of ordering this uniform random variables further I need to condition that the outcome of the first random variable is smaller than the outcome of the second variable and the outcome of the second random variable is smaller than the outcome of the third random variable. So, I need to further condition on this. So, if I further condition on this I am going to get this factor 1 by n factorial in the denominator. So, when it n factor goes in the numerator. So, this n factor is coming due to the fact that I am looking for this t 1 t 2 b such that they are arranged in the increasing order. So, in a way there are two distributions that are related to my Poisson distribution. One is what the exponential distribution that is the inter arrival the inter count times are exponentially distributed and now if you look at the directly the time of the count itself and the joint distribution there seems to be some kind of uniform distribution that is coming into a picture here. So, let us try to prove now what we are going to do to complete this proposition theorem we need to show that each one of them implies each right. So, I have to show that A implies B, A also implies C and if I take B it should be the case that B implies A as well as B implies C as well as if I take C it should be the case that C implies A and C implies B. So, instead of trying all this what we will try to show is A implies B and we will show that B implies C. So, we show that A implies B and then B implies C. So, that should be sufficient right to show that each one C implies B is already C ok, sorry I have to do this and I will do this ok let us say now let us try to show that A implies. So, let us assume that n is a Poisson process. So, as soon as I assume n is a Poisson process I have all these properties available to me because that is the meaning of n is a Poisson process. So, now let us take any n real numbers such that they are ordered like this t1, t2 all the up to tn where t2 is going to be larger than t1 and so on like this. So, now think of them as so this is t1 this is t2 all the way up to let us say tn. So, let us take an epsilon and now what I am going to do is I am going to look at a small region. So, I will just take an epsilon and just go behind t1 and get t1 minus epsilon and similarly around this I will go slightly epsilon behind and I get t2 minus epsilon and this is like tn minus epsilon. Now what I want to do is I want to look at this intervals here. So, this means I am open at this point closed at this point and similarly now let us try to find joint distribution of so what is this t1, t2 these are the count times and these are the random variables. So, let us try to understand now let us try to find what is the probability that so what is t1 here capital T1 the time when the first count happens. So, let us assume I am interested in this ti belongs to. So, what is this telling I am interested in when I take t1 I am asking the question the probability that the first count happen in this interval t1. So, t1, t2 they are some given numbers to me now ti is a random variable right now I am basically asking the question this ti that means the ith count happening in this interval and now I want I am looking at a joint distribution and I want this to happen this condition for all i 1 to n. Now if I now I want to express this in such a way that I can go and apply my independent increment properties. So, let us say I want to apply this how can I express this probability to exploit my independent increment property. So, one thing I can now do is what I am basically asking is the first count should be happening here right that is t1 capital T1 is t1 minus epsilon and t1 if that is the case then there should be no count happened before that right and the first count should have happened here and then if I want t2 to be again in this interval they should no count should have happened in this interval and the next count should have happened in this interval then my t2 is going to fall in this interval right. So, then this is same as asking is this correct I can express this joint distributions like this that is nothing happens here and the first count happened here and the second count only happened here that means nothing I do not want any count to happen here and like that and the last count happens only in this region. So, you see that now I have expressed this quantity in terms like in terms of the increments okay. So, now I want to expand this if n is a Poisson process are this all of this events are independent I know because I am looking at the different intervals right and I know that by Poisson process they are all independent. So, and now I am going to further exploit the property that. So, now if I am going to look at no first consider the case. So, I can write them as probability okay just let me write it for in terms of the probability first probability this is for the first one and the last one is going to be right I have simply applied my probabilities on each one of this right. Now, this is coming from what like as I said this is coming from independent increment properties. Now, further what is the third property of the Poisson process it is going to be Poisson right like if you are going to. So, I could as well write this guy as like n minus 0 here at 1 and I could just take n 0 as 0 itself. Now, what is this probability is going to be sorry this is 0 what is this what is the length of this interval t 1 minus epsilon and I am asking. So, this is going to be Poisson process with what rate lambda into t 1 minus epsilon right. Now, what is the probability that a Poisson process with rate lambda t 1 minus epsilon takes value 0 e to the power here is the length of interval is simply t 1 minus epsilon right. Now, what is the probability that it takes value 1 the length of the interval here is it is simply epsilon. So, here it is a Poisson process with rate what lambda epsilon right and what is this value is going to be lambda into lambda epsilon and there is a one factor in the denominator which we can skip right. So, like this you can keep writing for all of them and just let us write what this looks like e to the power lambda this is going to be what t n minus 1 t n plus epsilon and the final one is going to look like t n minus epsilon here this is going to look like epsilon e to the power lambda epsilon again. So, now if I am going to simplify all these things there are how many lambda epsilon multiply getting multiplied here there are going to be n 1 of them and if you going to simplify all of them you see that some subtraction is going to happen because t 1 is happening it will also come as minus t 1 later and if you going to simplify that what it will end up is simply e to the power minus lambda t n. So, whatever this probability I was looking at this joint distribution it is now lambda epsilon. So, this I am going to simply write as lambda n epsilon n into lambda t n and now let us say I am now want to take this probability that t i belongs to t i minus epsilon divided by if I want to do the epsilon and this is going to be equals to lambda n e to the power lambda t n and that is what I have shown. So, now if you are going to see what I am doing here for a given t 1 t 2 t n I am looking at for each of this t i around an interval of epsilon length and now everywhere these lengths are of epsilon. So, what I am basically looking at for a given this t 1 t 2 I am looking at the mass of this random variables in a ball which is of what volume. So, for example, let us take a simple case. I have now taken let us say this is my t 1 and this is my t 2 on t 1 I have looked at a interval of epsilon and also on t 2 I have also looked at an volume of epsilon. So, what is this region is going to be this region is going to be I was basically looking for my t 1 t 2 to be in this region here right where t 1 is of length epsilon and t 2 is also of length epsilon. So, what is the area here epsilon square right. So, if you are going to extend it to n dimension what is this volume is going to be epsilon to the power n. So, now this is what I want to do I was basically looking at this joint distributions to lie in some volume which has which is epsilon square. Now that probability I am dividing it by epsilon n square and now what I got is lambda to the power n e to the power lambda n. Now, this epsilon I have chosen is arbitrary right I could as well let this epsilon go to infinity here even if I take epsilon arbitrary small this ratio is kind of independent of lambda n right this value is going to remain the same. So, if I take this ratio and let epsilon go in let epsilon tend to 0 what this ratio will converge what is this ratio according to our definition. So, you remember the way we how we defined our CDF sorry PDF and we connected it to the probability. So, what we had said we had said that f of x at a point x can be thought as limit as this is in all one dimensional case as epsilon tends to 0 probability that x is let us say in this interval x minus epsilon x plus epsilon divided by epsilon. Did we say this when we are try to argue the relation between a problem when we try to give interpretation of what we mean by PDF of a point at a given point. So, we are dealing here with what like we are dealing here with continuous random variables right this t is here are the continuous random variables t is the count time right they can take any number on the real number value. So, here we are dealing with a continuous random variables. So, when we try to give a interpretation for our PDF we had said that PDF of a random variable x at a point x can be thought of its probability in the neighborhood of x where that neighborhood is defined in terms of epsilon and we can make that neighborhood small by letting epsilon go to 0 and we have when we take this ratio that is nothing but the PDF yeah ti yeah these are finite but each one of them can is a continuous random variable I am talking about individual one right. So, each one of them itself is a it can take any positive any real number. So, if you have this probability and if you just let epsilon go to 0 in this fashion that is nothing but the CDF. Now, we can apply the same definition here right, but here instead of real line we are in a n dimensional space that is what instead of just epsilon we are looking at epsilon to the power n which corresponds to area which is the volume in the n dimensional space. So, due to this we could now write f of t1 f of t1 t2 all the way up to tn the joint distribution of this to be what lambda to the power n e to the power lambda tn provided what is this this t1 is greater than t2 and all the way up to tn and now we have 0 otherwise which is correct are you convinced and what is this is the joint distribution of this random variables. So, this is what we just derived by using the properties of my Poisson process. So, it is clear that we use our second property here of the Poisson process that independent increments and then we use the properties of Poisson distribution the third property here. So, did we use anywhere the property the first property which was which said that the Poisson process is a counting process anyway we use it right because one of the what is one of the properties of the counting process n of 0 is 0 right and we use part of that the counting process is much more, but we partly use some part of the definition of it being a counting process here. Now, we have but this is not what we are interested in right fine we have come up with the joint distribution of my count times, but what I am interested to show property b is the distribution of my inter count times. Now, how to derive a joint distribution of the inter count times using the joint distribution of this count times themselves. So, if you want to derive what property you can exploit now is there a way you can think of. So, from this I basically want to now go and write what is the distribution the joint distribution of this random variables. So, if there is a relation between this u s and this t s I should be able to apply my transformation of random variables properties and should be able to derive this right we have already studied that, but the now the question is what is the relation between this u s and t s. So, we already know that right like u i t i minus t i minus 1. So, we know this relation now how can we write this joint distribution in terms of f of x f of t s here what is the property what is what is that we know about this. So, we know that right when we have to write this distribution we need to compute the Jacobian. So, what is the thing? So, we have this. So, let me write it more explicit we have u 1 equals to t 1 u 2 equals to t 2 minus t 1 and like that and also we know the other way around right. So, t 1 anyway equals to u 1 and what is t 2 u 1 plus u 2 and t 3 is u 3 u 2 u 1 all this we know right. So, how to write this then? So, we already know that this is nothing but u 1 to u n of u n u 2 to u n is equals to f of t 1 t 1 t 1 we have to t n what was this and what we had here a Jacobian matrix right determinant of a Jacobian matrix and what was that. So, can somebody quickly compute what is the Jacobian matrix here it is going to be 1 why is that you are just going to get a whatever you can just compute that and finally whatever we want here we will have a unit determinant. So, now how to express this I have this t 1 but t 1 is u 1 and t n is sum of all the random variable but this distribution only depends or it should be t n here on t n here. So, what is t n in terms of u's t n is nothing but sum of u i's right. So, because of that you could write that as lambda n e to the power minus lambda and this is for any u and this is going to be 0 otherwise. Now, from this did we conclude what we wanted to say here I am looking at the joint distribution of u 1 u 2 all the way up to u n now what I have done is finally able to show that give me any u this can be expressed as lambda to the power n e to the power lambda u n plus u 2 up to u n. So, what is this basically this is nothing but the product like lambda e to the power lambda e to the power lambda u 1 into lambda times e to the power minus lambda u 2 like that right. I can split it as the product of n such terms and what is each term there, what is each term there is a exponential random variable with parameter lambda. So, that is what exactly the claim and why I am saying independent because this joint distribution I am able to express is a product of n n distributions and each one of them now corresponding to exponential random variable with lambda. So, that is why this claim holds. So, as you see I mean if you have a gosh if I have a Poisson process that means nothing but if you just look into the inter count times the collection of inter count times they are nothing but they are Poisson distribution with the same rate lambda. So, if I say I Poisson process with rate lambda or I say I have a bunch of random variables with random process where each random variables are independent and everybody has a Poisson distributed with rate lambda that means I am basically referring to the same Poisson process with rate lambda.