 I will just quickly again go over the inter arrival times that we discussed in the last lecture and then I obtained the distribution for inter arrival times. So, we already said showed that f x 1 t is lambda e raise to minus lambda t that means, the x 1 denotes the arrival time up to the first event that occurs. So, therefore, that is exponential that means, the interval is lambda the interval for the first has exponential with distribution with parameter lambda. Then if you want to now compute for x 2 then let us look at the probability x 2 greater than t when x 1 is equal to s. So, here you see again the idea is that the first interval the first event occurred here and now. So, this was your x 1 now this time is denoted by x 2 and we are saying that this is greater than t. So, if this time is greater than t that means, there is no arrival in this time. So, you are looking at probability x 2 greater than t condition on x 1 being s, but since we have shown already that the inter arrival times are exponential and so they are memory less and therefore, it does not matter see this probability will remain the same whether it is here or here or anywhere. So, it does not matter when the first event took place the probability the conditional probability is the same as the probability x 2 greater than t. So, this is the memory less property that we have already shown. So, therefore, this is equal to this and here the same thing probability s comma s plus t 0 given that x 1 is s. So, therefore, this is equal to probability that there are no arrivals in the time interval s comma s plus t and that probability is e raise to minus lambda t. Because, here again the number that you of wanting to give the probability that you want to compute here is that there is no arrival in this interval and so the conditional has no bearing on this probability also. So, therefore, this is e raise to minus lambda t and here this in terms of your distribution function you will write this probability as 1 minus capital F x 2 d t and this is equal to e raise to minus lambda t. Therefore, if you differentiate both sides you get F x 2 t is minus lambda e raise to minus lambda t. So, this goes out and therefore, you have shown that for x 2 also the distribution is exponential lambda. And now repeated use of this argument because, essentially we are using the memory less property and so the same argument can be repeated for x 3, x 4 and so on. So, we end up with this proposition that when you take the sequence of interval arrival times these are identically independently distributed. Remember we have assumed for the Poisson process independent increments and stationary increments. So, therefore, these are interval times are identically independently distributed exponential lambda random variables that is the problem the process is probabilistically the process probabilistically starts itself which means it is memory less every time event occurs it starts itself again. So, there is no memory as to when the last event occurred it just starts fresh from after an event. So, when you start counting the interval arrival times it rejuvenates itself again. Now, just a word about the parameter lambda. So, you see because this is exponential lambda. So, expectation of x i will be 1 upon lambda and I mean the theory about exponential distribution does not say anything about lambda as long as lambda is greater than 0. So, lambda can be any positive number. So, here before I come to that see a high lambda corresponds to a small average of waiting time. If lambda is large then 1 upon lambda which is the expected value is small and. So, it is saying that this is the average of the interval arrival times. So, that means if the average is small then the arrivals will be occurring small intervals because the expected value is small. So, when lambda is high it corresponds to a small average of waiting time between two consecutive occurrence. So, we were saying that when lambda has a large value the corresponding expected value will be a small number and so that would mean that the interval arrival times are smaller in the small in the sense that the average is small. So, therefore, the arrivals are occurring at smaller intervals and in any case lambda is called the intensity of the process. So, there is no therefore, it simply says that lambda measures the. So, if lambda is small then this will be big. So, that means the inter arrival time average is large and so the events are occurring at large intervals and therefore, we are saying that lambda is called the intensity of the process also. And lambda can be any positive number, but if we think of earthquakes in Indonesia say for example, and take one year as unit of time. Suppose I consider the process you know if I am counting the number of earthquakes that have occurred in a span of 10 years say for example, and so if I take the unit of time as one year then lambda should not be large because if lambda is large then what will it say that one upon lambda is small and therefore, it would mean that the earthquakes are occurring at smaller intervals of time. But we all know that the earthquakes of course, they are unpredictable, but normally it does not happen that earthquakes occur very often. So, one has to be careful that is why this interpretation of lambda gives you an insight as to how when you go about modeling a process then how should your choice of lambda be made or also like if you look at moments this I mean again this is the time this when the when a radioactive material sends particles then the intensity is high the intensity is high and therefore, this is small. So, the particles spread radioactive particles spread very fast and so if you are counting at any point of this thing how often the particle is arriving then the inter arrival times will be small and therefore, this will be small so lambda would be high. So, that is why just to give you an idea and then you can look at many different examples and see how the value of lambda will reflect the inter arrival averages. Similarly, you want to look at the parameter lambda t. So, lambda t is the number of events so this is actually on the average the number of events in time t in time period t on this. So, this is your mean arrival rate. So, now number of events in two disjoint time intervals are independent just now we said now that they are independent increments. So, therefore, if you look at the arrival in between 0 and 1 it is Poisson lambda and this is Poisson lambda between 1 and 2. Now, if you look at the time interval 0 2 then it will become Poisson to lambda because again from now by now we have by so many different methods shown you that if two random variables x 1 of course, here x 1 is Poisson lambda x 2 is Poisson lambda then x 1 plus x 2 would be again Poisson to lambda the parameters get added. So, therefore, 0 2 the number of arrivals will be Poisson to lambda. So, therefore, in 0 t it will be Poisson lambda t this is the whole idea. So, occurrence is that. So, this is the important thing and therefore, in time 0 t we will say that the arrival rate is lambda t Poisson lambda t this is the whole idea of course, t can be fractional and so on. So, one can again interpret in the same way. Now, another thing is that since we are talking of stationary increments therefore, what we have to say is that the arrivals over the time 0 t are distributed in a uniform way because they are random 0 to t they are anyway random events and then when we are talking of number of arrivals for example, when I am counting n t this is the number of arrivals in time 0 to t in the span of time 0 to t. So, then we have to think the way the process is being modeled is that in this particular time period the arrivals can occur anywhere and so the best way to model that is that the arrivals are uniform in the interval 0 to t. And through an example again I will try to make you understand this concept a little better. So, after computing the distribution function of the inter arrival time let the another quantity of interest is S n which is sigma x i i varying from 1 to n which means S n is the waiting time for the n th event to occur. So, because see you are adding up x 1 to x 2 to x n. So, x n is the time when the n th interval interval time between n minus 1 th and the n th event. So, therefore, S n will be the waiting time for the n th event to occur. So, you have just added up all the see on the line you have you starting from 0 this is x 1 this is x 2 and so on and finally, this is x n. So, at this point the n th event has occurred starting from here this is the first event second event and so on. So, at this point the n th event has occurred. So, this the total time that means this total time you are denoting by S n. So, which is you can say a waiting time for the n th event to occur. For the n th you know the time the volcano has to erupt particular volcano or earthquake to occur whatever process you are looking at you can interpret S accordingly. Now, from independence of x i's and being identically distributed as exponential lambda and here again you see in the last few lectures we have been talking about some of independent random variables and through convolution through MGF's and so on. We have looked at the distributions of sums of independent random variables. So, here it immediately follows that S n is gamma n lambda because here the x i's are identically distributed as exponential lambda n of them you are taking sum of n of these random variables exponential random variables. So, the sum will be gamma n comma lambda and therefore, the pdf or the density function for F s n t is lambda e raise to minus lambda t lambda t raise to n minus 1 upon n minus 1 factorial for t non negative. So, this is the thing, but now again as I said it always helps to be able to use other tools that we have developed and so let us try to do it and the of course, the MGF again which is already been computed for a gamma random variable is lambda upon lambda minus s raise to n. So, while writing MGF of S n the S got written by mistake. So, it is actually MGF of S n which will be lambda upon lambda minus s raise to n. So, the idea was that you are computing it at S. So, therefore, it got written there. So, this is actually lambda upon lambda minus s raise to n. Let us look at it in an alternate way and that is also interesting. So, let us just be very clear about this. n t greater than or equal to n if and only if S n is less than or equal to t that is if the number of arrivals by time t greater than or equal to n then the time S n for the n th event to occur is less than or equal to t and vice versa that is if S n is less than or equal to t then it will imply that n t must be greater than or equal to n. So, when you want to compute the distribution function of S n this is probability S n less than or equal to t which because the two events are the same this is probability n t greater than or equal to n. And so since n t is Poisson distributed Poisson random variable with lambda t as the parameter. So, therefore, this probability can be written as sigma i varying from n to infinity e raise to minus lambda t lambda t raise to i upon i factorial. Now, let us differentiate this equation from both sides and so on the left hand side this will be the pdf of S n and now here let us do it term by term. So, the derivative of this first so minus lambda e raise to minus lambda t lambda t raise to i upon i factorial and then the derivative of this which we are writing as. So, first function as it is e raise to minus lambda t into lambda raise to i remains as it is t the power of t becomes i minus 1 and then i factorial here which you can you know cancel the i part here then it will be i minus 1 factorial. So, an e raise to minus lambda t I have taken outside and this summation from n to infinity. So, I am not writing out many terms here I just have to show you that you see when you take n i equal to n the term from here you will get lambda lambda t raise to n upon n factorial and this will give you lambda raise to n t raise to n minus 1 and minus 1 factorial. So, these are the two terms now put i equal to n plus 1. So, this will be minus lambda lambda t raise to n plus 1 n plus 1 factorial plus lambda n plus 1 t raise to n upon n factorial. So, you see this cancels with this and then I thought I will also write the values corresponding to i equal to n plus 2. So, then that will be lambda lambda t raise to n plus 2 upon n plus 2 factorial plus lambda n plus 2 t raise to n plus 1 and n plus 1 factorial. So, that cancels out this. So, you can see the pattern first and the fourth here and the third and the fifth sixth and so on. So, all these things will cancel out except for this because this is the lowest degree term after that the powers of t keep on increasing. So, this is the only one which is left out all these will cancel out and so you are left with e raise to minus lambda t into lambda lambda t raise to n minus 1 upon n minus 1 factorial same as 1 which is a gamma density function gamma n comma lambda. So, I just wanted you to sort of you know make use of this also and therefore, you can even do it directly. So, once when you generate so many tools it is always possible to prove result by more than one way and it also helps it gives you a better insight if you can do that. So, then expected value of S n will be n upon lambda and variance S n will be n upon lambda square fine. Now, we will further prove some more properties of the Poisson process and then you know work out examples to show you how you make use of these all this machinery that we have developed. Now, for example, if you take a Poisson process n t t greater than or equal to 0 then and there can be two sub processes. If you remember while discussing the joint M G F I talked about Poisson process and then I said that if all the events that are occurring are being counted then the probability of an event being counted was p and event not being counted was 1 minus p and then I showed you through the M G F that each of them each of these process again would be a Poisson. So, see while talking about the Poisson process having two sub processes which we call type 1 and type 2 and so the probability that the type 1 would have occurrence with occur with probability p and type 2 with occur with probability 1 minus p. So, I have already. So, the only correction I want to make is that see here your n t is lambda p t because we are talking about with respect to t. So, then we are talking of arrival time arrival rate in the interval 0 to t. So, now here similarly your n 1 t will be then Poisson and this we showed through M G F process you know we showed that it can be both will be again Poisson. So, these sub processes n 1 t would be Poisson lambda p t and n 2 the process type 2 which will be. So, the random variable is n 2 t will be Poisson lambda 1 minus p t. So, the we have to attach. So, I what I wrote in the lecture was without the t part everywhere here. So, this is what the correction is being made otherwise I have explained what we mean by these sub processes and so on in the lecture itself. So, exactly the same thing, but here again I will do this I will try to prove the same result by using the machinery that the definition that we have made here I will try to do that because the M G F thing we know. So, here it is same thing is saying that there are two types of that means you may be considering let us say immigrants from another country and the immigrants may be Hindus, Muslims whatever it is. So, therefore the total process of immigrants coming from another country may be a Poisson process and then in that the kinds of people that are arriving you may want to separate them into two streams one may be let us say Hindus the other may be Muslims. So, there will be type 1 arrival and the probability of one of the arrive immigrants being Hindu is probability P and the 1 minus P is the probability of the immigrant being Muslim. So, then it is P. So, we will now prove it in a different way in an alternate way. So, as I said the proposition is that there is a Poisson process and N T is the number of arrivals in time up to time t then if there is type 1 and type 2 processes sub processes and so type 1 process that means the type 1 event occurs with probability P and time and the type 2 event occurs with probability 1 minus P. We want to show that N 1 T is Poisson with lambda P as the parameter and N 2 N 2 T is Poisson with parameter 1 minus P lambda. We want to show this and as I told you that we have already shown this result using MGFs joint MGF, but let me do it through. So, we will show that N 1 T greater than 0 this satisfies your definition 2 which remember we said is more easily verifiable and so let us do it quickly. Since N 0 is 0 this implies N 1 is 0 because your N T is N 1 T plus N 2 T if this is 0 then both of them must be 0. So, N 1 is 0 now the other part is that independent and stationary increments. So, which can also be easily seen because if I condition this by fixing N T equal to N then the arrivals here are also they only depend on the length of the interval and are independent of what has occurred before. So, that is their memory less. So, by conditioning also you do not change the independence independent increment property and the stationary stationary increment property. So, therefore, N 1 T satisfies both. Now, we just want to show that your probability N 1 H equal to 1. So, the property 3 should be satisfied. Now, let us just look at this event. So, if type 1 arrival in time H is 1 then we can write this break up this event as saying that N 1 H is 1 given that N H is 1. So, total arrival is 1 and then N 1 is 1. So, this will be conditioned this into probability that N H is 1 or probability N 1 is H and N H is greater than or equal to 2. So, these are the two possibilities because either N H 1 is 1 or N H 2 is greater than or equal to 2. So, this will be this into probability N H greater than or equal to 2. So, I like the proof because it is just by basic definition of the process we are able to show this result. So, here see N 1 H is P and then probability N H equal to 1 is because N is any way Poisson process. So, we already satisfies the definition. So, therefore, probability of N H equal to 1 is N H plus order H. This when given that there is 1 arrival. So, then N 1 H the probability is P plus now again here arrival N 1 H is 1. So, that probability is P and then N H greater than or equal to 2 and satisfies the condition 4 also. So, therefore, order H and so this will be lambda P H plus order H because P see remember when you say a function is of order like this then constants are all allowed because it is only the power of H which is important. It is higher power and so H becomes smaller and smaller this goes to 0. So, therefore, that P gets absorbed here and therefore, this is it. So, this is what yeah. So, therefore, this satisfies definition because lambda P is the probability now of arrival. So, therefore, this will be lambda P into H and this N 1 H is greater than or equal to 2 is satisfied because this probability is less than or equal to probability of N H greater than or equal to 2 by the definition because N H is N 1 H plus N 2 H and since this is order H. So, this has to be also order H. So, nice simple proof and I like it. Of course, you we have already done it through the MGF method, but that was for a general situation. Now, here we are doing it for a Poisson process. So, therefore, the type 1 and type 2 that means if you have sub processes and certainly this can be extended to more than 2 sub processes. So, if you have more than 2 sub processes each of them and of course, some of the probabilities must add up to 1 which it will and therefore, you can say that all the sub processes will be independent. Now, how do I show that N 1 and N 2 are independent? That part is also there that N 1 and N 2 are independent. Once we have shown and similarly by the similar argument you will show that N 2 H is also Poisson with parameter lambda into 1 minus P and now you can use the joint MGF method to show that they will be independent. So, this method I used to show that N 1 T will be Poisson with lambda P as the parameter and N 2 will be Poisson with lambda parameter lambda into 1 minus P. To show independence you can use the MGF method. So, let us look at this example. Suppose that people from Bangladesh migrate into northeastern states of India at a Poisson rate of lambda equal to 5 per meter. So, the question asked is what is the probability that the expected time until the 15th immigrant arrives. So, what is the probability that the expected time until the of the expected time that the 15th immigrant arrives. So, that means you are asking for S 15. So, S 15 is gamma 15 comma 5 by the result that we arrived sometime ago because it will be x 1 plus x 2 plus x 15 and so that will be gamma 15 comma 5 and the expected value is 15 upon 5 remember because this is gamma this N upon lambda. So, this will be 15 upon 5 which is 3 days. So, the expected arrival time the expected time until the 15th immigrant arrives fine. Now, what is the probability that the elapsed time between the 15th and 16th arrival exceed 2 days. So, here you are asking for x 16 because x 16 is the inter arrival time between the 15th and the 16th arrival. So, you are asking for the probability that x 16 is greater than 2 and that will be e raise to minus 2 lambda yes again this is from your this is from your exponential distribution right because when you have lambda e raise to minus lambda t and if you are asking for this thing from let us say a to infinity then this is what lambda upon minus lambda e raise to minus lambda t a to infinity and so this is e raise to minus a lambda. So, this is it. So, probability x 16 greater than 2 will be e raise to minus 2 lambda which because lambda is 5. So, this is e raise to minus 10 and I have just computed the value here because you can write this as e raise to minus 2 raise to 5 and e raise to minus 2 I knew the value is 0.133. So, we just raise it to 5 anyway. Now, if a Bangladeshi immigrant is a Hindu with probability 1 by 10 then what is the probability that no person of Hindu origin will migrate to northeastern region in the month of March. Now, just to show you the use of you know the what we just discussed. So, here that means it is lambda p t. So, lambda is 5 p is 1 by 10 and the time is 31 days, March has 31 days. So, therefore, the no Hindu will arrive in that period again will be e raise to minus lambda t p which is e raise to minus 31 by 2 and so you can compute this number. So, this is the whole idea and then of course, we will look at some more properties of the Poisson process and work out a few more examples. Now, let us look at this example where we are trying to compute the conditional distribution of n s given that n t is n. So, given that n arrivals are there in time in time 0 t and s is less than t. So, now you want to look at the conditional distribution of n s given that see through all these examples I am just trying to familiarize you more with the working of the process and the machinery that we are developing. This is the whole idea and in the process it makes the subject quite interesting. So, the solution is that and then of course, it is being asked do you recognize this distribution once you get once you obtain the conditional distribution the question asked is do you recognize this distribution. So, you are given that n t is n you have to find the probability that n s is equal to k given that n t is n right. So, now that means if up to time 0 s the arrivals are k and up to time t the arrivals are n. So, obviously the number of arrivals in time t minus s is n minus k that is how it will make up the number of arrivals up to time t as n. So, up to time s the number of arrivals are k and then in the interval s to t which is the interval length and by now we know that we just have to worry about the length of the interval and not exactly where that interval is occurring. So, the number of arrivals in time t minus s is n minus k. So, when you write down this probability n s equal to k given that n t is n this you can write as probability n s is k and comma n t the joint probability of n of t minus s is n minus k conditioned on the n t equal to n. And since again from the independent increment property for this joint intervals or for s n t minus s the probability can be written as the product of these two probabilities. So, therefore this will be the product of these two probabilities the denominator and the denominator will be probability n t equal to n. So, I hope this is clear. So, therefore this probability is e raise to minus lambda s, lambda s raise to k upon k factorial and this probability will be e raise to minus lambda t minus s lambda t minus s raise to n minus k upon n minus k factorial divided by e raise to minus lambda t lambda t raise to n upon n factorial. So, as long as this part is clear that when you see this probability of course, I can write in this way and then this I can write as a product of these two probabilities. And so now you see that this is e raise to minus lambda s and here you get e raise to plus lambda s which cancels out then e raise to minus lambda t and in the denominator you have e raise to minus lambda t. So, the e terms all cancel out and you are left with. So, this n factorial will come in the numerator. So, the first term that I have written is n factorial upon k factorial n minus k factorial this I have put together and then you have lambda s raise to k and lambda t minus s raise to n minus k divided by lambda t raise to n. So, this is what I have written here. So, this whole expression simplifies to this. And now here what you can do is see again the lambda raise to k and lambda raise to n minus k is lambda raise to n which cancels with lambda raise to n here. So, you are left with s by t raise to k. Now the t raise to n I can write this as k plus n minus k. So, the t raise to k couples with this which is s upon t raise to k and this here it will be 1 minus s upon t raise to n minus k. So, you see that now you can recognize this. If you treat p equal to s by t then this is a binomial probability when you are choosing k items out of n. That means you are asking for k successes out of n trials and n independent trials and your probability of success is s by t. The more important thing is that this whole expression that the conditional distribution is independent of lambda. That means no matter what the parameter of the Poisson process is this conditional probability is independent of lambda. It is only dependent on the length of the time intervals. That means here it was n s and there it was n t. So, that s and t. So, I am sure there are many more interesting implications of this result. But, again you know you can get it nicely by just using the definitions and so on. Now, let me again take up this interesting optimization problem and here again you know again the machinery is not very complicated. Suppose that items arrive at a processing plant in accordance with the Poisson process with rate lambda. So, that means the items are arriving at the gate of the processing plant and the arrival process is Poisson with rate lambda. Now, at a fixed time all items are dispatched from the system. So, the items get processed and after time t when they all collect they are dispatched to wherever they came. Now, the problem is to choose an intermediate time t belonging to 0 comma t at which all items in the system. That means all the items which have been processed by time small t they get dispatched and then the remaining which get processed from time t to capital t they will be after being processed they will be dispatched. So, that means they have you know they are not going to wait till the end of up to time t. So, in between also they would like to dispatch the items and the idea here is that this will this way they want to minimize the total expected wait time of all items. So, total expected waiting time right this is what we have to write down the expression and then see how we minimize. So, the choice of small t that means the intermediate time at which you want to dispatch whatever items have been processed this is that has to be fixed that has to be sort of obtained by this process. So, expected number of arrivals in 0 t is lambda t remember because this is Poisson with parameter lambda. So, therefore, in time 0 t the expected value is lambda t right and each arrival is uniformly distributed remember I sometimes ago discussed when we were looking at the Poisson process and its properties and we said that because of the stationary increments when the number of items that arrive in this time they would be uniformly distributed over the randomly distributed over the time interval 0 t right. So, therefore, in time 0 t the expected any uniform variable distributed over 0 t has mean t by 2. So, the expected wait time is t by 2 all items which start getting processed from here till up to this. So, their expected wait time is t by 2 because we have said that the processing is uniformly done. In the sense that the processing is not uniformly done it is that is the arrival is. So, it is distributed the arrivals of these items is distributed uniformly in the interval 0 t. So, their expected wait time is t by 2 because they will be dispatched by the end of this time period right. Now, so total expected wait of all items arriving in 0 t is therefore, lambda t into t by 2. So, expected wait time of any item is t by 2 right because they are uniformly distributed in this interval the items and therefore, for each of them the wait time is t by 2 and since the expected number of items that arrive in this time is lambda t. So, the whole thing is lambda t into t by 2 which is this right. Now, the similar reasoning holds for items arriving in time t to t comma t and therefore, for these items because they will get dispatched at time capital T. So, the total expected wait time will be therefore, lambda t square by 2 plus half lambda capital T minus t whole square. So, I hope this part is clear right and this reasoning is because expected wait time into the expected number of items that arrive. So, that gives me the total expected wait time of all items arriving in time in the interval 0 t. So, to minimize that to find out the minimizing value of t I differentiate this expression with respect to t and I get lambda t minus because there is a minus here. So, 2 is gone. So, lambda t minus lambda capital T minus t is 0 and this gives me t equal to t by 2 as you would expect because the arrivals are uniformly distributed over the time interval. And just to make sure that this is the minimizing value you find out w prime t and w prime t will come out to be 2 lambda which is positive. So, therefore, this gives you the minimizing value. So, therefore, it says that you dispatch whatever items get processed in the middle of the time and then wait for the others to be processed and dispatch them at t. A simple which appeals to your reasoning also, but then through this machinery also we have arrived at this result right. So, before I begin the you know talking about queuing models I thought I will finish of the lecture on poison processes with this example on exponential distribution because it is somehow related and part of it right. And so and this would be the right place to talk about it because we have talked of exponential of the poison process and we have talked of people expected number of people in the system and so on. And then because the inter arrival times had we have shown that each of them were identically independently distributed as exponential random variables. So, I thought this would be also this can be part of it. So, here the whole idea is that and of course, this is a simple example on the memory less property of the exponential distribution. So, consider a railway booking counter that is run by two clerks. Suppose that when Mr. Sharma enters the system he discovers that Mr. Jain is being served by the clerk at one counter and Mr. Verma is being served at the other counter. So, both the counters are busy when Mr. Sharma enters the system. Now, Mr. Sharma service will begin as soon as either Mr. Jain or Mr. Verma leave the system that whenever as soon as one of them is complete service they will he will leave the system and then Mr. Sharma's turn will come to be serviced by the clerk. So, if the amount of time a clerk spends with a customer is exponentially distributed with mean 1 by mu. That means the parameter of the exponential distribution is mu and therefore, the mean time that a clerk spends with a customer is 1 upon mu. What is the probability that of the three customers Mr. Sharma is the last to leave? So, of course, here Mr. Sharma will only get serviced once one of the customers has left. So, the actual question is that when Mr. Sharma's turn comes for being serviced there is one person one of Mr. Jain or Mr. Verma one of them is being serviced. So, Mr. Sharma goes to the clerk for getting his job done and then the idea is that who will leave the system first. So, suppose Mr. Jain is being serviced well Mr. Sharma goes to the clerk because Mr. Verma has left. So, the question being asked is what is the probability that Mr. Sharma would still be in the system when Mr. Jain leaves. So, essentially you are asking who will be the first one to leave either Mr. Jain or Mr. Sharma. Mr. Verma is already left. So, but the exponential distribution is memory less. So, therefore, how long Mr. Jain how long more will Mr. Jain take is independent of how long he has already been at the counter because we have said that it is memory less and therefore, the service gets completed is not dependent on how long it will take for the service to be completed. It is not depend on how long he has already been serviced and so therefore, it is equally likely that either Mr. Sharma will complete his service before Mr. Jain or Mr. Jain will complete. I am just assuming that Mr. Jain is still in the system Mr. Verma is left, but you can do it either way right. So, therefore, very simple use of the memory less property of exponential distribution and so therefore, the probability of Mr. Sharma leaving at last is half because it is equally likely that there Mr. Jain completes his service first or Mr. Sharma completes his service because of the memory less property. So, I thought this will just add to the Poisson process and the other systems that we have been talking of birth and death process. That means, when you have people arriving at a service station and then they are being serviced and so then we would not talk about the number of people, average number of people in the system then what is the average waiting time and so on. I would like to take this further because once we have been able to compute the arrivals we have discussed the Poisson process. One of the arrival processes under these conditions that we have laid down and then now we want to look at you know for example, you have a service center and there you have people arriving for service then you have people providing the service and then that process is also random. So, now we want to combine these two and therefore, the theory that when you study such processes is known as queuing process. And then you want to for example, when you have a post office you want to know because if the average number of people arriving in the post office is large then you would want to one clerk may not be enough to serve everybody and then facility how big it should be and so on. So, they are very interesting question, but of course, we will study them at a very basic level. So, it will be the queuing process is where you want to compute the average waiting time of a customer, you want to compute the average service time of a customer then you want to look at the average number of people there at any time there are in the system and so on. So, such interesting questions we would want to answer and therefore, we will model the situation where you have people arriving for service, services are being rendered and then people leave the system. So, the whole thing we would want to study and this we will try to do in the next couple of lectures.