 So, so far we have discussed this F distributions, we talked about convergence of random variables and we talked about consistency and then talked about order statistics. In the last class, we talked about order statistics and discussed how to find distribution of an order statistic when it is discrete ok. We just said that let us say if my random variable is discrete that is it takes values in some finite set or some countable set, then if you are interested in knowing what is the smallest value, the distribution of the smallest value, distribution of the second smallest value, distribution of the second value or the distribution of the maximum value like that we discussed one way to find its distribution right. If you recall we defined some Bernoulli random variable, we took the summation and based on that we did some computations. So, that idea can be extended even when your random variable is continuous, I did not discuss that, but the expression of that expression for that is given in the slides and you people can verify how to get the expression by going through the computations yourself or refer to the book for the computations, all the computations are given in the book. So, today we will go ahead and just cover one more topic about generating random samples. So, this actually we already did right like this topic of generating random sample we have discussed earlier. Do anybody remember when was this? Yeah, we try to find samples or generate sample according to a given distribution using uniform samples by defining inverse of that given CDF function ok. Now, let us see if you can little go beyond that ok. So, on that front we are going to discuss two methods called direct and indirect method ok. So, direct method is what we already know. So, in direct method also there could be different possibilities and one possibility we already know. Suppose let us say I have been given x which is discrete or let us say x is continuous. Now, I have been asked to generate x such that it has CDF of f. Now, you know how to do this? You have to start with an uniform random variable and set your x to be f inverse of u. Once you do this you already know that x has a CDF of f right and this is the argument for that we have already discussed this ok. Next let us look into specific example. Suppose I have been told that I want to generate a random variable which follows exponential distribution with parameter lambda. So, for this I already know that the CDF I want is given by this expression right. Everybody agree that the CDF that I am interested is f of x equals to 1 minus e to the power lambda x right this is a CDF for my exponential distribution. Now, I need to define f inverse for this function ok. So, what is the f inverse for this function? Suppose let us say you call this some value u yeah then you can write f inverse of u like this ok. All you need to do is you just take it it becomes 1 minus u then the x that gives me this particular u is can be written as log of 1 by lambda with a minus. So, this will give me that x for which will give this particular value u that I am interested. So, now I know this is my inverse function. Now, all I have to do is take your x to be minus 1 by lambda log of 1 minus u. If you take this we already know that this is going to follow exponential distribution with parameter lambda. Everybody agree that if I do like this it is going to give me this x if I do like this x is going to have exponential distribution with parameter lambda. So, by this what I mean here is what does this mean? Suppose let us say let us say u 1, u 2, u 3 and you have generated let us say some 100 samples and now what I am doing is I am now doing 1 minus lambda log of 1 minus u 1 and then 1 minus lambda log of 1 minus u 2 and I am calling this as x 1 and calling this for x 2 like this and 1 minus lambda log of 1 minus u 100 this has x 100. So, whatever the new samples I got x 1, x 2, x 100 they are following the exponential distribution with parameter lambda ok fine. So, exponential was simple. Now, let us see using this simple method what else we can generate? Let us see is it possible for us to generate gamma distribution with parameter n and lambda for some integer n and some nonzero value of lambda. Now, given that I know how to generate samples from exponential distribution just as we did in this example let us try to see that we can leverage this. So, I know something about exponential how to generate exponential distribution easily, but if I can do that maybe I should attempt to connect my gamma distribution with the exponential distributions and try to explore that relation ok. But we know that there exist a relation between gamma distribution and exponential distribution right. Suppose you have x 1, x 2, x n these are all IID exponential distribution with parameter lambda then we already know that their summation is gamma distributed with parameter n lambda. So, then what is one way to generate gamma distribution well order just generate n exponentially distributed random variable and then add then it will become a gamma, but to generate exponential distribution how I did I generated uniform distribution and then use it. So, I am going to just use this. So, use of this IID random variable first I am going to generate them following making them follow exponential distribution using my uniform random variables. So, I am going to generate uniform random variables u 1, u 2 up to u n and set x i equals to minus 1 by lambda log 1 minus u and I know that if I do like that x i is exponential distributed with parameter lambda and now I have this x i which are exponential distributed at lambda just add them then I am going to get this x to be gamma distributed. So, if I have to directly write this in terms of my uniform random variable I have to do is generate my uniform random variables and then add them after taking the log and doing by scaling with 1 minus lambda then whatever I get is directly gamma distributed with parameter n and lambda. You see that I am basically use I am playing around with the properties ok and then applying this functions of random variable properties to extract all this desired properties ok. Now, I know anybody has any question on how to generate this gamma distribution now ok. Now, once I know gamma maybe I can also go and generate a chi square distribution because gamma distribution and chi square distributions are related right. So, I know that chi square distribution with the 2 n degrees of freedom is nothing, but a gamma distribution with parameter n and half. So, to generate a gamma distribution with parameter n and half what should I do exponential with what parameter half. So, I need to generate n number of exponentially distributed random variable with parameter half and then simply add them that will directly give me chi square distribution. And now, such relations should be exploited to even generate other distributions. Let us suppose if you want to generate beta distributions with parameter m and n how you are going to do that? You again first explore the relation that x if I am going to take log of u i's where i is running from m that is basically what I am trying to do here is I am taking the average of 2 sums in the numerator I am taking the sum of m uniform random variable after applying the log function and then the denominator I am taking sum of all the m plus n uniform random variables after applying the log function. We know that if I do this transformation on the uniform random variable it is already beta distributed with parameter m and n. So, all you readily got like just directly by taking uniform random variable and applying this transformation you readily got beta distribution. Okay. So, this method is nice we are basically exploiting the fact that the relation between one distribution with the other and as long as we could represented one of them generate one of those random variables through uniform random variable like we are done because if it for example, here gamma distribution depends on exponential and I know how to generate exponential using uniform random variable right. So, you either gamma, chi square or beta everything can be generated using only uniform random variables and uniform is something simple which we can assume to be available to us and then just use these properties to get this distribution. But now is this trick works all the time? Maybe it may not work all the time one simple example I have given is if I want to generate let us say x is x 1 square where n is odd. Now in this case x 1 square is nothing, but gamma with what parameters n by 2 and this we know if I have to write it as a sum of exponential random variables I have to add n by 2 exponential but when n is odd can I do that? No right you cannot add 10 by 2 number of exponential random variables when n is odd. So, then that times this simple method does not work. Ok and yeah and what is the relation between chi square distribution and gamma distribution did we study that? Anybody recall what is the distribution between or like relation between Gaussian and chi square distribution? Ok I want to generate chi square distribution with one degrees of freedom. What is the relation between chi square distribution with one degrees of freedom and Gaussian distribution? This is this is normal with what? What parameters? Now see now this chi square distribution with one degrees of freedom I know that this I cannot generate with my gamma method because this degrees of freedom is odd. So, because of this I cannot even generate the Gaussian distribution in this case right. Is this clear to all of you? Ok now we need to overcome this what other methods we have. So, we will just explore one other direct method now. Ok by the way if you have to the previous methods the previous direct method we applied we always needed to do F inverse right. If you want to generate a X X according to which is has the CD of F then I needed to find I needed to set F equals to F inverse of U. So, you need to find what is that F inverse function, but always finding F inverse is not an easy task. Maybe something like exponential it was easy because of its simple structure. So, in general if you have to like invert I mean basically you need to solve such integration functions like suppose let us say U is F of X and I want to invert F. How you are going to get? So, I want to find out if I want to invert if you give me U I need to find what is that X that will give me U when I apply F on X. So, for example, let us say U have been given to U and F is let us say is given as a integration of this. Now, to find what is that X which will give me this U I need to reverse this integration process which can be very hard ok. So, when F was exponentially distributed this was easy it was like 1 1 upon e to the power lambda X. But every time you may not be dealing with exponential right you may have to deal with other distributions. So, doing this inversion is required and that may become hard. So, that is why we have to resort we have to look for other methods than what we just discussed ok. So, there is a one method called Box Miller method which is also a direct method which is particularly it was full to generate normal distribution samples according to following normal distribution. So, let us see how does this work. Suppose you have two random variables U 1 and U 2 which are IID and uniformly distributed now you define two values are which is a transformation on U 1 and theta 2 which is a transformation on U 2. Now, you define X 1 to be R cos theta and X 2 to be R sin theta it so happens that X 1 and X 2 are IID and also Gaussian with parameter 0 and 1 ok you can verify this this exercise you people have been doing right multiple times like basically I am doing this is the one transformation and this is another transformation ok. And if I tell you and here can you find what how this R and theta this to R R and theta are independent they are functions of U 1 and U 2, but U 1 U 2 are independent. So, R and theta are independent will you be able to find distributions of R and theta joint distribution of R and theta you should be able to write like you know the distribution of U 1 R is a function of U 1 you should be able to find distribution of R and you should be able to also find distribution of theta and now X 1 depends on both R and theta, but R and theta are independent. You should be also able to find the distribution of x 1 and x 2 here and you mean you can apply the Jacobian method that we have discussed before and you once you do that you will see that and you find that joint distribution and when you find their marginal you will see that both x 1 and x 2 are Gaussian distributed. So, since you know the method it is just about doing the calculations just working out the details I am skipping and you should verify this ok. So, you see that like the previous method just using this based on this uniform distribution generating Gaussian was not feasible right just based on uniform you have we could not generate Gaussian distribution, but now we just came up with other method here which is also using uniform distribution, but it is still giving us Gaussian distribution. It is just like we need to do the appropriate mapping ok by transforming one and in an appropriate way we may get desired distribution. But what is that mapping that is the question and that need to be solved like here this box miller method maybe this box miller figured out that ok. If I do transformations like this, if I take R and theta in terms of u 1, u 2 like this and if I define my x 1 and x 2 then he said that this transformation actually gives what I want ok. It is not necessary that if I want to generate let us say tomorrow binomial distribution the same kind of transformation may will work maybe you have to find a other transformation suitable for that case. However, I notice this here this here we are interested in generating Gaussian and Gaussian is continuous.