 So, in this course mostly we will be focused on the four discrete distribution we talked about and the four continuous distribution we talked about ok. So, you all of you should be like this discrete random variables four we discussed they should be almost you should memorize on your fingertips and continuous all these uniform exponential Gaussian and Rayleigh they should be also may not be Rayleigh much, but Gaussian all they should be on your fingertips. In addition to this there are other distributions we already talked about this uniform distribution on finite set of elements there are again like gamma distribution, Weibull distribution and Laplace distribution we will discuss them as they as we require at them ok. Otherwise one can come up with different possible distributions all we need to ensure is if in a case of a discrete random variables right like if it is like you have this let us say this is like a some probability mass function of some random variable. This is the valid probability mass function as long as i equals to 1 to n is sum to 1 and all this p x i's are positive any kind of a probability vector you can come which add up to 1 and positive this is a valid probability mass function and similarly you can come up with some function for all x and this is positive positive and as long as this is 1 this is another valid probability density function for some random variable. You can come up with whatever you want ok like that people have come up with different but those have some special applications based on that they are called gamma Weibull and Laplace I will use them whenever we incur them ok. Now often when we have this random variables we would not be much bothered about what is the exact value but we would be interested in on an average what happens. For example this entire class is there maybe some of you will get I mean it is not necessary that all of you will get the same grades right but what matters to me is what is the average grade of this class that will define what is the quality of this class maybe if some of you got a fun you that is one of some individuals got and some of you got let us say cd some of you got but that is not like a individual case matters to us like what is the average grade of this class right in that case like what each one of you score it is like a particular realization like I can treat the score of this class as some random variable and there are about 60 students and the value scored by you as the different realization I am going to see. For me that particular realization is not so importance what is important is the average score if the average score is high then let us say you are the current batch average score is higher than the previous batch average score I may feel that I will feel that I may be I did a good job or you people did a good job like overall the class performance is better. So like that it becomes useful to compare the overall behavior ok. So, that is where we will look into the expectation and variance which will help us to understand the aggregated behavior rather than the individual behavior ok. So, now when we have this discrete random variable we said that we will have this point masses and the expectation of that random variable is simply defined as the weighted sum of the realization where the weights being there associated probabilities. So, notice that here my random variable x is taking values x 1 x 2 with the probability p x i's and what I am doing is x i is being taken with probability p x i I am multiplying and summing over all possible values. Even though I have written here i equals to 1 to infinity it could be finite ok when you have only finitely n like if you have only n terms in this this is simply going to be i 1 just a minute. What is this dft yeah this is something called a discrete Fourier transforms I mean that Laplace arises in that context like usually in signal processing when you are processing your speed signals one has to deal with certain kind of Fourier transforms one of them is a special thing called discrete Fourier transform we do that some special distribution arises that special distribution is called as Laplace there all I am saying is Laplace is one kind of distribution which has application in this speech modeling ok like that V bull is another kind of things distribution which finds application in reliability and survival analysis. I do not know some of you who are mechanical engineering you people study something called reliability course you study reliability course maybe later yeah so there you will end up seeing a V bull kind of distribution and when somebody in climate studies or civil engineering people may be interested in studying the amount of rainfall accumulated in a reservoir for example how much of the water gets accumulated in our power like in monsoon reason that is needs to be studied we need some models and maybe that gamma kind of distribution will find applications ok so depending on application people have come up with different distributions and I have just listed them like instead of going through all of them ok this is your expected value in the discrete case similarly like if I am given a probability density function what you are going to do is you know like when you go from discrete to continuous the things change from summation to integration and the PMF change from PMF change to probability density function that is what we are doing ok and now you have this expected value of your random variable. Now the other quantity we just said what is expectation variance variance is how the value of random variable varies around its mean ok so how the things behave with respect to the mean we want to capture and one way to capture that quantity is called variance and that is defined like this ok let us say I have a random variable x and I have expectation of x is expectation of x is a random quantity or a constant is a constant let us say I want to define a new random variable x minus expectation of x can I define a new random variable like this so now what I want to understand is I want to take square of this and take its expectation and that is exactly is my variance that is what its variance is now again how to compute this if it is a discrete random variable you just find out this quantity like I mean this quantity actually ok let me ask you let us say if my x takes values x 1 x 2 up to x n ok and when I do this operation y hypothetically assume that y also takes value y 1 y 2 y n that is x 1 gets mapped to y 1 x 2 get maps to y 2 and x n gets maps to y 1 y n now I want to understand what is the probability of x i and what is the probability that y equals to y i is there any relation between them will they remain same or different others they are going to remain same right and here I am specifically assume that there is a 1 to 1 mapping ok so whatever the probability I have x 1 the same probability with same probability I will get y 1 also like that now it is now basically finding because of this if I want to find expectation of y this is simply expectation of y i times y i but this is nothing but probability of x i but y i is now x i minus expectation of x squared and that is what we have written here maybe I should boundary patch a lot here p of x i times x i minus expectation of x but we have it and similarly in the continuous case as I said you are going to replace summation by integration and the discrete probability by the associated density function ok. Now I talked about different distributions right these are some like these are some discrete distribution we talked about and there are some different continuous distributions we talked about and each of them with comes with a certain distribution. Now Bernoulli random Bernoulli distribution with probability p if you go back and compute its mean value it is simply going to be p whereas, its variance is going to be p into 1 minus p this is obvious you can calculate ok and binomial n p. So, recall binomial has two parameters n and p and its mean is going to be n p and its variance is going to be n p into 1 minus 1. So, notice that means and variance depends on the parameters and for geometric this is going to be 1 by p. So, let us compute for one of them let us say let us take x to be geometric. So, what are the possible values x takes in the geometric distribution? So, 1, 2 up to infinity and what is the probability that x equals to i there is q say again ok. So, there are i minus 1 failures and after that there is a success that means there are success happened in the i trial. Now what is the expected value of I want to compute what is the expectation of expectation of x? Now this is going to be probability that x equals to i into i itself i equals to 1 to infinity now right and what is this value this value is going to be 1 minus p i minus 1 p into i and now i equals to 1 to infinity. Can somebody now you can simplify try to simplify this p i equals to 1 to infinity 1 minus p i minus 1 i. Now can you find out what is this value is? What is this in integral summation is? How? So, what is this like if you want to write it as we can write it as d by dp of what is this quantity? p power i this is not exactly the case, but you can imagine like this because integration and differentiation you cannot always interchange like this some conditions has to be satisfied you cannot blindly do this, but I have blindly interchanged for time being integration and summation ok. Now you do this quantity you can find it as p into 1 minus p square is it correct this quantity is going to be 1 minus p square I am blindly assuming you are right and then it is going to be 1 by p. So, that is what we get it and you can also find out variance. So, variance you need to do little more computation because square terms are involved you do it so like that you can do. So, some of the things to notice is if I have a Poisson distribution with parameter lambda it is mean and variance are also going to be lambda only ok. And if I have exponential with parameter lambda it is mean is going to be 1 by lambda and it is variance is going to be 1 by lambda square and this Gaussian distribution with parameter mu and sigma square mu in mu is corresponds to it is mean and sigma square corresponds to it is variance ok. So, like that. So, I want you to go through all of this table even though we have put it here you should work out this make sure that you are getting the right things at least once you should do so that it is like little bit get rid in your head you do it once at least. So, that it remains little bit longer term with you if you do not calculate once even though it could be little bit manual you have to write some 4, 5 page of calculation, but do it ok fine. This is all about the discrete and continuous random variable and the distributions and their associated functions like expectations and variances.