 We take another example from the Hogg, McCain and Craigs books that is application number 7.4.6a part. The statement of this question let a random sample of size n taken from a distribution of the discrete type. Now we are considering a discrete distribution so that you can easily understand how we can handle the discrete type of random variables for the completeness. And the probability mass that is 1 over theta and x ranges from 1 to up to theta, 0 elsewhere where theta is an unknown positive integer. So what we need to show that, show that the largest observation say yn of a sample is complete sufficient statistic for theta. So for this discrete type of distribution we have to show that it is sufficient as well as it is a complete statistic for theta. So let us move further to solve this. So first we have to find the distribution of the largest observation for the discrete random variable. So in case of discrete random variable the distribution of the largest observation say y that is y of n is for the discrete type of distribution that is let us say y is equal to y and that can be written as probability of y less than or equal to y minus probability of y less than or equal to y minus 1. So that is the way how we can find the distribution of the largest observation for the discrete random variable. Let us say for the n observations that can be written as y less than or equal to y raise to power n minus probability y less than or equal to y minus 1 raise to power n. Now the distribution function of this that is y over theta that is very simple. So we just put the values of this one that is y over theta raise to power n minus y minus 1 over theta raise to power n. So we can write it as that is y raise to power n minus y minus 1 raise to power n over theta raise to power n. So the giant distribution of the sample probability mass function is that is probability of x is equal to x and that is equal to 1 over theta raise to power n for x 1, x 2 up to x n. Now by using the factorization theorem that the probability x is equal to x our probability of y is equal to y if it is equal to h of x independent of theta we say that we say that y is sufficient statistic for theta. So probability x is equal to x our probability of y is equal to y that is equal to that is actually 1 over theta raise to power n and that is actually y raise to power n minus y minus 1 raise to power n over theta raise to power n. So when we simplify it we get 1 over y raise to power n minus y minus 1 raise to power n. So this is free from theta so it means that it is free from theta so it means that y is equal to y n is sufficient statistic theta. Now to show that y is a complete so we have to find the expected value of g of y that is equal to 0 for all theta and this implies that g is equal to 0 almost purely. Now the expected value of g of y that is actually sum y is from 1 to theta that is g of y into its pdf that is y raise to power n minus y minus 1 raise to power n over theta raise to power n that is equal to 0 and this implies that actually if we put the value of g of 1 we get 1 if we put g is equal to 2 we get 2 raise to power n minus 1 and so on up to so on it will give us g of y that is equal to 0 because this is the first one and the second minus 1 will be cancelled out similarly when we do it for 3 the next factor 2 raise to power n will be 2 raise to power minus 2 raise to power n and in the end we will get net result that is g of y will be equal to 0 for all theta that is equal to 1, 2 up to so on so this reflects that y is complete statistic and that is what actually we want to prove.