 Hello everyone, welcome to lecture on Shannon's Theorem. At the end of this session, students will be able to describe though what is mean by Shannon's Theorem. Now before starting with the actual session, let's pause the video and think about what is mean by channel capacity. If you know it is nothing but the channel capacity equals to max of your rate of mutual information where Ixy is nothing but your mutual information, maximum value of that is nothing but your channel capacity. Also you remember that it is nothing but the difference between the entropies. So, mutual information is nothing but the difference between the entropies. This one is the initial entropy and this one is the final entropy. So, whatever the difference you are getting that is having a maximum value of that is nothing but your channel capacity. Now let's start with the Shannon's Theorem. Now basically we already saw that a communication system has a maximum rate of information that is I max is nothing but your channel capacity. So, regarding with this channel capacity and a rate of information, Shannon's Theorem states that if your R, R is nothing but the rate of information is less than or equal to your channel capacity. In that case there is a existing coding technique by using that the output of the source may be transmitted over the channel in addition with low probability of error can be made out of the receiving message. This is the statement of the Shannon's Theorem means that if you are having R less than or equal to C by using any coding techniques you can make receiving messages with the low probability of error at the receiver side. So, this is the statement over here. The main feature of this is when R is less than or equal to C you can transmit using error free transmission is possible in the presence of noise also. Now this is a unfortunately the theorem is not constructive proof. Why? It just states the such a coding method exists but not there is a proof and that's why it is not be used to develop a coding method that reaches the channel capacity. So, we you can't reach the maximum channel capacity by using such type of coding theorem because there is no proof there is. So, in that case negative statement of this theorem is also exists. What is the positive statement? We already saw that R is less than or equal to C but if R is greater than C in that case the errors cannot be avoided regardless of the coding techniques used. Means in the positive statement you can say that by if R is less than or equal to C by using any coding techniques you can achieve the maximum receiving message with low probability of error. But in the negative statement if R is greater than C means your rate of information is greater than your channel capacity in that case whether you are using any coding techniques. Instead of that also you are getting a errors in your received messages that is the negative statement of your Shannon theorem. So, to overcome that we have a new theorem that is a Shannon Hartley theorem in that case this theorem state that the C is equals to B log to the base 2 1 plus S by n. Now in this case C is nothing but your channel capacity which is in bits per second, B is nothing but your bandwidth of the channel which is in hertz and the S by n is nothing but your signal to noise ratio. So, this indicates that the sufficient advanced coding techniques if you used during the transmission at the channel capacity can occur with arbitrarily small errors in the presence of noise. So, using this equation you can you can achieve the maximum channel capacity with the low probability of error. Now if the channel is in case of Gaussian channel in that case for no noise your S by n becomes infinity. So, that equation C equals to B log to the base 2 1 plus S by n in that case if S by n is infinity this n is no noise means n is 0. So, that ratio becomes infinity. So, overall your channel capacity become infinite which is not a possible. On the other hand if you channel capacity does not become infinite as the bandwidth of approaches infinity because with an increase in bandwidth also there is a increase in the noise. So, because of these two reverse statements you can say we have to trade off between the channel capacity and the bandwidth. So, how to do that? So, for fixed signal power and in the presence of wide Gaussian noise the channel capacity approaches an upper limit with the bandwidth increase to infinity that upper limit is called as a Shannon limit. Now if the Gaussian channel having a wide Gaussian noise in that case your n that is a noise power spectral density is eta by 2. So, that n becomes n eta into B which is nothing but your bandwidth. So, that equation becomes of channel capacity C B log to the base 2 1 plus S which is nothing but signal power spectral density divided by eta into B that is a bandwidth. If you simplify this you will get the final equation like this for now this is nothing but the general concept you can see over here limit x tends to 0 1 plus x raise to 1 by x which is nothing but E using this one if you substitute this n B by S as a x 1 by x then we get C infinity that is limit bandwidth tends to infinity channel capacity equals to S by eta log to the base 2 E. So, this is nothing but 1.44. So, this is your maximum channel capacity in case of wide Gaussian noise in the Gaussian channel. So, this is your maximum channel capacity in case of wide Gaussian noise in the Gaussian channel. So, as I said this gives the maximum information transmission rate possible for a system of given power, but no bandwidth limitations. Now, let us see one example for a standard voice band communication channel the SNRL is 30 dB and the transmission bandwidth is 3 kilohertz what will be the channel capacity and the given data is that given 3.32 log to the base 10 A equals to nothing but log to the base 2 A. So, this one will be useful in case of calculation. Now, so the given is SNR equals to 30 dB is given. So, we have to convert that dB value. So, we can write 10 log to the base 10 S by n equals to 30 dB is given. So, that 10 divided over here that becomes log to the base 10 S by n equals to 3. So, S by n equals to 1000. Now, we have S by n we have the formula for channel capacities that is c equals to bandwidth log to the base 2 1 plus S by n we have S by n calculated just which is nothing but 1000. So, 3 log to the base 2 1 plus 1000. So, 3 into to convert this log to the base 2 into log to the base 10 we have given formula over here. So, 3.32 log to the base 10 1 plus 1000 if you simply find out that by this calculate that and you will get the channel capacity as a 30 KBPS. These are the references. Thank you.