 Welcome back. So now what we will now study is the setting of a typical communication problem. So here is the problem of communication. In this particular case the case that I am going to refer to is what is called point to point communication. Point to point communication in a point to point communication what we have is you have you have information at what is called the source. So you have some information at a source and what we want to do is we want to replicate this information at a destination. We want this information to be replicated at a destination. So what this so this the information at the source would be denoted by a random variable s. So let us call let us denote this by a random variable s. So this here is some source. Now this and whatever is replicated at the destination let us denote that by a random variable s hat. The goal is to get s hat and s to agree. So we would want these to become come as close as possible. Now how is this in how is this source sending information to a destination it is being sent over a medium. This medium is what is called a channel. This is called a channel. So that is this medium could be say for example could be telegraph, could be could be post, could be email, could be sound waves, could be could be the kind of medium that you are using right now the internet or whatever. All of these are various types of communication media and this source is to be sent to the destination over this medium. Now the remarkable thing now you might think that can you send really any source over any medium. For example the source could be taking could be could be say for example the letters of the alphabet. Whereas this so these are letters that are pictorially written in a book or on a blackboard or something like that and the medium that we are referring to could be say radios, the radio medium. Now if you want to send these letters over a radio medium the letters themselves cannot directly go into the radio medium. So therefore what the medium has what one has to do is to somehow convert these letters into something that into signals that can actually be sent as inputs into this particular medium. So the medium comes with it comes along comes with a definition of what it is allowable inputs and it is possible outputs are. So what you have for a medium are what are called channel inputs and channel outputs. So this here these are predefined channel in properties of the medium or the channel itself. These are channel inputs and these are channel outputs. So for example if this is as I said a radio channel then in that case it would take electromagnetic inputs and produce electromagnetic outputs. It cannot take pieces of paper as input for instance pieces of paper where the source is written that is not an input. It cannot take sound my sound itself cannot be chosen taken as input as an input to this into this particular medium. But then if this is the case if there is a mismatch between the space of the source and the space of channel inputs then how does one even use this channel to communicate. What one needs to do is then have an adapter. You need a way for this source to talk to that medium and that medium to then talk to the later to the final destination. So you need in between what you know in electrical engineering what we call an adapter or power engineering and so on. It is basically takes two different formats and makes them compatible with each other. In the communication language this particular thing is known as an encoder. So what the encoder does is basically takes every value takes the source random variable and it maps it if possible channel input. So let me denote this encoder by a function f. So the source here goes into the encoder. The encoder maps the source to a possible channel input let us call that channel input x. Now the channel has a characteristic of the amount of noise that it can produce that it adds to the channel inputs. So when you send an input x it produces a possible output y here let us call this output y with a certain probability that is been told to us. So it produces so when an input x is sent x not it is not necessary that x itself comes out of the channel some other output might come that output is denoted y. Now because this could be some other output it does not even have to be from the same space. For example it is quite possible that you send an input of you send an electrical impulse on one end and what comes out on the other end of the channel is a sound or a radio wave or something like that all of these are valid definitions of a channel and valid definitions of channel input and channel output. Now this here again has the same problem the output of the channel may not be in the format that you want for your final destination. In the final destination you are looking for you are looking to replicate the source and the output s hat that so you want the output s hat to be in that format whereas y which is coming out of the channel is not in that format at all. So in that case then y has to again undergo you have to put in another adapter here. So the that second thing here is what is called a decoder let us denote that by g. So you have a decoder at the other end of the channel which takes the output of the channel and maps it again to a to something in the destination. So something that is a possible valid value for the destination. So now this is therefore the structure or the setting of a typical communication problem you have a source here you have a source here s that source is seen by an encoder that produces a channel input which is denoted x that input passes through a channel emerges out as y, y is seen by a decoder which is denoted g, g maps y to a s hat. Now the goal of communication here is to eventually get s and s hat to be close. So we can measure this closeness in through a function let us call this function d this function d is a function that will map s, s hat to a real number or let us say it maps it from 0 to infinity. So d is a measure of the distance between s and s hat this is in the communication language is often called restoration. So it is a measure of how far s and s hat are. So for instance it could be a function in which is 0 only when s is equal to s hat or and so and so if s and s hat are real valued random variables for example d could be the function for example d could be d of s s hat could be the function norm of s minus s hat square. So this is a way of measuring the distance between s and s hat. If you want s and s hat to agree exactly then d could be a function that is an indicator of s equal to s hat. So it means that it will take value 1 when so it will sorry s indicator s not equal to s hat which means it will take value 1 when s is not equal to s hat and take value 0 only when s is equal to s hat. This is again another way of measuring the distance between s and s hat because your when s and s hat are exactly equal that is when you get distortion 0 and that is and otherwise you get distortion unity. So there are many different ways by which we can define a distortion between your source and destination. This distortion is part of the problem definition. It is a part of defining what we mean by quality of communication and what we mean by communication itself you know by what we mean by recovery of the source at the destination. So then what is the problem of communication then? The problem of communication is now that you have been given this particular this diagram here. The problem of communication is to come up with the right choice of encoder and decoder. And by what do I mean by right choice? Well you want to choose the encoder and decoder so that you get as little distortion as possible. So the problem of communication is to minimize the expectation of d of s and s hat over all functions f and g. So if you take this d for example to be this this norm of s minus s hat and for simplicity let us assume these are just scalars. So then in that case this expectation would then become the expectation of s minus s hat squared. We would be talking of minimizing this over f and g. So now let us do this. Let us write this out a little bit further. Let us take this particular thing a little bit further. You would have already noticed that there is some there seems to be some resemblance here to the kind of problems we have already studied. Now in order to establish that resemblance let us take this one step further. So now let us think about this problem in the frameworks that we already have. What do we have here? We have the first entity which we called an encoder. What is that encoder? Well encoder is simply a function. The encoder is simply a function. It is a function that is mapping s to x. It is mapping your source to a possible channel input. You have another entity which we called a decoder. The decoder is mapping y to s hat. So this is your decoder. So these are basically f and g are just functions. Now let us try to look at this problem in the kind of frameworks that we have defined so far. It is clear that this is a stochastic decision problem because after all there is a random variable which is s. There is an encoder. There is a decoder. These encoder and decoder are actually functions. These functions are mapping random variables to what you can say are actions. And then based on those actions we are receiving a cost and the cost is encoded in this function d. So it is clear that the typical communication setup is really a stochastic decision problem. Although that is not how it is normally studied. People do not often normally study this as a stochastic decision problem. But once we look at it in the framework that we have defined, it really is some kind of stochastic decision problem. And now that we know it is a stochastic decision problem, we can start asking ourselves many more sharper questions. For example, if it is a stochastic decision problem then is there a what are f and g? f and g are the functions that are chosen that define your encoder and decoder. Another way of looking at f and g is to say well f, g together forms a policy. I can say well that f, g here is a policy in my and the communication problem is a stochastic decision problem in which what we need to do is find the optimal policy. Finding the optimal controller is really finding the optimal encoder, decoder is really about finding the optimal policy. So optimal encoder, decoder really is equivalent to finding optimal policy. So we are already seeing that there is some kind of a stochastic control analog building here. There is a policy that we have to find, there is a cost that we have. Now question then is, if there is a policy then what is the information structure? What is the information structure? What is the information structure in this problem? Let us go back and look at this. At the end of the day the goal of the communication problem is to reproduce this source here at the destination. So the reason for reproducing the source at the destination is because the source is not already known at the destination all this paraphernalia here has been introduced because we need because the source is actually not known already at the destination. So what this means is the entity that is producing s hat entity that is producing what is known at the destination that is s hat, this entity does not actually have knowledge of s. So if I like here this s is not known to g, this is not known. What is known then to g? Well g is standing on the other end of the channel. Think of the channel as a medium such as for example a radio or something like that with no other way of talking to the source. So what the decoder has to do is to look it is compelled to look at only what comes out of the medium and maps that to a destination. So this particular thing that so the only thing that the decoder actually knows is the value of y. So the information of g is y, this is all that the decoder knows. What does the encoder know? Well the encoder obviously only knows the source. The encoder has seen the source and it is mapped it to the channel input then thereafter the channel has taken over channel has produced it is out the output and the decoder has seen it at the other end and is producing what is its estimate s hat. So the information with the encoder then is only the source information. So this what the encoder knows information of air is s. So the encoder only knows s. So as now let us see what does this actually mean? This means now we have a policy in which there are two decision makers or two controllers. We have a policy with two controllers. The first acting controller is seeing s. The second acting controller is seeing y. Now how is y produced? Well y itself is produced produced from x and x itself is f of s. In other words y depends on f of s. This means that the information of the second acting controller depends on the action of the first acting controller. But the first but the second acting controller does not have access to the information of the first acting controller. So which means that so g is information depends on f of s but g does not have access to s does not have the access to the information that produced that was used to produce f of s. Which means what does all of this mean? I guess it is evident by now that this problem has a non-classical information pattern. So this problem the communication problem is a non-classical information structure. Indeed this is exactly what makes communication a problem in itself. The reason of the non-classical information structure is also why communication is in fact a problem. We need to communicate because the source is not known at the destination and once the source is not known at the destination whatever encoded messages we send to the destination are affected by the actions that we choose at the source and that those actions affect the information at the other end but the entity at the other end which is your decoder or your receiver of information it does not have access to the information that you had when you produced your actions. So as a result of this that this is in fact in a problem with non-classical information structure. So coming back to this communication setting here all actually communication problems therefore are really problems of non-classical information structure. Although they are not studied in that sort of way it is implicit that such problems have in them a non-classical information pattern which is why they merit a special attention. Now we will see this little more deeply in the next class that now that we know that this has a non-classical information pattern we can ask start asking sharper questions. For example, well if this is a non-classical information pattern then is it static or is it dynamic? If it is static then maybe we can solve the problem because we already know a little bit about static information structures. If it is dynamic then that then we can ask questions about the dual effect. Is there such a thing as a dual effect in this particular problem? So all of these questions can be asked once we view the communication problem as really a stochastic control problem. Another way of viewing the communication problem as a stochastic control problem is to think of the encoder decoder as a team that this is in fact a team of players that are trying to communicate and their cost is given here by this function D. This team comprises of these two agents called encoder and decoder and they have in them between them a non-classical information structure. So this particular this is the view that we will take forward for the remaining part of the course and as we get to the end of this course I hope it is now becoming evident how all these various problems stochastic control problems, problems from organizational structure and now problems from communication are really all instances of one common theme that these are really stochastic decision problems under various information structures. So I will see you for the rest of the course.