 Welcome back to NPTEL course on game theory. In the last session, we introduced the zero sum and non-zero sum games and I also gave a formal definition of the games. So let me, we will get back to them now. Let us start looking at, so let me recall what exactly is the game. So in the game there are 2 players, players strategy sets X and Y. Player 1's payoff is given by pi 1 from X cross Y to R and player 2's payoff is pi 2 from X cross Y to R and we also introduced the Nash equilibrium, equilibrium which is also called Nash, we will come back to the name Nash later is basically a pair of choices X star and Y star in X cross Y satisfying the following thing, pi 1 X star Y star should be a max over X in X of pi 1 X comma Y star. Similarly pi 2 X star Y star is max of Y in Y of pi 2 X star Y. So let us understand once again this notion when player 2 fixes his strategy to Y star the strategy of player 1 X star should maximizes his payoff and similarly when player 2 fixes his strategy to X star player 1's player 2's Y star should maximize his payoff. So this is a known as Nash equilibrium this let us now specialize this to zero sum games consider now zero sum games. So recall pi 1 is nothing but minus of pi 2. So what we have is that X star Y star let us take that this is the equilibrium means what we have is that pi 1 X star Y star is nothing but max over X in X of pi 1 X Y star. Similarly we also have pi 2 of X star Y star is max over Y of Y of pi 2 X star Y okay. Now recall this pi 2 is same as minus pi 1 X star Y star and this is same as minus pi 1 X star Y. Now let us elaborate. So what we have is that pi 1 X star Y star is greater than or equals to pi 1 X pi star for all X in X. So this is basically the consequence of this equality is exactly this. Now the second one says now we will write down this one we use everything in the minus pi 1 minus pi 1 X star Y star is greater than or equals to minus pi 1 of X star Y for all Y in Y. This is basically coming from the second equality. Now this if we rewrite these two things we have the following thing. So pi 1 X star Y star comes in the middle and this is from here when you fix Y star so X star then Y star is maximizer for minus Y minus pi 1 and hence minimizer pi 1. So therefore this will be less than or equals to pi 1 X star pi. Basically the second inequality we have multiplied by minus that is this. And now first inequality tells this is less than equals to pi 1 X Y star. In a sense this pair of this choices X star and Y star is like when we fix Y star X star maximizes this function pi 1 at the same time when you fix X star Y star minimizes in Y. So in this direct in one direction it is maximizing in the other direction it is minimizing it. So this is in the calculus we have seen this is known as a saddle point. And we also add this word equilibrium here. So X star Y star is called saddle point equilibrium. So in a zero sum game it is not really necessary to consider pi 2 everything is in one payoff. So this essentially tells you that in a zero sum game what a player 1 tries to maximize his utility function and player 2 tries to minimize the same utility function. But the difference is only in there is only in their choices. So therefore thus zero sum game means zero sum game is a triplet X Y and Y. So X is basic X is player 1 strategy. I keep using strategy choices and everything but strategy is used for a in a different sense as well Y is for player 2, pi is basically X cross Y 2 or is the utility of player 1 also means that this is a this utility of player 2 because player 1 is maximizing and player 2 is minimizing. And saddle point equilibrium is a pair X star Y star in X cross Y such that pi X star Y star is less than or equals to pi X star Y said it will be. So X star Y star is a pair of choices of player 1 and player 2 respectively such that pi X Y star is less than or equals to pi X star Y star which is less than or equals to pi X star Y. This is true for all X in X, Y in Y. So this is the definition. So from now onwards we will concentrate on this zero sum games. We will come back to the non-zero sum games later and for some time we will discuss the zero sum games. Now first let me mention one of the most important result is is Minmax theorem. So let me state this theorem. Let X Y be compact and convex subsets of some metric spaces. So I am writing it in a very general this thing but we will consider the restrictions and I am assuming I will introduce all this compactness convexness everything in the context of R and we will discuss that but first let me state the theorem. And pi is a function from X cross Y to R which satisfies the following thing. Pi as a function of X for a fixed Y is recall in X we are maximizing. So therefore this should be a concave function and this should be true for every Y in Y. Similarly pi as a function of Y is convex function this should be true for every X in X. Then the theorem says that Minmax theorem is that there exists X star Y star in X cross Y such that pi X star Y star is less than or equals to pi X star Y which is of course this side it will be pi X comma Y star. This is true for every X in X Y in Y. In other words X star Y star is saddle point equilibrium for the game X Y pi. So the game with strategy sets X and Y for the player 1 and 2 respectively and the utility of the player 1 is given by pi then this game has a saddle point equilibrium under this assumptions. X and Y have to be compact and convex and pi is a convex function in X variable concave function in X variable for each fixed Y and it is a convex function in Y variable for each fixed X. So now let me spend some time on introducing this thing what is convex. So we restrict all our sets all our sets R and some Euclidean space. Of course these can also be extended for a general metric space general topological spaces but we will restrict to R and only. So what is a convex set? A set X subset of R and is convex if the following thing happens for any X comma Y in X lambda belongs to 0 1 implies lambda X plus 1 minus lambda Y is in X. So any 2 points X and Y in X if you take it and X scalar lambda if we take it lambda into X plus 1 minus lambda into Y should be in X. What is the meaning of geometry? Suppose you take some set X if this is X this is Y the lambda X plus 1 minus Y this will be lambda X plus 1 minus lambda Y. So that means any 2 points if I take in this set the line segment joining these 2 points should lie completely inside this. Now if you look at this particular set if I take a point here another point here this line segment is outside this one. So this is not a convex set whereas if I take a rectangle this is a convex set this is convex this is non-convex. Similarly any circle this is a convex set. So of course convex lot of interesting properties we do not go into all these details. Now next thing let us look at the compactness. So the compactness is a very important thing in many ways. So because we are restricting X is a subset of Rn in Rn compactness is if and only if closed and bounded. So a set is compact if and only if it is closed set and bounded set. So this is a it requires some real analysis at least we assume that the people are already familiar with some aspects of real analysis then they will already understand this compactness sets but we will only discuss some properties of it. Anyhow before most important thing important characterization is that any bounded sequence has a convergent subsequence. So if you take if X is a compact set then you take any bounded sequence then it will always you can extract a convergent subsequence. So this is a most important property of compact set in fact one can show the equivalence with the compactness in the finite dimensions. So we will not go into those details. So we will only use this thing. So for us the compactness is essentially it is a closed set and bounded set we will only use this one. The next thing that requires is what is convex functions. So take a function f from X to R and X is a basically a subset of Rn it is convex set then f is f is a convex function if satisfies the following thing. Take any X comma Y in X lambda in 0, 1 then f of lambda X plus 1 minus lambda Y should always be less than equals to lambda into f X plus 1 minus lambda into f Y. So let us try to understand this again geometrically. So let us go here. So this is a let me draw one this thing this is a some function f X let us take this is point X this is point Y now this is a point lambda X plus 1 minus lambda Y. So this value of this is f X f Y this is f lambda X plus 1 minus lambda Y. So now what you take this here. So this will be basically the f lambda f X plus 1 minus lambda f Y. So basically if you take any lambda the this value is here which is f lambda X plus 1 minus lambda Y and this is the lambda f X plus 1 minus lambda f Y this is always bigger than this. So that is exactly the definition of convex function. So this is the geometrically you can say that. So in a sense the if you look at it the points lying above all the points above this graph what is this that is known as epigraph of f that is nothing but point X comma R such that f X is less than equals to R this is convex the epigraph is convex. So this is equivalent to f is convex. So in fact more details you can find it in a course on optimization which you can see it later. So now we have all the necessary ingredients here. So recall this thing we have already introduced this compactness and convex and also introduced this convex function. Then next we need to introduce this concave function let me just go to that what is concave function okay f is concave function if and only if minus f is convex function. A function is concave if and only if the negative of this function is a convex function. So if we want to write it formally the condition will be f of lambda X plus 1 minus lambda Y is greater than equals to lambda f X plus 1 minus lambda f Y for all X Y in X and lambda is in 0 1. So then you call this as a concave function okay. Now we have introduced the necessary definitions but why this convexity. So the most interesting thing about the convexity is that what is the most important property of convex functions. Any local minimum of a convex function is also a global minimum. So look at this, this is a convex function. So this is the only point where the function has a minimum and this is also a global minimum. So there are no local minima. If there is a local minima something like this and then there is something then this function this is a minimum this is a minima here this function is not going to become a convex function because this point is above this. So this is no local minimum exists. All these properties you will you can find from any optimization course. So now this is what is a very crucial fact that we use. So then next what we will discuss now is the theorem the min max theorem. So let me recall we assume that X Y R compact convex subsets of R n then pi is a function from X cross Y to R. Then there exists X star Y star in X cross X cross Y such that pi X Y star is less than equals to pi X star Y star which is less than equals to pi X star Y. This is true for every X in X Y in Y. So we will specialize this X and Y to some specific classes of sets and in fact this is not the first theorem that is proved. So we need to introduce that notation in order to explain all that. So let us go to that. So before going that let us understand one small thing this X and Y here we are assuming these are convex sets. So that means they are infinite sets but let us look at the matching pennies game. So here X is H and T there are only two choices and Y is again H and T and we have this pi H H is 1 which is same as pi T T then pi H T is minus 1 which is same as pi T H. Now does what is the saddle point? So if you really look at it saddle point equilibrium means if for example player 1 fixes some strategy the player 2 will try to minimize and if one player 2 strategy is fixed player 1 maximizes. So such a thing will it exist. For example can H can H H be saddle point? So that is thing player 2 has fixed H H then player 1 best thing is to play H. So therefore certainly H is the best response to player 2's H. What about player 1's H? If player 1 has fixed at this one for player 2 he will certainly would not like to play H because he is paying here 1 instead he would like to deviate to T then he will be getting 1 U. So H H is not a not saddle point equilibrium. In fact no pair will be saddle point equilibrium. So no pair is saddle point equilibrium here. So how do we resolve this? So here is a game which admits no saddle point equilibrium. There is no saddle point equilibrium. But the theorem let us go back to the theorem statement. The theorem says this theorem guarantees saddle point equilibrium. So here if you really look at it there are two things compactness and convexity of the strategy sets. And of course here another assumptions we missed is pi should be concave in x convex in y is necessary here. So these things if you really look at it here in this game x has only two points and y has only two points. In fact if I want I can actually put to see that these are all okay I think let me write it x I can identify with a 0 1 y I can identify with 0 1 and then pi from x cross y to r I can write it as pi 0 0 is same as pi 1 1 which is 1 and pi 0 1 comma pi 1 0 they will be minus 1. So we can easily verify now that x these are all subsets of real numbers and x y are not convex. Of course they are compact because they are discrete sets okay only two points. So therefore the closeness is automatic and boundedness is also automatic and hence they are compact but the convexity is the biggest issue. So because of this non-convexity here there is no saddle point how do we do it? So let us also look at it for example this is a game where if both of us you are let us say second player I am as a first player if both of us would not want to choose either h head r type or 0 or 1 okay. So the best way to do is that I would like to I will sometimes I will play 0 sometimes I will play 1 what do you mean by that that requires the introduction of mixed strategies. So what is a mixed strategies? A mixed strategy is basically is a probability distribution on x and similarly mixed strategy let me put it as for player 1 is a probability distribution on x similarly a mixed strategy for player 2 is a probability distribution on y. So let me define what is this one? So what is probability distribution on x? So we will only discuss for discrete sets basically finite sets okay. If x is continuum then it requires measure theory or probability theory. So we will not go into those technical details and we only look at the discrete set. Suppose x is finite set let me say x 1 x 2 x n x m what is the probability distribution on on what probability distribution? Mu okay I should put it as probability distributions on x this is probability distributions on x what it means is that mu gives you x 1 okay let me put mu of x 1 is let me put it as some a 1 mu of x 2 is a 2 mu of x m is a m in such a way with the following conditions with a 1 a 2 a m these are all greater than equals to 0 and we also want a 1 plus a 2 plus a m should be 1 that means what I am choosing x 1 with a probability a 1 x 2 with a probability a 2 and x m with a probability a m that means instead of choosing x 1 x 2 x m what I am going to now do is that I choose one of them randomly in particular x 1 I will choose with a probability a 1 x 2 I will choose with a probability a 2 x m with a probability a m this randomization makes the following thing you at any point of time when you are playing this game you do not let others what you are going to choose you are choosing randomly you can choose any of them and what distribution you follow that is going to be given by this mu mu this a 1 a 2 a m this is the probability distribution so in fact either we write mu or simply a 1 a 2 a m this is basically the vector in delta m what is delta m delta m is nothing but lambda 1 lambda 2 lambda m such that these are all non negative and then they sum to 1 this is the probability simplex in m dimensions this is a mixed strategy on x now similarly we can define for player 2 okay so we stop now and we will continue the in the next session