 Yeah, hi folks. This is Matt again. I am going to tell you a little bit about the Shapley value now, which is one of the most prominent ways of dividing up the value of a society, the productive value of some set of individuals among its members. And, you know, the the basic idea in coalition or cooperative games in terms of trying to figure out allocating value is is having some notion of what the right way to do things is and we might say even in quotes say what's the fair way of a coalition to divide up its payoff. It's obviously going to depend on the way that we define fairness and the literature has basically then taken ways taken axioms as the primary way of expressing the properties of what the desired properties are of rules for dividing up things. So what we're going to do is then have some set of axioms or properties that we want to satisfy and then see what that gives us. The Shapley value is based on Lloyd Shapley's idea that members should basically be receiving things which are proportional to their marginal contributions. Okay, so basically look at what what does a person add when we add them to a group and they should be getting something that reflects their added value to the society. Okay, so what's the tricky part about this? Let's just take a quick example and that'll give us an idea of why we have to be careful in doing this. So let's suppose that everybody and together in a society can generate one, but that if we're missing any member of society, we get zero. So this is say a committee and the committee all has to be present in order for for it to do anything. So if it's missing any of its members, it can't just make it decided to decide on anything. So in this situation, what do we so we've got you know v of n is equal to one v of s is zero if if we're looking at any s that's smaller than n. So in this situation, what's true, then the marginal contribution if we take any individual out of this group, their marginal contribution is one, right? So everybody is essential for generating this one. So everybody's marginal contribution to the the the society without them is one. And in this situation, we can't pay everybody what they're responsible for in terms of of leading ultimately to the to the grand coalition. So we're going to have to think about some way of waiting contributions in order to come up with a reasonable thing. And obviously, for this particular rule, it would be reasonable to add up things by one over n. So everybody gets one end of the contribution. But in rules where in situations where there's might be some asymmetry asymmetries in terms of who contributes which value, we're going to have to think fairly carefully about how this should be weighted. Okay, so Shapley's axioms are going to give us a handle on this. So let's take a look at those. So the first idea is a very simple one and one which pretty much any rule that you would think of in these settings is going to satisfy. So if we think of two different members of society, say I and J, if they contribute the same thing to every possible coalition in which they could be a member, they're completely interchangeable. So that if we're looking at some coalition that has neither I nor J in it, if we add I to that coalition, we get exactly the same value as we get when we add J to that coalition. If they're interchangeable, then they should be getting out the same allocation of value. So if psi is the way that we're dividing up the value from some coalition game, then we should be giving the same thing to I as J when they're completely interchangeable. Okay, this is a fairly uncontroversial axiom. It really captures a basic notion of fairness that if this, you know, if individuals are completely equivalent, they should get equivalent payments. Okay, next axiom, dummy players. So I'm sure that that everyone has had some experiences with people like this. What's the idea? There's a situation where you add a person I to a coalition, and they they add absolutely nothing. So no matter what s we look at, if we add I to a particular s, we get the same value as this situation with without that individual. So basically, the person's completely worthless. No matter what coalition we're looking at. So the idea is, then the axiom is, if an individual is a dummy player, then we give them nothing. Now, you know, this is on one hand, it's a fairly reasonable axiom. If somebody's contributed absolutely nothing, there's no reason they should get anything. On the other hand, this depends very much on the perspective you're taking. So if we're thinking about a society, it could be that that I contributes nothing because of reasons beyond eyes control. So something happened, they had an accident, or for some particular reason, they are unable to function. Society might still want to allocate something to those individuals. So it really depends on what the time perspective is, whether we're thinking about social insurance, and so forth. But but nonetheless, it's a fairly intuitive axiom, and it's going to be a fairly powerful one in, in what it delivers. Next one is additivity. This one is one which you might think of more about the process of allocating value. So let's suppose that we can think about looking at a cooperative game or a coalitional game. And we can, it's one that separates very nicely into two different parts. So we can think of it as, as you've got one cooperative game, you've got another one. And then we think of what do you get when you sum these two things together. And the idea is that if we're looking at two different cooperative games, and then we think about what would happen if you were trying to allocate something, when you sum them up, you should get the same thing from allocating one, allocating according to the second, and then adding those two things up. Okay, so the idea here is if we're looking at a cooperative game, where the value for any coalition is just what it gets under the first game, plus what it gets under the second game, then the way that we allocate value should be how we allocated things under the first game, plus how we allocated things under the second game. So you know, this is fairly obvious in terms of what it means mathematically, in terms of how you interpret this and what the story is for why you might desire to have to satisfy an action like this. That's a little harder. You could think of this as a story for saying, you know, maybe society one day produces according to v one, and the next day according to v two, and if what it produces the second day doesn't depend on what it produced the first day, then we should we should be able to allocate the fruits of the production in the first day, and then allocate again on the second day. And those things, since they don't interact at all, we should be able to do that separately. And what an individual gets is just the sum of those two things. So you can, you know, think of a fairly logical story for this kind of axiom. Okay, so what do we get from these three axioms? The Shapley value, and let's have a look at it exactly how you define the Shapley value. So the value, the Shapley value is going to be marginal calculations. What is an individual I add to coalitions that don't have I when we add so we've got coalition with I in it coalition without I, we then take a peak at how much that generates. And then what we're going to be doing is waiting that by different possible ways in which we could have come up with this marginal calculation, and then dividing through by all the possible ways that we could have done this. Okay, so we'll make sure we average over all these things, so that everything sums up to the full value. Okay, that's the Shapley value, we're going to dissect this in more detail in a moment. And what's the theorem? The theorem is that if we look at a coalitional game or a cooperative game, there's a unique way that divides the full payoff of the grand coalition. So if we're making sure we're dividing everything up, that satisfies symmetry, dummy and additivity. So if you put those three axioms together, there's only one way to do it. And that way is the Shapley value. So there's a unique way which does satisfy these. And it's the Shapley value. So that's a pretty powerful theorem. There's a fairly elegant proof to this. It's fairly intuitive. We're not going to go through that in detail, but we'll go through some explanations of this. You can find the proof fairly easily in a number of different places. There's actually a nice book by Osborne. And Rubenstein, for instance, which is free online, which has a proof of this. But there's a number of places where you can find this. Okay, let's have a peek at the actual value in terms of what how this breaks down. And then we'll look at some examples. So what individual eyes giving is according to this formula looks a little daunting, but the intuitions are fairly simple. So let's think we're thinking of marginal contributions. How are they coming about? So what we're going to do is we're going to think of all the different possible ways we could build society up. So for instance, we could be building society up by first adding person one, then adding person, say three, then adding person two, right? So that would be one order in which we could build a society up. We could also have built it up by first adding person two, then adding person three, then adding person one, right? So there's a whole series of different ways that we had a three person society that we could go about building these things up. And in each according to each one of these orders, we'll have different set different marginal contributions along the way. So here, first person one contributes something then person three adds their production and person two adds their production, and so forth. So we end up with these these different contributions. And that's what this is going to capture. So what we're doing is we're looking at these different sequences. And the first part we're doing is calculating, as we went along the sequence, what did I add when they were added? Next, we wait this by the different possible ways that we could have built up the coalitions before I was added. Then we also wait this by the different orders, different ways we could add the individuals who haven't been added yet, after I has been added, right? So there's a number of individuals minus the number that are already in S minus I. So that's the number of ways that people that are still left. In fact, take that to the factorial gives us how many different orders we can still add people in. So we wait it by that. And then we're summing overall possible combination of coalitions that are there before I, and then we're dividing through by the total number of different orderings we could have over people in a society. Okay, so that's the Shapley value. And in terms of understanding this, again, what we could think of in terms of the ways in which we divide up a society, we can think of, you know, adding person one first, then one two, one two three, we could have added one first, then three, one two three, we could have added two first, then one, then three, two first, then three, three first, then one, three first, then two, three, and so forth. So we could have done this in a whole series of different orders. So there's six of these, right, six different orders. And so for instance, if we want to figure out what person one adds when we add them, in the first case, this is v one. Second case, this is v one. Third case, what are they adding? They're adding v 12 minus v of two that was already there. Third case, they're getting v of 123 minus v of 23. That's the fourth case, the fifth case, we're getting v of 13 minus v of three, and so forth, right, so we've got here, v of 123 minus v of 23. Okay, that's the Shapley value. So each one of these things is getting weighted by a sixth. Here, this turns out then to, to get a total weight of one third. Again, here, we're going to get a weight of one third. And then these two are each getting a weight of 160 each, right? So that gives us the total value of the Shapley value. And that tells us what person one should be getting in this setting. You know, let's take a look at a simple, simpler example, just with two individuals and try and figure out exactly what the Shapley value gets. So these are two people, they form a partnership. So person one alone was generating production of one, person two was generating production of two, they say, wow, let's get together and form a partnership, we can do better than we can separately. They generate a total value of four. So this is nicely super additive, we're getting a higher value when we've got the two together. And now they sort of, at the end, they try and say, Okay, well, how should we divide the four among among them? Well, in this case, we could have added one first, and then one come up with one two. The other possibility is we get two first, and then one two, right? So there's only two different ways we could have built society up. So person one in the first, if we're trying to figure out what to give person one out of this, here they would get v one, right, which is one. Here, they would get v 12 minus v of two, the marginal contribution they added if they're added second, this is the marginal contribution if they were added first. What's this value? This value is two. And each one of these gets a weight, ultimately, of one half, because we've got two of these things. So we're adding a half of one, a half of two, we get 1.5 is equal to fee of one. You can go through you can check that 2.5 then is going to be equal to fee of two. So here, what do we end up with the Shapley value gives us that if these are the the contributions that people are making, you're going to end up with 1.5 as the right amount to give to person one and 2.5 to give to person two. Okay, so they're each in this case, getting some value that depends on the so it's it's taking into account what these values are in trying to divide the four. So they don't just say, Okay, let's just split the four 5050. They're doing a different calculation than that. And comes out at 1.5 and 2.5 in this case. Okay, so what about the Shapley value? It allocates the value of a group according to marginal calculations. It's captured by some very simple logic and and axioms. And what you could do is you can think of other axioms, you can think of other ways, other fairness ideas or other kinds of things that you desire your your rule to satisfy. And that's going to come up and make different kinds of predictions. And we'll take a look at the core next, which is another uses a different kind of logic than the Shapley value for for making predictions about how a society should divide up its values.