 Everyone, thanks for having me. So yeah, as I said, I'm going to be talking about some joint work I'm doing with my supervisor on looking at the standard problem in economics, the common prior problem, which I'll talk about, of which basically all the work previously that's been done has been done on situations where you assume full support of priors. You assume that people put positive probability on each possible state. And I'm looking at situations where that's not necessarily the case, sort of trying to extend existing results. So most of the work, the sort of the broad framework, a bit similar to what Matthew talked about yesterday, is we talk about there's just some state of states of the world. We call this set omega, but it doesn't really matter. This sum of state of the world, each state of the world is considered to be a full representation of the world. In the normal sort of scenario, we talk about each player has some, or there is some prior beliefs over the set of the world, and so there's sort of before the fact beliefs. And then we receive some sort of private information, which we represent through partitions, which again I'll talk about, which is some of private information. And then through a process of Bayesian updating, we produce posteriors about the world. And the situation's being complete information. And so this is sort of a pretty standard setup. And so this is the example we're going to be working with basically the entire day, basically the entire talk. Really nice, simple example. We have a set of states of the world, and there are six states of the world, A through F. And we can think about partitions. So we have a red player here, and the red player here has this partition, and what this partition means is that they receive some sort of signal. And their signal they receive tells them either the state is A or C, or the state is E or F, or the state is B or D. But they don't get any extra information from that. They can't tell the difference between E and F. They can't tell the difference between B and D. They have no way to tell that. This is an example partition. We can also think about the green player who has this kind of partition over their state space. This is a different partition. This person, if they receive this signal here, they know for sure the state is F. But otherwise, there's still elements of uncertainty. But there's possibilities of these things being, but there's the possibility that you know for sure, but not necessarily. And so what I'm going to be working with is sort of, in some sense, the reverse question, which is supposing that we don't know what prize are. We only observe each player's posterior areas. So we observe each player's after the fact information, what they know after they get their private information. And we want to be thinking then about which prize are consistent with these observed posterior areas. And in particular, what we're really interested in is, is it possible that there's some sort of prior that's consistent with every player's posterior areas? So we see each player's posterior areas. Could that possibly be coming from some sort of prior? There's very strong results if that is the case. And so we look at our example. And I just plopped down some posterior areas here. Numbers in red are for the red player. So here we see in this partition here, if the red player receives the signal that they're in this partition, they think A happens probability of half. They think C happens with probability of half. If the green player is informed they're in F, they think they're in F with probability one, perhaps not surprisingly. If the green player receives information, they're in this partition here, that they know the state is either C, D, or E, then they think it's C with probability 30%, D with probability 70%, and E with probability zero. We allow posterior areas to be zero and prize as well to be zero. And so if these are the posterior areas, we can think about priors that could have possibly generated these posterior areas. And here are some examples of priors that could have generated these posterior areas. It could be that the red player, this mu is a prior, it could be that the red player thinks each state before the fact thought each state was equally likely. If they thought each state was equally likely, they received information that actually they're in this state A or C, this element here, A or C, and they work out that, well, A and C were equally likely. So I should put them as equally likely in posterior areas, similarly to the rest of these. But they might not have thought that. This set of priors here, this is also consistent. It could be the case that the red player was absolutely certain that states A and C were not going to be true. The state of the world was not A and was not C before the fact. And then they got their private information. They received information that actually the state was A or C. They were mistaken. And so they put down some sort of posterior here. These posterior areas are still considered to be consistent with these priors. The thing that happened was something totally unexpected. And we have similar sorts of things here for the green player. It's just that these priors, if we apply them on these partitions, are consistent with the posterior areas. All we need is consistency. We can put zeros on things. It's not an issue. And now this graph here is fine, but it's not super nice. So let's get rid of a lot of extraneous lines and make this much neater. All I've done here is if they were part of the same partition, I've drawn lines between them of the appropriate color. And now it looks like a nice little graph here. You can build this as a formal graph, but it's just a nice little graph with colored lines. It's lovely. I think it's lovely. And with this graph, we can now do nice graph things. We can talk about paths. We can talk about cycles. So we can talk about a path which goes from A along the red line to C, and then along the green line to E. This is a path. We can talk about a path that goes from F along the red line to E. We can talk about cycles. And so cycles behave as you'd expect cycles to behave. It's any sort of closed path from A down to B, B across to D, D up to C, C back to A is a closed path. We call this thing a cycle. We talk about minus C as the cycle in the reverse direction. So C went this way, and minus C goes this way. We're going to use these notions. But it's just normal motions of cycles and paths. And then the trick of this is we put values on these paths and values on these cycles, which assign some function that is the value. And so if we think about the path that goes from A to C to E, it goes from A to C along the red line. And so we pick up the endpoint. We pick up this red 0.5, this red 0.5. And then we go from C to E along the green line. And so we pick up this green 0. And so it's 0.5 times 0, which is 0. This is the value of this path. This is how we work out values for paths, and also for cycles as well. We think about this path from F to E. We go from F to E. We pick up this red 0.5. And that's the end of the path. And so that's the entire value of the path. We can talk about values of cycles. We go from A down to B along the green. Get this 0.5 along the red. Pick up this 0.5 up the green. Pick up this 0.3, and this 0.5. And we get 0.0 375, apparently. Yes, the color that you choose matters. So you just pick one? You pick one, yeah. It'll depend. So if there was a red link here, let's say. If there was a red link here, then the path that goes from A along the green to B is different to the path that goes from A to B along the red line. So the color of the line matters, certainly. Proofs it becomes actually very important. But yeah, it matters. And we see that the value of this cycle, the direction we take matters. Because we're taking the posterior at the end point only, the direction matters. Poor direction might matter, I should say. These are values. And now we've got all this sort of the stuff. Now we can actually talk about some results that we get out of this. So if we talk about omega 0, as the set of states where some player has a 0 posterior, where some player says, I'm putting posterior 0 on that. Then we can say that any state with that property, any state that someone said has 0 posterior, has to have 0 prior. Any prior that exists, any common prior that just has to be 0 at that state. This is sort of the simplest one. If green has a posterior of 0 at E, green puts 0 posterior here. And so any common prior has to be 0 at E. This is hopefully reasonably clear why this has to be the case. If they had a positive prior, they would have a positive posterior immediately. So that's not much here. The next one, which I think is the novel one that I feel like people didn't think about. I feel like this question has been ignored because people thought that this was the only thing to think about. But it's not. There's this extra thing we have to worry about as well, which is we talk about positive paths. So a path is positive if the value of the path is positive. Not surprisingly, perhaps. And we let omega sub p be the set of states with some positive path to one of these zero posterior. So I can draw a positive path one of these zero posterior. Then the common prior there has to also be 0. Common prior, if I can draw a positive path from my state omega to something I worked out earlier, was add prior 0. So that's what he says. If I can draw a positive path from my state omega to some state that I know has 0 prior, then omega has to also have 0 prior. So for example, in this example here, we saw this path from F to E along the red line. Had a positive value, it was a positive path. And so our lemma says the prior F has to also be 0. If there is a prior F, it's also 0. And this is hopefully not too bad as well. We already worked out that the prior E had to be 0. And we know from these 0.5s here that the red player considers E and F equally likely. In posterior terms, also in prior terms, the red player considers E and F equally likely. And so they have to consider prior F has to be 0. And to get this in general, you sort of take a inductive line along the positive paths to get the result in the more general framework. And the very last one, yeah, sorry? Exactly. It depends. You normally draw the complete graph here. So it is still, I mean, yes, it's still unique. Why? I'm just thinking it could, but it wouldn't be E's information set. It wouldn't probably be Green's information set. It would be someone else's. No, so Green can only receive the signal that they're in either A or B, or CDRE, or F. They only get 1, 2. Yeah, no, you can't have overlapping stuff. Sorry, yeah, cool. Yeah, this is partitional information sets. You can build it where they do overlap, but it's different, different results. Yeah, sorry, yes. And so the last one, as I said, is we're all about these cycles. And so we talk about a cycle being inconsistent if the value of the cycle in one direction is different to the value of the cycle in the opposite direction. And this is the last problem, right? The last problem is this is the last thing that can be a problem, which is if some state is part of some inconsistent cycle, if some state is part of some inconsistent cycle, then any prior has to be 0 there as well. And I'm going to prove at least one direction of this. Well, I'm going to prove part of this. And so we see here, I mean, I had done the work earlier, if we think about the cycle that goes this way, we get the value of 0.0375, the cycle in this way. Value is 0.0875. And so these points, A, B, C, and D, are part of some inconsistent cycle. And so if there is some common prior, it's going to have to be 0 at all of these points as well. And so this model here, this is a simple example model, we've seen through the three things we looked at that any prior has to be 0 at every one of our states. And that's just not possible, right? Because it's a prior. Even allowing sort of prior to be 0 somewhere, that's not enough here, we still get situation where there's no possible prior here. Certainly there's no possible strictly positive prior, that's, we already know that from existing results. But this kind of model here, it goes a bit further and says we have no priors even in any sort of other result, even allowing zeros. And I'll generalize this statement in one second. But first I want to just quickly mention part of this lemma because I can do part of this lemma graphically. So this lemma says that if we're part of some inconsistent cycle, then any prior has to be 0 everywhere on that cycle. And so I can prove part of this at least, just very, very quickly. Suppose we have some sort of inconsistent cycle here, suppose this cycle is inconsistent, and suppose there's some point, let's call it C, where the prior is 0 because, you know, suppose there's one point. To show that there is a one point requires a bit more work, but supposing there is a one point that's 0, then I have some path here, A, B, C, right? Path from A to C. I have some other path going this way from A to C. If both these values were 0, then the cycle would be consistent because the cycle would be, the value of the cycle would be 0 in both directions. But so at least one of these things is positive, and the A, it's not because it's inconsistent. At least one of these values here is positive. One of these directions, at least one of these paths is positive, and so it's a positive path to a 0 prior. And so A has to also be prior 0 from one of our previous lemmas. And so it is the motivating thing for one of the first part of this result at least. So I wanted to do motivating reasons for some of the result. In any case, we get to the main theorem, which sort of generalizes or sort of gives the actual statement of the fact we showed earlier, which is that if we call this omega y, the set of points that aren't one of those three problems we talked about earlier, then a common prior exists if and only if there's something that's not something that's say, in this omega y, there's at least some points left over after we've made all of the points in here have 0 prior. Is there anything left over? We can put some weight on those and we can get a prior if and only if this set here is not empty. And so we can get a prior sometimes, exactly when this happens. And in this case, there's a common prior with support of this set here and this is the largest possible support. This is the unique maximal support for any sort of common prior in this problem. And I'm going to prove this. Most of this has been done through the three lamar, or one direction of this has done through the three lamar that we showed earlier. The other direction is my co-authors earlier, some modifications on my co-authors earlier work, but it's mostly these three lemmas. We also get a nice sort of, this is the main theorem, I guess. We also get a nice little one way proposition in the other direction as well, which says that if all cycles are consistent, then there must be a common prior. This is something that sort of comes out if you do a bit of work. You can say that if the cycles are all consistent, then there has to necessarily be a common prior as well, which is a different additional statement, but relies very, very heavily, obviously on the previous theorem. And that's it, really? I've got, don't want to prove anything. Thanks, thanks. It's sort of something's gone wrong with your procedure information set. Something's happened that's a bit weird, but well, I don't necessarily like to say it like that. You get very, very strong results. If there is a common prior, then you get very, very strong things like you can't possibly trade, to get no trade results if there is a common prior. So kind of non-existence of common priors means you can, you necessarily can do things like arbitrage. If you think about this as a trading problem, you can definitely get arbitrage happening. So there's those sorts of things happening, I guess. No. Yeah. Yeah, yes, we don't, I mean, it's not a question I've asked or thought about really. There is something, there is some minor result you can get for that, which is, or sort of some result you can get in that direction, which is if you have more links on the graph, so if you have more links on the graph, then you're more likely to have problems with getting a common prior. So ideally, you want people to have more information, reduces the chance of there being problems in a certain sense. But beyond that, I don't really know, yeah. More links, yeah. More links are bad, and so more refined information is, makes it less likely to have problems. Thank you.