 So it's my honor and privilege for the 6th Alpha Seminar of the Year to receive a colleague from not that far, but from the sister university care unit, the big sister unit. Sylvia, when we get out, she will discuss with us, measure in cosmology convention versus paradoxes. She will speak for maximum one hour, after that we'll do a five, a small five minute break and we will start the discussion later with a comment of the Dr. Sandro from UCVAC. You have the floor. Thank you. Hello. I have been working on infinitesimal probabilities and for this seminar I will try to look at this work through the lens of conventions and measure theory and probability theory. And I will discuss two case studies that are related to cosmology where infinite spaces of possibilities pop up more or less naturally. So that is the combination of words in the title. But I want to start by motivating why I'm interested in looking at paradoxes. So if you look in the history of mathematics and even now, many mathematicians want to avoid paradoxes for, I think, obvious reasons. But there are a few exceptions of people who are really interested in them. And one of them is Toreccelli. He was working on a theory of invisibles developed by Cavadieri. And that theory was very well known for generating paradoxes. And the way Toreccelli approaches was not to try to avoid them, but he actually developed new versions of these paradoxes to study what is going on here. So I found a description about his work that says that he was doing mathematical experiments and that it is a bit like in the lab where you put a system under extreme stress. So likewise he was pushing logic to the extremes and thereby he tried to learn something about the system and the study. In this case the continuum. So that is my approach to paradoxes. But obviously even in the context where I'm talking about them, many people don't like them. And a very vocal person who wrote about this was a physicist and a probabilist, E.T. James. So he has some very nice quotes about his dislike for what I am doing. He calls it a Pandora's Box of useless and unnecessary paradoxes. I also like the way he turned it into a verb because he talks about infinite set paradoxing. So he calls that a morbid infection and a probability theory. And to make it worse in the last quote he says is actually in all of mathematics. And then he says that it's the careless use of infinite sets with also infinite quantities and infinitesimals that generates most paradoxes like here. I am completely agreement indeed the keyword is careless. Let's not do that. Let's do it carefully, right? So in terms of the context in which these paradoxes popped up and sort of drew me in. That maybe these paradoxes and probability theory are indeed affecting other fields. So I have two case studies from cosmology. And the first part of my talk will be about the context of eternal inflation. And there is a relatively straightforward paradox from probability theory that pops up. And the second part of the talk should be about more recent, more speculative model. It's actually about the problem of time in cosmology. And the paradox is more subtle and maybe also more interesting. And in terms of the payoff of analyzing probability in this context for cosmology. I will already warn you it's limited in the first half. But I still presented that way because it allows me to introduce a lot of things that will also should also help us in the second part. And I do think there is more payoff there. So for thinking about the problem in cosmology. So regarding the first context, eternal inflation. I definitely don't consider myself a specialist in this. I had one course a long ago on Big Bang physics. So actually the problem was already there. So I will introduce it to you. But I will focus obviously on the part where the measure comes in and where probability starts to play a role. Just as background knowledge, why are cosmologists even considering inflation and what is it actually? So they observe a lot of things in the contemporary universe. For instance, the fact that it is already very big. It's still expanding and in many senses it's very homogeneous. And so the list of observations here can all be explained if we postulate, if we assume that in the very young universe there has been a period of very rapid and in fact accelerated expansion. And that is what this word inflation refers to. So there are also toy models that try to model how that could happen physically. And typically when physicists have to explain something like that they postulate a field. So in this case there is an inflatum, a type of field where it's assumed that at the start of the universe or in the very young universe this is not in the most stable situation. It is an equilibrium but it's an unstable one. So the idea is that naturally this field will go to its real stable equilibrium. So those are some terms that pop up in this discussion. And it was discovered already in the 80s, very early on, that if you assume that such a field exists and you have a number of seemingly plausible assumptions about the models that the region where this field has already stabilized such that inflation stops that rate of decay is actually slower than the expansion of the regions where it is still expanding at an accelerated rate. So the picture you get is of a very large structure, something larger than what we usually think of as the universe where local regions have already stabilized so inflation ends there but these regions are patches or bubbles or pockets and a larger structure that is itself still accelerated in its expansion. So that is the problem of an internal inflation. So the idea is that what we usually consider as the universe, as everything there is is only one of those pockets and what we see in our history or what we interpret as our big bang was in that sense only a local phenomenon. Local for this pocket but this larger structure is still accelerated in its expansion. Now in this structure there pops up a measure problem once you start asking questions about the probability that our universe would have certain properties basically. So the question is what is a good measure and in the end what is a good probability measure over these pocket universes but in this type of model there will be infinitely many and then we find Alan Good in a review already from 2000 so also not recent who says things like anything that can happen in such a structure will happen anything is possible and what he needs is that any physical possibility will be realized in one universe or another, there are infinitely many of them and also that the fraction of universes with a particular property is a ratio of infinity to infinity and that is meaningless so that is the measure problem in this complex. And then he even introduces a model system that is very well studied in the context of probability theory and that is connected to the Wellman paradox there and he says let's talk about the integers from the context is clear that he's talking about the positive integers and he tries to compare the problem there of assigning probabilities to drawing random integers from the natural numbers to the problem of assigning and measure to these properties of pocket universes and he suggests that there is no right way to assign probabilities in this model system and therefore there is not really an answer to the measure problem in cosmology. Now when I looked at this I had the impression that the way he presents the problem is not exactly adequate or at least that is not to receive to you for a problemist of what is going on here. So that is why I want to talk about the fair infinite lottery for a bit just as a model system without a connection to cosmology and we can connect it back at the end. So what is the usual presentation of the problem? It's sort of a thought experiment. Imagine there is such a thing as a fair lottery on the set of natural numbers by fair we mean that any possible natural number has an equal probability to an other one and now we can wonder what would that equal probability be and one option is that we would assign zero probability to individual tickets but that would mean that the sum of the probabilities of all the tickets is zero and then in probability theory there is an assumption that is called countable additivity that says that such a sum, the sum of all the individual probabilities of all possible outcomes should equal the probability of the whole sample space and that is one. Zero can never be one so that is not an option. So the other option is that there is some non-zero probability that we assign to the individual tickets but again if you compute a sum it will be infinite and you cannot normalize that. So that is the problem there. So there is an inconsistency between the assumption of a fair lottery on the natural numbers, countable additivity and normality. So the assumption that the probability measure is one for the whole sample space. To see where that analysis comes from we have to look at conventional probability theory and what the assumptions are and what the actions are. So conventional probability theory goes back to the 1930s by Coma Corp. There are four actions and sort of preambles so there are also already conventional choices at the very start. For instance it is assumed that probability values are real values, real numbers. It is assumed that they are always zero or positive and non-negative but the whole sample space which I indicate here by Omega which in this example is a set of natural numbers is one. And then there are two assumptions about additivity. One is that if you have two possible outcomes that don't overlap that are mutually exclusive that the probability of their union is equal to the sum of the individual probabilities. That is enough for the finite case, those actions. And then for the infinite case there is another conventional choice in the actions Coma Corp said for mathematical convenience let's assume that this property generalizes to the countably infinite case. It cannot be made to work in the general infinite case but when there are countably many non-overlapping events we assume that they are similar in that analysis rule. And there is a sort of coherence between the assumption because we are using the real numbers here and the type of limit you need to talk about countable infinity is actually exactly the type of limit operation that you use to construct the real numbers out of the rational spins. So there is a certain coherence in these assumptions. But as we have seen there are cases where these assumptions lead to paradoxical situations namely if we have a fair lot of non-natural numbers so we have uniformity over the singletons and the sample space is also countably infinite. So you would assume maybe because there is this countable notion here as well that this works very well together but this is actually the only case where it breaks down. And I would say so there is like a mutual inconsistency between the type of model system and the conventional choices and the probability theory that we are using that is not actually how good wrote about it and some other authors do the same thing they say that a fair lot of non-natural numbers is an intrinsically paradoxical or intrinsically inconsistent notion that you cannot even think about it in the right way that you will always be implicitly making errors. They have the impression that they take the standard probability theory as unquestionable and from that view point of course you draw the conclusion that the problem is the model system. There are reasons to think that it's more subtle because there are alternative theories and all of them there are actually even more than the ones on the slide I think the most interesting one or the most well-developed ones all of them do allow us to say something about the model system so I think it's more subtle. So let's start with the most obvious thing or the most obvious theory you could consider if you take this as a serious problem that you cannot describe such a worry that is to simply drop the final action about the additivity in an infinite case and say it has to be finitely additive that's all you need in that case you can even be a bit more stricter by the witch events you can describe you can describe more events there will be fewer or actually no unmeasurable sense. So what's the sigma algebra here? Yes, it's the sigma algebra of our sets so there is a unique choice no additional choices which is nice so in this case it suffices to assign you can assign probability zero to individual events. Another way to arrive at this type of theory is if you go by something that's used in number theory it's called natural density there you look at initial segments of subsets of the natural numbers and you extend sort of your view of window and you count how many of the elements how many of the natural numbers are in the event of interest normalized by how big your window is and you take the limit to infinity if you do that for singleton events that density will be zero it will be a half for instance for the even numbers it will be one for the natural numbers so this is actually consistent with the actions here but it's sort of a different way to arrive at that conclusion now finally additive probability was famously defended by the Finetti for instance specifically for the reason of the problem with the natural numbers and the fair distribution others have arrived at this theory for other reasons but I think in this context the Finetti is more interesting of course let's look at the response of James he was not impressed he said do we really believe that an infinite number of zeros can add up to one obviously there is no action in Finetti additive probability that says that this is the case but James had the impression that yeah you're sort of pretending that it doesn't but yeah so you are answering yes without seeming to do so so I think he was still using this intuition from the conventional theory to look at this new proposal so I'm not entirely convinced by his hesitation here and the Finetti had another response he said that okay okay indeed adding infinitely many zeros saying that that is equal to one that is absurd I agree but it would be true if you substitute actual infinitesimals instead of zero and that is something that leads me to another proposal the one I've worked on the most myself that is what if we take that seriously can't we assign an infinitesimal number to the individual outcomes as their probability so to do that we cannot stick to the conventional choice of the real numbers as the value set for probabilities we need a value set that does include infinitesimals so that brings me to a proposal I've worked on that's called non-marcomitian probability we try to stay as close as possible to the conventional theory of Kolmogorov but we had to make a number of changes so the first one is already in the preamble so here we allow a set of values that includes the real numbers that it can be bigger and we don't actually fix and advance which it is exactly but it will always be a non-standard model of the real numbers such that it includes infinitesimals for the first action we can actually require more than Kolmogorov this so that is one of the additional motivations for this theory that is that we don't allow probability 0 for a possible event so if this is a non-anti-set we require the probability to be strictly positive that is not the case in standard probability theory we stick to the normality action we stick to finite additivity and the most technical action and the biggest change in this theory is actually how does additivity work in the infinite case I will not explain in details I just wanted to show you that this looks more complicated and that is actually the direct consequence of the difference between doing things conventionally and doing it in a different way because what we could use in the Kolmogorov case was the standard literal operation so I didn't have to include in the action the definition of the limit operation but here we sort of had to include an alternative limit what we call a non-argumentian limit operation we have to introduce it on the fly to explain how the action works so what we are doing here is to assign to these natural densities for instance a limit but a limit that takes a value and a set that includes infinitesimals that cannot be the standard limit operation because that will by definition only give us real value to our actions and this limit works such that if we sum up all these infinitesimals that are assigned to individual tickets they do sum up exactly to one and not one minus an infinitesimal with some other number so there is again coherence between different assumptions for verification this R this curly R thing you said that it has to be a model non-standard model for the actions for the real numbers so it's a new standard model of it's not any R it's a hyper reals I guess yeah it's a hyper reals okay so it depends on what the sample space is how big this non-standard model has to be so we cannot fix it in advance but it will always be the way this action works if you look at the consistency proof it contains a recipe of constructing the specific R if you know what will make it so that is how it comes in yeah okay so in this case we can assign a particular infinitesimal to single time events and I've called it one divided by alpha and actually yeah I cannot introduce everything related to this idea but there is a theory developed by William Benchie to assign sizes to infinite sets that also uses for instance the hyper natural numbers so it is more fine-grained than cardinality and again sensitive to even removing one element from a set so you can take off the type of probability that we are defining by this type of actions as sort of a normalised numerosity function so the numerosity of the natural numbers would be alpha and one divided by alpha is the probability of a single ticket so in terms of the idea of a natural density we actually follow that idea only the limit of information that we impose on it is different so instead of zero it will be a number that is infinitely close to zero which is crucially different and as I said it will be in our notation one divided by alpha now for the whole sample space it is one of course by the normality action but the infinite sets that are not co-finite they start to show weird behaviour for instance for the even numbers you would expect probability half but it is a bit more subtle than that actually it depends on the question of whether alpha is even or not so you have to make a lot of assumptions if it is even it is half if it is not the probability is actually slightly smaller because the way we define the natural numbers we start at one so one makes that head start and that is why evens could end up losing sort of half an infinitesimal even in the non-standard limit it gets worse for some other infinite sets so this is now an infinitesimal difference between the possible assignments but what are non-measurable sets on the standard theory those are actually sets that on our proposal have a finite range of possible values so I also think of it as sort of measuring how pathological these sets are and even this one is like infinitesimally pathological so that is a non-conventional approach and there is another sort of easy solution that you could consider easier than the one I just showed so there is another action I didn't say much about yet that is the normality assumption I think in terms of conventions there are actually two components in this action one is really a convention why is it 1 and not 10 or 100 that's arbitrary the real restriction that the action imposes is that the measure is not allowed to be uniformly zero because then it wouldn't be normalizable and it's also not unbiased but what if we drop that that would also help us in the infinite case and then maybe we use the extend material number so we can assign probability quasi-probability plus infinity this is strange if you're used to working with normalized probabilities but it is not impossible you could for instance assign probability 1 to the single that's again arbitrary anyway could also say 10 or half or whatever but some non-zero number let's call it 1 and of course the measure of the whole natural numbers will be infinite and also that of the even numbers will be infinite so you lose a lot of resolution in that sense but for all the finite sets you keep the usual I call it resolution so again you can think of the natural density approach and now what you divide is just a normalization factor which exactly what you would expect I guess but it diverges and that is now allowed but it is weird, right so you cannot distinguish these infinite sets in terms of their measure but you could also consider what happens if you look at conditional probabilities so I realize I haven't properly introduced them but if you do that what you will find is actually if you normalize it on the whole sample space you get back the probabilities you got and finally add it to the probability theory so that's sort of interesting to see that the conditional approach here helps you with the loss of sort of you have fine greatness in the infinite case because now you can distinguish this but it comes at the price of again ending up and losing character volatility okay so as I said the payoff for cosmology I think is limited for this case I don't think we really have a way of getting past the measure problem in cosmology by using a different probability theory but I do think it clarifies where the problem is exactly so I don't think Guth is right in suggesting that there is no way to assign probabilities to that model system of the integers there are different approaches if you are willing to give out some conventions and I think it's also very interesting if you're a pluralist about these theories you can sort of do it side by side and as a whole picture I think it gives a lot of insight about what is going on in the model system but what if you now want to apply it to pockets, universes and inflation and cosmology well then you end up at the question what would the natural order be to consider these pockets to use this idea in one way or another of a natural density whether it's in a conventional theory or an unconventional one there was always this idea of you look at initial segments and then you go to the limits but then the question is about the order of how to do that and the natural numbers have a canonical order and you use that and that gives you the answers but in cosmology the way these models work these things are sort of outside of each other's light core so there isn't really a way to find a natural order and I think that is the real problem so if that is the analogy then it's fine but the problem doesn't really come from probability theory itself that's my question and moreover it's actually not clear that accountable infinite lottery is the right model system to begin with I find very few authors who remarked this in cosmologists assume there is a countable infinity of pocket universes but they also say it forms a fractal structure and I think many fractal structures naturally lead to actually a non-denumerable universal meta-unior I don't know how to call it whole of pocket universes so maybe they are even focusing on sort of the wrong side of the problem because it's an uncountable collection the problem I have been discussing all the time doesn't even really arise it's not the main issue it's the order that is lagging so if you were doing track this is actually a good time to try it because I will introduce a different problem I hope that you can use our investment in the first half to understand what kind of alternatives you have to deal with it but the problem that is addressed so this is a question about the era of time in cosmology and there is actually a lot that statistical mechanics helps us to understand about the era of time at least the one that is driven by entropy it helps us to understand why entropy seems to increase spontaneously towards the future the answer or a large part of the answer was given by Boltzmann and it has to do with the idea that there are many more ways to have high entropy and therefore to appear in this order and certain systems that we are very much used to but may not apply in other systems there are more micro states that correspond to high entropy states and to low entropy states and therefore if there is a certain dynamics that in principle gives you equal access or it makes it equally probable to go to any other state that you will end up in one of those high entropy states simply because there are more of them so the usual picture that is drawn is that there is a phase so these are possible physical states there is some dynamics on them there is a very tiny region that is actually always drawn too big but that corresponds to low entropy states that we can see around us and there are many more paths that lead to higher entropy and only very few that would keep you inside of that region so that is probably the question of where the era of time comes from but it doesn't really answer the question in the context of cosmology because it only tells you what happens if you start from low entropy tradition so we seem to see in our past low entropy young universe only this mechanism doesn't really explain that if anything starting from what we know today we would also expect in the backward direction that we were coming from a high entropy state because there are so many of them so there is something missing and what is usually done either endlessly or explicitly is just to put this in this low entropy initial condition is sort of taken as given and thrown there and that's why entropy would indeed increase so that has been called past hypothesis so we postulate it in but the question is do we really have to do that or is there another way to do it and there have been proposals in the last years that are called central time models and the physics so the dynamics of how states evolve to each other are exactly the same as in standard statistical mechanics there is only one other assumption that's always in place in statistical mechanics that is given up and that is the idea that everything happens inside of a bounded volume and in statistical mechanics that is sort of very understandable that the assumption was there to begin with because it was about gases and boxes and how you connect them and engines there is always a well defined laboratory setting with a finite volume but cosmology is actually not clear what the box would even be in upper volume so some others started from this point and said what if you have a classical gas in the box but no box basically so what happens then is that the collection of particles can keep expanding so that means that the entropy can keep increasing without bounds and these others the way they write about that is that any actual realization of the entropy can be considered as a low entropy because there is only a finite interval how it could be lower but there is this open-ended interval of how it could be bigger so that already sounds a bit like what you have with the drawing of an actual numbers lottery because if I have done the drawing I expect a gigantic large number but once I tell you what it is you can always tell me it's so low there are only finitely many ways how it could be lower but infinitely many ways how it could be bigger but that is the case for no matter what you have got so there isn't like a similar sort of spirit in this proposal so in this case you have an arrow of time that will go up in one direction and if you extrapolate to stay back it will go down to a certain point but that is not postulated in my hand and from that point on it goes up again so that point would look like a bit bad it would look as if from that point the entropy increases but actually in two branches of the universe so that is the proposal and if you think of it in terms of Baltimore it's the drawing I had in the beginning so you expect spontaneously higher entropy regions will be visited but that goes up all the way until without problems now a paradox that is related to this issue so in this context good, again the same author who also proposed inflation and discovered internal inflation problem he said that in this case if we have this kind of model the probability should be normal as well so he just says that it should be conventional probabilities basically and therefore we cannot have a uniform probability distribution on the volume on the position of these particles something like that that is not logically possible with sex but I think that only happens again because he assumes probability theory as the only context in which he can think about it and he even introduced a paradox he didn't really name it but I recognize it as a paradox that is usually not called the two envelopes paradox but it is the way I will tell it to you so imagine two envelopes and imagine although we have been here in the first half as well imagine there is such a thing as a parallel three-dimensional numbers maybe not smart to assume that but let's assume that it exists and assume that actually two drawings have taken place and the results have been put in these two envelopes and now the first envelope will be opened I tell you the number that I have in it and now we are thinking about what is the probability that the number in the closed envelope is bigger than the small or what is the probability that the one you saw is the biggest one so now you can make this reasoning that I already indicated there are only finitely many ways how the number in the other envelope can be smaller there are infinitely many ways how it could be bigger and all these individual ways are equally probable because we have a fair lottery so it seems then that's the probability given that you know what is in the first envelope that this is the biggest one should be zero let's assume we have some way of doing this with standard probabilities but that is the case no matter what the number is I didn't tell you the number and you could already do this so it seems that this condition here doesn't play a role so it seems that the probability that it is bigger to cool I can say that here I don't know how to say it but of course there is a problem because I could have told the same story by opening the other envelope first and then we would have concluded that the same probability is one and maybe even at the intuition that by symmetry for instance this probability should be half I don't know if you have that intuition it's not really needed to set up the paradox but in any case there is something really wrong and of course you know what is wrong we know that the assumption of a fair lottery on the natural numbers is inconsistent with standard probability theory and I have been sort of pretending to use it but it cannot really work now the way this probability paradox was discussed was actually very clearly analyzed again by the Finetti and he connected it to a property that is actually a theorem in standard probability theory but none of the alternatives that I discussed that's the theorem that would indeed allow us to make this step from this is true for any conditional event to this is true for an unconditional event there is a theorem in standard probability theory that if you could get it started to begin with that would indeed allow you to make this move but that is not a theorem in any of the alternatives so that is what is going on so to explain this more clearly I do have to tell you how conditional probability is defined because we saw a very crucial role right so there are more complicated cases where this definition doesn't suffice but if the probability of the conditioning event is non-zero and that is the case for us or let's see for the definition let's assume this is the case then the probability of an event a given b is equal to the probability of their intersection and then sort of renormalize or divide it by the probability of the conditioning event now I can tell you what conglomerability is and maybe I can do it by an example let's say I'm interested in the probability that it will be it will be more than 10 degrees tomorrow let's say something like that and let's say that is the event and what I know is the probability that it will be more than 10 degrees given that it's sunny that's the event and I know the probability that it will be more than 10 degrees when it's clouded let's assume that those two conditional probabilities were let's say 0.3 and 0.4 now my question to you is could it happen that the probability that it will be more than 10 degrees is 0.6 for instance so the other ones were 0.3, 0.4 could it now be that unconditional probability is for instance 0.6 could that happen no so we would expect that the probability of the unconditional event lies somewhere in the range of 0.3 and 0.4 and there is a theorem that stands for probability theory that's for instance the law of total probability that tells you that it's sort of a weighted average of those two and there is another theorem that tells you more directly that it has to lie in the interval and that is in this case finite conglomerability so that basically means unconditional probabilities are same in the sense that they lie in the interval of the conditional probabilities where the conditioning events form a partition of the sample space fine and you could also imagine that in the standard probability theory it's natural to ask but what if you have a count to be infinite sample space does it still work and the answer is yes because in the proof what you need normality and countable activity so that is indeed a theorem but not general conglomerability so if you have a non-denumerable partition you have insane unconditional probability in that sense they can lie outside of the range of the conditional probability so there is a sense of non conglomerability and any probability now that I have explained this I can tell you what we were actually doing when we were doing this thought experiment with the two envelopes there were two outcomes in terms of natural numbers so they spanned out my sample space I have the first outcome on this axis and the other one here and when I was asking you to imagine you open up the first envelope and you then know what your probability is what you are doing is actually carving up the sample space into a partition and strips like this so this is an event where the first probability is one two, three or any number so we are using strips like that and that is indeed a partition of the sample space and if I then ask you what is the probability that the first number is the bigger one what you are looking at is inside the strip the probability that the first one is bigger is this finite part that the second one is bigger is this open-ended envelope so that is why we ended up with the intuition of the probability of the first being bigger for a single and the other way around you make strips like that so that is the partition instead of playing normal here but the problem is there is no there is no way to get it started with standard probability theories so you can never apply the theorem in that affinity so what happens in these alternative theories that you all have heard of recently if you have finally additive probabilities then you agree actually with the stat that for instance the conditional probabilities are zero all of them but that is not inconsistent with the same for instance that the unconditional probabilities are half because there is no theorem that allows that forces you to stay in the range of the zeros likewise for the non-normalizable option all these strips are like a length so there is some finite number in each case but the event in the unconditional case is an influence so again you go out of the range and also with non-logic convenience solution you end up at half in this case also times one minus an infinitesimal for a different reason here it's actually well determined that is because you have to exclude the case where they are exactly equal to do with pathologically pathological states but again it's not in the range of the infinitesimal probability so we end up with non-conclamorability already in the conditional case but now maybe I will not go into details maybe I will tell you this part more or less qualitatively what I find interesting and that is why maybe here is something to be added also for cosmology is that you can stay a bit more in the non-conclamorability in case so what I discovered here is actually that there are sets or there are partitions on which you get conglomerability not this one but you only get conglomerability if the sets in the partition are finite so there can be infinitely many in that sense it's an infinite partition but they have to be finite and there is actually an additional condition they sort of have to be in agreement with where they have to be related to the way this non-standard limit is defined so it's not the case that you have arbitrary conglomerability but you can get conglomerability on partitions of arbitrary cardinality as long as the sets in them are finite and in a particular well-defined way related to how the limit operation works so there is something there that saves a sense of conglomerability in the infinite case that is not there in many of the other alternatives and maybe that can be used in applications so I'm skipping the part where I did this I don't know why but it corresponds to what I said qualitatively so now the question is what about cosmologists are they aware of this and interesting as it seems that they are in a sense or at least some of them I found a particular one paper that says very similar things even though they don't use this vocabulary so they don't seem to be they don't know the finetti and the history and probability but they have rediscovered this and I think they make very same conclusions in the way I look at it so there is a toy model in paper of 2016 for a central time model so exactly the case that I started from to motivate looking at this paradox where the sample space looks like this there is an axis that you can think of as a sort of a time axis and there is something like entropy axis or something related to entropy proportional to entropy and the sample space is actually within this parabola and you can already relate it to central time models the idea is that a universe in this model how does entropy evolve in time but it will be one of those parabolic trajectories it could be any of them and what they found here was that if you ask questions about what is the probability that such a universe is close to its central time it depends on how you model it exactly in the sense that what what you take to be taking out a universe is it something at a certain level of entropy or is it something that stays on the same trajectory and depending on that there are some conditional probabilities with non-overlapping ranges so that already shows you that in this case you cannot really have that unconditional probability always lies in the range because they are non-overlapping so then we discovered this it's a more interesting case because it's directly related to sort of a physical model system and it's also with non-uniform probabilities so that's also a difference from the two lottery the two envelopes paradox but the way these partitions work I try to indicate it with the colors they are these vertical and horizontal strips but they are strips like this that's one partition and the other one is these parabolic segments these stripes they give these inconsistent incompatible conditional probabilities and what I wanted to mention about it is actually that you immediately see that this one has an infinite area and I told you that in the case with non-archimedean probabilities you can have an infinite partition but the members of the partition have to be finite so in the theory with infinite decimals the partition into these kind of simplistic strips will never give you conglomerability but it turns out that this one does so it seems that here there's actually an additional argument to take the predictions or the values that come out of using that partition as more physically relevant than the other strips and in this case I think there is more going on than convention I think there is actually a mathematical reason behind it so to end where we started this is the last word what did he say about non-conglomerability well for him it was a paradox out of control so we have institutionalized this now it's now called as the truth so many unmarried minds have been trapped it has gone public so he told that we really shouldn't allow non-conglomerability but of course there was never conglomerability in the first place there is no other option than to accept it so I myself think that maybe it's not as bad as he thought and my final slide is actually an overview with the different theories that we saw the conventional finally the additive one the non-normalized one and what kind of assumptions you can combine consistently so there is no package where you can do everything you would maybe at the start intuitively expect you could do for instance perfect negativity doesn't really exist in most theories perfect conglomerability doesn't really exist and in any case there are different combinations of assumptions and type of model systems that you can there is one additional option that is not a quantitative theory that is Morton's infinite lottery logic so in the paper behind this presentation I also discussed that one so a lot of these issues don't really apply to that theory because it doesn't try to assign numerical values to to events it only orders them but I also have a lot of nice properties so I did want to at least mention there is another option but that is sort of the overview of things we've seen in Morton so thank you and let's discuss after to start the discussion let's give the floor to Peter Tiesel from Museum of Art and after that free discussion everybody online or yeah thank you so first of all I really enjoyed reading Peter Tiesel something very different secondly I'm no expert in the foundation of probability the science and philosophy of internal inflation cosmology so I'm not going to be offering much of a comment instead I'm just going to raise a couple of questions that came up that I was listening to your talk mostly for verification name just a brief comment about paradoxes because I really loved the cartoon that you showed where you have the little kind of infinity sparkling in his eyes I've seen that sparkling in your eyes I'm sure you can see the spark in my eyes when I'm talking about time travel paradoxes so I fully agree that paradoxes they captivate, they fascinate they provoke and seduces and most importantly they arouse our curiosity they can stimulate us and motivate us in our research so when I'm teaching about time travel paradoxes I usually make distinction between two kinds of paradoxes which are all weak and strong paradoxes so weak paradoxes typically go away on the closer inspection whereas the strong paradoxes typically pose much deeper problems so they indicate that there's just something deeply fooled in our understanding by opposing some kind of form of inconsistency or logical contradiction so they're very different so for example time travel paradoxes I think most information paradoxes are weak they're very puzzling and surprising and counterintuitive but they're not logically inconsistent in the way that the strong paradoxes would be such as the great father paradoxes most famously as I was wondering what the situation is in probability theory and it seems to me that in probability theory maybe most paradoxes are weak, it seems that it's particularly reaching weak paradoxes because they're just pretty bad probabilistic thinkers and it's a very prone to probabilistic fantasy so I was thinking about Simpsons, aeronauts the multi-hole problem and there are many, many others but I think that the strong paradoxes are much more interesting and so there I think about thinking beauty Eucos problem and clearly now the two paradoxes that you mentioned, so the infinite lottery paradoxes and the two handbook paradoxes so when you're trying to resolve these paradoxes I think with weak paradoxes it will lead to greater clarity and a more precision in your own thinking but with strong paradoxes I think you really need to do some radical conceptual innovation even to entirely new research progress as I think we've seen here there's a quote I really like by who said that the torniest paradoxes have a way of looming into beautiful theories but it brings me to my first question because I think with many paradoxes there are various needs of invading paradoxes, strong paradoxes and so that means that multiple alternative solutions can be proposed as we've shown with all paradoxes here so for example with the paradox where you showed it was inconsistent with Kolmogorov in probability theory but contrary to James and the perfectly good and important as far as I understand you don't take it to be intrinsically paradoxical you just take it to be jointly inconsistent with Kolmogorov and so you offer this four alternative responses to the paradox and so given these very different because they're all very different responses and yet they're all fairly consistent you've argued for a pluralistic stance so allowing multiple formalisms each with their own virtues and vices each with a certain range of applicability not only to make sense of the paradox itself of a fair influence but mainly also taking a pluralistic stance towards these mathematical methods when we used it in cosmology so I wondered if you could elaborate a bit more on this pluralistic stance so what exactly does the pluralism here entail at first to me to be that you are arguing for methodological pluralism rather than the epistemological pluralism and in your talk even more by starting getting the feeling that you're taking it much more like to reach any like little mathematical experiments trying to get to the extreme to get more clarity on certain problems whether it's in the foundation of mathematics or in science would you like to meet me? I'll be there though usually yeah so thank you for these very nice comments and very interesting additional quotes on paradoxes which I quite like so a few things I do think that the paradoxes that I discussed were mainly in your categorization strong paradoxes but it was also interesting to see that in auto-leg chains still insist that the paradoxical flavor remains even though the action has been removed so that is the weak paradox that stays in the air so to speak and about the pluralism part I think it comes from my experimental background that when you have a thought experiment or a model system like pharylothrinolateral numbers that I see that as a system that you can try to study with mathematical methods or in some cases also other philosophical methods and it's only if you do that with more than one method that you can distinguish between what is the property of the system started out as the thing you wanted to study and what are artifacts I really think of it as artifacts due to the theory you're using and I see that you're really strong to imaging artifacts and you need more than one system also an experimental context to have a confirmation of a property that is sort of really there but here the there is in mathematics rather than in the empirically accessible world so that is the way I approach it so I do think it's a methodological pluralism probably I haven't really thought about what that actually means and yeah for me that is also connected to of course a certain view on what mathematics is and even how it should be thought and so on so it has a lot of consequences but I do think it starts from this experience of thinking about paradoxes and what it does to teach you I think I forgot the actual question in the meantime it was exactly the point what kind of pluralism did you have in mind especially because they seem to give very different answers to certain questions depending on which word has been adopted and that kind of seems hot for me maybe the interesting thing that I didn't really emphasize in this talk is that there is also a connective tissue between these themes so for instance if you start with assigning infinitesimals there is an actual function that's called the standard part function that you can apply to it that will run off the infinitesimals and what you then get is finally additive but real valued probabilities so the intuition of the thing that they had that if you substitute infinitesimals for zero that actually if you do look along I can actually tell him he was right you can actually make sense of what an intuition was there so yeah I think in many cases when I had a result after a lot of work I could find authors that had said that thing a lot earlier even on the finettes so these intuitions were already there so I think I'm just extremely slow so I do have to really show myself that indeed that works by doing all the intermediary steps but people have thought about it before me they actually they're not known to be surprised by the results but I'm sort of making that connective tissue more tangible or more explicit yeah can I ask another question so it's just a very important question but I got lost there in reading it I don't know on more this few which is maybe not discussed as much as the paper but as far as I follow Lord is taking a lottery to be fair when the condition of labeling is met so basically the chance of an outcome should remain unchanged under permutation of the outcome of labels and in your treatment of the fair of the lottery I gather labeling depends actually on the students so you take fair as a consistent and actually weaker requirement that each individual outcome has the same chance and so since you're in a sense filing or at least denying labeling dependence Norden has claimed that you're not actually addressing a problem you're just changing the problem whereas Parker and I and I have to feel you in your paper actually saying well it's Norden who's changing the problem so I wanted to know what your stand was in dependence do you think the requirement is too strong or do you actually do you think that it should indeed as Norden says be generally adopted given that it certainly holds for a fair fine lottery so what's the deal there maybe let me briefly say something so it's a very spot on question maybe let me briefly say something about Norden proposal for the others so Norden has on several occasions argued that many people are too swift to start assigning probabilities for instance applications of principle of indifference he thinks those are cases where we simply should refuse to assign probabilities because you need certain types of information to know that the probabilities are equal for instance so it is not surprising but also in this case he looks at it from that angle and says that there are the type of knowledge we have in cosmology about these pocket universes is too weak to assign probabilities so instead he says we can say something about probability orders but not about actually quantitative probabilities and from that viewpoints and also the part about that we don't really have a good way to order these universes actually in that context I think his proposal is very well appointed very well designed for the problem at hand so in that sense he is not changing the problem in cosmology he addresses it he says that we don't know anything about order of these pocket universe there is no physical reason to impose one so if we want to talk about probability ordering at least and not introduce more information than is there then we need this label independent it shouldn't matter how you label the universes and indeed then what does fair mean it means that no matter how you reorder the labels to a set the probability should say the same then you get a much weaker probability like theory a non-quantitative one but one with orders and it does still allow you to make certain distinctions even among infinite sets so yeah I think it's very interesting but if you start from the problem of a fair lottery on natural numbers I would say he is changing the problem in the sense that what he is describing is a fair lottery on a randomly infinite sets that doesn't come with native labels and if you then have that whatever that set was and then you impose the labels on the natural numbers those cannot be used to argue for this approach with natural densities which depend on order that is given so I think the problem he is addressing is much more under specified than what good friends calls the model problem when you know it's the integers I think you can take for granted the pocket universes don't have it so in that sense Martin is really addressing the measure problem rather than introducing another model system where stronger assumptions are so I think that is sort of the way to answer the question like who is changing the problem but of course it depends on where you started and what the problem is but then if they can be meant for the cosmological problem then you end up with some kind of theory like orders which are no quantitative just more comparing properties what you could do is start from one that has the label dependence which are all of the ones we actually discussed and then do something like super evaluationist like any order could be it and how much can you still say and what you can say is exactly what the only things that Martin says so in that sense he goes there directly but again I'm interested also in this issue like you could get at least conclusion by realizing the problem along the way even though you impose the stronger weaker fairness condition with a stronger way of modeling the system with more structure in the sense you can do something like sets of probability assignments and what will they have in common exactly what Martin already knows Thank you for the talk I have two questions I have one as Peter says I'm a very poor waste figure so I'm not sure I understood exactly why the central time models are less conventional in the past time with this one because it seems that taking the lower point entropy at the start of the universe is as functional as saying that one pass is at the state of entropy right? I didn't understand that part and the open need question would be when you introduced your non-dark immediate probabilities you used the high barriers I was wondering if introducing if it is more in the other ways and small analysis or if it is more will it change your results or what you get out of it Okay thank you Yeah so I always have trouble remembering the previous question when you were saying it and you were too what to answer but now I forgot what the first question was How the other models are the central time models Yeah actually I think maybe I made an error of equivocation there because what is definitely true is that this model of reasons are not widely adopted are considered extremely speculative and very few people think this is the way to go so in that sense they are unconventional but if you think of convention more as a philosopher as I should have done you may actually be right that you will remove an assumption so in that sense they don't bear conventional loads just that they are not well developed yet also other aspects of conventions that have to be chosen that we are not even aware of because they haven't been developed so I think it's actually fair that you pointed it out I maybe shouldn't have phrased it that way definitely and then other ways of introducing infinitesimals so the alternative you mentioned was that smooth infinitesimals Yeah so these are little important ones right so if you square them they are zero so for the end of a problem I when I first started considering infinitesimals so I didn't know much about it at all I encountered it and I very soon ruled out that that would be a solution for my case exactly for cases like this actually when you have an infinitesimal that tells you what is the probability of an outcome in a natural lottery and then you do two lotteries then you would like to square it if it ends up zero there's no hope for arriving at the additivity but actually since then I've become yeah I've become less sure whether there wouldn't be another way of using intuitionist approaches to these kind of problems but I will definitely not be the one to do it because I think I've sort of been brainwashed by teaching myself these other methods so I may simply overlook certain possibilities but I remember that was the reason why I thought okay let's not try this but try something else I can't so it's very close to the last question so because I'm always confused about this confusion between philosopher and mathematician because they seem to talk about different things so I presume that they all believe maybe not correctly that there's a structure of this universe that is different things so it's because us we're there that we're asking a question about love so if an economist was doing that then you'd say there are two ways it failed bad questions in Kolmogolov or not Kolmogolov and it's you seems to say that it could be both your discussion between Norton and that because it could be a bad question it's not sigma algebra that's typically where you say oh we have a good formalism but we're just confused about the questions or it's one of the actions that is not there and that's not the right discussion so why the natural number? where did it come from? why did they say oh this question the structure of the universe must be analogous to the natural number because you well discussed that if we have natural number and it's very interesting and we can get beyond the paradox that's the first why natural numbers? is it fractals? like you said fractals sets with borders that are fractional I have no idea how to put that in Kolmogolov you're all done and I can attenuation so how does it start? I ran into it's very very good response how strange I ran into this discussion sort of from the other way around because I wrote a paper about this and I suggested that maybe it's the wrong model system to begin with and the reviewer who I think comes from physics was like but where do how do you even consider that it would be non-denumerable because everyone is assuming this so then I started looking for sources and I thought I had to find one who was considering this fractal structure had a model system for that and dressed it more directly in a sense but in that paper the authors were considering paths in time so of course that is non-denumerable even if the pocket universes are denumerable so they were also pointing out that well you can work more easily with Kolmogolovian probabilities if you have this non-denumerable thing so they were trying to assign probability to the paths that they sort of skipped over the question of whether it would already be non-denumerable if you look at the pocket universe so I then only found one other paper that tried to or maybe it was even the same one that tried to say something about the fractal dimension of this model system and from that I could argue that then it would be non-denumerable but there was only one paper that actually looked into it so it's really a model system that's apparently very nearby for physics and they don't even question why they are using it I agree that it should be questioned it doesn't seem to fit very well with all the things they say about it because it seems to show that they did twice two mistakes so it's the mistakes that applied to the fair natural lottery which you show very carefully and you have solutions or possible solutions that you can discuss and the fact that you start with such a model then you begin to discuss the problem yeah and the inflation is around you know infinite inflation is around for a long time 14 years it's still missing that I agree but obviously when you point out that the reviewer says it's not publishable I don't know what they will say it's very recent community effect because they all start from this picture and these pocket universe you can put the label of them and if you can put the label of them they must be a natural number of friends other questions so first of all I would like to thank you for this excellent talk it was really inspiring and near and the formal aspects of it too and the style of doing philosophy in general which is what I try to do I guess try to push things a little bit and what can get us solutions for no breaking but I'm not sure whether I have a real question that is worthy of the talk and I certainly don't know enough cosmology to to do that but maybe this has to do with the labeling issues but it seems that any solution to me to this natural number lottery that would end up with even numbers as probability of should be wrong there seems to be something deeply wrong about it I mean not as a possible solution but as a T-single solution that's what I guess the thing that's wrong is that I mean the lottery system should not care about the order there is order in the natural numbers but a lottery by definition almost forgets about the order otherwise it's not a fair lottery I would say and if you order you can easily order natural numbers obviously in a way that even numbers don't at all look like they are half of the cases it's just because we go through it and so if the only solution if the actions give you one half I mean it doesn't seem to be right it should be among the solutions I guess but and then so with this notion of being among the solutions so if you drop actions I mean it's natural that you get much more possibilities and not like with a possibility diamond but just as possible models of the actions and well isn't that like should we have some even for the physics purposes sometimes just refrain from looking for actual probabilities while still talking about probability this is a problem that has not a single just one solution and it's interesting that it's in a set of solutions and then we can compare several probabilities not because they have because they have one solution but because the solutions are a subset of the other ones or something like that so like is it or what would you say about the idea that all these paradoxes have to do with our desire to find single solutions and the fact that you can get interesting results by giving up some aspects some actions of standard probability theory actually points to that like sort of an incompleteness as we are used to in set theory we should also just accept this incompleteness and many other fields just unexpressible well not unexpressible but just not faced no reason to like say either we find proof or a disprove or whatever it's just among the possibilities the technically possible not the metaphysical possible I'm just no but I think the first part I think there were like different phases the first part was really a defense of the notion I think so you are really on the same line I guess that's why I... I think you would completely agree with how we approach it so you really start from questioning what does it even mean fair logic and natural numbers and that was very much aligned a phase where you said half should be among the possibilities I think but maybe that is because now sort of a bit too close-minded one way to interpret that would be to say that you can use one of the theories I was using don't take the original labeling seriously at all take all the labeling on a par then you can apply the probability to any the theory I mean to any of these things they will exist side by side the probability of even numbers will range from 0 to 1 a half will be among them then I attempted maybe that is sort of I think if not a mayor would really be very against what I will do now my natural inclination is them to think oh we go higher or the probability so all of these labeling are equally reasonable so you sort of assign an equal measure to them maybe it's not a probability so if you then course grade what is the probability of an even number it will still be a half but it will not be because at the first order I only looked at probability a half it's because I average over all the possible labeling so in this particular case I would still end up in probability a half but there would be a lot of sort of spreads on it I wouldn't I mean the whole range is from 0 to 1 but if I had I would be applying the formalism in a certain way so that I have infinitely many cases how I can apply it actually not enumerably many I average out over it so to speak I still end up in a half but for a different reason on the first order in that sense but maybe given what you said at the beginning but I don't want a theory like that I think that is what you were saying I don't want to apply any of the theories you were doing so I don't want to do some kind of super evaluation of those kind of theories you really have to reconsider the actions but then I don't really know how to get started because with Norton you drop assigning numbers it's really ordering he gives them names to talk about and the theory it's not a half doesn't really make sense there so I'm not sure you can really get anything in between those two actions maybe there is at the moment I cannot really imagine what it would look like but there is something weird about this averaging out studying that that's kind of a metaphor for probability I think that's what you used which is very interesting but it's not probability right so that might be a good strategy to actually deal with this incompleteness but it doesn't take away the incompleteness and we can perfectly have both just say well these are all interpretations of the actions we all accept for some reason and then we go study what we do in set theory nowadays we are interested in several universes and compare them and so on and they all set theories to some extent or universal set theory that makes some internal sense and I would say just for probability you have several interpretations of probability actions if you get rid of some actions so that maybe are not ideal and then you can indeed use some meta probability theory maybe to make sense of what to do with the diversity of possibilities of possible possibilities like is there like that doesn't mean that you have to get one half as a probability of over a paradox it still means or probability that it's an even number it just means that I don't know I find it difficult to think about this meta probability I should think more about it but not that it's a half instead we have a uncertainty and we can measure this uncertainty but that's at a level of epistemic probability that you are then interested in which is very interesting but it's a step higher wouldn't you and so I with the infinitism you didn't get to one half two and a half did you you said that actually depends on some choices depends actually on the ultra-filtrum natural numbers that you use that is very interesting that I actually would find that a good thing don't try to get to a half it is not there is no reason to go for a half but it's interesting in numerosity theory there is a convention to assume that alpha is actually divisible by any natural number so then it would be exactly a half because of that reason it's also an even number so it's only a half so it's also interesting that there is this push to other ideas of ultra-filtrums that you could use it almost is like a scholastic discussion what is the visibility of the numerosity of the natural numbers I'm also more inclined to say let's allow all the possibilities it's an epistemic thing we really cannot settle that it's beyond things you should expect to be able to answer with the kind of theory you use sometimes I'm on the same page as you and I don't think about the other the other case that you introduce sorry for having to step out so much I think it's my matter of control I wanted to ask about chance what does he want us to do so is his proposal like these are all badly formed questions and you people should go home and work on something else is that really the move there just is no asking the question how do we apply probability to a fair lottery on the natural numbers is his opinion that that's just a bad question or what does he want to say it's a strange thing because I started looking into it because I actually initially thought he was defending a theory like the because he doesn't want to put in his action of counter volatility but then he seems to assume you get that naturally so it's a very strong intuition he has that is so obvious even more obvious than an action when he talks about it so indeed from that perspective certain questions are intrinsically inconsistent because it's so basic you don't even have to put it in the action it's clearly crazy if you think that that makes sense that is what seems to be happening because in his probability theory he only has finite activity explicitly and then the rest comes from physical intuition which means he has been taught standard measure theory long ago he forgot it he became part of the system that's what I think and that is not the only way you can see that's how I read what happens there but of course it's not the philosophers that's really interesting maybe I read it the wrong way that is the only way I can explain what you read sure yeah it's fucking super yeah it's fucking cool but no other questions are coming so I don't know much about probability paradoxes I know much more about the semantic and the the theoretical paradoxes and possible solutions with therapy where people have proposed for that and people have also tried to unify such paradoxes because they look quite similar and I don't know probability paradoxes maybe do not but is there any hope in sort of not starting from ZFC at all but starting from something weaker that can allow for inconsistent or at least self-membered sets and so on without any paradoxes people have tried such things so with that strategy worth trying out like keeping the maybe the usual mokoro reactions like assuming a less paradox suggesting a set theory in the background that's a very interesting suggestion I haven't explored it at all I did put ZFC as another background assumption I did want to make it explicit because it's definitely there I didn't go that far back so I didn't go back to the real numbers but that's still ZFC obviously and I don't really have a clear intuition I mean related to these non-measurable sets that you always have like in other I mean you don't need to introduce the fair lottery paradox to get those in response to that but I think it would be a natural move to consider alternative sets to start from yes in that context people will have worked on it but it's not something I have a clear view on but it's there and whether it would have any impact in this case yeah I should check it I don't know yeah there's a price to pay of course you get the differential weakness I mean there is no serious that you're a don in these alternatives you get some weaker results that I mean don't give like certainly not strength to new foundations so that of course the question is what do you lose yeah the question is also like would it fit certain of these certain problems in cosmology because we seem to have so little grip on what the structure even is maybe a very weak theory would actually be a good thing so you don't have to impose structure that isn't actually there or you don't know about and therefore you don't introduce what I think of as artifacts due to the theory so it could be very natural to start out the actual problem that cosmologists have and start building a theory from that point that's not even what I did but maybe that is what it could look like and then you would expect to have a very weak theory because you have very little to start from anyway yeah comment you are in the last rough time yeah you discussed about the direction of time in the classical statistical mechanics in the case and even you made me a bit more confused than the ordinary problem which is nothing your talk was very clear but for now it seems to me that the problem is worse than I thought so it's been just expressed and after that you can correct me so so in classical statistical mechanics we tried Boltzmann for example we tried to put probabilities on face-picks and there's a lot of problems with that so it's why now we have your nice picture ergo ergotic system because it's another way to put probability that seems much more intuitive you know the problem the time you spend and you have to go everywhere in the face-picks and you come in infinite time put in a scientific that we have no idea of our inverses like that isn't it implicit in this approach that you can search and ascribe the face-picks but if you have an expanding the face-picks is augmenting all the time use that you may have a standard approach but it seems that our intuition of probability is related to spending time in a certain zone but it was all designed with the notion of I have a face-picks and I come around and I spend a certain time there but now I don't even have a face-picks because if I have this expanding universe the face-picks is increasing all the time so do we need to imagine the face-picks that could be possible so not the face-picks of my universe but the face-picks of possible universe and expansion which would be a second level notion of space today it's like there's a possibility up there or I'm very confused about the face-picks of an inflationist theory I haven't spent a lot of time looking at the standard approach actually in that case that's something that I do want to get back to the toy model I showed you of Goldstein and Collins so that is the central terminal they do mention that this is a non-ergodic theory so that's already interesting and what wasn't clear to me and it's something I still want to figure out is whether this non-ergidicity is coupled one to one to the sphere of fundamental probability or not so I have literally a paper on my desk with this question I figure out the relation so think about it more so there's definitely something there that's interesting to explore but what you asked about I don't even know how that works exactly with the expansion on the face-picks in the normal approach how they do it so because it's non-ergidic you have to include probability or in post-probability a ruler weight and they don't explain what they are using is a measure they don't they call it a probability so these authors don't call it a probability measure because it is environment I have called it quasi-probability but all the conditional measures look like conditional probabilities that sounds like it is okay to call it a quasi-probability but it's definitely non-organic and non-ergodic in their processing non-organic in that case yeah sorry I can't answer directly but there is something there's a cut there between the classical we hope the world is a good model a good model so we have to find another way to get them into a technological problem but even in economics they are now working with non-ergodic models maybe we have to live with them they haven't they haven't much as far as I understand because I'm teaching one class of economists about causality and social sciences as far as I understood they have a much less much less ontological notion of probability yeah because there's a big part of not all part there's a big part of the history of a commie that was an empiricist reaction against probability so yes they work with all kind of models but it would not put this it's coming from a probability reasoning because even if the Boltzmann didn't believe the word that may be deterministic I think I'm quite seriously asked as a reversal approximate now I'm getting confused thinking about this central time because it looks very much like what Boltzmann proposed in a sense that there is this maximum entropy with Boltzmann and you get all these fluctuations out of the maximum it only goes deep enough you can create a whole universe but you also get these two errors of time so that's very similar but since they don't allow since they say that entropy can keep on increasing there is no maximum there is no nature of Boltzmann brains but so where is the big bang on this model and can it be any central time because I mean obviously I'm getting confused so the idea is that if you would initiate a universe like that that is where this uniform probability problem came in like you want to distribute the position that the momentum really ran this is like a very big on the possible systems and you then look at the time evolution then this curve so you initialize it at an arbitrary point that has no physical meaning will always look like that so there will be a minimum in that curve so you don't postulate that whatever however you initialize it it will have a minimum and that is a central time that will look like the big bang of that universe so that is how it works okay that's to clarify the ground models with two directions from one big bang is too hard was defining defining that in the conference last year they liked that because it's a way to get rid of the past about why is that but they do analogy with calm mechanics that's an analogy but in cosmology does it work in cosmology because in cosmology it's not really big bang it's just a smaller state and two directions so they hope that it will fit everything I think relational mechanics there are also models on relational mechanics I'm not so familiar with them so I don't know if we have time for this one more kind of this is a bit I guess maybe this is sort of in the spirit although much less sophisticated because I don't know the theory that you're working with or Peter's question going back to asking about set theory it also strikes me that an important kind of putting a normally uncontested assumption back in question in a lot of what you're doing is this turn to the numerosity theory instead of cardinality a nice kind of general way to phrase a lot of these responses we were just always putting the wrong thing and the denominators of all of these things so of course we lacked the ability we just didn't know how to talk about so in part just because I'm completely ignorant tell me a little bit more about this and how you see that assumption because that strikes me as like a potentially really important and super interesting move that could be useful for all over the place all kinds of stuff if there's a much more fine brain an interesting way to think about sizes and really really big sets that's really cool I think it is yeah so numerosity theory the way I usually introduce it is that for finite sets you can look and compare sizes in at least two ways and they are consistent one is that if you have two sets and you can put the elements in one to one correspondence you conclude they are equal probability but it is also the case in the finite case that if you have one set and you take a strict super subset it will have a smaller size and those two assumptions together are not consistent in the infinite case and it's actually I think definition of affinity that an infinite set can be put into one to one correspondence you all know that but then if you want to make sense of the notion of size in infinite case you are actually facing a choice do you take size based on one to one correspondence or to having such that strict subset has a smaller size you could go two ways you've always gone one way and it's already fairly counterintuitive that there are different sizes of infinity in the sense of cardinality but there is this other option can you make sense of size subscript 2 which is now called numerosity which takes as an action that numerosity whatever else that maybe is such that it is strictly smaller if you are talking about strict subset so there is now an action like theory to prove these actions are inconsistent to use methods from non-standard or non-archimedean mathematics so you can think of that as hyper natural numbers coming from non-standard models of the natural numbers but they give you very plausible intuitive assignments if you take this strict subset relation as a negative notion of size and it's also not a choice against cardinality in the sense that again there is connective tissue there so it is the case that if the numerosity is the same the cardinality is the same the cardinality is the same it's not necessarily the same numerosity obviously but there are all kind of rules how big the numerosity difference can be basically for it to have the same cardinality so I don't see it as against cardinality it's really you can do both of them it's interesting to do both of them but on the numerosity side it's a recent theory and a lot of very intuitive questions are undetermined because nobody has simply got it so for the natural numbers and its subsets I think it's all developed but as soon as you go bigger than that simply not many people have worked on it so that is also what you always have you can always have this I think of it as sort of counterfactual mathematics what has somebody had taken that route earlier than counter what would mathematics now look like all we don't know but in any case at this time very few people are developing there are a few mathematicians in Italy working on it not very young mathematicians I mean somebody is not saying sooner rather than later because I do think there are more cool applications but it's sort of under praise please do make it stop it was 21 so we could have as many as 9 minutes or we could continue the discussion so thank you it was hard to be clear very very nice