 So, this is joint work with Coast John and Franz Hinson, both at NYU Stern, and Coast is also by the way here in case anybody wants to ask him questions too. So despite some of the math and the appendix, I think this paper is actually pretty straightforward. And the motivation for it comes from the obvious fact that, you know, Bitcoin's been around for over a decade, but it's still not particularly widely adopted. It faces, you know, as we say, a limited adoption. And we thought it would be worthwhile to think about for a moment whether this fact that it faces limited adoption after all these years is coming partially due to the fundamental economic structure of the way it's set up. And so that's essentially the research question here. And even simpler than the research question is essentially the key takeaway, which is to say, yes, we essentially at the highest level, we find that the underlying economic structure of Bitcoin is responsible for its limited adoption. And so in particular, you shouldn't necessarily expect that it's going to suddenly disappear. Now, within the paper, there are actually two main results that make this point. We have a number of supplementary supporting results, but they really are two to, I think, focus in on. And at a high level, the way to think about the first one is really just looking at Bitcoin as constructed. And so we showed theoretically that the Bitcoin as constructed is subject to limited adoption in the sense that that is an equally of outcome. And so you shouldn't be particularly surprised by it in reality. And you shouldn't be expecting it to change very much. Now, one reaction, though, sometimes is that well, perhaps a way around it is that Bitcoin has a maximal transaction rate, and perhaps relaxing that transaction rate would enable us to overcome this problem. And so we look into that, and that does not actually resolve the problem. You still get limited adoption, even if you allow for the transaction rate of Bitcoin to, for example, increase with demand for the system. And before I get into the model, let me just try to give you the basic intuition of both of these results at a high level. So for the first result, which again is Bitcoin is currently characterized or is currently set up, you should expect limited adoption as an equally of outcome. Well, so think about for a moment how you would overcome limited adoption in the first place. Well, at a minimum, what you need is you need more people to actually use the system by definition of overcoming limited adoption. But what would happen if a lot more people actually tried to use the system? Well, you get a lot more congestion, and that congestion would endogenously cause fees to go up. So this is a pretty straightforward point, and it's not materially different than the fact that, for example, when a lot of people want to use Uber, you get surge price, congestion is going to cause fees to go up endogenously. But remember that the mining network or the mining market is competitive, and it's characterized by free entry, and these fees are revenues for the miners. So this increased usage, which is going to generate increased fees is increased revenue, which is going to induce entry in the mining market. And what that means is you have more computers that are trying to solve this proof of work puzzle, and what it means then is that the network topology of Bitcoin is going to be more dispersed, and it's going to take longer to communicate information across the network, which is to say that it's going to increase the network delay of the system. And partially what we're showing is that the network delay of the system, when it goes up, it's going to actually become harder to generate consensus, which of course is core to Bitcoin making any sense. And if it takes longer to generate consensus, then ultimately users actually have to wait for their transactions to settle more than they otherwise would have. And I think as it's already come up, payment systems, the whole value proposition from a user perspective is speed. So you're not necessarily interested in waiting a long time, whether it's for traditional reasons or it's for something like consensus to be achieved. And so indeed, you end up getting limited adoption. And the system isn't really very well set up to absorb increased usage. Now, again, one way to react to this is to say, well, maybe you can alleviate some of that congestion by allowing the transaction rate to, for example, increase when there's more demand in the system. But that turns out not to work. And I'll spend a fair amount of time, hopefully in this presentation, discussing that. So for now, let me just say that essentially the reason for that is that an increased transaction rate leads to persistent forking. It leads to essentially more confusion, more disagreement, and therefore a longer time to generate consensus and still limited adoption. Now, the thing I want to emphasize here is that if you look closely at these rules, which at this point I'm only stating at a high level, what's driving them is really stuff that differentiates Bitcoin certainly from something like a centralized payment system. So what's really key is this need for consensus, which is not really something that would exist in a centralized payment system, and also the endogenous net curriculum. But hopefully that'll be clear as I move on. So let me actually get into the model. So the model has two types of agents, users and miners. So the miners are those that store and update the ledger, as we all know, and the users are just people who are trying to use the system or interested in having payments done on the system. And since we're studying Bitcoin, of course, we're going to model Bitcoin, but since we're studying adoption, we're also going to need to model an alternative system so that there is an adoption decision. And we are going to do that in a fairly reduced form capacity. So first, as far as the miners go, it's pretty straightforward, so I won't spend too much time on this. The key thing to know which is standard is that the miners are going to optimally process fees in descending order. So for example, if Catherine is going to pay a higher fee than I am, then she's going to receive service before I am. And the mining sector is competitive, so the number of miners are endogenous and determined by a free entry condition. As it pertains to the users, we're actually going to use a standard preference structure that is used in, for example, Huberman-Leschnoen-Malami. And so we're going to have a user preference structure characterized by three pieces. The first is that users do get an exogenous positive utility from using Bitcoin, characterized by this R next to the max. But other than that, as we say, it's a payment system. And so users don't like waiting. And so that's going to enter into the user utility in that second expression, which is the wait-dissutility. We're going to allow that users are heterogeneous in terms of their dissutility sensitivity, which is the CI corresponding to the user eye's dissutility per unit time. And of course, the total dissutility is going to be the sort of sensitivity multiplied by the actual expected wait, which is given by this W function. And you can see it's a function of f and f negative i, which are the fees that you pay and the fees that everybody else pays. And so users don't like waiting. And how do they alleviate their waits? Well, they can pay higher and higher fees. But of course, they also don't like paying fees, which is the last and third term in the user preference structure here. And so essentially users don't like waiting and users don't like paying fees. The more fees they pay, the less they wait. The less fees they pay, the more they wait. And so there ends up being an interior optimum, which users would pay if they were to use Bitcoin. But we're going to allow that there is an alternative. And we're going to basically normalize this outside option to zero and say that if the optimal fee generates a utility, which is negative, then the user abandons Bitcoin and uses something else. And that's the sense in which adoption is modeled in this paper. Now, one question I think that's always kind of important to think about when you're sort of examining a theory of paper is to say like, okay, so what's new and where are the results being driven from? And the truth is in this context, the key thing to keep your eye on is actually this endogenous wait function. Because apart from, for example, who are going to let you go and maul me, we're going to model the consensus process more thoroughly. And that's actually going to be very crucial to driving our results. So to dig into that, I need to talk a little bit about how we model the blockchain. And so as a first point, I think everybody here is probably already familiar with this. Of course, the proof of work puzzle is just a bunch of Bernoulli trials, which thanks to the Poisson limit there, and we know that when you have a lot of trials being done with very low success probability, and a lot of them are being under a short period of time, that that's approximately Poisson, which sort of gives rise to the Poisson process. And so people tend to model the block production rate for an individual miner as following a Poisson process. And that's also what we're going to do. But the key distinction from papers like movement less known maul me is in the next step, which is to say that, okay, we agree that individual miners block finding processes are approximately Poisson. But usually registering the aggregate block production process, because that's determining what's actually ending up on the ledger. And what most papers say here is, well, okay, so you have a bunch of miners, each of them have block production processes that are Poisson and independent. And the sum of independent Poisson processes is also Poisson process. So the block production rate aggregately is just a Poisson process with the sum of the parameters of the individual miners. And we're going to point out that that's not exactly right. Now, why is that not right? Well, in a word, forks. So of course, a fork is an instance in which you have, say, two blocks that are not on the same chain. And so what that means is that it's not the case that the sum of all the blocks produced by every miner is equal to the total amount of blocks, because it's possible that two blocks produced by two different miners are inconsistent and correspond to a fork. And therefore, the total number of blocks is actually strictly less than the number of blocks produced by each miner. And so we're going to incorporate this into our analysis. And it's going to be very key for driving our results. And so I should say here, particularly, you know, presenting virtually it to lose with Catherine and Matthew and Christoph here with their paper, The Blockchain Folk Theorem, that also looks into forks that in their paper, the forks are being driven by minor incentives. And we're actually going to abstract completely from that in our analysis for two reasons. One is we think they did a good enough job of studying that so we don't see the point in also studying it. But secondly, we're actually going to purposefully make as generous of assumptions about minor incentives as possible to think about whether Bitcoin can achieve widespread adoption. Because of course, if it can't, even when you make generous assumptions on minor incentives, then it certainly isn't going to be able to do it when you introduce the potential for minor incentives to drive problems. So we're going to abstract from those sorts of minor incentive problems generating forks. And the way we're going to the sorts of forks we're going to be concerned with are forks that arise essentially for technical reasons, which is to say, we're going to model network delay, which is the time that it takes to propagate information through the network explicitly. And we're going to model forks that arise from it. So to be a little more concrete on this, let me use an example. So imagine, for example, the Bitcoin network has only two miners, let's say it's me and Catherine. And I produce a block, let's say, at t equals zero and she produces a block at t equals two. Now, do these two blocks correspond to a fork? Are they going to be inconsistent with each other? Well, the answer is that if we're trying to coordinate in the sense that each of us is going to accept any block so long as it's consistent with the ledger that we already have, then the answer is it depends. And it depends on network delay, which is to say that if you assume that information is instantaneously propagated through the network, so that, for example, the block that I find at t equals zero is instantly transmitted to Catherine, then that means that when she solves her puzzle at t equals two, it will actually be the puzzle that corresponds to a blockchain that includes my block and the blocks will be consistent and it'll be fine. It won't be a fork. However, if it took me, for example, five seconds to transmit that information to her, then the puzzle that she solved at t equals two does not incorporate my block. And so her block is necessarily going to be inconsistent with mine and my block is going to be inconsistent with hers. And what's even worse is that she's going to think her block is first because she didn't hear about my block until t equals five, and she produced hers at t equals two, whereas I'm going to think my block is first, right? So we have, if you like, an honest disagreement. And that's where the forks in our model are going to arise. But let me be a little bit more formal about this, which we do in more detail in the appendix. So you can think about, well, imagine you start from a point of consensus, where everybody in the network agrees to the state of the ledger. And so at some point, of course, somebody's going to produce a block and you can ask the question, well, if everybody's trying to coordinate in the sense that I was describing, then what's the probability that the network uniformly actually accepts the block, that we don't have a fork? Well, it's just sort of repeating the exercise that I just sketched out with two people, except with more. And it turns out that the probability of agreement on that block is exponentially decreasing in two quantities. So one of those quantities is network delay, which hopefully is straightforward, right? The longer it takes for you to propagate a block through the network, the more time that is transpiring over which somebody could produce a block without knowing about your block, and therefore could end up disagreeing with you. But what's more important, particularly for that second result, that second main result, which I want to focus on, is that the probability of agreement on a block is also decreasing in the transaction rate of the blockchain. So that means that if you speed up the blockchain, you're going to increase disagreement. And I think it's an important point, and it is sort of key to what's driving our second main result. But I also think a moment and think about whether it's coming out of something parametric from the model or whether it's really some point arising from the fundamental structure. And if you think about it for a moment, I think you realize that this actually comes out very much directly from the way Nakamoto envisioned the whole system to work. Why do I say that? Well, so the whole point of the proof of work puzzle as it's devised, which is why it's a useless puzzle that you solve by brute force, is that there's not supposed to be anybody in the network who gets an advantage in solving the puzzle relative to anybody else. Now, an implication of that fact is that if you want to speed up the blockchain, then you necessarily have to speed up every miner in the network. In practice, of course, everybody's going to try to use the best computational power, and so they are actually going to be approximately equally likely to solve the puzzle, but so you cannot have one miner sped up while another one is not. You have to speed every miner up to speed the network up. And what is the implication of that? Well, so if you speed up the network, you will get blocks faster, but you will also make it essentially impossible to propagate blocks through the network fast enough to not have a disagreement. So one way to think about it is you want the first block to come faster, but the moment the first block comes up, you suddenly want everybody to stop so that that first block can be propagated through the network without disagreement. But you can't just turn it off. You can't just slow it down. And in particular, one of the reasons you can't is because the rest of the network doesn't even know that the block has been discovered. And the problem is that you're not able to convey the block fast enough. And the reason you're not able to is precisely because you sped it up. So it's sort of building in a self-defeating aspect that undermines the ability to speed up the blockchain. So we model network delay in the paper in a fairly parsimonious way so that we can capture sort of arbitrary market structures and formally state the definition of an equilibrium. And we show that it exists and is essentially unique. And here's a restatement of our first result and some intuition, which I'm going to skip over due to time. I want to focus on the second result, which again basically says that even if you allow for the transaction rate of Bitcoin to vary with transaction demand, then up to Bitcoin remaining decentralized, Bitcoin is going to necessarily experience limited adoption. And so just digging in a little bit further than that and using some of the structure I just explained for the forking probabilities, within the model this endogenous expected wait time that W of f and f negative i on the user preference slide, it's going to decompose into two parts. One is what we're calling the expected traditional wait time. And the other part is what we're calling the expected consensus wait time. And so one of the key points here is that if you had a centralized system, this thing we're calling expected consensus wait time would not exist. So what would you have? Well essentially you have a number of users that are in some sense higher priority than you are and you've got to wait for them to get processed and then you would get processed, right? And that's how it would work basically in a centralized system and that is still part of the story here. And that makes it very easy in effect to sort of overcome things like limited option. Why? Well, so for example, you know, you have 10 people ahead of you in line and it takes maybe you take maybe 10 every second, that's a one second wait and that's probably fine. And if suddenly there are 10,000 people in front of you because demand is exploded or whatever, you could in a traditional centralized system, you could just speed up the system and take 10,000 per unit per second and you still have a one second wait and that would solve the problem. Just speed up the thing and it'll be fine. But here you have the second term, what we're calling the expected consensus wait time. And this is, for example, a specific parameterization of the model but the main point is more generally true. It's just this is very clean, which is the expected consensus wait time is essentially exponentially increasing in the transaction rate of the blockchain. And so it's not an accident that the forking probability was exponentially decreasing or sorry the agreement probably was exponentially decreasing and this is exponentially increasing, those are very closely tied together. But the point essentially is that you have sort of exponentially higher disagreement as you speed up this blockchain because the blocks are coming so fast that there's just no ability for the system to converge. And it does eventually converge with probability one, but the time it takes exponentially increases. And so in fact, if you do the exercise that you might do in a centralized system of speeding up the blockchain by taking the transaction rate to infinity, that first term does vanish, but the second term explodes. And so the whole thing ends up being self-defeating. And if you try to speed up the blockchain too fast, you're just going to get persistent forks, essentially exponential confusion and the system will become unusable and the wait times will be too large and you'll still get limited adoption. So now we actually have another sort of supporting result that I think really hammers this home, which is because we model network delay fairly parsimoniously, you can also sort of in the model turn it off, which is you can basically assume that everything is instantaneously propagated. And indeed, if you assume that everything is instantaneously propagated, then limited adoption basically disappears. And then the intuition is fairly straightforward. If you speed up the blockchain and you don't have to worry about consensus, then you then you remove all the congestion. So the fees go to zero. And because the blockchain is moving so fast, your weights are also going to zero. So everybody essentially ends up adopting, which is actually in line of what you would get if you tried to do this exercise of speeding up the blockchain. And for example, Uruman Leshnoyan-Mohalmi. But a key point here is that network delay is sort of a fact of reality. And it's particularly relevant in a case like Bitcoin, where you have an endogenous network size. And the whole need for consensus is very much foundational to Bitcoin. In fact, more generally, of course, certainly permissionless blockchains. So what's really driving these results are very core features of Bitcoin, at least as compared to centralized systems. And so I think they're important limitations to internalize when we think about these sorts of systems. And so the goal actually here is not necessarily to be particularly pessimistic, I should say. I think the point really more was to was to highlight that there are certain technical features like network delay that do have actual meaningful economic implications. And particularly given that most of what's being studied in practice now is not Bitcoin. And it's sort of alternative protocols, and sort of let's say more interesting ideas. It is important, however, to keep in mind some of the limitations that arise due to things like the need for consensus and think about how those things vary when you think about newer protocols. And so the hope for this paper is that it provides some sort of formalization of this limitation that can then be used when we think about more modern protocols to see whether we're actually we're actually overcoming this kind of key problem of limited adoption. And one way to put it is, for example, like you wouldn't want a doctor trying to prescribe a solution to a problem without actually first having diagnosed the problem properly, right. And so the goal here is to sort of diagnose it in the case of Bitcoin. And yet it's not really something you can overcome in the in the typical in the current Bitcoin scheme, but that doesn't mean that, for example, you know, Algorithm doesn't overcome it or something like that. So so, you know, we think more than anything, this is this is a useful formalization to take forward when we think about more modern blockchains. So with that, I'll stop here. Thank you.