 What is your second thought of scaling when lightning won't work out? Yeah, we've got a B cashier in the house. Where are you? So I have a plan B, a plan C, a plan D, plans all the way through Z and beyond. We then start with A, A, A, B, etc. It's a 24 or 26-sessimal system. We have many plans. Lightning is not the only bet. There are other mechanisms for doing second-layer networks. We are also looking at, and I'm interested in, the research that's been done by developers, on optimizing transaction capacity by compressing signatures and aggregating both signature verification and signature application, it's called batched verification and signature aggregation, which nor signatures, tap routes and tap scripts, which are also great mechanisms for compressing the scripts that are used in bitcoin scripting. On top of that, there are other optimizations in the use of network bandwidth that are being applied today with protocols like ERLE and more and more and more. The innovation continues. Lightning was just envisioned in 2013. It's now deployed and it's growing every single day. And if it doesn't work, we'll do something different, something better. And also, we don't have to do just that. We're going to try every scaling mechanism possible. There are no easy answers. If you've been told there are easy answers, people are lying to you. And they're probably getting very rich while lying to you at the same time. You may notice that and think about it. Manners who secure the network in future are going to rely more and more on transaction fees for their award. We also have an indication that probably bitcoin needs a layer two solution like lightning network. If a lot of these microtransactions are taken off on a layer two solution, what effect could this have on the security of the network with miners getting less reward? That's a great question. First of all, you've got to understand that this is a natural market-based evolution that is happening in real time. The reliance on fees versus the coin-based reward or block subsidy, that calculation of what's more important, doesn't happen some point 100 years in the future. It happens today. It happens every day when miners do a profit and loss calculation in order to decide how much more equipment to deploy, what equipment to turn off, which equipment is no longer profitable, which equipment is now profitable and should be used more aggressively. They allocate their energy commitment based on this profit and loss calculation. And the contribution of fees to block subsidy has already reached a 50-50 point at moments of very, very high utilization of the network. So this is already happening. When you move transactions to an off-chain network, that has a multiplier effect. The multiplier effect is not to not have transactions on the base layer. It's simply to say, for every one transaction on the base blockchain, we can do tens of thousands of transactions, perhaps, on the second layer without incurring additional fees. But you still need anchoring transactions to open and close channels. Increasingly, what we're going to see is this hybrid environment, where your wallet will choose what to do based on the circumstances, the level of fees, the amount of payment you want to make, and it will do intelligent routing, just like we do least-cost routing in telephony, or shortest-hop routing on the internet. And your wallet will do that and make decisions about which layer to networks to use, because there won't only be one. And what ratio of on-chain versus off-chain? Now, keep in mind, because of technology called splice-in or splice-out, you can actually combine within a transaction, almost like a coin join that we talked about before. You can have some inputs in the transaction be on-chain redemption of coins that you have on-chain, and some of the inputs be channel closing inputs of channels that you're closing in order to rebalance your layer too. At the same time, on the output side, some of them can be direct on-chain payments, perhaps for larger amounts, where you want to do it on-chain. It's more effective. But some of the outputs will be channel opening outputs that open new channels to better balance your activity on layer too. There will not be one set of transactions for on-chain, and another set of transactions for opening and closing channels. Those things will happen simultaneously. There will be plenty of transactions on the network. In fact, layer two is only the first step in a scaling journey that has to involve hundreds of steps, and infrastructure improvements, and new algorithms, and new ways of balancing the essential requirements of security, scalability, and decentralization in a way that allows us to bring this universal basic finance technology to everyone. We cannot solve this with one magic bullet any more than you can solve the scalability problem on the internet. Because the moment you solve the current scalability challenge on the internet, it opens the door for every application developer to say, huh, I think we can do 4K video now, and then you have a new scalability problem. And you solve that because the internet gets completely flooded and nothing works, and then as soon as you solve that, then there's more capacity. The app developers go, VR 4K video, cool. And then you have a scalability problem again. And I've been watching that play out since 1989, where I didn't have enough bandwidth for text email, and then I did, and then I didn't have enough bandwidth for sending attachments, and then I did, but not high-resolution images, and then I did so I could do voice calls, and then, etc., etc. This is a continuous development. The engineers in this room will probably understand me when I say, if you give me a fiber connection that offers me 100 gigabit internet to my home, the first thing I do is I call the internet service provider and I say, did you really mean that? Like, can I actually use it? Like, we can assure you, sir, this has happened to me. It was a one gig connection with fiber. Yes, we can assure you, sir, that you can use it. I'm not sure if you understand me correctly. I actually can use it. I'm an engineer. I can use it. Like, absolutely, sir, no problem at all. An optimized router and switch later, two virtual servers, and I pegged that connection for 12 months at one gig down, one gig up, 24 hours a day. You give me a 100 gig connection? I'm like, excellent. The internet archive needs a local mirror. Wikipedia should probably be replicated in every language possible on my little server at home. You give me a terabyte connection. I'm like, here we go, 4k VR video for the world. I can do Netflix from my house, not watch it, serve it to everyone. Any engineer who looks at a capacity limit is immediately thinking, what could I do if it was doubled, tripled, quadrupled? We think Moore's law. There is no way we can solve blockchain capacity. The moment you solve it, I think microtransactions. The moment you solve that, I think pico transactions in microseconds. The moment you solve that, I think machine-to-machine payments with pico transactions and microseconds happening 24 hours a day on all of the IoT devices. We will always have a scaling problem. What's really interesting to me as a milestone is, when do we get this scaling to the level where we can start on-boarding more and more of the other 6 billion? And on that note, I'm going to end it today. Thank you so much.