 Yeah, so I'm gonna be talking about the KZG ceremony or how I learned to stop worrying and love trust the setups so Five months ago. I think I wasn't the only one who stopped using the chain or started using it very sporadically and the the demand is just weight weight outstrips the supply of block space and This is this is not a future we can Continue on into four ethereum. So we need to greatly change how we're doing things to scale our capabilities the L2s are the promise way of doing this For your multiple years now, we've been hearing about the roll-up centric design and how L2s are gonna provide us with what we need but L2s are only part of What's required scaling comes in sort of two variants for ethereum. There is scaling compute L2s take us from what ethereum is right now, which is effectively a single threaded very slow Computer to a massively multi-threaded much faster Processor the problem is is that L2s on their own don't have a way of storing this data enter Proto dunk sharding also known as EIP 4844 and the idea behind here is that we enable Incredible amounts of data for these L2s to use and just for them such that they can Provide if they're providing the compute. Here's the storage to match that and Scale ethereum up to to what we need. So what does this proto dunk sharding look like? We have these notions of blobs is It's some arbitrary data from the protocol's perspective. We don't really care what it is But to the roll-ups, it's basically going to be compressed transactions and then we use this device called the KZG commitment to Compress down this data and only store a small reference to it on chain then once the Block proposer has proposed this data the validators there in the bottom check that this data is available They check that they have access to it and that the network has Propagated this and if so they vote that they've seen this data And this gets also put on the blockchain. So from the blockchain's perspective, we know that this data was around There was an opportunity at some point for all the roll-ups and roll-up users to download the data that they care about the validators then save the data locally to their own discs and and This this this gets stored But this has the problem which Ethereum has right now, which is where we're storing incredible amounts of data again And the idea behind dank sharding is that we're scaling this way beyond a theorem's current capabilities Which would mean for the people who are already struggling to run their their nodes. This would be unacceptable So the final step in this is two weeks later We throw away the data So it's not that we promise to keep this data around forever. That's not what proto dank sharding guarantees proto dank sharding guarantees that at some point in the past this data was available for people to download and Should they have wanted to done that to have done this they could it is then on The roll-ups on the users to store the data. They're interested in so if you interested in certain transactions You could store just your transactions Roll-up sequences could store it. Well, we'll be storing everything But it's just no longer the chain's responsibility to store this you can put it host on IPFS It doesn't really matter Just that validators no longer need to store this so effectively they have this Sliding window of the past two weeks of state that they need to look after some may ask why KZG There's a very full table. I'm not expecting you to take all of it in The current solution we use for these kinds of commitments on chains are the on Merkle trees But as you can see there's lots of red on the Merkel that the shot char Merkle tree column on the left there What and what we really care about is this top line, right? If we're trying to enable If we're trying to enable roll-ups and all twos lots of them are ZK EVMs or ZK roll-ups And they care about being able to prove The things about this data inside the roll-up and as you can see from this top line if we stick to our current mechanisms It's quite literally not possible The only realistic other alternative would be IPAs, but there are several other trade-offs that they make so KZG is a very promising looking thing, but it has this one big Red negative as you see over there and that's this trusted setup and that's on me speaking to you about today Trusted setups this I hope should make everyone a little bit nervous Trust is not a thing we try have in the system is trust but verify So ideally you shouldn't need to do this. I'd like to be able to convince you today that in this case This is something you should be able to Trust and that you can verify lots of things about it so the idea is that collectively as an ethereum community and hopefully including many of those outside of our community We're gonna summon a secret There's not a single person Entity doesn't matter what scale that will be able to know what the secret is But together we will have generated this it runs on a So-called one of end trust assumption The idea here is that we need one person not to have actively been malicious It's not that we need one person to be good It's we need one person not to be bad and to actively like pull apart the code and try figure out How they could corrupt this and that kind of thing we just need one person not to have done that So what does this what does this look like? The ceremony is going to be written by people people make mistakes. So there will be bugs. This is part of the plan So we have Multiple implementations that people can use you can write your own implementation the idea being that even if there is a bug In one of the implementations There's still a lot of a lot of contributions that people have made by the other implementations that keep us safe There will be malicious people in the blockchain setting. It's very adversarial Theoretically if you could collect all of these secrets that people contributed it would be worth a lot of money because you could do a Lot of bad things so in some sense there's some value behind this Not really because you really need to get all of them having one short is not enough But there will be some people who will try be malicious and try spread fud So we lose a few more these contributions But the idea is is that there'll still be many Contributions which won't have this problem and by many contributions. I really mean a great scale like that's not even close Not even this is my sort of low bound goal for the number of people. I'd like to contribute. This is 8192 contributions This graph is something I personally care about it's a bit silly But hopefully this ceremony we running will be the first ceremony ever where they are more participants than we have powers So normally you have these very big hard-to-run ceremonies and one or two people contributing here We have a ceremony with that's very easy to run and we have Ideally tens of thousands of people getting involved Ensuring the security of all of this making sure that those candles stay lit So a lot of this came to the design decisions of the ceremony It's designed to be quick and easy for Everyone here at no matter what your level of Technical skills there. There's an easy option which is there's a website you can go to click contribute and It will walk you through it in about five minutes very easy to run on your computer if you want to write your own implementation The specifications have been kept to an absolute minimum even at the cost of some efficiency of the setup Just to ensure that it's very easy to do And then in other ways we've just tried to keep this ceremony as minimal as possible There's also the idea of the ceremony is updatable which will mean something to the cryptographers, but for us is that in Ten years or so when our communities progress when hopefully I'm less relevant to the protocol because we're less Capturable and we've grown further And we've adopted many more people that we can add to the previous ceremony build upon it To update it such that we'll have even more people Keeping those candles alight. So how does it work? That sounds like some big scary math, but in reality, it's not so the idea is that Each individual generates a secret and they mix in their secrets with everyone else So in this example the first person mixes in the number three into like the overall secret The second person adds the number five to the secret and then the third person multiplies the number two in so the idea is that this this brown colored 30 is the mix of all the colors slash all the secrets from the various people that came before and That it represents all these these these secrets and these people generated these numbers randomly But there's a problem with this and that's that as it stands here Anyone can read this right? It's just the number 30. That's not a great secret. So need to encrypt these and we encrypt these by Quote-unquote hiding them the exponent we turn them into elliptic curve points This allows us to still perform the math that I was saying it is quite literally multiplication But it is done in this in this encrypted form so that no one can read the overall secret There's one other thing and that's that we have the role of the sequencer The sequencer's job is to do two things one is to decide who's going next There are thousands of people who want to participate how we decide who's the next the next contributor and the sequencer basically Holds the files and when there's when someone asks for it and it's they're available They'll pass over the files to the next person to contribute the second thing they do is they verify these contributions so We need to be sure that these contributions that people give back to the sequencer Someone hasn't tried to delete some of the points or try to do some weird crazy math that breaks it So the sequencer just double checks that the files that they're getting back are still valid files So that the ceremony can keep progressing the sounds like the sequencer has incredible amounts of power This is not true. So the sequencer's abilities are Basically, the only thing they can do is censor things They could lie and say that the file you gave them was corrupt or invalid They could also just not give you a turn. So you would log in with your theorem account and then they'd be like, oh, no Sorry, your theorem accounts not valid We've got certain validity conditions it doesn't meet those In this case, we have signatures which hold the sequencer accountable for this role So should they try to do these kinds of things? We'll have a proof that we can publicize to show that this is happening and we can decide how to progress from then So realistically the sequencer doesn't have any weird and funny powers It is just a server that runs a very simple algorithm just verifies things and passes them on In the end, it'll be up to the community to verify that the sequencer did their job correctly So there's a transcript file that should be spit out of the ceremony at the end which contains the secret this final secret and we can verify that what the sequencer did along the way was Correct and valid. I kind of lied a little bit earlier when I said that's how it works. It's a little bit more complexity So how does it really work here? You might have heard of something called the powers of tau These are the powers. So it's not actually a single secret. We store multiple copies of the secret We stole the secret the secrets a number by the way We saw the secret the secret squared cubed et cetera et cetera all the way up to the secret to the power of 2 to the 12 All as these encrypted points then when it's your turn to contribute the sequencer will pass you this this file and You will come up with a secret with obviously with the help of your computer, right? And you do some some computations So you also calculate your own personal local secret or entropy you calculate the powers of this and You do basic multiplication. So you multiply the the corresponding power with the the corresponding elliptic curve point given to you from the sequencer and By the the linear properties of Elliptic curve multiplication we end up with this final thing which is basically showing that now your secret has been Incorporated into this overall secret and you can now pass this on to the next person that you then take this powers and hand it back to the sequencer who verifies that what you did was correct what you did was valid And that you haven't tried to to cheat the system. So when can you participate? The we should be launching a testnet very shortly ideally in the next few days They were fairly certain that most of the cryptography is On lock we've that that's pretty reasonable We've already had a first audit on that but there are a bunch of UX bugs that we need to sort out and we're handing that And that's a part of the testnet is to get people to participate and Help make this the the best process it could be We've got a final audit launching early November Which will hopefully be concluding after that Should anything arise in that audit will we'll squash those bugs review that and then hopefully launch the ceremony by the end of November EIP 4844 I don't show lots of people of her discussions as to when that's gonna be Ultimately, it's up to the client devs and the community demand For when this gets shipped, but as this is a prerequisite will basically be running the ceremony up until that point Which is a minimum of two months But obviously will run longer should it take longer to get to EIP 4844. There are grants I'm sure many of you have heard never to roll your own crypto. I'm trying to tell you should roll your own crypto All that's needed to To implement this like from scratch is to do elliptic curve multiplication Maybe some of you have heard of this or like had it explained to you on a whiteboard It really is a like relatively simple thing to do you have quite a lot of time to do this participation So if you are gonna write your own implementation You don't it doesn't need to be super optimized. You don't need to read all the latest papers So if you've ever wanted to understand what's really going on under the hood of elliptic curves, this would be a great way of doing it Alternatively you could use an existing client So in existing elliptic curve library and on top of that You could just implement the client side of things like the secret generation how you how you sampling that randomness and how that gets passed on to this elliptic curve library and Then the second category of grants is for crazy randomness generation and contribution so if you'd like to go on some weird vision trip that somehow is gonna Bring out this the secret that you can prove to other people's been done in an interesting way or you want to Use your local ant farm to like measure the location of answer something weird to Sample randomness something that is very hard to bias such you can convince other people that Your contribution was valid that you generated randomness in a novel an interesting way functionly These are less important because your computer should do this randomness for you We all rely on our computer's ability to generate randomness already for all of the encryption and cryptography used within this ecosystem and like the wider web, but It's even more interesting and we can make even more assurances if you start doing things like this And then finally as I'm sure some people have seen from past ceremonies You can destroy that destroy the hardware used to generate the secret What's a little bit different from our ceremony is that the secret only exists in the RAM of your computer for a few seconds 30 40 seconds or so. So it's not some something that gets stored to disk that you need to Get rid of later or something like that. So destroying your computer is much less necessary But should this be something you want to do or should you think that you can convince more people that this is secure by doing it? Reach out to to myself or Trent and we can be and you should be able to see a bit of shouting about this on Twitter But what are good places for grants and I think those are the the major points? I wanted to talk on so are there any questions? Why to to the power 12 ah? We actually have several ceremonies so And where the the ceremony is consistent consists of four sub ceremonies two to the power of 12 13 14 and 15 which are basically all the sizes of data blobs we could see ever needing for dunk sharding So two to the power of 12 basically gives us one megabyte Blocks in expectation that could grow up to two megabytes, but basically that's Which we think is a reasonable target and should be fine with current technology, but we give ourselves the ability to scale What's that 16 times over that should we need more growth in the future? And if we need anything beyond that we'll rerun the setup There's also an optimization we made which is pretty fun and interesting Which is where instead of having one large setup and then using subsections we can instead do several smaller setups And then we can skip the low-degree proofs because these setups are disjoint So you don't have more powers should you need them but that like the low degrees enforced by running out of points Sounds like the Mike's live. Hello if I stand correct if I understand correctly then everyone who's participating in a ceremony could have like the chance to See what the current secret is so is there a chance that if you be the last one Picked in the sequence, then you would have a higher chance to Kind of ruin the whole process So not really and that's because we don't operate over the secret in plain text We operate over the secret encoded as an elliptic curve point So it's not we don't use it in my example where the secret was 30. We don't actually use 30 We use 30 times the generator of the curve, so some elliptic curve point Which means when I give you the powers I actually hand you this elliptic curve point Which means you can't see into what the secret is so if you have something you build on top of there It it's useless to you just looks like a random elliptic curve point Which is the same security assumption we depend on for the rest of ethereum's signatures Etc. Okay. Thank you Hello, okay You mentioned the quantum vulnerability. Yes. What is the mitigation for that and also BLS signature aggregation and When is that gonna be a problem? When it's gonna be a problem. I wish I knew I think it's a lot of money. We'll be able to answer that question so If we want to switch away from this to less quantum vulnerable things we would have to switch to Probably hash-based commitments, mochil trees the issue about it being Not verifiable easily inside of snarks would go away because snarks would also be broken So like depending on like what your commitment schemes are you would have to tail all of that But we'd have to see quite a big re-architecting of this entire system So it's not something we have like a built-in backup like we do for many of the other seams In answer to your BLS aggregation or whatever question for the signatures we're using for ethereum right now Under the hood for your private keys. There's also a lamport keeper like lamport signature keeper So we can always revert to that to execute a one-time transfer should that need to happen. So there's more There's resilience built into that already and you don't think it's gonna be within the next 10 years that the quantum Will happen then I don't think so or at least not at the rate where all of a sudden we're gonna have to throw Everything away The good news is we probably won't need a trusted setup in the case that quantum is broken So we should just be able to switch out the commitment scheme we're using But yeah, I don't see it happening in the next 10 years if it does I don't think it'll happen overnight Thank you. I was wondering how like on a on a test net Do you what kind of tooling are you using to like execute these adversarial conditions? so the thing with a Do you mean adversarial against the sequencer or against the like setup in general so like if on a test net You wanted to actually like have one of them like hide a secret or like not make something available Like how easy it is to how easy is it to do that on these like live test nets? So the thing here is that There there's basically just one validity condition I can sort of okay I'm already on the next person slides nevermind click is not working I just slide on this basically one validity condition or two checks we need to do to assert that the Update has been done correctly So it's not that like there's a weird subset of collusion or something that can break things It's that have we built the sequencer in a way that such that it can't be dust or crashed Which we can use fairly standard testing infrastructure for and When you're submitting these these files we can like try corrupt them and that kind of thing But that's going to be rejected immediately by the sequencer So it's a bit different from standard test nets that we see from blockchain networks because the adversary looks very different It's more of a web-to-adversary in attacking the sequencer directly Then what would happen otherwise? Okay makes a lot of sense job Thanks very much for this for smaller projects Would you still trust if the if we're talking about a separate project? Would you still trust the trusted setup or would you think about different strategies talks or anything? Trusting this setup or their own setup? No, if you would walk on a project Ziki project now that is kind of thinking should we do trusted setup? Should we go with stock with something else? It's hard to know I think a lot of that comes down to the execution Things like how many participants have you have you had come on board or how many different implementations? Do you have we already have a few floating around? I think we could get something like 10 or 12 different implementations these kinds of things the good news is that once this is up there's now more infrastructure for smaller projects to leverage in terms of like just extending these powers that would be great But it like in intrinsically that it's like very much a case-to-case base thing I Don't think trust that setups in general are safe. I think there are ways of executing them that are safe And I think some of these things I touched on are important to do that Thanks. Have you considered starting with an already established setup and Basically adding your randomness into it and if you're not doing this why not? Yeah, we have there's been a lot of discussion on that So we're using the setup is done over the BLS tall 31 curves Which I mean they have been setups done before but not that many anyway What I was referencing earlier with this optimization where we stopped in the low-degree proofs because we run out of powers So we don't have to check that a polynomials of degree Less than 2 to the power of 12 because there is no 2 to the power of 12 plus 1 point. We literally run out this optimization falls away when we use these other trusted setups that existed all to the safety of The the KZG commitments and the way we using them in 4844 So for us it breaks things being able to use it So while it would be a good base from a the perspective of not knowing what the secret is from the security things We care about in our application. It doesn't add anything And then it's more of like arguing about nothing up our sleeve numbers kind of thing It's much easier just to start with the generator, which is what we're going to be doing. Oh, yes It's a sequential process. So sorry the question was do you run it in sequence and how do we handle civil attacks? So yes, we do run it in sequence because of the way the Things are encrypted or whatever it fundamentally has to be that like the way the math works out It fundamentally has to be a sequential process. We can't combine two parallel ones later Our civil processes are two-fold That's that you need to authenticate with sign-in and with ethereum and we check certain properties of your account So like a fresh account is not going to work But an account that you've got even like you've sent one of two transactions Should be good enough that in the past Alternative to this should you not want to use sign-in with a theorem because you like only a theorem adjacent or something Like that is github and there we're verifying that in the past you've had a commit to on on some repo the civil attacks here in the like low orders like if we have Some people contributing five times or so it doesn't really matter Because they just mix in five sets of secrets But if we extend this out it does start to matter So if everyone like if everyone managed to civil it and everyone was the same person replicated that would be bad So we don't need perfect civil prevention just moderately so