 Hello, I want to talk about source verification first thing the motivation like why do we need to do that like life is hard enough in this times of covert and stuff so why not this whole source verification thing. It adds important context to make a signing decision. The second thing is currently what we often see is basically you know signing, because what you see here is basically not meant for humans. That's meant for machines but often we just have to deal with that, because currently, there's not a lot of options to avoid that. My code metadata is already a feature of Solidity since 0.4 point seven, which is like ages ago 2016 release at that point. And it's basically see more encoded metadata hash and is Solidity compiler version. What we are most interested in is the metadata hash. It exists that existed for a long time, but often it was not accessible. Basically, it's either an IPFS URL or a swarm URL swarm URLs in the beginning. Now there's also IPFS. But the problem is people didn't really publish it or had the option to pin it basically so things were lost so when I wanted access it from a wallet for example. I couldn't resolve that anymore. That's basically what we want to fix. So how can we publish metadata. One wonderful thing now is that remix and it's this wonderful published IPFS when you did deploy. And I just want to do that in between. And I want to do that that we later see it's like fresh data I take a random word from the Internet and hope that it's nothing bad. Maybe okay. We don't get censored by YouTube so I added here. I compile this contract. Compile it and deploy it and publish to IPFS. Confirm everything's wonderful. And now we want to see basically what happens under the hood when we do that. And that was the simple life we had before basically, we had a deploy that's often like you, and we have this blockchain that's where we deploy to, and now things get a bit more complex. But with most of the complexity, you don't have to deal basically. That's what we deal with. Basically now the deploy doesn't only deploy to the blockchain, but also publishes to swarm or IPFS. And it's done the monitor is picking it up basically a monitor monitors if there's new contract creations on the chains, then gets the data, if it's available from swarm IPFS verifies it index it and pins it that is important basically that solves the availability problem that we had before. And then there's one more thing it's basically a manual input for the verifier that's mainly for old contracts basically for historical ones. You can do them and that's looking like that basically you find that currently on verification computing.org, and can verify it there you have to specify the network and specify the address to add the files in there, and you're done, or even easier now, also with the remix plugin, basically as now the source verify plugin, and then in the source verify basically you see also specified network, and then the files already selected so it's bit easier. And then you can also publish it, but you don't need to do that. Usually, because when you just click the virtual ffs that's already done. So how can we now access this media. One thing is, you can access it via IPN as we publish it via IPN as a problem with that is IPN is currently very slow. I've been stuck open for that for a long time, happily like with go IPFS which was just released this week. Resolving IPFS records should simply get faster. I tried it out it's not yet, but the IPFS maintainers say more people need to migrate to IPFS 050 to have that change basically take effect. And that's that they did, basically that they also in the real world so all of you who write IPFS notes please update, then that gets faster. Another version is basically to access it via DNS. The only problem with that is basically it's slower in updating because you have to pay transaction fees to do that, but resolution is much faster. And we can get more decentralization that way. With IPN is basically you just have one key with E&S you can have multiple keys. And you can also access it via HGPS. But we want to have it decentralized so that's basically the fallback option. So how can we use this metadata. First of all, now there's a warning like in Wolf I do a warning if there is no metadata to force people to do that and I hope other wallets also do that in the future. So that's basically the negative thing, but the positive thing is if we do that we can use for example nut spec. For example here you see now votes for proposal 23 with a score of five. And I want to show that basically with the contract we deployed before. And you can see that it's confirmed on Ethoscan and the metadata is deployed so now our verification server should have picked it up. So I have to copy basically the URL and start the emulator. And then we make a transaction to this address. And now we see we can add an action basically we just have one function here so we will just have one action and we get this warning because we don't have a fallback function basically we get the warning we cannot estimate the gas. So now we have this basically score and proposal. And let's say we give it a score of eight because it's eight o'clock and 42 is always a good number. We do that. And we vote for proposal 42 with a score of eight. And the other cool thing you can then get is basically it's just enabled for developers so you have in the settings to set the developer, you can also directly access the source code, which I find like I want to see more source code of the contracts I'm interacting with. Yeah. So that's basically that. So the next thing that you can do with the source verifier plugin basically is to load the contracts of an address into remix directly and play with them. And also what is really cool new feature that basically you can debug directly a transaction, and it downloads. It downloads the source code I also want to show that. So this is the. And you have to use another profile for that. Because otherwise it would be quite pointless. That's here. Okay. And then we have to select. And then we can also do that basically from like inside one of the directly debug from there but currently that's not possible because we need a provider that's why we need to basically switch to injected three years and we can do the work and then enter the transaction hatch. Then it's downloading the sources here and decompiling and you have the sources here. Wonderful. That's working fine. So, the next thing basically we can do, basically that's more the run part of the presentation or what we need to really improve. Currently, like the access nut spec and the access rat spec, both have spec in their name, but both are not really like there's no good specification like that's like basically says, use JavaScript, but really lacks also for the information for example, like how to access things in the contract with that. There's not much specification and rut spec. Only like is JavaScript implementation currently, and there's a third party documentation but it's basically abandoned. There's a lot of intrigues basically about what is currently used from that spec and rut spec, you can find that on contract matrix at computing. The good thing is because it's not much used currently people also like don't use all the features. And I think, and that might be a thing we could discuss in the session also afterwards. Perhaps that could also then be part of the specification that we basically make a simple specification first and then extended and also because could be nice that I have part of the package manager to specify what is currently actually used because I have to like guess if it is right back on that spec I can see if certain keywords are used. I know that it's what's back, but it's currently guessing game should be part of the metadata in the end, I think. But that could be part of BPM Nick will talk directly afterwards about it. It's a nice project, which basically standardized standardizes metadata, and now that will come together with the source verification and be a standard together. There are some open tasks to do. What is quite important is basically that we add the whole thing, not only to remix as the, as the idea, but also basically to do all the CI pipelines that the publishing is done automatically automatically. So basically in travel, please do the plugins that would be really nice basically that should be automatic. A depth node integration would be really nice and get more wallets to support and I hope also more wallets then do basically just warning and one their users, and I think it's their duty to one the users that they interact with an unverified contract because that is fucking dangerous. And we need to decentralize more aspects we try to decentralize a little bit more. We, before there was either scan I think we went a step further than either scan now, but there's still a little bit of work to do we currently a little bit of centralization is the left and we need to get rid of that we have ideas for that. We need to go step by step. That's the thing. How you could really help it's basically for example, if you have a depth node install this IPFS pinner and pin verificated that basically we solve this availability problem that when we go down that we are not the only ones pinning this content, please pin this content like for example with the IPFS pinner or with your own scripts that would also work. Then what we really should do with the NS thing. Currently, we do basically call content hash directly on the resolver. And it would be much nicer if we generate a DAO basically that just we proposed a new content, and then have some strategy like some people in an organization that need to propose it and agree on the new content, and they all verify that's important basically that they all verify the content of the contracts and see that it's valid and then publish it to NS that the trust is less, because currently we could publish badly verified contracts. But then you could see it in the end but not on all for example by the wallets you don't want to recompile all of them. So basically some open challenges that only ties is a bit more harder. Basically, yeah branding and awareness we need to raise more awareness for that and also like currently we just call it verification dot computing but that's more like there was no name in between if you have a nice name or have branding ideas that would be really nice. You have a logo now that's really good since last week logo very, very good. Hardware wallet support would be really nice but it's challenging because there's the air gap between the hardware wallets. There are some ideas basically that you load and sign the content to the hardware wallets regularly for example the treasure model T has an SD card. There are ways but that's a bit harder thing. Localization is a big issue. And that basically also ties into mutability. Basically, the metadata you want to publish is immutable but we could what we could do is basically links to where you could get your translations or where you could get updates because like interfaces sometimes you just want to update it's complex. And we need to find strategies to do that without losing the guarantees we create here. Credits like basically we like Christian right is now is the main driving force on that. Thanks for that. Christopher and Eddie wonderful work Mark also and remix team big up because they integrated that already very fast and wonderfully to test that. And like we have a source verify repo just joined there and grab some issues create some new ones try it out, or close issues and submit requests as always. We are happy to see that and join the conversation like we have this guitar room you know now through the summit basically about the guitar join us and get a we are on a theorem such source verify we also like have a video conference in each month. So perhaps ask us to also join that if you want to contribute something. And that's basically for me, we have time for questions or five minutes I think yeah. Yes, thank you. I would be very interested to learn more about the logo I didn't know you have a logo yet. I did do all those screenshots newly basically, because the logo just appeared there. It's this source verify. I think it's not as it sounds but it's, it's quite important basically that people know about that. And that there is a little bit of branding and a little bit people are aware that we basically overcome this issue that nobody really cares for that currently. Because you basically need the stamp of this thing is verified this, you know that you basically get a bad feeling if you interact with a unverified contract and you really should. I like the, I like the term Yolo signing lot. All right. So please, if you have any questions in the room, raise your hand now and use the raise your hand feature. Yeah, Mr. Chico. I'm wondering if you have attempted to crawl ethers can and get all of the sources from there. And if for example the contract matrix is doing that, but there comes this legal problem that Christian mentioned. It's not really like we could import it would be really easy like I have it locally here because I look up some things, but I don't just want to import it into that. That has basically two reasons. It is legal thing and be ethers can also inject things in the metadata and then we cannot really recompile things. So, so basically, yeah, but also made a really good thing. I want to stay. I'd recompiling stuff for me to scan, but I failed for all the contracts. And the interesting thing is, so this ethers can says fully 100% match, actually, the website says 100% match, but it's never 100% match because they ignore the metadata hash they never compare them with metadata hash. So, and the cool thing about the metadata hash is it's an hash anchor that basically, yeah that anchors the the actual source that was used to compile the binary that was then deployed. And for ethers can you can add arbitrary comments you can rename variables and so on and will still say 100% match. That's the case here. Okay, we have a couple of more questions. Leo was next. And I just wanted to say that logo looks like the solidity in the Viper logo is together. Might be that this is the case I don't know. Yeah. But it's also. Then we have Richard. Like the last time I checked this metadata file also contained to some extent my local pass or like sometimes actually it was not always consistent sometimes it was like absolute pass sometimes was a relative pass but this kind of made it for me impossible to verify the contract that I uploaded before from a computer that I reset it like because I don't have the same path anymore I don't have the setup from the the time anymore but it actually requires me to like that's why for me to do it for contracts in the past is really hard like if I now upload a contract I can do it now this is possible. So, and if you have the metadata file then the paths are all there you just need need the metadata file and then. So the compiler doesn't care about the path of the file, the compiler just takes the path you send it and uses that so it's always about how you call the compiler. Yeah, but I need to have the metadata file to create the same hash again right like if I don't know what was my, like I don't have the metadata data file. I have the exact source from this point but I don't have the metadata file which has some nuances that I cannot reproduce and therefore I cannot get it to verify from. That's an issue. But I mean this is not just about the path it's about any settings or whatever. I think the path was for me the one that was the most obvious one that that you see the first something that happens there. And that's why we really need to get it in all the CI pipelines that you don't have to do it retrospectively. I think it's just important for really often use contracts that we might be able to verify them again. And it's automatically then it's even easier than with users can you don't have to check checkboxes you just you know it's part of the pipeline you just agree to somewhere. The only problem that for example travel had they didn't know they wanted to didn't want to do it by default because some might not want to publish their source cause I don't know why. But yeah, so and big thumbs up to travel they store the metadata file. So we took the full path with kind of a little downside but we have more questions. Next up is Alex. It was more like a comment on when Chris said with the eater scan. They have been removing the the metadata for quite a bit. I think ever since it was introduced because we also had like the experimental flag in it and we're just causing confusion. I've been recently. At least I verified one contract on on eater scan. I think after ecc. And I just grabbed the sources from the repo and verified it myself it wasn't my contract. I realized just after doing it I mean I talked to the author, but I realized after just doing it that I should have added a comment into the into the source, because it wouldn't have detected it. And I could have said, you know anything I wanted there. And this wouldn't be the case if if the metadata is considered so it wouldn't be the case for source verify. Okay, and another question from Lucas. Okay, my question is, isn't having a verified source without the metadata better than having no verified source at all. So, isn't there some value in this. Because as at least I know that, yeah, this source code generates the exact same bytecode. Sure, but I mean the hope is that just publishing the metadata and storing it is just part of the deploying workflow. No, I totally get that but I see some value in making for legacy code. We're actually also thinking about adding source code to this repository that does not match the metadata but everything else. But then the problem is, what is the correct thing what if if someone else uploads another code that also combines the same bytecode. Do you sort both of them, or just the first one or what do you do right. Yeah, but at least one source code that compiles to the same bytecode. Definitely helpful. Because all the old contracts are still interesting to look at and having them in there, having them not in there just because we don't have the metadata doesn't make sense to me. Yeah. This is the exact problem interest can has because whoever verifies it first is able to introduce comments as they wish. They do have the option to reach out to eat or scan because it's an authority on this to replace the code. But if you decide to only store the first one, then it's it's up to whoever does it first to to make the source look like as they wish, as long as it compiles to the same binary. Okay, there's one more comment from Cairo on guitar and then let's move on to the next session after that. The problem is I do not want the source code of a project in testing to become public while we do testing on e.g. Robson. The problem is I do not. Yeah, once we go to mainnet we may be fine with the source team public right away. I'm afraid to publish like that. It's the same problem and get up often people are just too afraid to publish something like just look at others. And it's not shame to publish that. Why not. Like I don't see a reason why not like publish it. Publish early publish often because otherwise often you have the problem. People say yeah we will publish it later. People can look at it later and tell you later that there's a problem. Yeah, I don't think that's a problem publish as soon as the first line is there. But that's my personal opinion.