 Thanks so much, Brian, for that amazing introduction. And I certainly, my brother-in-law, he raises chickens. So now I have a new fun fact to tell them. So, well, it's a pleasure to be here. And I'm really excited. This is my first time interacting directly with the hyperledger community. And so I just wanted to say how excited I am to be here and hope to begin what I anticipate will be a longer conversation with many of you about ways in which we could potentially work together at the Starling Lab. I'm here to talk this morning about a case study that we launched earlier this year, looking at misinformation and trying to link some of the important new prototypes that we've created into the hyperledger ecosystem. And we learned a lot in that process. So excited to share with you the results. So to begin, I wanna talk a little bit about the Starling framework. And this is an initiative that began at the USC Shell Foundation and Stanford's Department of Electrical Engineering, we're using tools and principles to empower organizations to securely capture and store and verify human history. And that task seems obvious, like who would actually now want to do those types of things. And yet we realized that the human rights community and historians and others that work very passionately to prevent all sorts of conflict, they yet had not fully grasped the power of the decentralized internet. And yet they were the constituency that actually had potentially the most to gain from these tools. And so we set the course to bring in these tools into that community. So today I'm gonna talk to you a little bit about the context for our work with this case study. I'm gonna show you the prototype that we actually built with Hyperledger and then talk to you about some of the key learnings that emerged from that. So to begin, I wanna start off just by giving you a sense of the people that have joined us on this road. The context for this project is the USC Shell Foundation, which began 25 years ago, has really been working hard on the prevention of genocide. And their mission began by documenting the testimony of the survivors of the Holocaust, which was started by Steven Spielberg after he made Schindler's List. And then it has expanded to include now nine genocides that are in their collections. And they also have expanded to then work on contemporary conflicts, both with the rising tide of hate speech and violence, and then also challenges that have emerged with misinformation. They were a great pair to then bring in with the Stanford's Department of Electrical Engineering, which has a long history of working on innovation and has also now spent extra effort in the last couple of years working on ethics and how those could be brought directly into code. And so with a host of industry partners, we began with one photograph. This was taken on January 20th, 2017. It was published at 2.02 PM on the Reuters Newswire. And it was meant to be a fairly unremarkable statement of fact, which was here was the inauguration of Donald Trump happening at 12.01 and eight years before that, the same vantage point taken from the Washington Monument of the inauguration of President Barack Obama. And what should have been very clear and precise documentation of that day instead turned into somewhat of a circus. As you had the politicalization of this photograph really take off to spark a debate about trust in our media. And here the dispute was that if you looked at other photographs that were presented by people with different political viewpoints or indeed the administration, they believed in which the other photographs that were not theirs were intentionally framed to show a smaller crowd. And indeed from their view, the alternative facts as they called it was that this crowd that they saw was in fact the largest in history for a presidential inauguration. Now far be it from me to get into the politics of this but I think when you look at the specifics around what this meant for the news media specifically with our partners at Reuters, this was obviously a very disquieting trend because certainly their photographers were there with the intention of being objective and simply documenting what was going on as a wire service. And so as we got to thinking about it, we started to say, well, how could we use new tools and technologies to secure from capture and then in storage and then finally through a process of verification which all news organizations go through, how could we use new cryptographic tools to create what we thought of as an end to end signal flow? And what that meant was that we were going to use open source tools, create a route of trust and hardware, use some of the new great web three technologies that are here and ready to take on this type of challenge. And then of course, to keep this entire framework decentralized. So let me show you a little bit about how we put together this prototype. So we started off with a mobile phone that has on it a couple of new features but from the majority of new smartphones they actually share the majority of these capabilities. And really we know them well. So things like GPS and radio signals and the gyroscope and time of date. Those are now things that we are easy to establish on a mobile phone. And what we can do is we can take that metadata and we can pair it with every photo that comes in off of the camera sensor. And so now the pairing of both the photograph and the metadata can be hashed. It can then be then signed on things like secure enclaves which are available in most new mobile phones. And with that you now have a secure file and hash. So what we then did is we took a unique content identifier, a CID on that record, both again the pixels as well as the metadata and we distributed it. And we put it on distributed storage system that allowed us to put it in a variety of different environments, be it on the for-profit cloud and the nonprofit cloud, education environments, even things like Raspberry Pi's, we found ways of disseminating that information. And then using advanced cryptographic proofs such as Filecoin, we're able to create proofs of space time that every day we're able to prove that the image had not been manipulated in the object storage. And the network that was underwriting all of this was decentralized. And so really the learnings there is that as you continue to decentralize this work, the more people that join and the more people that are able to join that network, the more secure that network is. So then we continued on to do our expert review. And this is a really important step. This is where Hyperledger comes in. So as experts are reviewing the information as it's coming in, one of the things that they're doing is they are creating their own attestations around the validity of that information. And so with that process, what was important to us was that we were able to use a decentralized process to take that information and to put it out there on a database that, which in this case was underwritten by Hyperledger. And it allowed us to take, think of it as almost middleware, we're able to take that information and then start to disseminate it more broadly. One, the metadata was then put onto a public ledger which in this case we used the hashgraph provided by Hedera. And that seamless integration between Hyperledger and Hedera was something that was really powerful for us to experience both in this performance as well as the ease in which the technology was put together. And then secondly, we were able to put it on a decentralized cache. And this was really important so that Hyperledger wasn't paying it every time as you needed to verify the information, but instead we could take snapshots of what was in Hyperledger and put it out not just to any cache but to a decentralized cache that we built with the gun database that allowed for rapid performance of this mutable data. And the combination therefore of this public ledger that was created by Hedera and the public cache was created by gun was a really important compliment to the Hyperledger solution. And finally, we have here the presentation of this information. And this was essential to figure out a way in which we could allow end consumers to use to see this. And so we worked with Adobe and their content authenticity initiative to create a way to inject all of this information directly inside of photographs. And so you can see here, the record establishes the photographer, the date, the location. And then we have our signatures coming from Reuters and of course are linked back to the centralized web. And what that meant was that now you really, you've had in every photograph, not just a collection of the pixels, but it also was a container of all of these additional metadata points and the links back to decentralized web all contained within photograph. So to us, what occurred was that, well, that actually that's the ultimate form of decentralization because now if all of this resides within the object as the object moves around, no matter where it is, you're able to continually establish that form of credibility and that authenticity can be pinged against the various ledgers that are there. And so that was a really powerful example for us. So we put it into action with Reuters. And so the prototype began with work that we did during the California primary in which we tested out all the technology and we up leveled from where we had initially anticipated to actually incorporate a hop from a professional grade DSLR camera to the phone acting as a notary and variety of different techniques were explored to secure that link. And what we found then, of course, was that there was from the professional grade camera to the phone. And then finally, we were able to incorporate it directly into the Royce Newswire's system. And so here you see this is photo wear, which is the tool that they use for their Digiosset management. And you can see here that as we looked at this photo, you can click on the tab and you can see all of the keys that were created with Starlink. And if I went in as an editor and let's say made a modification to this information, like let's say I was changing a caption, for instance. So if I added that there, which in the case of this demo I'm doing and we look at the timestamp in which the web hook now pushes out this update to the system, we can now jump into our managed solution for hyper ledger fabric, which was the IBM blockchain platform. And we can zoom in and we can see seconds later, there it is residing within hyper ledger fabric. And we can push out now to a Kabuto Explorer, which looks at how this information was then syndicated onto the Hedera Hashgraph. And indeed seconds later, there it is again. And we can see that the contents of the information as it is on the public ledger are encrypted. So we have an ordering system and we've got the marks of authenticity, but yet we retain the security and the encryption on the public network. So a couple of firsts were achieved here. Hardware-based encryption with a solution with the new Canon APIs, the professional grade cameras. We had an end-to-end signal flow built entirely for web three, a photo wear integration, which was the first to hyper ledger fabric and really a cryptographic UX, which is pioneered by the books at Adobe and the New York Times and Twitter with the CAI. So all of that was available to us. And we set out then in our time looking at a second pilot which was done during the 78 days between the election and the inauguration. And surely, I don't think anyone could have anticipated sitting there on January 20th of this year. No one could have anticipated what had happened in the days leading up to that inauguration. And surely no matter what side of the aisle that you sit on, the events that occurred on 6th of January among many of the other days during that transition period are troubling. And specifically they're troubling because of the attacks that were waged directly on the media. Again, there are reasonable conversations to have about opposing viewpoints about where we are with the state of American politics. But surely we can all agree that the fourth estate is critical and that strengthening the work of journalists is something that we should all have common cause to support. And in some ways, if you look at what happened four years before, I think you can draw some pretty straight lines between the types of violence and questioning of the media and ask ourselves like, is there a better way? Is there a way potentially that as we built with an archive, we could start to use digital tools to help restore trust, not to tear down trust. And so that's precisely what you can see with the prototype's final result, which is this archive available on our website. And what you'll see there are the immutable records of those 78 days, the photographs which document them. And now let's come to our learnings. Certainly we learned a lot with what happened during that time and certainly if you look at even the removal of Donald Trump from Twitter which happened during those 78 days, it unveils like both the possibility of potentially restoring civics and thinking about responsibilities with online platforms. And then the flip side of that coin is it also revealed to be centralized control that it consists within those platforms. And certainly there is a lot to consider here. These are thorny issues. So I want to emerge with some learnings and specifically to put it through the prism of hyperledger fabric. So first, when we take a look at what we learned in terms of the technology stack, there's some really important lessons here about how things had to come together. Because really what we learned is that we had a lot of hashes that we had to deal with. Hashes that were coming directly from the authentication at source with the cameras, hashes that came every 24 hours that were coming out of protocols like Filecoin. And then we have all the hashes that were coming out of each transaction that occurred every time someone verified things. And what we needed was a solution that could take the hashes and all the cryptographic integrity and link it back to the object, to the photo itself. And what better way to do that than using hyperledger fabric because the advantage is that you could actually maintain the principle of decentralization as you were creating those intelligent links. And you could have therefore an end to end system which had decentralization and intelligence that you could link all these various forms of decentralizations together, but you maintain the maximum amount of control because what was special about the hyperledger fabric solution was that you could establish peers to use for maximum flexibility to establish a zone of trust. And therefore the hybrid approach that we used here of using both permission and permissionless systems, the public and private ledgers really allowed us to maintain for the long-term, the choice to expand our zones of trust from internal peers that could exist within Reuters. And then as they felt comfortable to expand things to external peers. And that is the essence of actually a solution that you can use to fight misinformation. And why I say that is because this process of creating an entire web of knowledge around each photograph can really only happen if you're able to make the choices with those peers and to start to bring more and more people into that process. So examples of those types of peers could include the pointer international fact-checking network, which can help establish all the work that it's doing to validate information as it comes in. It could be standards organization like claim review project or the work that's being done over at the C2BA, which is being very proudly taken on by the Linux Foundation, which is working on standards essentially for authentication. And when you think about really how to add all this up, what this allows is for a system for individual creators to build and do their own fact-checking. And not to centralize all of that work and all of that authenticity onto one single ledger. I mean, that would be a horrible Orwellian result in having some sort of ledger of truth that's out there. But instead, with solutions like hyperledger fabric, what you're able to achieve is that every organization can maintain its own ledger, establish the peers in which the zones of trust that it believes it can successfully maintain, and then the collection and the interoperability of all of that information. Well, that's truly what the solution is in creating overwhelming amounts of information that can create confidence. I'm not sure if these videos are playing, so I'm gonna go ahead and proceed, but I just wanna conclude with some ideas here about our sense of responsibility within the community. Because then this is oftentimes really lost in these technology discussions. I think there's a shorthand now within the blockchain community in which consensus is somehow equated to automation. If you look at even like DAOs, decentralized autonomous organizations, we're blending this idea that somehow creating consensus is obviously enabled by cryptography. And then somehow through the goal of creating more and more automated processes, maybe the entire thing is somehow autonomous in itself. And I don't think that's true. I think that's actually a very problematic approach to consensus because really when you think about it, consensus requires a lot more work. It requires things like communication. It requires things like choice so that you can choose to bring your information wherever you so choose. And finally, it requires community in which we can establish a set of common values. And maybe in aggregate, we might be actually quite diverse and may not agree on all things, but there could be ways of establishing consensus that eventually contribute to larger forms of consensus. And that really then underpins the ability to create trust because without all of these other activities, it doesn't matter how much automation you have, you're gonna still have disputes, humans are involved in this process, it's a messy process. And so we as a community need tools that can enable these forms of communication and choice and community. And that is not a theoretical debate. I think when you look at the 20 years of success that's existed within the open source community, we can see this is exactly what is needed to create successful open source projects. One of my favorite papers that really discussed all this was put up by the Mozilla Foundation. It's great bedside reading as I like to joke. I mean, it's quite dense, but honestly, if you haven't read it, it's worth it. And this talks about frameworks for open source governance and archetypes that exists within the open source community. And I think the blockchain world has a lot to learn from the last 20 years of how open source communities have actually done their work to create various goals of open source creation, how they actually formed those communities and how they've governed them. And it's also revealed that there is still so much work to do. Even 20 years later, open source has not just been the silver bullet to solve everything, just the same way that decentralization is not gonna be some sort of panacea. Algorithmic bias, privacy issues, the need for human intervention, the challenges of private platforms and then finally, of course, diversity. None of those things are solved on their own. And look at the open source community as an example. There's still so much work to be done to create diversity and inclusion within that community. You can see here by the stats, even though the majority of people within the community believe it is inviting. And I think that ethos is important and important to cherish. The numbers tell a different story, which just should really be an inspiration for all of us to bring more into this tech. So in short, that's the reason for creating multiple ledgers and links between them is that really you want, from a security standpoint, these diverse cryptographic features, you want the performance characteristics that can come by using different types of chains, which each have their own sweet spot within creating trust. And that can allow therefore, for diverse methods of preservation and finally, of course, diverse views. Because that indeed is the solution to fighting this information, is that if we can bring overwhelming amounts of evidence from the most diverse views, that's the best chance that we had in creating real and lasting consensus about how we all want to be part of this society. And of course, strengthening journalism, which as I mentioned, is truly something which none of us should take for granted. So with that, I'll pause here and turn it back over to Brian. Thank you, Jonathan. Stick around, there were a few questions asked in the community. So I just wanted to get to, one of them was the public blockchains that you're using as the other side of this. Could you mention which protocols those are? Yeah, so in this case, we were using out of hyperledger fabric, the Hedera Hashgraph and Filecoin as the two main chains. We also did some experiments with IOTA and Ethereum as well, but they weren't included as part of this prototype, but we're big fans of a lot of these different projects. Yeah, yeah. And it sounds like you're using IPFS in there as well. Yes. Filecoin, okay. So what happens if, and this was another question that was asked, there's kind of a GDPR kind of request to have some content deleted from one of these. How does that affect your use of these technologies? So when you look at some of the authorities on archiving in the European Union, which is obviously subject to GDPR, there is a big difference between the destruction of information and the availability of information. So the GDPR mandates is really that there is choice for the consumer and that, but within responsible archiving work, there are still rights that are afforded to archivists to keep that information as they see it fit for historical preservation reasons or other mandates such as journalism. And so we've been looking at those issues and trying to understand how that's possible to both continue with some forms of persistent storage. The persistent hashes of that storage are also very valuable as in how different choices are made, let's say in the future to maybe bring back information into the public fold and public sphere, you can then still have those public forms of authentication. So those are important to keep around. But as I think the questioner is signaling, these are right now means easy issues and therefore they require for all of these protocols to take some measures to be able to both add information as well as remove references to that information in a responsible way. And I think specifically with the distributed storage techniques that we were using with content addressing and IPFS and cryptographic storage with Filecoin, they have large teams that are at work on thinking about accountability, both at the technical and ethical level. Yeah. Well, I really appreciate the time you've been able to spend with us allowing us to deep dive into this. I also really appreciated your mention of being able to remember the importance of remembering the role of humans in the governance of these systems and the design of these applications. I'm definitely on team human myself as I've said another time this conference. So once again, thank you, Jonathan. And let's shift gears now to