 My name is Dong Song. I'm the Funken CEO at Oasis Labs, and also a professor at UC Berkeley. And today I will talk about the decentralized further to learning. So first, as we all know, data is the new oil. This data is really helpful for us to make better decisions and give us a lot of insights. However, a lot of this data is also very sensitive, and handling the sensitive data poses a lot of challenges. So for example, we continue to see really large-scale data breaches where attackers compromise and steal sensitive information about hundreds of millions of users in a single attack. And also we continue to see that users are losing trust in companies as a lot of users' data may be mishandled, abused, and so on. So as we handle sensitive data, there are a number of challenges that we need to address. One is the infrastructure may be untrusted, attackers may have compromised the compute infrastructure, or the application itself may be untrusted, because again, attackers may have launched some malicious applications, or the applications may have vulnerabilities. And also some applications may have undesirable data leakage. So to address these problems, as Oasis left, we have been developing new solutions where we essentially built the next generation blockchain smart contract platform with greater scalability and privacy protection, where we enable scalable smart contract execution and also enable, for the first time, confidential smart contracts. And also on top of the platform, we built SDKs to enable privacy-preserving analytics and machine. So today I'll talk about some new research that we have done at Avercly and also Oasis with decentralized federal technology. So first let's look at a motivating example in fraud detection. So when banks provide financial services, for example, giving out loans, they need to detect fraud. And typically how this work is that each bank uses its own data to build, for example, some machine-made model for fraud detection. As we know that it would be much better if these banks can actually share the data to build one fraud detection instead of the individual ones. Because with more data, the machine-made model can essentially be more effective. However, this cannot be done today because there's a lot of privacy concerns because the data is sensitive and also there may be misaligned incentives and so on. So how can we address this problem? So there are a number of alternative solutions that we can consider. So with one alternative solution, we can use something called federative learning. So federative learning is a mechanism, it's an approach that actually has seen some adoption and deployments at Google and Apple, for example, where essentially using federative learning uses data at stores locally on their mobile phones. And then the central server at Google or Apple can coordinate these different local devices to collectively train a machine learning model in this distributed fashion. So in this case, with federative learning, it takes an iterative approach where in round eye, it has the information of this global model that has been trained. And then this model is distributed to the local devices and then each local device uses its own local data to try to update this model using what we call gridding updates. And then this gridding updates is then sent back to the central server to then, the central server then aggregates these gridding updates to then create the next region of the model and then this continues. So in this way, the advantage is that each user's data is stored locally is not actually uploaded to the cloud or to the central server. But still, in this case, it still needs the central server to coordinate this federative learning. And there are, of course, a number of challenges when we try to use this type of approach to address the problem that I just mentioned. For example, in this case, when different banks wants to collaborate, the question, one question is who should actually host the central server? Ideally, you don't want to rely on any central system, so you don't want to actually have anyone hosting the central server. And also the central server, by relying on the central server, it can create issues, for example, if it became malicious or it may also give the central server some unfair advantage. So in contrast to this federative learning model, which relies on the central server to coordinate distributed devices to gather training machine model in a distributed way, another alternative solution is what's called a peer-to-peer solution. In this case, essentially the different devices form a peer-to-peer network. There's no central server. And essentially, the different devices will communicate through this peer-to-peer network with its neighbors and then through, again, an iterative approach to try to together update this machine learning model to utilize the local data on the different devices to train this machine learning model. However, the advantage of this approach is that in this setting, we don't need to trust any central server because there's simply no central server. However, the challenge here is that in this peer-to-peer there's a significant overheads for communication and so on, especially when the number of clients becomes large. And also there's more privacy challenges in this thing. So in our recent work, we proposed a new paradigm that we call decentralized center-to-peer learning. Where we combine essentially the best of both worlds, both using decentralized setting where we don't rely on any central server, but also at the same time, we can achieve much better efficiency and much stronger privacy protection than the peer-to-peer setting. And then when we do that, is that instead of using a central server, we essentially use a blockchain smart contract platform to coordinate this distributed machine learning across the different local devices, the different clients, which holds the private thing. This new paradigm, we achieve a number of advantages. One, again, we enable decentralization so we don't need to trust any central party for the operations. And again, it's much more efficient and can really scale to a large number of participants compared to the peer-to-peer solution. And we can achieve very high utility. Essentially, the accuracy can be comparable to the centralized model. And we achieve strong privacy where we can protect the client's data privacy. And by using blockchain smart contract platform, we have a pretty incentive mechanism. So now let's take a look at how this approach works. So first, let me give you a strongman solution where, as I mentioned, one way to achieve this decentralized federated learning is to use a blockchain smart contract platform to coordinate these different clients for training the machine learning model in a distributed decentralized way. So one strongman approach is that we can simply set up a smart contract on the blockchain platform, which performs the model aggregation, as I mentioned earlier, in the federated learning setting where essentially this model aggregation smart contract will perform what the central server will do down the line. But however, right, okay. So the challenge here is in this case, it can be very efficient. And also provides a strong integrity guarantees even in this decentralized setting where we don't need to actually rely on trusts of a central server. However, the challenge here is that using this approach, there can be privacy concerns because on a typical blockchain smart contract platform, the blockchain data is public. And even though in this case, the clients are now sending local data to the blockchain platform, but however, even these screening updates for, to updates the machine learning models, they can still contain sensitive information. So the question is how can we do this in a more privacy-presuming way? So to address these challenges, we propose HiveMinds to enable this new paradigm of decentralized federated learning. And HiveMind essentially utilizes a number of privacy-preserving technologies, including differential privacy, where we actually train a differential in private machine learning model. And using secure aggregation, where the model updates that's being sent to the blockchain smart contract platform actually will be in an encrypted form. And we use secure aggregation to do the model aggregation. And of course, in this case, also to protect privacy, we ensure that the model is encrypted. And that's, again, we take an intuitive approach. And in this case, in particular, we actually have enabled this decentralized federated learning on the Oasis blockchain platform. So one thing, as I mentioned earlier, Oasis blockchain platform actually enables confidential smart contracts, where the smart contract states is actually in the form and we utilize secure computing to enable secure computation or execution of the smart contracts while the smart contract states is starting in an encrypted form. So essentially with HiveMind, we utilize this confidential smart contract primitive, where, again, in this intuitive approach, at each iteration, the machine learning model that we are trying to train, you can view it as part of the state of a confidential smart contract of what we call HiveMind. And at each round, this encrypted model is sent to the local devices. We do handle some key measurements. So the model then gets decrypted on the local devices. And then the client uses its own local data to update the local model. And then it then computes the greeting updates. And then it also adds differential privacy perturbation to ensure that this generates the greetings. In the end, we can train a differential and private model. And then these greeting updates with the differential privacy noise address is then, again, actually encrypted. And then sent to the OSS blockchain to the HiveMind smart contracts. Where the HiveMind smart contracts then combines the screening updates and the differential privacy perturbations and through the secure aggregation and then computes the next iteration of the updated model. So again, by taking this intuitive approach, we can then use this HiveMind confidential smart contracts to coordinate the distributed machine learning training across the different devices to enable decentralized fabricated learning. So now I'm going to give you an example of actual application that we call in the domain of anomaly detection. So in this case, essentially, we can update the HiveMind smart contract to enable, essentially, to train a machine learning model for anomaly detection. And in particular, in this case, for anomaly detection, we look at an application of a system log anomaly detection. So as computers run certain programs, oftentimes it produces system logs and the system logs contains very useful information and one can try to train machine learning models to go over this system logs to try to detect anomalies. And essentially, one way to do system log anomaly detection using deep learning is that we can actually train a sequence model and then using the sequence model, we can then go through the system logs to try to detect anomalies. So in our case, essentially, we actually use the real world system logs and then train it through using this decentralized federative learning to actually train and update the model on our platform. And we achieve, we enable strong privacy protection while at the same time achieve high utility. And in many settings, the utility essentially is close to the centralized model. So to summarize, we have proposed a new paradigm called decentralized federative learning where we don't need to rely on any, trust on any central planning and we provide much greater efficiency for training a distributed machine learning model than the peer-to-peer setting with a strong privacy protection and the beauty incentive mechanisms. And this decentralized federative learning has been enabled by the OSIS blockchain platform with its unique capabilities for greater scalability and also privacy protection. So with the OSIS networking, we can, as I just demonstrated, it's the first blockchain smart contract platform that can enable this type of solutions decentralized federative learning. And we hope to encourage everyone to build new applications on the OSIS network. And we recently launched our Demnet 2.0 and also we released a number of new developer tools with new developer experience to make it really easy to build applications on top of the platform and in particular, also including new, and also we are developing new data privacy APIs in particular to make it easier for developers to build privacy-free applications. And we are inviting node operators and to join our network and we'll be starting also a bug bounty and also we'll be launching a public extension with the staking competition. So the staking competition will be coming later in Q4 and we are really welcome, everyone, to join the staking competition. And again, we look forward to have people try out the network and to build new applications that couldn't be built before. Thank you.