 Great, thanks. Yeah, so this is a quick overview of the tech we've been exploring at Fermi And then some food for thought on how to apply Trust the computing to scientific research so down to the data generation acquisition phase So we've heard that obviously in recent years. We've had lots of hacks and incidents of surveillance Compromising user privacy and mostly due to the centralization of big tech companies Oops So yeah, the this recently came out Bloomberg last month about supposedly Chinese military adding chips to my super microservers and that's a Chinese company that manufacturers servers that Apple Amazon and big tech companies Buy and then deploy their clouds Then Apple obviously refuted all that so we're not entirely sure if it's fake news or not, but the point is that They according to Bloomberg the Chinese put this super tiny chip into the servers and That kind of chip was being used to secretly add instructions to the CPU That allowed basically China supposedly to eavesdrop on clouds communications So obviously Apple You know the the big tech companies are having their own solutions regarding to privacy So Apple has push been pushing the envelope on this both on the consumer side on the hardware and on the software side, so The consumer devices they have this t2 chip on MacBooks and the A7 and onwards chip on iPads iPhones and also the Apple watch so what what they didn't these Processors have is a co-processor That includes a secure enclave and what that is is basically a hardware filter that Would act as a barrier between even the CPU so the processing unit in your device And the information in this case it's used mostly for face ID touch ID Data so the biometric information that you give to your phone is never being accessed even at the hardware level So obviously this is kind of a secure system. This is the main architecture So yeah, the memory even can't access the the main biometric data that You give when you access your phone in that case On the software side There's So on the software side does thing called homomorphic encryption Apple in particular is using something a step behind this which is called differential privacy and is about anonymizing datasets before at the data generation event So on the device itself before the that is is actually sent but the problem with this is that Recent research has shown especially this year that these datasets can be de-anonymized So yeah, it's not the the most secure method that we have these days so There's another step after that called homomorphic encryption and what that is is a Cryptographic approach where you can do computations on the actual encrypted data itself The problem with this it's it's mostly academic approach, so it's very slow even though this table will show you when These use cases could be implemented especially in health care in one or two years Because the point is that Data is already encrypted at rest and in transport So community like SSL TLS and address is using databases, but the problem is how do you encrypt data efficiently when it's in memory? so obviously with homomorphic encryption you don't Have this issue because the actual computation is happening on encrypted data itself but it's basic very basic operations like addition and multiplication, so it's still a Research area, so this is a practical use case that you would have I Think in at the talks before we got a similar scenario for health care data And in practice what happens is that you encrypt your blood sample in that case with the private key then you send the Data to the server and then you have server side processing using just the public key on the encrypted values and So when the server sends the the data back the client actually decrypts using his own private key So everything is happening with encrypted data So the big tech companies are having their own approaches to this especially Microsoft and IBM they have released this paper a while ago called Pinocchio, which is an approach on trusted computation verified computation And this is a in that like it's not an academic effort. It's actually a much faster approach, so The it's the concept also behind the well-known in the cryptocurrency space ZK snark, so zero knowledge proofs This is a different implementation, but it's using the same underlying approach of verified computation. So it's a much faster Implementation because you can take a program run it on an untrusted server. So let's take the example of the super micro hacked server And then check the output with a proof and this is all happening in software Yeah, then you have Amazon side trail. There is another effort from Amazon doing the same stuff But yeah in the in the blockchain cryptocurrency space Those are the main efforts for for this kind of stuff. So you have Crypto note the crypto protocol that is Monero basically as a cryptocurrency and that is basically using ring ring signatures to to protect transactions Like last month they added research Coming out of Stanford called ballot proofs and they actually implemented that and on their blockchain which would Reduce the size of it of the transactions a lot so this is research happening really really fast and being implemented in the real world and After that we have zero knowledge proofs and that is probably the most interesting development in cryptography because it allows you to Prove like you have basically a prover and a verifier so The like in that case the at the data generation phase You'd like to Transmit that you actually own that data without revealing the data itself. So that's the zero knowledge part And yeah in a z-cash the cryptocurrency z-cash implements that for financial transactions And then one step after that is what's called? teas so like trusted execution environments and that is You have that let's say implemented in hardware wallets, but in particular you have project enigma from MIT that is using this technique to Create basically secret contracts on Ethereum and so using the the trusted execution Environment what that is is an implementation mainly from Intel called Intel SGX and it's an isolated Environment inside the CPU where developers can actually Have Reassurance that the code being executed at the hardware level can be tampered with So that's how they basically create privacy smart contracts on a theorem and then the last Advancement in the space is called ZK Starks coming out of star queer which is another cryptocurrency company And basically it's the last version of zero knowledge proofs, but instead of using asymmetric cryptography And a trusted setup like ZK Starks uses symmetric keys specifically collision resistant hash functions Yeah, and that is much faster better, etc. So Yeah, as I mentioned you have hardware wallets that have a similar implementation in terms of Having secure element inside that acts as a hardware filter between your private key and the execution So yeah, this is a similar stuff as what Apple does in their phones so all of this is Yeah, the state of the art of cryptography and trusted computing this time, but the idea of applying all that to scientific research would be to you know going from bits to atoms in terms of Applying all that to companies like transcriptic This is a Bay Area company that aims to be the AWS so the Amazon clouds of Biology because what they do is Put all the web lab Stuff inside containers. So like pipetting all the biology web lab processes And automating that using API's so what you can do with this is you create an account as a biologist And then with code you automate all the stuff that you you had to hire like 50 researchers to do in the past The problem with that is there's still missing pieces in terms of the data acquisition and other generation phase So how do you trust that these guys are actually performing? So the the actions that you tell them right Obviously if you have your own research team, you have your own eyes to check on all that but if you outsource and delegate all that Possibly mission-critical research to something like that which can be on the other side of the world How do you trust that this research and this data has not been tampered with has not been compromised is not being like, you know They changed the process that you Delegated to them and all that so Yeah, the main idea is obviously to apply all the previous techniques including trusted computing at the hardware level on these chips on these manufacturing facilities to really go from beats to atoms and complete the the pipeline of an automated like AI science research and All that so this is the let's see if that works Yeah, this is just an overview of their lab So yeah, there's a robot that automates the in that case the pipet thing and you can Yeah, code all that and save tons of time So there's a few missing pieces right in terms of the final vision would be integrating all these processes into one pipeline where you can go from your Jupiter notebook And you're you know like wet lab notebook to to this and Yeah, the next video is Oops The fundamental way in which we conduct your lab research really hasn't changed that much from the days in our store The present actually looks a lot like the past as I like to say there's much more voodoo in science that anyone is willing to admit We want to eliminate that stuff as much as we can People would print out pictures and tape them in their notebooks and expect that to be the way that like data is tracked and passed along My grandmother was actually a chemist back in the 40s. I actually spent some time reading her thesis it was hard to tell whether this was done in the 40s or you know today Most organic chemists think of themselves more as artists than actual scientists You get things that look more like food recipes like I had a dash of this and a dash of that It's very hard to reproduce Finally someone sat down and said like well, why don't we just take the experiment and turn it into code The system that we generated is an emergent property of thinking about how to move the science forward in the most effective way possible The emerald cloud lab allows anyone with a laptop to direct experiments from anywhere in the world It's also an automatically parcel information store it So you never have to go searching through old notebooks all the data is tied together A robot can do a lot of scientific techniques better than humans can and All of that can be described in code and in a database It's very exciting to be a part of that There's no need for people to be in the lab to actually run the experiment or to have access to the data It's it's only available to anybody When we noticed that when we started going to the lab in the morning and there was no one in it yet all these things were running Because everything is standardized you have all the information about how you carried out the experiment the first time You can very quickly reproduce that that data over and over again If you can fully reproduce being fully encapsulated that means you can basically abstract about what you can build on top of that The ECL allows us to answer much bigger questions in biology that we wouldn't otherwise be able to answer It allows you to be acting as the architect of the science rather than actually carrying out the experiments themselves because everything is automated We're not doing this just to do it. We're doing this because we think that the way science has been done is not rigorous enough So one of the ways this is going to change life sciences is by making each individual scientist much more powerful You're only limited by the experiments that you can dream up This is great quote about standing on the shoulders of giants and that's how we make progress in science I think that kind of undersells a little bit But we should be standing on all of each other's shoulders Yeah, that's it Questions Hello I'm over here A lot of science I think involves not putting things in the code, but actually observations that arrives very like in a very serious Manor so How would it work in the situation where divorce the scientist from experience of natural processes And it's all very pre-programmed predetermined How would you you know how how do you envisage discovery in this kind of? Yeah, definitely. That's a good point and this was mainly let's say to automate the second step of the scientific process so the first step you mentioned is the idea generation phase or So after that you would then go on to verify that hypothesis and on you know Going on with the process and in that instance in this example for like what lab biology you would Then you know verify your experiment and Maintain your experiment using these techniques instead of Going through the process that we know about like hiring I don't know like then PhD researchers if you don't already have your own lab So this could obviously open the door to lots of Citizen science that today is mostly related only to like data crunching and stuff that You know, you have initiatives like bio curious in the US and I jam so that Citizen science scientists can also do web lab research on their own, but it's very very limited, right? So if you have this kind of pipeline completed It would open doors, yeah Sure sure, but that's exactly the point so Sure sure Well, so this would also kind of tackle their reproducibility Crisis in that specific case for like web lab experiments So as they mentioned in the video like they're still in chemistry, you know some like Processes that look like food recipes, right? So if another researcher wants to reproduce that that experiment it's very hard these days in terms of You know the actual web lab stuff. So with this It should standardize to a certain extent where the reproducibility just comes after like you click another button and it's done, but Yeah, yeah Sure. Yeah, that's that's absolutely right in fact Yeah Yeah, absolutely In fact, that's also the missing piece in terms of this is like these are two companies in this in Silicon Valley Right now and like for the this vision to work You would need I don't know like those and at least a lot of such things around the world So more of in a decentralized way. Absolutely. Yeah Hi max you had that exotic blocks going from bottom left to bottom right with all these, you know fancy encryption I was just wondering if you thought there was a box not on the slide to the bottom left And that was the the lightweight Cryptography to allow you to couple from the sensor or from that lab actually into the infrastructure So I just wonder if you any thoughts on the lightweight encryption that we can use actually on the devices Yeah for lightweight stuff, it's hard Like you can't use as you know homomorphic encryption and so Things like this are more Practical, I mean obviously a big companies created this kind of stuff for cloud computing But they're supposedly efficient enough to to be explored for mobile clients to okay further questions Otherwise, let's thank all the speakers once again