 Welcome everybody. This is great. I did an identically titled talk with different slides 18 months ago And there were like 10 people in the room. So this is fantastic that you all turned up today So we're gonna talk to you today about when the going gets tough get tough going and we'll find out more about what that Acronym is as we go along. My name is David Lawrence. My name is Ashmini and we're both security engineers at Docker so today, we're gonna talk about sort of Where are we coming from with signing like let's have a little look back at what signing used to be Where we are with the state of digital signing Some specific challenges in software distribution because a lot of what we talk about in terms of signing things Has added requirements and properties that we want to achieve when we talk about distributing software We're then gonna look at how tough solves these problems and notary, which is an implementation of tough or the update framework And then assuming we have time we'll have a very very brief demo at the end So looking back So passing message around messages around securely is really not a modern date problem It has existed for a long time in the old days The way to pass like news or messages was that the Kings of the kingdoms would Take a message write it down on paper seal it and give it to a messenger and the messenger would then get on a horse or a garage or whatever was the norm at the time and Pass it around to the right recipient Now there are two assumptions that were made here The first one was that we implicitly trusted the messenger to do the right thing and not mess with the message the second one is that The seal that I mentioned that the Kings put on the messages were very special so the King seal was sort of like a mark of the King or the Empire and a Good perfection around making sure that the integrity of the messages remained constant was that you would just Imprison or punish or Whatever the criminal proceedings were at the time the people who tried to copy the seal and It's it's actually a very real problem. It was then and it still is now How do you tell if a stamp is real or a stolen or or a last one that's a random person found? so Moving on though, we translated the concept of seals to signatures in the modern world and We evolved to this idea of asking people to sign documents Associating someone's signature as a mark of their identification But this also is not a perfect solution because what if I sign your name? How can we tell if it was in fact your signature and you yourself signed it and that's sort of brought about the the need for witnesses and A witness is basically a second person who could witness you signing a document thereby sort of establishing that you made your mark and The integrity of the signature is valid But wait, how do you trust the integrity of a witness? witnesses can be bad and a Witness could just be a friend that you asked to say something in court So that Establish the need for notaries Notary public is basically a person that we've entrusted to be a witness And we implicitly distrust them to be a good at being a witness and doing the right thing I'll hand it over to David to tell us what happened when we all started using computers So in between, you know, notary publics and people just signing documents. We actually had these really interesting machines up on the top left That's an auto pen famously used by presidents and it's literally a device for forging signatures So, you know, you want the president to sign some legislation if you can get access to his auto pen You can get whatever you want passed But signing in the digital world really came into its own and really didn't exist before we developed asymmetric crypto So just quick show of hands. Who really, you know, has a good handle on asymmetric crypto Okay, that's good, it's probably about half the room So the idea with asymmetric cryptography for those that aren't so familiar you generate a key that is actually a pair of data points one of these you make public and one of these you keep private and Secure as best as you possibly can Any operation done with one of those keys can be reversed with the other key So you want to encrypt something to somebody you take their public key you encrypt the data you send it to them They can use their private key to decrypt it signing in a very simple sense uses the Private key to actually encrypt the information and then you send out both the plain text and the encrypted format And a person can use your public key to do the decryption operation match the two pieces of data and confirm that you then actually had possession of the private part of the key and the most well-known Signing or I should say asymmetric crypto implementation is probably GPG new privacy guard and this gives us a number of very desirable properties Confidentiality as we said we can use these keys to encrypt things But in the world of signing Encryption isn't specifically what we're interested in that may be a requirement for a specific use case But we're not going to address that explicitly More importantly though we get integrity we can check that a message hasn't been tampered with by using GPG signing Through the web of trust we also get some degree of authenticity Now the idea of the web of trust is that you meet people you verify their identity and you use your private key to sign their keys and in doing so somebody can then determine that there is a trusted Relationship between you and that person at least as far as identity is concerned You know the more people that have verified your identity by signing your keys The more likely that that identity is legitimate to a random person who looks up your key in a directory online Of course, this is transitive So if I have no connection with somebody who sends me a message, but maybe Ashwini does I Have to decide if I trust that person based both on my relationship with Ashwini and also maybe asking Ashwini Hey, you know, how well do you know this person? Should I actually trust them? So, you know, authenticity isn't perfect here But we do have some mechanism for getting trust between two parties who don't necessarily know each other directly in the real world and then we also have non-repudiation Which in classical signing is a highly desirable property This means that somebody can't come later and say no, no, no, that wasn't me. I didn't sign that thing of Course as we mentioned with losing seals. You can also lose your GPG keys and the lost the lost stamp or the lost key problem is a real issue because You know, I may not be able to repudiate the signature itself But I can say like no, no, I lost my key like whatever was sent to you then it wasn't me somebody else had my key and The main problem with this is revocation Revocation around keys in general is frequently based on black lists Black lists are bad. They're straight up bad Like there may be cases where you have to use one, but you should always prefer a white list The reason for this is because black lists are fail open If you fail to get an update to the black list or something doesn't go into the black list in the first place You continue to trust something you should not be trusting anymore so let's continue looking at some of the Interesting life cycle properties in software that go beyond just classical signing Software is very very special and has a very interesting properties. For example software gets old if I sign an email It'll still have the same meaning years later, but software needs constant updates Because as software ages, we usually find vulnerabilities and vulnerabilities require fixes which these fixes are then usually released as a new version and as people download the new version and update their Installations we have to deal with incompatibility issues and The thing about complex software systems is that trying to update one component might lead to you getting a Chain of dependencies that subsequently require updating So this brings us to interesting distribution properties Because once you work done updating your software Releasing it and pushing those updates to consumers is in itself a big complex task and As a user downloading and updating software components securely is not very straightforward So for example, even if I'm downloading an update over TLS and have secured the handshake works and everything is great It doesn't protect me from compromised servers and servers get compromised all the time So do y'all see the picture on the right side? That's Fargo stage coaches were the TLS of the Wild West. They protected money in transport But there was nothing from stopping an evil bank manager for putting fake money on the on the coach and Similarly if the server or the server owner is in a location where they can be compelled to send you bad data Can you still trust the source? So moreover this doesn't give us any protection from expired mirrors if I am trying to download and update from a Node that thinks it has the latest version, but is really out of date The the update that I will get will appear to be a latest update But it won't really be and in that case unless I actually go back and manually check the versions and do all the matches There will be no way for me to tell that by default So What do we do? It sounds like going is getting tougher and tougher So As the title of the talk said let's get tough going and tough is the update framework Developed by Justin Kappos who is somewhere in the room here Excellent work and his team at NYU Tandon School of Engineering. I'll get the name right the update framework is a holistic solution to securing the Distribution of your software updates and really any digital content over the internet So, you know, what is it at its core and strap in because this is where we get into like the nitty-gritty stuff Traditionally, we would sign individual packages if you look at something like the Python package index They still do this every single publisher of a package is expected to sign that package themselves This creates an enormous headache for consumers if I have a hundred dependencies that I'm managing from the Python package index If I'm fortunate, they will all be signed in reality only about five percent of the packages in there are actually signed But assuming they all are signed I have to go and find every single public key of every single publisher and there is no standard way for them to get those To me so that I can install those onto my CI systems or my production servers to verify every single piece of content I'm going to go and download Now some people have already improved on this by having the actual repository manager sign the packages Sometimes as an additional signature and sometimes like with Apple Apple iOS applications Apple just replaces the signature But you're still typically signing individual packages. So going beyond this again What tough does and what some of the more forward-thinking? Packaging managers have done is starting to sign the entire collection I know, you know, we mentioned blacklist earlier if I had a bad package in this world I might have to blacklist a key or blacklist a package when I sign the collection I now have a whitelist Only packages that should be in that collection actually do get signed in Now what tough does that's really useful is I sometimes still want to get the signature from the publisher So in tough I can define what are called delegations Which allow me to segment out part of the namespace within this repository and say, you know what? I have a different person I want to be responsible for this little group of packages here and Adding to this white list. I'm actually going to take that person's key and sign it in and Now as a consumer, you're no longer going in getting a random key off the internet Even though you don't know the person that manages this subset of packages you can Generally assume that there's probably a reasonably trusted relationship between the person that owns the repository and the person That's publishing the packages like it may just be an online account But that's more than you just going to a random website based on somebody's name and finding a public key Additionally because I have a white list I can remove things Like I'm no longer tied to you know out of date servers possibly continuing to serve bad metadata I can start taking things out of my white list as they get updated or deprecated or whatever the situation happens to be and It's guaranteed that these updates will make it out to people because tough implements expiry times on every single piece of metadata So I can guarantee that I will get my updated white list with packages removed packages updated Delegations and new signing keys added within a specific period of time How do we actually manage all of this well tough has four core roles that it's necessary to understand The root role which is responsible for anchoring all of your trust and we'll see how that gets bootstrapped in a few slides Your targets key which is responsible for signing packages into the repository and is also the head of all of your delegations Your snapshot Which is used to sign an inventory of the other content in the repository and your time stamping key Which is used to produce a very small piece of metadata on a very frequent cadence That just tells people has there been any update you should be going to get Now these four keys are taken and put into a file that we call the root.json and All of these roles here have JSON files associated with them that take their their namesake keys The root.json is itself signed self-referentially with the root key This starts to give us a hierarchy We trust this root key because we're going to we're going to bootstrap this root.json in some manner and From the root key and that signed root.json. I have transitive trust on my target snapshot and time stamping keys My delegation keys all chain as a tree that can be arbitrarily nested Underneath my targets key So I may say that you know I have a delegation key in a role Ashwini has a delegation key in a role Jane Doe has a delegation key in a role We may be restricted into only signing certain packages within the broader repository and we can also further delegate This tree can go any level down Now that I have this hierarchy We can use this to also treat our keys as a whitelist if I need to change one of these keys I can just change it at that point in the tree Sign in a new record with the parent key that I'm chaining that trust off And eventually this goes all the way back up to my root key So root keys interesting because obviously I don't have any implicit chaining of trust down onto my root key So how would I go about changing that? Well tough to find how to do this. I take my existing root file that I've already got bootstrapped. I Generate a new root file. I Generate a new key to replace the old one. I Then sign my new root file with both my old key and my new key and This gives me a chain of trust from the data. I know to the data. I'm trying to acquire The keys here have different security profiles So as you might guess the the frequency with which you need to use the key Dictates sort of where you're gonna have to store it if you need to use a key more frequently you need it in a more accessible place and The the tough keys scale reasonably well in this regard Your time stamping key is probably held online somewhere likely in a server that doesn't get any direct access from the broader internet But it's gonna need to be time stamping, you know automatically on a regular basis So this has the weakest security properties at the other end of the scale your root key You should need very infrequently. You typically only need this to Resign your root jason root.json file when it's about to expire Or when one of your other keys has been compromised So you can store this somewhere like a bank vault and ideally put it on to signing hardware something like a UB key To bring it online when you actually need to do signing operations with it Furthermore if you want to add additional protection on top of this tough supports a really cool idea of thresholding For any of the roles in my system. I don't have to trust simply one key I can define a set of keys and Then have some subset of them be required to actually create a valid piece of data So imagine you were worried about our situation earlier where maybe the owner of the repository can be compelled to publish bad data I could maybe generate ten keys for signing a given role Distribute them to ten different countries around the world and require say four of them have to sign to make any piece of data valid This makes it very difficult For somebody to be compelled or even if one person doesn't implement good security practices the compromise of their individual key Is not enough to publish bad updates. So for people who need high security environments. This is an incredibly powerful feature Now we talked about how we get the initial bootstrapping of trust and much like how you'd get your CA certificates There's like a single download point that you just you trust that one point in time you download it over TLS Whenever you download something that's going to use the update framework and this is purely an example here as far as I know canonical is not using tough When you go and download your Ubuntu ISO you would get an initial root.json file as part of that download and Every subsequent update you do to your tough repository you've bootstrapped with this initial root.json that you downloaded So let's have a look at what that actual update flow looks like from my pinned root.json I'm going to go and download my timestamp file and I don't have any other information about my timestamp file other than it should be small So I limit how much data. I'm going to download that. I Verify that it's signed with my time stamping key that I have from my root.json Now my timestamp file contains a checksum of my snapshot Remember my snapshot is my inventory of everything else that's in the repository So for my timestamp I go and download my snapshot and verify the checksum to make sure I have integrity I Then also verify the signature of the snapshot based on my root.json file From my snapshot first I'm going to make sure that my root.json is up to date Because my snapshot will tell me if there has been an update to the root.json If there has been I'm going to go and download the updated root.json run through that key rotation check But I had to see if any keys have changed and then replace my existing pinned root.json with the new one and start this update flow again So when I eventually get back to my snapshot file again, I Then look up the checksum of my targets file Go and download it verify the integrity based on the checksum and verify the signature of my targets file and Then using that key hierarchy We had my targets file is going to tell me which delegations I have and which keys are associated with those Delegations while my snapshot is going to give me all of the checksums of the delegation files And I'll use those pieces of information to go and basically download all of the delegations Verify their checksums and verify their signatures until I've downloaded the entire tree of my tough repository So what does all of this work give us in terms of improvements in our software distribution pipeline? So yeah, let's take a look at where improvements tough actually brings into the picture So first of all whitelisting one of the best things about the tough model is that everything is based on wireless We are inherently only trusting valid things and all the invalid things are emitted by deep by designed by default And not just updates to your packages but also updates to the keys that you use to facilitate those updates and This leads us to improved key management as seen earlier tough provides a robust model for rotating and managing keys without the consumers manually having to configure or set up anything and The updates are pushed to the users in a timely manner as soon as they're published and This is pretty awesome These time line these timeliness guarantees solve the expired mirror problem that I mentioned earlier Where the the server that you were asking for updates from didn't really know what the latest version was and was giving you the answer That is thought was right, but it wasn't really right All the mirrors and servers can be expected to have the latest updates for the consumers so now that We've talked about what tough is and how it works you might be thinking how can I use it? So it is true that the update framework needs a lot of Work from the spec to signing to get to the state where you can just sign things But here's the good news. We've done all the hard work for you the the the screen you see is Basically notary go lang implementation that we a docker built and it was recently accepted as a CNCF project It's open source and it is ready for use Let me quickly go through the architecture of notary to get an idea of how it all comes together So notary has a client library for integration with your use case as well as a CLI that this CLI or client T it talks to a notary server which deals with storing and serving the signed metadata and It validates an update when it is published to make sure it doesn't break anything the signer does time stamping and time stamping involves signing keys and those keys are encrypted and stored in a database and Well, that's a lot of like high-level overview Let's actually see this in a demo. Okay, so who's done this at some point in their life? Come on like admit to it Think I think there's a few people too embarrassed to put their hands up in here Like I We're all in the same boat here. We're all bad people like Obama's displeased with us So let's have a look at how we can man. I should have closed some things before we start the talk Let's have a look at how we can improve on this. Oh, it's not switching. I think I have to stop presenting. Yeah And now nothing I Can windows All right, can everybody see that okay? All the way at the back. All right, great so You know, let's let's do our little curl command totally trustworthy and Yeah, that's not Not what I expected Fortunately, this this is my website. I keep a backup of it But before I run an update, let's actually initialize a tough repository Sign the correct copy into the tough repository publish that Verify that it now blocks the download of the bad data and Then do our update and verify that we're getting the correct script so We have this notary CLI Comes with many many commands for key management Dealing with changelists notary operates on a sort of changeless concept where it stages changes before eventually publishing them And we have particularly we're going to be looking at this command verify Which actually allows us to just pipe through small amounts of data or use files if we have larger amounts of data with dealing with To make sure that something is good and as we expect so we're going to initialize a repo and And we're going to call it trustworthy.com because that's the website. We're going to initialize it for and This is going to ask me for some passwords. So notary will attempt to reuse a root key By default you can also give the CLI flags to tell it to use specific root keys And then it's going to go and generate those targets and snapshot keys that we had in the slides So it's going to ask me to enter the password twice for each of them just to confirm And note it didn't ask for a time stamping key later Make sure we have time to get through it, but also It's in one of the latest lives you have a salon tomorrow, which is also going to be a great chance if you have time, so So it didn't ask for a time stamping key because it reached out to my notary server and requested a time stamping key for our trustworthy.com repo and Note all of these keys are ecdsa with the p256 curve by default if anybody's interested So I'm now going to add to this repo My awesome script and this is the name that I wanted to have of the target within the repository and Then I'm going to give it the copy from my backup to make sure that I'm getting the correct copy of it So there's been stage for publishing Let's publish this Okay, it's going to ask me for my targets key so I can actually sign in that new package or that new script into my List of targets that I want to be available And then it's going to ask me for the snapshot and key so I can update the inventory Note that the notary server and signer also support rotating the snapshot and key to the server Which is really useful in the case that you have lots of delegations that you're having to manage All right, so this says it's been published. Let's let's list it and just make sure that it's in there All right, perfect Let's run that verify command and Can you tell I was testing this out earlier? So fantastic Tells me that the data. I'm trying to download does not match the data in my trusted collection So let's restore from my backup and then just run this again So it printed the script obviously we can actually pipe the end of this to bash it will execute bash gives a really unpleasant error message in the case of notary denying the script so But it we can pipe it to bash and it runs my script and it's done a verification It's made sure that the data we downloaded matched from the Trustworthy.com repository the awesome SH target that we were trying to download Now you can extend this out. There is an integration called Docker content trust built into the Docker CLI that does Users notary for signing Docker images That can be used against any Docker registry if you happen to be a Docker enterprise customer That comes with Instances of the notary server and signer that you can already operate against And also comes with policy management So you can actually use these signatures to gate deployments going out to production sure all useful features for high security environments All right, so Wrapping up. I think we will have a minute of questions. Sorry so As I mentioned notary is is an open-source library it's part of the CNCF and If you all thought that this looks really cool We invite you to come contribute. There are a bunch of things that We're looking into for the future of notary and it would be great to see you all tomorrow 11 10 a.m. In meeting room for a that is So, yeah, thank you. I'm not sure how much time do we have for questions about four minutes for okay cool So we support other algorithms if you bring your own keys, but really when I say other algorithms RSA But it's we actually I Like pat myself on the back I feel like we did a reasonably good job of like abstracting the ciphers we support from the rest of the code base So it's if anybody wants to add additional ciphers. It is relatively straightforward And I think the Certain bits of the command line like the key generation commands definitely let you configure like which cipher to use and then you can use those to then like Tell it to use specific keys for specific things We started off with EC DSA and RSA with like RSA as the default and it's just so slow and Every time you download you're doing at least like four signature verifications Which takes meaningful time when you're using RSA keys You could do that through setting up your own signing So actually there's a really interesting You come to the salon tomorrow The update framework has an update mechanism called tough enhancement proposals, but no tough augmentation proposals taps There is a proposal. I don't think it's been merged yet for having multi repository trust So you could actually have PI PI as like the official source of packages and then have your own trusted subset So that like nobody can insert a package into your subset that isn't also in PI PI And then you can get that sort of like bidirectional trust So yeah, it's not in notary being debated. I think still in tough It's been accepted Yeah, I think this gentleman has him next So the architectures So fortunately Docker images at least which I believe still conform to the OCI spec and are intended to Are actually a Merkel tree So every the the manifest or the multi-arch manifest now something's called the manifest list Contains checksums of the layers The layers are tar files that you go and download So it's We sign in whatever the top level item is whether it's a multi-arch manifest or a single architecture manifest And from there you have checksums of everything else that you need to go and download Specifically the metadata that that is attached is like contains everything that you need to make the signing work There's what like one minute left if anybody has any more questions otherwise Salon tomorrow Yeah, 1110 for a great chance to get more information Justin will be there will be there so Yeah, hope to see you tomorrow