 For those of you like me who have like teenagers story of trust that will never clean up your room and they tell you And that's basically also a part of this talk Nerd alerts. This was from a previous conference, but I guess here. That's there's no alert everybody. Everybody's here And like to put the scope down We really only gonna focus on putting things from source to production So not running in production because that's another field of dealing with tampering and so on Where I'm not gonna tell you like scary stories about security because you know, that's all too easy. You all know that's important I don't have to explain that to you I'm not talking about changing hardware because that's another vector. You know nationwide attacks I'm not going to go there as well I'm not talking about like human engineering or kind of tricking you into doing stuff that you don't want to do I'm not talking about physical bypassing access to your server rooms, you know Trying to impress the talk as much as I can I'm not talking about key management Although it's like important and here at Fosdame. You probably want to manage the stuff as well Not talking about hardening your servers Okay, so we're almost there what I'm gonna talk about tampering Chasing things unauthorized changes. That's what I'm gonna talk about I Leave this slide in here because promise theory for those who know it There's a lot of the word trust trust trust in there But trust through verification and that's gonna be the crux of the talk like how do we verify the stuff as well? I don't have an answer for everything. So this is a research talk There's gonna be a lot all over the place of things I found So I hope in next iterations of the talk I will get better at Providing solutions as well. So but I went as far as I could from what I know right now in my research So we're gonna start with you know the laptop it's I know it's a curse to put like a Mac here and an open-source conference, but that's a lot of the reality And we got a trust already the hardware vendor. We got to trust Apple. We got to trust the operating system So I'm gonna take that as a starting point the developers laptop with the certificate authorities is on there You know with everything in there and it's updated. So I kind of start from there These days a lot of the development is actually assembling more and more libraries, you know, if you're doing some development on the Mac There's the app store. There's Docker npm if you're node homebrew because you know Apple doesn't provide all the tools you actually need so You know, it's nice to have the community But we depend on a lot of other libraries actually to even start our job and get the information Before we can even start developing. So the first thing we trust is kind of all these kind of external dependencies and most of them are actually based on you know some form of TLS verification chain of trust and Well, you know, I don't have to explain to this crowd any configuration ciphers You know all that stuff certificates That we think that is happening underneath because you know, we're using HTTPS and it should magically be taking care of right Obviously, you know leaked or rogue CAE certificates, but you know, that's another story. We leave that to the pros And all the ciphers and so on if you So I'm taking this example of a node app trying to push that to production Like the node By default has a lot of ciphers it supports. So it has to support it because it's quite open So we're trusting it, but we can get better at you know Putting this down It's it's as easy even as putting some environment variables and kind of saying these ones. We don't want So it's a simple thing you can do if you don't Want just rely on node just doing stuff and there as well I Have to take my glasses on and off. I'm sorry What's interesting is that a lot of the we think about Certificates being validated. Yes, most of the time it actually validates whether the certificate has been expired But it doesn't look at revocations So for example, you know, you can What is it CRLs and so on you can verify it But in several of the libraries it is just not available Let's encrypt s kind of made this better But we're still not there yet If you want to go a step further, you know, you look at the SSL the certificate is valid You look at the revocation stuff The next step is actually like what other certificates got created for my domain Facebook provides like a nice interface To kind of check whatever certificate that was created in your domain So that's a way of detecting walk certificates being created under your domain that you validate So you can put a web hook on there Monitor your domain and kind of get information about things certificates being created there as well And if you want to go step one step further You can put in DNS the CA a record that actually says well if this record If the certificate has been created by which authority has it been created because maybe there's some dodgy kind of CA That creates a certificate for your domain, but here you can specify it in DNS. Which ones do you trust? I'm not saying this is widely used, but it's like one of the trust verification system We can use in the system so we kind of you know Went a little bit deeper on HTTPS and validation than usual. Yes, you know HTTPS and we're happy kind of a couple of things we could do but obviously there's the next problem is kind of is DNS right because We want to do a request, but we want to do a DNS request and Well, I'm I'm sorry to say it still isn't solved and many would think like DNS sec is kind of the answer But if you look at what a few of the major ones like npm and github don't have the NS sec enabled So that was a surprise to me. I was like thinking yeah, this is being a sec You know everybody should be doing it. I know it's hard, but I would have expected these companies to be doing it Even curl doesn't check it like there's like a six-year old open pull request to do it like My trusting curl has gone way down. I'm not saying it's not a good tool, but Obviously it's people but the more you look at the things under the hood the the scarier things kind of getting There as well. And then oh, no, okay. We got the HTTPS out of the way We kind of trust that we got the DNS we kind of assumed that trust as well and then the next thing you do is like a Homebrew says us like you know run this kind of ruby execute from github on your laptop How many have done that? Yeah, scary, right? So I don't know why we you know, it's it doesn't seem to you know be able to get better to do it There's a tool I found like pipe this that allows you to actually view Whatever has been piped to the bash command that you can review the commands being executed gives you some kind of Visibility of the things that are happening But again, like we just take that for granted, right? So homebrew, right? Let's do it Know it is a little bit better in a way that at least we can do the curl by download and check the validity of the binaries They have GPG enabled so which is nice But Okay, I'm just gonna go back on this slide. No, okay But it still is kind of a whole ceremony to go through right? I rarely do it, but it's like one of those things It should almost get baked into the tool somehow in the future as a step that just says I download it It's validated. It does all the checks currently across all the Package repositories there's no standard way everybody does it differently and you kind of have to figure out and sometimes It's not even well documented and you have to spend a lot of time to actually verify the stuff Doctor was an interesting one. I wanted to have doctor desktop on my laptop. I could not find that check some Somewhere on a website that was able to verify the DMG that I was downloading The only way I could verify it was if I install it on my Mac Using app store and then I could verify the signature and that was documented But then again, you we go through the chain of trust that we have to trust like Apple and Kind of go around that so that was a little bit surprising for me to see You know, we couldn't kind of do that with a docker binary and verify that as well Okay, I got like node. I got like docker a homebrew so now, you know, I'm gonna install some libraries, right? If you look at the libraries, how do you? Think that a library secure right? Oh, if there's like a million people using it, it's a cure If it's like being there a long time it's a cure So I haven't really found a good way Obviously, it's kind of like verifying multiple vectors But there is no kind of good way to verify it There's a lot of vulnerability scanners that will report you see these or something on something But if it's like a library not well know or a new thing, it's like really hard to verify Unless you kind of download or compile everything from scratch yourself and go from there NPM install is interesting because it actually you know like many of the package managers Just runs also scripts when you do that so you can change a lot of stuff on your laptop as well So you can for example Use npm install with ignore scripts and say well Don't run it and get like a preview of the scripts It's gonna execute and then do the execution so it's like a next layer You can verify what is actually going on underneath It's it's very clunky and of course we don't want to have like a you know advertising in npm Who remembers this happening? Nobody so for a while when you did npm install Instead of only giving you the packages that in install it will just show advertising in your CLI tool There was a lot of debates of something happening But it kind of now shows you what these package managers could do to your system if you know There's somebody who wants to do some stuff with it So you you don't kind of do the scanning you find all the bugs one of the problems obviously is kind of all the dependencies and the libraries like a buck could be D dosing it but you're using it internally so even with all the scanner tools It's really hard to figure out like what what do where do I spend my time patching? So That's you know one of the problems prioritization to do that You can take a step further of the scanning of libraries and do it at a proxy level like J forks X prey Allows you to do the scanning and not allow you when you download the packages through an upstream repo and That they're like secure or verified or not so you don't have to do it on your laptop. Obviously you have to trust J for a kick tray, but it's kind of a another layer of control you can put there as well So this was just to point out that Homebrew the only thing it actually does is like looking at The checksum of the tar ball or something it downloads so the actual recipe or was it the brew That itself is not signed And the maintainers that we're not doing that we're an open-source project. We don't have time for that That was basically kind of the answer. So it's you know, it's kind of interesting On something we've used we're using so much in our laptops to install stuff that this is kind of the state where we are You know no it is a little bit better they kind of signed to their Their dependencies as well. They have like check sounds, but you can see like this is All manual steps that you need to go through but at least you can verify that you can get their GPG keys You can verify the stuff. So that's nice And obviously when you commit things yourself you can sign the signatures With your GPG key and push things to get but there's a I found an interesting project that allows you actually like Multi-comments or what is that multi? submitters That you do and get that there is like a solution for multi-signing it a Commit, so not only one person and then you can say well Before he goes into production multiple people have to sign that commit before it actually goes out. So that's another way of verifying things there as well You know if you're using github or something similar you when you push things these Fingerprints of the official the host keys are actually documented so you can verify those as well Instead of just saying yes I trust first trust and push it to the server as well It can kind of be it like any anything that has like a SSH key at that point You can kind of push to but at least here you can verify the official one That brings me kind of to a principle of tough I who has heard of tough few people so it's kind of known in More in the docker sphere is spreading towards More of the applications and the package distributions So it's it's tough stands for the update framework So they kind of made some principles and some guidelines of people pushing things to repose So the separation of duties more like you know one Multiple keys and not one key to kind of push things thresholds much like the get Commit signing they do the similar thing like multiple people Depending on the step have to kind of see yes, not only one. So that builds like the consensus much like Was it The certificate thing I talked about The revocation of keys is really important So it's built into the framework that they can fast rotate keys to make sure that there is like no delay Offline keys so the road keys don't need to be active all the time So they can be on offline storage and then the trust goes up for more like shorter keys To be there and no need for the key sharing So everybody can kind of delegate a key to somebody else depending on what the job they do So there's a principle look it up the tough the update framework You know, I'm just rushing over here, but it's an interesting rate as a concept And this is what actually is used by docker notary when you want to have push Images in You know that are signed and secure those principles are used by docker notary So you create a key you sign it very much like any of the package repositories as well And then you can expect Whether it's signed with or correct to like you would do with a git commit, but now with a docker image as well this actually has saved a major breach on the docker hub by Requiring multiple people due to a certain steps one of the keys got compromised But they recovered in a way because they had this principle available in in in the setup We've talked about, you know, node code or any of you know Ruby or whatever code we talked about docker images but there is also something you can do for verification for For example for each avascript libraries in the browser So you can build the checksum and you can make sure that there's an integrity check when the code gets executed in the browser as well So it's it's the check summing and so on we just keep going and We can verify more and more. So that's a good thing I'm taking it even further. We talked about code But you know, this is probably the simplest Way in the days when I was running Apache like minus T was my friend to validate the code or the config as well I'm not saying this is the trust But at least it gives you some kind of idea of what is happening Happening and if you want to take that further now with something like tfsec who can basically check your Terraform code to validate if there's something happening that shouldn't be happening as well. I Think that's getting powerful more in cloud configs that we can actually do that verification as well Taking it one step further so I'm just going up and up So what if I don't trust my laptop every time I do some stuff? I think Jess Rosalie has coined this idea of running every application inside of a Docker container on the desktop 2015 this was a blog post now. There's been a couple, you know Visual studio code you develop code inside your containers and then we get better at Exporting like debugging it from remote You know Fedora has a similar project and Something like cubo s goes a step further they even give like every Application the way I understand it they give every application their own network stuck their own everything And everything stays like separate and ephemeral and it's actually used in For journalists to make sure that when they're putting things in the news that they have like a safe environment Why where they can edit stuff and they can kind of make sure that it hasn't been tampered with So one of the point is that I know a lot of people say, you know server scale not pads But I think we're going the direction that it's going to be the same for the desktops I have when I said that I heard like developers screaming like I'm I cannot have my own autonomy and so on and it's like Yeah, I've heard that before like ten years ago on the server world Giving pets names. We were just discussing that like Pokemon names or whatever So we got the code verified we're pushing it now hopefully to the CI or We pushed it to the git repo and now hopefully the CI kicks in but what's interesting is that this often is outside our control So it's like who watches the watchers who trusts the people we trust So it's it's a this is a nice paper to read if you want to know more I Don't if anybody knows that project But yeah, it's just you know, it's in a fun way It shows like anything can happen in your CI system that you you don't know that's gonna happen as well And that makes you think like if you use like a saw solution How do we get that same trust we can't verify the binaries? We can't do everything we pay them or we use them and we just assume They're secure. So that's like a big leap up And I haven't got any You know perfect solution for it, but there's obviously they get better at exposing what they do So they show you the images. They're on your builds on they shared that officially But we don't have any idea where they are running this image So it's it's still like it's better Some have a solution to run things on-prem so they would do the same orchestration But they run it on your machines. So at least you can kind of get audit logs kind of see what commands get executed So it's not perfect some would have like More of They would officially announce that there was an audit of the code. So that's kind of how More and more they are a building trust Some would allow you to limit the IP addresses Let's say if your build system is remote they have you push things to your environment That at least you only allow the IP addresses of the people who are pushing something to you And this is a nice, you know little trick for example, if you use any of the AWS CLI tools you can change the user agent and you can have like an EIM profile That says like only this user agent is allowed. So you can turn that into an almost random Key by obscurity that allows you to kind of say well nobody else even if though they have like the AWS credentials Can actually use that if they don't know The the random key that you put here. I'm not saying this is secure, but it's like one step more The people from The Bitcoin system they went a step further So they just went with what if we have multiple people compile it and then verify if this Gives us the same result and then if most of the people get the same result We assume that is the best result and then we put the checksum in So not only one check some because your tooling might be infected and just changing it there as well So I found out a nice way. We we sometimes think about multi clouds. What about multiple CI? What about using two CI systems and then do like verification and see both kind of created the same thing Again, it's like Thinking I'm not saying this is actually being done, but it gives you some inspiration But this brings us to the hard point of doing reproducible builds Which isn't really that easy because let's say you have like a something like a Time in your banner or your header or really whatever and you're kind of recompiling it the next time you compile it You know the binary will be different Or the flags from the compiler or anything there So Deviant is bad in a lot of time on making packages and builds reproducible Node if I run it twice on my same laptop with the same compiler in the same settings the binary or whatever it builds Will not give me the same checksum So it's something that I found interesting to to to see We would think like a checksum. This is kind of what we do one More step further What about the operating system itself? It turns out that this was from a Fosn talk To build a Linux nowadays you have to have a Linux right so somewhere we lost the link from being completely from scratch To build the Linux system Sometimes you see like bootstrap one two three when you do like GCC compiling like some of the bootstrapping is binary only So people are still working on making that code as well So it's it's sometimes hard to verify to do that if you want to have a look at this This is I think there was a talk earlier today on that as well It's SD like all the dependencies of the libraries. It's not that easy as it looks And even like the bootstrapping assembler That you know people are working on that just to have like a from source verifiable compiler to make things happen I'm gonna skip that slide. That's right. All right So we kind of said well make it repeatable make it repeatable and then the hacker say Fantastic now I can detect every binary because if I look at the checksum. I know what binary it is So then this company said well, you know what? We're gonna build a wrapper that randomizes the checksums again to run it in production and then we mop it on the static Checksums again, so it's kind of you know, we're going around and around but you know, we're getting better The check something is one thing I talked about Think about all the dependencies I think this is where Google basal comes in that they instead of saying reproducible builds which built the same binary they're working on The concept of hermetic builds were all the dependencies that you need in the build for your docker image or whatever There are specified. It is not just up get update or something They kind of try to do that as well and not only for your dependencies, but all your also for your compiling tool chain So basal will download a known set of compilers so that everybody in your team will use the same compilers and do the same installation as well And that brings us you know the attack factors quite often the discussion is on containers like hello There's so much in the container and then there's so much vulnerabilities and they're actually going the direction What if we only need like one binary in the container? So distro less and that you know makes it a lot more difficult to do any attack factor there This Is what some people are working on so we when we know all the dependencies We kind of have like a bill of material, you know like the ingredient list This code has been built with this dependencies and so on Well, there's many tools that allow you to reverse engineer docker files Because it's sometimes not that easy and you don't know what's inside You can take multiple approaches either from forensics or reverse engineering the role in the nose You can see that later. So some interesting stuff, but you know, it's stupid that we throw away all the meta information while building and then Have to reverse engineer it again. I in my past company are worked on video stuff and it's almost like When you have a DVD, I know that's getting old with subtitles, you know And you have to reverse engineer it from the image, but it will that's actually stupid In total is a new Not new but it goes one step further They actually not only specify the dependency sources, but also the steps that they took While doing the build so they would sign every step of the build So you can verify what happened during the build as well Then graphias in the dockerx system can use all that metadata to decide whether something should run in production or not Based on what has been in the image when a new vulnerability comes out They can kind of say well, you know, this is kind of something that is has been built now We know there's a Severeity one so you have to do about it, but that's something that can be integrated there as well But obviously, you know, it never ends and we just keep going there as well so I'm getting toward the end of my talk. I know all the tools that I've shown you are so hard to use right and Sometimes I think about it We fought fought so hard to get the right to vote, but then you would say well, I don't want to vote Right, and it's a little bit like this. We fought so hard about getting the freedom for you know Using sharing libraries and so on and I think we kind of have to get better at the verification step because you know That's one of the duties. We have as well to make our ecosystem more secure in there and I think a lot of it has to do with the transparency transparency building consensus so not like one Group saying yes or no, but work on there as well So I think that was basically my thought. I don't know if it's time for questions Okay, anyone Yeah Mm-hmm. I know there's quite some debate whether it's useful or not. Yeah but I think If you automate it, that's one of the problems because if it's if it's just like I will do the from what I understand the argument is about when you automate the step then it's no use because it's There is no real verification in happening I don't know if if you have another idea on on there For instance not to build something that is not signed properly Yeah, but sometimes you have to I guess Oh, okay, I think the question was is there a Okay, I'm gonna repeat the second one so that that can prevent A build from happening if some the the the committee hasn't been signed I think that is probably looking at the verification and then putting an exit code I don't know. I haven't come across any specific tool in there. So Thank you Yeah, okay, so that's another talk like I said, yeah They're because I think that's more dangerous actually, you know, it is more dangerous but I think It's it's another, you know, it's another aspect of looking at things. I I think here I want to go in the prevention mode and not in the you know Sure Already Yes But but the visibility of knowing what is running production And I'm not talking about intrusion detection or logging or auditing But if you know what is happening and you can at least know check the vulnerability databases and get some of the tools that do that You get you are better informed whether you should kind of know patch to zero day or kind of as fast as possible I'm not saying you don't have to protect your servers anymore But at least the visibility is it's what I try to show here if you verify and get things visible at least you can See it if you don't know it. You just yeah Live in happiness. So that must might be good as well. So One more. Yeah Yeah, that is what I Think most of the distro the package managers are using Against It depends it could it could not right if you're saying If If I have like an S3 repository and my binaries here and my checksome is here You know on the same bucket and I have access, you know, yes, I can alter both So that's not really a solution But if I have the check sums on a website something else on a different system and I have my binaries here Yes, it makes sense No, but it depends what you define a same source is it the same company is it another company is it the same website same server So it's kind of depending on you know, the more the more people are seeing it the the the battery gets there. Yes Okay, I'm sorry