 After receiving his PhD in mathematics from the University of Notre Dame, Dr. Miller was a computer hacker from the National Security Agency for five years. Since then, as a consultant, he worked with Twitter's information security team, has won the Super Bowl of Computer Hacking four times, and has found countless vulnerabilities in Apple products, including their laptops and phones. Currently, the Senior Security Engineer at Uber's Advanced Technology Center, Dr. Miller has made waves within the field of automotive security, excuse me, for his work alongside research partner, Chris Valsak. He also spent considerable time revealing weaknesses in the field of automotive security, which led to the recall of 1.4 million vehicles. He's been featured on leading media outlets, including CNN, The New York Times, USA Today, and Forbes. And now, he's here at ArmTechCon. Please join me in welcoming Dr. Charlie Miller. Hey, everyone. Thanks for having me here. I'm going to talk about car hacking today with you. So it was a great introduction, but I'll tell you a little bit more about my buddy Chris who's in this picture. So he did all the car research with me, and if you look really close in the background of this picture, you'll see a Toyota Prius that we just bricked, and it's stuck on the road. And that was Chris's way home, and that's why he's looking kind of unhappy in this picture. So what am I going to talk about? So I'm going to tell you, we did some pretty cool research on cars, but we weren't the first people. So I'm going to walk you through some of the other research that's been going on in the world of automotive security, and I'll talk about cars in general and why, at least at this point, they're inherently insecure. I'll talk about the Jeep, of course, since I know a lot about that, and then I'll talk about some other car hacks that you might have seen in the news lately, and we can talk about those and how they compare to what we did and sort of the future of automotive security. All right, so car hacking basically started, at least publicly, back in around 2010. So these researchers, academic researchers from University of Washington and University of California, San Diego, released a paper in 2010 called Experimental Security Analysis of a Modern Automobile, and basically what they did was they took a car and they plugged into the OBD2 Ford, the Ford that's federally mandated, it's underneath where your feet are in the car, and they were able to send messages to the car and make it do things physically. So they were able to do things like control the brakes and control, like, silly things like windshield wipers or locks or something like that. And so it was sort of the first paper that made you think, oh my God, like there's a bunch of computers in my car, right? And so that time, at least I didn't even know that there were computers and networks in cars, right? It was like this is a thing from point A to point B. So this was like a pretty cool paper, but it was not well received. So this is one of the authors of the paper responding on Twitter to some comments. So the academic world and the computer security world, their response was, well, sure, if you're plugged into a car physically, you can control the car, right? And it's just like if I had physical access to a car, I could cut break lines or, you know, put a bomb or whatever, right? So the general response was like, this is interesting, but in the end it's not really, we don't really care. And so this guy's response and the economics response in general was like, oh yeah, well, we'll show you, right? We can do this remotely. And so they did, so the very next year they released another paper. This one was called in typical academic fashion, a comprehensive experimental analysis of automotive attack services. And so what this was was the same car, this time they attacked it remotely. And then they were able to send the messages to control things like the bricks and so forth. So they actually showed three different ways that they were able to get code running on the car. The first was through Bluetooth. So almost any car nowadays has a Bluetooth connection between like your phone and the car. And this is what allows you to do hands-free or, you know, music from your iPhone to your car or whatever. But the point is there is communication from the outside world going into the car through this Bluetooth interface. They found a buffer overflow and they were able to exploit this through the Bluetooth interface and get code running on the radio of the car. And then they were able to control things like the bricks. Another way they did it, and this is kind of clever, they took a CD and they put a malicious MP3 file on it. And if you put it in the CD into a regular radio, it would just sound like music. But if you put it into their radio in their car, there was a vulnerability and they exploited it to get code running. So it was, you hear these stories about dropping USB sticks outside of a company to try to hack into their system as well. You could just drop CDs outside of a car lot or something. I hope some of them would put it in their car. But the, so the Bluetooth was kind of cool, but the bad thing about the Bluetooth attacks is you have to be within, you know, 20, 30 meters for it to work. But the one that they found that was the most devastating was one through OnStar. So OnStar is a system in, you know, certain cars that allow you to call for help or allow the OnStar people to like, you know, do something to your car like unlock your locks if you, if you plead your keys in or whatever. And so you were able to find a vulnerability in the OnStar system where they could basically dial into the car remotely from anywhere and take control of it. So that was pretty scary. And the way they did it, I don't have a video of this, but it was really funny, like there was a, you know, a cellular modem essentially in the head unit in the OnStar system. And they found the phone number. They essentially dial it up with a real phone and then they have a speaker and they put the phone up to it and it's like, you know, making these modem noises and they were able to counter-mess it that way. And it's like right out of a TV show. Anyway, here is a video of them and their attack. They place the OnStar vulnerability and then send messages to the brakes. So they call the brakes. And you'll notice the academics are a little more safe than us. They're on an actual track with a helmet and everything. My research is not quite as safe. So they're able to stop the car, which is pretty cool. And one of the other things they could do is lock up individual brakes so you could imagine locking up just the front left brake of a car and that would probably make you do something pretty dangerous, I don't know. So I read this paper, these two papers, and I was blown away. I thought this was really cool. It's something I wanted to do. And the problem was that they had basically done everything, right? Like usually when you read a paper and you want to continue research, you're like, okay, what did they not do? I'm going to finish up what they did. They did everything from, you know, from no access to remote compromise to controlling systems of the car. They did it all. And so then I was the only sort of minor criticism you could have is they didn't release any details. So they were under the belief that it was too dangerous to release the details of their attack. So they didn't say, like, what bugs did they exploit? What were the vulnerabilities? What messages did they send to the components? They didn't even say what kind of car it was, right? So it kind of, from someone who wanted to start researching car security, it didn't help me at all, right? So they didn't have any code. They didn't have anything. In fact, like here's some excerpts from their paper and you'll see on the left, there's these, you know, bytes that show what the messages were and they, you know, they X out the middle of the bytes. So you can't just, you can't go reproduce their work, for example. And then if you watch their videos of their talk that they gave, they actually block out the car. So you can't see what kind of car it is. And you can see on them, like on the bottom right, there's like this big black wad that they're going to, they're going to remotely exploit. And so, you know, although we found out it was a, it was a malibu because if you're a super car nut, you can tell by little tiny details what kind of car it was. But anyway, so, so we're like, okay, Chris and I got together. We're like, we're going to do the same thing. Well, there's two sort of questions, right? So was it just that one car that was one or more, or is it all cars, right? So we're going to do a different car. And then we're going to also, we want to get people involved. We're going to release all of our findings, all our tools and all that kind of stuff. So luckily this time there was this thing called DARPA Sider Fast Track. And it was a program that DARPA set up to try to encourage small research projects like ours, instead of ones from like Blocky Mart or something like that. And so they gave us a little bit of money so we could get a car and do some research. And we did. So we got a Toyota Prius, the one that we showed Brick's, I don't know, that slide earlier. And then we got a Ford Escape. And the reason we picked those cars were, we wanted to have a car that had auto parking capability. Because the car that the academics use didn't have any kind of computer control of the steering. So we wanted that. And it had to be the cheapest car possible because it's government. So we basically went to a car dealer and we're like, yeah, do you have a Toyota Prius auto parks? They're like, yeah, yeah, what kind of color are we talking about? I don't care. Like, well, do you want to have XM radios? I don't care. Just show me a Prius and if we can auto park, I'm going to buy it. And it was the easiest sale ever for our car company. So anyway, so we bought these cars and we basically replicated the research that the academics have done. If we were plugged into the car, like we could send messages to make it do things. So we could control things like the Brick's and the windshield wipers and locks and all this kind of stuff. And so what this showed was, now we knew at least three cars, we could do this if you count the academics work. And so the consensus was, okay, this is not just a one car problem, this is an industry-wide problem. And the other thing that we did that the academics didn't do was now we had control steering because it's not like the academic researchers couldn't figure it out. It's just their cars didn't actually have a computer control steering, but we purposely bought cars that did and not surprisingly, we could make those cars do things. So here's a video of us with a Fox News reporter and we're driving on the highway and we could kind of like jerkly control the steering. So it was like really inconsistent, but we could do it sometimes. And this guy really made a huge mistake. So we were driving on the road and like I said, we would do the exploit and sometimes it would work, sometimes it would take like 30 seconds to work or whatever. And we're in the back seat, Chris and I and he's in the front seat and he's like, so you guys can control the steering? And we're like, yeah. And he's like, okay, show me. So Chris is typing on the computer and he hits it and I look up and the Fox News guy takes his hands off the wheels and I'm like, put your hands on the wheel. But it's too late. The car like got out of control. So anyway, Fox News reporter crashed and I've never been considered as that. But not unlike what happens in the academics, the response wasn't kind of what we wanted. We wanted ourselves to be like, oh my God, like cars are all having these problems where you can control the physical systems. But the response was exactly what they got, which is, yeah, sure, if you're plugged in, you can do anything. It's like, oh my God, like the academics, they show it, you can do it remotely. Like do we have to redo everything they did? And yeah, we have to because you get responses like this from Toyota who are like, oh yeah, well, we really only protect the outside. Like when we don't believe in a layered security approach, they didn't say exactly that, but they did. They believed they just could keep the outside traffic out. And just to show they, even less that they understand what was going on, they're talking about how they have state-of-the-art electromagnetic R&D facilities. It's like, well, this isn't caused by like a random fluctuation of an electromagnetic field. It's like a malicious attack. But anyway, so that was the response that we got. So like, fine, we're gonna do it remotely too. So unlike the academics who did it the next year, it took us a couple of years. But we were able to do it against a Jeep. This was now been like a year and a half ago. So we were able to remotely compromise a Jeep. Again, this was through, and I'm gonna go into a lot of details about this, but this would affect the car anywhere in the United States. So you didn't have to be nearby the car. And the result was that Fiat Crosso had to do a recall of 1.4 million cars, which was pretty crazy. And then like, I don't know exactly what that means as far as company, Fiat Chrysler goes. Like, I don't know how much I cost, but there's some people have said that it costs $14 billion to do that recall. So it's like, you could have hired a security consultant for a lot less than that. I would work for a tenth of that. I would have fixed that. And then like, instead of just picking on that, there was like other good things that came out of it. Like there's some talk about passing some laws to manage that. And this has created really recently this cybersecurity best practices for cars that they want companies to follow. So they're like good, they came out of this as well. But then if you want to take a step back and say like, okay, Chris and Charlie, we released everything, the vulnerability, the exploit, the code, the tools, everything we did we released because we want to get other people involved because car research doesn't scale if we rely on Chris and Charlie to do all the work. Cause I only have one car, right? So, and then like the academic researchers, they didn't release anything. And so like, I don't know who was right, like who did the right approach. But the one piece of data that I will point out is that some of the bugs that they reported, the academic researchers reported, it took GM five years to fix. For us, we gave Chris our nine months of advance notice and told them, hey, on this day we're gonna release no matter what you do. And they kept saying they were working on it, but they never really did anything that we could see. But then as soon as the wired article came out where we showed we could remotely compromise this car, like within a week it was fixed, or at least a recall happened, I should say. So, in my opinion, that was the approach they got the people most protected the quickest. All right, so we didn't stop there, that was in 2015. The next year this last summer we continued our research. In 2015 we were able to remotely compromise the car, but we had only limited control of what we could do to the car. We could control the steering, but only if you're driving really, really slow. And there's some other restrictions, like we could control the brakes if you're driving really, really slow. So we wanted to see how far could we have taken the attack, and we showed that we could actually control the steering and the braking at any speed. So here's a video of that. This is me controlling the steering of the car. Again, we're in a test track, just like that kind of researchers. No way, this is actually a school parking lot, sorry. So anyway, so we're able to control the steering of the car no matter how fast they're going. So that's kind of, you know, you can imagine the end to end attack of compromising something and following it all the way through to controlling the steering system. It's pretty scary stuff. So there's a question, like why do we all have these cars that are vulnerable, right? Like why did it take these academic researchers or me and Chris or whatever to point this out and get the industry thinking about it? And it goes back to a problem that we have a lot with security is that we build these systems and for a long time we don't think about security because we don't need to. And then all of a sudden we flip some switch and security becomes important, but we can't flip a switch and turn the security on. So cars do this too. So at one point in time, cars were just these things with wheels and engines and they took me places, right? But then we started adding features to them. So we added more and more things to them. So like windshield wiper, right? How does this work? Well, you've got this little stick on your steering column and you turn it on or a button or something and there's a wire that runs to a little motor that turns your windshield wiper. Okay, like how else would you do it? It makes perfect sense. And then you wanna add more things like turn signals. So I do that, again you've got this little stick on your steering column and then there's wires that run out to the four corners of your car that control little lights that turn on if you wanna turn to the left or the right or whatever. And again, you keep just adding more and more features and you've got a sensor that detects how fast your wheels going and then there's a wire that runs up to your speedometer and it moves a little needle or whatever on your speedometer. So all these by themselves are fine, but what happens is as we add more and more features, you end up with wires running all over the car. And this is bad for a few reasons. One is that it's extra weight, right? You got all this copper wire, whatever run through the car. And every time you add any little bit of weight to the car, the car company has to buy that wire and that costs a lot of money when you're, it doesn't cost a lot for one car, but it costs a lot for a million cars or whatever they're making. And then also all that weight adds to like makes your car less fuel efficient. And so there's a lot of reasons they don't wanna have wires running everywhere. So the car companies all got together and they came to an agreement that they would all support this solution. This is like in the 70s called the CAN bus. And the CAN bus is just a way that instead of running wires from every component to every component, you just run one set of wires to the components and they can all share that wire. And it's just an agreed upon protocol that they can all talk to each other. And the way this protocol works is it's called CAN and it's basically eight data bytes and an identifier. And it's just broadcast to everybody. And so if you care about a particular ID as a component, like if you're the speedometer, you care about how fast the car is going. So if you see a message that it has the identifier of how fast the car is going, you pay attention to it and you set the speedometer. If you get some other message you don't care about, you just ignore it, right? And so this was fine again, like I said, this was in the 70s. It didn't really matter if this was, you know, it doesn't have encryption, it doesn't have authentication or anything like that. It didn't matter. It's just like trusted things talking to trusted things. But then time went on, right? And we started adding a bunch of stuff to cars. So we basically added two types of things to the cars. One was we started connecting cars to the outside world. So I already mentioned that Bluetooth connects you to the outside world. Telmax, like OnStar, connects you to the outside world. But there's other things too. So like my Jeep had Wi-Fi. So that means my Jeep was a Wi-Fi hotspot, which means that if, which was cool, like when my kid wanted to sit in the backseat, you know, play Wii or something, not Wii, but yes, I'm so old. But anyway, play with their stuff on the internet. They could do that for the backseat. But that also meant that my Jeep was on the internet. And also my Jeep was, you know, could be attacked, contributed to that Wi-Fi hotspot too. So there's Wi-Fi in some cars. But most cars also have these wire, wireless tire pressure monitoring systems. So this is a sensor that's in your tire. It's telling the car how much pressure is in your tire that way you know if you're getting it flat. And so there's all these outside signals coming into the car. And each of these outside pieces of data sources are a place where code can have vulnerabilities and it can lead to compromise from the outside world. Well, that's fine. If you just compromise some piece and then you can lie about how fast I'm going, it's not the end of the world. But the other thing that we added to cars recently, which then makes this an actual problem, is all these like safety features. So there's this pre-collision system, for example, in my Jeep and a lot of cars. And the way that works is you're driving, it's got this little radar or something on the front. And if it detects that you're getting too close to something and you're gonna hit it, it'll apply your brakes for you to save you, which is great. It's gonna save you a lot. But the problem is then that means there's a computer somewhere that can turn on your brakes, right? Likewise, there's a feature called, well, you already heard that I was interested in auto parking. So this is a feature where you press a button and the steering wheel turns and it'll park you into a parallel parking spot. That's really convenient if you don't know how to drive. Hardly, well. But it does mean that there's a computer somewhere that can control your steering wheel. And there's other ones, so adaptive cruise control. That is like regular cruise control except instead of having to hit your brakes every time you're about to hit someone, the car will slow down for you. So there's computers that are in charge of how fast you're going, how whether your brakes are on your steering wheel. And so now you can, again, imagine chaining all this together where you got input from the outside world that could compromise some component that then can talk to other components. And those other components may control features of the car that affect physical safety. So it's, you know, end to end is kind of a nightmare. And then you get to the point where cars even have web browsers in them sometimes. And it's like, we don't, I work in computer security for 10 or 15 years. We don't know how to secure web browsers. So let's not put them in cars, right? Like can we disagree on that? Like car security is bad enough without adding a web browser. So this is a BMW that has a web browser. I'll talk about a web browser in a Tesla later on in the talk. That becomes problematic. All right, so I mean, cars basically suffer the same issues that all internet things type devices which is they're designed for, you know, trusted components and everything was great. And then we were like, okay, I want my trash can to be on the internet now for some reason. And oh no, like there's a security issue, right? It was shocking. So it's the same thing. Like as soon as you open up something that was all designed to be trusted components on the internet there can often be a problem. So let's talk about the Jeep in more detail since I know a lot about that. This is what the inside of my Jeep looks like. And again, like you definitely realize that thing in the middle is a computer. That's the thing that has the navigation system and it has like the radio and all that kind of thing. I mean, it looks like a computer. But as I mentioned, there's actually like 40 or 50 computers in your car. There's a computer that controls the anti-braking system. There's one that controls the airbags. There's one that controls, I don't know, the transmission, right? So there's all these computers and they're all talking to each other. But the one, there's one in the middle. It's called the head unit. That's the main one that is important because that's the one, at least in the Jeep where most of the outside data in the world goes to. And as you can see here, I mentioned that it wasn't actually made by Chrysler. Like a lot of the things in cars aren't made by the actual car company. They buy them from suppliers. And this was the actual head unit here was made by a company called Harman. So the screen, as I mentioned, my car has Wi-Fi, which is pretty cool. This is what it looks like. We looked at this and I found a vulnerability in it in the way that it processes outside, our data from the outside world. And just to show you the kind of security it had, I was, like Chris and I, when we did this project, we're like, I don't know, what's it gonna take like a year to find, even to write a remote exploit and send messages to a Jeep or whatever. So I was like, it's probably gonna take, I don't know, three or four months to find and write an exploit for this thing. So it turned out, I poked around for like three weeks and I found this one and the exploit writing part, which I thought would take months because that's how long it takes to write an exploit against like an iPhone or a MacBook or whatever. It took maybe five minutes. So it was trivial. There was like an outside interface that was facing the internet that had a method called execute and that you could call from the internet. And the way that this exploit method worked was you would give it a command and it would execute it. So I don't know if that's even an exploit, it's a feature and not a bug. Anyway, so getting code running remotely onto this head unit was like really easy. So I'll show you what you could do just with that. So not even talking about sending messages to other components, just like what could you do if you exploited that head unit? Well the first thing you could do, oh and I might mention that here's the one tie-in to ARM technologies, that head unit ran an ARM chip. But it wasn't ARM's fault, it was the code that was written on top of it. It wasn't even, so it was running an ARM chip, it ran QMU, it wasn't even their fault, it was the code that Harman had added on it so that had the problem. Anyway, so you could query that head unit and ask what the GPS information was for the car. So you could follow the car as it drove around on the street, which was kind of fun. And here's some of the other things you could do. So here's me and Chris is talking here. I am. So this is just showing we can remotely do the attack. You can see it's like a magician has to show. There's no wires, it's all wireless. Hey Charlie, get this car some style. So you can control what the screen shows, that's me and my buddy Chris in our super fly jumpsuits. Excellent. Turn that up. So you can control the radio station, you can control the volume, you can just turn it up to like my actual volume here. And then you think these buttons and these dials actually do something, but these are just inputs to the computer and we can ignore those inputs. It's like he can't turn the radio down or change the station or anything. So he stuck with the ARM beat. But he's like, well we'll just reboot the car and then that'll fix it, right? That's what you do for Windows computers. But no, for cars it doesn't actually work. So we still have to control the car even after a reboot. So this is a tax against the head unit. Like this is really fun and if you want to attack my head unit, go for it. It's fun and you know, we can have a gentleman's agreement to stop there. The danger is really when you start to send the messages to the components, which I get to admit. But before I do that, I just want to talk about like why we could attack a car anywhere in the United States. And the reason was because inside that head unit, there was a cellular modem and it was on the Sprint carrier network. And Sprint's smart, like if I just sat down at my computer at home and I tried to scan the internet and find vulnerable cars like Jeeps, I wouldn't be able to do that because Sprint has a firewall and they don't let in bomb traffic or traffic from the outside internet into their Sprint network. So it wouldn't have worked. But what they do allow is Sprint, one Sprint device to talk to another Sprint device. And so we bought the Sprint phone and then that Sprint phone can then scan Sprint's network and find all the vulnerable cars. And so we tethered, you know, we use that as the internet for our laptop and then we could just scan Sprint network and find vulnerable vehicles and we could have exploited them if we wanted to. So that was a big mistake on that part. And at this point, we had already told Price for a long time ago but they kind of weren't telling us anything. So they wouldn't tell us what different cars were vulnerable, what years were vulnerable, that sort of thing. So like, okay, well, we'll just find out for ourselves. All right, we'll just scan the internet and find all the vulnerable cars. And so we did. So we wrote the scanner and it would scan and then it would find, if it found a vulnerable car, it would ask what the VIN number was and from the VIN number you can find out what kind of car it was. So we have a scanner and the output is things like, you know, 2013 diagram, 1500 long-hauling. Wow, crazy. So we scan and scan and we found, these are all the cars that we found that were vulnerable. And some of the interesting things to point out is at the time, Price was saying it was only 2014 cars that were vulnerable but as you can see, if you actually look at the data we found, we found 2013 cars and 2015 cars that were vulnerable. And when we talked to them about that, their response was something effective. Oh, that was just a screw up at one factory or something. So, I mean, this is like a case where you really gotta look at the data and not just trust what companies say. So these are some of the cars we found that were vulnerable. And the one that was like, like, I'm a good guy, like I did this research for good. But the one time I was tempted was when I found that viper. It's like a $120,000 car and I could just change the radio station so easy. But I didn't do it. But it was tempting. All right, so at this point, we could remotely exploit any of those cars, head units and change the radio station or whatever. But what else can we do? So, if you look at the wiring diagram, this is our version of a wiring diagram for the car, the radio itself is connected straight to the bus, the CAN bus, that all the important components are on. So it's like, this should be really trivial to finish up and send the messages that we want. It turned out it wasn't. It was a lot harder. Inside the head unit, there's actually two chips. The one on the left is the arm chip. The one on the right is a V850 chip. And the arm chip itself couldn't actually send CAN messages. It wasn't allowed to. There was not a direct connection to the CAN bus. And so probably there was an engineer at Chrysler who was like, oh yeah, you know, like our cars can't be hacked because the outside connections don't have direct access to CAN bus. And he was right, or she. But he or she actually wasn't right either. So there was like, there's a connection between the two chips, right? The chip that I have code execution on and the chip that can send the CAN bus. They are able to talk to each other. And it turns out that the chip on the left can reprogram the chip on the right. So you can remotely attack a car, reprogram that V850 chip, and then reprogram to say, I don't know, send whatever CAN messages you want. That's exactly what we did. And this was another mistake that Chrysler made that they should have only allowed signed firmware images from Chrysler. They could have verified that all that code only came from Chrysler, but they didn't. So we changed it to do whatever we wanted. The bad thing is, like if you're exploiting something, it's one thing because usually if you just return the car, it'll fix itself if you mess something up. But if you're reprogramming the firmware of a chip and you mess that up, it's really bad. And that's what happened a few times. So I would screw up reprogramming that second chip, break it, and then my head unit wouldn't work anymore. And then what are you gonna do, except go to the dealer? So here I am with the dealer. Three times I was there with my busted head unit. And it was a real lend-in. I mean, I think broke all the time. But they were very nice. They fixed it up and it got me back on the road. And then eventually I figured out how to reprogram it without messing it up. So thank you, Chrysler, and there weren't the system. So eventually then we could remotely attack the car, reprogram the second chip, and then send can messages to the other components of the car. So here's an example of what we could do. We could make the brakes not work. So my buddy Chris here, he's gonna try to press the brakes there in a second. So he's pressing the brakes, and you can see we're not actually stopping. So this attack actually only works if you're going quite slow. So it's not, it's scary. And like if you were on a traffic light or something, you would go out into the middle intersection. So it's kind of scary. But it's not like you're driving down the highway and the brakes won't work, which would be even worse. So that was good on their part. They had an extra program into the brakes, not to allow these kind of messages if you were driving faster than, I think, five miles an hour. But as I mentioned earlier, we were able to, and they had the same sort of mechanisms for the steering. They had things built in to make it to where the steering will, you can't turn it while you're driving down the highway. Like it knows you've shot only on a parking or you're driving slow. But like I said, in that year from 2015 to 2016, we figured out ways to bypass all that. And so here is like the worst case scenario for the Jeep. So driving down the country road and I take over the steering from Chris. He tries to keep it on the road, but thanks. So we actually crashed on the side of the road there, which was, this is one of the funny things about doing car hacking is we didn't really want to do that, but it was kind of cool that it happened, right? So we were stuck, so we were on this farm road and there's this ditch on the side of the road. And our car was kind of at a 45 degree angle stuck in this ditch and we couldn't get out. I'm like, now what do we do? And I was like, I don't know. And this cop came by after a while and he's like, you guys didn't go in the corner, did you? I was like, no, no, we're just in the ditch. And he's like, okay, he just took off. So yeah, thanks. But then eventually some other person came by who pulled us out with his pickup truck, which was nice. And the other sort of funny part about that story is this road, I don't know if you can really see from that video, but it's like five miles perfectly dead street in nothing around. So there's no reason you should end up in a ditch on this road. And so the guy's like, you know, like, I should end up in a ditch. And we're like, ah, it's kind of a long story. I gotcha. Like, totally respected our privacy and he was up for discretion in there. So the first sort of takeaway from all this is like, why the heck are things that talk to you outside of the world connected to campus? Like, let's just separate that and that'll solve all our problems. Well, the problem is if you do that, you lose a bunch of features that you might want. And so one of these features is called speed compensated volume. And what that means is as you drive faster, the volume on your radio will naturally turn up to kind of compensate for the actual, you know, extra wind speed and our wind noise and engine noise and stuff like that. And the way that works is that means your radio has not faster going. And the way that it does not faster going is it reads messages off the campus. So if you want this feature, you gotta have it connected. That feature's not that great, like I could live without it, but there's other features people really do want. And so if you live somewhere that gets cold, you probably want this feature. So there's ways with your iPhone, you want to be able to turn on your car while you're sitting inside and letting it warm up. And the way this works is you press a button on your iPhone and it calls up to some cloud service that then calls down to the car or the car may be called the cloud service, whatever. But eventually the thing that's on the internet in the car has to be able to tell the engine to turn on. And the way it does that is with the CAN bus. And so not only does the radio or the head unit have to be able to read CAN messages, it needs to be able to send CAN messages too. Or you lose features like that. And the one feature that I really actually like and I don't want them to make go away is this like feature where you're in reverse and you turn the steering wheel and it draws these little lines on the screen that show you exactly where you're going. Super cool feature. But the bad news is it means that it has to know what your steering wheel's up to. And the way it knows that is by reading CAN messages that the steering wheel's sending to the head unit. So as long as we want these kind of features and we're gonna get more and more of these features, right? We're gonna get even more connected to the world. There's gonna be vehicles communicating with other vehicles that know like traffic conditions. And vehicles talking to traffic lights and other infrastructure. And so we're gonna get more and more connected. We're gonna have more and more of these features. And so it's just not really an option to disconnect it. We need to figure out ways to actually secure things and let things to be connected still. All right, so that was a personalized work and the economics work. So what about other things you may have seen in the news and let's talk about those just so that when you leave here, the next time you see a headline about car hanging, I don't know exactly what they're talking about. So one that was kind of cool is this guy, Sammy Kamkar's own star, instead of on star. Do you guys know, Sammy is a pretty cool researcher. He did the thing called the MySpace worm. Do you remember that? Anyway, I'll tell you what that is real quick. So MySpace was, I don't know if you guys even know what that is anymore. It was a thing like Facebook. But it had this feature on the side of your screen where it was like, my hero is and you would put, you know, death leopard or whatever, whoever your hero is. But he found this cross-site scripting bug that changed that and if you visited his page, it would change your page to say my hero is Sammy. And then if anyone visited your page, then their page would change to say my hero is Sammy. And so after like very short amount of time, that's how exponential things work, everyone's page said my hero is Sammy, which was really awesome, I thought. But MySpace didn't think it was awesome and then he actually got arrested and went to jail for that, which was a bummer. But now he's out and doing good guy research. So good for him. But anyway, what he did for this Onestar thing and we see a lot of this kind of thing is hard hacking is like really hard. Like what I described to you that we did, if you count all the way back to the Ford Escape itself, that was essentially four years, but just on the Jeep, we spent two years doing it. But something that people like me are really good at doing very quickly is looking at mobile apps. And so he found a vulnerability and I mentioned mobile apps sometimes let you press a button and turn on your car or something. He found a vulnerability in the mobile app for this Onestar service that they didn't use that to sell or anything like that. So if he saw you press the button on your app and if he was on the same network, he could see your password or credentials go over the wire. Then he could do that later. So he could send arbitrary messages on the canvas or anything like that, but he could do whatever your app was allowed to do. Like he could remote start your car, he could lock your car or unlock it, he could turn on the horn and that sort of stuff. And so the vulnerability wasn't with the car itself, it was with the mobile app. So that's kind of interesting that it's like car hacking, but you're limited to whatever the mobile app can do. And the good thing about this is it couldn't really fix the Jeep very easily, but you can fix this pretty easy because you just get a new version of the app downloaded to everybody. I can tell you the way that they ended up fixing the Jeep, it's kind of a nightmare. So they did the recall, but people don't really listen to recalls that much. They also mailed out USB sticks to people, so they would update their car themselves in theory. And then what they also did, which actually fixed the problem the best, was they had Sprint change that thing to where one Sprint phone could talk to another Sprint device. And once they did that, even if a car was still vulnerable, like no one could attack them anymore. So that was the actual fix they fixed it. Anyway, I'm just gonna show you how hard it is to fix cars when they're busted as opposed to mobile apps. All right, so another thing that's a problem is old flow here. It's like you can buy these dongles for different reasons, enterprise control, enterprise management, or for your insurance company, you plug into your car, and then it sees if you're driving too fast, or if your app probably breaks too hard, it knows that, and the phone's home to progressive or whoever, and says this guy needs to have his insurance checked up. But of course this means you've got something on the internet, and he's the phone home to progressive, and it's plugged directly into an interface that both us and the academic researchers have initially plugged into. So it's, and it's probably not written by someone who's as good as, say, Chrysler, right? And so some researchers, some of the academic guys again, some other guys from some other companies, they found three different dongles, they were all vulnerable, and they allowed you to do whatever you want. So just like for us, like once you exploit that thing, you can send camp messages, it's over. You can start messing with the brakes and steering and all that. So this is really bad. And of all the things that people ask me, like, hey, Charlie, how do I protect my car? You know, I'm scared my car's gonna get hacked, whatever. Like there's not really anything you can do. It's not like you can download antivirus or something. The one thing you can do is don't put these in your car, right? It's not worth saving 50 bucks on your insurance to have your car vulnerable to a time limit. Other things you may have heard about. So about the same time that Kers and I were doing the first Jeep attack, some guys came out and they hacked the Tesla, right? But they didn't actually hack the Tesla. What they did was they took a Tesla and they ripped it apart and they got the firmware off it. And they found passwords that would allow you to like log into it or whatever. And then they found this other physical port that if you plugged into, you could log in. And then, so with physical access, they could plug in and then they could control things. They couldn't send messages on the CAN bus. That was something they didn't know how to do. But they could do whatever the computer could do. So it's the equivalent of what we did with the first step of our attack, right? So we could control the radio. So you could change the radio station. They could control the windows and the locks because they had buttons on there to do that. But they couldn't, for example, control the brakes, right? So this is, they've done basically step one of the two-step attack that's necessary to like do the really crazy scary stuff. That's a good start. But it's just, she goes to show you like, when you read a headline that says Tesla hacked, like you really need to understand the differences of car hacking and how that matters to know that this isn't the same threat that the academics or that we did against the chief. But these guys did do the same work. So this is pretty recent, like maybe a month or two ago, these researchers from China, they were able to hack the Tesla legitimately. So what they did was they found a vulnerability in the head unit in the web browser. So Tesla's come up with web browsers. If you surf to a malicious page in that web browser, they're able to get code execution. So this isn't as dangerous as the one that, the other one is because the one that Chris and I did, we could just scan the internet and find them and attack them, right? It didn't require the user to do anything except have the car on. Here you have to have the user do something. So it's not as scary in that sense. But the thing they did do was they were able to then do that second step. They were able to do exactly what we did. They reflashed this gateway module that then allowed them to send cam messages and they were able to do things like control the braking or I don't know if they did the steering, but they probably could have. So they did, and then attack just like we did, except that there's required to use their interaction. Just to show the difference. So when Jeep fixed, when they fixed our vulnerability was they essentially just shut down those services that had the vulnerabilities, right? Which is fine. But they didn't ever fix as far as I know that second piece, the code second piece. So if I could find some way to hack a Jeep, I could still do that reprogramming and send the messages. What Tesla did, which was better, was they actually changed that second piece to where you had to do the code signing, right? So now these Chinese researchers, if they were able to find another web browser vulnerability, which I'm sure they can because the web browser folder vulnerabilities, they could no longer reprogram that gateway because it has to be only be signed code by Tesla. So that was a much better fix. And the other thing Tesla's do that's really cool is they can actually auto update over the air. So Jeep had no way to fix all the vulnerable cars. They had to send out recalls that said, hey, bring your car in, get it fixed. But Tesla can actually just push out updates over the internet to the cars and fix them all overnight if they want to. And that's what they did. They pushed out this fix that required code signing to all their cars. So pretty cool. Some other things you might have heard about in the news there was this hack against Nissan Leafs. So again, this was more of a mobile app kind of vulnerability. So there was a mobile app and you authenticated to a server and then you could do things like turn on the heater or heat the seat or whatever like that. The problem here was it wasn't an SSL issue like Sammy's. It was that the password, the authentication to the server was the vehicle identification number. And so if you know anything about cars, you know if you want to find the event of a car, you just walk up to it and you look through the windshield as written right there. So if you saw Nissan Leafs, you could just read the vid and then you could log in as that owner and like turn on the heated seats for example. Which is, I mean it's sort of fun, right? The only problem was with Nissan Leafs is it's an electric car. And so if you were like to overnight turn on someone's heated seat when they woke up the next morning in their car or like wouldn't start because it's dead, like it's out of batteries. So that was like a denial of service against the car right there. But it wasn't physically dangerous, right? You couldn't send messages to the brakes or anything like that, but it's sort of fun. And I think we're gonna see more and more of these kinds of taxes. It's relatively easy to find these. You might have read articles about all these cars getting stolen, right? It's high tech hackers that are stealing cars. And at first I read, just like you, I read this headline. I'm like whoa, what'd they do? They like remotely attack the car and like unlocked it and turned on it. Like that sounds really cool. But if you read the details of what it is, it's not as cool. So the way that these attacks work is someone who works at like a dealership or something gets the software that they use to reprogram keys. So like if you lose your key to your car, you can go to the dealer and they'll give you a new key that works, right? They have software that will program keys for you. So someone gets a hold of that software, they go to your car, they find out what the vent is or whatever, then they have a spare key, they reprogram the key with that software, and then they can just open up your car and drive away. So that's how that works. So it's really it was sort of a feature of the software that you could do that. Like they keep that software under lock and key. It's very protected. Like I had all their software that you could get for Jeeps and they don't let me, for example, get that software that lets you reprogram keys. Only the dealers can do that. But obviously there's a lot of dealers and you can't trust them all. So this was just, it wasn't really hacking I would say. It's more like just using, it's like stealing software or something. I don't know. But anyway, it wasn't quite as sexy as I thought it was gonna be. One more thing, so there's some research came out recently about heavy trucks. So the messages that you send on the can bus to say control the brakes are steering, they're different for every car. So if you figure out a way to control the Jeeps, for example, steering, and you try that against a Ford, it's not gonna work. All those messages are all different. And you know, we know this because we, you know, Chris and I have looked at a Toyota, a Ford and a Jeep. Totally different messages every time. But heavy trucks are different. So if you buy, if you happen to have a Mac truck or something and you look at all the messages that are being sent to the brakes or the steering or the windshield wipers or whatever, and then you compare that to a Peterbilt or you compare that to some other kind of truck, those are all the same because those heavy truck companies have gotten together and agreed on a standard. And so that's like really cool if you want to build a truck because you can just pull an ABS system out of one truck and stick it in another completely different truck and it's gonna work because all those messages are the same. Where that wouldn't work. If you took a Ford ABS and put it into a Jeep, wouldn't work at all. So that's really convenient for them, but from an attacker's perspective, it makes things a lot easier too. Because now, like one of the things that limits like someone's like, oh, some like, you know, Russian attackers are gonna, you know, stop all our cars or make all our cars crash. And I'm like, well, probably not because every car is so different, it would be so hard to figure out how every single car works. Like if it's just one guy, then maybe they're gonna attack one particular kind of car. But with trucks, it's a different story. So trucks, you can attack. If you figured out some message that would make all the trucks crash, you could do that and it would work against all the different trucks. So that's pretty scary. And this is something we knew about, but these researchers actually got trucks and school buses and stuff improved it. So that was kind of cool. Okay, so that is like all the car hacking that you will have seen in the news. So you're up to speed, you're as good as I am on this now. But sort of the lessons that you can learn is that cars were always insecure, but it didn't matter because they weren't connected to the internet or to the outside world. But as soon as we connected them and we allowed input from the outside world from attackers, for example, to be processed by them, that really opened up the security risks. And then when we also added in physical control features, then that whole chain could be pieced together to cause sort of dangerous situations where cars could get hacked and actually like crash or something. The other thing is like, if you read an article and it says so-and-so car got hacked, like you need to read the details to know exactly what they're talking about. Are they talking about you can control the heater, the seat? Are they talking about you can control the steering, right? It's only different ways. And the worst types of messages are the ones, of course, or the worst types of the text are the ones where you can actually send the cam box messages because then you have a lot of control over the actual features of the car. So basically like we're not in a great place now and the reason that Chris and I did this research was we don't want to wait until cars are getting hacked and crashed because it takes so long to design a car. We don't want to wait four years, right? So trying to get ahead of the car now, like communicate to car manufacturers to think about security and design it from the beginning. And we don't really know if it's working. Car companies don't talk to me. They don't talk to any of us really about the security of their cars. They just say, trust us they're secure. We take security seriously. Hopefully, like the biggest thing I want to see is first of all, I want them to be working on security. And second of all, I want there to be more transparency into like how do they design their cars to be secure? What kind of attacks do they consider? So I would love to see like Microsoft does this. I would love to see white papers written by car companies and saying exactly how their system's designed for security and what, then I could spend a weekend reading their white papers so it's been two years tearing apart their cars. Anyway, hopefully things will get better. We're not in a good shape now, but I hope for the future. Thanks.