 Next talk is data exploitation will be presented by privacy international. We will have three speakers Leading with Christopher weather head Yeah, a big applause for him then Our next speaker is Erin Leverett and we will have a third mystery speaker that will get introduced within the talk So about privacy international you are famous for suing GCE HQ and winning So I leave you with this wonderful Speakers and a big applause for them Afternoon everyone. Oh, I just lost my microphone. Yeah, I'm still going. Welcome all genders Yeah, welcome just from a quick bit of audience participation. How many of you actually heard of privacy international for the All right, so we're like an eighth. Yeah, that's not too bad So I'd just like to introduce this is a the esteemed Erin Leverett. I'm Christopher weather head I've been working for PI for about three years nearly something like that We're gonna talk about one of our key programmatic areas, which is data exploitation And yeah, we'll take you through what we're gonna talk about. So if you want to we can introduce ourselves a little bit more, too Chris Chris has been working on connected cars since I've known him So part of this talk is gonna be as we said data exploitation part of it's gonna be internet of things But we want to keep it kind of broad So this is Chris's chance to show off some of his hardware reverse engineering of the connected cars But we've also done some stuff on data privacy with connected cars And I myself have done industrial systems for sort of 10 years and then recently have started working with privacy international So I was a Mozilla web fellow there They paid for me to work with privacy international and do work with them But not be directly part of the organization, which was great for me Because I got to meet Chris, right? You want to say about yourself? I was gonna say the end we're gonna just talk through some principles that PI is putting together They're work in progress. So any feedback very welcome. Yeah Cool, let's get started So I'm briefly gonna go through who we are and how we work and a little bit about PI generally And I'm gonna just talk about the data exploitation program in Total totality. I'm gonna do a little bit about cars. It's not gonna be particularly deep But it will be maybe interesting. Hopefully you'll find interesting a little bit about various bits of Gmail and rhombor and yes on that stuff medical devices medical devices Then we're going to do the associate power of data And then we're gonna talk about the principles for doing and then there's a little bit of a cool out for you guys Because we could do a little bit of help mainly in the tech side of stuff So one of the things to say about data exploitation is it's a brand new program and we wanted to capture The way the companies are using our data Theoretically, we give our consent for them to use our data But then they sell it on to third parties where they go bankrupt and the data gets sold to someone else And you have problems like Cambridge Analytica and election hacking and all these other sorts of things that start to come about a lot Of these companies are using data in aggregation So you might be giving up your data individually, but when they fuse it with everybody else's data You get a wildly different effect of how that data can be used And how much money can be made from it? So we want to capture some of the issues around that and some of the advocates who we do Having been more of a hacker in most of my life. I never really worked with lawyers before I was sort of afraid of them for the reasons you would imagine But working at Privacy International I started to see all the ways that legal advocacy and impact litigation could change things in this space So that's part of the data exploitation platform as well So just on the point of these three points here space team Gmail and Roomba Space team is a game that I love I doubt many of you have heard of it But you play it on your phone and you shout out silly instructions to each other and you try and perform them at the same time But when we were downloading the app just the other day it asked for permissions to the contacts to the microphone to the to the location and also to Screenshots and this is ridiculous like maybe the screenshots is viable But all the rest is totally unneeded to play the game So that's a good example of data exploitation where you're giving up your data in a simple way overprivileging of apps Gmail of course most of you will already know that they were scanning for for many years for keywords Inside g-mails to give you better advertising better for various values of better in your opinion And of course Roomba was doing the same thing mapping people's houses by you know selling the the cleaning Robots, so we want to capture that not we're not we don't want to pick on those companies We want to capture that as a general issue and how the the wider population can fight back against that Yep, anything else here? Now we're going to the next slide So PI is split across three major like programmatic areas The one we're best known for is our stuff around surveillance where we've taken gchq to court. We do Litigation in the European court. We've done quite a lot in local courts. We've done Interventions in the states. We did a amicus brief for Apple versus FBI. We've done quite like it's definitely where we're best known. It's mainly around The proliferation of surveillance technology Yeah, it's Surveillance generally We also done a little bit of research that falls into the next category, which is building a global movement So we have 15 partners worldwide and we Try and get the privacy agenda on the local Agenda in those countries that where that where our partners work. Most of them in the global south, so Latin America South America Africa Africa and Far East And that's very important because trying to be a global international Organization with 20 people is sort of unachievable in many ways So by working with local partners, we have a lot better idea of what's going on on the ground both in terms of legal changes, but political changes as well and Some of those countries don't yet have laws protecting privacy in any way shape or form So we're able to work with them to kind of advocate for those laws to be in existence at an early stage Yes, so when we're talking a little bit later about data exploitation Yes, it's good to be cognizant of the fact that although in Europe We have quite good data protection laws in much of the world. There really isn't anything And even here in Europe We're overly focused on GDPR and personally identifiable information kind of approach But we don't necessarily look at the mapping of houses as an issue of personal privacy, right? so and our final program is a data exploitation which is broadly fits into two sort of Overarching categories on the one side. There's what we call data in the wings Which we'll go into a little bit about in a minute, but it's The data you can't see the date your device has which others can access or it's particularly law enforcement But you know others others have access to and can use but you can't see you don't know it's there necessarily and the other side is data that's interpreted from data you have given that you weren't expecting to be done like algorithm learning machine learning algorithms Artificial intelligence like sort of stuff around the data that you've given someone and they're trying to create new knowledge from basically And that's also a problem with data synthesis So it's one thing when you give up your data to one company and then again to another company But you sort of assume that they will never be associated But of course there's an entire brokerage around data that goes on in the world And when someone starts to associate your phone data with your car data with your health data You start to get a much more invasive picture of an individual or an organization next one so How PI does its work as Aaron said we're quite a small organization is only about 20 of us and so We actually have a matrix structure. So we go our three programming programmatic areas and across them We've got a legal team a Research and investigations team that do a often do research in places like the global south on on governments and other people who are subverting surveillance, then we've got a Communications team an advocacy and campaigns team who do You know just generally trying to bring raise the privacy agenda all around the world And then we got the tech team, which is what we are indeed The better team obviously yeah I've learned to respect the other the legal team and the advocacy team too But we have a nice internal sort of rivalry about these things about how we get things done and and I like the way that works personally So, you know the tech team will end up supporting the legal team on these amicus briefs I personally had never filed an amicus brief before so we'll get involved in some of the core cases that are going on around the world so like Apple versus FBI or the playpen case, so we'll look into Potential ways that tour might have been not necessarily compromised, but how it was possible to track some of the users and How that warrant was probably violated by you know targeting while not targeting 8,000 different Computers around the world and something like 60 jurisdictions So we get involved with the legal team and then they'll write up the amicus brief and send that off as advice to the judges In the court case but by having an organization structured like this It means you get a lot of different viewpoints you get a lot of different experience across all the program areas It's rather than just being like oh The the stuff around algorithms or whatever is entirely a tech issue. It's like it's actually no It's not it's a issue across multiple strats of the organization And the other advocacy team is made up mostly of human rights activists and researchers So they'll publish an individual report on say Tanzania or Kenya or Egypt and they'll focus for six months on trying to uncover what's going on Including the sale of zero days to various government agencies around the world and this kind of stuff. So that's a bit of background So what is data exploitation? Well, we kind of covered the outsides, but the you know if we get going she's gone to the next slide. Sorry Yeah, you want to move on? Okay. Yeah, sorry So this is a example of the data in the wings which I was mentioning earlier So on the top we've got the the the laptops that have the Snowden docks on from the Guardian. I think my colleague did a really good talk on this in CCC 2014 2015 one of those anyway, it's a the when the laptops that the Snowden docks were on were at the Guardian the gut the GCHQ turned up and Came around destroyed load of the chips on the board and Love them were like pretty benign shit. Well, we thought were pretty benign chips You know had keyboard controllers that kind of stuff And but clearly GCHQ were thinking that there was some way that classified information from those Laptops that had handled Snowden's documentation Is in those chips so Potentially government knows something about The stuff we're using every day that we don't start so it's quite you know That's that's just like a single example. We can always just ask them. Is anyone from GCHQ in here today Want to give us some feedback? No worth a try And the other side of got the the field test mode of On an iPhone which allows you to see the data around all of the 3g or 4g towers, you know the cellular data The and again if you know the code you can get the information But most people don't know what the code is it's sort of hidden and there's probably other things inside your iPhone Which if you know the secret code you can get but you kind of feel you should probably know some of it and some of it will Like again, there's this the cell towers you connect to it's essentially location data And that is kind of personal to you, so It's a little bit worrying and the final one here is the room bird. Do you want to talk about the room bro? Do we talk about the room for you? I think everybody knows about the room a case, right? You must have seen it in the press it was a thing that happened last week There was a rumor announced that Or that have been found out that there are The little robot that goes around vacuuming your house is actually mapping your house why it's while it's doing that Which is always a nice to know that someone is looking around how how big your house is and that of course has an effect on You know people selling properties or something right so you could buy this data and then use it to Determine property values are properties getting larger or smaller in a particular region over a period of time this kind of stuff And so that's what we're trying to capture with data exploitation We view the hacker community as a particularly empowered voice in this debate I mean there are many people in here who can reverse engineer devices We'll show you a bit of reverse engineering of a couple different devices here in just a moment But the point is that not everyone can do that So how do we extend that ability to run wire shark on a router and see what data is going out and then challenge the company? Legally, how do we extend that to a wider audience to build an advocacy movement that helps protect us and our data in the future? Yeah, one so one final point on the room, but just while we're at it I've assumed the people who bought the Roomba didn't know that their house was going to be mapped They bought it because it was a vacuum cleaner and it's doing something completely separate to what was They originally thought it was gonna do and we could challenge that particular issue in multiple ways It's not just they they necessarily use the data in different way. It's also that they can break the devices, right? So everybody remember the the nest Situation where you buy a nest and you're able to use it for home automation And then they threaten to break it because the project has been shut down So it used to be when you bought a dishwasher it continued to work and it wouldn't be shut down or altered or the data wouldn't be used Remotely in some other way That's something that's changing and we'll talk about some of our approaches to solving that cool So I'll give you a little bit of a spiel about cars. This is a Before I get too into it the this is from the 2017 Nissan cash guys handbook on there. They're connected after my connected car I just love this one because it will be the surrounding stock imagery. Like what is this going to do with cars? It's great. But yeah, this is so their car is a really interesting one in that it has integrated like Facebook and TripAdvisor and this is a You know, this is more and more common obviously because people want a social network on the go I think I don't I don't know. I'm not a social network. I heard that I heard that from the car companies. I went to a Conference with Daimler and they said that there's a large block of people on the west coast of the USA that buy cars and Literally say things like I don't want to have social death when I get in my car. I want to be able to tweet I want to be on Facebook. I want to be able to share my location with my partner as I travel around so they know what time I'll be home So that that seems to be a bit of a market force for them And they now offer a service to delete all the data from your car From their particular models of the car but no one uses it right so when you sell your car You could delete all of this data, but a lot of people never take advantage of that service and plenty of companies as far as I know Don't even offer that service. Well, yeah, we've we've seen cars also like The deletion is a real issue in cars because often they just refer to it as factory reset Which doesn't necessarily delete it just makes it null as you know, there's no, you know, there was no data there You know that and then sometimes the factory reset Reset is Functionally a fact that you have to take it to a dealer who is authorized to reset the car. It's not a it's not just Yeah, it's a real pain the arse be a beat The side effect of this is that this car is now having Facebook data or trip advice data or Google data on it and it's probably not the most secure environment for that data It can well, yeah, we're talking about it in just a minute. So So there's a brief guide for people who aren't particularly fair with a reverse engineering or what it is It's basically taking stuff apart and seeing what's inside it It's like it's quite indicative like you can look up all the chips and have a look at what the chips do If you're happy to do some testing you can do some testing between the interconnect see what they connect to And it can show whether advice is secure or insecure in some way respects so cool I'll probably talk a lot about reverse engineering, you know one way or another so on my desk in my office I've got loads of bits of car. It's quite it's quite the infreak Another moment. I've been looking at two BMW telematics units from one series and I've got two Range Rover telematics units once from a 2016 evoke and the other ones from a 2013 Freelander you know pretty quite premium cars all of them and It's it's they're fascinating with how much crap they have on them basically Yeah, let's have a quick look at them. Shall we so this is the this is the Land Rover Well, Jaguar Land Rover. They're the same company. So all these when these were made they were the same company I think Jaguar is now owned by Tata. I'm not sure if Land Rover was also bought by Tata and A few of you might be like well, you know, it looks it looks like a computer and it is the fence Actually, it is a computer and it's in your car except this computer Uses a system court. So there's a sock on here. This is this renaissance sock and and One of the features of this sock is that it's full to motive use and it's got a bus and it called the can bus and The can bus is connected to everything else and the can bus isn't encrypted or anything You can send any arbitrary data across that can bus Including so this is this can bus is connected to a 3g modem. So that's nice. That's cellular It's also connected to its own ethernet control the car has its own Ethernet network because you know you want that in your car. Yeah, why not? It's got a Wi-Fi chip because you know if you can't get on hard-wired. You might as well get on Wi-Fi The whole the whole unit doesn't turn off for the car head stays on all the time because It could be collecting data and it's got quite a lot of storage for for an embedded system. This has got this one I think has a you know about 300 400 mega storage on it, which is quite a lot and if you look in the handbook some of that storage is used by its Emergency recovery system, which is basically about black box In the same way you have flight recorders in planes. You have a flight recorder in your car that checks to see why an airbag was deployed But it also checks to see maybe whether you are driving quite aggressively, you know heavy acceleration heavy braking That's all locked and Although this is currently not connected to an insurer and sure insurer would find that information very interesting There actually there was a insurer in the States who wanted to plug USB devices Into cars to keep track of your driving and then depending on your driving over the course of a month or two They would offer you a lower price on your insurance But then another friend of mine went and did some reverse engineering of that USB and found that he could load it with the Exploits, so they were actually changing the attack surface of the car at the same time that they're trying to offer you better insurance Which is kind of a crazy crazy life This Renesis chip is that it is also got it's got USB 3 obviously why wouldn't you want USB 3 in your car? obviously, and it's got a You know full full video out so it is essentially a computer in your car Just out of curiosity have you sent like an 8th plus plus zero over the can bus just to see if it disconnects the modem No, I haven't sorry Guess that's a bit old-school. Yeah Anything else you wanted to say about this Does anybody know what the unknown test points down down at the bottom are any other car hackers here? Any car hackers. I know they're here at the camp, but all right anyways Okay, cool. Do you want to go to the next one? I think one of the important points is Chris has been working on this for like two years So having each individual person kind of do this and protect their own data and try and understand all the stuff is kind of untenable So what we're hoping to do is make sure that this kind of research eventually makes it to the public So it can be used for legal advocacy So you can go to the companies and get back the data that we know they're gathering Yeah, cuz like I can only do one car every now and again when I've got other when I've got time to and There's just thousands of models of cars and most of the systems are proprietary So and it's pretty low hanging fruit It's literally opening the thing up and having a look inside and just working out all the chips are Hopefully how they connect but don't know even that just what they are even is is you know very helpful We also want to change the debate a little bit There's nothing wrong with the stunt hacking that's been going on for safety reasons It's you know fantastic research shows us what we should be concerned about But there's precious few people doing this for privacy reasons as well And I think that both of those things need to be done at the same time cool Your slide. Oh, you're gonna skip through a couple of the animation. Sorry Yeah, so this is a this is the BMW's unit And keep going Which is a nice and I got this this is a good one because I'm the FCS The American version of this unit has to go through FCC regulation So I actually get a nice block diagram if you go forward another slide Which is great, and I wonder if anyone recognizes any of those chips in that block diagram anyone That is an iPhone 4 effectively So yeah, this is this is in the BMW and it's essentially it's the same power chip and same modem is in iPhone 4 And it's connected to the Samsung memory, which is also an iPhone 4. So it You've basically got an iPhone in your car without even you know without even knowing about it Yes, so this this extension board this daughter board connects to this This slot in the bottom corner here And it this one the one the model of the car I had was probably a low spec one unfortunately because the the empty solder contacts are for GPS But again, all of it's just aggregated in one chip, you know passed around the cam It's it's great, and yeah, it's quite quite privacy Low low low on the privacy scale Cool So we talked like about looking in cars hacking cars a little bit maybe yeah, tell them the fun stuff We got to do last week Yeah, okay, so We've done a little bit of if we've recently started having a look at rental cars because rental cars are really interesting examples of data Exploitation so when you buy a car and you put data on it It's kind of your car your data. It's you know, you're you have some control over it when you rent a car You know, you only got it for a couple of days, and you put you know your location data in it or you put your You know you can't your phone to the car, and you you know you do a Cool through or whatever and it's you know as soon as you give the car back if you don't wipe that data No one wipes that data with well at least that's how we think no one wipes that data So you have you know just personal information of loads of people just driving around And we're kind of hoping whether you guys might be able to give us a hand with this so We're looking for anyone who is looking to rent cars in Europe particularly in Europe because we are a UK charity and although UK is still part of Europe. It's not seen as being the So if you're in if you're in Europe you're renting a car, please can you get in contact with me because I'd like to talk to you about it Yeah, some of the things we found is that as soon as you collect your connect your phone to the car All of your missed calls or made calls or whatever are suddenly available to people Even if you've walked away from the car a certain distance Some of the places that you've been in the past Are there stored in the car and of course theoretically you can delete all this But the user has to know to delete all this and we sat down and read through all of the license agreements on this particular Vehicle that we were looking at and it took us an hour Just scrolling through the menus to read all of the license agreements and understand all of the software that was in The car that we were renting for a day. So It's kind of an untenable position in the long run for users All right, we're gonna change gear a little bit pun intended nice puns. Thanks. I practiced that So, yeah, I want to talk to you a little bit about IOT a little bit about Mariah and then we'll move on to medical data And it's gonna go a little bit quickly. This is from a paper by an academic and his students at the University of Cambridge and what they were trying to do is map Vulnerabilities over time and I think it's a great graph because it's sort of what I had in my head as a pen tester when I was at IO active you have this is just known Vulnerability so this we're not talking about zero days or you know something new This is just do we know if a vulnerability exists or not in this particular version of this particular phone on this particular network So different phone networks send you updates and patches for your phone or for the apps at different ratios And this is basically showing you that at any given time for Android There's always some exploits left that we know about or some vulnerabilities if not exploits That are available to us as attackers So the patching rate on the phones is just not high enough to maintain a some assured level of security And I think it's a fantastic graph that illustrates that so you can go and see some of their work on Android They have a product called device analyzer where they gathered this data with people's consent I also want to talk to you a little bit about Mariah and Did you want to say something about Tesla before we move on? Now, okay next one. So this is gonna go a little bit quickly Lots of people have heard of Mariah, right? You know about the Mariah case I'd like to use it as an example of pollution Don't give them a quick introduction. Okay, so Mariah was a Mariah was a botnat used primarily for DDoS as far as we can tell And essentially it was infecting IP cameras CCTV cameras Digital recorders set top boxes these kinds of things right and then it was using those devices to perform DDoS is around the world So essentially some vendor comes along and sells you a product and it does something that you don't think about That's poorly configured DNS or SNMP or something and then that can be used in turn to perform DDoS attacks on other people either because it gets exploited or because It's a reflector of those protocols, right? So actually Mariah's problem was that it was default passwords. Yes, so essentially yeah, we're talking about admin admin, right? So there's some discussion we can have about Vulnerabilities and liability in a moment or two But this is counts of Mariah infections by country And I know many of the people who are working these individual incidents at any given time This is an IPv4 map, which is actually kind of horribly difficult to really read my apologies But it shows roughly the source and destination addresses Not for the DDoS, but but in general for the traffic. So we'll move on This is what a Mariah Bloom looks like so this is what incident responders essentially have to deal with so here you have the countries With the counts at different time over what a six week period So you can see you know Brazil wasn't doing too badly and then suddenly they had a spike in infections And then they worked very hard to clean them up and they got a little bit better But that's what it looks like for incident responders over the course of a few weeks Constantly trying to deal with thousands of these things because companies have plenty of reasons to sell you default passwords for usability, right? So next slide All right, so here's my crazy subversive idea. Has anyone heard of the general product liability directive here in Europe? Yay, thank you. Have you read it? Okay, good. So these two are particularly interesting to me We've been having some discussions inside the computer security community about whether or not the product liability directive could be used If someone is hacked and the device stops working in a functional sense So we need to talk about iot devices rather than say everyday general-purpose computing Could someone sue them? Because the product is no longer working So it used to be if you had a dishwasher and it floods your kitchen you can sue them because it floods your kitchen, right? Well, what happens if someone Sid floods the dishwasher and the dishwasher floods your kitchen? Can you still sue them? So there's some discussions going on in Europe about whether or not this is a good idea and whether or not it's possible So we looked into the product liability directive and it says quite clearly that the eula does not prevent liability For an iot device manufacturer so the liability the producer rising for this direct directive may not really in relation to the injured person be limited or excluded by a provision limiting liability and exempting from liability so the end user license agreement Would be violating article 12 essentially is what we're talking about the other thing That's important is in one of these two clauses. I believe it's in number one The liability of the producer shall not be reduced when the damage is caused by both a defect in the product i.e. A vulnerability and by the act or a mission of a third party a hacker So we're exploring whether or not liability could be divided up between manufacturer and attackers and whether or not this would Have an effect on the quality of products in the future So let's move on. This is a modern An example of computers in medical rooms right in hospitals in surgeries in various devices This slide was produced by a fantastic researcher called Harold Thimbleby He's been doing medical device research for a number of years And I think it just gets the point across right that these computers are everywhere and they're used in life critical Situations that have a real-world effect, right? It might be drug infusion pumps It might be implanted medical devices such as pacemakers in you know Defibrillators can also be For a variety of other reasons right so yeah just getting the point across there's quite a lot of computers here So Harold Thimbleby's work focuses on user interface errors. This is not just about vulnerabilities It's also about accidents. So this is a variety of different Ways that you can administer drugs to patients and what I want you to want to focus on is look at all of these user interfaces some of them signify 1,000 units with a comma some with a full stop some of them place the buttons in different Positions so the the one is at the top or it's at the bottom So your average nurse or doctor working with all of these tools has to switch between one or another and understand all the different user Interfaces this is what liability did for us a hundred years ago It's standardized the car so you don't just have a steering column on this side or this side or the middle And you don't switch the brake pedals around with the with the gas pedal you standardize the user interface until things become safer We still haven't done that in medical devices and Harold Thimbleby Estimates that at least in the UK the number of deaths related to medical device user interface errors is Equivalent to the number of deaths from car crashes I'm gonna let that sink in for a minute One of the points I want to make here is that One in one thousand of you in Europe will end up with an implanted medical device sometime in your lifetime in Norway for example, that's getting close to about one in two hundred So as we all get older and we have implanted medical devices or even wearables This is going to be very important to us and medical implants have recently been used for forensics and led to convictions Right the medical insurance industry wants this data to keep track of your health over time and change the pricing of your insurance My concern is that you won't know why the price of your insurance changed Because they'll have gotten the data from some second company or third company and lastly you shouldn't have to choose between health and privacy So we have a special guest speaker here with us that we've been tweeting about and joking about for a while I don't know if many of you know Dr. Marie Muah, but she's here with us today And I'd like to invite her up on the stage to tell you a little story of her own So would you please give her a nice round of applause for joining us? Thanks So I have a pacemaker implanted in my body. I have a computer inside of me I got it five and a half years ago and At the time I was working at Norwegian shirt doing incident response already working on information security And I suddenly needed this device to stay alive So I'm depending on the computer inside and I have data inside of me in this computer generated by my own body and I can't get access to it. It's all proprietary information This come this computer also sends out log information and you can hook it up to kind of medical Internet of Things and There are websites where the doctors can log in and see my my patient data but this is not available to me as a The generator of this data so I started a pacemaker hacking project together with Aaron and To see if I could get access to some more information About my pacemaker and my data and if I could actually trust this device that is keeping me alive and I've been doing some talks about it and a little less than a year ago I was invited to do a keynote talk for hardware.io. Some of you might have been there It's a great conference on hardware hacking here in the Netherlands And I actually ended up in hospital In Amsterdam. So this is me in the hospital because my pacemaker Failed while I was up in the air in the airplane on my way to give that talk and So this is kind of personal story about my own critical infrastructure And how I accidentally ended up getting hold of some information due to this happening So it's up in the airplane mind my own business. Just usually I don't feel anything I don't feel the pacing from the pacemaker even though it's working 100% of the time and just keeping my heart beating But suddenly I just get a strange sensation. I could feel that something was going on I looked down at my chest. I could see my chest muscle was twitching and So I I figured out there's something wrong with the pacemaker I notified the air crew when we landed at Skippal There was an ambulance waiting for me that took me directly to the hospital I had to spend the night there because I didn't have a pacemaker technician in at the time And this is me the morning after when they're rolling the table with the pacemaker programmers So you can see for different brands of programmers They have to use the programmer that is actually connected to that that is from the same Manufacture as my device because all the wireless communication protocols are proprietary So they don't work. There's no like Open standards for this. So you have to have the correct programmer to the correct device and But I'm looking happy in this picture. That's because I've found out that it's not it's actually something that can be fixed By just giving me a software update Our firmware update so So I didn't have to have any surgery. There was nothing like physical like there was not not a hardware issue in my pacemaker So this is The face of the pacemaker technician When he had hooked me up to the programmer and was looking at the display display on the programmer and he saw this error message Next slide So this was actually had been going wrong. There was a data error in the pacemaker it was it has switched up the Voltage that's why I felt the pacing it was constantly pacing me 70 beats per minute And I'm really happy that they had engineered in this safety feature because I was actually keeping me alive It was a bit uncomfortable, but not not a crisis But check this out there's actually a memory dump and a log file created on the pacemaker programmer and Because of this crash, so I'm sitting there getting a firmer update and after that I'm Good to go after we have configured the device of course and I Happened to have a Memoristic in my bag as usually happen to travel and I asked that I asked if if I could get a copy of this file and That way actually ended up getting all of some log data from a pacemaker that we've been trying to get hold of also in the hacking project, so that was a win and Also reached out to the manufacturer and I also got some of that data from them Which made me get access to memory dump from my device so This is what you have to do to get all of your patient information if you're wearing proper cherry Computers inside Yeah, you need to hack it or you need to get a failure event and get hold of the logs You're the only person. I know he's willing to have a heart failure to social engineer a hospital Okay, so you can move on to your conclusions, okay running out of time We are so hopefully that gives you a sense of somewhere. We're dealing with obviously we can talk about safety But we also want to talk about privacy So we're trying to formulate these data exploitation principles One of the things we want to do is reduce data monopolies in general of course That's very very difficult in a corporate world But there are ways that we can do that and we would like to work with some regulators If there's any regulators in the room to work on some of those issues We also want to address the imbalance of power between user and company So you shouldn't have to spend two years hanging out at hacker camps just to be able to access your own medical data I mean, I know you probably would have done that anyway, but the point still stands, right? That it takes a lot of time and effort to do this We also want to legally contest the non-consensual data sharing that we see a lot of the time Such as Chris tweeting pictures from the stage. Yeah, sorry. I'm just joking So we use a variety of these different techniques down at the bottom desars So Chris built a nice little router To collect data from any particular IoT product that we bring into the office It's basically just wire shark on a router, right? But every time we buy a new device We run it in the office and we bootstrap it and we see which data it's kind of sending across And this means we can make legal requests to the companies and we know in advance what data they will have So we're in a position to say we want you to give us this data And then when they say we don't have it we can say we know you do because here's the wire shark, right? So a simple approach, but please join us in this kind of thing We also work on multilateral international Sharing agreements and try and challenge those freedom of information requests and amicus briefs as we said previously FOI is a great tool. The FOIs are a fantastic tool. Lots of people have filed an FOI in here, right? Really only a few file more FOIs just yeah use use camp bandwidth for that All right, so we're gonna try and wrap it up. Yeah, I'll try and keep this one. Sure. That's okay Yeah, we'll do our best so We've talked about the problem So PI is trying to work on what might be the solution and what we're trying to do is bring like a set of principles they're like the you know the the entry level of what users should expect from you know when devices or services they use and in it's like the You know what you would want like a baseline effectively So, you know and there's about 14 of these so I'm gonna try and whiz through them reasonably quickly I'm looking for comment on them or we're looking for comment from them to be honest I'd refer if you would send me comment rather than do it in the Questions at the end that'd be great because I'd like to keep a note of your comments But I'll try and go through these as quick as can so first thing about this is with the spread point to two categories as data and control and Security of information and so the data control stuff is stuff like All the data that is derived from your personal data should be treated the same as your personal data like that seems fairly sensible really System should be designed to minimize data Data generation processing and access and that's very important a lot of people focus on Minimizing the processing, but they gather far more data than they need to and as you start to look at these devices you see that Yeah The data should not be generated collected analyzed retained or transmitted aggressively or excessively Although excessively is a bit of a way of putting this is maybe Excessing relates to the user rather than the you know some arbitrary number Excessive like were you expecting that to happen basically? The General data protection principle anyway data must be protected from access put by people who aren't the user That's already mostly in you European data protection a lot, but again, we talked about places where there is no data protection law Individuals must be able to ascertain their digital footprint They must know you must be able to find out How much data you've got so you have desars, but that doesn't actually really necessarily tell you much That tells you maybe what you've told a company, but it doesn't necessarily told you the data that's been interpreted from your data It doesn't tell you You know all sorts of types of data that your data could be used in fact I don't think we have legal instruments to determine what can be derived from your data or what AI could derive from your data Or data fusion could be performed We have legal instruments for the individual but not necessarily for society in general So we see this as kind of a consumer rights issue Individuals must be able to Delete and refresh their data like you must have the option It's a bit like the the GDPR does changes a bit because previously in a listen UK You could ask if data to be amended but not necessarily deleted Whereas in the GDPR you can now ask for data deletion, but we think it shouldn't just be a general principle anyway And if you're if you want to you should be able to have a negotiable Identity and at default you should be anonymous You shouldn't have to give data to use a service if that data isn't intrinsic to the operation of that service Could only go into the next one it says more are relating to the security and protection side of things And the first one is Devices shouldn't be able to betray us our device in service should not be able to portray us which is Again, it's a sort of raption from we already gone over but it's a you know Your device shouldn't be giving out data like First information about you Without your knowledge or consent or whatever so an example of this would be something like Wi-Fi which broadcasts all the previous SSIDs you've been on it's betraying a load about your Pre you know your previous connectivity like it would if your SSID is named whatever you work the company you work for and your At home SSID is where it named where you live or whatever. It's telling an awful lot about you by just broadcasting and it's sort of betraying you Individuals should have insight into the data that's collected on them and why and through and how you know We've talked about pacemakers. We've talked about cars. You need to know What's the point? Why why does someone take this off me, right? You buy a set top television box and you expect to be able to download content You don't necessarily think about the fact that it's keeping track of all the content you watch to give you better content for values of better Security updates should be separate and distinguishable from feature updates. That's this is the Microsoft problem I think silent upgrades. Yes This is where you get new features You don't necessarily want because you also want to have the security features which you do one They should be two separate things Manufacturers well, we talked about liability already, but manufacturers maybe should be responsible for security of their device throughout their life cycle Yeah, yeah, absolutely And all entities that handle the user's data should be joining us every liable for it so Again, this is this is in some data protection law But not all data protection law and it'll be great that if this will like the the baselines of Yeah So if you're willing to do some activism around this and you want to do some legal activism or some technical activism Or just be an empowered user if you have some ideas about our data exploitation principles How we can explain them to everyday people who are not hackers who are very empowered and have a voice Please get in touch and let us know Chris won't be here because he's gonna run away back to London as quickly as possible Marie and I will both be here, but Marie doesn't work for privacy International so bringing the questions about PI to me, of course you can speak to her about medical devices and so on And thank you very much for your time. You could have gone anywhere and seen any talk So thank you for coming to see our talk Thank you very much. That was a fantastic talk Thank you very much. That was a fantastic talk and now we still have about 15 minutes for an open discussion So if any one of you has a question, please go to the microphones at the center of the tent You you don't have to answer my question But the prime minister of the United Kingdom Mrs. May has a very peculiar track record Can you get closer to the mic please can you comment on on the track record of her government and will it be replaced soon? I hope Well as a as a representative of privacy International, I don't think I can advocate any non-democratic process in any way But the future of the UK's approach to surveillance I hope will change and we'll continue to challenge that so as we said at the outset our Amazing legal team who I started to work with challenged the UK's collection of data data retention laws and also the sharing of data internationally and won that court case so that GCHQ had to acknowledge that they had Gathered that data for 17 years illegally And that's the sort of work that will continue to do so I hope that that Regardless of any individual politician whatever party they might come from we we don't like to see that kind of data retention Continue and we'll try and challenge it anywhere in the world that we can Next question, please Yes Commenting and a question. I really like to work about cars Just to give you an idea of how bad security is with car companies My job is normally I'm a policy expert. I work for the OECD in Paris I used to work for the Ministry of Economic Affairs in the Netherlands. I don't code at all I hack law I basically say I did hack Toyota Toyota the Netherlands I hacked their website and I was able to get access to the data of other Toyota's just by entering the Number plate, right and I could see what people had downloaded for their car what kind of software updates And I could also send updates to their navigation system if they actually use that Function That was completely not secured whatsoever That I can hack that is a testament to how bad it is absolutely I know vallis second Miller who did fantastic work on car hacking also discovered the same thing Once they got onto the network of the ISP or the the mobile phone provider They could see multiple different vehicles identified by their vehicle and identification number the VIN And so they could see many vehicles at the same time. So I mean maybe Chris wants to say some more about that Two points this so firstly talking about Toyota. I think there was Toyota. He said So me and my dad and I both own Toyotas and we both live at the same Well, we were both living at the same address So we registered our car on whatever is my Toyota to do you and I could see that site And I could see all of his car details and all the stuff about his car And I could see all of my car and he could see all of my car. It did. Yeah, it wasn't a great system So I had reported it. They said, thank you. They gave me five tickets to the Toyota Laumann Museum It which is a brilliant car museum here in the Netherlands. Thank you, but I don't think anything has properly changed I totally agree with you. I think the same thing. I also reported it good So at least we did our due diligence on that I Do have a common and I'm running around with this idea on a policy side that Notifying and making people accept end-user license agreements is just Idiotic and and our privacy rules want us to register all the data that we have on a put it on a person to see whether it's Privacy sensitive but data is collected everywhere My standard example is that even street lights connected street lights could these days be privacy sensitive because if you walk out at Night from your home and make a trip through your neighborhood and the lights turn on That could be recorded and then you could see at night who went out of their house at what time. Yeah, so instead of Forcing people to register all the data that that they collect It would be much better if we could get people to Tell you who accessed your data the same with with medical data etc. In the Netherlands you need to give an upfront Access to your data to a particular person before they can access it But that prohibits doctors when you're lying half dead on the ground to access your medical data Which is a problem for the ambulance stuff So it would be much better if we had a register of who access data on you Yeah, then that we try to block it up front. What's your idea on that? It sounds like that's probably a wider conversation We could have together with PI and and yourself There's lots of regulatory approaches that we are talking about taking the main thing is just focusing on the empowerment of the user But I realized it's also becoming untenable for for companies To be able to manage all this data and access in the ways that they're talking about So it's a problem that society needs to solve in general And that's why we like to work with lawmakers and regulators and lawyers and hackers and human rights researchers We think it's a really multi-disciplinary approach that changes things over time Yeah, the data sharing is a big issue anyway because it's often non-negotiable like you agreed like Although it would argue the license agreements and ULAs are Functionally useless, but you're you have to agree to these terms if you want to use a product or service and You're having to share data which with people you might not even want to share data with so it's Well, you might want to do it now I'm going to catch you that because there's other people join the conversation find me a little bit later in No, so you still have time. No, no, I just mean we'll have a chat again later Let's go. I have a question to Mary Probably your device was hit by a stray cosmic particle Are you trying technical countermeasures now when flying like putting a sheet of metal over your pacemaker, it's a really rare event But I fly a lot so I guess if it's happening to someone then it would happen to me so it's possibly or most likely it was caused by cosmic radiation and That caused bit flips in the memory of the device. I Don't think I want to Like I can't live my life Worrying about my device not working all the time Actually, I feel better after having this incident because of the fail safe mode failure mode made me stay alive so It actually worked even though this happened and I don't want to go around with a care a fair day cage to protect myself against that Your husband might object All right, any more questions Well, I have one and Not everyone here is a very experienced hacker that knows how to track their internet And I was missing in the part of the tutorial where you tell people Like which is the easy access software that people can install So they can like track what their devices are doing all their connected devices I think for an everyday user. It's not that there's so many tools So so as a reasonably technical hacker person you use wire shark, right? You download a tool called wire shark and it keeps track of all the connections that are made over the internet on a particular Interface and so then you can record that almost like you would record a phone call And then you can decode it with various tools and there's many workshops here on wire shark At the camp some of them delivered by people who really know the software. I've been writing it for years There's also little tools like little little flocker Which keeps track of web browser? Interactivity so it doesn't keep track of all interactivity, but at least the web browser So some of those tools I would say get people started And then maybe some of you in the audience will write some next-gen tools that are easier to use than wire shark For capturing this kind of thing PI is also working on a nice segue here working on a another project my Like my other colleague in the tech team in that PI is a writing a bit of software He's called thornsack Which is basically? Compartmentalization for networks and is about trying to minimize the amount of cross contamination and cross communication anyway It's not so much about the logging so much, but it is trying to minimize what things are talking to one So shout out to Ed. Yeah, I recognize I also work with a team of lawyers Can you get closer to the mic, please? So I also work with a team of lawyers closer, please and The team of lawyers I work with they said that pseudonymizing the data With the current GDPR it is pseudonymized data holds the same regulations as not to demise so really nominize data What is your reflection on that? What do you it's it depends a little bit on the data itself, but in general There are ways to de-anonymize data again using mathematical techniques and as a broad stroke This isn't true for everything. I think that the legal boundary is not sufficient against a Mathematical or hacker boundary. Let me say that differently. I think the state of the art in the mathematical Anonymity community is far beyond what the law is protecting us from at the moment And if any laws don't sort of keep up with the offensive community Then we fall behind and it becomes sort of meaningless legislation in some way It helps, but it's not enough Necessary but insufficient. That's my answer I have two questions for Marie and First one is about the other users of them and pacemaker Do they also have access to their data or are you the only person in the whole world who's got the data from your Pacemaker it depends on what kind of data you're talking about so The data that is related to the configuration settings of the pacemaker you can get access to if you have a programmer And that's typically when you go into checkups that they they they can set this configuration And I always ask for a printout a hard copy of all my configuration settings And that came in really handy when I had this incident because I was in hospital Amsterdam I needed to get my device reconfigured and since I had access to I had this printout it was actually in my office back home, so I had to call a colleague and Get him to scan it for me and send it to me on email You were able to configure or in tune the device exactly as it was before this happened So I could just walk out of there and and be as normal again Because it was impossible to get hold of this information this data from my local hospital in Norway when I was in in the Netherlands So really happy that I had this data Also, it's possible to get hold of these programmers on eBay So I bought a programmer for five hundred dollars and it came with a lot of data That's you I could get access to you, which is kind of a privacy violation of the patients that had their data in this Programmer that wasn't really decommissioned properly before it was sold on eBay So it's possible to get hold of this data, but it's kind of data that we got from from the crash file That's the kind of data. That's only the manufacturer that will usually have access to Or if you do like a forensics and examination of of advice and Then the second question I have for you is more kind of your creative Imagination of the data you get and so have you got any plan after getting the memory dump of your heart If I've got any plan Have you got any? What are you going to do with the data? I'd like to do with the data Yeah, I'd like to just know more about how my device works basically and I Couldn't get access to the source code running on the device or the firmware on the device without hacking it so I want to as a security researcher and Have a like a third-party opinion on how secure this pacemaker is because I know there's a lot of pacemakers that have been shown to not be secure that haven't really implemented security protocols properly like really failed with the with the Implementation of crypto for instance and I studied crypto and I get upset when I heard that some pacemakers have authentication kind of implemented by 24-bit key, so it's kind of silly really if you know some thing about the crypto so Yeah, thank you very much Christopher you had an announcement. Oh, no one question. Sorry. All right, then please join sir what you what you are saying is that they use the privacy argument for Not giving you access to your own data and your own machine, right? Well, they're using their IPR as a kind of You want to know your own information, so Why should they say it's privacy break? Yeah That's I think we need to move into a future with more transparency and how these devices actually work so that people can trust them Any other questions? My takeaways are two bits of homework, so if you are Renting a car, please do get in contact with me. My details are up there Or talk to Aaron if afterwards and the other one is any comments on the principles we put we put forward Again, you get in contact with me Or they will be up on our website, so you can have a look at them there if you want to see them in full and Yeah, thank you. Yeah Thanks again. Thank you very much big applause