 Hello, I'm Sarah, I'm a designer. I'm interested in the kind of interaction data networks within the public domain. I work in the area of kind of technopolitics, privacy and civics. I run a small design studio called IF. We make things that change how people think about privacy, security and data. Our mission is to make those things more empowering, transparent and active for people. We do this because I think we live in pretty extraordinary times. There's just 58 years between these two photos. On the left-hand side, that's a computer used by Norwich City to handle payroll and rates for its employees. It took a number of people to negotiate getting that computer into the building. In 2016, you can see the Raspberry Pi Zero, which I'm sure many of you know, which costs just £5, and it's smaller than a credit card. It kind of refers to the way that at the moment it's never been cheaper to put a chip in things. And so we're increasingly moving towards this internet of things or internet of everything, where all objects around us become data conscious and connected. But I don't think that this accurately describes what we're building right now. I think it's a little bit more like this. This is a really great Twitter handle at the bottom. It's got a bit angry recently, but I recommend you follow it if you're on Twitter. They post images of things like this. On the left, you've got a connected watch, and you can't tell the time because there's a software update. Suggestion in the middle that we connect toilets to Facebook, services that have never been connected before. And I just have visions of in connected homes that it's really hard to grow up. It's really hard to come home as a teenager drunk and for your parents not to know about it. And on the right, there's this poor person that's got stuck in a lift and there's no panic button just a screen that's telling them about a software crash. But seriously, I think we're in a troubling place right now because we used to get products with instruction manuals. We now get products with apps and terms and conditions that no one reads or understands because they are longer than war and peace. They tend to be written in gobbledygook. As we agree to these terms and conditions, we don't just sign up to strange policies. We also sign up to these strange new social contracts, like a couple of years ago now with the Samsung Smart TV where there was a clause within the privacy policy that said, anything you say in front of your Smart TV may be recorded and sent to third parties as part of our voice recognition services. So suddenly you have to be really careful about what you say in front of your TV because it might gossip more than your next door neighbour. So the objects around us might just become informants. They might betray you a bit like this because we've no longer got static relationships with things around us. Our relationships are changing. We have a beginning, middle and end of a relationship with an object because the rules of engagement change every time we get a new software update or the terms and conditions get changed or we add a new app. And as more and more things become connected, that's going to be an awful lot of things that we have different rules for or contracts with. It's going to be very hard, if not impossible, to know what you can trust and what you can't. And that's when you're the owner of a product. What happens when you go round to a friend's house and they have one of these elusive connected fridges? What is your relationship with that object? What happens then? The way that these products talk to us has to change. I think that if the government can write in plain English, then why can't Apple? We need human-readable explanations alongside software that people can understand what they're using. This is something that my friend Richard Pope introduced me to, Gherkin syntax, and I'm kind of interested as to whether this could be a nugget of how we can ask things questions. It gives you human-readable kind of text that you can ask of software so you don't have to be a developer to know what's going on under the hood of a thing. And as we put the internet in more things that have heating elements or blades, I think that this is increasingly about kind of public safety and policy development as much as it is about privacy. I think we need to make the products that people use clear and accountable so we know what's going on. That's our right as consumers. I think it's a bit like Mr Weasley said. Never trust anything that can think for itself if you can't see where it keeps its brain. So that's what I've been exploring recently, how we can make the objects we own clearer and more accountable. And if we've been thinking about this through the lens of a consumer rights group, how can they update what they offer beyond a kind of saga-like magazine that gets sent to you each month? So we've been exploring this through kind of three different objects or things. They are all design probes, so they're reasonably speculative, but we're using them to show a little bit of the future and to think ahead about ways that we could question objects around us. And I'd like to show you one of those today. It's an early version, but at the end I will share a URL that you could track what we're doing with it. So this is an idea for a hub, a home hub. We are really interested in the idea of a router as an object that we can explore more than just as something that provides you with Wi-Fi. So what we're thinking about is using network maps of objects in your home. And these objects don't have to just be connected, they could be kind of analogue objects too, but that they're all stored in one place so that you could log into your hub and see the network of objects in your home. And you can see, as you click through onto those objects, whether those particular things have passed their any tests, passed tests for bugs, security errors, and that you also might be able to add on significantly more secure tests depending on who you are. So if you're someone who, for instance, might be a journalist, you might want to have a much higher rated security package running on your devices. And so we're interested in giving people the option to run tests over their own objects to see what's going on. And you can see here on the right, as we go through, that you could add on different packages. And that means that also that other interested parties like campaign groups or charities could review tests against particular pieces of legislation and run them against the data to independently verify that a regulatory body is actually doing its job. This is an idea of the different tests that you could run. So there's different tests for different devices, for different professions, for different kind of events. And as we've been doing some research with people, so we've been going out and interviewing about 50 people now about objects in their home, there's something really interesting about giving people the tools that they can understand and take control of their devices with. And as we continue to develop this, I'll show you on this URL. So if you wanted to explore where we're at and click through some ideas, if you go on to consumeradvocacy.projectsbyif.com, you can see where we're at with this demo. And we're going to be finishing up in about three weeks. So the hub is one of those objects we've been looking at, thinking about what are the kind of references, pieces of information that people might want to know about their objects. And delivering that in a way that makes sense, that's clear. So we've been looking at things like CVE numbers, DRM, and working out how we can take those kind of very technical pieces of information and deliver them in a way that many more people will understand. So, unfortunately, I didn't have a full kind of prototype to show you, but as I say, do you have a look at that URL in a couple of weeks and we'll have finished it in a moment in time, I suppose. And the other two pieces that we're looking at, too, are around shopping. So when you go to a store, how you can see whether certain devices are more secure or less secure than others through a connected tag system. So we'll have that, too. Thank you very much. Short presentation, but I would welcome any questions that you might have. Thanks. Hey. Hello. Oh, you need a mic. Hi. So one of the problems I foresee is maybe what would allow a device maker to kind of open up their platform enough to allow you to run those kind of tests because is it in their interest to do so, or how do you kind of solve that problem? Can you say that once more? I didn't get the first bit. Like, what is it you need to do to kind of, like, why would a device manufacturer and object manufacturer allow you to run those kind of tests against their device when it's in their interest, their commercial interest, to collect as much data about you as possible? So I think there's two things there. So there's, like, actually being able to audit code is really important in so many instances, like EFF at the moment are running a campaign against DRM, which is kind of interesting. Corey Dr Rose is leading that campaign that means that you can audit code. I think that it's increasingly important to be able to run audits over code because if we are collecting more and more personal data about individuals, we need to make sure that that's being stored in a safe way, that it's not going against terms that we might have agreed to. So as objects collect more and more data, that auditing is super important for us to be able to trust the companies that provide us with products. So I think there's, like, this is kind of inevitable that this is going to happen. And when you look at objects, so like two weeks ago I think an app called Glow is being tested by consumer reports. And this app, Glow, I think lets women track their menstruation cycles and monitor, like, when they're most fertile. And consumer reports were running other tests and found out that Glow was really, really insecure. So it let other users change someone's password without them having their old password. It kind of had huge amounts of personal data in the forum groups. There was quite a few things that were really worrying about that particular application. And it was only possible to know that because they were able to have access to code bases. So I think it's really crucial, particularly as these kind of applications take more and more personal, sensitive data, but also are objects that might keep us alive or safe. Hi. A question a few weeks ago. There was some publication in the Netherlands that Apple would patent a way to block a camera from recording if certain infrared patterns were broadcasted on stage. So that would also prevent consumers from, for example, recording a politician emitting some words and so that those couldn't be used later on. Are there also some legal means to prevent such, well, form of censorship being applied? So can you... Yeah, it's a legal way in the UK, for example, if it's possible to prevent such censorship in a camera device being employed. Oh, I don't know actually. I don't know the answer to that question. But a big part of this is certainly around kind of policy change and making it actually illegal to hide important pieces of information about the devices that we have. So that's a big part of this project too. Yeah, just wondering. It seems like with the Investigatory Powers Bill you're sort of fighting an uphill battle with the government on that kind of front. I was just wondering if perhaps more education is necessary because the only education we've really had has been, I think, from biz in terms of like businesses put antivirus on your emails and really basic stuff like that. Is there any kind of real campaign from people like the FF to the public to explain what rights they're losing through all of these things and also the vulnerabilities that their devices are exposing them to? Perhaps that's part of it. Well, that's a really interesting point. So particularly in the user research that we've had back, there's been a few people who have mentioned that when they go and buy a laptop, they're told to kind of install like Norton antivirus or other kind of antivirus packages. When they go and buy, have bought their nest or like a mobile phone, you don't have that kind of warning from a shop assistant at all. So, and that for them kind of suggested that the products they were buying that didn't have those sort of warnings wouldn't have those problems. It's not something they'd have to think about. So I think there's certainly a kind of digital literacy piece at the shopping point. My colleagues also went to a connected home store at John Lewis on Oxford Street and they were asking the shop assistants about a couple of like the connected, bizarre home things that they were on sale. And the shop assistants really couldn't give my colleagues any good answers about the kind of questions to do with privacy that they were asking. And the shop assistants also said that we get asked these questions all the time. So I think there's definitely something about around like a better digital literacy piece. That's quite complicated and hard. But I think there should be, yeah, much clearer kind of terms when you buy these things that you know that, yeah, that those objects aren't completely, and even using the word safe is kind of hard because you don't want to scare people too. But I think it's even about finding the kind of language that we should use to talk about the sort of security vulnerabilities in objects that we use. Hi Sarah, great talk again, thanks. What agencies do you see being able to deliver the sort of things that you're talking about, being able to drive standards for safety or for the consumer experience? I think that consumer advocacy organisations are still super relevant, but they need updating to be brought into the internet age I suppose. So I still think that groups like which consumer reports, choice are super relevant as kind of trusted organisations that can verify certain vulnerabilities. But also that there is something to be said about crowdsourcing some of this information to because there are so many different versions of hardware and software that we're running. Is there an opportunity to have more decentralised tests running on objects that can be fed back? So I'm quite interested in the RSPBs like Summer Bird Watch and how so many people report back on the kind of birds that they've seen in their garden and whether there's something similar about running tests on devices in your home. Is there something around that that we can start to encourage because there are so many things now to test, to look at and how could we have more decentralised testing too. But I still think there's a really important role that consumer rights organisations play in this puzzle. But they need to become like, far more relevant, I think. OK, thank you.