 Many people don't appreciate the importance of their privacy until it's too late. People will start to say, hang on, you're not allowed to know that stuff about me. Or if you do, that's fine, but I want it back now. We're often not even conscious of what we're giving away. Really simple example, terms and conditions. Of course, they're written in a way that you don't read them, but you just accept those things because you want the benefit of the app. I get, in return, an amazing search service. I get an amazing music and book discovery service, a great shopping site. They can't even deal with navigating saying no to the app and just accept that they have to give away their geolocation. They say, oh, I've got nothing to hide. You know, I didn't commit a crime. I didn't do anything wrong. Convenience often trumps trust. We all have things that we want to keep private, legitimately. We tend not to value privacy until it's too late. Though we may easily take it for granted, the fact remains that many of the companies offering us free services derive their revenue from our data by way of targeted advertising. All those online ads tailor-made for our unique likes, dislikes and curiosities, insights accumulated over time on the basis of how we behave online. As more and more cases begin to emerge of this sometimes deeply puzzle information being collected and shared without our consent, pressure is mounting for governments to step in. But when so much of what we do is being conducted and tracked online, efforts to restore privacy will require greater thought and imagination than simply tightening regulation. And fundamental to finding solutions will be answering the question, to whom does this data actually belong? Our data should belong to us. I'm a huge proponent that your data is your data. That's where I start from. Companies conveniently, in the absence of any organized system to say otherwise, say that they own the data, they collect it because you use this app or this device or this platform. People will start thinking, well, we'll hang on a minute. What am I worth to you? And is that worth what you're giving me in return? The data, which is really a technical way of saying how you lead your life, where you go, who you talk to, who you email with, what you browse on the web, where you shop, what you do moment by moment by moment, that's you. And you have to own that in the first instance. What scares me is not sort of the immediate use of that data, it's all the secondary uses of that data. And what happens to it, where we have no control. And just because the world is becoming more technologically sophisticated in surveilling every step of our life, doesn't mean we somehow lose ownership of that very personal day-to-day aspect of who we are and what we do. And also how the companies can extract all the value from that data. And again, that's completely invisible to us. It is very, very important. At some point, people are going to start recognizing their own value and they're going to demand transparency. I don't see anybody playing a real leadership role in fighting back because the use of our data is so commercially lucrative. EU lawmakers, however, are making a concerted effort to fight back, which could spell trouble for the online advertising model in Europe and perhaps elsewhere. Set to take effect in late May, the General Data Protection Regulation is a new law intended to give European citizens greater control over their data and imposes significant penalties for companies that fail to comply. Some experts believe similar laws will be passed in the United States where, according to a survey conducted by Harris X, as many as 83% of those polled believe technology companies need tougher regulations. But government intervention is just one piece of this complex puzzle. How might we want to reconsider our current concept of consent? To say that you've consented to give up your privacy because you've chosen to use a particular app or because you've used your mobile phone fails to reflect the way the modern world operates today. You can't live without your mobile phone. The problem with believing that you've got nothing to hide is that, you know, everybody has things that they want to keep private. It's really understanding what you think the trade-off is. What is it? Is it lack of control? Is it surveillance into your life? Where is the fear really coming from? You know, it could be a new relationship you're starting. It could be a health problem that you have. That starts to impact the decisions that you make as a consumer. One of the things I think is really interesting is how design can play a role in this. So how could design introduce just a little bit of friction? Before you accept that terms and conditions without reading it, it says, are you sure? We need a new concept of consent. It can't be implied from use. It needs to be much more explicit. If you give us this, we can give you that. If you give us this, we can give you that and suddenly you get transparency about data usage. This is something I've really tried to introduce in my own life is just that moment of pause that I'm even conscious of when I'm giving away and asking myself whether the benefit is really worth it. Ultimately, the solution is going to be a combination of self-policing by the companies and then government does enforce government that sets at least basic rules. It is a bit of the fox guarding the henhouse because governments are incredibly intrusive. It's not a great situation. Is there anything we as individuals can do to regain just a degree of privacy? There are a number of best practices such as switching to web browsers and search engines that don't collect or share user information with advertisers. But for those of us not quite sure where to begin, Mozilla's Kathleen Berger advises trying out their data detox kit, an eight-day program designed in partnership with a non-profit tactical tech intended to help users reduce their online footprint. Advertising is a per se a bad thing and neither is tracking. The much greater concern in the current ecosystem is the lack of consent. So the data detox kit, for example, is an easy eight-day guide that helps you understand what's going on. It helps you find the privacy settings in controls. So basically anyone who has the lingering feeling that they might have agreed to too many terms of services without actually reading them or open to many online accounts, here's a tool that might help you reset. But Ken Roth thinks it's unfair to put the onus on the individual. We're never going to be as technologically savvy as the tech companies. And Kathleen Berger agrees. Data detoxing is not going to solve the whole problem. Ethical design, corporate responsibility and appropriate regulation all have a role to play. So what happens next? Is change on the horizon? Or must we radically redefine how we understand privacy in the digital age? I do think it's a paradox. You can have this apathy around what you give away and yet get so angry and be so concerned about sort of the macro issue of privacy. So we need official paranoia for us. And if governments drop the ball, we need the media and civil society organizations to sort of step in and insist that somebody start protecting our privacy and our data because right now it's not happening. Humanity fundamentally is a lot stronger than digital media and will start to reassert itself over it and will contextualize itself accordingly.