 Hello, this is Gerrit Leonhardt. Welcome to another edition of Meeting of the Minds. Today I have with me from Sydney, Australia, Ross Dawson, futurist, author, strategist and a good friend. We are discovering that governments know far more about us than we ever imagined, down to the intimate details of our lives, knowing more about us than we know about ourselves. So what are the implications of that, do you think, Gerrit? I tend to compare the internet and digital data to nuclear power. So what we have here is tremendous power in our fingertips to find information. And on the other side, we don't really know how to, if there's a big emergency, we don't know how to finish it or how to deal with it because it just burns out, basically. We have no way of really stopping the disaster. And data is very much the same way with privacy. What we've done now, we've become naked as a consequence of the internet. And we enjoy being naked for the most part because people can find us, for example, and unlinked in. And we have a benefit. We've made a Faustian deal, a deal with the devil, you could say, except for that we always knew that the deal existed. So I don't think we should be surprised if we know how much the government can find out about us. That all existed before. The bad thing about this is that they weren't transparent with it. They gave us no recourse, or they're giving us no recourse. They're giving us no control. They're not apologizing. So it's not surprising that this exists, that people are using our data for all kinds of purposes. And this is only the tip of the iceberg. We're at 1% of the app use of this. We're going to go to 1,000% in the next five years because there's just a lot more data there. Location data, privacy data, health data, media use, payments will go digital. So it's not surprising. I think it's crucial that we need to put things in like a recourse way, transparency requirements, all of these things that we have in use, for example for money, we have recourse, we can complain or we can do things, but we don't have it for data. So that needs to happen urgently. David Brennan is a book, The Transparent Society, said that we have two possible futures. In every future we can imagine, individuals will be totally visible by governments and corporations. But the two choices, the two possibilities, are that individuals and citizens can look back into the institutions of government and corporations or that we can't, and it's just that one-way visibility. So what we're seeing is that the extent of the information required, I mean, for some time I've been saying that the reality is if we step outside our front door, we are entirely seen. Now we're finding that even if we speak on the phone or we communicate or if we're watching anything, that all of that is known about us. And the challenge I think is that people sometimes say that if you're not doing anything wrong, you've got nothing to worry about. That assumes that the government is one which respects us and which is trying to do the right thing by citizens. I don't think we can be confident that all the governments in the future who will have the legacy of all this data will be able to ones who we can trust with that data. It comes down to trust, and trust is based on a few things. For example, trust is based on a mutual conversation. If you're always dominating me, I'm not going to trust you as much as you may appreciate it, right? If you don't give me a recourse and I can't complain, I'm not going to trust you. So all these things, I mean, I think these institutions have to earn trust and they have to keep trust. Otherwise, when we move our trust, they should go away. The challenge is, though, with governments is that we can vote political parties in and out, yet ultimately the mechanisms of government remain. So the democratic government in the United States has inherited all the data of the Republicans, whoever goes into government next will inherit all the data it has. So while we can trust or not trust our political parties or our governments, they still don't go away. I think that that's the challenge we have. We can vote political parties in and out, but that doesn't necessarily change. Not trusting our government doesn't necessarily help us. Let's put it this way. I think for the current scenario here in 2013 is the current scenario is that kind of this machine is taken over. It's a self-running data-eating machine and we're in it. And this is because it's so new. Social media and mobile devices and location services. So what needs to happen is the machine has to be brought back in the control of the paradigm to help us, not to run us. And what is it doing is running us now. So my view is that if this setup of our data going out there and doing all these things without any sort of recourse or control from the user continues, then we're going to be run by this engine, essentially. So all of our work, quality of what we do for work will be measured and used against us or for us, for that matter, depends how you look at it. So what needs to happen is basically empowerment of the user here and the companies who are in this system, like Google, Facebook and many others, they need to be the ones who make the effort of creating these new essentially seat belts and airbags for the use of that data, not the other ones. So it would be the most prudent by Google is, for example, with Google Glass, is to come up with a framework of how that's being used and why and why not, rather than somebody from the outside protesting against them. The issue is what can we do as individuals, and I think partly as individual corporation, we do need to ensure that we can see back into the institutions and create those mechanisms, creating transparent government, creating mechanisms to be able to see what is going on in institutions. And I also think that collectively we need to make our voices heard and that's hopefully over time we'll shape how governments and institutions use our data. There's two things I think we can do about this privacy problem. One is we stop the fast impact. We stop using the places where they are doing this for us. So I mean... That includes telephones. Hard to do with a telephone but easy to do with search engines. Or not easy but possible, right? So the other thing is I think I'm dead certain we're going to start paying for privacy. So rather than my data being used to fuel advertising like it does on Facebook, let me do my own Facebook and I pay for not having advertising. Just like I'm not... I'm paying for Apple TV or Snap Films or whatever. I'm paying money for the movies. I don't want to see the commercials. I want to be left alone. So we're going to start paying for privacy. We're going to have privacy banks, digital identity protection, we're going to start paying for this because if we don't pay, then we are the content. This is an old saying from just from last year. If you don't pay, you are the content. And there's an alternative. I mean that's the deal that you've made. We are the content of Google. We are the content of YouTube. We are the content of Facebook. And we don't pay. So if we reverse this, we can say, okay, what about if we pay, then are we in a safer place? And we should be. InfoMedery dates back to 1997. There's independent institutions that we entrust with our data and we let them manage our data in ways that are beneficial to us. Yet still, a long time later, the promise of the InfoMedery is yet to be seen. And I think the reality is that, unfortunately, most people are going to choose not to pay for privacy. Certainly I think there's potential to do that. But I think it is going to be a minority of people that I choose to pay for the privacy. And the question still is there are still limits to the privacy we can gain, even if we pay for that, because there's so many different channels about where information about us gets out. Yeah, this, of course, I think this is a really tough issue that's going to be a landmine for a lot of companies who want to progress in this direction because as long as these issues aren't solved, I'm not going to give up cash, for example. If I give up cash, then you know everything I've done. I've probably popped somewhere, you know, which is crazy. So this is why people are refusing to get rid of the bridge toll, you know, where they have to put the actual money, because otherwise they can be tracked anywhere using the radio device that's for the bridge toll. So as long as I'm not certain that I can be left alone, I'm not going to participate. I'm not going to have my health data in the sky. I'm not going to pay the bridge toll with the... I mean, I would, but a lot of people aren't. So I think this is something that companies that invent this kind of stuff like IBM and others or Intel, with the Intel television that is reading my face. They work in the tech stuff, but they don't work in the social consequences. And this is a huge mistake, I think, if you're looking at companies like Google and others, they are just inventing stuff, basically a million miles a minute. But they don't invent how we're going to respond to it. And my view is they should spend equal enough time on inventing how we're going to actually realize the good in it rather than the slavery of it. So this concludes today's episode of Meeting of the Minds. Thanks very much to Ross Dawson for being part of this today. If you want to know more about the show, you can go to meetingoftheminds.tv. We are also taking questions and inputs for the next show. Just use the Twitter hashtag meetingoftheminds and we'll be responding and trying to work your comments into our next show. Thanks very much for joining us.