 It's my honor now to introduce the Special Rapporteur on Privacy from the United Nations. Sorry, Professor Joseph Kanatachi. Welcome, sir. As Professor Sudin has indicated, he's just released a groundbreaking paper on these issues. The paper covers a... Well, the report covers a wide range of thematic issues that are of relevance today, including the use of personal data by corporations, privacy and health data, privacy and personality, privacy as an essential foundation to democracy, children and privacy, and gender equality and the right to privacy. As Rachel was describing in some of the recent developments, you may know that last week there was confirmation that a company called Clearview AI has in fact scraped probably your photograph and mine from the web, and they put it all into a facial recognition database. And they're selling it. They said only to law enforcement agencies, but according to this report that Alison alerted me to, it's also being sold, or touted at least, or being tested by private companies. So this affects each of us personally, not just the folks outside in this very neighborhood, but really equally vulnerable. So we're in this together and I think our response needs to have a collective approach, it needs to have a joined up approach. And in South Africa that means our constitutional framework, but globally it means the United Nations human rights framework. And so, Professor, most welcome to address us this morning. Thank you. Thank you very much for those kind introductions. It's such a pleasure to be here. I suspect this is one of those rare occasions where I'll start off with a conclusion, because I'd like to emphasize the importance of the series which is being launched here today. But more than anything, we have a collective responsibility to get everybody out there, especially the policy makers, thank you very much, to not read one book out of the series. Read them all, and if possible, read all of those which are yet to be published. Because the message I need to communicate today is that it all fits together, that you can't just take health data, you can't just take police and surveillance data, you need to take every single kind of data. And I'm so pleased to be here and to see people working together. But I'm short of time and I have so much to say about the subject, so I'm going to crack on as it were. Now, when Rachel reached out to invite me, she probably wasn't totally aware of how much of an AI freak I really am, right? Now, so just to walk you through why I'm both worried and a bit relaxed about what Rachel had to say, because, yes, there's a lot of talk, but boy, have we been talking about this for a long time, right? And, you know, I'm young, but I've been younger and I'll prove it in a minute, right? Because my first book about privacy 33 years ago, and I've been working in AI for 36 years, right? And my second book was actually about liability and responsibility for expert systems and that was published in 1988. And we haven't made much progress as society since then, right? And if you look at some of my earlier works, right, these are from 89 and 89 and 90 and we're talking about medical expert systems, because that's the area where I am today and where I was then. It's moved in terms of applications and the speed of some of the processes we have. But in terms of ethics and law, it hasn't moved much, right? And this is where Rachel is right and we have to play catcher, because when we think about AI, so we're talking about does it look like this, right? Does it look like that, right? On a campaign to stop killer robots, or does it look like this? What is AI? What are we talking about here? Why should we be concerned about it? And should we pay attention to Stephen Hawking, right? You know, who warns that AI could end mankind as we know it? Is Google's plan really to take over, you know, but name, right? Where are the elements of truth in this and how do we counter this properly? Because as I was saying earlier this morning on South African National Radio, it's the problem, especially with South Africa, is not that the constitutional vision is not there. Oh boy, is it there, right? It's there. It's a great vision. The only problem is if this were rugby, South Africa would not be playing, you know, in the leagues it currently plays. And this is where we have to move it from, right? Because we have to challenge statements like this. Rachel mentioned big data, right? But is this true? That in the world of big data, privacy invasion is the business model, right? And I'll be looking at a few headlines as we go along to see whether these things are true and what we can do together, globally and with South Africa, to take this forward. I'm aware that these are the kinds of headlines that South Africans like to read about, right? So South Africa beat Australia and Johannesburg, yes. Now, but I need South Africa to beat Australia and other things too and not just rugby, right? Let me give you a few examples. Leaked, one in ten Australians medical records exposed. Is that really the way you want to go? Do you really want to have one of your government agencies and the Ministry of Health publish 10% of your citizens' medical data on the internet? By that I mean 30 years worth of medical data for each individual. Every single visit to the gynecologist, every single visit to the STD clinic, every single prescription board from the pharmacy, right? And put across on the internet because motherhood and apple pie have no competitors or no competition when it comes to the gospel of big data and especially open data. Now, as far as I know, in South Africa you also have surges and constituencies, constituency surges for politicians, right? Now, hands up anybody who's seen a line of constituents snaking around the block asking their MP, oh, dear sir, please put my data online. Have you seen that? I haven't. But if you suspect to know of a few corporations which have been lobbying for that data to be put online, then we know that, as in most other things, what we have to do is follow the money, right? Because if we look at this case, right, in Australia, from two years ago, this was horrifying. Is this where you want to go, right? That's my colleague at the University of Melbourne who within six weeks of the data being uploaded to the internet. And mind you, downloaded 1,500 times as a data set, right? Exposed it and while I was on my flight to Cape Town, the news was that she had to quit. Because instead of telling her thank you, Australian society pushed her out. Is this, and I have to raise this in a research context, is this what you're going to do to Rachel and the rest of the team? Because they were conscientious enough to raise these matters as part of partnerships within South African society? My job as UN Special Rapporteur is to speak truth to power. But please consider the extent to which this is done by your colleagues in the room. And the right way to handle this is not to say, oh, dear, is this awkward, and oh, this company finances my speeches, or this company is a donor, right? So I invite you to join the national and international effort to cut the crap, right? We must stop beating around the bush. We must look carefully at the risks that we're trying to manage because after all, this is a risk management exercise. I didn't put my screen up there for you to see my ugly mug, right? But simply because if there are prizes to be won as to which is the worst end user, unfriendly website on the planet, the UN probably wins the game. But if you get on to it, right, and you go on to the right-hand side, you're going to find some of the reports that Rachel and the two professors who introduced me were talking about. There's not one report, there are many reports, right? And if I had to walk you through the table of contents, just that I don't have time for anything else on the one, on big data and open data, you're going to see the problems being examined in some detail. The reports are there, that's March 2019, that's 2018, 2017. I present two reports a year, sometimes more. But generally I make a report to the UN Human Rights Council in March of every year, that's in Geneva. I've just flown down from Geneva where I've presented the report on privacy and gender. And then I present an annual report to the UN General Assembly in New York, that's normally in October of every year. And the reports that I've just presented, I'm not calling them reports anymore in that sense, they are reports, but they're actually recommendations. I don't think it's very useful just to say, oh, there's a problem. I think that one of the things that I have the job to do with the help of people like Rachel and yourselves is, and these are some of the options for solutions, right? So when it came to health data, for example, I've just picked up one example here, I put together a task force on the subject and no one invited more than 50 people from all over the world. After an online consultation we went to an in-person consultation which was hosted by the Council of Europe in Strasbourg. And after that we had 954 different proposals of changes to the draft text that we prepared, right? So this took two and a half years of work in the making and it's there for you to use. So when your ministry for health comes along and says, what on earth am I going to do about personal data? What am I going to do about AI and health? What am I going to do about Fitbits? Basically, to a certain extent, we've worked it all out because while it's great to have a great constitution like South Africa has, the devil is in the detail and no constitution on the planet can provide all the detailed guidance that you need in any given sector, which is why it's so good to see that the series which is being produced by the policy network here in South Africa is taking a sectoral approach. In this it has taken the same approach that we had taken of the Council of Europe and I chaired the Committee of Experts on Data Protection where we said, where do we have problems, which is basically wherever we had data, so we have recommendations on insurance data, social security data, on medical data and other kinds of data, which is why I'm so pleased to see the information regulator in the room because law knows they have a massive task, right? I mean, does anybody have a politician that is by anybody and not the people in the room? Does anybody have one single good reason as to why within 2020 and we're still talking and even sharing as to why South Africa is bringing in the act that it brought forward in 2013, right? I suspect you would agree with me that if South Africa played rugby that way, it would be nil, zero. So why do we accept that when it comes to law and data protection, right? South Africa's RICA, its 2002 act on the regulation of interception of communications, brought in after 9-11, was progressive, was it? When you compare it to the Brits, well, way back in the Paleolithic period, right? Well, why would you compare yourself with the Brits? I mean, you fought two wars with them, right? You keep trashing them on rugby and you've bottled them in the large... I mean, so what in the hell do you look at the Brits for? You should do now, huh? Right? You should do now, because they've learned a lesson. Yes, all right, they had to put me on the front page of their papers and say, what was the politest thing that I said in 2015 about the Brits? Yes, that the oversight of surveillance was a bad joke at the citizen's expense. But in the meantime, what have they done? They've taken a hint from the South Africans and it wasn't the coach, the assistant coach in the rugby. It was even the South Africans have put judges involved in the procedure. But what have they done now? They've put in a properly independent regulator. They don't have one judge. They have 15 judges. That's five full-time judges as opposed to South Africa's part-time judge involved in RICA. MI5 is one of the most powerful intelligence agencies on the planet. Yet it has been told publicly, uh-uh, you go and fix your data, because I'm not going to give you one single more interception warrant. On 22 November 2019, I was a happy bunny because I saw the report from my friends at IPCO saying, MI5 has resolved the problem. We can take them out of special procedures, right? And that's when you're going to see me in Joe Bergen here, cheering on South Africa, when it beats the British and puts in its standard of oversight of surveillance at least one notch above the British. But you ain't there yet. You're way off there, right? And I want to see that happening. And I'm here informally this time. But just in case, you know, anybody in government tell them, I'm very well-disposed to engaging with them formally, too. We know how to do it, so why not do it? Right? The... I'm not going to go on at length on the protection of health data. Why won't I do that? Because you're all perfectly intelligent folks. You can go to the website, sort it through, and find out the text. Latest, 5 December 2019, there are more than 43 pages of recommendations. There's another 40 pages and more of Explanatory Memorandum. If you don't get it right, you really can't blame anybody except yourselves. And if you can't, if the website is so dumb that you can't get it right, you've got to remember how organized my parents are. There's only one joke that can attach you on the planet. Google me and we'll send you a copy, okay? Why? Because over there, as you can see, if I look through the table of contents, you can see health-related data and automated decision-making. AI, health-related algorithms and big data, right? It's not rocket science. And yet it's worse than rocket science. It's actually much more difficult in certain areas to get to rocket science. I have about seven minutes left because I started a bit late. So I am going to zip through some things here and not go to gender, because I'd like us to spend the last seven or eight minutes reflecting on why this is important, right? Now you know what's important, right? But... And I'm going to give you a few examples here, right? So, in surveillance, right? This website screenshot comes from 10 years ago. This family picture taken at Interpol, and I'm looking at myself 10 years younger then, was taken 10 years ago. We've spent more than 10 years working on smart surveillance. It's not new. We've come out with three model laws for Europe and one very thick set of guidelines for government-led surveillance. Do we really have to translate it into Afrikaans for anybody in the South African government to read it? I think not. Because this is something which I drew 10 years ago and which I still have to show to people, right? Because the load of data that you have on the left, right, is still being matched toward and shared and analyzed by law enforcement and intelligence services in the middle, and it still has got a number of applications. But where are the safeguards? When I was training as a software engineer so many years ago and then also included, since I couldn't decide what to do, I also did law, one of the first things I learned as a lawyer was that the client only cares about one thing first and that's remedies. Client has a problem. And then if you want to avoid problems, we'll put in safeguards. So please help me out with the safeguards and remedies on the South African law. That's what we need, just two words, safeguards and remedies, right? Because, you know, Mimzi was actually one of my students from Botswana, but massively integrated multiple sensor installation. That's where South Africa is going, right? How many sensors can you put in, right? Because it's when we link the video from CCTV toward you from microphones in the ground, and these are all real pictures, right? And connect them up in all one or more control room, that's when we get Mimzi, right? Because with geo-location built in, right? Then we look at what Edward Snowden, after whose revelations my mandate was created at the UN, then you start seeing reports produced by intelligence services from various countries. And this is why you don't want, incidentally, just in case you have noticed, the guy with the orange face on the right is not me. I don't want to take credit for that. But it's important that we do something concrete about it, right? Because contrary to what Vint Cerf said, you probably recognize Vint as being one of the... He said, of course he's wrong, right? I mean, Vint is a very intelligent person, but he does some intelligent people sometimes say dumb things, right? And this is particularly dumb. Privacy may actually be an anomaly. Privacy is a construct of the modern industrial age. In the past, his thinking goes, people lived in small, self-contained villages where pretty much everyone knew who was dating the baker's daughter or the sheriff had for lunch. It is only when populations started migrating en masse to citizens that anonymity emerges as a byproduct of urbanization. Now, clearly, Vint can only say that because he spent too much time on computers and not enough time studying human beings, right? Had Vint come with me to Australia and looked at Aboriginal way of life? Had he looked at indigenous peoples in South Africa? He would never say that because they have been developing their own sense of privacy for the past 40,000 years or more, right? But we don't have time to go into that today. You can look at some of the stuff that I've produced on Time, Place and Space and some of the stuff we've published from UNESCO because that would help beef that something up. Let me just finish by reminding you how fast we have gone over the past three years, right? Firstly, look at the date on this. Now, that is 2015. One billion people used Facebook on Monday was headlines in 2015, right? We've more than doubled that today. We're moving... We had already, indeed, doubled that. If you look at the date of this one, this was 2017. And Facebook, obviously, wants to travel that, right? This point is not about Facebook, but it's where people are using it on their mobile because... And we start looking at the dynamics of what we have going on here, right? It was still news that Snapchat passes Twitter, but then you see other media look at the video side of life, right? Snapchat closes in on Facebook as it hits 6 billion daily video views. Follow the money, ladies and gentlemen, right? Because Snapchat not only caught up with Facebook, it paused it, went to 8 billion, went to 20. So on and so forth, right? 10 billion, so on and so forth. And this was in months, ladies and gentlemen. Because this is where we get into the local age of surveillance. Does anybody in the room not have a smartphone? I see no hands going up, right? So QED, this is why we have to talk about it. Because I love music, but this kind of shows my age. Remember, every breath you take, every step you make, I'll be watching you, right? But ladies and gentlemen, if I had to pick up my guitar, that's where we would be, right? I don't sing. Because let me remind you where we've gone to, right? 1968, Solzhenitsyn says this, no mobile phones in sight. As every man, I'm going to read this especially for the benefit of the people at the back of the room, right? Who, like myself, might be a bit challenged on the visual side. As every man goes through life, he fills in a number of forms for the record, each containing a number of questions. There are thus hundreds of little threads radiating from every man, millions of threads in all. If these threads were suddenly to become visible, the whole sky would look like a spider's web. And if they materialized, like rubber bands, buses and trams and even people, would lose the ability to move. And the wind would be unable to carry two or enough newspapers or autumn leaves along the streets of the city. If only I could write like that. But smart cities make Solzhenitsyn's concerns look minuscule, right? Because it's not only the forms that we fill, it's the electronic tracks that we leave everywhere and most of us are never conscious of them. How many thousands of electronic tracks did you leave between you this morning as you came here? Just think about it, right? Because the problem with us is, and this is where the French are better than the Brits, they actually have the word surveillable, which until recently was not in the Oxford dictionary, right? And that's exactly what we've done, because if you look at this mousepad, which says, always online, right? At home, we're online. Outside, we're online. And if I'm permitted a small Christian joke in a more Islamic quarter, right? I understand, you know, everybody's online, right? And, you know, I read this on the Internet, so it must be true, right? So when you... And I love this cartoon, right? Because we all leave a digital fingerprint. Once it's online, it's virtually impossible to scrub out, and the data on you will follow you around for the rest of your life. And that's part, only a tiny part of what we're trying to deal with here, right? And we have to... At least 10 times, and I was counting, Rachel said, we have to interrogate this question. We have to ask this, right? So we have to ask ourselves, are smart cities worth the risk, right? Because that's what politicians are trying to sell us on behalf of the companies, right? And I have to remind you... And I'm going to invite you to look at this very crude diagram, right? Privacy, freedom of expression, freedom of information are what I call the three information-related fundamental human rights. But above all, they serve to support an overarching right. And that's the right of everybody in this room and our kids and grandkids to the free and hindered development of personality, right? And we made history at the UN in March 2017, and I wanted to conclude by reminding you of this, because for the first time we have... In a Human Rights Council declaration, recognizing the right to privacy also as an enabling right, not only as a standalone right, but as an enabling right to the free development of personality and in this regard, noting with concern that any violation of the right to privacy might affect other human rights. Because I'd like you to think about this in relation to the AI that Rachel was talking about. 1983 was when I was starting my doctoral dissertation. And Paul C. Carter just written this, and once again I'll read it. In a society where modern information technology is developing fast, and please apply this to AI, many others may be able to find out how we act. And that in turn may reduce our freedom to act as we please, because once others discover how we act, they may think that it is in their interest or in the interest of society, or even in our own interest to dissuade us, discourage us, or even stopping us from doing what we want to do and seek to manipulate us to do what they want to do. 1983, ladies and gentlemen. How far ahead of Cambridge Analytica is that? And now you will permit me to read out something that I wrote at more or less the same time, and after 35 years I can't bring myself to disagree with myself, and that's unusual. Sean of the Cloak of Privacy that protects him, an individual becomes transparent and therefore manipulable. A manipulable individual is at the mercy of those who control the information held about him, and his freedom, which is often relative at best, shrinks in direct proportion to the extent of the nature of the options and the alternatives, which are left open to him by those who control the information. And that, ladies and gentlemen, I suggest, is what the discussion should also be about. What information flows are we encouraging, permitting, and facilitating in society, and where does AEI fit in that context, right? Because the quality of life is defined by some values, including the free development of personality that I spoke about, which is why I'll be chairing if I ever seen a South African law, which says that informational self-determination should be a design criteria for smart cities, right? It comes out of the Constitution, but are you going to leave it a dead letter, right? This morning I reminded listeners on your radio that you've got wonderful sections in the Constitution about diversity and sexuality, LGBTQ, etc., but in practice in South Africa. Is that a reality? Or do you still have hate speech? Or do you have discrimination about people on the LGBTQI spectrum, right? And that is the transition we might have to make, from the vision to the mechanisms which will get us to the middle. So, smart, everything except us is part of our final message, right? Because what we need to do, right, and this is going to be my final slide, we have to go back to the social contract. Because where we are today, I suggest to you, ladies and gentlemen, is we're seeing people rewrite the social contract, right? I'll just stop here. There are many more slides and I'm very happy to send you the complete package if you'd like to have a copy, but I'd like us to get on with the rest of the day because it promises to be a very interesting one. Thank you very much, ladies and gentlemen.