 So let me ask you a quick question. How many people here already fell their phone vibrating in their pocket, took it out, and there was nothing on it? Yeah, yeah. So this is something called phantom vibrations. It's when you actually take your phone vibrated, but in fact it didn't. Do you know another word for that? Hallucination. Now, I don't know about you guys, but there are very few things that usually make people hallucinate and it's pretty freaky that we got to a point where we're so conditioned by technology that we're basically imagining things that don't exist and The reason for this is pretty simple is that using technology today is extremely complicated. You basically need to make the effort to learn how technology works and each time you're doing this you're forcing your brain to basically create coping mechanism. The issue is that this is already happening with just a smartphone and a computer. So try to imagine what this will look like when you have a hundred billion connected devices in the future. Clearly, there is no way we can keep interacting with technology the way we're doing today. So I believe that artificial intelligence, which is growing extremely fast at the moment, is a great way to solve that problem because if you can start communicating with machines the same way we communicate with humans using language, you no longer need to learn how they work. It becomes so intuitive, so easy, that eventually your perception of technology is that it will completely disappear into the background. It has become like electricity. Something is all around you that you don't pay attention to. Nobody came on stage thinking, oh my god, there is light. Nobody cares. Neither should you care about connected devices. Now, you've probably all heard about this object. Have you? Who has one? Okay. Well, you can burn it. Basically, voice assistance is becoming the new big consumer category. In fact, it's growing faster than smartphone did. 4x year-on-year growth. It's insane. And the way those voice assistant work is always the same way. There is some kind of action and then some kind of parameter. So you're asking whether in London the action is whether the location is London. But as soon as you're asking something a little bit more complicated, let's say find me an Italian near my Airbnb in London, you realize all of a sudden that human language is extremely ambiguous because there isn't a single meaning for a sentence. The syntax in the grammar is not enough to give you meaning for what you're asking. In this case here, finding an Italian could mean finding a restaurant, but it could also mean finding a person who is Italian. Now, try to imagine if you're asking your voice assistant to find an Italian and you actually got like an Italian person coming to your room and saying, hi, I'm the Italian you're looking for. I mean, it could be an app, but most likely this is not what you mean. The second ambiguity is that Airbnb in London is not a location. It's a reference to something that only makes sense within your own context. So how do you resolve that? The answer is by giving more data to your voice assistants. So effectively, the way that you disambiguate what's happening is by feeding data from other sources so that your voice assistant can understand what you mean. Things like your calendar, your location data, your text messages, your emails, anything that represents your life. And by doing so, you can recontextualize a query. You can understand that you've got some emails from Airbnb that some of them are mentioning London that the date corresponds to today and therefore this is the address you actually meant. By giving personal data to your voice assistant, you can make it understand things that it could not understand with language alone. What's the issue with this, however? Can you tell me? Privacy. Because, I mean, we're talking about giving access to your entire life to a computer that is most likely running on a server somewhere in the United States at the moment. And most people, when you talk about privacy, they'll tell you, well, I've got nothing to hide. How many people here actually think they have nothing to hide? Okay, great. Okay, great. Do you want to do an experiment? Yeah? Okay, everybody take out your smartphone. Everybody do it. Take out your smartphone. Unlock your smartphone. I'm not going to hack into it. Do not worry, right? And open your last conversation. Any conversation, it doesn't matter. The content is irrelevant, right? It could be WhatsApp, Telegram, anything you want. Are we good? Okay. Now, I want you guys to give your phone to your neighbor. Go on, do it. No? Don't be afraid. Come on, give your phone to your neighbor. You've got nothing to hide anyway, right? So do you see that feeling of anxiety, of like giving your unlock phone to your neighbor? And by the way, I never told you to look at the conversation of your neighbor, but you all did. Because as humans, it's normal, right? You're curious. You like to snoop into other people's lives. And so whenever you're given an opportunity, you will look at the things. In this case, your neighbor was next to you, so if you started looking at their pictures and anything, they could slap you. But the problem is every time you're saying yes to terms and conditions for a company, you're basically giving them your phone. It's exactly the same thing you're doing, but you're giving it to someone that you have no idea who they are, you have no idea what they do with it, and you have no control over what's going to happen. But it turns out that privacy has nothing to do with hiding things, right? I think most people here are pretty decent. Rather, the way I like to think about it is like a puzzle. In the beginning, there is very little thing you can understand. Think of a puzzle. You can't really tell what's in the picture, but every time you're flying a piece into the puzzle, the image becomes clearer and clearer until eventually you can see everything that's in the puzzle. That's exactly what happens with your digital profile. Every time you're giving a piece of data to a company, you're making your profile more and more accurate, and they end up knowing more and more things about you, which means that they can target you better and better. And there is one statistic that I saw in a research paper that got me really worried. They showed that the more active you are on social media, the less diverse the number of news sources you're reading. So, most people think, well, that's normal because people tend to just like one specific news outlet. I have a different explanation. I think what's really happening is that the algorithms are learning every time you're clicking something, reading something, liking something, that this is what you're likely to spend the most time doing. And therefore, algorithms are optimizing for reinforcing your own universe bubble. The reason why you see less and less diverse content on social media when you use it is because the algorithms are zooming in on what makes you feel most comfortable. Because, let's be honest, diversity doesn't feel right in the beginning. This feels so much more comfortable just to look at things that agree with you. And for me, the most obvious moment where this happens is during elections. All of a sudden, my entire Facebook friends are voting like I do. Where is everybody else? It's not that they disappear. It's not that my friends all vote like me. It's that the algorithm knows that this is what I want to see and is basically reinforcing my bubble. The issue, however, is that innovation comes from diversity. We know from a fact that the most successful people in company are also the most diverse. And it's very easy to understand. Imagine if technology is a color red. So imagine someone working technology is red, a piece of content about technology is red. And this is you. If everything you read about is about technology, you're going to end up being red yourself because that's the only color you've seen. And your entire way of thinking, your entire framework, your world will be a red only world. It's a monochrome red world because to you, nothing else exists. Now, imagine if instead of just reading about technology and hanging out with tech people, you had one thing about technology, and then this time you've got art, which is green, and you're adding a piece of content about art. Not only do you now have access to the red color and the green color, you also have access to something that's neither red or green, but rather the combination of the two, which is yellow. All of a sudden, you've invented a new color by combining what you learn from technology with what you learn from art. And this is literally the definition of innovation. The more colors you have around you, the more you're going to be able to play within that spectrum. And if you're now adding business, and let's say business is blue, then you've got the entire rainbow that you can access for your thinking. What do you think is more powerful? Reading three books about artificial intelligence, which, by the way, is what I would be recommended if I'm buying a book online, or reading a book about AI, one about art, and one about business. What do you think would make you learn and be more successful in the long term? And this is a very, very scary prospect because if you don't care about privacy, you're going to reinforce your own existing bubble. You're going to be trapped in the world that you already existed, and you're not going to be able to invent all those other colors. That is precisely what you need to do at that point. And so think about this next time you're buying a book. If they're pushing your recommendation, try to go for what they have not recommended you, and read that instead. So privacy for me has nothing to do with hiding things. It has to do with innovation. And innovation is something we desperately need today. So how do you go about privacy? Because I know everybody talks about it, and it's great to say we want it. But it turns out that privacy is something we can all achieve very easily. There is a concept called privacy by design. The idea of privacy by design is that you're guaranteeing privacy in the conception of your product and technology. So one thing you can do is you can create products that enables people to opt out of targeting and profiling. The Twitter app is a great example. When you go to the Twitter setting, you can opt out of all personalization and you can look at your newsfeed in order of tweets actually being posted. So you can take out Twitter's interpretation of what you want to see and see the raw data as it comes. This gives you an opportunity by chance maybe to be reading things you would not have come across if the algorithm has zoomed in on the fact that you only want to read about artificial intelligence or cryptography or whatever. You can also spend less time online. Some other products don't give you this opportunity, but given that the more time you spend online, the better targeted you are if you spend less time online, the algorithms will be a little bit more confused. Apple, for example, in iOS, has now featured unless you set limits on how much time you're spending on different applications. Or you could do what I do, which is to actually randomly click on a bunch of stuff to confuse the algorithm. One of the great things about working in AI is that you also know how to trick AI. And I can guarantee you that the trickiest, the best way to trick an AI is to literally click on random stuff. It will never be able to target you this way. And then finally, you can use end-to-end encryption. In particular, there's a new type of technology that you're going to hear a lot about in the coming years because it's starting to work, something called homomorphic encryption. In a nutshell, it enables you to compute unencrypted data. So rather than sending your actual data, you're sending an encrypted version of it, but the actual cloud server doesn't have the key to decrypt it. Somehow, it's still able to compute some kind of algorithm, send back a result which itself is encrypted so that cloud doesn't know what you sent it, it doesn't know what you're giving back, and you can decrypt it locally on your device. So think about it. If we've got all this technology, why aren't companies using them? There are only two reasons. Either they don't know how to do it, in which case you might not want to trust them with your data in the first place, or they're doing something else with your data they don't want you to know about. And finally, regulation. I know regulation for many people is a bad word. It's actually very good. People criticized the GDPR initially, but now everybody's realizing they really badly needed it. And guess what? Nobody shut down their business because of it. And three things in particular are interesting here. The right to be forgotten. You can tell a company, forget who I am. I want to disappear from your database. The right to data portability. Very big deal, guys. When you can take your data with you to another company and then ask the company to forget about you, you're actually having a lot more power and leverage to change the way companies are using your data. And finally, the right to non-profiling. In some cases, when there is an impact in your life, you can ask company to use the service without profiling. I think this should be generalized. We should have every service out there without profiling. We should be able to use Facebook, Twitter, Amazon, our voice assistance without our data being used to target us and trap us in our own bubble. And so the reason why I'm talking about this is because I was often told the future would look something like that. This is a scene from the fifth element. It's kind of oppressing. It's kind of scary. Big policing everywhere, advertising nudging you all the time. Right? Come on, who wants to live there? And this is the word we're going towards. If we let people use the data the way they've been using it because they're going to manipulate us to an extent where we're no longer going to have a choice, we're just going to do whatever a few companies are telling us to do. But if we start thinking about privacy and artificial intelligence as enabling a completely new way of interacting with technology, then maybe we could live in a world that looks like this. Right? And this is fundamentally what everybody wants. Everybody wants to feel peaceful, to feel safe, to feel like they've got freedom. And to me, artificial intelligence and privacy are necessary for this. It's necessary for people to be able to think about new things, to invent new ideas, to solve problems that we're not able to solve today and eventually, we can all feel like we're at the beach all the time. Thank you.