 Okay, everybody. We go on with the session. The next speaker is Jean-François Guerrard, and he will tell us more about emotional firewalls, protecting users with unbiased personal assistance. Jean-François. Hi, sound is good? No mixing with other sessions or anything? All right, so my name is Jean-François. Most people call me John because it's kinda difficult to pronounce my name properly, and that's fine by me. I work for an organization called the OIO Foundation. We essentially work on digital rights, and that is roughly human rights applied to new technologies. We're gonna be discussing a couple of things, or mostly everything we'll be discussing today in this session, is not that much technical, okay? And it might not be the typical topics you guys are confronted with on a daily basis. That's gonna come with some challenges and some friction. Let's see if we can actually go past that because it is the role that you play in how all of this is gonna be managed in the future is way bigger than you normally come to think of, okay? So first, I would like to do a small test on the crowd. Can programmers stand up for a second? Those who are active programmers. Let us not be shy. I work on the development community, so we actually do these kind of ice breakers and whatnot. Okay. Those who are not concerned with how their data is used by other companies sit down. Those who don't care. They have my data, they can sell me any product, I don't care. All right, I'm gonna assume those who remain do care. Those who are standing up, who are familiar with the concept of human rights, just the concept. Those who are not sit down. Okay, you heard about it, right? Um, who of you have ever heard about the UDHR, the Universal Declaration of Human Rights? Those who haven't sit down. Okay. And which one of you has actually read the UDHR? Those who haven't sit down. Okay, it's two more than what I expected. And, oh, sorry, three, three, all right? And that's kind of the elephant in the room. Um, programmers are essential on the way. You can sit down. Yes, yes, sir. Thank you, thank you. Round of applause for actually. All right, thanks. Programmers are essential in how technology infrastructure is built, all right? A lot of people are complaining about big companies using our data, collecting data, linking, profiling, whatnot, and then in the same breath, they realize, hey, you know, I just realized with this new app that I can do, I'm going to monetize my customers' data. They don't seem to be linking the two things, okay? You have a lot of influence on that. You're going to have the decision to make on how do you architect your next applications so they respect your data and your customers' data. Now, I'm going to be trying to get some pieces of elements all together, and through the conversation, you will see that they actually start converging to the topic of today, all right? It may not sound very, very logic, but just there's a method to my minus. A few warnings, though. I'm going to ask you to keep an open mind for the whole thing. Like I said, these are going to be topics that are not usually on the table for, on your daily lives. English is not my mother language, even though I'm kind of fluent, but sometimes I get stuck, so bear with me. I'm suffering a bit of jet lag from coming up, I'm trying to get with it, so again, bit of patience. All right. What do we essentially do when we are interacting with devices? Most of the time, we are trying to obtain information, right? How do we start, back in the days before we had devices, obtaining information? So we had two systems. We first started going peer to peer. We just went to the caveman next door and asked him, hey, do you know where I can find deer? I'm hungry, I need to go and hunt. And they would tell you, this is the best zone in the prairie to go and hunt. Then someone figured it out, hey, if we put all the information together in one place, then people don't have to go asking around. And we can actually centralize all the knowledge that we have as a village, as a community, and have it there as a reference. So came libraries. Now the interesting part is libraries were designed with one specific or objective impact. They were designed in a way that you, as a person, could go there, obtain as much information as possible, as fast as possible, and the information that you wanted. Okay? Now, moving forward, turns out that there's this cool thing called search engines. We start getting into digital area, we have access to all of that, and we have essentially libraries who are digitalized. And search engines go all around, they start crawling, they start indexing information, and they have their own algorithms to index that information. Now, the big difference is that those algorithms are not designed for impact. They are designed to make money. And that's a big shift. And it's a shift that most of the time has been very transparent, or let's say, sorry, the opposite, very opaque for us. It hasn't been communicated to users in a very upfront and honest manner. How did we study interacting with websites? Well, we basically went to a website so a search engine. We went to a website and we started figuring out keywords. Okay? And we needed to find ways to pinpoint exactly the information that we wanted and try to refine all of that until we found the result that we thought we were looking for. Has anyone gone in the last years ever beyond, say, Google, for instance? I'm not gonna assume most of you use Google. Has anyone gone past page number five of any search result? Okay, at least someone tries. All right? So imagine how all of a sudden, if you are not on page number five, essentially you don't exist. Most people don't even go to page number three and I would bet some money here that most of us don't even go past page number one. How you are reducing the view of the world and how you are skewing that has a lot of dire consequences. And how the algorithm is deciding what you want to see or what they think they want to see or what they want to present to you changes a lot how it's gonna be your perception of things. All right? Now enter chatbots. Has anyone been here developing chatbots lately? Okay. Has anyone really thought about what a chatbot is? I mean, I'm a person who doesn't really like hype words and I don't like marketing and whatnot. The way I see it, a chatbot is a guided search engine. Okay? The same way as you were before looking for keywords, all of a sudden you got a software that is providing you those keywords to search. You are tapping a certain term and then the chatbot ask you, were you referring to this or were you referring to that? It is essentially guiding you towards a much more targeted result, which is cool. Um. I was last year in this low tech hackathon in in KL, I live in KL. And it was interesting because they were trying to bring technology into law and they were providing a number of tracks and people were basically developing a number of solutions and in and around 80% of the solutions that were presented by all the teams were chatbots. And of course the question came like, why? Well, because when we interact with a chatbot, we feel as if we're interacting with someone else. It feels much more natural. It's engaging a number of emotions that makes us feel much more relatable to the product. Has anyone here tried this uncle bus chatbot? It's funny as hell. Those who haven't just look for it and have a talk with him. It really interacts with you in a way that looks very natural using natural language expressions. And it's engaging. Okay? So it's way less cold and I'm using my terms very, very on purpose than a typical website. Now, yeah, now telephone is not unlucky. For a few years now we have all of these companies especially Amazon, Google, Samsung lately who have been trying to push this personal assistance Apple with Siri into our lives. Okay? Kind of got accustomed to it. It's interesting because the way we are interacting with those personal assistance it's even more potentially engaging. Because well done, it feels like we're talking with a body. Now the only difference is that that body is not really your friend. Okay? So when I was mentioning before about the amount of pages that you have when you go for a query and you don't go beyond five pages. If you say for instance Siri, where can I buy potatoes for my stew tonight? Do we really expect Siri to say, hold on a minute John, I got five pages of results for you, 50 hits each of one of them. Let me read one by one. You don't want that. You're expecting a much more fluid conversation telling you there's two options. One of the groceries downstairs and another one somewhere else. Okay? An optimal result for you. Something that really is what you're looking for and affordable and available for you to resolve in five, 10 minutes because nobody has free time apparently. But when you are talking about these companies who are putting this personal assistance, which is the model that they are following? Are they following the old time library system of impact? Or are they following a business model trying to sell you shit? Part of my French. The second one. And here comes the question. Can I be sure that the information that they're providing me is the one that I really want and I really need or just the one that they want to push into me? It's a very simple question. It's just not the one that they want us to talk about. So, again, please don't do this to me. Let's go back a little bit on how this is played out. By the looks of what I'm mentioning, it would sound like I don't like data. I actually like data, okay? Data is good. We need data to be able to make informed decisions in order to do plans, even if sometimes that data can be very sensitive. The problem is never to have a data set is to make it so that we cannot attach a name and a face to that data set so that information cannot be used against someone. That's the basic principle of data protection. The typical example would be that say, if I'm a government, I would like to know how many people have HIV in my region. Okay? Because I need to know what are gonna be my contingency plan, how many antiviral I'm gonna have to buy for the next season and whatnot. Now, what I don't need to know is exactly who has it. Because otherwise I can actually use that information against that people. In the case, for instance, of Malaysia, if you are a civil servant and you are known to have HIV, you can be kicked out. Now, I don't think this comes to any surprise to anybody of you. Any interaction that we do online is essentially creating digital breadcrumbs about who we are, okay? You browse, you buy, you date, you go watch the biggest repository of nothing. And essentially all of that is collecting a lot of information of yours. Now, a company loves to do what's called linkability, which is try to figure out ways, smart ways as possible to connect all of those different separate databases and start creating profiles. Profiles that says a lot of you, not only your personal features, but most importantly, how you think, how you feel and what are your preferences, okay? We tend to disregard all of that, but essentially what they want to know is what are your potential emotions, what are the buttons that activate those emotions and how to play those buttons. And if you think that emotions are not so important, think about right now about how you're feeling about this talk, whether it's engaging you or whether it's boring you, but that's an emotion I play. And if I knew exactly how to talk to each of one of you in the right way to engage you, bingo. I'm selling myself here, all right? At any given interaction that we have, emotions are always going to be at play. That's exactly how we are defined as people. There's no escaping that. And we have to be very aware on how is that playing out with the technology that we're using and how is that being used against us to follow a certain agenda, whether it's commercial, whether it's political, Cambridge Analytica, anybody? Okay, and that's just the beginning. I mean, I want to guess as most of you knew that Cambridge Analytica, even though we didn't know the name, did exist already. I don't think anybody who was more or less following the trends of things got, oh my God, there were some people trying to use data against us in terms of changing our political views. The problem is Cambridge Analytica is not even the problem. They were in the spotlight because it was very mediatic and whatnot, but they are bigger companies who have much more information, not even only public information as these guys did, and we all know the names of these companies. They know exactly what we don't want them to know. And here's the funny part. So let's say, for instance, what happened, a bit back in the days into marketing, which I tend to demonize, but if I was a guy who wanted to sell vacuum cleaners, I needed to have first a marketing pool and have a look at the sort of market research and try to figure out the demographic where I can sell this particular vacuum cleaners and try to manufacture the right amount to make a profit and not to lose money with the manufacturing. That was a lot of money and a lot of effort. Now we are providing them not only with information, but actually also with the channel to reach out to us. And we are just buying them. There was this guy who he made a talk on TED Talk, kind of a funny talk about him pretending to follow one of those email scams, one of the Algerian prints and whatnot. And he was just pretending and replying and it's really, really funny. I recommend you to have a look at it. And he said something very, very interesting. We got into the internet and we thought, oh, we have access to all of this. Fantastic. But we didn't realize that now all of that has access to us and we were not quantifying that before. So here we are now with personal assistants that have pushed into us that talk to us directly that we think our bodies and that when they provide us with solutions and with answers, are we expecting that they are providing the ones that we want or the ones that they want? And so who is gonna be actually being the result? Essentially, two types of answers. One, those who have a specific agenda to be there and two, those who can pay to be there. Because when your world is reduced to this very small amount of options and because it's convenient, we actually just stay there, who is gonna go and double check? Unfortunately, ever since we were kids, no one at school was telling us, you know, you should double check your sources. Most of us don't go through journalism training, for instance, and try to be very accurate and double check things. That's not really what happens. And think about how that affects the freedoms that I was mentioning before in UDHR, in human rights. All of a sudden, data is, you know, I'm giving away my data. Well, data belongs to you. And one of the principles in human rights is that you have the right for possession and to retain possession. Technology is not really providing us right now with a way to make sure that if I request for a company to delete my data, the data is actually deleted. What we have is something much more akin to, if you ever heard about GDPR, for instance. Yes, okay. What we have is much more akin as to, so here's a brick. Please make sure that the brick follows the specifications. Build your building. And if it collapses, here are the laws that I'm gonna use to sue you. But then I was gonna go and check the building on itself. Okay, GDPR is essentially a law that allows you to sue a company if after you're requesting the data, later on you'll discover that they did not delete the data. But the request doesn't come attached with a team of specialists flying all the way from Brussels to whichever server anywhere in the world and make sure the data is actually deleted. Never mind how many times have been copied. Never mind how many times have been sold to other companies. There's no such thing. And here comes the paradox. When it comes to technology, you guys know very well, you get to decide how the software works. So if you wanted to provide a technical reassurance to that, you could. Just the industry is not really supporting that. I'm not getting, I see your face of doubt. I'm not saying it's easy or hard. That's another story, okay? But it's a matter of will. And the moment that you perceive there's a gain and not necessarily economical, there's always a way to do it. And if the transition requires five years, then you go for five years. GDPR in fact was approved in 2016 and they gave two years for implementation in 2018. So well, again. Now, how does that come together to what I was mentioning before? Out of this concern, what we identified in our organization was that the big thing that we need to protect people in terms of a personal assistance, which is a good idea. I mean, it's very nice. If you think about it, a personal assistant is sort of the Pokemon evolution of what we wanted 10 years ago when we started with devices. We bought this very cool computer, put it at home, and thought I want this computer to help me. I wanted to do things for me. And to assist me and to make me gain time and to provide me with very cool services and whatnot. And I'm gonna install this web server and I'm gonna be providing, you know, I'm gonna install mastodon and whatever. Cool, all right? And it would be even easier if you could actually remove this very clunky interface, which is a screen in the keyboard. As a itself is a pretty slow interface. You don't really communicate with your devices at the speed that you would like. I mean, here comes Elon Musk wanting to connect our heads with Neuralink. Now, we see as a personal assistant as the potential device that's gonna do these kind of things. But do you really want to surrender all of your capacity of rationality into those devices? So we believe that the only solution for that, and actually the big companies are pretty, no one is recording this, they are scared shitless about these kind of models that could appear, whereby supporting open source unbiased, non-business biased personal assistants, such as suci.ai, would probably be a solution for this. And how you could apply to that is you would have a base of people using this kind of personal assistants, open source peer reviewed, you name it, you know, all the advantages already. And start using gossip protocols by the way by which you would be obtaining information in a distributed manner. And you would be the judge of how the information is actually reaching to you. You can imagine that's not a business model that really interests big companies. And so they are doing all they can to actually push their alternatives into our houses. And honestly, who wants to have Alexa listening 24 seven at home, I personally don't feel like it. If you guys want to think about the kind of unintended consequences that can have, I guess some of you are familiar with football and the Spanish La Liga, okay? I'm from Spain. About two, three months ago turns out that the official application from La Liga that people were installing in their phones, and I'm not even talking about all the way to personal assistants just now that you were installing, what they didn't know was that the app was activating the microphone of their phones during times of the matches to try to figure out which bars, which venues were not paying taxes for broadcasting the matches. And of course, no one knew about that. It came as a big scandal. And of course, people started uninstalling and whatnot. But that is without you even knowing. On top of that, you got a buy a product and have it 24 hours knowingly in your living room, recording almost everything you do it with no control whatsoever with those recordings. Yeah, you can subpoena them and you can send a request for, yeah, yeah, yeah, whatever. Let's see how long it takes and let's see how much information they actually give you because they might not be providing you all the metadata, for instance. They can't provide you with the raw data, but they're likely not with the metadata and specifically not what they have learned from you. When they, when Facebook came with this can I say bullshit? I said already, sorry. When Facebook came with this bullshit tool about you can download all the information that we have about you, what you can download is the raw data. Yeah, sure, but not everything they know about you. That they don't share. The type of preferences that they think that you have, the type of salaries that you may have, the people you might be related to. None of that, what really matters, that they are not telling you. So, resuming, and almost done, yeah, almost to the doubt. When it comes to personal assistance we don't think our organization don't think that commercial ones are gonna be any good. I'm not saying that they are totally evil. Most of the time even the programmers who are involved they are just not aware about the things that they're doing. They are not thinking long term, they're not thinking about the consequences. Part of the work that we do in the IO Foundation is specifically try to train programmers into human rights and digital rights so that they get to have a say in a voice in the project that they develop. But it's very, very important that we support open source movements by which we know exactly what's the code, or at least we expose the codes peer review can be done. And we can try to avoid as many biases as possible. It always comes down to this question. Where is the line between an informed decision and an imposed decision? And it's a very difficult answer to be had. I personally don't have an opinion but I don't have an actual answer to that. It's up to you guys to think about what part of responsibility you have and try to collaborate in this kind of projects and maybe try to do some good. Thank you. Oh François, we have time for a question. Oh yeah. Probably the discussion will be continued tonight at the pop crawl I assume. Ken? Questions? So one of the things I was thinking about as you were talking is I think one of the core ideas you're expressing is that being cognizant of, that we should be cognizant of the potential for these apps and applications to manipulate us in ways we wouldn't be comfortable with. And I was thinking that all interaction with the external world could be construed as manipulation to some extent. And so far as when I buy into an ecosystem I agree to be pushed and prodded in certain ways why should I care about the way Google does it or the way that Facebook does it. I still have free agency and I have the responsibility to be aware of the way that they're doing these things. So, you know, why remove myself from that service? So there's two elements on that that I would give you my answer. One, not everyone has access to the same education that you do or to the same opportunities to actually grow into that education. And if you think about it, let's go a little bit away from coding and from tech. All education is basically a sort of indoctrination. So if you don't provide the right tools for people to be able to think and to exert logic and to double check facts, whichever information you have given them at birth, that's the vision of reality. They don't go beyond that. If you haven't been sparking questions, you haven't been sparking interest, curiosity, okay? And, oh yeah, thanks. And the second part I wanted to tell you, I just forgot. Oh yeah, relation of cost. I was mentioning before, it took me, so if I want to be pushing a certain idea into you, okay, how effective I'm going to be is very much related to how much money I have in my bank to actually push for the idea and how much it costs me to actually implement those things. Think about the example that I gave before about selling the vacuum cleaners, okay? I don't have enough of a picture and specifically I don't have a direct channel to you. So I need to maximize my impact trying to talk to a variety of people with a much more broader language, okay? But all of a sudden now I have technology that allows me to talk to you, to the very much you too, because I identify that you have enough things in common and I can create those ads on the fly with software. I don't even need a graphic designer anymore and I don't even need to produce the product beforehand. I can print it out in 3D for you directly. So the cost for me to push anything that I want is ridiculous. The dangers on how much more attempts to try to manipulate and to push things that you may not really need without looking into the final consequences of it are just way bigger nowadays because of the cost. And of course you need to ask yourself what kind of plan that you want to leave out, that's, okay, very enough, very enough. So really good talk. Thank you. Personal assistant is interesting, but if you look a few years later when we move to augmented reality, we go from searching data, asking a question to actually entire reality being pushed at us. And there's device to do a good job, we have to know who you are and push entire layers of reality. So this issue is not going away, it's going to get much more important while you actually control what people see. Now on a slightly more positive note, there's another way, there is open source, as you mentioned. But if you think of a product like Tinder, right? So in Tinder you pay for it and they make money if they are successful in matching you with somebody you like. Because then you're going to tell all your friends and they get more customer. So the quality of their product need to increase for them to be successful. If Tinder was a completely free product, yes, I agree with you. But same thing with personal assistants, if you pay for it, then you're going to try to find the best possible one and this one will be motivated to find what's best for you, what's best for you with an agenda. Of course what they do with the data is a different discussion. So first is who is the owner of the data? Companies believe that they are the owners, we don't believe that's the case. Oh sure. One, two? Yeah. We don't believe that's always the case? Actually never. And second, yes the conversation was or the talk was about personal assistants, no it's not the end game. The end game is actually to train programmers. I was trying to not really be heavy on that but the fact that all of these services that you are mentioning, including augmented reality are going to be coded by programmers. We identify them as the next generation of human right defenders because they're going to be just the forefront of developing the apps that we are so much concerned about. There's no escaping that. Any more questions? I think we are a bit over time now. Probably not, okay it's working. My question is basically about what is this called regulatory regulation and regulatory arbitrage because basically most of what they're proposing in order to make any regulatory headway you need kind of a global approach because all of these companies are basically global companies amassing data from all over the world. So there's two things to be considered. One, software industry is one of the less regulated ever and when I say less regulated I'm not aware if anyone can correct me of any technical regulation. As in software should be behaving respecting this particular rights, this particular services under this particular code and we as bodies of governance we're endorsing this particular thing so if you want to release software in our jurisdictions you're going to have to be able to make to reassure that these things are done in this way using these APIs or whatever. These might sound very crazy but imagine what's happening in this building. For this building to be accepted it goes through a number of regulations and we all assume that they are done there to provide us the safety of being able to walk into the building and it's not going to crumble onto us. For some reason software hasn't still fallen into that logic and I can just craft any software that I want put it out there, collect all the data that I want and come and find me, yeah? I don't believe so. If you restrict yourself into the specifics of protecting the rights of users, I'm not talking about curtailing businesses but the thing is if you have businesses whose business model is based on breaking human rights where is actually the problem? That I'm not giving them business or they're the breaking human rights? I mean let's have the conversation there. And when it comes to business itself just so you know this initiative from the UN called Guiding Principles on Business and Human Rights which has been pushed in the past year specifically to try to provide this kind of approach for organizations. Thank you. Thank you very much. So for us one more.