 talk about along with the course of the latest in technology with our panel of experts. Let me very quickly run through them in no particular order. Right next to me, Bob Livingston, Senior VP Strategic Projects at Visa. Next to him, Rob Leslie from Siddici Innovations. Next to him, Dr. Rita Singh, who's from Carnegie Mellon, an associate research professor there, and last but not least, right at the very end, David Brady, who heads Aquati. Right, Aquati, thank you, which is a Chinese-American smart camera maker. So folks, we've got about 45 minutes or so to spend with you. A little bit shorter than a lot of other sessions I've done at WEF. And what we'll do is we'll kick it off and talk among ourselves. And then at some stage, we'll open it up to the floor and involve and bring you folks in. And just to point out housekeeping, this is being taped and recorded, and I think will be shot out over the web. So if you do have a communications device, probably a good idea to either turn it off or if you really have to stay connected to turn it on silent just for the duration of our session here. And we've got these simultaneous translation jobbies if you so wish to use them or need to use them. So we encourage you to do that. So let me kick off. I want to start talking about the technology itself if I could. And click Joe Hands. Right when you get to the top of the escalator, you turn right and there's this voice recognition demonstrator. How many of you have tried that? OK, not bad. All right, yeah. I noticed the last couple of days that the lines have been really, really long. So I thought, OK, this must be really popular. I got to check this out and try it for myself, so I did. And the lady just in front of me tried it out. And she read a sample of prepared text which got recorded. And I was pretty surprised initially because it accurately predicted her gender, OK, maybe not that hard. Her ethnicity, she was Caucasian. Her age, range. So I thought, OK, well, this is pretty good stuff. And then it went on, though, to tell the entire line of people waiting behind her that she had serious neurological issues and that she was mentally exhausted as well. At which point, I think the rest of the line just sort of turned around and went the other way. So no, that's a funny example. But I guess it speaks to the limitations of the technology right now. Maybe I could just sort of run down the road or whoever prefers to answer. At this stage, we're pretty much well beyond fingerprints. Retina, we're getting into DNA biometrics, voice recognition as well. And Dr. Singh had talked to us about that. What is the cutting edge right now in biometric technology? Anybody? I think there's a lot of development work going on around facial recognition. Specifically, the use cases, payments would be one. I'm involved in the forums initiative for the future of travel. So the modernization of airports, trying to get people through cues in airports at borders, at security, so that you can move faster through those choke points. And again, trying to identify you as accurately and as quickly as possible is one of the key objectives. So I've understood. And gosh, I tell you, if you fly a lot, you would probably want to see that technology work faster or better, more efficient. But what kind of technology is it right now? I could take a swing of that. I mean, the number one thing like some of the technology Dr. Singh is using around neural networks. Neural networks are blind to the actual modality. So the major innovation has been that it's possible to combine multiple modes, visual, voice, gate, many things together and that they can be combined in an integrated way. So usually the identification we've done in context. So before saying somebody was mentally ill, you might guess whether they're in a hospital or not. But by the ability to combine in context a lot of different markers together would make biometrics extremely accurate. How about this DNA biometrics right now? I know. Well, DNA, you have to go to a lab. They would be extremely accurate, but it would take a day or so. Yeah. There are biometrics such as blood flow biometrics that are also fairly accurate. And so at Visa, we've got a prototype where you swipe your hand through a reader and it just measures the blood flow and that identifies who you are. We also have payment rings that are near field communication devices that can pay at any sort of contactless point of sale. And so you can imagine a future where you've got a ring measuring your blood flow and nobody else can ever use it. It's yours. How extensively used is it and where right now? Well, right now it's just in prototype mode, but you asked about cutting edge. And so that's where it is. Okay, fair enough. It's in beta. I mean, your space, Bob, payments, financial services, et cetera, that's been the industry that's been driving a lot of the development and change in biometrics. But is it likely to stay that way? Other industries that could take the ball and run with it, do you think? Well, I think payments is a very logical place for biometrics to start. It's an industry that really values security and also values convenience. And biometrics actually provides both of those in a very good way that consumers understand. Here in China, for example, we've got the most advanced market in the world where biometrics are used every single day when consumers are using their mobile payment devices and with WeChat or Alipay or the other platforms identifying themselves at the point of sale with their thumbprint. But to your question about where could it be used elsewhere, this is really where biometrics, I think, converges with the internet of things and where you can have any sort of device, any sort of object with a biometric reader, with a fingerprint reader, that then becomes a point of commerce at the same time. We've, again, in prototypes, we've got people who've got plastic cards with fingerprint readers on them. And so that's a traditional form factor, but you can imagine a situation where vending machines would have fingerprint readers and then the real questions of privacy come into play. Where is the data stored? When is it centralized? When is it decentralized? When do you have actual payment and identity credentials on the actual device? Or when do you have something that's masked or tokenized so that it can't be stolen? So a key question, I don't mean to get the backup of our hosts here in China, but what you're talking about is a massive database in China. How secure is that? Who has access to it? Well, I think that's a question better posed to Ant Financial or to Tencent, which run Alipay and WeChat Pay. But as of right now, it has been fairly secure as a payment vehicle. Where that data exists beyond the actual process of payments, I can't say. Rita, let me bring you in. The voice recognition demonstrator, I was just joking about a couple of minutes ago. That's actually your behind that. Is the issue here that this is AI driven, right? So the more voice samples it has or collects, the more accurate it can be. Was that the issue with the whole, you know, neurological disorders and mental exhaustion thing? Okay, so I am. That maybe it was right. I don't know. Okay, may I correct you on one point? It is AI driven, but it is not data driven entirely. So AI is being used to engineer the features that are then mapped to the parameters that we deduced from voice. So much of the mapping and the accuracy of it depends on how well we are able to do the engineering to discover these micro features for voice. So the way people perceive AI now is just big data, throw big data at neural networks, and that's AI, that's not it. AI is, neural networks are just tools, but AI is really more than that. It's what you design with those tools, right? So much of the profiling, the technology that is being demonstrated next door is called profiling humans from their voice. And it's all about deducing all kinds of information about you from your voice, for which there are signatures in your voice, whether you're able to hear them or not. It's going to take a while to develop that to a point where it's completely accurate. And what you see there is a snapshot of where the research is right now. But it is really scary to think that your voice can really reveal so much about your voice, about yourself, your persona, and also your environment. So there are issues associated with that, ethical issues. But more than that, I mean, I also have something to say about biometrics and the way we think about it, not just the voice profiling. So we always, so we're talking about facial recognition, we are talking about other kinds of biometrics, fingerprint recognition. Behind all of that are algorithms that have been deployed. And those algorithms have shortcomings, right? And there are, as an academic, I'm aware that there are my colleagues and many groups of researchers around the world who are working on building adversarial systems that can attack these algorithms and cause them to give wrong results. Whether, even if they don't have access to the innards of the system, even if they don't have access to the algorithms themselves, they can actually break into Google Speech Recognizer, and the examples are up on the web. This is a group in Berkeley that has done this, can break into the Google Speech Recognizer and make it say exactly what they want it to say. So at least in this question, the existing technology that's being deployed today, how secure or rather how hackable is it? How secure is the digital hackable, is it in itself? That's an issue, right? Its accuracy is an issue, but there are two sides of a coin. Right. And we have to worry about that. And there are other issues involved, like ethical issues, which everyone is questioning, and everyone is aware of at this point. So the short answer to my question, how fragile or secure are these systems to hacking these days? I mean, what's being used? Is it safe? Yeah? No. No. No, no. Facial recognition systems can be hacked. If you go on to Google and search for papers that are coming up with newer and newer algorithms to hack facial, deployed facial recognition systems, you will find very good algorithms up there. So David, that's your space. You make smart cameras, right? And she's saying, look, these images, digital images can be hacked, true? Oh, definitely they could be hacked. I mean, it's an information system. But when you say how safe are they, they're safer. So I live half time in China, half time in the U.S. I've gotten four new credit cards in the U.S. in the last year because my credit card was stolen. I use Alipay and WeChat all the time and I feel very secure there. So these things can be hacked, but they're harder to hack than stealing somebody's credit card number. Okay, good to know. Let's bring in Siddishi, right? Gay-like name, I thought it was Italian. What you do is what your company does is you build platforms that help people build their own digital identity. Explain how that works. So your identity is everywhere and most of the places it is you have no control over the information that's there and most of the time you don't even know if it's right or wrong. So we're building a platform that allows you to build a profile of yourself, where your information resides and whether it is right or wrong based on what you say is your version of the truth. And we do that using a cryptographic protocol that doesn't require any exchange of the actual data itself, so your privacy is preserved at all times. So our interest in biometrics because it's such a key piece of you is making sure that that information stays where it needs to stay and doesn't end up leaking or being sent to other places. So it's safe is what you're saying but is it necessarily accurate if a person can build whatever digital identity they want for themselves? Well, this leads me on to what we're working on is a picture of the world where you have what we call levels of assurance. So where your information is stored and who is in control of it and how reliable that place actually is. So take your passport today as an example. We don't own our passports. Our passports are loaned to us by our governments and they are the custodians of our information. They are the custodian of our facial biometric that is used to grant us permission when we cross a border. And it's a government to government relationship that says we keep this secure and in return you allow our citizens to cross your border and do whatever they need to do. When we start to think about how we're using technology today and how biometric information is being placed in our phones, for example, we have a copy of our passport, for example, in our phone and ultimately we wanna get to a world where we can use our phone as our passport to allow us to do certain things. We need to make sure that the security of that device is at least as robust as the security of our physical paper passport and that the government can warrant that data is secure. Now today we're not at that point. The latest phones, the smartphones that we have have technology in them that allows those what we call secure elements, trusted execution environments to hold this information and allow under secure circumstances the interaction that needs to happen between devices in an airport in your phone or devices in your bank in your phone or wherever so that that information always is maintained in a secure state. So, Rob, I gotta ask you, I mean, who, it's safe, okay, that's good, but who authenticates the accuracy, the veracity of the data, of the identity? Who's to say that you are you? Well, ultimately, the government is warranting to another government, in the case of a passport, that you are you and that is generally because you've gone to a government office, you've looked an official in the eye and they've said, yep, you are the person who is in that passport, you are female, 45 years old, whatever it is, and that information now gets inscribed in that document with a whole bunch of passive security measures that stop counterfeiting of that document. We've got to essentially do the same thing for electronic devices and we're not quite at that point yet and we're getting there but we've got a bit of a job to do still. Can digital identity be forged? I mean, physical passports, people have been forging it since passports were invented, right, but digital identity, on your platform? No, not on our platform, definitely not on our platform because we don't have any data on our platform. Data resides in other places and all we do is compare what you have with what you have or what she has with what he has. And it's cryptographically safe. And it's cryptographically safe. I was going to say, in the payment space in the rest of the world, Visa partners with Apple Pay, Samsung Pay, Android Pay, and in all those cases, the data on the device is tokenized. So it actually doesn't reflect what's printed on your card nor can it be used on any other device. It's unique for that one device. So in this example, if you had passport information on your phone, it actually would be a sort of mast or a tokenized version of your passport and then it actually couldn't be used when you need to replace it or it couldn't be used to spoof in another location. Yeah, I got to raise this question because we've been talking about it this week as well. Blockchain technology, distributed ledger technology. It's a buzzword, it's fashionable. Everybody likes to claim that, yes, we were going down that road. Does it have a place and how important a place in further development of biometrics? Definitely. Blockchain technology is the most advanced authentication technology that we have and of course you're going to use it as a part of a solution for maintaining security. It makes it much harder to spoof because you'd have to spoof everywhere. I'm going to take a completely opposite of view on it because I am concerned about privacy generally that our personal information slowly is being eroded or sold. Well, the thought of my information, my biometric information being enshrined in a blockchain forever terrifies me. I don't want it in a blockchain and when I'm European, in Europe, most people who have heard of GDPR, the General Data Protection Regulation, one of the rights you have under that is to request erasure of your information on demand. If I put personal information into a blockchain, it can't be erased unless the chain itself is deleted, which is a real challenge. I would agree with you. Blockchains are immutable, but they're not private. You can actually, they have to be mined in the case of cryptocurrency, for example, for bitcoins or whatever. Miners have to have access to information in order to authenticate a transaction and add it to the blockchain, right? So once someone has access to that information, it's not cryptographically hidden from anyone. It's accessible. The only thing blockchain guarantees is that you cannot change anything in that chain once it has been added to it. So, and there are negative implications to that. Commercially or economically, I mean, blockchain is actually extremely expensive to execute. If it were used to develop biometrics further, will that hold it back? Will it hold development back? Anybody? I think you're gonna see blockchains developed in instances where they're not as sensitive per se when it comes to sort of personal, very personal information. I mean, biometrics is the most personal information you can get, and I think they will be held in secure, very secure databases generally in probably centralized locations. And they will link through APIs under secure circumstances to maybe to a blockchain that has a pseudonym of you that links to a supply chain or a payment mechanism or whatever it is, but that pseudonym is what ultimately will be the linkage back to that biometric information. Okay, so we've talked about different technologies. We're starting to get into issues like privacy, et cetera. I wanna get to sort of a big macro question, something that we've been covering and also talking about pretty much all week long, and that's how this whole trade situation has kind of escalated and started blowing up. Tuesday morning, President decides 200 billion, we're gonna tariff that 25%, well 10, and then 25, starting next January. China retaliates, 60, but they're going five and 10. So tiered and maybe not as bad as people expected. But over and above and beyond that, trade, tariffs, et cetera, there's this whole competition, rivalry between structures, between systems of government, and also competition over who's gonna dominate technologies of the future behind things like the Fourth Industrial Revolution, including AI, robotics, et cetera. David, I wanna get you in on this because we were talking about the soft line a couple of seconds ago. Does biometrics fit in there, in this competition, in this rivalry? You make smart cameras, but you are a Chinese-American company. So in a sense, it's kind of win-win for you, no? Well, tension between the United States and China is not win-win for us. We would like to see the United States and China get along. And we're talking about business. And the business shouldn't be so confrontational. The development of these technologies is for the net benefit of the world. So I think that it's not competition, that advances in China, advances in the US benefit both countries. But it's not, the regulation is very different in the different countries. And so as we as technologists or others imagine the ideal world, I think like for the government in Europe to pass a law saying you have the right to be erased, even if that's not technically possible, governments are coming up with ideas that are not consistent with technological reality left and right. And the idea that the US or China or Russia can control AI or can control biometrics, that's absurd. So governments will come up with these kinds of restrictions. As technologists, we can't build these things. There's things in China that you need, there's things in the US that you need. If you wanna build advanced technologies, you need to work globally. And so you gotta hope that the winds of government behave at the speeds that they happen, but underlying relationships between people have to continue to grow. All right, so what you're saying basically is this whole rivalry thing is in nobody's interest. One, and the approach should be global. Are you also saying that it's kinda pointless trying to free router ask who is leading in biometrics nationally? Or even if it involves the companies that are doing the work at their domicile, whether it's the US or China? Dr. Singh talked about people can hack things at a much, much simpler level. There's no way that one country is gonna come up with an algorithm that the other country is not gonna find out about. Basically, the information about how to do these things moves literally at the speed of light. As it's invented in Shanghai, people in the East Coast of the United States know about it 30 seconds later. Okay, since we started getting into regulation already, let's go down that road then. I know that David, yourself, and also Bob, you've got some pretty interesting and also strong views in the sense that because the rate and pace at which biometric technology is developing, it's kinda hard for regulation to keep pace, is that right? It's very hard for regulation to keep pace. There are still laws in the United States, for example, let's say that you have to sign for an accountant. That actually exists in China as well with a new credit card where you need a wet signature and that's just not how the world is operating usually today. As in an ink signature. As an ink signature, right? There's a digital signature which works just as well. But I think that to your question about rivalry as well as this question about regulation, there actually are common sets of principles in terms of how something like biometrics can be used that almost every government is going to agree upon at a very high level in terms of if a customer, if a consumer is using biometrics for a commercial application, they should have control over when that happens. And the World Economic Forum, as an example with the Fourth Industrial Revolution Committee or Council, they have this, we have this ability to start to create a forum for different governments and private sector and academia to talk about what should those principles and standards be? And potentially that could help defuse some of this rising tension to have a common set of understandings about when and where should biometrics and other sort of new technologies be used. So this would be sort of like a code of conduct? It's almost. It's almost like a code of conduct. Yeah, but not legally winding. No, it's voluntary. And from FISA's perspective, the best standards are always voluntary and open so that they're used everywhere around the world. Nobody really benefits if you have a technology that you can use in China but you can't use it in Brazil, right? It means that in a global environment where people travel, you're limited in terms of the applicability of it. So that ability to have something open and common is key. Okay, whatever these principles are that eventually end up being agreed on, what happens when the issue of sovereignty kicks in? What happens if China, for example, which is very much in the lead, as you suggested, at least in terms of payments with biometrics, what if they don't sign on? Well, that's a risk with any sort of international agreement but I think we won't know that until we start to have that conversation and say what are all each government trying to ensure for their population in terms of how biometrics are used at that very highest level. As soon as you get to anything below principles, more granular than principles, there will never be agreement. How close are we to that sort of agreement deal or even document? Well, one example is on the Internet of Things which is a very amorphous term that can refer to almost anything but it is in many ways connected with biometrics. The WEF has a council that's going to be established, that's going to be working on that in the next few months and that's coming down the pike. Will it actually have a real impact in the next six months? I would doubt it, it's too soon but you can't actually create common language and common purpose unless you start the conversation. How much input does civil society have in this or is it having in this? Well, in the chart of the World Economic Forum, it's very much to have all stakeholders present. So it is government, it is commercial interests and it's other representatives of civil society, whether it be NGOs or universities, et cetera. Is it going to address the issue of privacy? It must. I think it should. Particularly for biometrics, privacy is one of the most important issues. Right now actually consumers love biometrics and the payment space, right? So when Visa does a survey worldwide, 86% of people say that they want to use their thumbprint or their retina in order to ensure that their identity isn't stolen and a transaction is fast and everybody understands that. But as soon as privacy issues start to come to the fore and they haven't yet on this front, that trust will dissipate and we don't want that selfishly as a payments company that's relying on this. On the technology itself, is it going to function something like the FAA to police whether the technology works as advertised on the box? Because I mean, any number of examples a lot of people can think of or imagine where, I mean, if the technology gets wrong, the results could be pretty embarrassing at the very least if not catastrophic. I think for that specific example, that's actually too granular for a multilateral organization to try to... Should be left to governments then? I think that that's something ultimately when you're talking about consumer protection that falls under a traditional national government's purview. David, what do you think? I think the issue is you have to have the process to resolve issues when something goes wrong. So things will go wrong. And so it's really getting to that level of knowing how to fix it when it goes wrong. China uses face detection and fingerprint detection at the borders. Recently, I came into China with a beard. With a beard, okay. Yeah, you know, like you're not you. And there's a process to resolve, well, I could be me with a beard. And that, you know, it's not, the technology is not going to be accurate. It's going to get it wrong. So it's like, when somebody, the technology says you're mentally ill, don't immediately go to the hospital, maybe get a second opinion. Ha, ha, ha. Did the technology get it wrong with the beard and no beard? Yeah, they took me 20 minutes or something to talk about that I could have be the same person with a beard. And you had to convince a human being. Yeah, yeah. Oh my goodness. Ha, ha, ha. Okay, might be a good time now to try and bring somebody a view. And everybody have a question. The only thing we ask is if you could raise your hand. I think they've got, they're roving mics that'll come to you. So tell us who you are, where you're from, who you represent, and also who your question or comment is meant for. That'll be helpful. If you ever want to kick it off, please, sir. Anytime I walk anywhere to get into China, they took all 10 of my fingerprints. You know, you go through all these different use cases. Why should I trust that as a password now that it's effectively a shared secret that other people have? And who would you like to answer that for you? Anyone. Anybody? Have a crack? I agree with you. Completely. And that is one of the problems, is that you have a secret that is openly on view for everybody. My fingerprints can be 3D printed and used on scanners. And this may be one of the things that governments need to actually get together and figure out that we, for example, my country, Ireland, they don't have my fingerprints. They don't have my DNA. They don't have a whole bunch of stuff about me. China knows more about me. The US knows more about me than my own country. But maybe governments need to step up and say we're going to take control of the biometric information of our own citizens and we're gonna make it available under secure processes and structures to those organizations who need access to it in the form of a token that can be used for a specific purpose, for a specific time, and do something along those lines. The danger is you end up with every country taking everybody's information because they think they have to have it. I could, that's an interesting question with a lot of angles. I mean, on the one sense, your fingerprint can be stolen, but on the other sense, like if I wear a ring, to me that could be a little bit too close to having an embedded chip, which wouldn't be stolen, but I wouldn't be comfortable with. The thing is that, first of all, you have multi-factor identification. Somebody just having your fingerprint, hopefully they'd have to have your phone, they might have to have other factors that would also indicate that they're you. And then the other thing is that still stealing your fingerprint, 3D printing your fingerprint is harder than stealing your credit card. And so it's a question of convenience versus balance of other ways that you could be spoofed. So I've got a question, and Rob, you're talking about this, it sounds sort of mission impossible, right? Create a mold of 3D print your digits and you're home in a way. Rita, though, how difficult or easy is it to forge a voice signature? Is it really that much harder? It's at this time, impossible to replicate the nuances of the human voice at the micro levels. But if somebody records your voice and then replays the recording, they've got your voice. That is a real concern. Once they have your voice, and we use our voice very freely because we are not aware of the potential of voice to identify us in many different ways. So we use Alexa, we use Siri, and I'm not trying to put those down. All I'm trying to say is that we use them freely, we get our answers, but we don't worry about what has happened, how we got the answers, what has happened to our voice. It's gone to some server. Has it been deleted from that server? Who has access to it? And what if tomorrow I want to use a voice-based authentication or verification system? Can I ever be certain that it will never be had by someone who has gotten hold of my voice from that server? So, I'm sorry? From this YouTube video, right? Or from this room. Biometrics like fingerprints are very much within your control and you can decide whether you want to give them out or not. Consciously, you can decide whether you want to give out your blood sample or DNA, but you cannot not talk. You have to talk. And people are walking around with phones that can record your voice at any time. And I'm developing this technology for profiling humans. It's like, think of it as what X-ray is to medicine. It can look into you, into your persona. So, pretty much anyone who walks by you, hears you speak, could record you and have access to what's happening inside you. And what can they do with it? It's a worrisome thing, right? It's a scary thought, anyway. Yeah, Bob. No, go ahead. No, I have a lot to say. I mean, I... Please, please, please, please. So, yeah, a lot of people have brought this up. And as academics, there isn't much we can do about it, but one of the things I worry about is we keep talking about the use of biometrics, protection of identity and so on and so forth. But the people whose biometrics we are talking about are, for the most part, not aware of its potential, be it face, DNA, fingerprint, whatever. The general person out on the road doesn't know how potent it is and, you know, I mean, how restrictive it can be, how misused it can be by people. So, they give out these things freely. If they knew about all of this, I can imagine a group of people who would just, you know, go up in arms and stand at the airport and say, I'm not going to give out my personal data. Does there need to be fine print in this sort of technology then? I'm sorry? Does there need to be fine print in this sort of technology then? Yes, there is, I mean, that fine print has to come in the form of awareness. The media and as academics ourselves have to build that awareness in people and then use their information. Their information is like their money and if they don't know that they are giving out this valuable information, it's like stealing their money, right? Bob, does Visa do that? Steal people's money? No, no, no, no. Make sure that the fine print is there. Visa's a bad word. Our banks do that, our clients in Baines. But where I was going as well is it's very similar to what David said earlier, is that it's what creates the security as a multi-factor authentication, right? It's your fingerprint, yes, which governments have, but it's also your phone. And it's your geolocation signal saying you are at a store where they're saying this transaction is happening. All those things come together and it makes it incredibly high likelihood that that is you making that transaction at that point in time. So from a payment standpoint, it's incredibly powerful, especially compared to any alternative. Compared to a signature on a back of a credit card, a fingerprint is a much more secure way to ensure that somebody is who they say they are. But with that being said, outside of payments, security areas, et cetera, I think there's many more layers of authentication that are needed in protection of that data. But that's sort of a problem, if you will, for governments really to focus on to ensure their citizens that they're gathering all that information and they'll keep it safe. Okay, Rita, I forgot to ask your voice recognition down the street, the technology. How, based on the database you have of samples, have you been able to calculate how accurate it is? It's an insane thing. At this time, this is just four years old. It is where DNA was in the 1980s. We've just scratched the surface. It's not perfect right now, but one day it'll get very accurate. I have no doubt about that. Can you put a number on it now though, how accurate it is? High 90s? High 90s. High 90s for most of the parameters? All right. Including your personality, including the fact whether you're neurotic or not, it can be more, suddenly you can imagine that it can be more consistent than physicians and clinicians, especially for personality and behavior, not to criticize physicians and clinicians. If you take the opinion of multiple physicians for the same individual, they will not exactly coincide. They will differ. So human beings are not very consistent in making that kind of judgment. So what do you do, I mean your psychology, does it have potential applications in let's say medical diagnostics? You think? I think machines can be very much more consistent to begin with. And so they provide a good platform for the doctors to reference their opinions against. Time for, sir, in the back. Hi, I'm Risalat from Bangladesh and I'm a global shaper with the New York Hub right now. So many of you are probably familiar with Professor Yuval Noah Harari's work, he's the author of Sapiens. And I'm reading his current book right now and in it he makes the case that the merger of infotech and biotech would essentially raise the possibility that humans can be hacked, right? So if we look at the last few years of how basically aggregated social media data like social data about us has been weaponized for manipulating people, psychologically profiling people in election situations, if we take what the work you're doing, Rita, with voices, I was talking to a founder of biotechnology and neuro-technology organizations who are aggregating those kinds of data. And as we have more biomarkers and stuff like that in the coming decades, if you use those to kind of do the same kind of profiling, it could really have a crazy kind of dystopian future where we can be hacked, right? So I just wanted to hear your perspective about that potentiality and how we can form the kind of, whether it's global norms and other agreements that could protect us against that kind of eventuality. Thanks. Great question. Anybody in particular would like to address that? I agree. I think we're not far away. All of the peace parts that you need are potentially there. It comes down to an ethical question at the end of the day. Is this acceptable? For me, it's absolutely not acceptable. And discussions on this level happen at forums like the World Economic Forum. And I think if we can build consensus that this is a line we're not going to cross for the use, for example, the use of drones for targeting of somebody because I know what your facial template looks like, but these are the kind of things that we absolutely should not allow. Who's gonna police that though? Who should police that? Well, it's global bodies like maybe the UN or that try to build consensus. David, you're naughty. This is a really, really great question, but I think the answer to the question is in your question. Let me talk about social norms. Hacking is this thing where, traditionally, like if you have a rock and there's a plate glass window, you don't throw the rock through a plate glass window. But somehow hacking is emerged as this thing where people feel like if a system can be broken, somebody's gonna go break it. Yeah, that's true. But we don't do that because we're a society that works together. That social norms develop. So somehow we need to develop a social consensus that there are ways that people behave that involve protecting each other. And at the core thing is that if people can be hacked, we protect each other so that if somebody is being bullied or being harmed or their identity is being stolen, we come together as a society to protect them rather than just say we're the kind of people where if somebody can break a plate glass window, they're gonna do it. So not so much laws, but etiquette is what you're saying. Yeah. Wow, okay. Probably got time for one last question from the floor. Anybody like to have a go, sir, please? Just a simple question. The quite exciting and I think it was quite exciting and at the same time scary talks. And I was just wondering how we can continue to lead a simple and ordinary life that we are enjoying right now. I mean, can we wear a mask and that will be a solution or what else? I mean, good question. Social media in many senses is a mask, right? Do you have a cell phone? Start by throwing it away. Right. I don't have one. I think life is just going to go on getting more complicated. The more technology we have, the more complicated our life becomes. So we're just talking about biometrics, but there's lots of other technologies coming up that are changing our lives. Starting from smart cities to your internet of things and everything that affects your daily life is in your room, in your bedroom, in your bathroom, everywhere. Is that going to make our life simple? Well, you'll have to change your philosophical definition of simple to get there. But I don't think our lives are going to become simpler. They may become easier to live. What we define as quality of life may go up, but simpler, no. Things will only become more complicated. Almost inevitably, yeah. Inevitably. What I'd like to do before we wrap up is maybe just go all the way down the row. Last thoughts that you want to leave this audience with regards to everything we've been talking about, Bob? Well, I think just from my perspective, the real value here in this conversation and all the conversations happening here at the Forum are trying to create those international norms. You could call it the etiquettes, the first principles upon which all of this complexity can be managed and the potential downside can be mitigated. All right, Rob? I think biometrics are an incredibly powerful tool. When you think about financial inclusion, there's 1.7 billion people on the planet that don't have an identity. Creating an identity for them, using biometrics, getting them into the financial system, giving them opportunity is something that biometrics can enable. But as Rob said, it needs a structure around it so that they are used ethically and properly and we don't end up with the rogue hacker suddenly selling 2 billion people's biometric information on the dark web. That would be catastrophic. Rita? And I think we need to devise ways to regulate the use of biometrics more strictly and prevent misuse by societies, not just individuals or businesses. Okay, David, last word. I think this idea of the social norms and etiquette is critical, but the idea of the competition between nations feeds the idea that we take separate paths and maybe encourages, if you get nations acting as rogue hackers, then the system becomes very, very dangerous. And so somehow we need to find a way for nations to come together instead of to push nations forward. How much do you fear that's exactly what could happen with biometrics? A lot, I guess, but on the other hand, I have hope that that won't happen, that we can come together instead. Excellent. All right, folks, listen, we're just about done. Thank you for your time to our panelists and also for you for taking part. We'll see you next year, I guess. Thank you.