 What if seeing is no longer the same thing as believing? What if the lines between what's real and what's fake has become so blurred that we can no longer trust ourselves? Anyone or anything? Let me play you something Imagine this for a second One man with total control of billions of people stolen data all their secrets their lives their futures I feel really blessed because I genuinely love the process of manipulating people online for money I too am nobody. I am a fake bear You ever wonder why I'm so popular because of my big brain? Maybe president trump is a total and complete dipshit Now You see I would never say these things at least not in a public address, but someone else would someone Like jordan peel All right, so what if I told you that obama never called trump a complete dipshit? What if boris johnson never called himself a fake? Well, you wouldn't believe your eyes your ears or even your judgment, right? Recent advancements in artificial intelligence as well as video manipulation technologies have come to a point Where we can do we can put anything into anyone's mouth? We can make people do whatever we want And the second thing is also these kind of technologies can actually generate content that appears so real That they become indistinguishable from reality And the third thing and probably the thing that is the most scary Is that these kind of technologies become accessible to anyone? So you may have heard of a technology called deep fakes, right? So it's a open source software that emerged from the dark corners of the internet You know since 2017 and what the software allows you to do is actually to swap someone's face with another person And This technology is actually really really easy to do anyone can actually produce these kind of films So let me show how it works all you need is a gamer pc something quite sophisticated with gpus that you can get In any store so to you know a few powerful gpus And you need internet connection you can download one of your favorite deep fake solutions And the last thing you need is a lot of data, right and the data that we're looking at are basically Photographs videos where people's faces are present. So let me walk through the process of creating a deep fake So first let's start with an snl video of alec baltman impersonating president trump Then the second thing you do is you use this software to Analyze where the face is and you click as many many face data as you can In addition to that you also collect more images of donald trump roughly 10 000 And then you do the same thing for the actual donald trump, right? Once you're done with that You throw these data into the software And what you can do is you can actually train a deep model for both of these subjects, right? So this can take a few hours or you can do this also overnight And then what you do next is that the software actually replaces the identity Automatically by synthesizing a face onto the original video. So let me play this video real quick I want to talk about what is really important which is jobs Okay, because I am going to bring back a thick stream of jobs back in this country The biggest strongest steadiest stream you've ever seen this country will be literally showered with jobs Because I am a major wizard jobs That is to be a golden opportunity for me So the scary thing about this is that you don't need any computer science knowledge And even if this is too complicated for you all you need to do is download an app There is a chinese app where you can actually download and there's another one that just got released from ukraine All you need to do is actually take a selfie of yourself And the software uploads this to a cloud and automatically inserts your face into one of your favorite hollywood clips So as tech companies artists Hobbists are embracing themselves to get access to these kind of new technologies A pandora's box is opening. So let me ask you Where do you think this type of technology is mostly used for? Well, as it turned out 96 of all the defects, you know Are for non-consensual pornography, right? So they were You know a list celebrities like gal gadots Scholar Johansson, Emma Watson, they have all been inserted into hardcore pornography That are you know spreading like crazy on the internet As of july 2019 there were 134 million views at least right of these Content and not only celebrities are being affected Anyone of us could be a victim Here's the story of noelle martin. She's she was an 18 years old Law student where she discovered that her face was inserted in and photoshopped into You know hardcore pornography When she started, you know demanded the webmasters to actually remove You know these content what happened was The perpetrators actually created even more Defects of herself and inserted them into video Now it's not hard to imagine how these kind of technologies could be misused in a political Sphere as well as against corporations for politicians, for example You don't even need to go as far as you know having some dictator launch You know announced that you're going to launch a nuclear bomb What you can simply do is you can just ruin their reputation, right create some new scandals Swing political elections or exacerbate societal division Again CEOs you can make them, you know do certain fraud You can make them announce their retirement and instantly you have the ability to Cause a stock dive, right? Okay, how did I get involved in 2d fakes? Well, you know this is from a while ago So i'm an associate professor at the university of southern california And i've been doing research on how do we develop technologies for digitizing humans specifically for democratizing this technology for over a decade One of my fascination since my phd was how can we actually how can we develop technologies that are accessible to people At that time, you know We could actually capture performance of subjects we can create Virtual actors, but then the process was inaccessible to people You needed a film studio behind to create those and I was fascinated by the question on how can we democratize this technology This is an old video of myself where we developed, you know, one of the first performance capture solutions Where in real time you can actually puppeteer someone's face A couple of years later. I was invited to wet a digital. So that's a company in New zealand that is, you know famous for lord of the rings or avatar and I was working on a really interesting project which is Palm walker an actor from the movie fierce Fasten the fear is he died in the car extend and for fear salmon, you know, we still needed a lot of shots of him So the solution was actually to reenact him digitally, right? So we developed new performance capture techniques to capture his brother and to recreate that The main issue here was that in some ways it appears like a high-end version of deep face But you needed an entire Team and a lot of hardware solutions to make this possible For example, if you wanted to capture, you know, celebrate his face if you want to use this for stunts or you want Bring someone back to life. You would use a high-end capture system to digitize their face So the problem is this is not something that's consumer accessible This is a technology that you know costs a million dollars. No one will have that at home So we started recently to develop AI technologies that would automate this process and democratize this Here's an example As we have the ability to capture a lot of these data, we can actually train models to predict these things So here's a recent research We just have a single input image of a person and what you can do is you can automatically generate a 3d scan And predict this without requiring the whole sophistication of the hardware At the same around the same time when deep fakes emerged from the internet My startup company pin screen There we were developing avatars that are consumer accessible We wanted to build a solution where all you needed is a single image of someone But have the ability to create photorealistic faces. Let me show you the technology behind this. So On the upper right, you can see photographs of people where we have never seen anything else from from them On the left, we have a driver and on the bottom right, we actually have Expressions new viewpoints of these people that are generated From the single input image Not only are these generations, you know plausible, they also happen in real time So here's a more extreme example But we basically can show that in real time we can actually drive a person's face from a single input image This is kind of scary because it makes the technology more accessible to people You don't need to collect a lot of data and you can live stream broadcast a message and stealing someone's identity and impersonating that person So You know Not surprisingly, you know One day one of our investors got worried and you know got a call from them and they're like how was this you right? So it was all over the news There was a deep fake technology that's you know spreading on the internet Used for pornography and you know very quickly we have to tell them like we have nothing to do with this but One thing that was important is that there was a new ethical concern about these type of technologies Where we really had to be worried about right and we had to take we have to be responsible about like what kind of Applications come from these technologies and we're not the only people looking at this Both google and facebook started their efforts in you know detecting you know technologies like Technologies like deepfakes Facebook for example invested 10 million for a Deepfake detection challenge Google CEO sooner pitch. I even mentioned Detecting deepfakes is one of the most important challenges Ahead of us right. So these are new efforts that came from it And you know, we also started in our research community organizing workshops on this topic inviting experts in all field And that's where I met You know media forensics experts, honey farid from uc berkeley and we teamed up and worked on how do we combat deepfakes Right. So this is the approach we took it's a cat and mouse game so Our team was responsible in generating as many as possible deepfakes and they have to be as realistic as possible His team on the other hand focused on the detection part right and one thing that is interesting about the detection part is that We made the assumption that the detection That's the content that we were generating were perfect So the only thing that is left to detect is basically the behavior the expressions of the person And the only thing that is unnatural in deepfakes is the driver. So we're looking at this So for these kind of algorithms, uh, we could reach something like a 94 accuracy But only for known people for these people where we have enough footages hours of footages, but still We still think that it there will be a point where you can actually create deep fakes that are not going to be Detectable and you know this kind of you know detection is will always be behind That's one of the issues and this is to give you an idea of where we're at now So on the left you have a ground truth image This is these are deep fakes from six months ago and here we have deep fakes nowadays And I can assure you that very soon there's going to be Even more realistic ones to a point where to the naked eye. It's going to be nearly undetectable Here at davos, we're also showing a real-time deep fake capability. So that's a new technology That's I invite you to actually try it out. It's there at the exhibition if you haven't done so And you can basically turn yourself into you know, your favorite celebrity But here the key message is really that you can detect You can actually have deep fakes in real-time already, right? All right Media manipulation is not something new right Image manipulation has been done for years for propaganda and you don't need very sophisticated tools All you need is photoshop. Here's an example from the la times where you know back in 2003 Journalist called brane walshy was confronted by the la times for you know Creating fabricating an image that appeared on the front page using two compositions of images, right? Just to make it appear more dramatic Here's another example with para siltan which clearly shows that you know, all you need to do is change the wording What's on her t-shirt in order to catalyze hate on social media? Right and this is something that is very very easy to produce the other thing that's really important is that you know The main issue is necessarily deep fakes, right? It's only part of the problem. The other big problem here is social media itself There's something that's really unique to human history, which is anyone Can't create any kind of content content spreads like crazy And you know anyone can share information and it's almost zero fact-checking So then there's a question of whether this is a direct threat to democracy Because if there is no way to actually verify if your source is real or not Well, how can you have democracy if there is no way to share facts? Here's an example of Nancy Pelosi house representative Nancy Pelosi Where the perpetrator basically only you know slowed down by 75 percent in order to make her appear drunk This video was shared three and a half million times on twitter including a president's retweet and this is an example that shows that You know this information can be very very dangerous The other thing is Not just this information is dangerous But just the fact that you can manipulate anything you want This is what we call a liar's dividend the liars dividend basically says the following If there is a scandal of someone, uh, if there is a video that person can always blame it on Well, it was manipulated video. Here's a real-world case um governor candidate, um from sau Paulo called uh, jaudaria In brazil, uh, there was a Orgy video of him that leaked uh on the internet And of course his lawyers, um, you know in their technical report basically blamed this on a deep fake Right, so the danger really is you know, and even you know, even if something turned out to be true or you know, a fake Um, the damage has already been done So how can you protect yourself against deep fakes, right? So the first thing that and that's the thing that we're doing here We want we have to educate we have to we have to educate people about technological capabilities two days To uh, what's possible today and what's possible in the future And the second thing is we are we also have to build tools that have the capability to analyze the spread of Deep fakes as well as what their intentions are, right? And the last thing is we have to work with lawmakers on regulations on how these kind of Video manipulations are uh being deployed, right? We have to make sure that they're not used for um harmful purposes or You know any purpose that it could actually mislead Uh what we want to do, right? so Then there's also, you know people who would argue that, you know, maybe we should ban The entire research in this domain. Maybe we should prohibit researchers being welcomed and I think it's a bad idea Because there's actually a lot of positive applications. Let me show you a few of them Here's one example um from facebook reality labs where Telepresence is enabled using a technology that's very related to deep fakes, right? It allows you to actually build a photorealistic presence of yourself Characters that don't look like avatars from from video games Here's another example why you can imagine How could a next generation virtual fitting room look like? If you have the ability to, you know, generate your face on top of a, you know, you know try How can you look like with different clothing without actually going to the physical store? Also, I believe that in the near future, we're going to You know interact with virtual beings virtual assistants in a regular way going beyond Siri and Alexa But then, you know, one of the issues here is that Building these avatars is very very expensive. So having the ability to democratize this You know could mean that these kind of deep fake technologies can be useful So not all the applications of deep fake technologies are bad Bill Gates once mentioned Technology is just a tool in terms of getting the kids working together and motivating them the teacher is the most important thing So to conclude Deep fakes are by design a form of manipulation But so is communication as well and Our abilities to trust Strangers is at the core of society But then on the other hand, you know, we also have tendencies to want to manipulate other people So one thing that I find the most fascinating about deep fakes is that they reveal something about humanity Thank you so much. Thanks for listening