 Hello, humans. How are you guys doing? Yeah? Woo. So this talk is ostensibly about technology. But really, it's about us. It's about our potential. It's about our vulnerabilities. It's about our limits and what's at stake for us, both in the midterm and near future. There have been many things in my career. I started my career in dark matter physics, and then I helped lead design for Firefox. Did the whole songs a thing with Moodblaze Playlist, which is sort of fun. And one of the things I think I've always had my hat on is, as a designer, we know that human beings are hackable and that they don't have as much free will as they think they do. That's the whole conversion funnel thing. We are, in fact, sort of the magicians of the digital age. And so we're a susceptible number of types of attacks. When I was in 2006, I was working on something called a stopping queue attack. So you guys played with the infinite scroll, just like on Instagram, Facebook. At some point, that was a new technology. And so I have the dubious honor of being the person who invented that thing. And what it does is that it removes the stopping queue. So a stopping queue for humans are when you're drinking a glass of wine. You don't stop drinking normally until you reach the bottom. And that's when your brain wakes up. It's like, oh, maybe I shouldn't have another glass of wine or maybe I want another one. That's when your brain can catch up with your impulses. And by removing the stopping queue, it's like, well, as a designer, every time you ask the user to do something they don't care about, you've failed. So by removing the stopping queue, it just, you keep going. You keep going. And I actually was calculating it out yesterday. If you just multiply the number of people that use these kinds of products times the number of hours people actually waste and get into these sort of trances, it's the equivalent of wasting 200,000 human lives per day, which is the number of that's one third of the people in Helsinki's entire lifetime filled up with time they don't care about. Another kind of attack is a social conformity attack. So this is an interesting study called the Solomon Ash study on social conformity from the 1950s. And here they asked a really simple question. Do you believe your own eyes more? Or do you believe in what the people around you are saying? And so they would ask people, they'd give people three lines, A, B, and C of different length. And they'd be like, all right, here's a fourth line. Is this the same size as A, B, or C? And if you do this by yourself, you get it right 100% of the time. Then they would have three actors first before you gave your answer, give the wrong answer. And here's the question. Did people conform? Or did they just literally measure with their fingers? Here's the answer. 5% of people always conform to what other people say. 20% of people never conform. So that's good, which leaves 70% of people which conform some or most of the time. So that means there are three times as many people that conform to what they think other people think, then literally measure with their own fingers. And when you think about the power of what Facebook or Twitter or any of the other social networks that we have do, they're the very first kinds of technologies that directly intermediates what you get, what information you get from your friends and the people around you. In the last, in the 2018 elections in the US, it turns out that Russia, when they were doing their election manipulation, didn't actually invent as much content as they did in the 2016 election. All they did is they found existing content and amplified it to try to increase derision. Twitter just recently had to remove 70 million fake accounts all because of this kind of social conformity attack. So it took telephones 75 years to reach 100 million people. Facebook, it took them four years. Instagram, two years, Candy Crush, it took less than a year, just months to reach 100 million people. Any time we now code systems at scale, it's an inherently political act. And one of, I think, the biggest myths of technology is that it's just a neutral platform. Tech is not good, it's not bad, it's just neutral, but that's not exactly true. Tech now, and particularly our deployments of it, have a particular bent. They have a particular kind of externality, which is how you measure how neutral platform is. So my favorite example of this is a car. A car is a dual-use kind of technology. It does good things. It lets us go places we can never go before, do things we can never do before, have a kind of mobility we never had before, and at the same time, it terraforms our cities. So we live further apart and are more socially isolated. It quite literally makes us fat, pollutes the environment. And so here's the question. Would we rather have cars that are more like Priuses or more like Humvees? Or maybe we're asking the wrong question. Maybe it should be more like bicycles or public transit. There is nothing that says that technology, as it says today, is the way that it needs to be. So what kind of not neutrality does our technology have? Well, all of these companies, Facebook, Twitter, Google, news, they're in a race to the bottom of our brainstem to capture our attention. Because it turns out attention is sort of the final commodity. It's the thing that everyone wants, and we only have a limited amount of. And so these companies are in a zero-sum win-lose game to try to grab your attention, which is why we all feel the effects of digital addiction. Like, you know you're addicted to your phone by whether you check your phone before you pee in the morning or while you pee in the morning. So if you step back, he's like, what? Just a simple question. Like, why is it that all of our sort of modern societies systematically abuse animals? Well, you could think of like maybe it's something about enlightenment and like our ascendancy over animals and rationality. But I actually think it's a much simpler reason. It's simply because the cultures that survived, the ones that out-competed the other ones, are the ones that learned to treat animals as resources to exploit. Which companies are now out-competing all the other companies? Oh, yeah. They're the ones that are learning to treat human beings as resources to exploit for our attention. And in fact, the attention economy companies make up more than 50% of the top US stock market. So I want to introduce a concept, an idea, that of blue lighting. So I think it's important as we, as our technology, gets more and more powerful for us to take a step back and look at the creature, the human that we are, and our limitations and our capabilities. So if you take light at night and you shine it into your eyes, blue light in particular, it has an effect, a physiological effect on your body. It messes up your melatonin cycle, which means you sleep worse. It turns out that the melatonin cycle is connected with the cancer cleanup cycle. And note that when you are shining blue light into your eyes, there's nothing that tells you that something is wrong. You just get this sort of uneasy sense that you're not sleeping as well. Your quality of life goes down. That's the effect of being blue lit. And once you realize it, of course, you can create technology that starts to solve it. That's why all of our new phones change to have sort of flux-style sort of yellow light. But note that you just don't stop sleeping, right? You just have this degradation of quality. That's the effect of being, or the feeling of being blue lit. Think about our relationships and our connections. We have that same sense something is off, but you can't quite put it into words. And it's because the very thing, like your mind, that can do the diagnosis is also the thing that's being affected. So blue lighting, I'm using as an analogy for a much broader thing. There's a very specific thing where you shine blue light into your eyes. You can't choose whether it affects you or not. But every time that technology does damage by exceeding human limits, I think is a kind of way that technology blue lights us. Another example of blue lighting. We're constantly showing ourselves beautified images of what we look like. And that, in the race of the bottom of the brain stem, gets down and starts to modify not just behavior, but identity, 55% of plastic surgeons in the US now say that they've seen a patient that wants to look like have their body modified to look like their Snapchat filter. There was a study just in Germany. So that's like blue lighting like our identity and our kids. Then you start being able to blue light entire societies at once. The study in Germany from this last year, they showed that there's a causal link. If you dose a city in Germany with Facebook, violent crimes against immigrants goes up by 50%. It has an effect. So that's blue lighting an entire society. And one of the things that we'll always hear from here is these companies, Facebook in particular, will be like, oh, but you know, this is not our fault. We are just giving people what they want. Just the stuff that they're clicking on, we don't have any agency in this. But that's not exactly true. It's sort of intellectually dishonest because they're optimizing for engagement, optimizing for clicks. It's not that they're showing us what we want. It's that they're showing us things that we can't help but look at. It's a sort of a hacking of our neuroplasticity. When we drive by a car crash, we can't help but look at the car crash. And now there's an algorithm there that's like, ooh, you like looking at car crashes? Well, I'm gonna make more car crashes and show you more car crashes. So if you think about this on a spectrum of any content anywhere on the web that's sort of powered by one of these algorithms, and there's this spectrum going from like, sane, calm, nuanced, rich science over on this side, and on this side you have sort of extreme, polarized controversy. Which kind of content is better at catching your attention? Well, obviously the extreme stuff, like catches your amygdala makes you look. And one of the interesting things, everyone always says like, correlation is not causation, but with algorithms that like, see what you do and where you look and then reinforce that, correlation causes causation. So we are tilting the world towards the extreme, the polarized, we're constantly being shown like the very best lives that everyone else is leading, showing that our friends are all having fun without us all the time. And that has an effect. So right before the last 2016 election, turns out that YouTube, which is used by 1.9 billion people, which is more people, is around the same actually as the number of people that follow Islam, they recommended Alex Jones's videos, and the conspiracy theory is 15 billion times, which converted to two billion views. And if you think about Scientology as a cult, which only has 40,000 people, if even one out of a thousand people starts to like believe in these kinds of things, that means just these video recommendations alone, we're printing Scientology-sized cults every single week. If you, oops, if you look, it turns out that YouTube loves recommending conspiracy theories. It's sort of the great polarizer of our times, a great radicalizer of our times. If you look at just the percentage times that the Flat Earth theory is recommended, in Google searches, it's only 20% of the searches are for show Flat Earth. On YouTube, 90% of the recommendations, all that free advertising, is pointing people at the Flat Earth theory. Recently, Kyrie Irving apologized for promoting Flat Earth, because he said he got lost in a YouTube rabbit hole. He then laid one of you were candid and said, like, I'm sorry for misleading and causing a whole bunch of kids to start to believe in Flat Earth. The teachers said that it was really hard to get people, like, their kids to come back, because they now accuse Kyrie Irving of having been got to by the round earthers in the conspiracy. You know, we see this everywhere. It's not just on YouTube. On Facebook, if you're a new mom and you join a support group, this is a real example. Because they're optimizing for engagement or attention, what turns out to be really engaging for new moms? Well, it turns out the anti-vax movement, these kinds of conspiracies are very sticky and so it recommends them as the top suggestions and from there, it's only one more click into the top suggestions for those things are things like Pizzagate. And so very quickly, you see that no matter where you start, all of these products are gently pushing you one bit by one bit by one bit over into radicalization. And of course, if you take another great example of blue lighting, if you take a country which has deep social divisions and racial tension and then you add virality on top of it, because that's what's good for your business model, you end up with the kinds of genocides we're seeing with the Ringa and Myanmar. So are they giving us what we want? No. In fact, they're moving from a kind of blue lighting into gas lighting because they say, this is what you want but it's really just what we can't help but click on. It's a hacking of neuroplasticity. Facebook will often say like, oh hey, like if you don't like Facebook, just delete it. And then, oh hey, you guys are, you're back. It's just been a couple of months, welcome back. So here's the thing I think we've missed. Everyone has been worried about the time, if you think about the human brain as a kind of technology that's 10,000 years old, agriculture. We're sort of fixed and technology is getting more and more and more powerful and we've all been worried about this time when AGI and AI exceeds human capabilities. And what we've entirely missed is that there's a point that happens much, much, much earlier starting around now when technology starts to exceed human limits. I think one of the most interesting ways to think about this is that in order to get you to click, in order to get you to stick around, these companies are building little models of you. Think of them as like a voodoo doll that looks like you, which they can wake up and do experiments on to figure out which content is maximally engaging. Facebook really recently had to come out and say like, we do not listen to your microphone when you have a conversation with a friend and then the next day, the product you were talking about that you've never talked about before shows up in your newsfeed. It's not because they're listening but because their knowledge about you is so precise that they can make that prediction. And when you play Gary Kasparov in chess, you lose because Gary Kasparov can see more moves in front of you, right? And when Gary Kasparov plays deep blue in chess, he loses because deep blue can see more moves in front of him. When our minds are pitted against sort of the smartest supercomputers in the world, right? When you open up YouTube, it's not just some random sets of videos that they autoplay. The reason why you open a video and you sort of like you get lost in this transfer an hour is because you're playing against a supercomputer and these supercomputers are getting better all the time. So we live in a time of accelerated human hackability or ah for short. Where are we going here? We are just beginning to like really be able to reverse engineer the human. So this is from the Jacqueline lab at University of Berkeley where they stick a human being inside of an fMRI and they reconstruct what the person is seeing purely from brain data. So that means quite literally be able to read your brain. Interestingly enough, when you dream, your visual cortex is activated. So that means the next five, 10 years are gonna be able to start quite literally seeing what people dream about, which is terrifying. There's a 2015 paper that showed that computers are better at reading micro expressions than humans are. These are the true and involuntary facial gestures. So often we hear that empathy is gonna be the thing that saves humanity. It's gonna be the biggest backdoor into the human mind. All the new iPhones do real time face reading. Just imagine when Netflix or Facebook gets access to the exact moment you get bored or the exact moment that you have a little like semi sadistic smile when something happens and you start generating new media, particular to you. All of these faces here, none of them are real. They've been generated by an AI. And so now you can start thinking, oh, if I wanna generate a face that's particularly persuasive to you, all I have to do actually is just take your top Facebook friends and generate a new face, which is a mix of them. And for some reason they just, it feels so familiar that you trust them. You generate a voice taken from social media that's a little bit like your father or your mother's voice. So once again, you can't help but trust them. The political campaigns of 2020 and 2024, they are going to be sort of a Google duplex style. You're gonna get a call, but it's gonna be coming from actually a call center spun up, run by AI that has read your social media, knows your language, used it back against you in the voice of one of your parents with a face of some of your friends. This is where it's going. We need to be thinking no longer just about human center design, but human protective design. Because we, our stationary, our technology is getting exponentially better. And I'm reminded of, it took 10 years from when we put telescopes in space before we turned the telescope back on earth and took that iconic picture, created the whole earth catalog, started the environmental movement. And we need now that movement in technology where we turn the telescope of our own intelligence back on ourselves and ask, hey, how do we work? What are our capabilities? And also what are our limitations? Let's not be blinded by the good of what we create, but take a very clear eyed view of our psychology and our makeup so that we make technology that doesn't blue light us and society, but instead enhances our potential. Because otherwise we're gonna be continually pushed towards the extremes and have this kind of accelerated, extreme human behavior which leads to the trumps of the world, which leads to the kinds of fascism and authoritative rule that we're seeing. So that is human protective design. Thank you very much, guys.