 What is surveillance capitalism? It rests on the discovery that private human experience was to be the last virgin wood available for extraction, production, commodification, and sales. People, that means us, we did become chattel for commerce. That's exactly what happened. And the results are shaking democracy to its core. They're transforming our daily lives. They're challenging the social contracts that we've inherited from the Enlightenment and indeed threatening the very viability of human freedom just as was predicted. Under siege, though it may be, the only possible remedy for all of this is democracy. And that's why we're here tonight, of course. So I think about it this way a little bit. You know the story of Alice in Wonderland, yes? Everybody know the story of Alice in Wonderland? And you remember the white rabbit who had the clock and he was rushing and I'm late, I'm late for very important date and he goes down the rabbit hole. Well, the way I think about it is two decades ago, we were all Alice and we encountered the white rabbit and he was rushing down his hole and just like Alice, we rushed after him. We followed the white rabbit into Wonderland. What happened in Wonderland? In Wonderland, there were various things that we learned and it took us two decades to learn them. Okay. First of all, we learned that we can search Google. We search Google. But now, two decades later, there is a fragile new awareness dawning and it's occurring to us that it's not so much that we search Google, it's that Google searches us. In Wonderland, we assume that we use social media but now, we've begun to understand that social media uses us. We thought that these are great free services. While these companies were thinking, these are great people who are free, free raw material for our new operations of analysis, production and sales. We barely questioned why our television sets or our mattresses came with privacy policies. But now, we're beginning to understand that privacy policies are actually surveillance policies. We admired the tech giants as innovative companies. But now, innovative companies, by the way, who occasionally made some big mistakes and those mistakes violated our privacy. The difference now is that we're beginning to understand that those mistakes actually are the innovations. Those mistakes are the innovations. In Wonderland, we learned to believe that privacy is private. We failed to reckon with the profound distinction between a society that cherishes principles of individual sovereignty and one that lives by the social relations of the one-way mirror. Privacy is not private. Privacy is a collective action problem. Privacy is a political challenge. Privacy is about the kind of society that we live in. Finally, our most dangerous illusion of all in Wonderland. We believe that the internet offered unprecedented access to proprietary knowledge. But in the harsh glare of surveillance capitalism, we have come to learn that proprietary knowledge now has unprecedented access to us. Surveillance capitalists sell certainty. So they're competing on their predictions. So let's reverse engineer these competitive dynamics and see what we find. Well, number one, everybody knows and AI needs a lot of data, right? Everybody knows that. So the first thing is economies of scale drives them toward totalities of information. We need data at scale. Okay, that's an easy one. Competing on scale is good, but not good enough because eventually they realize, hey, you know what? We need a lot of data, but we also need varieties of data. Now we know that we need economies of scale, but we also need variety. So we need economies of scope, different kinds of data. Now, even though you're not old enough to remember the dot-com bust, many of you are old enough to remember the mobility revolution, right? So this is the idea that we give you a little computer, you put it in your pocket and you go. Well, we'll call it a phone. What the heck? And it will go everywhere with you and now we can get economies of scope like where you are and what you're talking about, who you're with and what transactions you're making and maybe where you're eating and what you're eating and who you're emailing or texting or what kind of browsing you're doing while you're walking in the park or walking through the city. We can get your voice. We can get all kinds of things now. Oh, and don't forget what's the most important thing of all that we can get with this new computer and get your face. We can get all your faces. Okay, so we've got economies of scale and economies of scope. Prediction continues to evolve and competition continues to intensify and pretty soon there's a new realization. The most predictive data comes from intervening, excuse me, in your behavior. Intervening in your behavior. Intervening in the state of play in order to actually nudge, coax, tune, herd your behavior in the direction of the outcomes that we are guaranteeing to our business customers. Herding your behavior in the direction of our revenues and ultimately our profits. Okay, because what is new here is that at no other time in history have the wealthiest private corporations had at their disposal a pervasive global architecture of ubiquitous computation able to amass unparalleled concentrations of information about individuals, groups and populations sufficient to mobilize the pivot from the monitoring to the actuation of behavior remotely and at scale. This, my friends, is unprecedented. What is this new power? It works its will through the medium of digital instrumentation. It's not sending anybody to our homes at night to take us to the gulag or the camp. It's not threatening us with murder or terror. It is not totalitarian power, but it is a new and unprecedented form of power just as totalitarianism presented itself as a new and unprecedented power in the 20th century. This new power is what I call instrumentarian power. It works its will remotely. It comes to us secretly, quietly. And if we ever know it's there, it might actually greet us with a cappuccino and a smile. Nevertheless, it represents a global means of behavioral modification and is the engine of growth for surveillance capitalism. Okay, so here we, we've now climbed a mountain. We've climbed the mountain of the division of learning and we've peaked inside the fortress into the AI hub, into these backstage operations. And what have we found? A frontier operation run by geniuses funded by immense amounts of capital. Are they solving the climate crisis? Are they curing cancers? Are they figuring out how to get rid of all those plastic particles that now even are detectable in the Arctic snow? No, they're not doing any of that. Instead, all of that genius and all of that capital is dedicated to knowing everything about us and pivoting that knowledge to the remote control of people for profit. I don't like that. This is how the age of surveillance capitalism becomes an age of conquest. So, you know, we're meant to sleepwalk through all of this. We're meant to be ignorant. This is engineered for our ignorance. Mark Zuckerberg says privacy is the future. Very confusing. So now we're living in a time when we understand that privacy is a collective action problem. And we have to look now to only one source for remedies here. And that source is democracy. That means law. And that means new regulatory paradigms. And when we're talking with Toby, we can get into more details on this. But I want to call your attention to at least two things that I think are immediately important. And once we start talking about them and begin to get used to them a little bit in our imaginations, they won't sound as strange as they might sound when I say them right now. The key thing that confronts us here is to interrupt the incentives for the surveillance dividend. We essentially need to outlaw the surveillance dividend. Once we do that, we open up the competitive space for the thousands and hundreds of thousands and indeed millions of young people, entrepreneurs, companies who want to produce digital products and services that will address climate, that will address our real needs, that will cure the cancers that plague us, that will do all of the things that we once expected from the digital, but they will be able to do them without having to compete on the surveillance dividend. That's what we need. So two things I want to suggest. One is that we interrupt supply and the other is that we interrupt demand. By interrupting supply, I mean that the illegitimate, secret, unilateral taking of human experience for translation into data should be illegal. The surveillance capitalists have fought. This fight that you heard about in 1997 continues literally every day. They have fought for the right to take our faces whenever and wherever they want to. They take our faces on the street. They take our faces in the park. They take our faces when and wherever they want to. Our faces go into their facial recognition systems. Facial recognition systems train data sets. Data sets we now find out often sold to military operations, military divisions, including those military operations that are imprisoning members of the Uyghur minority in central China in an open-air prison where the only walls are facial recognition systems. That's what I mean by the way. Privacy is not private. Okay, so we interrupt supply. The next thing that we can do is interrupt demand. And that means we eliminate the incentives to sell predictions of human behavior. How do we do that? We make markets that trade in human futures illegal. Other markets are illegal. Markets that trade in human organs are illegal. Why? Because they have predictably destructive consequences for people and for democracy. Markets that trade in human slaves are illegal because they have predictably destructive consequences. Markets that trade in human babies are illegal because they have predictably destructive consequences. markets the trade in human futures should be illegal because, first, they are the enemies of human autonomy because their competitive dynamics require economies of action for which human agency is the enemy, and second, because they inevitably produce the extreme asymmetries of knowledge in the power that accrues to knowledge that create epistemic inequality and epistemic injustice. Surveillance capitalists are rich and powerful, but they are not invulnerable. They have an Achilles heel. Do you know what that is? They fear law. They fear lawmakers who are not confused and intimidated, but ultimately they fear you. They fear citizens who are ready to demand a digital future that we can call home. Thank you.