 Meine sehr verehrten Damen und Herren, herzlich willkommen hier in der Urenia. Wir sehen euch wieder nächstes Jahr wieder. Seit December 2017, in der Kooperation mit der Humboldt-Institut für Internet und Gesellschaft, wir haben uns über die Digitaltechnologie, wie unsere Gesellschaft transformiert ist und was die europäische Perspektive auf diese Transformation-Prozesse aussehen könnte. Wir haben mehrere Lesen in der Serie mit Speakers wie Manuel Castells, wir haben über die Digital-Society-Power gesprochen. Wir haben Eva Illous, sie hat über die Kapitalist-Subjectivität und die Internet gesprochen. Und das letzte Lächter war Armin Nasehi. Was für ein Problem ist Digitalisierung eine Lösung? Also, nach all diesen Lesen, ich bin sehr glücklich, dass wir die Lesen-Serie in nächstes Jahr weitergehen. Ich hoffe, dass ihr weitergeht und dass wir uns in diesen Tagen, wo wir die Peace for Protests in der GDR 1989 und Fall of the Wall, erinnern, dass wir uns mit uns über die GDR 1989 und Fall of the Wall erinnern. In diesen Tagen, wo wir die Peace for Protests in der GDR 1989 und Fall of the Wall erinnern, werden wir uns über die GDR 1989 und Fall of the Wall erinnern. Ein paar der wichtigsten Fragen in dieser Lächter-Serie bleiben sehr wichtig. Wie ist die Power in der Digital-Society Express selbst? Wie haben wir uns gebeten? Wissen wir uns auf eine Revival der Demokratie durch eine erhöhte Transparenz in der Participation? Oder haben wir uns als Resultat der fragmenteden Publikum und der Limitation der Privat-Sphäre beobachtet? 30 Jahre ago, die Mitglieder der GDR haben sich auf die Peaceful-Society Express frei und auf die Omnipräsent-Serveien erinnern. Heute ist die majority der Menschen aware of the potential danger of constant surveillance, but they readily surrender to the ecosystems of digital, different digital platforms and thereby become data suppliers and, say, in an exaggerated way, they become products of some few digital companies. We have to continue to analyze these developments and debate them in society in order to shape digital change together. Against this background, it's a great honour for me to welcome Shoshana Zuboff as a speaker here. She wrote an impressive book, The Age of Surveillance Capitalism, which exposes the threat to autonomy and democracy posed by the monopoly power of the internet corporations. Thank you very much for agreeing to take part in our lecture series. I'd now like to hand over to Christian Katzenbach from the Humboldt Institute, and it's for me to wish you a fascinating evening. Good evening. It's my pleasure to welcome you all to this special evening. I do that in the name of both the Humboldt Institute for Internet and Society and the German Communication Association. This evening is special for two reasons. It is special, of course, because of our esteemed guest, Shoshana Zuboff, who is without doubt one of the leading critical analysts of our time. In her work, she scrutinizes the information of capitalism that profoundly changed how we organize and live our social, economic, political and private lives. With that theme and scope, she's the perfect speaker and highlight for a lecture series making sense of the Digital Society. We started this series almost two years ago with a lecture by Manuel Castells and have hosted since then exciting, almost a dozen exciting and prolific speakers. We have already foregrounded readings of the emerging Digital Society that address big picture questions, but at the same time are based on rigorous empirical research. And you can see from her book that Shoshana Zuboff has done her homework on that respect. These kinds of lectures, as we see today, seem to hit the nerve, given the turnout for each and every lecture. Although today definitely constitutes a new high in that respect. Thus I'm happy and grateful to see this series continue into the next year, in the third year. So thank you to the Federal Agency for Civic Education, namely Thomas Krüger and Sascha Scheier for this wonderful cooperation and your support. But this evening is also special because it somehow also constitutes a first in this lecture series. Because this lecture today is not only part of the lecture series, it also kicks off an academic conference that takes place the next two days in Berlin. So I have the pleasure also to warmly welcome the participants and speakers of the conference Automating Communication and the Network Society, Context, Consequences and Critique who are all among the audience tonight or most of you. So you all have the chance to conduct probably even more informed conversations with your after-show drinks than usually in this series. This conference is the annual conference of the section Digital Communication in the German Communication Association. And this year it is jointly organized by the Weizenbaum Institute for the Network Society and the Humboldt Institute. And I'm profoundly happy to see this cooperation between the two Berlin-based German Internet Institutes. We jointly contribute to the city's remarkable growing research community addressing the digital transformation. For the conference we chose automation as a theme in order to give the current heated debates about AI and algorithms more historical depth and context. And given the tight connection between capitalism and automation who could be better prepared to kick off this conference than expert in capitalism and its digital transformation. So Shoshana, we all are more than happy and honored to have you here in Berlin today. But before you enter the stage I need to hand over to our wonderful moderator Tobi Müller who will guide us through the evening. And I promise your patients will be rewarded. Tobi is a true master in properly introducing our speakers. Tobi, the floor is yours. Don't get your hopes up too high now. Good evening everybody. What a turnout. As much as I can see. You were all in my back. Now I can see a little bit better. So about a year ago tonight's much-anticipated guest shot thought she would be on the road for maybe three or four weeks as she told me on the phone. It turned out to be much more. It has been 11 months mostly for her. Due to the great success of her book The Age of Surveillance Capitalism. But she will find the idea of home as opposed to exile in her writings too. As a space that is relatively safe from violence, exploitation or powerlessness. She has asked that question of home or exile 31 years ago in her first book in The Age of the Smart Machine The Future of Work and Power. It had to be the opening chapter in what became a lifelong quest to answer the question can the digital future be our home. As you have heard, tonight's event is also part of the series making sense of the digital society and of course we have talked a lot about the impact of digital technologies on society. But we have also talked about a notoriously hard to define term. It's called capitalism. Late capitalism, neoliberal capitalism Capitalism. This term is like a giant rock or several giant rocks actually. Rather amorphous, dark and very hard to move for a train of thought to pass without difficulty for a conversation to take shape. Tonight is the night where those two come together. Impact of technology, capitalism and take center stage. Because our guests has worked for several decades on how quite different forms of capitalism defined a quality and shape of the future consequences of said technological impact. Her increasingly warning voice was also heard in this country and given space accordingly by the late Frank Schiermacher co-publisher of the Frankfurter Algumane Zeitung and head of its Foyton as we say here in a mix of French and German feature pages, art section both the but weak translations to convey the discursive power Schiermacher's paper had at the time. Her body of work is impressive as is her career as one of the first women to get to New York at Harvard Business School. But after 11 months of hearing her CV read out to her every other night she asked me not to do that and be a bit less formal. I find that very refreshing. Most of you have probably checked Wikipedia anyway or as I hope have read her book. Speaking of refreshing websites, we're going to try out something new tonight in terms of audience participation. We're trying to get a little bit more focus and a little bit more diversity in gender and age when we actually converse here after Schiermacher's talk and it is a tool called Slido. May you please show us the slide. Slido.com you can type in your questions there. You all have the chance to vote those questions up or down so we can guarantee that also other people are interested in your questions and those questions will be read out to us by somebody from the Humboldt Institute for Internet and Society. We do have I think two microphones here on the floor but as you can see it will be quite hard to pass around the microphone and we don't really like to give out the microphone. So I think the two of them you get, I mean if you don't feel comfortable with those devices you do get the chance to ask that question but it will be a minority tonight so please try to use that tool it's not part of the surveillance capitalist complex I hope it's actually something made for something quite different and we are very many tonight so please time is limited no co-lectures or co-speeches please that of course should apply to me too so before our guests will take the stage I will let others speak of her. Here's some praise about a book. The New York Times wrote light on prescriptivist notions Zuboff does propose a right to sanctuary based on universalist if ever more threatened humanitarian principles like the right to asylum but she's after something bigger providing a scaffolding of critical thinking from which to examine the great crisis of the digital age through her we learned that our friends to the north were indeed correct Facebook is the problem along with Google, Microsoft, Amazon and others this is the rare book that we should trust to lead us down the long hard road of understanding The Guardian in London the book is one of the 100 most important of the 21st century and London based writer Sadie Smith wrote Zuboff is concerned with the largest act of capitalist colonization ever attempted but the colonization is of her mind our behavior, our free will our very selves yet it's not an anti-tech book it's anti-unregulated capitalism read in tooth and claw it's really this generation's Das Kapital Naomi Klein last but not least said the hour is late and much has been lost already but as we learn in these indispensable pages there is still hope for emancipation Klein therefore points out the remedy section so to speak play a certain role tonight I think but we're not going to hear an abstract of the age of surveillance capitalism the fight for a human future at the new frontier of power but actually are going to get a glimpse of ideas of a next book with Cambridge that will focus on the epistemic inequality produced by surveillance capitalism and why this is a threat to democracy I'm extremely pleased to be with you tonight please welcome Shoshana Zuboff oh my goodness thank you so much for that beautiful introduction Tobi your appreciation of that work I'm so grateful and thank you Christian thank you Thomas what a wonderful night this is for me to be here in Berlin to talk about this work Tobi mentioned my friend Frank Schumacher and I do want to dedicate my talk tonight to Frank it was through Frank that I became introduced to Berlin and fell in love with Berlin and I feel like there's a piece of my heart that lives in this city and always shall so I'm so excited of course wish he could be with with us tonight but I'm so excited to finally be here and being able to share this work with you so there are a few things that I want to do tonight I want to talk about some of the ideas that are in the book but also kind of move that forward driving forward into implications driving forward into a more careful thinking about remedies the word I was using with Tobi what are some of the things that we can start contemplating as the way that we come together to move through and beyond this age of surveillance capitalism which as you know if you've read the book or at least maybe the first page and the last page on the last page I say the age of surveillance capitalism may it be a short one and that of course is up to us so funnily enough Tobi brought up the New York Times and I'm going to start with the New York Times and this is a piece from the New York Times you know what the Federal Trade Commission is the Federal Trade Commission is the that's the agency in the United States that now has most of the jurisdiction over commerce on the internet so when we think about regulating the surveillance capitalist we're talking about the Federal Trade Commission so here's a New York Times reporter who's describing what he calls an unusually heated debate about privacy individual rights and law at the Federal Trade Commission and he says of course industry was represented there and civil society was represented there and the the industry executives were arguing quote they argued that they were incapable of regulating themselves and that government intervention would be costly and remember this word because we're going to come back to it later counterproductive alright so that's the executives the civil libertarians were warning that the company's data collection and analysis capabilities posed quote unprecedented threat to individual freedom then there was another advocate there someone else from a civil society organization and this is what he said quote we have to decide what human beings are in the electronic age are we just going to be chattel for commerce finally one of the commissioners asks the following question where should we draw the line now all of this sounds familiar to you doesn't it? does it sound familiar? familiar debate familiar points of view what's so interesting about this article it was published in 1997 so I think we know the outcome of this story who won the argument so the executives won the argument and they got their way they got everything they asked for in the United States more than in Germany and more than in Europe but still relatively speaking a near absence of law for them to be able to do what they wanted to do surveillance capitalism is the fruit of this victory of a battle that already those battle lines were drawn in 1997 at the dawn of the internet so what is surveillance capitalism it rests on the discovery that private human experience was to be the last virgin wood available for extraction production commodification and sales people that means us we did become chattel for commerce that's exactly what happened and the results are shaking democracy to its core they're transforming our daily lives they're challenging the social contracts that we've inherited from the enlightenment and indeed threatening the very viability of human freedom just as was predicted under siege though it may be the only possible remedy for all of this democracy and that's why we're here tonight of course so I think about it this way a little bit you know the story of Alice in Wonderland yes everybody know the story of Alice in Wonderland and you remember the white rabbit who had the clock and he was rushing and I'm late I'm late for a very important date and he goes down the rabbit hole well the way I think about it is two decades ago two decades ago we were all Alice and we encountered the white rabbit and he was rushing down his hole and just like Alice we rushed after him we followed the white rabbit into Wonderland what happened in Wonderland in Wonderland there were various things that we learned and it took us two decades to learn them okay first of all we learned that we can search Google we search Google but now two decades later there is a fragile new awareness dawning and it's occurring to us that it's not so much that we search Google it's that Google searches us in Wonderland we assume that we use social media but now we've begun to understand that social media uses us we thought that these are great free services while these companies were thinking these are great people who are free free raw material for our new generations of analysis production and sales we barely questioned why our television sets or our mattresses came with privacy policies but now we're beginning to understand that privacy policies are actually surveillance policies we admire the tech giants as innovative companies but now innovative companies by the way who occasionally made some big mistakes and those mistakes violated our privacy the difference now is that we're beginning to understand that those mistakes actually are the innovations those mistakes are the innovations in Wonderland we learn to believe that privacy is private we failed to reckon with the profound distinction between a society that cherishes principles of individual sovereignty and one that lives by the social relations of the one way mirror privacy is not private privacy is a collective action problem privacy is a political challenge privacy is about the kind of society that we live in finally our most dangerous illusion of all in Wonderland we believe that the internet offered unprecedented access to proprietary knowledge but in the harsh glare of surveillance capitalism we have come to learn that proprietary knowledge now has unprecedented access to us the digital century was to have been democracy's golden age instead we enter the third decade of the 21st century marked by an extreme new form of social inequality that threatens to remake society as it unmakes democracy this new inequality is not based on what we can earn but on what we can learn it represents a focal shift from ownership of the means of production to ownership of the production of meaning this is what I call epistemic inequality defined as unequal access to learning now imposed by private commercial mechanisms of information capture production analysis best exemplified by the growing abyss between what we know and what can be known about us unequal knowledge about us produces unequal power over us and so the abyss widens further marking the distance now between what we can do and what can be done to us these growing asymmetries ensure that epistemic inequality will be a critical social contest of our time 20th century industrial society was based on the division of labour and it followed that the struggle for economic equality would shape the politics of that time our digital century shifts society's coordinates from a division of labour to a division of learning and it follows that the struggle over access to knowledge and the power that is conferred by such knowledge will shape the politics of our time these contests pivot on three essential questions about knowledge authority and power and these frame the fight for epistemic rights and epistemic justice three questions who knows who decides who knows who decides who decides the answers to these questions will determine the fate of equality after wonderland alright let's talk a little bit about surveillance capitalism because this inequality is forged in the backstage operations of surveillance capitalism it's one way mirror operations engineered for our ignorance wrapped in a fog rhetorical misdirektion euphemism and mendacity invented at google the turn of the digital century surveillance capitalism begins with the secret theft of private human experience now declared as free raw material in a behavioral data these flows of behavioral data are conveyed now through complex supply chains devices apps, third parties into a new kind of factory computational factories called artificial intelligence machine intelligence where the data are manufactured as occurs in all factories manufactured into products but these now are specific kinds of computational products that are behavioral predictions predictions of what we will do soon and later in case you think I'm exaggerating a leaked facebook document and I draw your attention to the word leaked you know it's crazy how much we have to depend upon leaked documents and whistleblowers to understand what's going on in these backstage operations so there's a leaked facebook document came out about 2 years ago and maybe some of you read about it it's a document about facebook's computational factory which they call their quote prediction engine alright so they're describing what happens in this artificial intelligence hub and they note that their machine intelligence their AI hub is now capable of ingesting trillions of data points every day and that the company is now able to produce 6 million predictions of human behavior each second that's what's happening inside the factory so these predictions are about us but they're not for us where do they go they're sold to business customers it turns out that businesses are very interested in what we're going to do they're very interested in our futures so they're sold to business customers in a new kind of market that trades exclusively in human futures our futures like we have markets that trade in wheat futures and pork belly futures and oil futures we now have markets that trade in human futures in other words surveillance capitalists sell certainty that means they're competing with each other on the quality of their predictions and this is a form of trade that has birth the richest and most powerful companies in history alright so this was invented at google the invention the process began 2000 2001 and we didn't start to learn anything about it really until the company went public which was 2004 and they had to make public their initial public offering documents so here's what we learned from those documents and this is really crazy so listen to this number I'm about to say between 2000 and 2004 revenue line now let me underscore something 2000 why did they invent surveillance capitalism you remember what 2000 was I can't see you now I can see you a little bit better so a lot of people in this room don't remember 2000 because you either weren't born or you were too little 2000 was a time called the dot-com bust everybody in silicon valley was going broke and they were all panicked that's when google announced a state of emergency they declared that famous state of exception where they were going to let go of all of their previously held values and principles and that's how they invented surveillance capitalism but the point here is that they were in financial emergency in 2000 because they couldn't figure out a way to monetize and their own venture capitalist were threatening to pull out so that's the background here let's get back to the story so between 2000 and 2004 the revenue line and of course these are the years where they invented this new logic and started applying it so everybody clear on this between 2000 and 2004 now we're finally going to get to the punchline their revenues increased by 3590% that's a very big number okay so what is that this is a startling number and this number represents something that I call the surveillance dividend that number would not be there were it not for this new logic of surveillance capitalism as described here the surveillance dividend and what did that do literally overnight it raised the bar for every investment first in silicon valley in the tech sector but eventually of course this has had effects through all economic sectors across our economies but now imagine your venture capitalist, your investor capitalist you can invest in a company that can increase its revenues in four or five years by 3590% or you can invest in a company that's going to do innovation the older way like Henry Ford and actually invent a product that everybody wants which one are you going to invest in the answer is obvious surveillance dividend alright so what do we learn here the surveillance dividend is the center of this surveillance capitalism produces the surveillance dividend which has driven this logic not only through the tech sector but through our economies surveillance capitalism is not the same as technology surveillance capitalism is an inevitable consequence of digital technology surveillance capitalism is not restricted to technology companies it redefines businesses in every sector now so I'm going to tell you a great story about this this is a story about chasing the surveillance dividend and what's happening inside our economies alright and so just to make this symmetrical let's go back to the beginning of the 20th century and the Ford Motor Company the birthplace of mass production as we know it you remember the Model T Henry Ford, the Model T the most successful product ever sold until the iPod so today we have the Ford Motor Company and a new CEO not Henry Ford and this CEO Jim Hackett is facing what some of you may know a global slump in autosales autosales are down and they're not coming back what is the CEO of Ford Motor Company to do well if you were Henry Ford you might say hey I know let's invent a car and they compel people to buy it how about a car that's completely affordable and doesn't burn any carbon that's a good idea that's not what Ford Motor Company is up to Mr. Hackett says I want to attract investment the same way that Facebook and Google do so what I need to do is I need to find data wait a minute I've got a great idea he says there are a hundred million people driving Ford vehicles so let's stream data from all those people then we can combine it with the data we have in the Ford credit business where he says we already know everything about you now we have a data set we have data flows that are on a par with Google and Facebook who would not want to invest in us chasing the surveillance dividend no more cars he says now we have a transportation operating system chasing the surveillance dividend and here's what a wall street analyst says about it listen this is a great idea he says Ford could make a fortune monetizing these data flows they won't need engineers they won't need workers they won't need factories and they won't need dealers pure profit it's pure profit they can make a fortune okay so you've got the picture now we're following the money follow the money that is the whole point here an economic logic human made let's follow the money and see where it leads us you ready yes alright so to follow the money what do we have to do we have to look at the competitive dynamics inside this kind of marketplace remember what kind of marketplace it is it trades in human futures what are the competitive dynamics in this kind of marketplace I know this is Berlin and you're not used to audience interaction I'm an American what can I say I want to hear from you too I have to know that you're hearing me I have to know that you're with me alright so we said surveillance capitalists sell certainty so they're competing on their predictions so let's reverse engineer these competitive dynamics and see what we find number one everybody knows and AI needs a lot of data everybody knows that so the first thing is economies of scale drives them toward totalities of information we need data at scale okay that's an easy one reading on scale is good but not good enough because eventually they realize hey you know what we need a lot of data but we also need varieties of data just wanted to mention that I have some very nice bottles of water here but they're not open so so now I have to do this in front of all of you and you are gonna see how completely hopeless I am oh god I've gotta do this well please let me do this excuse me one second I'm going to get a glass I guess we ran out of glasses okay thank you as you can tell I kind of have a cold so water is good alright okay so now we know that we need economies of scale we also need varieties so we need economies of scope different kinds of data now even though you're not old enough to remember the dot com bust many of you are old enough to remember the mobility revolution right so this is the idea that we give you a little computer you put it in your pocket and you go well we'll call it a phone and it will go everywhere with you and now we can get economies of scope like where you are and what you're talking about who you're with and what transactions you're making and maybe where you're eating and what you're eating and who you're emailing or texting or what kind of browsing while you're walking in the park or walking through the city we can get your voice we can get all kinds of things now oh and don't forget what's the most important thing of all that we can get with this new computer we can get your face we can get all your faces okay so we've got economies of scale and economies of scope prediction continues to evolve and continues to intensify and pretty soon there's a new realization the most predictive data comes from intervening intervening excuse me in your behavior intervening in your behavior intervening in the state of play in order to actually nudge coax tune, herd your behavior in the direction of the outcomes that we are guaranteeing to our business customers herding your behavior in the direction of our revenues and ultimately our profits okay so this is something new this isn't just scale and scope which we're familiar with this is something new and this tracks a process that data scientists talk about they talk about the shift from monitoring to actuation and that shift is a point in systems management where you have so much information about that system that the the information cascades over a tipping point and you have so much information that with that cascade you can begin to remotely control the system so you now know so much about it that you can remotely control it that happens in the management of machine systems wait one second but now the idea is how do we make this work in the management of human systems monitoring to actuation okay so the idea now is we've got to figure out how to do this this has never been done before at scale automated at scale so this is what I call economies of action economies of scale economies of scope familiar economies of action how do we automate remote control human behavior at scale alright this is a whole new experimental zone this is something that has never been done before it's hard to learn about it because as I said at the beginning these are backstage operations but it turns out some of these experiments are hiding in plain sight so we can learn something about it so one of these something that you probably read about now I know you're old enough to know about this the facebook what they call their massive scale contagion experiments so they did publish one in 2012 another one in 2014 the first one was to see if they could change people's voting behavior not necessarily who they voted for but just to get them to go vote rather than not voting at all the second one was to see if they could change people's emotions make them happier or sadder so when the researchers wrote up these experiments both 2012 2014 they celebrated two findings number one we now know that we can manipulate views and social comparison dynamics on facebook pages to change real world behavior and emotion we know we can do that number two we now know that we can do this while bypassing user awareness it's undetectable they never know that we're doing it that's what makes successful economies of action why? awareness is friction friction is expensive if I know about it I might refuse I might look for a way to hide I might look for a way to camouflage so awareness is friction awareness is the enemy these kinds of systems have to be designed to bypass awareness ok great contagion experiments now we're on to an even more sophisticated zone of experimentation and this one I am certain that you know about how many people in this room went out in the streets of Berlin and played pokemon go with your friends and family come on audience participation you can be honest we're all friends here oh don't be shy I know this isn't true I know you're not telling the truth well did you know did you know that pokemon go came from Google did you know so is that because you read my book so pokemon go was incubated in google now of course Germany was famous for being the first country to contest street view right pokemon go was invented by the same guy who was the boss of street view who was the same guy who invented google earth before that it was called keyhole and it was invested by the CIA before google bought it so this is a man john hanky was a long history how to fill the supply chains on their way to the new factories so this man john hanky had a little shop inside google it's called niantic labs and that's where they incubated these new augmented reality games including pokemon go when they brought it to the market of course they distanced themselves from google niantic labs became an independent little company and brought it to market that way so no one would know that this came out of google turns out that when you were playing pokemon go you were actually playing a little game within a bigger game alright so let's go back to the first round of surveillance capitalism what was the first really really successful prediction product okay that was the click through rate because the click through rate we think of it as a click through rate but actually just you only have to think about it for another couple of seconds and you realize that the click through rate is a computational fragment that's predicting a piece of human behavior right and of course what were the first markets in human futures that's where these click through rates were sold these predictions were sold so that first market insanely lucrative market in human futures was called online targeted advertising and it's still insanely lucrative however we now see that same structure now juxtaposed translated to the real world in pokemon go niantic labs had established its own human futures markets so they had business customers not online but in real life like mcdonalds and starbucks the real shops the real establishments or joe's pizza and harry's bar so they had these businesses paying them not for guaranteed click through but for guaranteed footfall people's real feet falling on the real floor of real places guaranteed footfall and so the idea with pokemon go was how to use gamification the rewards and punishments of gamification in order to herd people through the cities to the places where their feet were guaranteed to be right so that's another phase in the work of economies of action and figuring that out now here's another phase comes a little bit later now we're back at facebook and this comes from another leaked document this one written by australian facebook executives and well i don't know if they were they were australian let me put it let me put that another way this was written by facebook executives written for its australian and new zealand customers i've always kind of assumed assume that the executives were australian but i don't know that for a fact so this is a report that is selling its business customers on the following idea we have so much information now remember monitoring to actuation we have so much information 4.4 million young people high school students, college students and young adults in australia and new zealand that we can now predict their emotional state on a daily and weekly basis we can see their emotional cycles across the seven days of the week and we can predict in this emotional cycle we can predict things like if they feel stressed, defeated overwhelmed, anxious nervous, stupid, silly useless or a failure and with these predictions we can alert you to the exact moment of maximum vulnerability when if you send a message that contains a confidence boost you will be successful so for example let's imagine that you have a sexy black leather jacket to sell well we can tell you when to sell it how to sell it what to say in your message either way make sure they know you're going to sell it on a Thursday night because that's when they're most anxious because the weekend is about to appear tell them that you can have it delivered for free to their door the next morning throw in a little price discount and we can guarantee you success alright so that's another phase economies of action monitoring to actuation finally we're seeing the next phase unveil itself now literally as we speak some of you who follow smart city smart city developments might know that just the other day the officials in Toronto made some decisions about how sidewalk labs is the subsidiary of google slash alphabet that specializes in its smart city work you know that they used to call it the google city but they don't do that anymore they call it the smart city smart city they're trying to get the waterfront area in Toronto to rebuild as a google city and this is a dynamic that's been going on for a couple of years become very contested with many citizens getting involved and just the other day some of these officials in Toronto actually made a very good decision and curtailed the development of this plan substantially but the key point here is that when you look at the documents behind this sidewalk labs proposal and and in fact just last week the globe and mail found some secret documents that really hadn't been reviewed by the public and finally made them public and it's fascinating what you see there because all of these documents if you read them with what we've just been talking about in mind these are documents that are a clear declaration of epistemic dominance and the intention to use that dominance for behavioral modification at scale I'm not going to go into the details but you can trust me on that alright so you know sometimes I hear people saying to me you know Shoshana I mean take your point but really businesses advertisers commerce always try to persuade people you know always try to change people's behavior and get them to buy something that they didn't want to buy really Shoshana there's nothing new about this and of course that's true there is nothing new about our desire to persuade each other to do things that we might not have otherwise done or maybe to do things that we don't even want to do there's nothing new about human persuasion but let's not lose our bearings because what is new here is that at no other time in history have the wealthiest private corporations had at their disposal a pervasive global architecture of ubiquitous computation able to amass unparalleled concentrations of information about individuals groups and populations to mobilize the pivot from the monitoring to the actuation of behavior remotely and at scale this my friends is unprecedented what is this new power it works it's will through the medium of digital instrumentation it's not sending anybody to our homes at night to take us to the gulag or the camp it's not threatening us with murder or terror it is not totalitarian power but it is a new and unprecedented form of power just as totalitarian totalitarianism presented itself as a new and unprecedented power in the 20th century this new power is what I call instrumentarian power it works its will remotely it comes to us secretly quietly and if we ever know it's there it might actually greet us with a cappuccino and a smile nevertheless it represents a global behavioral modification and is the engine of growth for surveillance capitalism okay so here we we've now climbed a mountain we've climbed the mountain of the division of learning and we've peaked inside the fortress into the AI hub into these backstage operations and what have we found a frontier operation run by geniuses funded by immense amounts of capital are they solving the climate crisis are they curing cancers are they figuring out how to get rid of all those plastic particles that now even are detectable in the arctic snow no they're not doing any of that instead all of that genius and all of that capital is dedicated to knowing everything about us and pivoting that knowledge to the remote control of people for profit I don't like that this is how the age of surveillance capitalism becomes an age of conquest so we're meant to sleep walk through all of this we're meant to be ignorant this is engineered for our ignorance Mark Zuckerberg says privacy is the future very confusing they just really think that we're stupid and because we're meant to sleep walk through this when something actually rises up out of the fog to send us a message well it's crazy I mean it really gets our attention this is what happened with Cambridge Analytica isn't it Chris Wiley here's the whistleblower now Chris Wiley says this is what we've been doing that really got our attention let's just take a minute and look at what Chris said Cambridge Analytica was doing he said quote we exploited Facebook to harvest millions of people's profiles and then we built models to exploit what we knew about them and target their inner demons does that sound familiar does it yes the objective was behavioral micro targeting influencing voters based not on their demographics but on their personalities does that sound familiar he says I think it's worse than bullying because at least with bullying people know what's being done to them with what we do he said people don't even know what's being done to them he says if you do not respect the agency of people then anything you're doing after that point is not conducive to a democracy well yeah that's for sure alright so then he concludes he says Cambridge Analytica was information warfare was an acknowledging that information warfare originates in epistemic inequality information warfare is impossible to prosecute without that information dominance that information advantage but what remains poorly understood even today is that Cambridge Analytica only repeated the mechanisms and methods that represent everyday life for every self-respecting surveillance capitalist I mean what more apt description of the treatment of those young people in Australia and New Zealand whose social anxieties were manipulated for profit than to say we built models to exploit their inner demons I mean how apropos was that so here is this political consultancy that got the world's attention and still has the world's attention when actually all it was was a parasite a parasite in the host and the host body was not just Facebook the host body was surveillance capitalism itself its surveillance capitalism that provided the three things that the people who study information warfare say are essential for its success the conditions, the weapons and the opportunity it was surveillance capitalism that provided the conditions through the ubiquitous datification of human experience it was surveillance capitalism that provided the weapons the data the methods and the mechanisms the predictive analyses the intimate simulations of individuals micro-targeting the techniques for subliminal influence and manipulation of social comparison dynamics the mastery of hidden real-time experimentation all of it pioneered in surveillance capitalism the weapons and finally it was surveillance capitalism that provided the opportunity the opportunity being the fact that all of these mechanisms can be applied while completely circumventing human awareness it can all be done in secret and that provides a massive opportunity for successful information warfare the conclusion can only be that what we have failed to recognize is that it's not that Cambridge Analytica represents information warfare and it's not that information warfare is strictly a function of the state or increasingly of even non-state but political actors it turns out that surveillance capitalism and its illegitimate use of knowledge to power is best understood as the normalization and the institutionalization of information warfare for profit that is the world that we are living in today okay so I want to conclude with just a couple of thoughts that will allow us to turn the lights on in a minute without everybody feeling really depressed because you may not be able to tell right now from my voice which is a little warped but I'm actually very optimistic about our ability to change this and in fact I'm very candid with you some of my optimism comes from your country some of my optimism comes from seeing how the generations in your country and in this city learn to confront and internalize the lessons of totalitarianism and completely change the fabric of your culture and your institutions and your laws and I have so much respect for that and I think it reminds us of a larger a larger pattern here which is that as democratic societies we have confronted grave problems in the past and we have overcome them we ended the gilded age we overcame totalitarianism and in fact we have used the levers of our democracy in order to ensure that the poached war world was a prosperous world for ordinary people that the post war world was the age of the middle class and that capitalism and market capitalism could actually promote and itself be strengthened by democracy and that was part of the legacy of the post war years so now we're living in a time when we understand that privacy is a collective action problem and we have to look now to only one source for remedies here and that source is democracy that means law and that means new regulatory paradigms and when we're talking with Tobi we can get into more details on this but I want to call your attention to at least two things that I think are immediately important and once we start talking about them and begin to get used to them a little bit in our imaginations they won't sound as strange as they might sound when I say them right now the key thing that confronts us here is to interrupt the incentives for the surveillance dividend we essentially need to outlaw the surveillance dividend once we do that we open up the competitive space for the thousands and hundreds of thousands and indeed millions of young people entrepreneurs, companies who want to produce digital products and services that will address climate that will address our real needs that will cure the cancers that plague us that will do all of the things that we once expected from the digital but they will be able to do them without having to compete on the surveillance dividend that's what we need so I want to suggest one is that we interrupt supply and the other is that we interrupt demand by interrupting supply I mean that the illegitimate secret unilateral taking of human experience for translation into data should be illegal the surveillance capitalists have fought this fight that you heard about in 1997 continues literally every day they have fought for the right to take our faces whenever and wherever they want to they take our faces on the street they take our faces in the park they take our faces when and wherever they want to our faces go into their facial recognition systems facial recognition systems train data sets datasets we now find out often sold to military operations military divisions including those military operations that are imprisoning members of the Uyghur minority in central China in an open air prison where the only walls are facial recognition systems that's what I mean by the way privacy is not private okay so we interrupt supply the next thing that we can do is interrupt demand and that means we eliminate the incentives to sell predictions of human behavior how do we do that we make markets that trade in human futures illegal other markets are illegal markets that trade in human organs are illegal why because they have predictably destructive consequences for people and for democracy markets that trade in human slaves are illegal because they have predictably destructive consequences markets that trade in human babies are illegal because first they are the enemies of human autonomy because their competitive dynamics require economies of action for which human agency is the enemy and second because they inevitably produce the extreme asymmetries of knowledge and the power that acquires and the power that acquires and the power that acquires and the asymmetries of knowledge and the power that accrues to knowledge that create epistemic inequality and epistemic injustice ok so now we are at the end what do I want to say to you and now the question is not are you old enough but are you young enough to know this Greta Thunberg Greta Thunberg says our house is on fire succinctly framing the climate crisis cataclysm our house is on fire so I'd like to suggest that global warming is to the planet our house human capitalism is to society our home not only is our house on fire but our home is on fire this fire though is not kindled in the implacable physics of the climate crisis it's kindled in a human made logic a human made economic logic anything that humans make can be unmade all we have to do is decide like the Berlin Wall you decided and ultimately it came down surveillance capitalists are rich and powerful but they are not invulnerable they have an achilles heel do you know what that is they fear law lawmakers who are not confused and intimidated but ultimately they fear you they fear citizens who are ready to demand a digital future that we can call home thank you I just want to say I did have my watch up there it didn't do me much good though did it thank you so much for that that means the world to me it really does I'm doing a lot worse than you with the bottler here actually oh behold yes, well done thank you Tobi thank you so much Shoshana for that very interesting outline surveillance capitalism and for the call to arms so to speak at the very end and I would actually like to start this conversation by referring once again to the city we're in even I am not old enough not may come as a surprise to most of you to have witnessed the fall of the wall here in Berlin that's because I'm not German I was in another small European country apparently everybody knows that obviously it was the GDR was a state that relied heavily on analog surveillance it was estimated that about roughly 200.000 people were on the payroll of the secret police of the Stasi in a state in a nation that had how many 16 millions inhabitants so there's a conversation about how surveillance changes behavior there when it comes to analog how mass surveillance I mean it's still a difficult conversation to have in Germany actually but the conversation is there we have talked about this for quite a while in terms of speech, public movement action taken, action not taken things like that but I'm wondering 2013 after the Snowden Leaks after that story broke all over the world what do you think what ways has surveillance capitalism already changed our behavior so far because I don't hear a lot about that well I mean I think it's yeah it changed our behavior and probably I think maybe the younger we are I think the younger we are the younger one is the more one's behavior is likely to have been changed and the less the less one has any possibility of even knowing that so we can see change behavioral patterns but it's not necessarily within the self-consciousness of a young person that their behavior is changed but this kind of entrapment in the psychology of adolescence write about the life of the hive the idea that we're we're living in a way that is so hyper attuned to one another and it appears to be that young people really have been drawn into this kind of this kind of operation so profoundly but there's that interesting age group I love talking to university students probably some people in this room I had a really moving experience was late 2017 and I went up to visit do some lecturing and teaching at King's College Ontario up in Canada and Kingston and my favorite thing is being with undergraduates college students and we had this wonderful conversation about Irving Goffman you remember Irving Goffman a great social theorist of the mid 20th century presentation of self in everyday life and Goffman part of that group like Stanley Milgram the great studies that came out of the war in the post war environment and Goffman talked about the presentation of self and he wrote about this idea backstage and essentially he said if you don't have a backstage you're gonna go crazy because backstage is where you get to be yourself and you get to replenish yourself and that's where everyone's just hanging out no one's judging anybody and so forth and without backstage you go nuts and so I this group of students was a big class and they had been reading these theorists like Goffman and I said well everything we're talking about that you do on Facebook and so on curating your persona for different audiences and so forth isn't that just 21st century Goffman isn't that just presentation of self in the digital world so there was a debate about that and then this young woman began to speak her name was Helen and she was feeling and thinking in real time and she began to speak and she said it's not the same and I've just realized it's not the same because we have no backstage there is no place I can be that is backstage this morning I was walking across the campus and I thought I was backstage lost in my own thoughts and I looked up and I saw somebody over there with their phone taking my picture there's no backstage and the whole room so big room, not quite as big as this but it was one of those big classes everybody went quiet Photography is a very interesting example to talk about privacy actually I think of the notion of privacy that I think is so central to a lot of your arguments you're making, the mistakes are the innovation you said at the beginning of your talk and that's of course an invasion of privacy or meant at that time an invasion of privacy now if you look at that a little historically that's what I mean with press photography in the late 19th century which led to a canonical text of course you know it in the history of privacy you lost the right to privacy by Samuel Warren and Louis Brandeis at when was that 1819 actually in the Harvard Law Review the philosopher Raymond Gois Professor Emeritus at Cambridge tells the story who had actually inspired this nowadays canonical text when it comes to the right to privacy you're telling all my good stories all the story went like this the wife of I think it was Samuel Warren's wife it was a rich society lady and I think Warren himself was very upset about this he was very upset about this that press reporters actually sort of invaded or reported on the parties he or his wife or the two of them through together so it was a very concrete instance actually that led to this canonical text nowadays people post the pictures of their parties you know free of will I mean nobody forces them to do that in other words privacy is a very unstable concept as we can see now when it comes to photos but what about today do you think we can frame a universal concept of what privacy is how would you describe that what would mean privacy today I mean I was I was reading some of my materials today just reminding myself of that great quote from Mark Zuckerberg I think the quote was from 2011 when he said we just decided that there would be no privacy and that would be the new social norm and we went for it and so rather than so what we've seen is this um this assault on you know there had been this sort of evolutionary process of privacy privacy of course is an idea that is only relevant to the growth of the evolution of the psychological individual and the progressive individualization through history which is arduous and an arduous evolution and one to be celebrated because with the concept of the individual came the concept of rights and with the concept of rights came the concept of democracy and equality so privacy is part of this nest of values and sensibilities that are so essential to the way of of life that is associated with the growth and the you know the health of our democracies the possibility of our democracies so for so what's fascinating is that Facebook simply decided there would be no privacy what happened initially what happened what happened was the world exploded in outrage we didn't just accept it people all over the world were really really angry it's like we talked about street view before there were nobody actually knows how many lawsuits there were against google because of street view but they were coming from almost every country on earth germany started there was outrage and protest so but these these lines were crossed and there's habituation and there's normalization and there's what I call psychic numbing so this goes with what we were just talking about a minute ago what Helen said there's no backstage if there's no backstage you start to go crazy in a certain kind of way and what do you do to protect yourself from that feeling of craziness you go numb and so there's a lot of psychic numbing right now because we are all increasingly experiencing this world of no escape and in order to protect ourselves from these feelings of going crazy we kind of get numb and we stop thinking about it you know or we console ourselves with privacy browsers add blockers and things like that how to increase encryption or maybe you're into the camouflage those special materials that you can buy to put over your face and confound facial recognition cameras and so forth so I don't know that we've given up privacy so much as that we yearn for it it's become unavailable and that makes us feel kind of sick and crazy and so we're protecting ourselves from that until we can figure out a way through in which case something that you and I have talked about a little bit Tobi you know I happen to believe that if we can get rid of the surveillance Dividend and really open up the competitive playing field we're at a moment in history where any business that comes on stream giving us the things that we actually want the way we want them without the overhang of these externalities that come with the surveillance Dividend these anti-democratic and anti-egalitarian externalities that that company those companies really have the opportunity to have every person on Earth as their customer because according to literally every survey in Europe every survey in the United States nobody wants this stuff nobody likes it nobody wants it there's just very little choice you mentioned that 20 years ago we thought that the internet was going to be about empowering the consumer now we ended up as we're in the era of the end of the consumer as you put it in the book I think the end of the consumer being the raw material for the profit other people call it labor even it should be another concept but what I'm interested is for now, for the moment before we open this up to you in the tipping point what happened in the evolution of same managerial capitalism to distributed capitalism oriented capitalism and made that fight loose and so surveillance capitalism one by actually destroying the consumer what kind of forces are there is this about continuity or discontinuity or in other words how much capitalism is in surveillance capitalism that's always been there or is this really a break as in the discontinuity I would say it's both I mean surveillance capitalism replicates the age old evolution of capitalism claiming things that live outside the market dynamic bringing them into the market for commodification for production and sales but what's crazy about this era of that pattern is that it's not simply about taking Nature in the case of industrial capitalism to be reborn as commodities for sale and purchase but now it's about taking human nature private human experience and that sets into motion all these other kinds of dynamics that we've been talking about but having said that there are also very powerful ways in which surveillance capitalism diverges from the history of capitalism not only does surveillance capitalism not have okay surveillance capitalism breaks with the history of capitalism because it no longer has to sustain organic reciprocities with its own societies with its own populations right for two reasons one is because it doesn't need us as its source of customers we are not its customers as has already been pointed out I won't repeat but also because it doesn't need us as a source of employees it's not accurate to say we supply the labor because if we were labor then we would be the workforce and that's a source of reciprocity don't forget democracy the amplification of democracy in Great Britain in the late 19th century was directly linked to the dependence of the elites on the working class they understood that if they did not expand the franchise the people who worked for them were likely to burn down their factories so those reciprocities that's not just an abstraction that's real life same with consumption I mean you know there's wonderful historiography really looking deeply at the American Revolution as a consumer revolution that the colonists actually you know were finally molded into a shared a sense of shared political interest across these disparate colonies because they were all outraged at how they were being treated as consumers and when they wanted to protest they said we will cease to buy your products I'll go without a winter coat I'll go without tea so this is very real stuff that's intrinsic to the history of democracy so that they don't need us as consumers and they don't need us as workers is a really big deal and that is a tremendous difference from the history of democracy from the history of capitalism and I think a key reason why you know democracy and capitalism have found a way to cohabitate especially from the late 19th and especially in the middle of the 20th century market democracy turned out to be a very powerful and prosperous kind of structure but is very unlikely as long as surveillance capitalism is the dominant market form let us switch to the new tool Slido, Natalie do you have questions pardon me yeah there's a reduction we're trying this out this evening it's the first time we're trying it out I was not there I'm sorry we are trying this out tonight okay we're still trying it out wanna put the slide up alright should we start with the questions from this tool we have received more than 150 questions so we kinda had to put thematic blocks um on them and one of them this is like Berlin in the 90s it's coming back resist but we're trying out this tool please respect that so first block is about regulation about lawmakers and can we go on? okay great so for example should big tech companies be broken up as Elizabeth Warren proposes is this a viable way forward and on the other side what policies should the EU put in place to limit the power of companies like Facebook and Google so which regulation forms can be used for that okay well great question so glad you asked 150 questions I think we're gonna have to get our sleeping bags out so when we talk about antitrust law these the whole body of anticonpetitive behavior and legislation so that was a very fertile and creative legislative period to invent antitrust law um it didn't happen like that as I'm sure many of you realize there were cartels and there were monopolies and they were a scourge on society and there was tremendous protest and violence and these laws and regulatory visions were developed by progressives and by legal scholars over a period of decades and it was a tremendously creative act so there's wonderful histories of regulation and one of the lessons these histories of regulation is that regulatory efforts fail when they are unable to frame regulatory strategies that are carefully based on an understanding of the industry that they're trying to regulate and I think the same can be said on the larger scale now for surveillance capitalism so let me connect those two points number one there simply can't be a question of taking that very creative legislative work that came out of the 20th century and applying it to a wholly new set of mechanisms and methods problems and phenomena in the 21st century do we have anti competitive behavior among these companies yes do we have monopoly behavior yes addressing those monopolies undo will it interrupt an outlaw surveillance capitalism no not in my view what it's more likely to do breaking them up we run the risk of creating more surveillance capitalist companies intensifying competition among surveillance capitalists and therefore intensifying the drive towards certainty and predictability that I've just been describing to you so what I believe is that we need to stand on the shoulders of 20th century antitrust law, 20th century and even early 21st century privacy law and we need to build on that with specific understanding of surveillance capitalism's logic, its methods its mechanisms, its imperatives and having a new creative effort that produces the insight the legislation the regulatory vision that will interrupt an outlaw what is unprecedented here okay thank you I can't see you, can you raise your hand I'm here I'm gonna stand up you're welcome we're coming to talk about what can we personally do to combat the threats for example and I must ask which actions we can take to make sure we are not entirely influenced by such companies as Google and Facebook also on the more personal level which actions do you personally take to minimize the data you provide to tech companies dependency is not private, so, there are a couple of things here, one is, my personal view is that with drawing from such a dependency and Investment in these systems will give you a better life and better mental health. Even before the rise of the digital, I was never the kind of person who was very attuned to the others. And I certainly would avoid the amplification of that kind of dynamic that we see now in the online media. So, I think we can do things for ourselves by withdrawing and by not being so dependent on these systems, not being so mediated. I do believe in eye contact and I do believe in actual recognition and I do believe in being in the presence of trees and things like that. So, but what does it mean privacy is not private? Does it mean that we have no power as individuals? No. It means that our real challenge as individuals now is political. So, to say that privacy is a collective action problem means that friends, we need new forms of collective action. So, in the 19th century and especially in the first part of the 20th century, this was about the right to have a union and the right to bargain collectively and the right to strike. It was about making sure that children didn't work, didn't go to work when they were 7 or 13. But they went to school or they were with their families because that was consistent with the aspirations of a democratic society. So, what are the forms of collective action that will define our challenges and our time? So, collective action means that we need to discover ourselves not as anonymous users, which is their name for us, but rather as citizens of democratic societies with shared not only economic, but political and social and psychological interests. And we have to come together in those interests and create these new forms that are going ultimately to be the vehicles that put pressure on our lawmakers, that mobilize our lawmakers into this next era of creativity and a new regulatory vision. So, we have work as individuals and that work is coming together to create these new movements. And these will be movements defined by the fight for epistemic justice. Thank you. Okay, just a small announcement. I know we're running a little bit late. I thought we had a conversation up here for 20 minutes and then it's your turn for 20 minutes. And I think we're going to stick with that. So, we have another about 12 minutes maybe until this evening will end a little bit late. But, you know, the subject is so huge and our speaker is great. So, I think we can all deal with that. I can certainly. So, I think there'll be one more question for the moment from Slido. And then we'll go to the microphone on both sides. Please, Natalie. Great. So, we have a top voted question by far, which is more about the organizational part of the event. The event is co-organized by the Humboldt Institute for Internet and Society, which received millions of funding by Google. Do you think that this is a problem? Absolutely. You know, years ago, when I was a little girl, there was a wonderful thinker, Herbert Markusa, who wrote a book called The One-Dimensional Society. People don't really read it that much these days, but I recommend it. You read it. So, this is our version of the one-dimensional society. You know, Google's out there whitewashing itself in every way imaginable, including if you look at the list of Google Fellows year after year, it's all kinds of people who are from civil society institutions that should be dead set against Google. And instead, they're Google Fellows and they fund conferences and they fund civil society organizations. And this is part of the whitewashing that is intended to keep us so confused. You know, it's like I said before, Mark Zuckerberg just told us that the future is privacy. What did you just say, Mark? So, you know, they think we're stupid and we're supposed to be confused and we're supposed to think that they're really nice. But I don't think that. And I wish that they weren't a sponsor of this event. To be honest, perfectly honest. I didn't know this until a little while ago, a few days ago. But I have been at other events, other events that I was very proud of, including the CPDP event in Brussels, where we launched my book in January, which is also, you know, Google is still a sponsor there. And there are some folks who refuse to go to that meeting because it was sponsored by Google. But, you know, it's a calculation, it's a choice. Cost-benefit, you know, better to go, be with you, share a message or better to stand on my principles and not come here and be silent. So that's how I calculate the cost-benefit, but I'm not happy about it. I just maybe have to say something to that. I'm not on the payroll of the HIG, so to speak, but I work for them. And just to give a little bit background information, the institute was founded by different universities here in Berlin, not by Google. And I think in the board of five people or six people, there's one Google guy. That's just to give you an impression. I think it's 1.5 million euros a year that the institute gets from Google. That's about in good years, that's about half of the budget. 1.5 at the most is the other kind of Trittmittel, the other funding they're able to gather. Sometimes it's less, just to give you a couple of numbers on that. There's a microphone. And this side, is there one on the other side, too? You see, that's the problem, because this is so packed and we don't have aisles and everything. It's really hard to handle the microphone. This is one of the reasons we opted for Slido, like back in the day at the Kino International with Castells. That's part of the reason. Okay, how are we going to do this? Hi. Thank you so much for your talk. I'm Danny Stockman. I'm a professor here at the Herdy School of Governance. I'm wondering whether you could speak to alternative business models to the surveillance capitalism for tech companies to use in the future, thanks. Well, I think, for example, I think it could be, you know, possibly quite useful to have some of the devices that are characteristic of what people call smart. Home, manage energy efficiency and communicate with distant family and make sure old folks are safe aging in their homes and all these things. So, there were folks in the early 2000s who were developing these models of a smart home. And when they did that, they were drawing schematics of how it would work and it was a closed loop. It had devices in the walls that were generating information and all that information was channeled to the occupants of the home who received it on a little wearable computer and it was a closed loop. Then you fast forward to 2017, a very interesting legal analysis of one Nest thermostat. Nest, smart home devices. Now, it was bought by Google. Now there's no more Nest. It's all Google Home. One Nest thermostat, these legal scholars figured out. If you're even just a casually vigilant consumer and you install one thermostat, they recommend you review a minimum of 1,000 privacy contracts. So, I don't think we have to look that far for business models. The problem is that without the surveillance dividend, it's going to be very hard to monetize a refrigerator that's not secretly sending data to my health insurance company. Just like Ford Motor can't figure out how to make money without streaming data from the folks driving his cars. And that becomes the basis for revenues and profit. I think if we could actually see our way to eliminating that overhang, the business model is the least of our problems. I have at least a hundred messages in my inbox out of several hundred every day that are people who are working on great ideas. This piece of technology, this kind of product, this kind of service, that's not surveillance capitalism. How do I do it? How do I get it funded? I don't think we have any problem with business models. The question is making space in the competitive landscape so that business models can get the funding and the capital that they need to sustain. Thank you. I think we got one more question from the floor. My name is Rafael. I was actually Danny's student and I used to be a Google Fellow, but I swear that I'm not defending any sites. I work for civil society nowadays and we look into the impact of social media around elections in the world. So not only Europe, but in Asia, Africa, many countries. And what is interesting to see is that the decision to counter the risks of each election is basically a PR campaign. So you're going to, of course, invest a lot of money in the US, in Brazil, in India, because they are big democracies and the effects of any bad PR will go to the company. But they don't look into other countries that are smaller or smaller markets. So basically you have this asymmetry of a very, like an US-based company deciding which democracy values more than others basically. So in that sense, my question goes in two ways. So you mentioned about this infrastructure generating predictions about behavior. So you have basically inside of those companies probably 98% of the efforts towards that and 2% towards countering the negative effects of that. First of all, regulation is enough to really solve this if you consider this global scale of companies. And second of all, how can you really make sense of this rising global inequalities in terms of democratic standards. I wanted to ask more, but it's fine. Thank you. Okay, that last part got a little pushed together. The idea that some democracies value more than others. Well, what's your name? Rafael. Rafael. I love that name. That's my son's middle name. If they were choosing to look out for democracy in India and Brazil, they didn't do a very good job. So if that's what we get when they're actually trying to do something, then I don't buy it. I mean, all they had to do was intervene in WhatsApp in Brazil. And it's clear that the outcome would have been different, or at least I think there's compelling evidence to suggest that the outcome would have been very different. So I'm not sure that they're really looking out for democracy anywhere on that part of your question. What was the other part of your question? Oh yeah, what was the other part of your question? Oh yeah, yeah. So look, this is where I'm talking about regulatory vision and creativity for a new era. So we're talking about a war now between computational governance and democratic governance. The computational governance side says democracy is too slow. Democracy limits innovation. Democracy can never keep up. And the democratic side says you bet democracy slow. Slow is good. Slow means we're deliberating and we're creating some kind of consensus. And the idea that democracy is bad for innovation, that's some kind of really crap piece of propaganda that was actually invented by J.P. Morgan in the late 19th century. J.P. Morgan loved to say capitalism doesn't need law. We've got the law of supply and demand and we've got the law of survival of the fittest. We don't need any more laws. So the idea that law is somehow bad for innovation, bad for business, that's just an old piece of propaganda that's just been recycled. The idea that market actors should be completely free, that maybe has had a certain amount of merit when Adam Smith was writing, because when Adam Smith was writing the hand really was invisible. Not for long. But today the hand is not invisible. These cats know everything. So they actually know too much to qualify for freedom. So that argument is gone. So the point is, part of a democratic regulatory vision is to say, it is not okay, for example, to have political advertising that is patently false. It is not okay to have videos of elected officials that are known to be false. It is not okay to have a virus without a vaccine. These things are simply not compatible with how democracies have to run. So, can we regulate that? Yes, we can. Can we do that without that being turned into censorship? I believe we can. That's where the imagination now, that's the front line now. But we see the alternative and it's unacceptable. Thank you, Shoshana. What we usually do in this series, please. I'm not quite done yet. I'm not quite done yet. What we usually do at the very end in this series is ask about the European perspective. Because you know probably in Germany there's the Nettstegida Network Enforcement Act. It goes colloquial by the name of the Facebook law. Basically make it easier to force platforms to remove hate speech, for instance. It is a set of compliance rules, more or less. Then there's the right to forget on a European level, things like that. Now there is the usual European conceit, you know, that borders on feelings of moral superiority. I'm sure you know what I mean. And there's those who think, ah, forget it. We do not have enough power in this new emerging world order anyways. Now what's your take on the European perspective of agency in that matter from the US? Well, you know, when I first met Frank Schirmacher and I began writing for the FATS in 2013, I told him then that I wanted to write for him because Europe was the front line. And Europe still is the front line. So, if we are going to make progress on the things that we've been talking about this evening, I do believe that that progress will be made here first. And I don't know that we have to call it moral superiority. I simply think that Europe has had a different experience than America. And World War II is an important part of that. And what our society has learned from the devastation of that time is different and has played out differently in our different European and American civilizations. And I do believe that in many ways Europe understands that democracy has to be defended and fought for in every generation. I was doing a book signing a few months ago in America. And this long line and I kind of got my head down. A beautiful young woman comes and we sign it. What's your name? It tells me her name and she says, I'm very depressed. Your talk really made me sad. And I said, oh my, why are you depressed? What happened? Why did I make you sad? And she said, well, I've lost faith in democracy. And we talked about it a little bit. And I said, you've just given me an insight that I didn't have before. That for someone your age, especially coming of age, eight years of Obama or whatever, it's like you're looking at democracy like it's a mountain, like it's a boulder. And it's there when you're born and it will be there when you die just exactly the same. But that's not what democracy is. Democracy is a creature of human imagination and will. And every generation has to keep it going. It's like the kids, before there were toys, kids would throw a hoop, and then you had to run over the hoop and to make sure it didn't wobble and fall over. Well, democracy is like that hoop and you've got to run after it and make sure it doesn't wobble and fall over. It's everybody's responsibility in every generation. And I think that Europeans have learned that in a way that perhaps Americans have not had to learn it in the same way. I mean, my father learned it, but I'm not sure that Americans have had to learn it generation after generation in the same way. So, I do think that Europe has a unique vantage point and that Europe has been and will continue to be the vanguard in this work. And that's an important reason why I've been so committed to this work in Germany and in Europe. I didn't know I was fishing for this, but it's sure nice to hear. Thank you Shoshana Tzubal for being with us. Despite your illness, despite your cold. Thank you for being with us. Shoshana Tzubal.