 Good evening, everybody. It's my pleasure to welcome you to the so far hottest day of the year. I must say I'm really moved that so many of you chose to come. And I regard this as a big recognition of our speaker tonight and, of course, also for our lecture series, Making Sense of the Digital Society. We started this lecture series some 18 months ago because we saw a growing need for a broader perspective on the structural transformations that our societies are experiencing or even actively pursuing. We see a need for broader perspectives that are based on research, but also bold enough to sketch out new comprehensive narratives and that reflect, if possible, a genuinely European perspective. We've mentioned this before. The European angle really matters to us because we are convinced that sense making is a form of shaping our world. And, of course, because there is more than one way of making sense of the digital transformation. And that is why we think that a European or several European perspectives will broaden our options that we have in making sense and shaping the digital society. Recently, we observed that the idea of strengthening European perspectives on digitalization is gaining prominence both within and beyond Europe. This concerns the academic field, but also the growing debate on regulation, the digital economy, data protection, and possibly other human rights. For this reason, we are honored and very happy to have Jose van Dijk as our speaker tonight. Her work on the platform society has become a major reference point in academic research on digitalization precisely because Jose reflects and advances a specific European perspective. With this, I hand over to our proven moderator, Tobi Möller, who will introduce her in more detail. Thank you, Shannet Hofmann, for this introduction. Thank you for being here. Shannet already mentioned this is kind of an evening of reversals because, as you know, or some of you who are familiar with this series, it's always the last question of the evening before we have a drink that I'm trying to go for the European perspective and ask our speakers, what can we do? What's European agency? Is there room for it? Have we failed already? It's been kind of a bleak outlook when it comes to the European perspective so far, I think, and I have the feeling that this is going to change tonight with our guests. Really shortly, about the structure of the evening, there's going to be the short introduction now. Then the lecture of our renowned guest for about 45 minutes. We'll have a conversation on stage for, I say, maximum 25 minutes, just the two of us here. Then it's going to be your turn. There's going to be, I think, two microphones here in the venue for your one microphone, all right, in the venue for your questions and comments. There's also a Twitter wall. You see the hashtag right in front at hashtag digital society. And I think we're being filmed again. Yes, we are being filmed again. And this is going to be streamed on the respective websites of the Federal Agency for Civic Education and the Humboldt Institute for Internet and Society. Okay, to our guest. She was born in the Southern Netherlands and she's now a renowned new media scientist, a professor at Utrecht University since 2017. Before that, she was chair of the Department of Media Studies at University of Amsterdam and the former dean of the Faculty of Humanities. Her studies and her teaching brought her to Philadelphia to the MIT in the East Coast, to Georgia Tech in Atlanta in the South, among other places. In 2010, she has been elected a member of the Royal Netherlands Academy of Arts and Sciences. In 2015, she was the first woman to hold the position of president. Her subjects vary from the discourse of reproductive technologies and manufacturing babies and public consent, as her book from 2005 had it, through the culture of connectivity, a critical history of social media. In 2013, her newest book, quite close to tonight's lecture, came out last fall and is called, you've heard it already, the Platform Society, Public Values in a Connective World, Oxford University Press 2018. She co-wrote that with Thomas Perl and Martijn de Waal. A hugely informative and inspiring read and very clearly written, if I may say so. It's very inspiring because it challenges reflexes when we think about platforms, what's private, what is public. The book also shows that it's not always clear who pays for infrastructure, who can use it, or even the question of what infrastructure is, what a public good is, and how private benefits actually are in the end. But enough, we have the primary source of all this thinking with us tonight. Please welcome from Amsterdam, Jose van Dijk. Thank you very much, Toby. That's such a wonderful and really too much for the introduction. Thank you for having me on this hottest day of the year. Thank you very much. I always wanted to be here at the house. I usually was here in the winter, but for a change I'm here during the hottest day of the year. So if you see me faint, you know I need some water, right? So I will try to drink lots of water. Well, I'm here indeed not to promote my book, but actually to talk about Europe and responsible platform societies. And thank you, Jeanette, for the invitation and I'm really proud I can be part of the renowned speakers of this series. It's not going to be an advertisement, but I wanted to show you the book particularly because I didn't write it all by myself. These are my two co-authors. I'm very, very proud of them. And if you can't read the book in English, by the way, we're being simultaneously translated today to German. And if you can't read the book in English, there's going to be an Italian and a Chinese translation out soon. So just in case you prefer another language, right? So, digital platforms, they, of course, we all know that and that I assume to be some sort of common phrase, but they have created enormous benefits and very powerful global connections. Let me say that first of all, because I know, is there anyone in this room who has not been using one of the big five platforms over the, let's say the past week? You must have been on vacation to some kind of remote area where they don't have, there's no connection to the internet or something. So anyway, we all know how much we have become dependent on these platforms, but since 2016, problems have been mounting for the tech companies and you probably have heard about the Tech Lash, which has introduced that sort of mounting problems. We have been talking about disinformation and fake news a lot. Hate speech and trolling was much into the news. We have heard much about the election intervention, particularly the Facebook, Cambridge Analytica scandal. It's probably still fresh on your mind, but as if that wasn't enough, we've had privacy scandals, we've had security leaks, and of course, totally different venue, we've had tax evasion and undermining of labor laws and that now I'm just stopping here because you would be totally depressed before I even have to start at the lecture. There's lots about addiction, et cetera, et cetera. My conclusion so far would be the longstanding values that promote an open society, and by that I mean tolerance, democracy, fairness, I'll come back to that. They're compromised in the online world and that's a world that is dominated by mostly American digital platforms. So my leading question today will be how can we anchor public values in an open digital society or all the open digital societies in Europe? Pretty much how could we use data for the public good in an online world that is almost entirely dependent on a private American ecosystem of platforms? I will come to that. So over the next 40 minutes or so, I will take you through these four speaking points. This is sort of an outline. I will first explain to you what I mean by platform ecosystems, what I mean by public values. We're all talking about public values. What kind of values are they? Who are responsible actors in this digital society and what particularly are the challenges of Europe? Toby already pointed to that. Now let me begin by explaining to you what platform ecosystems are, how do they operate and how do we encounter them in the wild? Pretty much in our global online world, that is a world that is driven by platforms and those platforms are fueled by data flows. Now platforms and data flows can be steered by companies, either companies or states. And the two platform ecosystems that dominate the online world are what I call the American platform ecosystem and the Chinese platform ecosystem. Let me start with the Chinese. Sorry if I drink sips of water because it's so dry here that I need to drink lots of water. China, as you probably know, has its own ecosystem and that is controlled mostly by the state but it's operated by the big five companies in China and those you see listed here are Baidu, Alibaba. Alibaba is like the Chinese Amazon. Tencent operates WeChat. You've probably heard of that. Jingodong Mall is a little less known but still a very big company and Didi. Didi is sort of the Chinese Uber. Now Alibaba and Tencent, they're becoming extremely powerful in this ecosystem and beyond in the online world at large. That's because they're branching out their core businesses into every sector of society. And they're gradually or actually quite rapidly becoming gatekeepers to the entire economy. They're wielding power over brick and mortar enterprises. True, for instance, the pay systems that they build into that ecosystem, communication channels, but also grocery stores, pharmacies, healthcare platforms, et cetera, et cetera. Now in China, the state has very strict power over those companies and in fact, their sheer size, the fact that they're so big and so few, only a handful of those big ones, actually makes it easier for the authorities to control the data flows that are also, in that way, accessible to the state. And most of you, I assume, have heard, have picked up in the news some ideas of how the sesame credit system works in China. Anyone of you who hasn't heard about the Chinese sesame credit system? A few of you haven't. But, well, very basically, this is a very rough sort of explanation, but if you ignore the red light three times, because, and that's being recorded, of course, by facial recognition, that may limit your choice of entering your university of choice. Now, this is a very sort of simple explanation, but the sesame credit system is very powerful, it's becoming very powerful in China. Now, let me go to the American platform ecosystem because that has its own big five, and of course, those you all know, you've probably been using that for some 15 years now, all of these companies' platforms, dominated by Alphabet, Google, Facebook, Amazon, Apple, and Microsoft, also known by the acronym of GAFAM. That ecosystem pretty much dominates the rest of the world, which is Europe, Asia, except for China, Africa, and both North and South America. Now, American tech companies have tried to enter the Chinese ecosystem, but in the past, they've been oftentimes either banned or censored or forced to align with Chinese companies. Think of Google Search, for instance, which was forced banned, is now trying to get back on that market. Facebook social network has tried to enter the Chinese market a couple of times, was actually refused, but on the other hand, think of Apple, which pretty much, I think 40% of Apple's app stores income is now, the revenue is now generated in the Chinese market. So there's more and less successful attempts of penetrating that Chinese markets by American platforms. However, just recently, over the past few months, we have seen that the Americans are trying to bar Chinese companies the other way around from entering the American market, and that's pretty much the current trade war that's going on between these two countries or these two superpowers. However, that will be increasingly hard because those two systems are very hard to actually separate. These two systems at first sight appear entirely different. For GAFAM, of course, you know that the data are owned by corporations. There's a huge potential for corporate surveillance of online activities, and of course, that system is underpinned by libertarian capitalism. On the other hand, very differently, almost the opposite, you see that in the bad system, not the bad, but the bad system, data is owned by the state, well, very important to that T instead of the bad system, data is owned by the state. Civilians are subject to state surveillance, of course, of online activities, but they're actually executed by corporations, and that turns it into a form of state capitalism. Now, although those two systems are entirely ideologically different, they're increasingly intertwined and increasingly hard to separate. That holds for the economic level, to think of partnerships, of shareholder positions, of financial flows. For instance, Uber has a stake in Didi and the other way around. I think they have a 20% stake in each other, and then Didi is, of course, controlled by Tencent. So, you would be surprised once that I started to look into that, only after I'd finished the book, and I was totally caught and surprised by the capitalist nature of the Chinese ecosystem, whereas we hear a lot about, you know, Chinese, the American system being some sort of, you know, the utmost becoming a surveillance system. Actually, I think it's the other way around. It's the Chinese ecosystem that has become such a, you know, a capitalist, taking over its capitalist nature. We also have, you know, it becomes harder to distinguish at the technical level. There's the material infrastructure that relies on, for instance, the Chinese control the metals, much of the software and especially the hardware, the chips that is involved in deploying and operating those platforms. So, in fact, the two systems are almost it's almost impossible to disentangle them. So, that will be, I think, the most important thing when seeing how the trade war is evolving. I'm not going to talk about China today. That's not my issue. We're going to talk about Europe. And, of course, squeezed in between the US and China is this continent. And this, our European continent, has pretty much no major platforms. This one is the only major European platform in the global top 50. Anyone can guess which one it is? Yeah, Spotify. Actually, on that top 50 of most important platforms, it's number 49, so it's not that big. But, more importantly, it's no longer fully European. Ten sentence Spotify have now minority shares in each other and Spotify is actually listed on the New York Stock Exchange. So, in short, for online infrastructural services, Europe has become largely dependent on the American platform ecosystem. And here you can read that this was actually, these numbers are from last year, the corporate headquarters of the largest players by market capitalization. They're very unevenly spread geographically. 47% located in Asia, 36% in North America and 15% in Europe. And the most important thing is that of those 15% platforms in Europe, Europe has very few unicorns. Estonia has Skype, for instance, but that has now become Microsoft. Taxify has become quite a big one, has become a unicorn. It's now owned by Volt. It's now called Volt. And AdGen, which is a Dutch company that has a pay service, which you may probably not know, but those are pretty big platforms. The problem is, neither of those platforms have important infrastructural positions. And I will come to that in just a second. When we talk about platform power, it's very important to distinguish the various levels of platform power. It's actually distributed at three levels. And I compare that to a tree, just to stay in kind with the ecosystem metaphor. We have the roots, which is pretty much the internet architecture. It's the digital infrastructure of hardware, of ISPs, internet service providers, but also satellites and data centers, domain names. It's the big infrastructure that the whole tree basically relies on. Now, this part we have not included in our research. It was just too much because we have concentrated on the middle and the upper part of the tree, and that's the trunk and the branches. The trunk, by that I mean the infrastructural intermediary platforms. I will come to that in a second. And secondly, I'll concentrate on the branches, which is pretty much the sectoral platforms. I will explain that to you in just a second. Most importantly, most important about this slide is to remember that the Big Five company ownership is now distributed both among its roots, the internet architecture, as well as that intermediary level, as well as the sectoral branches where it's spreading its powers. Okay, so this is just a visual to remind you or to remember when we talk about platform infrastructure. It's a hard thing to imagine, but I try to make it more clear. In the American GAFEM systems, American platform companies are driven by, you know, not so well, of course, by market value. In terms of market value, these Big Five, they formed the world's fifth largest economy after the US, China, Germany, Japan. But more important, I think, more important than market value, it's about societal power and influence. These Big Five increasingly act as gatekeepers to all kinds of social, economic, cultural, and personal online traffic. And that's what you also see in the branches. So our focus has been on the trunk and the branches and how these two interact. The intermediary and the sectoral platforms. Let's start with those intermediary infrastructural platforms. How do the Big Five companies actually wield those strategic platforms? And by that, I mean, we have made an inventory of what those infrastructural intermediary platforms are. We have found some 70 that we would call infrastructural, but that's disputable. For instance, social networks, like the Facebook Blue app, but also, of course, any other social networks, web hosting, base systems, identification services, cloud services, advertising agency, search engines, of course, operating systems, navigation maps, messenger services, app stores, analytic services. And there's about 70 of those. Now, societies across the globe, particularly also in Europe, they've come to depend on this infrastructure for organizing all kinds of societal sectors, right? So, and rather than having private infrastructure, public infrastructures, we increasingly see that platformization also means privatization. Now, there's a big debate whether we should call these particularly intermediary services, these infrastructural services, whether we should call them utilities and because they have become privatized. That's a huge debate. I'm not going into that because it's pretty much a legal debate, but it's actually very difficult for lawmakers to define which platforms are utilities or infrastructures and which are not. So that's an incredibly refined legal debate. But let me first take you to the next level, which is the level of the branches of the tree. So besides owning and operating those intermediary level platforms, many of the big five companies are now branching out into sectors. And for our research, in the book, we have researched two public sectors, health and education, and two private sectors, news and urban transport. Of course, there's plenty of other sectors that we haven't even touched upon. Let's, for instance, finance or retail, hospitality, food, Uber Eats, there's so many branches that the big five companies are actually branching out into. And then, of course, that's not only in terms of acquisitions and mergers or partners, but also in the back end of these systems, data flows and infrastructural apps can be tied together and the sectors, so all the data gathered from the various sectors can be combined. To give you an example of how that works, but in the book, we have actually gone through each of those sectors, but this is from the educational sector, which I just took as an example. And take, for instance, Alphabet Google. Actually, in the slide, you probably can't read the small print from there. I don't blame you, but you can find, I'll be happy to share the slides with you so you can read the fine print and they're also in the book. Take Alphabet Google, for instance. As a company, they control the intermediary platforms like search, advertising maps, et cetera. Those are, of course, taken into every sector. But if you, I've just put in a few arrows here, and that is showing you how the data flows between sectors and infrastructures and between sectors actually work. So for instance, you see that in the transport sector, Alphabet controls 20% of the Uber shares. Google Maps is built into Uber and Waze, the navigation app, is now acquired by Google. So increasingly, they're dominating the various nodes of that sector. In Health, for instance, Google Health is an important app, a health app, and 23andMe is partly owned by Google. It's a DNA app. It now controls pretty much the biggest DNA platform in the world. In News, Google has the News aggregator, of course, and in Education, and that I would like to focus on just a little bit. Of course, in Education, I think the penetration of Google is almost mind-boggling. Google Apps for Education is now built into Chrome laptops, which are bought by schools, entire school systems. And by that, they bring not only the common apps that you already know from Google, but also school administrative systems, tracking systems to track students' performances, to track how they respond to each other, lots of social networking features built into those apps. It's not just ownership and acquisition, but it's the built-in ability to control and connect data flows in the backend. Upstream and downstream, the tree, right? As well as sidestream between those branches, between the various sectors. Now, just one more thing about education. In the education branch, in the educational sector, what we're seeing increasingly is that the big five are all over the past two years have been penetrating this important sector for tech investment. For instance, Amazon, it has last year, 2018, has invested 100 million, sorry, I started with Amazon, building platforms where the child is the customer. And that, of course, shows you how much commercial values are built into those systems. Facebook, last year, invested 100 million in better schools, and those are called summit schools. It's a Facebook project. It's supposed to scale. They are starting in the public sectors, where schools, of course, are way underfunded in the United States, and that's how they take it to other school systems. And I think this has been the underreported story of the year, and that's Google's takeover of the classroom, which was reported last year in the New York Times, has received very little attention. But built into those Chrome laptops, as I just said, are not only a set of apps, Google Apps for Education, and that, of course, goes back to the infrastructural platforms. They have analytics, tracking, and they're combined with platforms that in that sector track every small movement of a child's education. And that, of course, is incredibly important, one when you take into account how branched out and penetrating these systems have become. Now, each of those platforms, well, the whole platform ecosystem, in fact, is built on commercial values. They're driven by market forces, the market forces of efficiency, monetization, and, of course, dominance, because it's all about market dominance. But what about public values? That's what I stated in my initial questions. What about public values and the common good? Now, Europe, different from America, from China, on the other hand, has substantial public sectors and public space, which pretty much seems to be absent from the American ecosystem. And public values appear to sit in tension, and that's why we've had, I think, those problems over the past few years. They sit in tension with the commercial values that structure GAFEM's architecture, the trunk particularly of the platforms. Now, first, before we continue talking about public values, what are we, in fact, talking about? What kind of values do I consider to be important public values? First of all, those values are very basic values that pertain to our online interaction and online society. Values like security, transparency, accuracy, and privacy. We've heard much about that. You may expand this into values like autonomy, very basic human values. And these values, of course, they're not fixed. You can't just go to a store and pick them off the shelf and buy them like they are. Values are often negotiated, and they're negotiated at different levels. For instance, when Google tries to implement its educational platforms in schools, what we see happening is that the value of privacy of students may sit in tension with transparency. And that transparency may be a very valuable public notion because, for instance, schools could be opening up their data on children's progress to the public or to research. So those values may sit in tension and need negotiation. But beyond those precise values, beyond internet and consumer values, there are public values that pertain to society as a whole. And those values, they're not exhaustive, but they include fairness, inclusiveness, responsibility, accountability, and, of course, democratic control. And these values are negotiated at every single level, starting with the transnational level, Europe at the state level, the local levels, but also at the institutional level. You know, all the way down to the professional coach that, for instance, teachers or journalists have anchored somehow in their societal... how they perform their societal roles. Interestingly, or perhaps sadly, connective platforms often bypass or ignore those sectors and where those values are negotiated. For instance, institutions or professional coach. They go straight to individual consumers. I was just talking about education. What we're seeing is that individual schools are being offered Google Apps for Education and even Chrome laptops at very low prices, 150 bucks, which is way underpriced. But that's, of course, because Google can earn it back in other sectors or through other services. Facebook, for instance, it bypasses news organizations because it refuses to carry the label Media Company, and hence, it ducks regulation. So, public values have become increasingly important, I think, not just to our European system, platform ecosystem, but to the entire world. And that brings up the bigger question. Who is actually responsible for the platform society? You know, these public values, as I said, they don't just exist. They need to be negotiated at every single level. And the very simple answer to this question is, we are all responsible for governing the digital society. But analytically, and this is sort of a lesson I took from Politically Economy 101, analytically, there's three types of actors. Market, state, and civil society, right? In China, as we just saw, state actors dominate. In the US, excuse me, market actors dominate the stage. In Europe, ideally, there's an emphasis on civil society and state actors in balance with market actors. So, in fact, you know, there's a few simple rules in Europe. Data are preferably owned by their citizens. That's why we put so much emphasis on privacy. European nations prefer to operate in multi-stakeholder organizations, so, you know, to balance off those three different societal actors. But there's three problems with implementing public values in the European platform society. First, those civil society actors, they're systematically underrepresented in the ecosystem. And particularly in the infrastructural part of that ecosystem. Second problem, there's hardly any public space in the American platform ecosystem. Hardly any to come by. And the third problem is that data, you know, generated mostly by civilians, by citizens, by users or buyers, those data become mostly proprietary. So they cannot be used for the public good. So those are three major problems we have to deal with. And that is the kind of challenge that Europe is currently facing. You know, we need to negotiate those public values by, you know, the responsible actors at every level, institutional, local, national and supranational, but we don't own the platforms. So that's the major struggle that we as a continent have been in over the past few years. So let me concentrate on that European transnational level. There's much to say about the other levels, local, institutional, et cetera, but I will concentrate on the EU level. So what are the challenges for Europe? Which owns hardly any companies with vital infrastructural power while it wants to project its own ideological values, you know, those values of public space, public sectors, et cetera. We cannot solve all the problems in Europe, but we can do a few things to improve the European condition, I think. And these are a few of the challenges I would like to mention, four challenges that, you know, we could talk about a little later in the discussion. First of all, I think we need to take a comprehensive, integrated approach to data-driven platforms. I was just reading the other day this particular report, it's on artificial intelligence. Of course, data is now the oxygen of artificial intelligence. AI, a European perspective, it's just, I think it was published a couple of months, just last year, eight or nine months ago by the EU. And it's all, you know, it talks for 150 pages about Europe as a market. And the very last page of that report talks about pace attention to ethics. It's like, it's almost an afterthought. And that really struck me. There's now, I heard a high-level group on ethics, which I really applaud, but it should not come as an afterthought, not on the last page, it should be up front. So that is one comment I would like to make on this report. In fact, you know, we're no longer dealing with digital markets. We've heard, you know, we've had many EU reports about markets, but we need to talk about platform societies. That's actually why we preferred this title for the book. We need to look at how markets, governments, and civil society actors can create a balanced, balanced and govern this platform society in a balanced manner. A good example, so I wanna come to that, you know, it's not all bad, that's out there. There's a good example, and this is the British IPPR report, which is called the Digital Commonwealth from Private Enclosure to Public Benefit. I think this one takes a holistic approach to how values need to be negotiated up front, how principles need to be anchored in that digital society, even before you start negotiating the technology itself. So that, I think, you know, you can take a look, it's not much time to talk to you about this, but I think that's a very good example of how you could do that differently. Particularly at a time when, for instance, now cities like San Francisco are going as far as banning facial recognition technology from their city limits, which is quite, you know, quite a thing, think about it, that you need to pause and think how to put those public values up front. Secondly, I think, I think, articulating, we need to articulate value-centric principles at the European level. And now this could be many different sort of principles, and I totally agree if you say you can't do that, just top down at the European level, but that's not what I mean. I buy a few principles, I mean very, you know, simple principles from which nations and local authorities and institutions where they can actually look up to and say, okay, that's what we stand for, and then they can start negotiating those public values themselves. So what kind of principles could those be? For instance, about data ownership, very simple rule, four words, data belong to citizens, and that, of course, has everything to do with privacy, but, you know, on the other hand, open data belong to the public, and by open data, I mean that there is reciprocity between those, you know, open up those data, and usually after that, they're becoming privatized as I just showed in the educational world by those companies. Open data reciprocity means it's a two-lane traffic, right? So that could be a very simple rule. Data portability, you can carry data around to different platforms, very simple rule. We could have that at the European level and then work it into the various other, you know, levels of implementation. Data transparency, data flows could be regulated like money flows. We all, you know, are perfectly comfortable with the fact that banks are being looked upon as, you know, they are in control of data flows and states are actually controlling true accountants, for instance, how those data flows are governed. We could implement a similar sort of governance with data flows, why not? We just have to, you know, be inventive. And finally, software ownership. Open source when open source is possible. Not simply, you know, privatized by default, but if you put up open source as a viable alternative and also support it, I think that would make a major difference. Now over the past few years, there have been several attempts to do, you know, to sort of propose those simple values for Europe. Tim Berners-Lee, I think, has designed a wonderful sort of set of principles. He calls it the Magna Carta for the World Wide Web. And in that, and I think you recognize here, if you can't read it, but you may recognize that he distinguishes governments, companies, and citizens. So my three different kind of state markets and civil society actors. Facebook and Google have signed this contract and so have some nations in Europe, particularly. But, you know, it's a bit non-committal. They haven't really committed to implementing these principles or values. Without, for instance, any obligations or sanctions. Third, I think we need to update and harmonize regulation. That is one thing I think Europe could make a real, real difference. The EU, and I really applaud that. I think the EU has been really taking a big step with the GDPR. It helped set standards on privacy and data protection. And, of course, we've had, here's one of my role models, Margarita Vestager. She has imposed the antitrust fines from EU comp and sent a strong signal to platform companies. This is not, you know, this is our limit. This is where we stop. I think these are wonderful gestures. They're wonderful beginnings, but it's not enough. The current antitrust and competition and private, you know, antitrust, legal framework, competition, anti-competition, privacy law, it may not, they may all be wonderful and helpful and actually sufficient as, you know, to be used for that particular part of the legal framework. But they're compartmentalized and they're, to some extent, outdated. So what we need to do is to update them and harmonize these. They need to collaborate because it's now like you're putting a dent in, you know, a big, big thing. And it's really something that you need to look at as, and I come back to my first point, as a comprehensive approach. And finally, this is my fourth point and last, I believe. There are a few, there are actually too few strong, viable public or civil society platforms in the online world. There's hardly any and they're not strong enough. They don't have any power. So what we need to do is to stimulate nonprofit and public platforms. For instance, in the public service media, in hospital systems where they're badly needed. In education, you know, my example. And for instance, in libraries. One of the problems is that often, you know, as an objection to public or nonprofit platforms, you often hear the objection, well, they do not scale. You know, they work at the local level or at the most at the national level, but not beyond that. And that's what makes them perhaps useful for a small group, but not very particularly useful for a larger global community. Well, actually, I've been thinking about that. We may, you know, bring that up in the discussion, but particularly decentralized design, you know, if you design apps, if you design platforms, to do that, decentralized, that is actually makes them more manageable. And that may actually be quite a valuable EU principle, even if they do not scale. You can make them interoperable. You can try to think of different ways. But scaling in itself, and that's, of course, the marketization of the global scale, is not a value in itself, I would say. So I think we should stimulate experiments with that decentralized design and with helping nonprofit actors. So in sum, I need some water before I sum it up. And I think many of us, especially, you know, people in Europe who have been complaining about the American platform companies a lot over the past few years, I was one of them. If we feel squeezed between those two ecosystems, made in China, made in the USA, it's time to rethink our own architecture, our design, and the governance of platforms. Indeed, as I said, we're all responsible for creating a fair, open, digital society. And by all, I mean, I don't know how many there are in this room, but engineers, I mean policymakers, I mean regulators. I also mean academics like myself, but particularly also civilians who care for the society they live in, who want to govern it democratically. I think we all need to collaborate on design and on governance of these platforms. The current tech clash, as I've just been describing, or in the beginning of my talk, that tech clash doesn't necessarily lead into a dystopian future. I refuse to believe that there is necessarily, that this brings us into some kind of dystopia. And I see very encouraging signs coming from public counter power. In the online world, we see many local initiatives, for instance, taken by the city. I'm now involved with several of these initiatives in the city of Amsterdam, several cities in the Netherlands, with public broadcast systems and a lot of public organizations who want to collaborate and provide alternatives. We need those initiatives and we need to support civil society efforts, also raising awareness at both the national level and the super national level. And I really believe over the past year, I think actually after we had already finished the book, so I'm so sorry I couldn't put it in anymore, but I really believe there has been more awareness and more consciousness about what non-profit civil sector, civil society actors could do on this level. So, on closing with that hopeful note, there's certainly a lot of hope in that area and the idea of platform counter power will hopefully be the topic of my next book. So I will leave you with that thought. Thank you very much, Jose, for that very inspiring talk. I'm trying to do two things now. I mean, I've prepared some questions that only sort of correspond with what you're talking. I'm trying to react in the moment here a little bit. Also, you talked about European values and the lack of them in the architecture of the US platforms. You said there were much more multi-stake holders in the European value system, so to speak. Now, my question would be, can you explore a little bit more about what you consider to be European values, actually, and how they could be built in in those, you know, GoFam platforms or disruptive technology, or if that's even something to be wanted. Right. First, I didn't say European values. I used the term public values for the kind of values that I listed. But then, of course, in the European system of European democracies, we have the tendency to cooperate much more with state markets and civil society actors. So that is very much a European way of working. So I wouldn't call that a European value, but a European preference for collaborating. Yeah, we are, I think we see them on different levels, but one example that I wanna pull out is the creation of identification systems. Increasingly, we're using Facebook login as a sort of universal identifier to all kinds of platforms in Spotify, for example. Spotify, you know, all these companies are trying to become the one and only universal identifier that makes that so they can follow you across the internet, right? That's extremely valuable. In my mind, if I regard, you know, from a European perspective, what I would find very valuable is if we would have an identifier that is not provided by the state or a global iPass port, it doesn't have to be global, but it has to be a passport sort of idea, doesn't have to be provided by the state nor by the market, which, you know, both of them come with problems as I just explained, but they might also be provided by civil society actors. And in the Netherlands, we currently have such an identification app that's currently being created by a university collaborating with actually a foundation. It's a nonprofit foundation. And having that alternative at least doesn't force you to go either with a market system or with a state system. And in terms of providing infrastructural services, I would very much applaud the emergence of civil society initiatives, nonprofit, non-state, non-market initiatives that could provide an alternative to what's already out there, which is trying to, you know, to pull you towards a universal identifier driven by the state, controlled by the state or controlled by the market. In response to another thing you were saying, that data belongs to citizens, that this is more or less the European perspective. What does that do? That sort of private identifier that basically means that your data belongs to you, to the citizens, what does that do to the business models? I mean, it would do different things to different businesses, right? It would be different at Amazon, but let's say for Google and Facebook, because their business model basically relies on data harvesting. So what would that do to them? Would that be possible? It's doing a lot to the business model. It basically undermines the business models of the big companies. Now, to some extent, you know, the companies have been very protective of their business models, but over the past few years, they've also run into problems because of those business models, and they're seeing that in order to design and to maintain trust in the digital society, they also need to open up to that society, and they can no longer rely, we can no longer rely on their systems for a lot of trust issues as we have realized over the past few years. And the companies now begin to understand that in order to keep that trust and rely on people for generating their data, they have to come up with solutions to the problem of data ownership. And I think most of the companies have now begun to call for regulation in that area, which is interesting. They've only been doing that for the past six months. Now that's becoming clear that it is very expensive to come up with models that everyone will trust, and that's why multi-stakeholder organizations would work much better in that respect. So that's why they're opening up their stakes, literally, to other stakeholders. Those are all, I think, if you understand you correctly, and as I perceive it, pretty recent developments. I think that's kind of changed that this coming about. We've had times where many activists and scientists said, why are we so apathetic to what's happening? I mean, we know what's out there, and we know what's being done. I mean, Edward Snowden didn't really change a lot, for example, and he kept asking that question. Now it is changing, we can say that, maybe, but it's happening at a time where Europe, the whole concept of Europe is heavily contested, and there's a large populist right movement that actually basically wants to attack the very same institutions, the GAFAM platforms want to circumvent anyway in the end, right? I mean, is that legitimate to see like that, and what can we do against that? This is a really, really intriguing question, one that I can't answer in just a few minutes. These developments have been happening after we finished the manuscript for the book, so, you know, we always finished the manuscript at a very bad moment, but this was when we, of course, saw the populist movement making big gains also in Europe. The interesting thing is that there have been, now in the US, for instance, over the past few months, there have been calls for breaking up big tech companies, not only coming from the Trump administration, but also from the Democrats. So the heat on these companies is now on from very different ideological perspectives, and that, I think, makes it interesting because if we look at Europe, if we look at the populist movements here, they have become very dependent on, for instance, Facebook for making their points of view very popular amongst, you know, big populations, in Italy particularly, but also in Hungary, and so the tensions around that, what is it, ideologically, that we need to invest in has actually accrued a lot of power tension, right? So it's the power tension that we, not only between the United States and China on that matter, but also between European states amongst themselves. So it is a very ideological, has become a very ideological debate as to how power is distributed in a system that intrinsically, you know, is not owned by European companies. I think there's, again, an ideological problem in, if we sort of juxtapose the ideology of the platforms and of the legislators on a European level. The platforms say you cite that a lot in your book with your colleagues that they're, you know, again, want to circumvent government, that they do the better job in governments, that they're basically a bottom up, grassroots type of organization that's built in into their architecture that do not even mention their role of the state or the nation state or the government or whatever. I mean, what, I keep wondering how top down management had become such a bad name in that whole talks to position that you actually cannot mention it anymore. It would be totally impossible for a legislator to say, yes, what we're trying to implement here is a top down process. Do we have to look differently at the top down model or are there other ways to actually organize it, bottom up, like the platforms say they do? Well, that's become the million dollar question, of course. I don't think top down governance works any longer. Facebook is trying to govern top down its own blue app, for instance, and of course, Messenger and WhatsApp and Instagram. That doesn't work. The, you know, the moderation model that Facebook uses and it is desperately trying to keep it to itself, to, you know, to keep control of its own moderation. It's now hiring 30,000, I think, I believe, moderators to moderate their content, to control their content. And that is more I've heard than there are journalists in the United States. So it's becoming crazily, but also incredibly expensive to sort of take over an entire system implemented into your business model because it doesn't make any business sense. You know, it's simply too expensive to do that. And on top of that, you're not going to be trusted more because you're now owning all the content moderators, you know, are actually steered by Facebook. That's not going to work, not in a million years. So I think they have to look, you know, to it in a different venue for other solutions. It's also not going to work when the state is taking over that, you know, that moderating function. I don't believe in that either because that in that respect, we will have a, you know, a Chinese system, right? What is the solution? I just pointed to in one of the last slide. I think that we've been focusing too much on the global level of, whether it's in content moderation or identification apps or whatever, the global level, you're not going to find any solutions. That's where the problem starts. So I think that the solutions that we need to introduce to counter those problems have to come from the local levels or the national levels. Institutional level is so important. We have, you know, over at the same time when we were seeing that power in these platforms has increased, we have also see decreased the power of institution. And that's something I call institutional collapse, which I think is a major and underreported problem. Institutions have become pretty much, have been pretty much left out of this equation. And that's because they're bypassed by the big platforms. They're undermined by the state. Look at what Trump does to the EPA, to a lot of institutions in his country, public institutions he doesn't care for. He just doesn't give the money like the public school systems. And that's why that institutional level is so incredibly important. That's the level at which we hold the fabric of our society together. And that's what I think we need to invest in at the national level, because most of the institutions, except for a few European institutions, are of course governed at the national or local level. Took me a lot of guts to ask that question about the rethinking of the top down process, but I knew it was going to lead to an interesting answer. So I'm pretty glad about that. You also talked about the platformization as privatization in your talk now. You also do that in your book in a different way. And I kept asking myself, is this really a process that has started with GAFAM, with all those big platforms with digitization, or has this not been going on for actually a longer time? If you think about public-private partnerships, which is something that even leftist cultural politics in Berlin talk about all the time. I mean, this is not just a neoliberal discourse of big, big corporations who want to get into that. It's a very, it's an old concept and it has done a lot of good things actually too. But is it the same or has it led to this? Or how do you see those in relation, public-private partnerships and platformization, which is actually privatization, as you called it? Of course it didn't start with the first platforms. It didn't start in 2000 or 2001. Google started in 1999. It started in the 1980s, of course, the preference for public-private partnerships, which in practice often meant privatization. Think about Big Pharma, for instance, when we look at Big Pharma, we see the roots in the 1980s where knowledge of pharmaceuticals became opened up in public-private partnerships. Actually, what it meant is that 20 years from the 1980s and 2000s, et cetera, they started to become privatized. You saw exactly the same thing happening in what I now call platformization. That in fact started, think of how the big platforms started out. They were there for the people. They were there for users. They were there as not as companies, but as facilitators for the public to do their own thing. There was so much promise in that idea of bringing facilities straight to the audience, straight to the users. And of course, nothing has come of that promise, almost nothing. And that is once again, when you bypass institutions, when you bypass people's organizations at the local level, then you have no choice but to become privatized. Does that make it clear? I'm not sure if I'm explaining that very well. No, I think so. A very interesting point when it comes to agency, I know your take is not purely normative. It is analytical, but it has some normative aspects even now tonight in your talk. And I'm really thankful for that. You write about functional taxonomy of platforms that could sort out public questions like antitrust laws, competition laws, but maybe also taxation, which has already happened to some extent. And in Germany, you talked about Wastashea. But again, who would be the agent of that taxonomy? Would it be science actually? Or would it be science in compliance with politics? Or how would that process go about? Well, that's why I also talk about who are the responsible actors. And there's no such thing as one single unifying solution to all the problems. It's not that the European Union or the European Commission is going to find one single solution or just one committee who's going to solve that. That, I think the need for distributed counterpower, which was basically my last argument, I think that is needed to bring back power to the various levels. I think that counterpower is different at each level that I mentioned. It's for a super national at the EU level. I think we definitely need to look there for anti-competition law, anti-trust law, privacy law, of course. But for one thing, I showed Margrethe Vestager who's been terrific in sort of understanding how the platform system works. And yet, if you look at her verdicts, if you look at what the outcome is of her verdicts, you see that she is, you know, only anti-trust law cannot solve the problem because they're bound to legally argue within the limits of companies, of consumers, consumer markets, and that's not going to solve a lot of other problems that are not exactly in that type of definition that we make of markets. But it's beyond that in terms of not just citizen privacy, but a whole lot of other values that creep into the legal frameworks. And that is why I argue for a comprehensive approach to those laws. I'm not saying that these legal frameworks no longer suffice. I think they need to be updated. But particularly in terms of harmonization, they're not doing the right thing right now. It's too small. It's too little. So we need to think bigger than that. Some legal frameworks are actually there. And I think sometimes they're not being enforced, like in the case of terms of services. You talked about, we've seen that with Spotify, where Spotify has tried to stop the funding of six Swedish scientists who wrote a book about Spotify. It's Spotify Teardown. It's a very interesting, experimental scientific book. And Spotify has asked the Swedish state to stop the funding because apparently they went against the terms of service. And the Swedish state said, well, no, we're not going to do this because your terms of service have basically been illegal. And you have to change those terms of service. And that's something I don't hear a lot about. But I kind of wonder, is this some sort of quasi-legality that is being enforced or not enforced there? And why does the state not interfere more often with things that clearly do not correspond with sometimes human rights? Interesting question. I think states could do a lot more than they're actually doing right now. They look at the European level and wait until they come up with the GDPR. They wait with a lot of privacy frameworks. I don't think they have to. I think nations should be a lot more creative in finding their own legal perspectives and also their own national perspectives. But even more importantly, I think it's at the city level that we're actually right now over the past year are seeing a lot of innovation in finding those principles or rules. A civil or city? I'm sorry. Cities. Cities, yeah. Municipalities, cities like Amsterdam. For instance, the city of Amsterdam, I'm currently involved with some of their projects. They're working extremely hard to find a set of principles according to which platforms can enter their city governance level. For instance, they've been struggling a lot with Airbnb, not one of the big five, but very important for the city of Amsterdam and probably for Berlin as well, as far as I can tell. They've come up with various principles like the 30-day principle. You can no longer do more than rent out your place for longer than 30 days. They have installed a register. They've come up with all kinds of small measures to counter the one platform, Airbnb. Now, over the past year, they've realized that we need an integrative set of principles to act upon, and that is a different perspective. It's no longer going to target one specific platform who is sort of messing up what's happening in the city. You know, the city rules, but it's looking at it from a very integrative perspective, and that, I think, makes sense. Like, if our citizens want, okay, if your platform wants to operate within the city limits of Amsterdam, you either need to allow the city of Amsterdam to open up the data for control, right? For instance, one of the problems is that Airbnb has not opened up its own statistics, its own analytics to the city of Amsterdam. So now they can pretty much... Do you do that now? No, they still haven't come to that point, but they're thinking of trying integrative strategies who force platforms within their city limits to act upon those rules. But that, of course, it takes a different perspective. You cannot just go one-on-one with each platform and decide every rule once at a time. You need to have an integrative perspective, and I think they are now currently working on that. That would be really interesting to see the day when Spotify has actually have to open up their statistics and their data. Well, it's different for Spotify, of course. But that's another point. We always think of, in terms of regulation, as one size fits all. The problem with the platform society is that there is no governance that one size fits all. And that's why you need to look... I have made one attempt to look at platforms in a differentiated manner, like some of them are more infrastructural than others. Some are... Content moderation is a totally different thing in social networks than is, for instance, app stores or identification systems. So you need to look at them in conjunction. You need to see what the differences are, but also how they operate as a whole, as a system. You also talked a little bit tonight about education, not just higher education, but elementary education as well. You also do that in a book, in a very interesting chapter. I thought, and I wondered if we could, before we open this up to the audience, talk a minute about the idea of the public good, which can mean a lot of things, which is a very contested, I think, term, what it actually entails, who says what's public and what's private. It's a relation that has been contested for 200 years, I think, more than that. But again, the problem you outlined there is, of course, one of surveillance also, right? If Google has those cheap laptops for all elementary schools with all the presets and pre-installed apps and so forth, everything can be tracked. So that's the problem of transparency, so to speak. Always entails problems of surveillance. If you see public good as something that is tied to transparency, you still have the surveillance problem sort of built in there. So do you think, could we frame in this context of education the idea of public good also as one of secrecy, as one of actually something that's hidden even in public schools? Now would you do that technologically? Well, that's precisely what I meant by a trade-off, a negotiation between the values of privacy and surveillance. You know, that's a constant trade-off. And so within the institutional context of the school, you need to make that trade-off. That decision and that negotiation should not take place at the company level in Facebook and Silicon Valley. It should take place in the schools. That's where those data are generated and that's where children have the right to be protected against certain surveillance systems but also have the right to be protected in terms of privacy. Now there's another public value that I would like to point out in this respect and that's autonomy and professional autonomy particularly. Because many of these systems, I heard, well, this is something I read in that New York Times report that I showed but in the Google, sorry, in the Facebook system, that comes with sort of a centralization of how these apps are made in Silicon Valley, of course, produced there and then they're sort of exported but it takes away autonomy and professional agency from teachers and sort of recreates that at the engineering level. So what you also do is you make it less attractive for a teacher in a school system to create technology that helps children and by outsourcing that to not just a private company but an engineering company, you sort of leave that autonomy level to engineers who are basically reinventing education from an engineering level. Now there are two solutions to that either Facebook implements non-engineers and educators and pedagogues and whatever didactics into their system or the other way around, you try to make teachers become more aware of the engineering systems. I don't know which one is hardest to achieve but I think at both levels we need to The latter probably. Collaborate, probably. But that's why I also make an argument for collaboration and understanding how technology works from a collaborative perspective. Thank you. I think it's time to open up. We've been talking for 25 minutes as promised. Please feel free to ask your question in German if I can't understand with my school German what you're saying, I will ask my neighbor to translate it for you. Please feel free to speak your own language except when it's Chinese because I don't know that language. I'm trying to keep track, there was a gentleman in the third row who raised his hand first and it's you and then it's the woman in the back here. I'm sorry, I hope I'm having it right over you, please. Okay, shall I do it in German, yes? Okay. Rudolf Hilscher, I come from the academy world. I have two topics that you haven't talked about and I wanted to ask you a little bit to say something. The first one, there are also other platforms, that are the B2B platforms. We're talking about Industry 4.0. I remember my father, in his old days, wanted to buy a Jaguar because it was always his dream. Then he got into a law and then said to the seller, he looks like a Volkswagen. And of course that's on this B2B platform because they, so to speak, build these parts all over the world. Please. But of course. So that's one topic. How do you translate B2B platforms? A B2B platform is a B2B platform. Oh, okay. Business to business, so to say. Yes, in the Industry 4.0 context. I hear it instead of B2B. No, no, no, B2B. Yeah. The other question, where you don't really have a focus, is the question where are the heads that are able to create such complex platforms? And of course they have very much to say about it. Well, it has to grow more from below. But if the best heads are bought for good money from the companies that have the money, to pay accordingly, the heroes and so on, then of course we have another problem. And I don't know if they have numbers on it, but that might be interesting. Oh, the last question I may have to come back to you. Business to business platforms, very interesting. We haven't focused on that in the book. I know how important they are. I know there's a lot of literature out there on business to business platforms. But it's simply a choice that we made. So I don't know much about it from research. I know they're incredibly important. And I know for one thing that also the big platforms themselves, for them it's an incredibly important angle. But unfortunately I haven't done much research. So beyond what you already know, I probably can't add much to that. As for the second questions, I'm not sure if I understood that right story about my basic German. But was that about how the local and non-profit apps, for instance, can actually make money because they have to be funded somehow? Okay, right. Then I just understood the first part. Very important topic. The brain drain, that's one of the most important issues that is currently being under-examined. I think the brain drain that goes from public institutions and where the public is actually working on non-profit platforms, for instance. Of course there's much more money to be made in Silicon Valley and with the big companies who immediately, as soon as they spot that talent, they're taking it away. Unfortunately also from the schools, which I think is a very sad thing that's happening. They're taking away engineering capacity from the schools and also privatized the brains, right? On the other hand, and I would like to add this, it may be a site issue, but another thing in terms of counter-power that I see happening in Silicon Valley is currently with the big companies, employees. A lot of the engineers this year, over the past year, have been putting their stocks and bonds to use and to raise their voice. You saw the big Google walk-out beginning of this year in September or October and that meant that platform employees, company employees, are beginning to understand that they also have a particular kind of power that they even want to use, even if it's against their own owners, right? And in fact you might say, you might argue, it's against their own interest. But they have been the ones who have been arguing for equality in the workforce. They have been the ones who are arguing for, for instance, Facebook content moderation. They've been becoming an increasingly important force, not only because they work there and they are the brains that Facebook and Google make that makes them valuable as companies, but also because they have been using their stocks at the stockholders level, the owners level, to become increasingly influential. Thank you. I think there was a second. Have you seen Christian? Where was he? And then there are three in the back, right? Thank you very much. On a certain extent, I represent a little the bad guys because I represent the European Commission here. But not as a bad guy. No, not as a bad guy, I would say. That I spend 20 years just for the embedding of values into European policy, including with the President of the Commission for more than 10 years. My question is really related to the notion of values and the values systems into the digital policy frame. I heard that there are some difficulty to identify values that should be considered as the ones deserving the labeling of Europeans. But on that sense, I'm a little perplexed because in the Constitutional Treaty of Europe, in the Lisbon Treaty, we have the Charter of Fundamental Rights where the values are listed, the integral component of the way how the policy is designed and is on the name of these values when applied to citizens for example, on the protection of autonomy that GDPR has been possible because it was based on a value system approach rather than on the market approach on movements of persons of goods. Then there is a reference that is less publicized that probably that should, but it's already there. My perplexity, this is the question that I pose to you is about the need of having a kind of charter of digital values because the basis of a human rights is there under the basis of that if it was possible to identify fundamental values of citizens but the basis of citizens' rights and personal liberties into the digital world in something new and there is a lack of also cultured and legal element that could be considered for establishing it. Then would you consider the need to have something as a further step to be needed at the European level for the establishment of a charter of European fundamental values into the digital world? For one thing, I just want to address this what I think is a misunderstanding. I consider the EU particularly as one of the best forces we've had in terms of setting the standards for these values. I think I cited that at the beginning. I think the EU has done the best steps that have been taking in this respect have come from the EU. So that, for one thing, I called my grade of a starker one of my biggest role models. So that's for sure. The GDPR I think is a amazing piece of legislation. It took six years or longer to implement that and I think it's an amazing piece of legislation. My point there was not so much as a critique of the EU. I know how incredibly difficult that it is to reconsider or reassess frameworks but I think what is needed is also sort of a rethinking and harmonization of the various areas in which we're currently very compartmental, have compartmentalized legislative frameworks. So I'm arguing more for collaboration, for collaborative frameworks, for harmonizing those frameworks than I am set against one of these. I think EU level has been admirably on top of that. As for the second part of your question, I'm not sure if I understand it completely well but I have not argued for specific European values to put in there. I think there are common public values that should also be common public values for that matter in the United States or in China. I think they have been, well, they are part of the law. I mean, they're legal frameworks in and of themselves. So I'm not in any sense arguing against that. I think what is needed particularly for the digital, and that's I think what Tim Berners-Lee is also arguing for. He sees that the internet is some call that colonized or the infrastructural part is increasingly owned and operated by a few major forces. And in that respect, we could reinforce these public values as the basis of all internet traffic. I think is that more an answer of what you were referring to? I'm not sure if I answered your question correctly. Let's hear it in the back. There's a lady that's been waiting for a while. Hi, hello. So thank you so much for a very interesting talk. I come from the University of Sydney so I'm pretty far away and I'm a scholar now at visiting vets and beds. So I'm interested because in mainstream debates, and it's been obviously popularized in the Academy by Zupov book on surveillance capitalism, we always talk about data oil, data economy and how the data economy has radically transformed capitalism, but sometimes we kind of forget that actually data without the right interpretation, without assembling data, without giving meaning to data, without making sense of them would be like value less. We'll have no value, right? And very often when you consider the platform capitalism, we tend to overlook, for example, the immense billion dollars economy coming from data brokers, for example. And Europe is already squeezed between China and the US already in the data brokers economy. So my question is, I really believe that one of the most important battle for Europe would be not to lose in the space of artificial intelligence, so making meaning, making sense, and machine learning, for example. So I wonder in this climate, quite worrisome climate of Europe, if you think that there is a possibility to try to reshape this artificial intelligence strategy, and I know that we have a representative from the European Commission, we know that we are not fan of the recent communication by the European Commission, because it seems to be too much focus on the market, and you yourself said so, half an hour ago. So do you think that there is the strength to reestablish this value? I don't wanna call them European, I want to call them common values, public service values. We see them in the Charter, in the Amsterdam Charter, but also in the very famous communication that established public service media in Europe, for example. They were there and they were common. So do you think that Europe has the strength not to lose this battle and trying to shape artificial intelligence in this way? That's exactly the point I was trying to make. So thank you for summarizing that quite well. About the report that I just showed, I think it's not that this is a bad report, I think it's doing very good things. It's only that it puts ethics on the last page, and I think it should be upfront, so you may call that a minor quibble. You rightly point out the, I don't like to talk about data as the oil or the kind of sub-off metaphor that she uses. I think data though is the oxygen of artificial intelligence. Without data there is no artificial intelligence. But as you rightly point out, there's something else badly needed in that equation, and that is water, and that is what I think is analytics. Algorithms are at least as important as data, without that knowledge to turn data into knowledge, which is algorithmic predictive analytics, for instance, or real-time analytics. Without those algorithms, there's never a tree. So we need water and oxygen to grow the tree, but both of them are needed to actually make that tree become blossom in full bloom. And that analytics maybe, we always talk about data as a common good, but the predictive analytics that have been used by companies to turn those data flows into a privatized good are, I think, much more under the radar, and we need to pull them out from there. And by that, let me give you one more example, but the NHS controversy over the past few years, I think is a pretty good example, where Google DeepMind came in to provide the analytics. The data were given to them by the 15 or something public hospitals in London, and the analytics, in fact, became privatized, not even the data, but the analytics that came out of those data processing. So we need to focus not just on data, but also on the analytics that are used to process these data. And thank you for the summary, which I totally agree with. Two, I don't know who was first, no, I saw three. You're already there. Thank you very much for the highly fascinating lecture. I'm probably one of the few here in this round who actually has no idea about all of this. But you explained it so fantastically clearly that I, so to put it bluntly, asked myself a question. A few decades ago, no one was able to imagine that Russia is no longer a superpower. How do you see the role of Russia in the context of all this? Hey, very interesting question. Russia, yeah, Russia was not on my map, the map that I showed you. I just concentrated on the two. Nah, I know. Russia is an interesting case. Russia has been presenting itself as an undermining force rather than a superpower in the digital sense. So rather than positioning itself as another superpower in this global equation, it has been concentrating on undermining the political, ideological, and digital forces that are actually now governing that global ecosystems or system of ecosystems. I think that's an interesting position that Russia is taking there. And of course, most visibly in the undermining of election of the recent American election, but also elections in other parts of the world and the European elections, of course. Why is Russia doing this? You would have to ask a political scientist. I'm not a political scientist. Probably someone who is very, very well versed in geopolitics would probably explain that to you. But for me, it's enough to understand that Russia has taken that position of being an undermining, underminer rather than a constructive force. In that respect, we're not going to get much help from Russia. If anything, it will try to play Europe apart, I know all the European nations, and it will try to undermine China and America at the same time, I believe. On the other hand, it also bespeaks the Russian impotence in coming up with its own system, and that I think should make Europe different from Russia because we should really concentrate in Europe on principally designing a different architecture. And just undermining, I think, is a very destructive force, and I refuse to believe that it's impossible to create a counter force that is also very creative, and that can actually, I think it can help generate a new business model, for instance, to have business models in that European model that may not scale globally, but that are very productive on a national scale or a local scale or an institutional scale. This is a stray from your question about Russia, but... Bitte, es sind noch ganz viele Leute, die Fragen stellen wollen, ich muss die Leider unterbrechen, es sind noch zwei, drei andere Leute, und die Zeit rennt ein bisschen, vielleicht haben sie nachher noch die Möglichkeit, die Frage weiterzustellen. Go ahead. Thank you very much for your very illuminating talk. I've just one comment about the last topic that you discussed. In one of my capacities, I have three school-aged children, and I'd like to suggest an alternative answer to your question. You asked whether the teacher should be educated in the engineering questions or whether the engineer should be educated in the pedagogical questions. And I think I would be in agreement with most parents with school-going children that the alternative would be, don't let the crap into the schoolroom. You know, when in doubt, switch them off before you come into the room. As an alternative, I would suggest empowering teachers to find creative ways to use them, but that the power should be with the teacher. Obviously, in interaction with their children, what they judge to be good for children, we should be educating teachers to be good pedagogs and not letting international companies bring their machines into classrooms. But anyway, I do have a question, though. My question is, what role do political parties play in your story? I was looking at your wonderful graphic with the civil society, market and government, and even though my eyes aren't as good as they were, but I was squinting hard to see and I didn't see political parties there anywhere. And the background of my question is that I think we're probably most will agree that we're experiencing a crisis of democracy. And this crisis of democracy has a lot to do with the role of social media, but it's also primarily it's a crisis of political parties. You know, major political parties in Germany, like the SPD, are getting washed down the electoral toilet. The party system in Britain is crumbling in front of our eyes in the most remarkable ways. So what role do you see for political parties as mediators or whatever in the story you're telling? Thank you. Very good question. And I'm sorry, it wasn't in the graph. You're right, political parties are not part of that graph. There are many other actors not part of this graph, by the way, because it would be completely filled up with actors. But you're very right. And as a matter of fact, we've published a Dutch version of the book in 2016. That book was picked up by Dutch politicians more than any other actor. I was very surprised. I thought I would be invited first by schools and by, you know, the various institutions. But it was political parties that invited us as authors and come and talk to talk to them about the political implications of what we were arguing. And as a matter of fact, just in recent months, two weeks ago, I was called to the Dutch hearing, parliamentary hearing, into the digital society and how to govern it. Exactly this topic that you mentioned. And this was organized by the Greens and the Dutch SPD equivalent. I think they're picking it up just very recently. They're picking up this theme as an important theme. At the various, as I explained, the political level at which it's picked up mostly is the city level. And that I found also very surprising. So the national level, it is political parties that are picking up on the theme. But at the local level, it is municipalities who are trying to implement new systems that are driven by platforms and actually fed by data flows. So at these two levels, it was to my certain distress like the middle ground political parties, like the German SPD or the Dutch equivalents of that. They didn't pick it up at first. And only very recently, they have turned it into a political team. And now I think it's actually the political parties who are calling for more awareness but also for better legislative frameworks. And as I said, this has only been happening since January. So that's the only difference I see with like two years ago where no political parties were actually interested in this topic. We have one more question and then we have to wrap it up. I think pretty much which was first. Christian, do you have an overview? I don't. Who's the tallest? Yeah, I win. Thanks very much. I'll make it as quick as I can. In work that I do, sometimes we have conversations that come up about what are we doing this for? What's the internet for? Why are we doing this as humans? And sure, Silicon Valley will tell you what the tech is for and we see that actually the tech is for something else. So I kind of wanted to ask you if you think that there's space in your list of things that where we could add something else, it might be off topic, might be a little bit out of scope of the title of the talk. But the item that I would like to see added there is something to do with the non-platform, the non-online, the right, we have the right to be forgotten, et cetera. What about the right to be unconnected or disconnected? I'm a little concerned. You mentioned identity provision in the beginning of the talk. Sometimes the suggestions that maybe Facebook might become some kind of passport provider. In Mexico, it's not really possible to use blah, blah, car without a Facebook account. And in Germany, people are still quite into cash, but there are other places in Europe where participating in the economy is being completely technologized or whatever. You can't really buy anything without having a phone. And people are like, yeah, this is great. And so just to sum up again, sorry, is there scope in talking about responsible platforms to talk about the right to no platform? Right, very interesting question. The right of the disconnect is basically what you're asking. Well, I'm planning, I'm sort of thinking about my next book, which will be, as I just told you, Platform Power and Public Counterpower. And perhaps one of the chapters will be on the disconnected. I think it's a very interesting sort of power where people deliberately choose not to be connected and not to have a mobile phone, not to connect to Facebook. I'm actually a Facebook refusal, refuse nicks. But there's actually people who refuse anything, simply want to disconnect. And I think that's a very interesting power. It's a minority right now. It's becoming a decreasing number of people. But I think it may be a very important signal to put up. The problem is though, especially amongst the younger generation, you can hardly do anything professionally without being part of that infrastructure, without having a Facebook ID, without having a login, without having a LinkedIn page. Many of your professional environments require that you are part of that digital platform ecosystem. And that makes it so hard to be a deliberate protester, a disconnect person who wants to turn that into a signal. Still, I think it's a very interesting movement to have people who are deliberately disconnected. And I'm going to do more research into that, I promise. I have asked that question a couple of times, if I remember correctly, at the beginning of this year, it's about disconnecting. And I think the answers I got that sounded very reasonable to me, usually were, oh, you have to be really privileged to do that. I mean, most people just do not have that kind of privilege. Or old enough to do that. Or old enough to do that. And let me close with a question that maybe isn't some sort of relation to that disconnecting, but it's about growth. Because one of the laughs you got tonight, at least for me, was when you quoted Jeff Bezos with the consumer in the classroom, and so forth, and with Facebook in the classroom, that was, of course, that's funny, but apparently it's pretty normal to some people to view it as that. And I think most of us have become quite at ease with the notion of being consumers or users, which is something that is not very far away from the idea of being a consumer, not citizen. I kind of perk up citizen, what's citizen? It's a totally different concept. And the concept of the consumer, of course, is to pay exactly for what he or she has paid for, or hoped for, usually, or thinks that he or she ordered. It's a totally different concept. Then the citizen, and it's a concept that leads usually to growth. And internet traffic is not called traffic for nothing, right? Because it's traffic. And because it grows and keeps growing. Do you think, apart from all those multi-level, multi-perspective kind of legislation and collaborations you were talked about, do we have to think about degrowth in order to get a grip on these things on the internet also? Yeah, absolutely. And you didn't even mention one area that I think is most important in that respect, and that is sustainability. Data centers, the last time I heard, are going to use up 25% of all our fossil fuels over there. And of course, they're trying to become more sustainable. But the most remarkable figure I've heard over the last few weeks is that Microsoft is going to build a data center close to Amsterdam, which is going to use up more energy than the entire city of Amsterdam. And for that, they will need an area more larger than the province of North Holland to actually feed their own data center, which is usurpating more energy than the entire city of Amsterdam. That, I think, is mind-boggling. We're trying to implement notions like values, like sustainability into our economy. And at the same time, by the same means, these companies are trying to do that by putting up sustainable energy solutions. But they take up so much space and so much energy that we're forgetting about precisely this thing. Why do we need all this growth in this area? Why is there no point at which we're satisfied with the kind of data activity that we're doing each day? Do we have to go to the moon to just become aware of the fact that we're eating up our own planet? I don't think we should go that direction. But that's a sad note to end on, don't you think so? Oh, I think we should go to the bar, not to the moon. I promise to Toby to be very sort of utopian, which I can, but I actually, I promise to be not dystopian. I said that in the introduction, it's going to be a jolly night, right? And now we're ending at that note again. So we shouldn't end on that sad note. Actually, I am actually an optimist. So I think we can stop that before it happens. Okay, we'll take that. Thank you, Jose van Dijk. Thank you for being here. Thanks for being with us. Thank you for being such a wonderful audience.