 Hi, good afternoon. Welcome to the Future of Democracy, a show about the trends, ideas, and disruptions changing the face of our democracy. I'm your host Sam Gill, and for those of you who have listened to this show or tuning in just for the first time, what we kind of try to do here is give you the op-ed page of our democracy. We try to break down some of the most contentious, challenging, fractious issues that are happening and explore what they are, what they mean, whether they're real, where they're leading us. And we're sticking today with an ongoing discussion we've been having over the whole summer, really, about the role of social media in our democracy. Almost daily, during the final sprint of the election, the major social media platforms are adjusting their policies to address a seeming tsunami of election-related misinformation, demagoguery, and conspiracy. Our guest today saw earlier than most that technology, for all its promise, held the potential for great democratic peril as well. Zeynep Tufekchi is a professor at the University of North Carolina, a columnist at The Atlantic, and the author of Twitter and Tear Gas, the ecstatic, fragile politics of networked protests in the 21st century. Please welcome to the show Zeynep. Hey, how are you? Hi. Thanks for coming. Thanks for inviting me. So we're starting with your Victory Lab. So you were very early to point out what the dystopian implications of globally networked technology could be, especially some of the things we're really seeing right now. So the use of microtargeting and what that does to preference formation and the way in which technology can accelerate extremism. So do you feel vindicated or horrified? Well, I feel both horrified and mad. I keep feeling like saying it's not all terrible. There are lots of good things. There's a lot of reasons for optimism. It's just important to pay attention to these difficulties of a transition so that we can do it in a healthier way. So I feel almost like a slight contrarian urge saying that it's not hopeless. It's not something to despair but think deeply about how to do it better. So what are some of the folks in technology often talk about the affordances, the things that technology allows us to do? What are the elements of these tools, particularly social media, that you believe are both capable of enabling good, but are also the instruments of some of what is concerning us right now? So the way I think about it is that this changes the fabric of society because it changes how we can connect to one another, what's visible, what's not visible, what's archived, what's permanent, what's ephemeral. So it goes to the heart of everything. It's changing how time and space are experienced. This is a profound shift and it's way beyond, this is just a tool, we can do good with it, with bad with it. What you're doing is changing how everything is organized. So to sort of, when you look at it that broadly, no wonder we're struggling with this. And what you're talking about in terms of my earlier work, when I was talking about some of the dangers, it's still sort of similar topics. One of the key differences, for example, is that with social media, we have things that are at the same time mass but not public, right? In traditional broadcasts, the things you have that are mass are also public. You still have gossip, you still have socialization, you still have people talking to one another, but it's not fast and mass at the same time. So you might have like a rumor going around in one town, but it can't just sort of all of a sudden, you know, unwrap the whole country, perhaps, but also without being visible as some form of broadcast. So that's one big difference. The other major change is that when you have sort of socialization, where people are just talking to each other, whatever they think they're talking about, it's not mediated by a platform like we do right now, whose business model is to capture our attention and to sell our attention to advertisers by profiling us, right? So this is a huge difference because things that capture our attention are things that are novel, interesting, exciting, things that bring us together, things that bring us together against another group. Those are things that feed polarization. So we have an ecology on social media by business model, it facilitates polarization because that's engaging and polarization is the flip side of bringing people together. You know, you get into a team, you go rah-rah to your team, you fight with the other people. It changes, you know, I'm going to go on and on, it changes ephemerality in that lots of things that we just would have said and would have kind of disappeared into time are now archived permanently. And it changes who can be targeted on something like Facebook, you can upload, you know, 1000 emails and say, find me, you know, machine learning that Facebook has developed millions more like these people. And Facebook doesn't even have to understand what on earth you're looking for. So if you upload, you know, 1000 people who are prone to conspiracy theories, Facebook's ad engine will nicely and dutifully give you a million more who might be prone to conspiracy theories. And then you can micro target them in a way that's mass and public. So it changes everything about the way our fabric of information and socialization occurs. So let's talk about this question of intermediation because I think, you know, so the certainly, you know, newspapers and broadcast television, we're also selling attention to advertisers that they were aggregating. And, you know, there are two things that you've brought up that some people point to. So one is that that they to your point of mass, the masses, the public that they certainly saw themselves as serving the whole of the market, as opposed to just the conspiracy theorists. And, and that so that's one thing that's gone wrong is the sort of fractious nature, the vulcanization of audience. But the other argument that people make is sort of an ethical one that that these these gatekeepers, these intermediaries of the newspaper and broadcast era had ethical standards about what would count as authoritative information. And an algorithm may or may not have ethical standards, or it may have unethical standards that are much more focused on attention and engagement. What do you what do you? So I guess my question is, what is that? Is this the right comparison? First of all, like, should we be comparing intermediaries of the past to intermediaries of the present? And if we should, what are the what are the what are the helpful versus distracting points of comparison? So those are very good points. So it's not, it's useful to some degree to compare, but there are big differences. So we had conspiracy peddling newspapers as well, right? National Inquirer with the frequent UFO landings and whatever else you want to say. But one of the differences was, one of the key differences is that it's still broadcast. Like if there is a National Inquirer right at the checkout line, it cannot flip a different front page to every person coming by depending on what it thinks is going to be sort of that person will be susceptible to based on surveillance of their habits, right? So there's a big difference between having one big broad thing that's static compared with something that is different to each person, you know, screen by screen. So that's one thing. The second thing that is different is that in a advertiser sported model where the output is static, you only have one front page, which is a 20th century model, right? Before you had different kinds of newspapers as well. You, the way to get the broadest advertisers is in some sense to remain centralizing, like you sort of try to get a broad audience, which had pluses and minuses, whatever you thought about it, it was a thing that didn't, couldn't divide because wanted mass audiences. You couldn't just have two billion people, which is mass, but also showed them, you know, have them splintered into publics. So you have to have like a merger of public and your business model kind of worked with each other. And also it was gate kept, right? There were journalists who were gatekeepers. And while not perfect, I mean, I can give you 30 significant criticisms, you know, papers with criticisms of traditional mass media gatekeeping, but they did have, as you know, ethical standards that they failed from often, but they failed from them. They had standards that were normative and they failed from, and which were, you know, sometimes some things got censored, there was misinformation, there were other things, but it was still like things that we tried to go by. And importantly, a lot of those standards and that understanding of journalism with the, you know, the code of conduct or ethics that you think about were developed in the post-World War II period when the world was horrified by rise of fascism and propaganda, the role of propaganda in the rise of fascism. So they were also reaction to the age of propaganda, which, or the fear of propaganda that in both Europe and the US scholars established and journalists kind of went with this new program where they saw themselves as public watchdogs for the broad public. Whereas, of course, the algorithm doesn't have to have any such, but it can't because one, it's an algorithm that's trying to optimize for engagement. The second thing is once again, if something is visible and math and trying to gather a large audience, which is the newspapers of 20th century, it's a very different setup than, I can have my two billion people without giving them all the same message. New York Times cannot do that. Facebook can do that. That changes what their incentives are. So one of the sort of thinking about kind of gatekeepers for kind of most of the history, particularly social media, and this is still true, the philosophical argument has been advanced around being a gatekeeper that we don't want to be arbiters of speech, has been a part of the argument. And a part of the argument has been that it's net beneficial to democratize discourse, all of these critiques of newspapers, that everything that gets limited in a world where a few people are making decisions is tended to carry the day. And it's still an argument, for example, I think Mark Zuckerberg of Facebook will make very, very, very vocally. This year has seen a real change in the orientation, at least the public orientation, particularly with regard to health information and election information. And as I sort of said at the outset, you're starting to see some of these companies make policy changes that I would have said 12 months ago they would have been allergic to, even considering. And I'd love to know, what do you make of some of the policy changes? What seems on the right track to you? You know, we've seen some, we're getting rid of this kind of content, we're blocking that, we're giving a notification, Twitter is introducing a little bit of friction in the system versus what do feels like it's on the wrong track or missing the problem. So, one step back, I will say that there's a question different than arbiters of truth, is that they're arbiters of attention. And that's really important because that's a big debate we should be having because even when they are or not blocking a particular speech or group, they are abiding attention, our attention by elevating certain kinds of things by design, by algorithm, by what they promote, how the retweet button works, whether you have like or dislike, whether you have a hate emoticon after everything, they are arbiters of attention and mood and context, which is super important. So, there's a discussion way beyond speech because speech is kind of misleading. They think as a cause, I mean, we had similar discussions before with mass media too. If you didn't get on television, you didn't necessarily have restriction to your speech, you got restricted into the attention, mass attention you could get. Well, that's what I want to say is that that's more important. The second thing, which is what you just said, is that they're stepping into the space and there too, I have a much bigger concern than what they're doing, which is that they get to unilaterally decide this. I'll tell you what I honestly think they're doing. I think Facebook and Twitter have concluded that President Trump and the Republicans are going to lose this election and they know that the other side of the aisle has been really mad at them for their lack of action in this space since 2016 and I think they're just doing a very calculated attempt to appease a constituency that includes their own workforce, which has been upset with these platforms because of what they think the election result will be. And I think this is very similar to what they did in 2016 in which they decided, along with most rest of the population, I guess, that Hillary Clinton was going to win and decided to leave the misinformation space alone because they thought, you know what? We don't want to really upset the Republicans right now. We're just going to deal with it after the election, after Hillary Clinton wins. So in both cases, when they act in a way they don't act, because they think it's in their political interest, in their interest for saving off regulation, in appeasing constituencies they care about, like their own workforce or, you know, the other side of the aisle, it bothers me equally because the problem isn't whether they're doing something that I personally like, but the problem is who died and made Mark Zuckerberg or Jack Dorsey the king of our public sphere. I do not want them making these decisions on these calculations, which to me is about their self-interest as far as any sort of, when you follow it that's what it looks like. And rather, I would like a world in which where we as a society get our politics together, get our act together, and we decide where those lines of attention speech, hate speech, the harms, misinformation are, and we tell these platforms what to do rather than this game of playing the referees, which everybody's doing. The Trump's tweeting about 230, why? He's just trying to pressure them. And when it looks like he's going to lose the election, they're putting labels on his, all of a sudden, you know, they, they're putting labels on his tweets. That's not the way it should be. So what, so how do you create effective public accountability for, for the reality that these have become public institutions in a sense, whether or not they're privately, privately operated and publicly traded, traded companies? Their argument, of course, is always, oh, well, if the users didn't like it, we're all users, by the way, we're not people anymore, we're just users. If the users didn't like it, they would just walk away. Clearly, that's not the case. So what, what are, what are some of the things you've thought about as more effective approaches to public accountability? So the first thing is the longer this goes on, the harder it is to fix, because they also break our politics, right? The way our media ecology works right now, from mass media to digital media, the whole thing, it breaks the politics, the very tool we're supposed to use to fix all this gets further broken, the longer we wait to fix this, this would have been easier to fix in 2012, when, which is what you're referring to, I was trying to say, you know, let's get to this before the next election, right after the Obama, President Obama was elected for the second term, I was trying to say, you know what, this is an opening, let's try to fix this rather than to celebrate Obama won using big tech rather than let's think, how do we do this? So that's a problem, it's hard. But there's no other way, there's no other way to try to deal with our societal problems than through political processes that regulate and inform how the public sphere operates. And as you say, there's a lot of things that are wrapped into this, there's the antitrust argument that we're seeing already, there is sort of competition, there is all these very important questions, but also what is hate speech? What is the business model limits? What kind of data collection are they allowed? What kind of micro targeting should be doable? What rights do we have as a public beyond like I click yes or no? And there's all these questions that we have to address as political questions through the political process. And there I have a little of sympathy for the platforms because like everybody's sort of trying to press them to do this or that in their power. And like what are they supposed to do? Like what will their right answer look like? They shouldn't be trying to like sit around the table and figure out what the right answer is for drawing the line for hate speech. We should be developing those things the way we have FDA, right? We don't sort of say to every food company, oh, you decide what's seen a poisonous or you decide what's dangerous, we tell them this many parts per million or you don't get to sell that, right? We tell them what food safety looks like and expect them to go buy that. That wasn't always the case. So this isn't something like that. We do that with cars. We say this is how much emission you can do. We don't let every car company say, why don't you have your CEO decide what's good for the planet? So that's the model we have plus we have because of network effects and all these other things, a few big companies that are, you know, cause our monopolies in a new way, right? They don't really fit the traditional definitions completely. Maybe Manas Alpani would be better. But yeah, that's what we got to deal with. Sorry. So I think FDA and auto safety are two really, two really good examples of what though how hard it is to build an administrative state that can take these questions on, right? Like this isn't just an issue of, you know, Lindsey Graham and Amy Klobuchar putting aside the posturing and talking about hate speech, right? FDA, you know, proto FDA happens in the late 19th, early 20th century in response to meatpacking industry and in response to sawdust instead of aspirin is the thing that you're sold at a low, low price. And of course, auto safety regime comes out of mid and I think people don't realize how many people per capita used to die in, in, in collisions in the United States in the, in the 30s and 40s comes out of recognizing that if we don't have some common understanding of, of the way that the, that, that not only the harm happens, but the safety features need to be designed, we won't get there. But, but both of those agencies right rely on an army of experts who, who have comparable knowledge to the private sector entities about the way these systems work. It strikes me the, there is a huge asymmetry. And there's maybe a couple of staffers, you know, who understand this technology. I think Congress has come a long way, but I don't see a lot, you know, I don't see a lot of, a lot of the administrative state having anything close to the kind of engineering knowledge that you find inside. Don't even mention Google and Facebook that you'd find inside that startup you've never heard of that that's coming up with the next tool. So how do you, how do we ameliorate this challenge? So I completely agree that as things stand, we are not up to the task, right? There's no question. I mean, I keep going back to that moment when like a senator literally asked Facebook, how do you make money? Yeah, I mean, and he, I mean, again, I'm a critic of these platforms, but you could just sort of see him stunned. Are you really asking me this and trying very hard to be respectful to a question that absurd because that senator's supposed to be questioning this platform and doesn't understand the most basic thing driving the business model. So we're not up to it. That's like, I'll be the first to concede, but that's again, no excuse because that's how everything starts. The companies, the businesses, they of course have more expertise, plus they have all the lobbyists, plus they have the technical knowledge because that's where the money is. But you, we as a society, our job is to counter that by fixing our end of it. And I'm an academic. So for me, a big part of it is like strengthening the research side, the independent research side. Like when I do research, I don't care to sort of make money from the company. I just work on the public interest. So that's very important. So you have to strengthen the watchdog ecosystem, right? You have to strengthen, we have tons of media watchdogs over, you know, the sort of 20th century, exactly because of this reason, you have to absolutely strengthen the technical knowledge among staff in Congress, in Senate. And you have to like, I mean, for foundations and other funders, they have to fund those fellowships. We have to be able to compete with the best because if everybody's like, if, you know, Facebook and Google are offering, you know, high six level salaries and it's, you have to compete at whatever level they're competing at. Of course, people will, a lot of people will take a pay cut as academics do to work in the self, work in the public interest rather than just, you know, be a slightly better advertising delivery machine. So I think we would recruit, but you still have to be in some reasonable space to give people a chance to come to this side. So we have to do all of those things. Now, why isn't any of this doable? It's absolutely doable because even the numbers we're talking about aren't huge, and the stakes are so high, the stakes of, if you don't fix this, it's kind of like, as you say, not having the FDA operate all of a sudden, we're just telling customers, oh, everybody, you better have a lab in your basement, buy or be aware, that's current information ecologists buy or be aware. And that's not, that's really costly, you know, prevention and healthy standards and a way of doing business that is not against the public interest this much is much cheaper, I think, than trying to balance the administrative state side of the problem. One of the questions we've gotten from the audience is what you think of, in addition to the regulatory, the regulatory option, what do you think of the potential for independent bodies? The definition of independence is obviously an important factor here, but what do you think of sort of non-governmental forms of oversight in this space? Absolutely. I mean, I think all of it, like we definitely have more of the administrative state. I would like to see the industry do explicit self-regulation through their own watchdogs or industry associations, because why not? Like, lots of industries do it, but you can't just leave it to that because that's not, because then they'll, and there's value in intra-industry coordination so that they're not afraid of doing the right thing and having their lunch eaten by their competitor, right? If they say we're all going to do the right thing at the same time, so there's value to that. There's absolutely value to NGOs and independent watchdogs that are neither government nor the industry, because I'm from the Middle East. Whenever the government steps into the information ecology space, I'm afraid of what the government's going to do there, right? I'm afraid of what the political power is going to do. So it's not like I want the government to be the only sort of political body to be that powerful either. You have to sort of have a civil service that is not just short-term political appointees, which is important. You have to have political intervention because times change. You have to have the independent watchdogs that are also trying to check not just the industry, but the government and speak as public actors in that space. You have to have citizen involvement so that people kind of say, you know what? This is our society, our public sphere, and hold politicians to account that they should have platforms. They should have policies. It's just as important to tell. I mean, I'm not going to compare, which is more important, but like if healthcare is a question you get asked, this should be a question you get asked by constituents. And industry, you know, ideally you would have the workers, the very privileged workers in the space, actually unionize and try to hold, because they hold a lot of power. They just don't seem to ever bother exercising, to be honest, besides complaining on their internal forums, the message boards. Like they could try to become sort of another constituency with a powerful one within those companies. So there's all these things and that's how we do everything else. Like everything else that we've kind of managed to grapple with a little better, none of them is perfect. It's never perfect. In fact, in a lot of areas we're seeing sort of losses of that kind of, you know, it's not just in this area, but that's how we do it. Throw everything at it and it helps balance it out as well. Did you say something kind of a last question on the topic that you just left us on, which is this question of our, basically our cultural tolerance and expectations for what this technology should do and be. Do you see that shifting? I mean, there certainly was a period where we were, you know, at worst naive, but at best really excited about what this kind of technology was enabling. We're in a kind of moment of condensed anxiety. A lot of that anxiety is leading anxiety. Do you think there is a broader cultural shift underway about what people want out of this technology? I'm still excited. We have a pandemic and we're able to sort of go on with a lot of things partly because we have digital technologies. It's not a bad thing at all. It's just a major transition to the fabric of our society. So I think what has happened is I think a more appropriate recognition of what a major transition this is. This is not a small scale transition. You know, you can do your historical comparisons, but, you know, pick your example, printing press, telegraph, whatever you want to say. And it's hard to compare, which is big or small because there's no meaningful way to measure, but it's a major transition. It is not as simple as, oh, get off Facebook because even if, you know, a few people get off Facebook, there's so many functions that are now only possible through Facebook that we have to think about that I'm not even for. I've never been for, let's not use these technologies. I'm more like one that's not going to happen, but there's no reason that we have to have all of this operate under terms dictated by a few powerful companies that are to a large degree run by the individuals who founded them because that's how the stock is structured. So this is a really ridiculous moment to have one person basically in the case of Facebook have like this, see, he decides this, that's what happens. He decides that that's what happens. This is not how it should operate. And Mark Zuckerberg is on the record basically saying, well, do tell us what to do like he's written an op-ed. Now, I don't think he would like us to tell him what to do on everything, but I think we should take it in the broader sense and say, yeah, like it is our place to tell you what to do on these important things of public interest. And we should do it and not let this transition be dictated by a few people. But if it is, if it must be dictated by one person, some of our viewers will nominate you. So for those of you, you can follow Zayn up on Twitter and Zayn up again, the most recent book is Twitter and tear gas, the ecstatic fragile politics of network protests in the 21st century. We didn't talk about a really critical topic you've been a commentator on, which is the ongoing pandemic. And those of you have not been catching Zayn up's columns in the Atlantic. They've been absolutely defining an essential reading. So we'll be sure to send those out after the show. But Zayn up. Thank you so much for joining us. Thank you. All right, folks, we're changing the schedule a bit. So keep an eye out on kf.org or at the Sam Gill on Twitter for info on new episodes. And as a reminder, this episode will be up on the website later. You can see this episode in any episode on demand at kf.org slash fdshow. You can subscribe to the future of democracy podcast on Apple, Google, Spotify or wherever you get your podcasts. Email us at fdshow at kf.org. Or if you have questions for me, just send me a note on Twitter. Again, that's at the Sam Gill. Stay for just a minute, take a two-question survey. And as always, we will end the show to the sounds of Miami singer songwriter Nick County. You can check out his music and follow him on Spotify. Until next week, thank you for joining us and stay safe.