 Alright, welcome. It's 101 Eastern Time. It's 231 in Newfoundland and we are ready for another episode of Vision, a show peering into the trends, ideas and disruptions affecting the future of our democracy. We're coming to you today with a brand new HD Logitech camera. We have no relationship with Logitech. This is a product we chose. We would like to have some relationships with some companies I think have ever advertised on a web property before, Squarespace, Stamps.com, Casper Mattress. If you're out there, we would love to be the first digital property that you work with, give us a call. Everyone, it's tremendous to have you here. We are continuing our deep look into one of the really central issues of debate about the future of our democracy right now, which is the issue of the infodemic, this sort of officially World Health Organization designated challenge around the overabundance of information at a time of a global pandemic that is making it incredibly hard for people to know who to trust, to know what to trust, to make smart decisions. We have been exploring over the last few weeks both what the nature of the infodemic is, but also how it's related to the way that our information system works, whether this is a more persistent and more endemic challenge that we're going to have to confront as a democracy. I think today's episode is really exciting because it's going to be a chance to make that connection, I think, to explore that connection between what we're seeing now and the way some of the technical systems that power the internet, some of the social systems, the way that we use the internet are actually designed to facilitate something like the infodemic that we're experiencing and then potentially what can be done about that. Our guest, I really want to get right into the conversation. Our guest is an incredible scholar who is focused on these issues. She is a assistant professor at the University of California, Los Angeles, Sophia Noble. She's the author of the best-selling book Algorithms of Oppression, How Search Engines Reinforced Racism, and so just joined me in welcoming Sophia. Sophia, it's great to have you here. Hi, Sam. Great to be here. So I think I want to, before we get into the kind of the moment and the way in which the moment is exposing so much about how our information systems work, about how we relate to each other online, I want to sort of get into some of the ideas that you have really been pioneering. And, you know, a big part of what you've documented in your work is the way that algorithms, that the lines of code that enable Amazon to recommend to you what to buy or help Facebook figure out what's the next piece of news or alert from your friend that ought to pop up because these systems connect so many people. The way these algorithms can actually discriminate, they can have a discriminatory effect on different people. So to sort of unpack this, could you just give us an example of this phenomenon? Where does this pop up in our world? Well, it's really, it's interesting to think about the kind of internet that we all are on today, which is largely controlled by big, multinational companies who, in many ways, I think, present their products and services as one thing, maybe as things like neutral democratic spaces of conversation or connection. And they have worked so hard to divorce themselves of being seen as something like a media company, right, because they would be subject to oversight in different ways. So part of the challenge of how we're here is that many of the things we encounter on the internet are really motivated by advertising, by that kind of advertising profit imperative. And the content in between ads is really just to capture our attention, like the things we share and the things that we contribute. Even the information we find in a search engine is really about linking us up with Google's advertisers or being Ariyahu's advertisers. So this is the challenge. Some ways of how these things work is that it's not obvious what's going on or there's a lot of misrepresentation of what's happening. So, for example, when I first started writing about this about 10 years ago, I was studying the way that different people in communities get represented in something like a search engine. And I found that for many years, if you did searches on black girls, Latina girls, Asian girls, and these are ethnic groups in the United States who are children who don't have a lot of money, obviously, even all aggregated together. They were overwhelmingly represented by pornography. And the porn industry had captured their identities and all the keywords associated with these children. And so the question is, like, how does this happen? And these are the kinds of things that I study is what are the stakes for the ways that people get misrepresented in platforms that misrepresent themselves? So how does this happen? I mean, I assume as you started to document this, you had no shortage of Google engineers who are horrified, right, to find out that or acted horrified to find out that systems that in their minds were neutral. They were about giving you, you know, relevant results. That's what Google is trying to do. It's trying to give you relevant results. We're returning these searches. What is what happened? Why does this happen? Well, I think there's three things that I identify in in my book, which is first, people even who work on search engines, but certainly people who use search engines think that it's a democratic process. You know, they're looking for the kind of the information or the links that get clicked on the most by people all over the world as a way to kind of inform democratically what should go to the top. So that's that's the first factor. But of course, that means that if you are in any group where you are in the minority, you will not democracy of this sort will not work for you. So that's one of the first challenges and fallacies of the way that these systems, the logics that they're resting on. The second is search engines are advertising platforms. And you can go into AdWords, for example, and pay to optimize your content. And those who pay the most fare better. So this 24 by seven live auction is the mechanism by which we see ads on the side, but also the logic of how do we marry up certain kinds of content with certain types of keywords. So industries and organizations that have the most money always win in a search engine. And they usually fare best, unless there's a public relations nightmare that has Google engineers fix it. And then I think the third dimension of this is there's a huge gray market of search engine optimization companies, advertising agencies, consultancies, people who are interested in fixing things like that we call that we nerd out on calm, things like metadata, optimizing websites so that they can kind of game Google's algorithms and be made more visible. So these are the challenges is you have all three of these things at play. And what is not at play is let's get the most accurate, correct information to the first page of search results. And of course, the first page is the most important because most people don't go past the first page. They really believe that search engine companies are giving them the most reliable information that they can find. And certainly they do when they're you're looking for, you know, a party dress. It's really good. It's it's accurate. If you're shopping, if you're doing things that advertisers actually want you to do, you're going to find pretty accurate information. But the minute you step off the curb and start looking for nuanced information, history, a context, you don't get, for example, like the whole history of sexual sexualization and commodification of women of color, that extends back to the founding of the country and the enslaving of African peoples. I mean, there's no context for understanding why women and girls of color would be sexualized in this long history. That's just completely invisible. Does the does the does the does the kind of content matter? And so by that, I mean, you know, we so I certainly think, you know, folks who are listening who who are less familiar with this can imagine how something like a search engine is uniquely vulnerable to this because it's the search engine is trying to behave as if context doesn't matter. But everybody whose interests are interests are driving the search engine, have enormous context around that there's a profit seeking context. There's someone who actually wants to commodify women. There's someone who really doesn't want to serve all these competing things. Yet worse, it seems to me that we see examples of you know, what you would call algorithmic discrimination in platforms like Airbnb and platforms like Amazon platforms that are more transparently transactions, you know, platforms that are not in the search engine relevance business is something is something different going on to generate, do I have the right perception that there's this is still a vulnerability on those platforms, and is something different going on in those cases to produce discriminatory results? No, I think the the logics of of profit are always undergirding these these platforms, and they have a lot of vulnerabilities by which they can be manipulated too. So, you know, I, for example, when the Facebook housing ads, for example, the discriminatory housing ads were placed a couple of years ago, and this was a big news story, you know, it was interesting talking to people who've worked for Facebook in the past and and who for whom thinking about things like federal law and non discrimination was not a factor in their making of their own review process, for example, for ads that come in and get placed on Facebook. So what's interesting is that I think many of the tech companies operate outside of the civil rights laws that we have the human rights paradigms that we're trying to operate within. Because first of all, it doesn't occur to them that in their advertising platform, they need to be thinking about civil or human rights laws. They're oriented, they know what they're doing as an ad company. And so I think these are some of the challenges. At the same time, the public is using these things, these platforms like their public utilities, public information utilities, that they're, you know, they don't actually understand the motivations. Well, this I think is an interesting point that sort of cuts both ways, because I think, you know, on the one hand, like, you know, you've used the word logics a couple of times, like it strikes me, that's sort of what institutions are designed to do, right? You know, society is complex. What I want to know is not how every single judicial case is decided. I want to know that there's a logic that I think is fair to how we decide issues of guilt or responsibility in criminal court or responsibility in civil court, and that that process is being applied. So it's not, it doesn't strike me that it's new that we have problems where we don't, where we're trying to figure out how to make institutional logics work. I guess what strikes me is new is, one, those institutions are imperfect too. You know, you need look no further than the judicial system to know that institutional logics can lead to results that you don't like and you need to be able to scrutinize what's going on in that institution. How is it making decisions? And it also strikes me that the point that you're making here is also new, which is that you've got a set of entities that effectively are institutions, but behave like companies. And, and, and a question about whether they're doing, taking the steps you might need to take, if you're going to be, if you're going to have the same power the judicial system has, if you're going to have the same power that a consumer protection agency might have, is that, is that the right way to think about this transformation or is there something else? Well, I think another way to think about it is, why would we seed the public square, so to speak, or our news environment to a company like Facebook, and, and confer the responsibilities of democratic institutional work, right, in our society? Why would we do that in many societies, not just in the United States? This is a big challenge. I mean, what has effectively happened is Facebook has helped put a lot of smaller newspapers out of business. I mean, we were already in a crisis. I mean, no one knows this better than the Ninth Foundation, right, of the kind of collapse of the newspaper industry in the 90s and early 2000s. And so here we have, by default, nothing left but these large companies to be kind of a repository of all kinds of activity or a generator of a facilitator of all kinds of activity. But unlike, let's say, other kinds of democratic institutions, universities, schools, government, there's no democratic oversight of these kinds of platforms. There's no oversight. I know there's like attempts now. I mean, Facebook is attempting to have its own oversight board, but these are not democratically elected people. And for many of us, democracy is still unfolding. It's still a project unfolding that hasn't fully realized equal protection under the law, equal opportunity. So to think that even less transparent organizations than the kind of democratic institutions we work with in the United States are in charge effectively, is really frightening. And when I look to things like a pandemic, for example, and Google and Apple rushing in to provide things like, you know, contact applications to trace our every move, again, that's really frightening to think about the level of information and control over something like a health crisis, for example, by companies and not by the public. So let's come back to this question of both, you know, why would we see this level of power and what kinds of accountability would we want and talk about the pandemic for a moment? You know, I think we are seeing, we're seeing new efforts by the, efforts maybe sometimes they hadn't taken before social media platforms to at least suggest that they're trying to get the right information out there, acknowledging that there's misinformation. And as you point out, we're also seeing efforts to say, hey, look, algorithms can solve problems. The biggest, the biggest, the biggest, the best way to solve a tech related problem is with more tech. What, how are we seeing, how are, how are you seeing the exact trends that you documented in algorithms of oppression playing out with relation to the pandemic? Well, one of the things that has me really concerned right now is that the obviously we're seeing millions of people unemployed, small businesses closing. And this is increasing our reliance upon big tech like Amazon, for example, and Instacart and other other kinds of companies to fill in the gap. So we're seeing, for example, Amazon is providing, it's making its own products and selling it under its own brand, right, products that we formerly might have purchased around the corner at the Bodega or at the small businesses in our neighborhoods that are closed. And those companies, many of them are not coming back or they will be, it will be very difficult for them to come back. I think about other kinds of systems that are at play right now, for example, the whole financialization, you know, that has happened through big tech, the way that banks, for example, make decisions, banking decisions through these algorithmic processes, you know, there isn't even a lot of human review sometimes in the distribution of financial services like loans, for example. So here we have the SBA payroll protection program rollout, which has been, I think, you know, an example of how automation has taken a lot of care out of who would benefit from a public fund like that. So we saw banks giving out loans, SBA loans to their biggest customers. You know, I live in Los Angeles. So of course, when I saw that the Lakers, the LA Lakers got a SBA loan, I could have died. Meanwhile, the barber who cuts my husband and my son's hair, I can't survive. So, you know, these are the kinds of things that automation, swift automation also strips out the reality and the context and the the tacit knowledge about who needs to be supported. And, you know, this is to me is a really, we're going to have a lot of long term consequences from the way that just that resources have been distributed kind of with the help of automation. But do you I mean, you know, one of the so I get sort of just to play devil's advocate, right? Because in some ways we're not whatever future we might be able to build. It's we're not putting the genie totally back in the bottle. We might be able to change the structures of accountability. But, you know, I'm sure I'm sure there are some people out there saying, applying a utilitarian calculus and at least a semi-sophisticated way, right? They're saying, you know, it's terrible. It's terrible that small businesses are going under. But thank God, with Amazon, people can continue to get deliveries. I mean, I wonder, I mean, I sense that some would make the same argument about about your earlier question about, well, why would we seed so much, for example, of the institutional space in which our our public debate happens to a private company? And some might say, well, look, thank God, because it accelerates movements like me too, because it gives people a space outside of outside of institutions whose logics may indeed, you know, be oppressive or marginalizing or imperfect or incomplete, that's so much faster, you know, than having to organize in person, that's so much faster than having to take power away from from entrenched institutions. That's certainly the argument the founders make, right, of these companies, that they are, that they have created these new vistas of possibility for freedom, for convenience, for benefit. And of course, there's costs, but but when it's a net beneficial, you know, it's a very utilitarian argument. How do you, how do you think about that? I think you could coordinate the distribution of goods differently with different logics. So for example, in Los Angeles and any city and any town, you could have this kind of and we could have done this even before the pandemic, quite frankly, but the pandemic is just shining a light on it. We could have coordinated buying and selling of groceries and goods and toilet paper and everything else locally, hyper locally, where, you know, the kind of, if you want to say it's just kind of the infrastructure of what Amazon is providing, which I think they're doing more than just providing an algorithm an algorithm and a database and an interface for us. But you can do that locally. And the largesse of the benefit of a platform like that that's hyper local to California or to Los Angeles or just Southern California, where we keep the businesses and our neighbors in business, and we're able because see what we can't do is go down to the local. So instead, where we're buying these, the toilet paper is from who knows where, but not a place that's down the street from me. I know that for sure. So I think these are some of the challenges here about, again, the logics of the consolidation of the power of one company or a few companies who then can source from anywhere they want and for whom the value of community and community support and keeping a community afloat is not, in fact, part of the logics there. So these are the things that I mean. In fact, it's really about driving down the cost. So getting those goods in the cheapest place possible in the world to maximize the profits so that I pay what I would pay in my LA price with my LA dollars for a thing that you sourced somewhere else in the world for a fraction of that and you kept the the profit, the difference. So I don't know. I mean, yeah, sure, it's convenient. But I think this is a great time to think more broadly. And I obviously my job is to do the kind of thinking about more than just the immediate band aid. But what are the structures that are being remained? And what are the new dependencies that are also happening that will be difficult to handle? What do you know this? This reminds me a bit of I think the debate part part of the debate that was happening during the during the sort of fever pitch around globalization in the 90s where you had you saw you saw you saw consolidation of of multinational industry. You saw consolidation of power around who was able to control global supply chains. And and but you also saw interest in creativity as as folks as folks anticipated that we would externalize, you know, all of the all that we would externalize and make distant all of the the localized pain that that that blue collar workers would feel to another country. You know, you saw some creative work right to build, you know, ethical sourcing was one of the terms ways to have accountability within supply chains, ways to pass value on supply chains. And you know, it strikes me that at the end of the end of that period, you know, you had a group of people who said who probably had been the inveterate optimists about capitalism beforehand saying, look, we can do this. And you had a group of people who were who were who were concerned saying, you know, the consolidation of wealth in capital marches on. And these are really these really aren't things that are changing the system. Now we're even further down that path. And I think you're you're opening up to us the possibility that power consolidates even further in a global way. What what are the steps? If it's now is a time for fundamental rethinking, what are some of the steps that we should take? I mean, this is really the where the rubber meets the road is our own ability to envision and enact something different. And, you know, one of the things that I write in my book, you know, as I say, we have more data and technology than ever before. And we have more global social and economic inequality to go with it. So this promise of the liberatory possibility of data and technology to equalize has actually the data show that we have more global inequality record high than since we've been tracking global inequality. So more consolidation of wealth in the hands of the most powerful in the world. And we don't see that trend reversing except through public policy. I mean, to me, this is the place of power, because public policy like enacting different imperatives is the way we put the brakes on these kinds of transfers of wealth. For example, you know, in the last four years in the United States, we've seen some of the most egregious transfers of public wealth into private coffers that must be arrested. I mean, we have to talk about and of course, the SBA is just one of those examples, right, where we saw the safety net that is supposed to be there for people in this democracy in the United States. The taxes that people have been paying that to make a society work for them actually just was transferred to the wealthiest companies. So we have to ask ourselves like, how are we going to legislate differently, regulate differently? I mean, we have tech companies run amok as far as I'm concerned in the United States. There's on so many different levels, privacy, surveillance, tracking the right to be forgotten, algorithmic discrimination and oppression. These things are not really taken up very seriously in the United States yet, although they are being interrogated more purposefully in Europe and the EU. And I think that, you know, one of the things I always say to legislators is I say, you cannot try to regulate big tech as the only solution while you simultaneously defund every democratic public counterweight. So while you defund education, defund higher education, defund public health, public media, libraries, all of the institutions that would serve as the counterweight to these kinds of the influence of big tech, you know, those things have to go hand in hand and we have to double down on our democratic institutions. But but I but I might ask the question that I assume a lot of those legislators ask, which is, OK, I hear what you're saying, but what would you do about technology companies? You know, what are there are there incremental steps that obviously will to start, they need to be taxed. All right. The fact that Apple gets a refund or any other big tech company doesn't pay its fair share of taxes, like companies in other modern democracies in Germany and France, for example, is criminal. It's it's not right. And of course, here we are in California, where we are suffering so much under this pandemic and other types of crises collapsing, you know, the threatened collapse of our brilliant educational systems. And and this is where Silicon Valley was born and Silicon Beach now and these companies, you know, now what they've done is their billionaire CEOs are cherry picking the things they want to fund and see live rather than paying into the coffers of the state of California and every other state where they do business and having that money helps support the public infrastructure that we need so desperately. So I think it's that's a very easy first step and I would be transformational to see these companies pay their fair share of taxes into the public coffers and see that money given back because what they do is they extract they use our roads, our airports, the people we educate through our public schools and public universities and private universities. They extract all the talent, all of the brilliance, all the resources and they don't get back into it. And I think and that includes the the brilliance of our brains as we make content in their platforms. So I want to let's ask a question about brilliant. There's a kind of the last question that actually came out of the questions our audience is asking that I think is really interesting about about our brains and the power of our brains. And you know, there's obviously it's sort of people who have built up these companies and are making decisions about how they ought to be governed. And we've got a great question here about how educators should be thinking about this. What how should how should the how should the how should your colleagues who are educating, you know, a generation of engineers who probably hope to start these companies or hope to be executives in them or work in them. What we've been thinking about a bottom up if taxing is part of the top down. Is there a what do we need to do for the next generation of folks who are going to help, you know, in real time figure out how how what the balance should be between the business imperatives of technology and the public imperatives of any institution. Yeah, well, one of the things I call forward in my book is the rise of humanities and social sciences to massively inform engineers and technical thinkers and workers because we don't have an appropriate contextualized long view of what these technologies are. And of course, that's what the art the liberal arts help us do is to understand things like the rise and fall of empire. And that might be really helpful for people in Silicon Valley to study and understand. We also need people from ethnic studies and gender studies because at the level of discrimination, and we need people who really understand in a sophisticated way, well informed way and can recognize when systems are broken and who spend their lives thinking about the interventions. So educators have a huge role to play. Universities have a huge role to play in not separating the data from the society conversations and not privileging the data before the society. What we find in data and technology quite frankly in machine learning and artificial intelligence, they're doing pattern recognition of the past. And that's what helps inform their algorithms. But if you don't understand the biased, racist, discriminatory classist past, you don't understand colonization and enslavement and all kinds of patriarchy, all of these things. And I mean at a very specific way, if you don't understand those things, you really cannot understand where your systems are engaged in furthering them. And so we must have a more integrated and holistic liberal arts type of education. And we need the public to fund education now more than ever because we have engineers making these products who AP tested out of English. They're rolling on a 12th grade English or social studies background. I mean that is, I tell my computer science students at UCLA who come and take my classes and you know that my classes really help them tremendously. And they're shocked that they are just engaging these conversations as they're about to graduate. And I tell them I say you have no business designing technology for society and you don't know anything about society. You know anything about people in an educated way. I don't mean from sub reddits. So these are the kinds of things that I think, you know, we need the public and we need a higher education for sure. As someone who's, you know, someone who got one of those antiquated degrees in philosophy, I will say Martin Heidegger in subreddits about equally obscure, but otherwise point taken. Well, I think look, there's obviously I think a ton to unpack in the kinds of fundamental questions that you're raising. You know, for those of you who want to get deeper into Sophia's work, you should buy the book algorithms of oppression. It's really sort of a contemporary classic. It's ironically available on Amazon, although I think at a fair price in the Kindle edition. Sophia also has a new book study group on Instagram. It's at Sophia.noble.phd. We'll send that out. She's on Twitter and at Sophia Noble will send that out. We'll send around a couple of your recent articles as well. We're not done with the infodemic. Next week, Renee DiResta, who I know you know, is going to be joining us. She'll be talking about the infodemic is the only growth import in America right now and the role of foreign, foreign governments in the infodemic. I'm also excited to preview for you that after the show with Renee, we'll of course be coming back to these topics over the course of the coming weeks and months, but we're going to talk really specifically about a disrupted election and the potential for a disrupted election and the procedures in our democracy is one of the biggest things that we may have to grapple with as a nation. We'll be telling you more next week about who you'll hear from, but it will include former election administrators, innovators, law professors, etc. So we're really excited about that. Sophia, it was incredible to have you with us. We appreciate you joining us. Everyone out there, this and all future episodes will be available on kf.org slash vision starting tomorrow at noon. We always want to hear from you on email on the website. So, Sophia, thank you for joining us. Everyone out there, thank you for joining us.