 Welcome. It's 101 Eastern Time. That's 601 in the Canary Islands. And that means it's time, of course, for Vision, a weekly show about the trends, ideas, and disruptions affecting the future of our democracy. Our focus as we've started this show has been on the way that the COVID-19 pandemic has been opening up huge questions about how our democracy functions. And for the past several weeks, we've been looking deeply at the problem of the infodemic. This is the official term that the World Health Organization has used to describe the overabundance, the sort of the the surfeit, the extreme quantity of information about COVID-19 that's available, especially on digital and social media. And the challenge that that infodemic creates for people to find trustworthy information. And this is a real problem. You know, our own polling at Knight Foundation that we've conducted in partnership with Gallup, and that was released this week, found that an overwhelming majority of Americans, 78%, and this was consistent across party lines, seeing misinformation about COVID is a major problem. So today we're going to continue that conversation about the infodemic with Renee DiResta. She's a researcher at Stanford University. She'll talk to us about where all this misinformation about COVID-19 is coming from and what we might do about it. We have a lot to cover, incredibly excited for this conversation. So we're going to get right into it. So without further ado, please welcome to the show, Renee DiResta. Hi, how are you? Good, how are you? I am good. I am good. So I, you know, this is a problem. I'm really excited about this conversation, because I know this is a topic that you've been thinking about long before global authorities started to think of the challenge of an infodemic. You've been following, you've been tracking, you've been researching how misinformation spreads through the internet, why it's so hard to find trustworthy information, why it's so easy to find untrustworthy information. And so I, before we kind of get into COVID, I want to kind of, I'd like to ask you a bit about the kind of the nature of this information system. And one of the things that I know you've described in your own writing is that in some ways you can think about the encounter of information in a digital context as an information glut, that we sort of have more information than ever before. So could you just tell us a little bit about what the information glut is, why, why that's a feature of our information system? Yeah, I don't think that wasn't my term. That was, it was a way of describing the fact that once we had the era of the blogosphere, right? So back the early, you know, internet 1.0, anyone could become content creator. And it was a really interesting dynamic because it gave people the opportunity to say exactly what they wanted, you know, this proliferation of voices, utopian ideal of the internet is facilitating more speech, more commentary of less gatekeepers. The interesting thing, and I had a blog back in the day, too, I'm sure a lot of us were like on GeoCities or whatever with, you know, like a nine inch page. Mine was GeoCities, you know, the construction GIFs and the whole of it. But it was a way for anybody to write anything. But at the same time, there was not a whole lot that was tying it together. So with my crappy blog, like nobody read it, you know, a few people managed to crack the distribution nut through email lists and things like that. But most of it was really not very well connected. So you had this glut of content, but search engines weren't quite there. There was no algorithmic amplification. When you really started to see that change when distribution and content creation came together on social platforms. So then what they did was they even lowered the barrier to creation more. So instead of having to think up a whole blog post or try to decide what topic you were going to write about, you could literally write, you know, 140 characters or Facebook had the prompt, you know, how do you feel today, right? They would periodically change that in the little window. But it was just asking you, what are you doing? And so more and more people became content creators, Instagram, snap a photo, boom, you're a content creator, right? So all of a sudden, you had this experience where you could no longer get to the end of your feed. And so to make the browsing experience more the user experience more pleasant, what you started to see was platforms ranking the feed. So instead of reverse chronological, there was some value judgment where the algorithm decided that this should be at the top of your feed. So there was elements of personalization, elements of curation. And then as the share button and the like button came to be incorporated on Facebook and the variants on Twitter and YouTube, people became an active part of kind of upvoting, if you will, content which further impacted what other people saw in their feed. This is partly what the social media is selling you though, right? It's selling that I'm giving you more relevant content to you that I know what you want to read what you're interested in. So yes, and I assume a lot of good has come from that, right, that you went from a world of a lot of junk to a world of hey, I'm really interested in dogs. I'm getting more content about dogs. So what what is it? What is it about this, this set of tools that has also enabled the flow of untrustworthy information? Because I know you've studied a lot of how these really interesting tools for convenience are also used really effectively by people who want to build community other communities of affinity, but affinity around conspiracy affinity around incorrect facts where how does that work? Yeah, well, the technology doesn't decide who uses it, right? That's that's this is of course, it's the nature of technology. And so you did start to see a lot of conspiracy theorists, you know, anti vaxxers were where I spent most of my my early kind of research, you know, back in 2015, looking at those communities, because what it allowed them to do was connect with other people outside of the confines of geographical space, right? So you no longer had to know the other local conspiracy theorists in your neighborhood, you could find your tribe online, if you will. And what began to happen as a result of that was the platforms recognizing this built features to support that behavior. So Facebook began to actively promote groups. Instead of just the newsfeed, which was your friends that you had kind of brought with you in your early social network, it was now helping you find new friends and new interests. And so you started to see recommendations pushing content to people who had evidenced maybe like the slightest interest in it, or had evidenced no interest in it, but were statistically similar to people who had. And so that was a really interesting kind of innovation of collaborative filtering, which was a way for the platform to show you something that felt very serendipitous, you know, you go on Amazon, Amazon does this to anything with a recommendation engine has this experience, where you go on Facebook, and the content that you see is something that you are very likely to be interested in, even if you've never necessarily typed that specific keyword in. So the idea of platforms became curators. So instead of editorial gatekeepers, where there was some notion of authority or establishment media, or however you want to describe the kind of old media system, you now had this sort of kind of a moral algorithm that it didn't really have a sense of what it was recommending. So you could be recommended like a gardening club, or you could be recommended QAnon, right? And in essence, the authoritative, the authority was what was interesting to you, rather than something dividing fact from fiction or sort of information from, from knowledge. But how does, how is this, is this what drives misinformation? We had Jevon West on the show a few weeks ago, and he said, you know, a surprising amount of misinformation, particularly about medical things, vaccines, COVID-19, that a key mechanism of transition is this sort of innocuous process. It's the parent who's just trying to find an answer to a question, who then runs into something doesn't know, because they don't have that authoritative institutional resource that it's trustworthy or not, and then ends up becoming a part of that community, or passing the information on. What is it? You've really been tracking, you know, anti-vaccination misinformation, COVID misinformation. What are, what's driving this? Is it, is it, is it parents doing their best? Is it foreign governments? Is it conspiracy theorists who have a malevolent impulse? What, what's pushing the misinformation through an ecosystem that is generating a sense that's sort of trading on serendipity? Yeah, so we kind of, we can divide this into, let's call it bottom up and top down, right? So you've got, and let's use COVID as the example, since that's something everybody's paying a lot of attention to. You have the conspiratorial communities that are going to do what conspiratorial communities do. The minute the first, you know, case made the news in the US, not even the first case here, I'm talking about like early January before it was actually in the United States, you had the conspiratorial groups talking about how this was a bio weapon, this was an engineered virus, this was done to facilitate campaigns of mass vaccination. Those, those posts were put out in a very coordinated way by certain types of networked activism communities. So you do have that kind of bottom up peer to peer information sharing. I trust you, I follow you. You're sending me information because I've kind of bought into this trust relationship with you. I am inclined to believe that you're not willfully misleading me. And so I in turn share that information on, right? Maybe to signal my identity, maybe to help my community. But that's the kind of peer to peer model that we see. The other thing that we see right now with COVID though, and this is really interesting. This is kind of a remarkable example of every government in the world simultaneously facing the same type of crisis, right? And so what you see particularly from China, it's happening in the US, but I'll use China as the example. There's a real need to portray a sense of competence to one's own people, right? So there's the outward projected narratives, the things that, you know, we call it public diplomacy. I, as a nation state, am communicating about, you know, my country, my government to the rest of the world, the citizens of the world, the governments of the world. But the other thing is how your messaging is read and reflected back by your own people, particularly for China and Iran. What we have seen is government initiated conspiracies in which the government will say, some people are saying, and then we'll put forth a conspiracy theory. In the case of China, it's that the US military brought the virus to China during the military world games in Wuhan in November. So by recasting the virus as something that was not native to China, something that did not come from the wet markets or the bat caves, you instead have this idea that someone, some nefarious force, has brought it over to you. And then conspiracy theories really offer like a kind of, the academic term is like populist theory of power. But what it basically means is it affords, in this case, it's affording the government an opportunity to reposition itself as the defender of the people. Because this is something that has been done to the country by outsiders. And so the Chinese Communist Party response is, you know, defending the people from the outside threat. So again, you have this bottom up conspiracy theory, and then you have the top down where they're using state media, they're using diplomats. Iran's narrative was the virus is perfectly tailored to the genome of the Iranian people. Some people are saying. And then Russia goes and reports out as the Iranians are saying that. And so it becomes this daisy chain of information. In which again, nobody comes out and says that they concretely or conclusively because there is no information, there's no concrete information upon which to base this. But the insinuation alone is enough to generate conversation. Right. It was particularly in an environment where people are looking to confirm their priors. They've got, they're starting to build a narrative out of the information. But there's sort of something, first of all, how dare you accuse elected officials of trying to seem competent. That is just antithetical to the American ethos. No one would take them seriously. But that is bipartisan, by the way. No, but I, you know, what's interesting as I listened to you in both the bottom up and top down dynamics is authoritative institutions are getting squeezed from both sides. Right. So you've got, you've got sort of this democratization of content so that I can offer, I can speculate about cures for COVID in a way that feels equivalent to the CDC putting out some kind of you know, hesitant, cautious, proviso laden statement. But as I listened to you, what's interesting is when we talk about government in the top down context, we're not actually talking about expert agencies that are offering authoritative information. We're talking about individuals, you know, with their own incentives, their own rent seeking incentives, who have kind of taken a sip of the same brew that us regular folks have. And they kind of like it. They kind of like that they're not beholden themselves to the fact that the CDC is not sure when a vaccine is going to come. That, you know, that elected officials in Russia are not beholden to what the expert agency says about whether you should go outside or not. That elected officials that sort of party officials, political aporetic in China is not beholden to whatever a Wuhan local public health agency might say about really trying to get their arms around where this came from. So, you know, where does that leave institutions in that equation? Yeah, well, one of the one of the real challenges is the erosion of trust, not only in government, not only in media, but in institutions as well, is a thing that happens, you know, the the media and the social media kind of democratized, you know, democratized propaganda, if you will. It, it feeds into the cycle. There's this narrative that if an institution has made a mistake once, it is forever, you know, tarnished, you can never trust it again, right? That will always point back to that institutional failure. So that, and unfortunately, one of the things with the internet being forever is that there's always a tweet you can point to, right? The surgeon general said this on this day. The CDC said that on that day. The media wrote this stupid headline about the flu on this day, right? So you always have something really, you can screenshot it very quickly, throw it out there and put it back out. And so there is the opportunity to just, you know, devolve into cynicism where there is that that constant narrative that nobody knows what they're doing. The problem with institutions though, in this particular case, is that one, they are in fact, per your point, constrained by a need to report something that's at least scientifically accurate, right? The CDC and the World Health Organization. We saw this pretty clearly with the dynamic around masks. So there was this narrative, you know, mask. I've been calling a mask gate, but I'm forgetting the exact hashtag. It was, you know, there's the mask up hashtag. There were the people on Twitter who were saying that based on the available evidence, we should all be wearing masks. And the CDC was not updating their guidance. And in fact, they had said the complete opposite. And so this chatter began kind of on the internet, Twitter in particular, saying that they were trying to keep us from buying masks to save the masks for the doctors. And they were lying to the public to, you know, to save the doctors. But if you went back to the CDC website, the guidance on their website from 2012 from SARS said exactly the same thing. No masks, hand washing is better, here's why, right? And so it was almost this like, you know, nobody really knew what the COVID transmission mechanism was at that point. Nobody had really mapped out droplets and spray and distance and so on and so forth. That was just beginning to come into the public sphere. There were really great voices on Twitter, experts who were not affiliated with the CDC or World Health Organization that were trying to break that information down, trying to explain it to people so that people felt informed. The institutions were silent, right? They weren't going to make a change in guidance until they were absolutely sure that of what they should say, of what the new findings was were with regard to this particular disease. And so you have this gap, this void where people who were looking for information on masks are going to find the people who are loudest and most retweeted about masks. And for a time, that was actually the people who were advocating that we wear masks. And then you did see the CDC update that guidance. But now, interestingly, one of the narratives that's beginning to emerge on Facebook or in this pandemic video is that masks are harmful, that we're re-inhaling carbon dioxide, that it can exacerbate other illnesses, you know, the sort of pseudoscience theories. And now, again, nobody is countering that information, but that is the conversation that's beginning to percolate around masks on social channels that's gaining adoption among the communities that are skeptical of the scientists to begin with. I mean, one of the things you talked about in your recent Atlantic piece, which we sent around in social media, that great cesspit of information, misinformation and what we'll send around after the show that I thought was really interesting was, you know, so much of this debate, I think appropriately focuses on, you know, what should the regulatory infrastructure be or how should companies change what they're doing? And we should talk about that a bit. But you made this really great case drawing on some of these debates that, look, really effective institutions also learn how to play in the milieu that they operate in in an effective way. They learn how to operate and that a big part of what we might need if we want to preserve some idea of authoritativeness around information is that authoritative institutions figure out how to play in the Internet. And some of that's how you communicate, but some of that to your point is thinking beyond your institutional borders, you know, who's part of that authoritative community. I thought, you know, businesses are starting to learn this lesson, right? As they find out that they can't go to ground when there's a controversy, you see real differences between CEOs who know how to have a dialogue on social media, a real dialogue, and to have a certain kind of transparency, often with their own employees, versus those that really, you know, they look wooden and it's they give a statement that this seems inauthentic. And then there's sort of the Elon Musk side of the spectrum, which is, you know, something totally, totally different, not to be replicated. But what are some of the things that you'd like to, like what should expert institutions like the CDC do in a pandemic moment or the next pandemic moment on the next issue? What is what's an institution supposed to do when it's not sure what the answer is, but it needs to be in the discussion in this environment? Yeah, for a while, like CDC, when they were, you know, tagged into this anti-vax conspiracy theory, CDC whistleblower back in 2015, they chose silence, right? There was the idea that if you just ignored it, it was people on the internet and that was separate from the real world. And in the real world, people would still have a healthy respect for authority and would, you know, so it was kind of a a very old lack of adaptation to the understanding that the trust communities that we were talking about before, I trust you as a source, you're telling me something counter to them. You know, you're in fact telling me that they're broken and incompetent. Why would I in turn trust them when I trust you, right? So, so that dynamic, they didn't recognize that they didn't get, you know, they didn't really internalize that early enough. I don't think at the same time, you know, this is one of these areas where you hope that they have, you know, more structured internal comms, but really there are certain people who are very talented at conveying authenticity, right, at communicating in a very human level. One of the challenges with the science comm, you know, in psychoma particular is that there are a lot of unknowns. There are a lot of probabilities. There are a lot of like, you know, the disease can follow this curve or this curve or this other curve over here. How do we convey that in probabilistic terms to people who, you know, aren't going to sit there and delve into a stats class before listening to the message, right? This was the same thing with the vaccines cause autism thing. You have a compelling first person video that a mom records on YouTube versus a PDF fact sheet about the MMR, which are, what are people actually going to sit there and listen to and engage with and read. So, social media really changed the format in which we've come to expect information, meaning, you know, memes work for a reason, right, when you design the feed to amplify the picture. The picture has to convey the entirety of the information because people aren't going to click through to the article, right? So you have, so this is where you see that memetic style of communication evolving in response to platform design. I think the dynamics around understanding the need to shift away from a, we give press releases to journalists to we communicate authentic, authentically with people and we recognize that there's a strong emotional component fear of anxiety right now in general with health concerns and recognizing that the information you know, tailoring on that front, the system as it's designed if you're going to have people share your content and spread it and you become part of the information moving through those trusted networks just has to be designed in a very different way than it currently is and it has to be responsive in a near real time unfortunately, excuse me, for your point the crisis communication model that good CEOs follow is to get out there and begin to communicate authentically immediately. So that I think is a thing that when you see people do it very well it really has, it diffuses the anger it diffuses the anxiety and it provides people with information and makes them receptive to listening. I do, you know, so I agree like I think I think there's no question that are, I'm definitely an institutionalist like anyone who's been watching a show for four weeks knows that I believe in these institutions and I believe information and knowledge aren't the same thing information and truth aren't the same thing. I also think it's naive not to ask our institutions to reinvent themselves and how they operate in we're not going to put the genie back in the bottle on this tech we don't want to, right? All the, as you point out so much of the utopian promise is true, you know, of the internet's making and so it's enabling this. It's interesting I think some of what's new is new like I think figuring out ways for particularly like scientifically oriented institutions to talk about uncertainty or to say we don't know in ways that are authentic but build an audience that will be hard. I think building organic communities will be hard but I also, I mean, isn't some of what's new old like I think about you know, mothers against drunk driving was a big part of a public safety movement right and a lying them with the NHTSA our transportation safety agency was about that authentic parent video, right? I also think about like lead certification like I don't know why lead platinum is better than lead gold scientifically but I want to be a lead platinum right? If there's a medic quality to a certification scheme that they know and they've researched aligns with my values I want to be part of the race to the top on sustainability and I want to walk into a building where I work that reflects my environmental values the internet I think compresses those emotions it heightens those emotions and those response mechanisms but it hasn't invented them right? Right. So do we do you think we have some of what we need and it's about applying it or do you think there's something qualitatively different about how you have to communicate digitally in social media? I know I think that I think that the you know, I've I've compared some of the you know, pandemic video to a marketing campaign which is you know, maybe lacks some of the nuance but I think if you were to just describe it in a snippet it is what it is, right? People have to run marketing campaigns for ideas now that's what digital activism is and networked activism in particular recognizes that there are persistent factions on the internet who will amplify content that that resonates with them on a values level and you know, I don't use factions in a pejorative sense I mean that literally is just there are standing communities on the internet that have these sort of interest-based clustering or you know, there it can be anything from like Star Wars fans who want to talk about you know, either rant about or talk about the latest actress to be cast in whatever role or you know, whatever the theme is and so some of it is you know, how the institutions that are not necessarily their core competency isn't going to be meme making nobody expects the CDC and the World Health Organization to like, you know, build out the meme division that'd be a little bit ridiculous I think but per your point recognizing that there are partnerships where there are folks who are who have that capability who have that audience who have that kind of tangential overlapping affinity group that I think is where institutions really need to be going at this point one of the challenges for the groups though and this is something that you know, we saw when I was kind of more involved directly in vaccine activism pro-vaccine activism in 2015 there's no funding for them right and so so you're asking a lot of people you know, to kind of put in the sort of labor of passion you know, to produce this content one of the things that's really remarkable with health is that the people who are going to be most likely to to put out the content proactively for free, you know, voluntarily are people who either feel that they've been phenomenally helped by the system or deeply harmed by the system right and that's why with something like vaccination you know, the 85 percent of people for whom it's the world's most routine thing aren't going to go home and make a meme about how their child got vaccinated and nothing happened right and so it's you see much more of like that asymmetry of passion that asymmetry in content creation this is true for cancer as well right people who lose someone to cancer and decide that chemotherapy was in fact the cause begin to go down the road of pushing kind of cancer quackery because they really, truly deeply believe that it was the chemotherapy that killed their family member and so that is a hard thing because it's a deeply emotional authentic kind of passionate grassroots response where institutional communications just don't carry that same degree of emotional clout I guess right no and I have a four year old who's anti shots so he doesn't want to do a cute video about why even though they've really helped him but you know we're getting so that sort of raises a question that's coming up a lot in the feed right now with our with our with our viewers which is like you know to what extent is the problem that the internet has surfaced people who aren't going to be convinced you know because either for emotional or value based reasons they they don't want to be persuaded they want to persuade you they want to build a community of affinity as you've said but they don't really care if the CDC can find a more authentic way to convince them is that is that the battle we're fighting or is it that is it is it someone else that's caught in the middle of the established authority and the kind of emergent emergent conspiratorial or untrustworthy community well we started to see when when we looked we started doing this actually in a kind of quantitative way in California in 2015 looking at the conversation around some of the vaccine loss right saying like who are the communities that are part of this conversation and what you'll see is there are the sort of entrenched you know you're not going to shift them these are the we kind of refer to them sometimes as true believers but these are the people where it's almost a part of their identity at this point they're not going to relinquish that very easily and so the counter movement was much more geared towards ensuring that we were reaching people sitting on the fence right that how do you reach the the rituals that was in part because again the resources required to persuade someone who was so deeply entrenched in a certain type of conspiratorial thinking it becomes almost more like you know you see this with a deradicalization as well it's much more of a you have to have a personal relationship there has to be a it's a cathartic process it's a transformation they're not going to see six memes and be like oh well I'll have to change my mind now but there was a paper that just came out in nature literally yesterday which I I tweeted if someone wants to go pull the URL for the paper itself it's a journal article it is a really interesting article because it looks at sharing relationships sharing behaviors between pro-vaccine kind of like entrenched pro-vaccine entrenched anti-vaccine and then the undecideds which are like the green team kind of in the middle and what it's doing is it's looking at ways in which the green team so to speak is actively searching for information and the prevalence of the content they're finding is from the anti-vaccine groups and what they're sharing back out is from anti-vaccine or vaccine hesitant or you know whatever kind of degree of on the spectrum that winds up being so it's an interesting case of research that looks at the dynamics around how what I would call how factions integrate right how you know you've got these kind of persistent standing communities they're not defined by an interest in vaccination but maybe one of them is natural living maybe one of them is you know I was in a when I had my first baby in 2013 we decided to make baby food right my parents did it so I was like okay I'll do it too you know and I joined a baby food group you know and boom I mean the number of like anti-vaccine recommendations they got just like boom that that is what I think is new I know we've got a lot of unanswered questions about the prevalence of misinformation but I think that you know the CDC or the FDA in 1970 didn't need to worry that when you joined your neighborhood parents face break group might actually stumble into this and I mean we have exactly the same experience like we most of the feed and our people who live in our four block area is about is someone saying you know my kid has a high fever I'm not sure what to do and the answers they get on that feed you know there are a couple times where we've said you know we've wanted to step in and say like we're not a doctor but we know some and don't do that and so I I think people don't that are sort of orthogonal encounter I think is something that is new and to your point is exactly what the tool is designed to do I want to we're kind of we're running low on time but there was a really big announcement this week from Twitter that is that is really on point of course for this discussion which is I think an even bigger deal than the upranking of you know WHO and CDC content and Google search and those things which is as I know you know but for our listeners Twitter said they're going to implement a labeling policy when information seems to go against what established authorities say and I have to say this is one of the first times I've seen where a company has been really serious about indicating that there are authoritative sources of information and that those authoritative sources of information matter for what is pushed to you I'm curious what your thoughts are on this whether you think this is a good idea or bad idea or any thoughts you have about how effectively this is going to work yeah so some of it is like who are the authority partners right that's that's always one of the key questions in 2019 after the first Brooklyn and then there was the Brooklyn measles outbreak and then there was the Samoa measles outbreak both in 2019 in March of 2019 when Brooklyn was kind of at its peak the platforms began to put policies in place to deal with the fact that again when people searched for vaccines on Facebook or Twitter what they would find was anti-vaccine groups because that was what was most popular right so that was what the these were not these were never platforms designed to be libraries or they were designed to show you entertaining and you know entertaining content and so the policy began to be put in place that they would kind of like override that consensus of the most liked was what you know I used in the Atlantic article but that that they would that they would layer in authority on instances where your money or sorry Google calls it your money or your life where your money or your life is at stake if your search results are bad right World Health Organization and CDC were the two partner orgs that were selected to be the information that was surfaced the problem is again first the content creation is not always where it needs to be as we've seen with COVID the response time and speed and adaptation is not where it needs to be and so I think what the platforms are trying to do now is think more broadly about what does it mean to surface expertise more importantly because we all know that the correction does not go nearly as viral or reach the same people as the original content at least if you put up a label there is a sense that by by tagging that content you'll continue to have that there'll be a persistence there where people will be able to see that that material was flagged in some way it won't just kind of be off to the you know it won't just disappear Twitter was always a little bit different and just in terms of design right like Facebook would pop up the interstitial at the bottom that would let you know that something had been fact checked and found to be false or you know they would throttle it in the feed Twitter doesn't have that same you know type of design environment or kind of so this is sort of I think their way of of tackling that same problem of how do you ensure that if someone has seen viral misinformation or if it's continuing to be shared even at a lower level that there's something there labeling it letting people know that from for you for for health reasons this is a thing that they perhaps shouldn't take seriously do I mean what I found optimistic about it was so so much of the Silicon Valley discourses either an implicit or explicit critique of existing institutions like you just see this all the time you know I thought Mark Andreessen's you know widely circulated post about its time to build was sort of like there's nothing good in the institutions that exist we've got to we've got to totally build new ones and so I was glad to see some affirmation that having institutions is a useful thing I mean what do you have ideas about what you think what other steps you would take if you were if you were running Facebook if you were running Twitter to to to address the dynamics that you've seen that you witnessed that you research yeah I mean one of the things is really that I think the hardest thing to reckon with for them is the velocity piece right so that you know we I think you're having Nate personally on next week or something right and so he's done some great writing on this also but we think about velocity and virality right virality peer to peer sharing facilitating the spread of information through factions and communities and trust circles velocity refers to how fast that happens right and this is where we can use the plan endemic video as an example because that just happened there was this video it it began to show evidence that it was going to tip into the mainstream as opposed to just staying in the kind of anti vaccine QAnon and reopen communities that a lot of these pseudoscience conspiracies tend to remain in this one really for a variety of reasons began to show evidence that it was going to hit mainstream awareness that presented a real challenge for the platforms YouTube chose to take down the video after it had about a million views or maybe even more than that that's a huge problem right because that plays into the censorship narrative if a million people have already seen it and then you take it down that's not really the most optimal time for a takedown if you are going to use takedown as the moderation approach you're going to take with that piece of content there is another option which is what's called reduce meaning you remove it from you throttle it a little bit you kind of down rank it and in that time you have an opportunity to have your fact checkers come in watch it produce the counter the counter post the counter content and then you put the inform interstitial up basically or you know that it's been fact checked you point people to credible information and in a way that diffuses the secondary drama of a free speech narrative around censorship which is what so many of these things turn into so the fact that you know when people talk about this video coming down now they're talking about free speech the problem with the video was that it was pushing out just this array of garbage just absolute nonsense with regard to health claims they took it down and keeping with their decision to minimize COVID related health misinformation but unfortunately the conversation now is about the censorship and about what they should have done and whether they made the right moderation call and that's a whole other conversation and that is a very very difficult one because that's how you attract people who are just sympathetic to the idea that big tech is shutting everything down big tech is acting as censor so I really think in the you know pandemic provides yet one more example of where shutting the door after the horses left the barn is actually way way worse than dealing with the velocity problem earlier on to kind of assess the this is showing indications that it's going viral is this accurate medical content and who is the credible counter voice to to put out in the fact check on that front well that's this has been a fascinating conversation and to sort of continue with the the animal metaphors you know Renee has gone all the way down the rabbit hole on this information so if you want to hear more about this you can follow her on Twitter at at no upside you can find all of her work at her website www.ReneeJeresta.com there's no live journal there anymore for more on this particular topic you can check out a recent lawfare podcast on apple podcasts Renee it's been really great to have you on the show thanks for taking the time thanks for having me and before we go I really want to tell you all about what's coming up on vision next week we're going to turn our attention to another huge storyline during the COVID-19 pandemic which is how to manage an election a presidential election a contested presidential election during a time of potentially significant disruption so we're going to be looking at this from every angle over the coming weeks from the basic mechanics of how to manage an election if we have to stay home to how the pandemic might actually affect this critical democratic ritual on May 21st we're going to have Stanford law professor and campaign expert Nate Persily on May 28th we're going to hear from Spencer Overton Arturo Vargas two critical civil rights leaders June 4th we'll hear from two former federal election commission commissioners Trevor Potter and Anne Ravel and on June 11th we'll hear from Digital Innovator Seth Flaxman who is finding new digitally enabled ways to help people register to vote and get engaged we're really looking forward to continuing to explore how COVID-19 is reshaping the future of our democracy by the day as a reminder this episode will be up on the website tomorrow at noon and you can see this episode in any episode of the show On Demand at kf.org slash vision we want to hear from you email us at vision at kf.org or visit us at instagram at vision.kf please take the survey before you go and as always we're going to end the show to the sweet sweet sounds of Miami songwriter Nick County check out his music on Spotify until next week at one have a great week