 Hello everyone, good afternoon. If you can take a seat, that would be great. We have a really packed agenda today and we're hoping to get started as soon as possible. My name is Lisa Guernsey. I'm the Deputy Director of our Education Policy Program here at New America. I also run the Learning Technologies Project. And I'm joined with David Snyder, who is the Executive Director for the First Amendment Coalition. We're partnering with them on this event today, as well as with Future Tense. And Andres Martinez sends his regrets. He was not able to be here because of an illness in his family, but wished he could be up here to welcome you all along with me. As many of you might know, Future Tense has run some events similar, kind of a similar vein over the past year or so. And back in January, ran an event on free speech online. And so we see this as a nice continuation of that conversation. It's really heighten the dialogue, elevate the dialogue on these issues. So I just quickly want to just throw a couple of things out there about New America and then I'm gonna turn it over to David. For those of you who have never been to New America, welcome. As you might know, we hold a lot of different types of events here and do a lot of different commentaries, policy, programming, et cetera. We're dedicated to renewing America. And we're trying to do that by really trying to continue the quest to realize our nation's highest ideals, but to recognize the challenges that are wrought by technology and by social change as we do that. And I really think that today's event is a perfect example of that combination and what we're trying to understand better about our world. So for today, a lot of you are watching online. We have a full house here as well. And a lot of people will be tracking this on Twitter. So the hashtag is fact or fiction. And after that, I think we're good. We're gonna jump right into conversation. I'm gonna turn it over to David. And thanks again to David for working with us on this. He's a former Washington Post reporter and a lawyer who really understands these issues a lot. So thanks so much. Thank you, Lisa. And thank you all for coming out. The First Amendment Coalition is a nonpartisan, non-profit group that advocates for free speech, free press, and the freedom of information. We're based in the San Francisco Bay Area. And we have members who are journalists, academics, ordinary community members. We are so grateful to New America for hosting this event. And to be able to harness the brain power here is really nice for us. Because I think the topic we're gonna talk about today is really of critical importance to the future of our democracy. Just to sort of briefly set the stage. The First Amendment sets forth among five rights that it provides. One that I think is most important here today is freedom of the press. The insight that I think the founders had when they included that right in the First Amendment was that it is absolutely essential to the survival of a self-representative democracy. Of a democracy where the people govern themselves, that they have good information about what the government's doing. That they understand what the government's doing in order to steer it in the direction that will most benefit the people. And without good information, you don't have self-representation. We have more information than anyone could ever possibly have imagined 20 or 30 years ago, much less, 200 years ago. And yet, we have a situation where it's hard to find good information. There's a kind of informational paradox that we're all living through where there is more information that you can access more quickly than in the history of the human race. But finding good information somehow is a challenge, or finding information reeded out from the chaff is somehow a challenge. And so today I hope that we are able to not just identify the core problems in getting people to have access to information that's meaningful and true, but hopefully we can begin to chart solutions to this. Hopefully we can begin to figure out how from the government angle, from the angle of social media enterprises, and from an education angle, we can start to chart a way out of this. The problems that we face are complex. Politicians, voters, and legal experts are looking to government, to social media platforms, and to the mainstream media to try to figure a way out of what I think everyone would agree is something of a mess. On that note, let me introduce the moderator of the first panel. His name is Carlos Mazza, and he's a journalist and video producer who writes, produces, and hosts the Vox series, Strike Through. He previously worked at Media Matters for America between 2011 and 2016, where he was a research fellow and created a video series on media criticism. So please join me in welcoming Carlos. So the title of this first panel is The Threat of Fake News and Efforts to Curb It, which is a huge topic. And if we're all being honest, we probably have three minutes until we all get a push notification that terrifies us for the rest of the day. So I'm not gonna have a huge intro. I'm just gonna invite the other panelists up, and I'm gonna start with one of my favorite humans who I've known so far, which is Michelle Lipkin, executive director at the National Association for Media Literacy and Education. Next up is Kevin Benson, who's a director at the New America's Open Technology Institute. Dan Gilmore, who is a professor of practice at the Walter Cronkite School of Journalism and Mass Communication at Arizona State. And I'm welcoming back David Snyder. We missed you so much. It makes you seconds you were gone. So if you all are cool with it, I'm gonna gloss over the first part of the panel, which is what is The Threat? I think if anyone has a pulse and is in America, you are aware of the mess, as you described, or the crisis. We all know that misinformation is being weaponized by bad actors, and that ranges from Russian hackers to Macedonian trolls, to Alex Jones, to increasingly Sean Hannity, Andrew Uncle, who keeps posting about chemtrails on Facebook, I have one too. And while misinformation has always been a thing in democracy, it seems to be very, very bad now. And what's weird about it is that we live in a time where we have platforms that theoretically are the utopia of the free market of ideas, right? Anyone can post anything, anyone can share anything, and yet things are not going well, so either that idea of a free marketplace is wrong, or something else is happening. And the question is, what do we do about that, something else that's happening? We've all talked beforehand, and I think all of us agree that having the government intervene to crack down on misinformation is scary and dangerous. That's not really a debate among us, but I was hoping we could start by, if I was to ask you your best argument for why the government should not be regulating and trying to fight fake news by banning things, what would your best argument be against it? I think the guy who runs the First Amendment Coalition, if I wanna lead with it. I can answer that question in three words, the First Amendment. I mean, so the core of the First Amendment, the principle that underlies the First Amendment is that the government shouldn't be in the business of calling balls and strikes, of determining who gets to speak and who doesn't get to speak. And so my organization, and I think many others who are proponents of free expression, get very suspicious the moment government starts to get in the business of deciding what's right or what's wrong. That said, the First Amendment feels, in some ways, obsolete these days. The First Amendment was drafted more than 200 years ago and the basic point of the First Amendment is to keep the government out of this business of saying who can speak and who can't speak. The primary threat that it addressed was direct government censorship of speech. And fortunately, we see less of that than I think we did at the turn of the 20th century. The legal doctrine that formed around the First Amendment emerged from that time period when the government was actually, early in the 1920s into the 1930s, the government would prosecute people for things they said. And so this robust body of law sprang up and it doesn't really feel like it applies anymore. So there's a lot of discussion, I think, among legal academics and others about ways in which the First Amendment can evolve to address the problem of speech that is stifled by too much speech, by aggressive or almost harassing speech. And we can talk more about that in detail if you'd like. But I think that the bottom line for me is that the First Amendment feels sometimes beside the point in the middle of this debate. It shouldn't be and it's not, but it does feel that way. Yeah, I mean, that's sort of the baseline that we're starting from and the second idea is so what then? And it seems like the answer so far has been these companies that own major digital platforms have to be somewhat responsible for what happens. And if you saw Zuckerberg's testimony in front of Congress, that's a really messy, big, great question and raises a couple of really, really big anxieties. And I'd like to work through those anxieties with you all or have you all confirmed my anxieties and then we can all leave here crying if that's what ends up happening. But the challenge of having tech platforms deal with their major speech problem, Facebook, YouTube, Google, Twitter, the first that sticks out to me is there's a problem of scale. There's so much stuff like you mentioned. It is four hours of content uploads to YouTube every minute. Is there any meaningful way for tech companies to regulate the sheer amount of speech they deal with on a regular basis? And if not, why not? I have to agree that I think the issues that we face right now have more to do about speed and quantity than misinformation because misinformation has always existed. So we need to focus on what is new about this particular landscape, right? And what is new is speed and quantity and how are we processing it? I think that tech companies, my concern about the tech companies being responsible for the solution and this might be controversial, but I'm not a huge fan of only tech solutions for tech problems, right? I think we have to come up with human solutions and that's where education, and we'll talk about it a lot, comes in. And I think though that what tech companies, I don't want tech companies to be judging what should be shared and what is true. I don't want that. I don't want them getting involved with my freedom of speech, but I do want them to be transparent. And I think when I look at tech company solutions, it has a lot to do with their relationship to the public. And what is their level of transparency? What are they sharing with their users? How, you know, things like how can we make their terms of service be understandable to 11th graders? You know, like, that there are things tech companies should be doing, but to me it's always about the relationship with the public. So I have a number of things I'd love to add to that, but I want to jump back to the anxiety part. Yeah, Carlos. The story of our lives. Well, because I want to take a second to try and tamp it down a bit in the sense of we're facing a real and to some extent unprecedented problem, but at the same time, we don't want to descend into a moral panic and make bad decisions. And I think it's worth questioning. If you want to see a distorted marketplace of ideas, one rife with lying, skull-duggery and defamation and anonymous propagandizing, all you need to do is look to the founder's own generation. I mean, that's the model of the marketplace of ideas. And it's a pretty rough and tumble place and there's a lot of bad things happening in it, but their response wasn't, let's figure out how to better censor the printing press. Their answer was, let's promote more speech. You know, filter bubbles aren't new. Conservative radio has been an incredibly powerful, politically potent filter bubble since the 80s at least. You know, I mean, the fact that more Americans believe in ghosts than in human-caused climate change is not the fault of the internet. These have been problems in our information ecosystem forever and I think immediately scapegoating the internet or thinking that this is a brand new thing as opposed to simply a change in scale rather than a quantitative change is dangerous, especially when we also don't know how bad this problem is. I think there was an assumption because it's easy to say, oh, look, the pope endorsed Trump, that's clear BS. That doesn't mean it was actually important and I think there's a question of, a lot of the examples that have been shown in terms of Russian info ops weren't fake news. They weren't lies, they were just political opinion, often voiced very aggressively or targeted in a way to start arguments, but it wasn't misinformation or disinformation, it was just political opinion deployed in a difficult way. You know, I think there's a question of, we complain about these stories, but then again, how do we gauge that impact versus the treatment of the mainstream media of the candidates in our election where Trump got a lot more coverage just on the numbers and Hillary's coverage was disproportionately about the email server thing in line with the talking points of her opponent, like how do we factor that in? How do we factor in, and this will be my last point here, how do we factor in the fact that a lot of the mainstream media stories were driven by Russian info ops, not around fake news, but around hacking into people's servers and stealing their email? How do we factor that in? Because I don't see us having a national freak out over the fact that we all still store decades worth of email in these cloud servers that often aren't protected by two-factor authentication. Why weren't John Podesta and Colin Powell using two-factor? We're not freaking out about that. Instead, we're having a national freak out about why the platforms aren't censoring more, like without any data behind it, that seems worrisome to me. So, and maybe you can speak to the same, but it's... At long last, there is a bunch of research going on. So, this is going to be discoverable and in fairly, I think, quick time, we're going to have a lot of excellent data to show what did happen and what might be done. But going back to the original thing, if we're talking about a marketplace of ideas, markets are about supply and demand. And almost everything in this sphere has been talking about supply. And God knows we need to improve the supply of journalism and information. But very little until recently has been focused on demand. And that's like us, the people who are the recipients, the users of information. And Michelle is working on this. You'll hear more about it in an upcoming panel. But we really have to fix demand, and that involves, at its fundamental root, the idea of critical thinking being ingrained in everybody. And really quickly, three ways to get that done. And three major ways to get that done at scale, which we have to do. One is education. That's where the government role makes sense to me, is to promote education in this, including libraries. The second is media themselves, especially journals, to make news literacy, media literacy, part of their mission, which they've never done, and now they can, and a few are starting. And then the third area is the platforms, which define scale at this point. And what I'm hoping they will do, and this is a tough ask, is to not just give users a few options here and there, but to give us tools to manage what we see, and learn, and find ourselves, and to do it individually and with communities. And if we can get all of that going, I think we can make a really big dent on the demand side, which until now, as I said, has not been sufficiently addressed. So I think we have a real chance to make a difference, without involving either the government or platforms becoming the editors of the internet, which would be catastrophic. So one of the tensions that I've seen so far, just from this first round of answers, is that there's a sense that there's too much, that the speed of news is too intense, and also that a solution to that might be more news or more speech in the way that the framers first imagined it, and those seem to be in tension. The truth is we don't have a really a collapse of journalism, we just have an explosion of all this other stuff that can crowd out journalism. And I'm wondering if anyone has thoughts on how to resolve that tension, because you can train journalists to talk about media literacy, and you can train viewers to think about media literacy, but if someone opens YouTube, and there are 100 Alex Jones conspiracy theory videos about crisis actors, how do you deal with that supply problem, where the supply is kind of overwhelming, and even in some cases shaping the demand of consumers? You go first, and then I'll... I mean, I think there's a serious question of what additional information will actually add value rather than subtracting value. I think the key word there is context. How do we create context for these things? And I think no one has good answers yet. YouTube just had one idea for creating context, which was linking to Wikipedia for these conspiracy videos. I think there are some aspects of that that are a good idea, the fact, but I think that sort of doing that, relying on that nonprofit entity, that very small nonprofit entity, also not warning them that they were about to do that, was probably not the best way to deal with it. I think similarly, Facebook is struggling with how much do we want to do fact-checking or add some indicia of credibility to stories. But one, there's the problem of many of the folks who seem most susceptible to fake news also are hostile to the idea of fact-checking or are hostile to the fact-checkers themselves. But two, there's actually been some research, and I apologize, I don't remember off the top of my head who did it, that was around. If you start fact-checking some things, but not other things, and then people look at something and see it doesn't have an indicia of having been fact-checked or as proved true or not, they assume that it is true. And so if you do that incompletely, you may actually exacerbate the problem. So, yeah. I think it's also, though, and I sound like a broken record to people that have heard me say this before, is that we're so focused on facts, like we're so focused on truth and false, and it's so much more complicated than that. I mean, we are not having enough conversations about the business model of the media industry and how all of this comes from a desire of these social media platforms. They're corporations, right? So they have bottom lines, and they are trying to advertise to us, and all we're talking about is this story fake. And it's like, I feel like we're limiting, and I don't mean like we, I mean like the larger we, is that we're limiting the conversation and what we want. We want, shout out to our librarians, like we want information literacy, right? We want media literacy. We want people to understand the entire landscape. We don't wanna just say, okay, I can trust this or I don't trust this and leave it at that. It's a much deeper critical thinking that it needs to happen. And I think that what we don't do enough of is we don't understand, like it's not only the speed of information getting to us, it's how fast the change happened, right? Like that's unprecedented. And all the other, in the history of the communication systems, how long did we have to get used to the printing press, right? But you know, like we're talking like the last five years, the entire human communication system has changed. No wonder we don't know how to figure this out yet, right? Like it's gonna take a little time and that's why I totally agree with you. Like we have to take a breath and see where we're making mistakes and learn and see how we can address it in our education system and in our government and all of these things, but we have to take a collective breath. Like of course we're screwing some of this stuff up. It's never happened before. If I could pick up on your point about the business model, which I think is a good one, these social media platforms, they're out there to make money. And that plays a role in what it is we see and what the quality of information we get, but that's not new. I mean, newspapers have been profit driven for as long as there have been newspapers. I think that the key difference is there is a culture which developed over many years, hundreds of years at newspapers, there's a sense of a public trust and there's a sense that there's a duty to the public in purveying the information that newspapers purvey and God knows they don't always meet those obligations and sometimes they come very far from meeting those obligations, but there is that culture in newspapers and I wouldn't expect social media platforms to immediately have that culture, but from my perspective, there's a complete absence of the sense at social media platforms that they owe any duty to the public for the platforms and I don't want them to become the editors of the internet as Dan aptly put it either, but they already are to some degree. I mean, Facebook and Twitter already take down enormous amounts of material from their platforms and so for them to say we have no role in this, we're just the platforms, I think it's disingenuous. They're already editing and so. I don't think they're saying that at this point. Not anymore, that's true. I think they're tripping over themselves to censor more and it's getting a little worrisome, but. But so is there some way, and I asked this question without knowing the answer, which being a lawyer I know I shouldn't do, but is there some way for the government or somebody, is there some way to encourage a cultural shift at social media platforms that would make them feel a greater responsibility to the public good than I think they do now and would that be desirable? I think it just happened. I think it just happened. I agree, I agree. I think this is exactly what we're in right now is they're waking up to a lot. It is certainly top of mind, I think, at the platforms at this point. I always ask when I hear calls for the platforms to do something, what do you want them to actually do and how does that not lead inevitably down a path given their unprecedented scale? There's nothing in human history to resemble their scale with one exception which might be God? Well, I'm actually thinking of a relevant case in terms of information flow in a individual community. Newspapers were monopolies for a brief period, absolute monopolies, and that's why that cultural difference is interesting, but today in some countries Facebook is the internet. This is without precedent, so what do you want? I don't know how what all the people say, they gotta do something. I don't know how doing something avoids the problem that we all want to avoid. There's a really be careful what you wish for thing here and also a core contradiction to the way, especially policymakers at this point are talking about this, where in one breath they say your power is unprecedented and unaccountable and frightening to me. Now use it more. Yeah. That is the message that the companies are getting right now and it's kinda confusing. Maybe it's just use it more responsibly. But I think that what you're bringing up though is that, this was so obvious in Zuckerberg's congressional hearings, right? No one knows what to do, right? We maybe should stop comparing it to the newspaper industry. We maybe should stop comparing it to the way the communication industry used to work because we've never seen anything like this before. So it's going to take new skills and it's gonna take new questions, right? To figure it out. But I also think that it has to be multi-pronged. Like we can't depend on the social media companies to solve the problem. And that's what I mean about technology solutions. It's just like, when I see these apps that are no offense to app developers, but apps that are coming out that's gonna teach us which sites to trust and which not. Well, that's not a long-term solution actually. Like we need a long-term solution and I think it's, a lot of it has to do with education and we need, really need a revolution in education today that we need, like what are we preparing our kids for and how are we gonna help them succeed? And yes, all of these pieces need to come together but we can't, there's gonna be no long-term solution without addressing skills. I mean, but one of the challenges there is as we talked about, we have this unprecedented scale problem. It may be, and this is both worrisome but also we're thinking about, like that the primary pathway to educating people is going to be through these very platforms because they're the only thing operating at a scale where we'll have the impact necessary. I mean, Google's already has the monopoly on that but they are getting involved too. They are dedicating funds and we're personally working with them on some things. So yeah, maybe it's everyone has to. I have to do just general media literacy rather than actually education system. I thought. This raises sort of the question we're dancing around which is what it, so newspapers have this obligation in the public trust that guides their actions even when they are profit-motivated. Tech companies don't see themselves that way at all and if you watch the Zuckerberg hearing, he was asked, do you have responsibility as a publisher or are you purely a tech company? And he said, we're a tech company. They don't feel they have that obligation. That's a choice that tech companies are making and it's an editorial choice to say we're hands up in certain cases. They're also, as you mentioned, frequently making the opposite choice and censoring things and choosing things to take down but they don't really have, they're going back and forth as to what their role is and it seems like before we figure out what should happen next, you have to answer this question of what should the role of the platforms can be? If it's true that only platforms operate at this scale, we have to answer this question of what is the responsibility of the platform? Even adding context with the Wikipedia thing is an editorial choice the platform is making that we have to be comfortable with. That is a call. That's a judgment call to say we need to fact check some of these things. Some of these things are not gonna get fact checked. So I think the basic question is what should, if any, that responsibility look like and if there isn't an answer, are we comfortable with saying Zuckerberg has no obligation or responsibility for what is published on his platform? I can't say that. I don't want them to have no responsibility. But they don't say that, we don't say that. I don't know who says that. No one says that. I think that ever since we've had user-generated platforms, we have also had moderation of content, less or more so, but I think one of the biggest challenges is, and this goes to, I don't want to get too wonky, but there's this thing called CDA 230 which is the original bargain that policy makers made, which was if user-generated content platforms are responsible, potentially liable for every piece of content, even if they're operating at a scale of hundreds of thousands, millions, billions, trillions of pieces of content, they will not emerge, they will not survive, they won't work. And so Congress carved out liability, not criminal liability, but civil liability, in part to enable people to go ahead and moderate content. CDA is Communications Decency Act. It was actually an anti-porn bill in a lot of ways, and the reason they created this protection from liability was to actually enable companies to moderate content without fear of liability. And they have done so because Facebook doesn't want to be a cesspool. People won't come to it if it's a cesspool. So the question is, how do they draw those lines? And Congress just tore a big hole in CDA to 30th them. Post-assessed to another topic, but so I think the bigger question is, recognizing that these people are essentially making governance decisions about what we can say and not say online, which ultimately it is gonna be them and not the government because of the First Amendment. How do they draw those lines? How do they validate those lines? How do they test those lines? What is our involvement in being able to write those rules or say when we think they're bad? And I think that is why transparency about what their policies are and clarity about why exactly things get taken down, notice to people who are affected, appeals for people who are affected are important because this is basically artificial, private governance of our speech online, but we don't yet have due process for it. And Facebook's terms of service and Google's terms of service override the First Amendment in the universes that they control, which are a big portion of the speech that exists in Twitter. And we are, I just don't like this notion that all these solutions tend to say, you guys do more and we need to find ways to promote more competition for hosting and creating speech. We need to, again, I'll keep saying it, we need to put a lot of it back on us. And we seem to be resisting that in some ways and I think we have to push that, not resist it. So this is the scariest part I think of this panel, this idea that it's back on us because, like I said at the start, in theory, we are living in a utopia of speech where we all have access to publishing and sharing things and things are not going well. And if you watch the way that Facebook has tried to deal this so far, like letting people flag things that's fake, that hasn't gone well either. If you look at YouTube's trending page, it's the bad stuff that people choose to look at first and consume first. And so in a lot of cases, as it is now, the handing over that responsibility to people has made things worse. And the question I wanna ask is, is the broken platform us and these platforms are just reflecting that brokenness back at us as citizens? First of all, everything is not going badly. There's countless amounts of great stuff that we can do and say and are doing and saying in this new world. And let's not forget that this has enabled things that are absolutely wonderful at scale, not just a bunch of things that we don't like, which are, I think, dwarfed by the good stuff. Yeah, I agree. I have to say, I mean, sorry, I'm like a total optimist. I feel like this is an amazing time to be alive. I feel like the things that I see, you know, you just look, I mean, use the example of the Parkland kids and you're just like, this couldn't happen. Like the voice that these kids have and whether you agree or disagree with them, like their bravery and their organizational abilities and the way that they're activating. Oh my gosh, you see amazing things happening in media companies with breaking down barriers with gender and with race and just technology itself with the creation of content. It's an extraordinary time to be alive. And so to me, it's like, yeah, we have some problems, but if we only focus on those problems, we're missing out on an incredible landscape of opportunity and we can't let the fear take over. We have problems, we need to address it, but. I strongly agree with this, but I also want to take that Parkland example and push it a little further in the sense of if we're talking about do more, if we're talking about, well, censor this, take this down, the question then becomes, what is this? How do you identify it? And it's, you know, the same tools that are enabling info ops by state actors or people we don't like, saying things we don't like, although a lot of people like the things that are being said and a lot of those things being said are First Amendment protected, how do we start drawing lines that don't hit the people trying to take advantage of digital tools to effectively speak for good causes or for causes we actually think are okay. And this also gets to the really dangerous fault line of progressive versus conservative in the context of this issue because, and this is another reason why there are a lot of things the companies aren't doing because if they do them, they will get attacked from the right because there's a bit of a disproportion in terms of people advocating for violence in terms of political access, you know, like, and so, and they're already facing that right now, and which results in really bizarre stuff like 60 conservative thinkers just signed a letter to Facebook saying, you're being biased against conservatives. We want you to follow the rules that the founders set and only censor what the First Amendment, what is violates the First Amendment. I'm freaking, Edward Meese signed this thing, one of the like, greatest anti-porn crusaders in this country if people remember him from the 80s, like pretending that he thinks that these companies shouldn't only be censoring obscenity or child porn as opposed to other sexual speech, like, please. But anyway, I digress. And then to think that we're gonna be able to draw these lines in a way that is easily administerable, doesn't carry a lot of collateral damage, and is automatable. Like, that's science fiction, at least right now. And, you know, Mark Zuckerberg can try and deflect political pressure by saying, oh, the AI are gonna solve it. In 10 years. Yeah, well, but, you know, if we as humans can't make these decisions often, how do we expect something on a chip to do it? At the same time, I get kind of frightened that where we are ultimately going with this is just a war of AIs versus AIs. They're AIs trying to target people with messaging. Are AIs trying to figure out which ones are authentic or inauthentic? I see things like Google Duplex on Tuesday where Google Assistant now has an AI that can call and make appointments for you. And it's basically an AI pretending to be a human and doing it really effectively. The next step is gonna be me having an AI that answers phone calls to try and weed out the BS AI calls. You know, and it's just, and it's gonna be a continuing escalation, but I don't see any other direction for it to go other than that, which is perhaps the scariest thing. So I'm A, having a panic attack, but while that's happening, I wanna push back not to be a pessimist, but we do have a big problem with people liking things that are bad for them. Like Fox News is the biggest capable news network in TV. We make bad decisions as consumers. If we're putting a lot of faith- You did just jump into the fault line. I just talked about it, but- If we're having faith on people to deal with that and to grapple with our own pad brains, what can be done and what can platforms do to encourage us to be better decision makers when we're being asked to make smarter calls about what kind of news we're consuming? Which I think is a big question. There's no instant fix and there's a million different things we have to try, but we can, as one example, take a page from the anti-tobacco campaign that produced real results to convince people that sharing BS is not cool. That you're a real jerk if you share stuff that's false. So there's a campaign in the works to start working on that. And by the way, sharing is the new distribution for a lot of our content, which is a whole place where supply and demand intersect. So that's just one of thousands of things we have to try. I would add, some of the solutions may be orthogonal. Like, I think that there's a lot of work that could be done around privacy that could help mitigate the harm of hyper-targeted messaging. I think there's a really hard question to answer about when does targeting go from effective speech to manipulation? And I think that privacy and figuring out, like, how do we make sure that people don't have so much information about us that they can manipulate us versus effectively speak to us, is something we need to talk about and try and come up with answers for. This goes back to the issue of the ethical framework that digital platforms operate under. And what you described, the immense amount of information that Facebook and Twitter have about each and every one of us, that's their business model. That's how they make their money. And so I think there has to be a significant shift in the ethics, in the ethical approach of these platforms so that that information, which makes them piles and piles of money, is used responsibly. And do you legislate that? I don't think so. I think the First Amendment counsels against legislating it. And I think they're coming under great political pressure to change those ethics and to change that sort of approach to the world as a public trust versus not. But they haven't thought about that. I'm fairly convinced they haven't put much or enough thought into how that information is used in ways that affect our ability to speak and our ability to hear real information. I think there's just no easy fix and there's no fast fix and we have to be in this for the long haul, right? And if you look at what we do right now in schools to teach kids to read, it takes years and years and years and years to get them to be literate. So why do we think that we're gonna be able to solve this particular problem in a week, right? Like we need to be in this for the long haul and be talking about it and having conversations and really looking at systemic change to make the solutions. I think that's a good place to stop. Do we have time for Q and A stuff? Cool. We have questions from the audience. Now's a great time. Let's start with you. Are there mics or are there mics? There are mics. Yeah, if you could start with your name and just keep to one question that'd be great. My name is Mark Nadel and I wanna focus on the demand side because I like to believe that people we should teach them critical thinking and it's so important but I think a lot of people are dominated by denial. They don't wanna give up power that they have. A first child doesn't wanna share with the second child and so they like stories or facts or whatever, fantasies that make them feel good. They don't have to share. They don't have to give up power. How do we combat that? And in addition to denial you have religion where you say take faith, take it on faith. The facts don't support but just believe it because of faith. So you have a community, a public that's been taught to some extent about taking things on faith that has denial in their human nature. How do we get them to get past that? Can we start with the 23 roughly percent of Americans who want help in navigating this torrential river of information that they don't know how to survive on? I'd like to start with the people who already have said they could use our help and work on the more difficult folks that you've described as we go but let's at least begin. These are people who basically say they're at a bit of a loss in how to deal with all this. That's, we can do that. Yeah. Hal Bogner. You brought up the moderating capability and role of the platforms and privacy questions as well. And right at the end you started touching on these things but the things I'm wondering if you've thought about are they're moderating for the purpose of making money. They're using information about people that they are obtaining without people even thinking about the fact that the business they're in just like going back to Publishers Clearinghouse is building mailing lists, networks of how can we show you more ads so we make more money and they're also therefore moderating the content already whether it's through AI or any other method and changing it constantly to optimize it to be able to keep people on and get people to see the things that resonate and that they'll repeat the most, believe most easily and feel good just shouting out to friends and in this reinforcing the spread of these memes just to keep people on so they can sell more advertising. So that's a moderating thing. They could moderate it for other goals than that and be socially responsible. For instance, the tobacco. Is there a question, sir? I think the question is is the capitalism incompatible with democracy? Actually, maybe the question is for David which is the First Amendment didn't stop tobacco companies from promoting smoking long after it was known what the harms were but there were mechanisms that worked in law to eventually say these are causing harms. There are ways of holding people responsible so maybe there are some and maybe it involves platform moderation and figuring out legal approaches. I mean in terms of the government stepping into the media world and saying here's how you're gonna do it. The only example that I know of at least from recent history is the fairness doctrine which was in place for 20 or 30 years which applied only to broadcasters and said you have to give equal time to the opposing side and you have to do this, you have to. And I think that's generally considered a big failure on a number of levels. The government doesn't typically do a good job of saying here's how you should do it media. First off, so as a practical matter I don't think it generally works and I think that probably applies also in the social media realm but there are deeper constitutional issues when the government decides to do that which is that they are, the government is not permitted to, both not permitted to silence you or permitted to tell you what you have to say. So I just, it's a minefield in getting, and I have not read of, I do not know of, a good concrete suggestion about how regulation could work to moderate content on social media and also be consistent with the first amendment. I think we have to, I know we can talk about this all day but I think we have time for one more question. Yeah, in the back. Hi, I'm Joe St. George. I'm the legal redress chair for the NAACP Montgomery County branch. A little louder please. Okay, my name is Joe St. George and I'm the legal redress chair for the Montgomery County NAACP Maryland. And my question is, you've been talking primarily about education with as a means through which we can bring about some change in the landscape but what is your position with regards to the using the stick? You know, you have the carrot education, you have the stick legislation and in the state of Maryland, we have cyber bullying laws that make it criminalize someone or criminalizes the act of sending email that annoys or harasses, which I think is very broad. But in, what is your position on that? Because I get calls weekly as a volunteer lawyer about all the hate crimes that are happening to African Americans and the media has done a horrible job in my opinion in protecting classes that have already been disproportionately impacted by the negative stereotypes in media. And so now we have an environment where people can use all types of racial slanders and so forth and there is no penalty for it. And so I believe that, so I wanted to get your opinion because you can't train away hate, you can't train away bias. You only can change bias through positive images but you deal with hate with the stick. So what's your perception about cyber bullying and having stronger laws and policing for a very overt and clear hate in media? Well, the law is pretty clear under the First Amendment. There are types of speech that can be criminally punished and speech can rise to the level of harassment that is punishable under the First Amendment but it's very narrow. So what's punishable under the First Amendment or what are described as true threats, which means a direct threat of violence to a specific person which could ultimately actually result in harm to that person and is perceived as such. That is punishable under criminal law but there's decades and decades of case law that make clear that just saying things that are hurtful emotionally to people is not punishable criminally under the First Amendment. There are cyber and I'm not gonna get the name right but there is a federal cyber harassment law. It's a criminal statute that it's not, I don't know if it's cyber bullying but it has been upheld but it's been upheld in instances where someone repeatedly threatened violence against like a newspaper reporter or some other individual and they're prosecuted for that but it's a very narrow band that is punishable and once you start punishing words that are hateful or hurtful you have weakened the protections of the First Amendment for anybody who wants to avail themselves of those protections. Yet the prosecutors still prosecuted. We won in court but the school district would not take responsibility for the fact that they had no evidence against this boy except for full copies of emails from a principal. So what I'm saying is the state law is written, annoy her or harass and around the country other states are adopting this law. Sounds like a really bad law. It does. And they're using it in the real bully is being against children of color. And so laws have a way of doing that. They're gonna stick in that situation coming into play. I do wanna say though just some of your, I mean I can't speak to that I feel like that's above my pay grade but the issues that you bring up about bias and representation and all of those things those are key, key elements of a effective media literacy education program. So those things need to be taught in school. I know you can't legislate hate but you can talk about bias and you can talk about representation and you can really get people to open their eyes. I think we're gonna continue after we're gonna move to the next panel now and I just wanna say a big thanks to Carlos for moderating and for all the people. Thank you, thank you. So I have the pleasure to jump in for this next one. I'll be your moderator for our second panel which I think we'll get to a lot of the questions raised by this last question related to schools and what can happen in our education environment. So I'm glad that we got that started. I'm gonna quickly introduce our next panelists and come on up as I mentioned you. We have Alexander Dardelli who's the senior vice president for IREX with us today. Thank you so much for being here Alexander. We also have Anmei Chung who's a senior fellow at the Mozilla Foundation. I think the next one is Patricia. Am I getting everybody's order right? Allen's next, okay. Allen Page who's a government teacher in Oklahoma City, Oklahoma. Thanks for coming in from the heartland for us. Allen, we appreciate that. And Patricia Hunt who is a government teacher in Arlington County just over the river here in Virginia. So thanks so much Patricia. So what we wanna do with this panel that was so interesting to hear what was talked about in this last one because what we wanna do in this panel is try to get to solutions that are, what's happening on the ground in our learning spaces whether they're informal learning spaces which I think Alexander's gonna be talking about in a moment here or our formal classrooms. And to be thinking about the long term as Michelle pointed out in our first panel that that may be where we really need to go next and to really understand how to, maybe we can't solve the problem with misinformation, disinformation or propaganda. Those things have been there for around forever. But how can we make sure that our citizenry, our people that we live and work with every day, our brethren have a sense of how the world works are formed enough to make good decisions for us as a society in general. So just, you know, small, small task. We're just gonna work on that in the next 30 to 40 minutes here. What I wanna do to get us started is jump in to talk a bit about a report that just came out from IREX that opens up some really interesting questions about what one can or can't do when it comes to trying to help people think about being more discerning in the information that's coming at them. So Alexander, tell us a little bit about this study that was released today. And for those of you who are here in person, you had a chance I think to pick up some copies of it, but it's also available online. And I also did an article about it for Slate that just appeared last night online. One of the things that, as I was doing some of the research on the study that jumped out at me was that this was a study that was looking at what you have done in Ukraine, not in the United States. And it was huge in scale. 15,000 people participated in this program. And you were able to get some data on whether it worked. So those points already kinda led me to be pretty intrigued. So tell us what you found and we'll get also into some discussion about what it means for education generally. So thank you. It was a great discussion and very timely. One of the speakers said it's a great time to be alive. It sure is. How great it is depends on where you are. And so there is a noticeable decline in democracy in many parts of the world and it affects many geographies including Ukraine. Ukraine is not a stranger to fake news and disinformation. Indeed it has been awash for 20 years in disinformation. With three key drivers. First, Kremlin sponsors of funded propaganda and disinformation. Second, oligarch controlled bad content produced in Ukraine. And third, junk generated by social media. Which is not unique to Ukraine. So what do you do? In Ukraine you have high consumption of media. 75% consume audio-visual media. Very low trust in media. Less than 30% believe in what they see and what they consume in audio-visual media. You clearly need critical information consumption skills. So we designed an initiative that we call learn to discern. In which we trained 361 community leaders in media literacy who then in turn trained 15,000 people in media consumption. In building critical consumption information skills. So this was the first iteration of our learn to discern in Ukraine which concluded a year and a half ago. There is now a second iteration going on as we speak. So we wanted to see, did it work? We conducted an impact study with 200 participants in our program and a controlled group controlled for age, gender, geography and education of 200 non-participants. And we found some fascinating results. And I think we wanna share these results and we were proud and humbled by them at the same time. One, the group of participants reports a 28% higher demonstrated ability in the knowledge of the news media industry. Which at the very basic level means who owns what? Who's the face behind the facade of media? To 25% of the participants in our program self report a higher ability and likelihood to cross check new sources, to check multiple sources of information which is a very basic form of media literacy. And third, and I think the most important finding is that 13% have a demonstrated ability, higher ability to detect misinformation, disinformation, to critically analyze news. We think this is huge. 30% is a big difference. I'm gonna just give you some hypotheses to make the point. Assume that this information relates to elections. Or assume that it relates to war, hostilities or assume that it relates to market. Well, this 13% could literally dictate the outcome of an election. It could dictate actually life or death choices in a situation of conflict and Ukraine does face conflict on hostilities. And three, it could make or break your fortune in a market. So these are some highly promising results that we're facing in Ukraine. We're happy to discuss more about these results but I'm gonna be respectful of time and so I'll let you ask other questions. So what are the things, I just have one more question for you and then we'll start bringing some other folks in. When you say you got to 15,000 people, this was done not by sending some sort of like handout or direct mail or saying, here's what to believe and here's what not to believe. As I understand it from my reporting, this was done by tapping into people who already lived in the community and were from trusted members of a community, maybe the librarians, police officers, who then recruited people to come in to these small group sessions and just talk. Am I right? So they weren't lectured. You're right. One of the many unique features of this program is the fact that we leverage the power of social trust networks, which basically means in a more sophisticated way the political economy of information, how we generate, consume and use information is highly relational. It depends on certain structures of trust or distrust, incentives or the incentives based and populated by people we like or dislike. And so we went to trusted community leaders, peers that enjoyed the respect of their communities in the workplace, in the communities. These are the initial 361 trainers who then in turn reached 40 people in their networks and delivered on a face-to-face basis the training we delivered to the initial group. And so that element of scale occurred organically on the basis of distrust relationships. And that's why we were able to reach 15,000 people who then in turn reached about, shared voluntarily five or six other human beings and we reached an ultimate number of 90,000 people in Ukraine, country-wide. So it's one of these cases. To me, it jumps out at me that the social networks that right now on social media we worry about could be harnessed for good, perhaps in a case like this. But let's get back to that too because I do want to talk more about social media pieces as well. I want to jump to you now on that and tell us a little bit about the information that the Mozilla Foundation has been putting out to try to seed some kind of baseline understanding of the digital information that's out there, what kind of tools are out there and why it's important to have those kinds of materials out. And there was an announcement, was it three weeks ago or so, of the release of some new materials for educators and this one if you can share. Sure, so Mozilla is in the business of really activating communities to do much of this work. So I think one of the things, if you haven't seen it, is Mozilla started to put out their internet health report. So our core mission is making sure that the internet is healthy and so the internet health report, if you all haven't seen it, Mozilla already does provide research and data and information and stories about the five issues that Mozilla is focused on, which is web literacy, digital inclusion, decentralization, privacy and security. And then I've forgotten the last one, but at any rate, I think, and I've been focused actually on working on web literacy. So my issues that I've been focusing on are web literacy. And I would describe that web literacy is basically how to read, write and participate on the lab. And when I think about why that's important and what we try and convince people of, it's like the ABCs, it's like the alphabet, right? You need to learn the alphabet in order to form words, form sentences, form ideas, know how to comprehend, know how to communicate. And if you leave out the vowels, you're not learning everything you need to learn. And so people will often say, well, can I just learn privacy and security? Can I just learn this piece of, I'm like, you know, you actually have to learn all of it. So to me, they're the basic fundamentals of what you need to do on the web. And the web literacy map that we created basically has 14 core skills that we think everybody should learn. And the curriculum that we developed was really open source. We built it with librarians and in and out of school educators. It's a set of like 14 curriculum and facilitator guides that are about how to evaluate, connect, search, how to collaborate with people. And the pieces that I think are probably most relevant to this there, we've got activities called search party or web detective or privacy first that really I think start to address some of the things that the everyday person should be learning. And specifically, I think someone brought this up earlier. Like, at the end of all this, it's just about critical thinking. We are needing to really teach people critical thinking skills and learn how to discern information of all kinds. Whether it be media literacy, financial literacy, civic literacy, data literacy. And it's about teaching, I think in particular, young people about like, how are you gonna be able to use your own mind in your brain and to really bring back the onerous on the individual itself and giving them those skills is something we've been trying to do forever. So this is not anything new, right? We just have a new tool and where all the information comes from. Yeah, I mean, you say it's not anything new and yet because of the just vast amounts of information that are now flowing in and that students now have access to that they have their fingers on, it does feel like we're in a new world at some level in terms of what teachers may have to manage or even what librarians are dealing with. Many of you may know that I've got teenagers and I watch the way they gather information that comes streaming in at them, whether it's their stories on Snapchat or something that they've seen on Tumblr and where they're getting any kind of information of how to discern what that information is and whether it's valid or not or whether it's been double checked or not, I just throw up my hands. I'm not sure where they're going to learn how to do that. So doesn't it feel like there's something a bit different now? There's certainly more information, both good and bad. So I would just say that it's not one sided here. There's also been good information that has transformed what someone mentioned earlier at the Parkland kids, right? I also have teenagers and I am pleasantly surprised by how they've been able to actually figure out, maybe I need to look at the Solila Deeper, maybe I need to look at other resources, but quite honestly, I think they're at a school that helps them with that, right? And so that's where I go back to education where at the school that my kids are at, they're learning critical thinking skills and I think what I'm most concerned about are the equity and access issues because there are so many kids who aren't getting that at all and they're still learning how to, they're teaching to the test, they're not learning the critical thinking skills that they need and I think one of the things that we need to do as a country and you all who are doing the education is like we need to do a better job of being able to get these skills and providing educators with the tools and the professional development. I think it's more than just handing people tools, it's actually teaching them how to use them and helping them be comfortable and really providing that information so that individuals can be able to make those decisions and you can do so much to be able like, as you have kids, you can give them all this information and you hope that they walk away with the skills that will help them do that. So quick, yeah. Very quick comment. So I think there is three new elements. One is the philosophy with which content is created. Every day we create 250,000 libraries of content. Libraries of Congress, worth of content. New content, most of it is junk, but it's still content. Two, I think it's the erosion of national borders. Frankly, we're talking about some domestic issues, but as international borders erode, well, a troll or a human being somewhere in St. Petersburg can control how people think here in the US, so that's new. And three, very important, there is a bit of an erosion and a resignation to, an erosion of the civic duty and a resignation to AI. We somehow expect that artificial intelligence will feed us the perfect information, will give us the information we need that we can then consume critically. That's not the case. We have a non-abdicable responsibility to consume critical information. I think reshifting the onus on us, it is largely about us, it's new, and it needs to happen now. I think that's happening now. This conversation is actually... No, but I mean, I think it's happening in a way that we're not giving people enough credit and I think it's because of what's been happening with Facebook, it's been happening with all equal facts, with all these things that are happening, what's happened with the election. I think the silver lining to a lot of the stuff that's been happening the last couple years is that it is awakening people to the fact that we really do have to take more responsibility. I don't know if folks know Bruce Shreiner or not, but one of the things he said that I've always remembered is like, we created this, we can fix it. And so I think it's giving that responsibility more and for people to understand, and I think there have been a lot of good wake up calls. So we have this awakening and now we need to maybe do something with it. It sounds like, I mean, that's certainly part of, I think the theme of the first panel as well. Let's turn to our teachers who are really facing this every day. We got it. We got it. You got it. No problems? Not a problem. So let's first, Alan, the reason that we have Alan here today is because Alan is part of a project that's national, many sites around the country that is called Generation Citizen and it's around how to change what civic education looks like for our high schoolers, maybe middle schoolers too. Not sure on that. And you can maybe tell us more, but what I wanted to understand about the kind of work that you're doing in Oklahoma City public schools is how you are doing the civic engagement piece with your students while also recognizing all the sources of information they're having to kind of sift through, while also just teaching them the content in the first place. I mean, you know, there's certain standards, they got some tests to take. So tell us how you're doing that. That's why I get the big bucks right there. That's why I get the big bucks in Oklahoma. Oh yeah, that's right. It's an Oklahoma teacher. You know, you're paid so well. So well. And so actually, honestly, we can, we should in fact kind of bring that into because the resources to ensure that our adults who are teaching our next generation are a key part of this. But tell us about what you're doing with the students that you have in Oklahoma. So generation of citizens has been around about 15 years. They've handled, they've involved about 30,000 students. They have a base site in San Francisco, Texas, California, and then about a dozen base sites in the Northeast. Its emphasis is on the action, but it lays out a full program whereas typically in the past, as I taught or tried to teach government, we focused on, from my generation, focused on the big news agencies and newspapers. Over the, obviously for the past five years, that's changed as to where they're getting information. Obviously children have always gotten information from Uncle Larry or primarily those sources. They still don't watch the news or read it. But now instead of just Uncle Larry, now it's Uncle Ivan and people from all over the world, which I think is a big, all of a sudden there's trying to go through that discernment so I've had to go through an evolution of rethinking a little bit, that critical thinking. So generation citizen came along and said, listen, we really want to get into action civics. Taking it to that level that typically in my government class, we'd learn something about the legislature and then they would do a little project like email a legislator, give them a call or something. This takes it to the next level of actually more systematic, more of an organization. And I think that's really where a lot of the, I think part of the solution comes in as they have gathered information. We asked them, so what are the big issues you see facing your community? And generation citizen mainly focus on city and state level. So what's the issues you see or you hear from your parents? And we'd try to survey, we'd reach out and find out when we come up with half a dozen problems. But then through the discussion, we say, well, is that really the problem? What information do you have? And we, not so much a formal debate, but kind of a Socratic Seminar kind of situation where there's that discussion saying, well, I heard this. Well, where'd you get that? No, I heard this. I think there's where that discernment comes in from their own discussions with each other. It's like, oh, wait a second. Yeah, all I heard was this one side and starting to realize and then having to come to narrow down five or six after discussing them and trying to find out, well, what's the real issue? Going through a process of deciding what is an issue to a consensus of them and then narrowing that down and continually looking for information, not just on the media, but going directly to legislators, going directly to influencers. One of our groups had to do with gun violence and was able to bring a speaker from mom's demand action, very much into the regulation of weapons. But anyway. So let me ask you a quick question on that and then I'll get to Patricia so we can talk through how it works in your classrooms. But how do you deal with the issues that might come up where there are really not just a difference of opinion, there's a difference of fact base that students might be working from. I just wanna, and it may be very much gonna bound up in politics. I wanted to just read a statistic here from a Pew Research Center study that came out last year. It found that 34% of Democrats said they considered information from national news organizations very trustworthy but only 11% of Republicans said that. And if you are thinking about kind of the next generation or students today and the households that they come from, whether it's a Democrat household or a Republican household, they have a really big impact on how they're seeing the information that they're being asked to parse through in your courses. So how are you working through that? I also teach psychology, which means I don't know any more about that than I do government. But you're exactly right. Most of their, while kids will say, well I don't want to dress like dad or I don't want to listen to mom's music, throughout their life they really do stay pretty true mostly when it comes to religion and politics. So that fourth and fifth graders might not know they're conservative or liberal yet, but they probably are. Based on the fact of their information and the people they trust have given them this perspective. Trying to break through that and say that's fine. But even if you want to continue on a path conservative, how do you at least recognize another side and if you're gonna be a true Republican, being true to what that means or changing it, but going through that critical thinking of we always want to accept information that fits into our mental framework and they come by the time they're four or five years old there's already a mental framework and then trying to bust out of that and say, well now we want you to be a critical thinker. And that's a big wall that even Donald Trump would be impressed with trying to get past. It's a roadblock and have the motivation at the, in the high school mind typically, having the motivation to break through and say wait a second I can be open minded and I can keep my values but I can understand others and I want to have mine based on some sort of ethical truth is a tall order. I think a lot of it doesn't come until in their 20s and 30s and then they reach back and they say okay yeah, this now makes sense to me. So Patricia tell us about the tools that you're using in your classroom. Patricia is here as a representative who's also deeply involved in the news literacy project which, oh yes, Alan is here, good. So Alan Miller who is the founding CEO of the news literacy project and also can be here to answer some questions later. I may have heard of Alan's work and what the news literacy project is doing but one of the things that I've been following over the past year or so is the incredible kind of momentum now behind the software that's come out of the project called checkology and how that's being adapted in classrooms. So tell us a little bit about that and your involvement in the project and I'm curious about your take on some of the questions that I asked Alan. Yeah, so I teach 12th grade U.S. government in Arlington and I have the luxury of teaching a course that is, it's a high stakes course and that it's required for all students to graduate but there's no test. So I have some freedom to design a course where we meet the county standards but I can also infuse lessons that are far more relevant to my students' lives and so the challenge that I have is engagement and participation. I want the students to graduate and to vote and to make the world a better place but they come in with varying degrees of understanding of the levers of power and even what real news looks like. So instead of using a textbook like many teachers out there I use the news. So most days when students come in they listen to or read a news article that's related to the content but it's also related to their life and so I have been doing that for several years now and have been chomping at the bit to get involved with the news literacy project and when technology came out I was super thrilled to start participating in it and so what it is is an online platform that students have an individual page, they have an account if you will and there are four modules and I think it really, it gets to the heart of the problem. It starts with what I think is the most important lesson. I don't think we have a news problem, I think we have an information problem and students aren't able to discern and label and put into buckets what they're looking at. I asked students who spend a lot of time on social media, what are you doing? Like what are you doing on there? Are you reading news? And they're like what? Some are. There are some really incredibly savvy kids who I have to say last panel that I had asked my students a few weeks ago, the essay question was given the dangers to our democracy of misinformation and viral rumors, what needs to be done and so many of the points that they came up with in their essays were expressed here with super smart people. So- We gotta get them on the panel. Yes, they are thrilled. So what they're doing online is they wanna be entertained. So they're having fun and they don't really care that that video that says like these are the weirdest last meals of death row inmates. They don't care if it's real or not. They're not fact checking that. But that's generally speaking. We do have a problem and my students recognize that. So what we do is we've gone through it in a couple of different ways. Checkology has, it also makes the case for the First Amendment. So the quick, the knee jerk reaction is ban it. Well, this provides a lot of really great lessons and videos with these fantastic reporters. It's interactive. I can look at it as a teacher. I can see exactly what they're doing, when they're doing it. So it's got all of the great best practices and it's relevant to their lives. Have you had a student ever say, how did they know this is right? Or questioned checkology? I mean I only asked partly because one of the things that's come up in some of these conversations about news and media literacy and at what point have we gone so far in teaching people to be discerning and questioning that they're always questioning and never certain enough, right? And so there are some certainties in life. So how to cope with that? Yeah, there's not a lot of pushback. I think that because they get so much news in the class and because checkology does such a great job teaching the skills of what is quality journalism that when you, I mean it's pretty much a no-brainer to look at a Facebook post that says that you can contract HIV from a banana and a quality news piece. So I'm not worried about fake news in my students. I don't think that they're gonna fall for that. What I think they're gonna fall for is really one-sided information and that's the kind of stuff that- Getting the nuance. Yeah, the nuance that they need to really look at it from multiple perspectives. So I know there's gonna be some good questions out there and I wanna open it up to the audience before we break but one of the big questions that I wanna make sure that we all can grapple with here and I'm just curious about each of your take on this is the theme for this entire event which is how do we combat misinformation and disinformation? And it feels like, but here's where I wanna just test it like tell me if I'm wrong, it feels like some of the answer that came out of the first panel and that we're trying to grapple with here is that how you combat it is building better critical thinking skills on our kids. But the bigger question I think is, are we doing enough now to, and whether it's kids or adults because as the IREx program show, even adults who's at pension year and retiree level can really benefit from this. Are the tools available to do this job? Are we reaching the kids and the adults that we need to do this job and what has to change? Standardized testing. What about it? That's what needs to change. What about it? So kids come in, they don't know how to think, they know how to identify the right answer but they don't know the and and the because. And it's a billion dollar industry that we spend a lot of money on and it's to Pearson's benefit to not have a test that actually requires a human being to read. You can run it through the Scantron and then you've got a result. So totally just double sad question because I hear where you're going because as a parent of kids who are right now, literally probably maybe an hour ago sitting in a classroom doing a standardized test. Do you think that if standardized tests went away, kids would have better critical thinking skills? If we're doing a good job teaching. Aha. Yeah. Yeah, the teaching. I mean, the tools are out there. I mean, not only is checkology awesome but I'm sure the Mozilla program is awesome too. I mean, the tools are out there. Michelle probably has a whole slew of tools in terms of what you all do that's out there but it's finding the time and the opportunity. We can't just try to school today. And it's got to be both in school and out of school. Like I don't think it's just the school's like responsibility. There are libraries and museums and out of school. Kids only spend 20% of their time in school anyway. So we need to think about this kind of like a whole village effort as opposed to just the schools. And it's a matter of life and death for our kids too. I have half of my students live in poverty. Many of my students are here undocumented. You know, when we started the school year, there's all this news out there about what's gonna happen with DACA. We had a lot of kids in tears. Kids who are overweight don't know that you're not supposed to have more than 20 grams of additional sugar a day. Like there's so much out there that's competing. One thing I asked to do with resources, I think you highlighted this maybe in an implicit way but even in Arlington County, there's divides between North Arlington and South Arlington and there's a political economy of information. How people access information and how they make decisions on the basis of that information. That I don't think it's quite captured in the so-called tools that we use now for our kids. And we have to be a bit careful with tools. I think one of the previous panelists said there's both supply and demand sides to this discussion and solutions both relate to supply and demand. But if you focus on the supply and this is the AI, the tools, we have to rethink a bit actually how specific they are, how contextualized they are and what are the drivers that people bring to the use of these tools. And this goes to, is really technology the solution here or do we need to rethink the problem and how we go about that. That's one. And two, I think it's also a matter of resources. If we truly believe that information is a lifeblood of democracy, well, we need to put our mouth actually behind some action and actually invest in this aggressively. I think that the resources, even in some of the wealthier counties of the US, are really inadequate vis-a-vis the challenge. You're right, it's not a media literacy issue, it's not a media information. We have an information problem. We have a challenge with a vast amount of information and this vast amount of information, this universe requires resources to handle. Are you feeling, well, I know you're feeling the resources issue in Oklahoma. Yeah, yeah. Tell us a little bit more about what you need and what you don't have. Part of it goes back to the standardized testing. I'm really not against standardized testing. I just think it's so focused on some math and science you can come up with an answer and you can have a hundred questions and come up with a right answer. They tend to kind of let social studies kind of do our own thing because it's hard to grade. It's gonna be essays, it's gonna be justify your reasoning, it's gonna be those critical thinking skills, but they can be done. So I teach AP psychology and you look at the AP test and part of it is there's a lot of application, even though a big chunk of it is multiple choice, there are application questions. Same kind of thing can be done in more of an essay and or verbal discussion to see if someone is looking beyond their own narrow frame and able to look back at and say, yeah, this is true because or no, I see this other person's point of view. It can be done. And not that I want to be standard eye testing in my classroom, but I think it could be done well. I just think there needs to be as far as the classroom side of it, as I already said, I mean, they come already with a framework from their parents, from their friends. A lot of them goes back to that motivation. They wanna be entertained and what's entertaining is dramatic. And so somebody says that Clintons are in child selling children under a basement of a pizza place and people go up and say, where's that basement? We don't have a basement, but that's really entertaining and that's what they wanna share. I think the discussion and I think that's partly what we're getting a lot of information. We're not helping them discuss it critically. Let me. So I would just say one thing, like the way that the curriculum has been designed around some of these pieces, it's asking questions. It's promoting that conversation. So our curriculum is designed specifically for that. It's getting them to ask questions that they can think and respond. It doesn't tell you like this is correct or this is wrong. It's just like it prompts people to think. Let's go to a couple of questions out here. We'll go over just by a few minutes because I know that there's a lot of folks here. So just make sure to say your name, where you're from and just go straight to your question, please. And we have somebody here on the front. American University. One of the things I noticed from the study in Ukraine is that one of the effects that occurred frequently in all of the studies it discussed was that the ability to discern me literacy degrades over time even after it's taught. So have you explored any methods of sort of reinforcing that knowledge afterwards after the fact like a refresher or like buying in from the media to introduce like little snippets of this teachings that you've taught them to make ensure that that skill maintains over time? So it's a great question. And I don't think that I have quite the data that you may be looking for but I think informal we've been trying to do this. And indeed the network of these 15,000 trainers who received this learn to discern methodology the curriculum as we're told and as we can informally tell do indeed continue the spreading of these skills on an informal basis. However, the initial sample of 90,000 was based on that first one year of the program which happened a year and a half ago. We're now in a second iteration that works in the secondary level of education in Ukraine with kids because we think this is a population most at risk and with which we can have the greatest return on investment long term. And so there is some collaboration going on between the two groups with some nuances. We also work with a range of local partners including StopFake, the Academy of Ukrainian Press and they do do some of this on a repeat basis. Let's take one or two more. Yes, right here in the front. Thank you very much. I'm with UNESCO, my name is George Meyers and we've been working since last year on what we call rebuilding communities after war, crisis, but also various degrees of liberalization or democracy building. One of our barriers is the credibility of mainstream media. So I'm very much interested to what you're doing with schools and the second one is social equality. How to address this issue? You have a question for the panel? After school and after. Most of the education today or much is after school, out of school and equality is very much decided there. So how would you address this issue? In my community where I teach, it's 75% Hispanic, 75% low income, towards really one of the least diverse places I've been. So there's very much a one perspective and they don't see others openly or fairly quite often if they're different. If they're from a different part of town, different color, whatever. And so I think you're exactly right. I think one of the big challenges or one of the big tools in doing that is whenever we do choose topics, whenever we do action projects, first off valuing everyone's voice in the room, letting them know that even this new person to the class or the one that seemed not so socially acceptable has something valuable to contribute. And realizing, so I think through the group work, quite often it's more of an active way to help improve that, oh wait a second, this person is a different color or it comes from a different background. But they have something to say of value too and they need to be heard. To me a lot, it goes back to those questions and setting them up in a situation where they're, I hate to say debating, but discussing things with some framework of fairness and openness and still being able to hold, I'm passionate about this, but it's okay you're passionate in another way or you come with a different perspective and now I understand it's not just that you hate me because of my ethnicity or vice versa. Those discussions I think really go a long way to breaking down those barriers especially when it comes to inequalities. Obviously there's the inequality within Oklahoma, some of our school districts, it's when it comes to the technology, when it comes to other resources, contacts within the community with people that are involved in government or involved as influencers, that's hard to get around. And that's one of the things I really appreciated about Generation Citizen is they provide people that have those kind of connections within our community so that we know who to reach out to to get those other perspectives and see that people that are very critical thinkers or are very highly educated can really are human too and can be approached. Having those kids actually talk to a legislator, they freak out, they go, what do I say? Well, what would you say if they were just a person? Well, I don't know. You're talking all the time and breaking that down, but I think those kind of connections really opens up a lot when it comes to that inequality, seeing that they have a voice too, even though they're young and not a voter yet. Just to add on to that, one of the best moments this year was when a student of mine, her name's Sahin, as a result of the program, Checkology, she said, before this class I used to think that news was just for smart people and that word was really code for a whole lot of other categories that we put ourselves in. Rich, white people, people who've been, so she's like, but now I know I can participate in this. I can read the news and I can be part of the watchdog role. I can, so it meets many needs. So I think that that's a nice note to close on. Can I say one last thing? I just got to grab real quick. So this year we have these competitions between classes, but one of our classes was wanting undocumented to be able to get driver's license for a lot of safety reasons for the rest of us also. There's actually legislation in Oklahoma being introduced and several of them will continue to work with the legislature to try to campaign for that. And one of my classes having to do with school safety, several of my students have been asked to join the school board in drafting new regulations and how to direct the money for school security. So I said ongoing thing is what you look for. But it's got to start with that critical thinking of what is the issue, what is the real issue and then boring down into it and then finding those solutions. We had more optimism than I expected on the first panel and I think that if we think about the kids and the stories of the kids that you're talking about now and what they can bring to the fore in the next generation we can have even more optimism. So we just got to make sure we provide the resources and the mentors and the teachers to do it. Thank you very much everybody. And thanks to all of you. We actually will have a little reception after. So please stay and join us for some more conversation. And I'll see you out there. Thanks.