 Welcome to the breakdown. My name is Umu. I'm a fellow on the Brickman Klein Center's Assembly Disinformation Program. I am recording today from California, and that is why my background doesn't appear the way it normally does, but I am excited nonetheless to be joined by Lisa Kaplan of the Alethea Group. Lisa founded the Alethea Group to help organizations navigate the new digital reality and protect themselves against disinformation. Thank you for joining us, Lisa. I'm so excited to have a conversation with you about this and many, many other pertinent issues. Thanks so much for having me. Excited to be here. Yes, so our conversation today centers on a really, really big topic in the disinformation space, and that is the shift in focus among a bunch of different stakeholder groups, including the national security community, academics, civil society and others, and focusing on disinformation from a national security sort of foreign policy geopolitics perspective. So the first, the first thing I want to ask you to that end is just can you give us a little bit of a taste of your background and what sort of compelled you to to found the Alethea Group. I started the Alethea Group in 2019 and prior to that I was the digital director on a 2018 Senate campaign. One of the things about disinformation that I always like to remind people is it's not always a foreign government and it's not always people who are seeking necessarily geopolitical goals. The goals for threat actors really do vary depending on the threat actor and that's one of the reasons why the hard work of attribution that we do at Alethea Group is so important. It does depend on who the actor is and what their motive is. And once you know that you can infer what their goal may be and it can help to mitigate a situation before it even starts, or it can be used to mitigate a situation that's elevated into more of a crisis situation. So I was working for Senator Angus King and because he was running against a Democrat and a Republican, we were looking at disinformation narratives from both sides, not necessarily doing the work of attribution. But because we were a campaign, we had limited resources, but trying to understand what the narratives were that were out there. And what we realized is that disinformation, it was targeting candidates, it was targeting issues, but at the end of the day it's really targeting voters, us as people. And when you think about it as an election context, it's really straightforward. It's trying to influence people's decisions around when, where and who to vote for. You know, targeting those decision points, are you going to vote and then if so, who are you going to cast your ballot. However, disinformation, it's not an issue that's limited to just elections, and especially for sophisticated actors. So think for an adversaries think for profit disinformation networks who have built up influence to then sell to the highest bidder, or are they influenced to be able to generate ad revenue through clicks, for example. They're not just talking about one election or one candidate. They're talking about a variety of different issues. And so I say that because, you know, for us and the proliferation of this, these, the number of threat actors since 2016 has been exponential, what started as being primarily Russian we now have according to Oxford, over 80 countries who are actively engaging in social media manipulation and that doesn't even account for all of the individuals who have stood up their own operations. You know, we also see both sides of the aisle to be clear. Political consultants engaging in similar behaviors and social media manipulation tactics. So it really comes down to who is the actor and what is the goal. Now fast forward. One of the things that we saw, unfortunately, and it's very unfortunate that it took essentially an insurrection attempt at the US Capitol to really catapult this conversation and there are a variety of research organizations including ours that had been beating this drum for several years now and so people were saying oh it's it's just a meme but no it can really you know disinformation really can lead to offline harms and we know that we've seen that with pizza gate we've seen that with you know different different events such as the L.A Dodgers stadium being shut down. Yes, there's vaccine site right so we are seeing more and more offline action happening as a result of disinformation. I think one of the things about the what happened at the US Capitol is. It's a really important case study and I say that, and I know it sounds clinical, but because what happened was obviously a horrible day for democracy and I think affected the research community in a variety of different ways that was a very tough month for everyone. So I don't want to sound overly clinical about it but in a lot of ways there are a lot of lessons that can be learned. So for example, we can draw a straight line from some of the narratives that were happening in March saying that the election would be rigged that there would be violence that people needed to start preparing for the worst to what happened on January 6. One of the good things to come out of January 6 though is that most people get a second chance, not everybody. Obviously there's a Capitol police officer who died in the line of duty protecting the Capitol. So we're able to really learn from this moment and move forward so that we're addressing this threat in a way that does make it so that we're able to identify these online threats and prevent them from becoming offline harms. And for that summation of what not only what happened on January 6 but actually like the through line you can draw between the sort of initial stop the steal narratives and organizing to what ended up happening on the 6 and what I think is also projected to sort of continue this week on the fourth. Can you talk a little bit about your practice at Alethea Group and how you help organizations navigate through this new digital environment. In Alethea Group what we do is we detect instances of disinformation misinformation and social media manipulation as well as track other types of online harm such as targeted harassment to be able to help individuals figure out how to how to navigate this new digital reality. A lot of times people don't realize how many options they actually have. So for example if you're able to detect what what a lot of people are doing right now is they're analyzing something once it's already gotten onto Twitter onto Facebook onto YouTube for example, and it's on one of these mainstream platforms and they call the social media platforms and they say please take it down. One option, but if you've gotten to that point in a lot of ways it may already be too late. So what we do is we practice early detection so we tried we catch narratives when they start and we're able to then track them and understand how they may be influencing individuals and seeking to change individuals behavior. So what we do is we analyze on the network level and what that does is it enables us to take more options. So for example, let's take these for profit disinformation networks. Well, you know if they're making a profit off of a potential organization there could be an opportunity to seek damages. If there is appropriating intellectual property such as your name your likeness trademarkable or patentable or copyrighted material. There are legal options that organizations can take to seek recourse. If there is an opportunity to do counter messaging and I'm not talking about fact checking and our experience fact checking while it's helpful in creating a paper of record or it's helpful in giving something to point to to set the story straight. That's not sufficient. And here's why the people who are choosing to believe a narrative aren't necessarily going to believe something, you know, salacious about it. It's changed their mind and say they no longer believe something that's totally salacious about you and confirms their biases about you just because you're the one that said it isn't true. We're talking about really counter narrative building. You know, and that can be done in an ethical way by creating a greater understanding around an individual or an organization. So what we do is we help through the entire process so we detect what's happening we assess whether or not it's having an impact on an organization's goal and then we help to mitigate any potential impacts the idea being that we can solve a lot of problems before they become a real issue or a challenge for an organization. You know, along with the expansion of the threat landscape. There has been sort of an uptake and sustained information events, like the COVID pandemic like stop the steal and just a lot of these, not just this information but chatter around the election. Are there other structural developments or other factors that you attribute to the expansion of the threat landscape and the shifts that you know the shift can focus from foreign disinformation to domestic. Yeah, I think one of the things about disinformation and you know there was conversation around how this was going to happen after 2016 because it's fast cheap and easy to do if you know how to run a good marketing campaign. You can actually figure out how to run a disinformation campaign. And so, with that, and then you also look historically and open source estimates say that the Russian inner attempt to influence the 2016 election cost a million dollars which is a rounding error to most large organizations or federal governments. So, it wasn't a question of if other organizations were going to start engaging in these tactics it was a question of when. I think what we've seen is the proliferation of that. It's the proliferation of the number of threat actors that's causing an increased number of threats, when it comes to the disinformation landscape so going to the COVID 19 pandemic. Race narratives have shifted over time, and they vary again based on the threat actor so one thing we saw and again this is all open source just Russian state media. You know we did see Russian state media pushing race narratives, targeting different communities and trying to put them against each other we saw our tea put out content that was more focused on, you know what's the big deal with calling it the China virus for example we saw that the Spanish flu, and our tea typically targets one audience and then we saw in the now which is, you know, typically targeting younger audiences on Instagram with videos being like if you are Asian, you will get attacked in New York City and it's not to say that people are being attacked because of those sort of attacks, and, you know, as a result of disinformation going back to the online to offline action and I don't want to make it sound like it's just the Russian government because there are definitely other actors who are playing in this space. And it is I think just providing more opportunities from that perspective and then, you know, similarly, there's the financial motivations and the financial perspective so we'll see some of these junk domains pushing false information about the pandemic, and they're doing so because likely to generate a profit, you know they have advertising revenue that they're getting from clicks, so I think it really does depend on who the actor is as to why they're pushing it but because it's so cheap easy and potentially easy to do, as well as it works from a geopolitical perspective. So, and if it's cheap why not try there's not really a high cost to people who are executing these sorts of campaigns, not at all. And I would just echo what you said by, you know, by pointing out that it's been so interesting that since 2016. You know, our own elected leaders see value and exploiting the sort of fissures and society that we have and exploiting them for their own political gain. As you mentioned, you know this happens both, both on the right and on the left. Here's and talk about some of the specific mitigation. So one of the most significant impediments to legislating on disinformation and engaging it, engaging on it from a policy or regulatory perspective is that moderation is often criticized as being a brush up against the first amendment. What is your thinking around how regulation can encourage effective moderation without brushing up against those first amendment concerns and then I guess also more broadly what is your thinking around what the government's role should be in addressing specifically domestic disinformation. Well, I do appreciate and admire a lot of our colleagues in this space who are putting out really important research that can inform eventual legislation where I see the opportunity for more immediate action is actually through other other means such as you know the our judicial system and so on and so forth and forth and I'll get to that in a second. So this goes to first amendment. So that's an argument that I just don't really buy. We accept limits to speech in real life. I can't yell fire in a movie theater, like there will be consequences for me we've accepted that there are some limits on speech. So I think we should be reframing this question to say how can we make the online conversation more reflective of the conversations that we have offline. I think one of the things that's become really clear in the pandemic when we've all been you know forced inside and online to a degree is that that's not where we're at right now, but there's no reason why that's not where we can be. So what we should be talking about to though is, you know, when it comes to content moderation. I don't think that that necessarily removes the thread all together. For example, we saw a lot of accounts BD platformed for breaking the terms of service multiple times and again I think that's a perfectly acceptable consequences it's kind of like the no shirt no shoes version of social media platforms it's like if you break the rules on multiple occasions you're not going to be allowed back in. But where I think we need to be headed, and where I think we're going to see the most progress in the immediate is enforcing some of the laws that are already on the books. For example, the Dominion lawsuit that's ongoing right now is something that I'm paying attention to. I think that that is a potential avenue we've also seen some successes when it comes to copyright infringements because guys aren't really paying attention to following the laws if they're spinning up disinformation campaign so I think that we can potentially anticipate some actions taken there. You know, I think that the other thing that has been interesting to watch unfold is the stop hate for profit sort of activism. I think that that is potentially having an impact as well. So, all that said, I don't see this as a speech only issue, I think that there are very serious concerns, especially when we're thinking about the world outside of the United States and the Western liberal order, where, sometimes social media is how you evade censorship. And so these are really tough challenges and there are no easy solutions and I think having these conversations and debates are hugely important because if this were an easy fix it would be solved by now so But I am confident that this is something that we will begin to see and we've seen progress since 2016 to be fair but we will continue to see progress, we will continue to see solutions proposed and implemented. You had mentioned earlier that you've observed that fact checking is not necessarily the most effective way to clamp down on some of the bad and false information we see circulating online and certainly not enough to sort of interrupt the cementing of alternate realities that really disinformation is intended to foster. Yeah, so I think that fact checking definitely plays a role but I don't think it's something that can be done alone. I think, you know, labeling that sort of thing. These are relatively new, new features and so you know how effective they are. That's a little bit outside of my purview and I look forward to reading somebody's research someday that did a longitudinal study on how effective labels were on any given social media user. But I do think we need to start thinking about this challenge more holistically than we currently are right. So for example, it's not just what's happening on social media platforms it's also what's happening on blogs it's how social, some of these disinformation networks are being financed it's through advertising revenue it's through. It's through, frankly, like people who are building influence and then turning around and selling it for profit. Those are all things that, you know, I think we can incorporate into part of the solution when we're trying to figure out how do we put out the fire that's happening right now. And when we look towards a long term solutions to figure out, you know, what is it that we can do to really change the equation to make it so that disinformation is not as successful as it is now. There's all kinds of things we can do but I keep coming back to education. The general person knew what we knew and likely what everybody who's listens to these podcasts new, we'd be in a lot better shape. So how do we make it so that we're not so special anymore how do we make it so that the general population is more resilient to disinformation and a lot of that has to do with education. So when I was in high school we used to do these things where we would basically learn how to read a news article, we would learn how to read a newspaper based on you know what page was the article on in the newspaper for relative importance what paper was like, who's the author when was the paper published like how do you read the first two paragraphs how do you read for bias. Why aren't we teaching that for a digitized world, because I am still pretty old school and I like physical them, but I'm like the rarest subscriber you will see for my age demographic like one of the things we need to consider is how do we modernize our approach to consuming media and I don't want to put it on the social media user to protect themselves from a sophisticated information operation. But again when you think about it as a holistic solution this can also be a piece of the puzzle that really makes a difference. Thanks so much Lisa I really enjoyed our conversation. Thanks for joining me. Thanks for having me.