 Information laundering is really quite ferocious It's when a hoax detects some lies and makes them sound precocious By saying them in Congress or a mainstream outlet so Disinformation's origins are slightly less atrocious It's how you hide a little, hide a little lie, it's how you hide a little, hide a little lie, it's how you hide a little, hide a little lie When Rudy Giuliani shared that intel from Ukraine Or when TikTok can flu and so say COVID can cause pain They're laundering to sin for when we really should take note And not support their lies with our wallet voice or vote Oh, information laundering is really quite ferocious It's when a hoax detects some lies and makes them sound precocious By saying them in Congress or a mainstream outlet so Disinformation's origins seem slightly less atrocious Hello everyone and welcome to Mentor Talks My name is Asha Bay and I'm joining you from the Office of Alumni Affairs in ECA at the U.S. Department of State Our office opens the door to alumni networking, professional development, and grant opportunities Where are you joining us from? Let us know in the comments Mentor Talks is a virtual series that features exchange program alumni sharing their stories, mentorship, and advice This is your opportunity to connect with these leaders and mentors Like our guest today, Nina Jankowicz, of a Disinformation Fellow at the Woodrow Wilson Center in Washington, D.C. Nina studies the intersection of democracy and technology in Central and Eastern Europe She's the author of How to Lose the Information War, Russia, Fake News, and the Future of Conflict And her writing has been published by The New York Times, The Washington Post, The Atlantic, and others She's also a frequent television and radio commentator on disinformation And a prolific poster on social media about disinfo and other topics Before we get into conversation with our guest, we want to hear from you, our online viewers What would you like to ask Nina? Post your questions in the comments Now let's talk to our guest Nina, welcome and thank you for taking the time to join us on Mentor Talks Thanks for having me Asha You're welcome Before we get to questions from our audience, we'd like to know What led you to apply for a Fulbright Clinton Public Policy Fellowship? And how did your experience impact your career? Well, I had been working at the National Democratic Institute for several years on Russia and Belarus programs in the Eurasia team And a short time into my time at NDI The Euromaidan Revolution happened in Ukraine It began the revolution of dignity, the change of government that came after that And Ukraine's increased commitment to democratization began And I really wanted to get out into the field I wanted to really make an impact as much as I could And the Fulbright Public Policy program really seemed like a great opportunity to do that I kind of applied on a whim, I found out about it just a couple of days before the application deadline And had to work really hard to bring my application together But it was truly one of the most important experiences of my life Being able to really understand the realities on the ground in Ukraine For such a long period of time to work as an embedded advisor In the foreign ministry of Ukraine And to understand the realities of disinformation And that experience led me not only to conceive the idea for my book But also has just really created such important connections between me and Ukraine And my colleagues there I think those are going to be connections that I have for the rest of my life Which of course is the point of the Fulbright program To build mutual understanding And at that point in Ukrainian and US relations As we were undergoing a change in the US administration As Ukraine was in a sort of precarious political situation I think we really learned a lot from each other That's great And I just want to follow up on something What are the realities of disinformation? Well, for instance, I worked with the spokesperson Who had a pretty robust presence on Twitter She did regular briefings for the Ukrainian language news media And was interfacing with international journalists on a regular basis And given that Ukraine is at war And is resource strapped compared to its adversary Russia Which of course illegally annexed Crimea And has been funding and fomenting conflict in Ukraine's Donbas region It was really an uneven playing field And my boss, Mariana Betza, the spokesperson On a daily basis was just dealing with the influx Of disinformation coming from Russia And attempting to set the record straight To proactively put out a narrative that was the truth It wasn't a story It was the realities of what were going on on the ground Out the front line of what the ministry was trying to achieve And she often dealt with a lot of abuse Coming from Russian trolls Coming from some folks who experts call useful idiots Who support the Russian narrative And we tried to get through that together And to use what limited resources the ministry had Again, to advance the truth And it was a difficult, difficult job But also I think we realized in that year working together Certainly it was a big revelation for me That we can't fact check our way out of a crisis of truth and trust There needs to be a compelling narrative put forward But it also depends on governance When people are vulnerable to these narratives There's a reason They might not be seeing the response that they need from their government They might not be feeling like their voices are heard So some of that was outside of the Ministry of Foreign Affairs purview And that really led to my belief that we need to invest Not only in teaching people how to navigate today's information environment But also investing in the things that our adversaries exploit These fissures in our society that leave us more vulnerable to disinformation It's not just about pushing back, fact checking About playing what I call whack-a-troll It's also about looking inward and repairing the fissures that our adversaries exploit Thank you, Nina And yes, speaking of looking inward You've testified several times before Congress on disinformation And its effect on democracy What do you think are a few ways we can win the war on information? Well, one of the things that I always repeat in my testimonies before Congress Is that we need to invest in our people So I've mentioned media and information literacy already I think we have wasted a lot of time In not starting these generational programs And really making a fulsome investment In teaching not only school-age students But adults, voting-age adults as well How to navigate today's information environment I'm not saying that we need to tell them what is true and false No one wants the government to be the arbiter of truth But to give them the skills to separate fact from fiction To evaluate a news source To have a healthy skepticism And return a little bit of democracy and civility to our discourse I think is extraordinarily important So I would love to see programs like that Funded through libraries or universities Or civil society organizations In states, in local communities Not just coming from the top down from the federal government That's one Another thing I talk about a lot is investing in public media The United States only spends a dollar and 35 cents per person per year On our corporation for public broadcasting Which funds NPR, our national public radio and PBS The public broadcasting station When you compare that to a lot of other developed democracies It's actually quite low But when you look at the trust in PBS and NPR Across people from all different political parties It's much higher than the corporate media The mainstream cable news and things like that That a lot of people are getting their news from And if you look at our allies Like the United Kingdom for instance Which of course has the BBC Britons really trust the BBC in a time of crisis Something like 68% of Brits trust the BBC So I would love to see a more A really robust investment in our public broadcasting So that people have an authoritative source of news to go to In order to understand the nuance of situations And then finally something I always tell members of Congress When I meet with them is that Democracy knows no or sorry Disinformation knows no political party It is a democratic issue And so when we see elected officials Really amplifying and spreading disinformation That might benefit them in the short term But in the long term what suffers is our democracy So we need some top down education And leading by example as well And I hope to see that in the next couple of years As we go forward in our counter disinformation efforts Here in the United States Yes that would be great So we have limited time So let's get to some questions from our viewers In February you made an open offer to mentor early career women How do you mentor people you meet online for the first time Yeah this was something that I really felt passionate about I've been so lucky to have a lot of wonderful women mentors throughout my career And I found myself having just completed a book And several other large research projects With a little bit more time on my hands Feeling a little bit lonely In this wintry weather in Washington here And so since we're all connecting via Zoom And other online platforms anyway I decided to put an hour of my time every week Out there to early career women And kind of give them a steer To talk them through some things That might be bothering them or worrying them And what we do basically is They sign up through a forum online And they send me their resume ahead of time So I get an idea of where they're coming from We have a very casual conversation about their goals The sort of things that they're seeking help with And if I hear something that sounds like imposter syndrome Or sounds like people don't exactly know what they want But they're still very early on in their career I just try to assure everyone that A, you are where you're exactly meant to be And that if you've been accepted to a graduate program Or just gotten a new job And you're not feeling like you have really The expertise to be where you are We need to dispense with that Especially because women tend to really undervalue Their expertise When they're applying to jobs We won't apply to a job If we don't meet all of the criteria for the job Whereas men applying to jobs Only need to meet half of the criteria To put their names forward So I remind them of that And then I also remind them that As much as we all like to plan And game out our careers If you asked me four years ago When I was still in Ukraine If I would have written a book Testified three times before Congress And regularly be appearing on the media Of all types right now I would have told you you're crazy Sometimes things just work out differently And if you identify the things That you're passionate about In each job that you have And try to follow them through your career And prioritize them Then you'll end up in a place Where you are doing really fulfilling work Even if it's not necessarily the path That you thought you were going to be on So saying yes to opportunities And really just making everyone understand That we can't plan everything out And it's okay to not know I don't know really what I'm going to be doing Next month but that's okay As long as we're following our passions In our careers I think having those guiding morals and values Really is what has pushed me forward And I try to make that accessible And a reminder to everyone that I speak to That's really great advice Yeah, even when I think back to my own career Yeah, I'm not where I thought I would be But it's good It's still good It's all good Okay, so now we have some more questions Said asks what are three indicators Of disinformation that an internet user Can look to avoid believing And spreading disinformation Yeah, that's a great question It's gotten a little bit more difficult Because we're seeing a lot fewer uses Of what we call inauthentic amplifiers Like trolls and bots And a lot more information laundering When the source of disinformation is concealed And it's reported in the mainstream media As if it is verified fact But if you are encountering something That makes you highly emotional Whether that's really upset or joyful Which seems to be in less supply these days I would say that that's a good indication That you're being manipulated for something Maybe it's for profit Maybe it is for power Maybe it's to get you to click on something But you might be being manipulated So in that case, I suggest that we practice What I call informational distancing When you feel your emotions getting heightened Close your laptop, put down your phone Go for a little bit of a walk And if after a couple of minutes You still think, okay There's something interesting there I want to investigate this further What you can do is investigate the source If you're on a website that seems dubious See if they have contact information See if the reporter or poster has a history online Just do a quick Google search And see if they've reported anything else If it's all similarly incendiary Then it's a good indication that Again, something weird is happening And then one thing that is a really great indication If we're looking at visual material images You can do what's called a reverse image search And just right click on the image If you're using the Chrome browser And it will bring up the history of that image online So the earliest instance of it Sometimes it brings up slightly edited versions And you can see if the image that's being attributed To whatever you're reading about Is actually what you're looking at What we see sometimes, for instance in Ukraine Is images from the Balkan conflict in the 1990s Attributed to the Donbas today in Ukraine Or if there's a hurricane here in the United States Sometimes there are misattributed images Or edited images that mislead the audience About the level of destruction in a particular place So that's a really useful skill It's not only on the Chrome browser There are a lot of other different sites That you can use like TinEye If you are comfortable using Yandex The Yandex reverse image search Which is the Russian search engine Is actually quite good So there are a lot of tools that you can use To really understand what images you're looking at And try to give yourself a little bit more context But in total, those things don't take very long It would probably take about 90 seconds Just to do that quick sweep Of the information that you've encountered Okay, that's good advice We have more disinformation questions for you I hope I'm pronouncing this right Volodymyr asks what are the most efficient approaches To tackle Russian disinformation campaigns Well, Volodymyr, I have sad news I don't think any of them are particularly efficient And if there's one thing that I've learned from Ukraine And other Central and Eastern European countries It's that these campaigns have been going on for a long time In my book, I go back all the way to 2007 In Estonia as the beginning of the modern Online Russian information war These are generational campaigns Russia has been out it for a long time And that means that unfortunately We can't just snap our fingers And hope that, you know, taking accounts offline Will reduce the broader impact of Russian disinformation And in fact, we just saw this week Or last week in the United States Our Director of National Intelligence of Real Haynes Released a report on foreign interference And influence campaigns in our 2020 election And what in the Russia section really stood out Was that no longer are we seeing, again As I mentioned before, bots and trolls And fake posts and ads Instead, we're seeing this information laundering So in this case, we saw some Russia-aligned figures Sharing false narratives with key influencers In the US media who reported on those allegations And brought them to many more years This means that, you know, Facebook or Twitter Or any other social media platform Can't just take accounts offline to stop disinformation It's about, again, a broader understanding Of information literacy And it also means that we perhaps need to look Here in the United States at our campaign media laws About, you know, what is allowed to be shared By active political campaigns Now that doesn't cover everyone who shared disinformation In the lead up to the 2020 election But it would have perhaps, you know, stemmed some of the impact And also looking at how that is advertised What campaigns are allowed to spend money on I think all of that is important And then more broadly, looking at the regulation of social media So here in the United States We are just starting to have a really fulsome conversation About that I know Ukraine has been considering some Some measures related to the countering of Russian disinformation And domestic disinformation Including most recently a ban of pro-Russian TV channels In Ukraine and sanctioning of their owners I think that, you know, in Ukraine's case Obviously Ukraine is at war There are different stipulations Different considerations to make In regard to those sorts of bans But I hope that those aren't going to be a forever thing I think we always need to have some sort of Sunset clause on measures like that Because otherwise they might be inherited By another government that will even, you know More broadly enact such bans In order to silence opponents And we've seen some disinformation And fake news legislation Including in places like Singapore Where that's been exactly the case So there's a delicate balance between preserving freedom of speech And countering disinformation And we need to make sure that there are guardrails On all of those actions that governments take in particular Yes, yeah Speaking of social media and its role in disinformation There was a study that just came out That said Facebook left it too late To prevent the blues of information And it meant 10 billion page views Of false narratives Narratives gained traction ahead of the US election So I mean because these Facebook and Twitter These are all privately owned companies How do we move forward on the conversation on regulating them? Well, I think the most important first step Is to depoliticize the conversation Which sounds easy to say It's a lot more difficult in practice And I think the key to that is transparency About what the platforms are actually doing Right now we have to take the platforms at their word Reporting on how much they took down Whether it was effective The measures that they're taking The way that their algorithms affect people When there's another narrative that exists in the media In certain political circles that Either there's censorship happening Or there's not enough happening And somewhere in the middle of all that Is the actual truth So I would like to see forced transparency For the platforms I want to know how much hate speech they're taking down How much targeted harassment How many reports they get from users And what their response is How different new mechanisms In the infrastructure of the platform Are affecting user behavior So that we know if these online harms That are starting to proliferate Are being amplified by the social media platforms Or if they're actually being tamped down Because they do create As we saw on January 6th Offline harm as well As we've seen during the coronavirus pandemic Disinformation has human consequences So I think we need to understand that A little bit better first But there's a lot of other measures That have been floated around Including algorithmic transparency Understanding how the algorithms That show us certain content work Maybe reining in the level of micro-targeting That the platforms can do So that they can target you If you're 35 and a woman Living in Washington DC But they can't target you based on race Or at a very minute level Like down to your street Perhaps just your zip code Those sorts of things are all on the table As is antitrust action President Biden just hired a fairly I would say aggressive antitrust lawyer To lead the FTC Who has been a big critic of big tech So I think all of this stuff is on the table It's not going to happen instantaneously But we do have a big responsibility As the United States Where these companies are headquartered To make sure that the legislation The regulation that we're putting into place Has our American values at heart That it's protecting freedom of speech Protecting the voices of the voiceless Because social media can be a really Powerful democratizing tool Unfortunately we've gone the other direction But the regulations that we put in place Are going to have repercussions And ripple effects around the world So we have a huge responsibility As we go forward We do, yes So we have some more questions from our viewers Kriti says Often there is more than one perspective To a certain piece of data How do you filter which version is true And would that filtering unintentionally Contribute towards censorship I guess in this case We're talking about fact checking on social media That's interesting And absolutely I think there is an inherent bias That comes with some of these operations And that's why I think the social media platforms Have moved more toward adding context To certain pieces of content Rather than removing them Based on perceived verity or falsity And so we saw this a lot on Twitter During the lead up to our 2020 election Facebook has introduced some similar things As well where you can click to read For more context about vaccines Or things like this And that's where I come down I think it's a In terms of protecting freedom of expression It is the better choice And it does seem to generate engagement People are less likely to share Different things if they're nudged Or have some friction into sharing them So if something pops up and it says This is potentially false People are less likely to share that And we are the ones that amplify A lot of the false information It's not just big influencers It's normal people sharing it with their friends So I would like to see more of that And I'd like to see data About the efficacy of such measures It's the very beginning Of adding that context And kind of attempting to nudge people Out of certain behaviors But I think it's a potentially impactful solution Without again Trying to court any censorship accusations Okay And I want to touch on what you said About normal people sharing information I believe you had mentioned that You have some relatives that you've tried to like You know, let them know what the What the truth is versus the disinformation And how do you do that On like a case by case basis Like you know with your family With friends others Yeah, it can be really touchy And I think everyone's first inclination Especially in the online environment that we're in Where we're not gathered around The Thanksgiving dinner table Or seeing our relatives and friends as frequently Is that we send them a link to a fact checking site Like Snopes or Politifact Or a debunking page on one of the mainstream outlets When we say, you know, hi Joe This isn't true That actually the research Psychological research and other like counter extremism Research shows that's more likely to make people Dig into their positions So what we have to do is approach this from a very human Angle and full disclosure I have not been fully successful with this yet Because it takes prolonged engagement And the relatives that I've tried this with Are on the other side of the country It's it's been a long process And it's difficult to do online But if you can pick up the phone Or see the friend or family member in person And try to understand what has drawn them To that particular piece of disinformation A lot of people are talking about QAnon right now Many people have been kind of enveloped by that narrative Because of caring about things like child trafficking For instance, which is at the core of the QAnon conspiracy theory It's it's the way that they bring people in in many cases So unpacking that and and perhaps nudging them Toward different resources that are more Authoritative, more trustworthy Is a great thing to do But you have to get to that why What is their why for seeking this out In some cases people are lonely They're searching for community And they find it in these narratives So that's extraordinarily important You have to tread really lightly And it can be frustrating When you feel like your relative or a loved one has You know, they're on the precipice And they're going to go over off a cliff soon But we have to be patient and empathetic in these situations That's good advice And I think that leads us to our next question from Asan Dislanting and angling a story to such an extent That it changes the meaning of the whole story Caused the story to become a piece of disinformation That is a hard question to answer So the definition of disinformation Is false or misleading information shared with malign intent It depends what the intent of the person Sharing the information is And that's why we see the social media platforms Often saying this is misinformation Not disinformation Because they don't want to ascribe intent To any of their users without knowing that for sure I think in many of the cases of the hyperpartisan media That we see in many countries I would ascribe the disinformation label to it Because it is either the intent of accruing more profit Or more power The two P's that I use very frequently Or supporting certain political movements That's what I mean by power And that's really dangerous Again, because the media as we know Is supposed to be the fourth estate in our society Meaning it is the fourth pillar of government In addition to the executive, the judiciary And the legislative branches of government So if we don't have a media that we can rely on That we can trust to give us the unvarnished truth Even if perhaps it goes against the political inclinations Of the editorial staff or the journalist in question Then we have a really, really difficult problem On our hands So absolutely, there are certain narratives In the media that I would call disinformation But it comes down to that intent And that can be a sticky thing to try to discern on your own Okay Yeah, it can be So it looks like we're just about out of time But before we go What would be your tips for fighting the war and information? I feel like I don't know if it's possible to do a quick roundup But yeah, I mean, I think the best thing The most important thing to remember is we Individuals are on the front line of the information war We can bemoan the inaction of social media platforms And our governments But it comes down to whether people are sharing the content that they see online Or whether they're being more discerning So if you want a little moniker to remember Just take care before you share We need to do our due diligence And understand our emotional responses to the content That we are being fed by the social media platforms And by disinformers And understand that we are being manipulated And really fight back against that unique response That's my biggest piece of advice Along with the other things that we talked about before Just checking your sources Reverse image searching These sorts of things Those heuristics can really lead to a better And more balanced online experience for everyone Great, thank you So take care before you share And check your sources Thank you very much, Nina, for being with us today And a big thank you to you, our online viewers, for participating Be sure to keep your calendar open for our next Mentor Talks When we chat with internationally renowned artist, entrepreneur, and arts envoy Carla Canales Carla joins us live on April 20th Don't miss out Interested in learning more about our awesome exchange alumni? Visit our website at alumni.state.gov And follow us on LinkedIn, Instagram, and Twitter Thanks for watching And see you all on April 20th for our next Mentor Talks