 Secretary General Casas Amora, Excellencies, Distinguished Guests, and Friends. My name is Jason Latour. Wow, that is bright. My name is Jason Latour. I'm Canada's Ambassador to Sweden. I'm really pleased that you've all joined us here today. I'm delighted to welcome you to today's event on disinformation and its impact on democratic processes. As most of you know, disinformation is Canada's thematic priority for our chairship of idea this year. So why did we pick this theme? A strong democracy relies on access to diverse and reliable sources of news and information so that members of society can form opinions, hold governments and individuals to account, and participate in public debate. However, some states and non-state actors are employing disinformation as one of a number of tactics to subvert democracy and erode the rules-based international order. Hostile actors are exploiting divisive or extremist narratives to spread fear, fan far-right extremism, seed division, and amplify polarization. They are peddling conspiracy theories and casting doubt to erode trust in our rules-based institutions, ultimately to weaken democracies and embolden authoritarian regimes. Effective responses to disinformation are needed at different levels and from different players and can include legislative and regulatory measures by government, industry commitment and action, and civil society initiatives. Canada has stepped up efforts to build societal resilience at home through research, training, and supporting local journalism. We also launched the Plan to Protect Canada's Democracy in 2019 to protect our electoral system against threats. Internationally, we are working with other democracies, civil society, academia, and industry to counter these threats. This includes leading the G7 rapid response mechanism to identify and respond to foreign states' sponsored disinformation. This mechanism is ramping up engagement with social media platforms and civil society organizations to ensure a more cohesive approach to counter disinformation online. Canada is also tackling this issue through its role as chair of the Freedom Online Coalition. We need stronger cooperation among democracies to address this challenge if we are to protect our values and our interests and rebuild the trust needed to help democracies thrive and survive. Canada wishes to place particular emphasis on the challenges created by disinformation on democratic processes, including participation and representation. We also wish to broaden the conversation at the international level around building societal resiliency to disinformation. So this is why Canada has made disinformation the focus of our Chairship of International Idea. Today's session, which is being live streamed, brings together a host of stakeholders and offers an opportunity to hear from experts, share views, and have a dialogue. To help us with that goal, we have assembled a panel of distinguished experts, and I am pleased to introduce each of them now. First, we have Mikael Tovason, head of operations at the Swedish Psychological Defense Agency. He was previously behind Sweden's efforts to counter foreign interference. He also headed two task forces to protect Sweden's general elections in 2018 and the European Parliament election in 2019 against foreign malign information influence. Second, we've got Serge Blay, executive director, Information Integrity Lab at the University of Ottawa in Canada. Serge has been with the University of Ottawa for 30 years, where he now directs the Information Integrity Lab, which brings together a cross-disciplinary international network of experts to highlight actors behind misinformation and provide solutions. And thirdly, online with us today, we have Tin Le Nguyen, a journalist from Myanmar. She is currently based in Rome and has nearly 13 years working as an international correspondent for the Thompson Reuters Foundation. In 2015, Tin returned to Myanmar after many years abroad to set up, launch, and manage Myanmar now, an award-winning bilingual news agency producing in-depth reports on the country's historic elections. You can see that we have an incredible panel based on what I just read out, but not only do we have a highly experienced and impressive panel, but we also have an incredible moderator with us today. In this regard, I'm also delighted to introduce the moderator of today's session, Alberto Fernandez. He's a senior program officer at International Idea. Alberto has carried out research, advocacy, and technical advice projects globally on how technology and democracy relate, both from the regulatory perspective as well as from impact of technology on the state of democracy. He is also a regular contributor and commentator in diverse media outlets primarily on topics related to technology, democracy, and online political campaigns, policies, and regulations. At the end of today's session, we are also delighted to have the Secretary General of International Idea, Kevin Casas, provide some closing remarks. So thank you to Alberto and to the panelists for taking the time to join us here today to share your experience and your insight, and thank you to International Idea for co-hosting today's event with Canada. It is now my pleasure to pass the mic over to Alberto. Thank you, Mr. Ambassador. Thank you, everybody, for joining us. So, Mr. Ambassador, say we have an excellent panel here to go and talk about the topic of disinformation. And I want to thank Canada for organizing this, but also for choosing this particular topic as a priority for their chairship. So just to run a little bit about on the event, we will be having a conversation with the panelists, and then after that, open the floor for questions from the public. And then at the end, Kevin Casas-Tamora will, Secretary General of Idea, will close the event. So, because everybody's here to listen to the panelists, I'm not going to take more than a minute just to introduce the session of today. And I think it would have been difficult to place an event on disinformation on a better day than today, where all the news have gone a bit crazy about the attempt of the richest man in the world to buy probably the most important social media network on Earth when it comes to politics, Twitter. That is raising the conversation to interesting levels, but not only that. It is also a few days ago that the European Union passed the Digital Service Act, probably the most important piece of legislation when it comes to the regulation of disinformation online. Not enough, but it's one of the most important steps into countering the effects of disinformation. So at the European Union level, we will have a lot of rules that will force big tech companies to actually change the way they behave. And this is probably a trendsetter into different countries adopting similar regulations. So with this background and with all this information, I want to start posting a few questions to our panelists. And I want to start with both of you that are here, Mike and Serge. If you could please explain a little bit, a bit more in detail, what are we exactly referring to when we speak about disinformation? And not only what we are referring to, but also who is behind disinformation? This is not happening coming from nowhere. Somebody is actively pushing this. Why are they doing this? Who is creating disinformation? And what is exactly disinformation? May first? Sure. Oh, come on. Sure. So I would argue that disinformation is sort of, it's a too narrow perspective to use that definition. Because disinformation is when you use information with a harmful intent. You need to do, you want someone to feel, perceive something so they will do something that you want probably. You have an intent and it's harmful. And then in Sweden, we prefer to talk about foreign malign information influence activities. And with that, I mean that you use information as a deceptive tool with a harmful intent to reach your goals and perspectives in other ways to control your environment. And that you can actually do also with correct information. For instance, by influencing the infrastructure where we have information to make sure that your opponent will not be able to come out with a counter argument to come out with facts. If you only will show one side of the situation, then you can actually also deceive your target audience. And therefore it can be a bit deceptive to use disinformation as a perspective. Otherwise we see of course that the foreign malign use of information influence is, disinformation is one of the tools. Propaganda is another of the tools, but also any deceptive activity. I just want to add one more thing for us. And when we look at this and we're a government agency, we're not the agency for disinformation or just deceptive information. It's also about consequences. Disinformation works because we are vulnerable. Because we're talking about information or other disinformation, that is information that is incorrect. And it shouldn't be in our interest to really trust in this. And it should be really easy to actually understand that it goes against our interests to be influenced by disinformation. But if you are vulnerable, then someone can exploit this. And this vulnerability in itself will create consequences. And we usually talk about disinformation in a package where disinformation will have an effect on the safety and security of our population, the functionality of our society, or our fundamental values, freedom of speech, individual rights, the freedom of press and so on. If it has those effects, then it's something we can't live with. On the other hand, and now I'll stick my head out a bit and say that disinformation is something that everyone conducts, everyone lies. And it's probably a part of what makes our world function. We can't be just 100% truthful to each other. But on the other hand, that is a kind of disinformation that doesn't have negative effects. So we perhaps need to live with that. Thank you. Happy to be here. By the way, Monsieur l'ambassadeur, les autres rencontrés, it's always nice to meet our ambassador and make sure that they work hard these guys. Happy to be here. Merci beaucoup de l'invitation. Very well articulated. And I would add to that as a continuation, but you said, yes, lies have existed in biblical times, right? And before that, and propaganda has existed, has always existed. And many years ago, we used to fly airplanes and drop little paper leaflets on towns and cities so that people would be influenced. We'd spread ideology like that. Propaganda has always existed. Lying has always existed. Errors, of course, have always existed. The new thing now is what I call the virality, the spread, the speed of the spread, which is almost beyond human nature to a certain extent. Now, I know that when the train was invented, the British would say, it does make sense. 30 miles an hour, our brains are going to bust open. And I feel we're saying the same thing about this information these days because a lot of this information is AI-based, AI-driven, not even spread by human beings, but by machines, by algorithms, by artificial intelligence. And here we are, you know, with our little cauliflower-looking brains trying to counter that. And the work we're doing at the university is specifically that to look at the causes of disinformation or re-examine the causes of disinformation, like we've been doing for many, many years, but also look at ways of countering disinformation. It's become quite clear that it's beyond human to be able to do that, and we have to develop AI-enabled tools. In a moment, I think we'll have a chance to talk a little bit of, you might have heard about the so-called freedom convoy in Ottawa, where 200 trucks, you know, blocked basically the whole parliamentary precinct. And it was obvious, just by seeing the signs on those trucks, that a lot of it was driven by disinformation. And that, I don't want to digress too much, stopped me after two hours, if you're tired. What happened when disinformation kind of infiltrated, if you will, a very lawful protest, you know? It was a legitimate dissent. People didn't agree with the COVID regulation that the government put forward. The truckers in particular were hit with that. Whether I agree with what they said or not, whether the polarized language that I heard made me comfortable or not, is not even the issue, right? They have the right to dissent. They have the right to their own opinion. And as we embark on this initiative of trying to counter this information, we have to remind ourselves several times a day that this is not about censorship. This is not about limiting free speech. Otherwise, we risk creating what we would call a disinformation industry, okay? And already, there is fear that if we use the word disinformation too much, you know, it might lose its impact, its value, okay? Disinformation is something, as you've described, that is ill intent. There's this ill intent, and in some cases, you know, heads into the area of lawlessness, you know? It's prosecutable. Otherwise, it's just an annoyance, okay? Lying is an annoyance. Spin doctors, political spin. You know, I wouldn't call out this information. Errors. Errors are made by newspapers, but they're self-correct if they're good newspapers. Anyway, I could go on for another two hours. I'll stop there because we have another guest, of course. Thank you. Thank you, sir. This is very interesting how we get the foreign perspective, the foreign influence perspective from one side, the Canadian truckers, the national perspective from another side. So I want to go online and ask our next speaker, sorry if I mispronounce your name, Thin, about your experience, how democracy has affected your country. And not only that, it would also be interesting to know from your perspective if you think democracy in your country, but also with your broad experience in other countries as well, if it's a symptom or it's a cause of something else. So the virtual floor in this case is yours. Thank you very much, Alberto. Can everybody hear me? Yes. Very well. Okay, great. Thank you. I have my own question, Alberto. How much time do you have? How much time do you need? No, thanks for the question and good afternoon, everyone. I'm really sorry I can't be there in person. It's Italian bureaucracy, I'm afraid. Look, I guess on a much more somber note in terms of talking about disinformation and Myanmar, I think in terms of impact, what I can say is that this information like the previous speakers have said, shared and spread with ill intent has led to the loss of many, many innocent lives. And surely that is a terrible price to pay. But let's backtrack for just a little bit and I just want to tell you and set the scene a little bit about a time when I was growing up in Myanmar, which was in the 80s and the 90s. And at that time, we only had state run media, which was of course full of propaganda as well as misinformation and disinformation. Right, if we really wanted to know what was going on with the rest of the world, but also within the country, we had to, you know, resort to the Burmese services of the BBC World Service or the Voice of America or Radio Free Asia. So obviously you have to do it surreptitiously or illegally. And I know again that, you know, our previous speakers, the experts have sort of talked about how propaganda and misinformation and disinformation has been around for centuries and I agree. But I think in that kind of atmosphere that I grew up in in Myanmar at that time, it was just very right for rumors of every kind to spread very, very easily. Right for both misinformation and disinformation. And then, of course, things changed suddenly in 2014 that telecoms sector opened up and suddenly mobile phone SIM cards that used to cause a couple of hundred dollars became $1.50 overnight. We've got millions of people in Myanmar coming online, skipping everything from feature phones to going straight to smartphones and 3G and 4G. Right, just as a brief statistics around 2014 before the telecom sector opened up about 1.2% of people had access to internet within five years, about half more than half were already online with multiple phones. And, you know, going on Facebook and social media to share information, you know, there was a sense of hopefulness and expectation of suddenly becoming connected and becoming part of the world. But what it also did was that it put, you know, the propaganda and disinformation and misinformation, you know, through the internet and social media, just put all of that on steroids. To talk about whether it's a cause or symptom. I don't know. I just think it's a vicious cycle. Unfortunately, at least in Myanmar, you know, I'll just give you two short examples. So in July, you know, 2014, so pretty soon after the telecom sector opened up, there was an unsubstantiated rumor on Facebook that in Mandalay, which is the second largest city in the country, a tea shop owner whose Muslim had raped a Buddhist woman, a Buddhist employee. That led to a riot in that city, a mob of about 500 people looting and destroying things. And it ended with two people losing their lives, a Buddhist and a Muslim. The impacts were so much bigger and worse in 2017 when the Myanmar military cracked down on Rohingya Muslims and then drove almost, you know, three quarters of a million across the borders into Bangladesh also killed thousands and thousands of people. And guess what? preceding that event, you know, years and months before that was violent speech, as well as, you know, hate speech, whatever you want to call it, as well as disinformation about how, you know, Muslims and the Rohingya are taking over the country, the spread of narratives all pretty much false, but preying on the fear that people have. And I might add, right, that fear is not just, not just specific to Myanmar, right, they're very, very smart. The people who spread the disinformation, and many of whom were linked to the Hunter, right, they were playing on fears that the international community had around the world with the war on terrorism. And they sort of juxtaposed that and tailored it to the conditions in Myanmar and lit this fire. And what happened was that a huge, a huge percentage of the people in Myanmar actually supported that crackdown because, weirdly, despite knowing that the Hunter had, you know, for years engaged in propaganda, they believed the lies that they were told. I mean, I'm not trying to absolve, you know, sort of blame that the people shouldn't have believed it, but it is, it was amazing to speak to even among friends and some of my close relatives about what they heard then and what they believed. And now seeing the same storyline being played out since the coup last year and realizing that, oh my gosh, what they told us then, they're now doing it to us. So it was really interesting to see the way disinformation has been used in Myanmar. Interesting, perhaps it's not the right way. It was, you know, has very tragic consequences. And I guess I just want to end this by saying, you know, there are a lot more recent examples since the coup. I just want to give this two very specific ones and to say that, you know, in Myanmar, like in many other countries, disinformation has been used to sort of sow doubt on the democratic process and also to actively encourage violence. Thank you. Thank you very much for this. And I think it's very interesting to see both international and the national side of things with what was referring to on these internationalists that were around and then they were contextualized to the Myanmar context. So I would like to ask both Michael and search about the consequences in your countries. You already touched upon the issue of the Canadian trackers, famous Canadian trackers. Whereas you, Michael, talk more about foreign influence. So I would be interested in knowing what exactly is the influence in your countries. Well, one pervasive influence, I think, if I look at it, and I think you will probably relate to that as well in Sweden, what I've seen had a good, good friend of mine was minister of the environment in Canada and not making a political statement, but I think she was one of the best minister of environment we've ever had. Minister and climate change. The disinformation that the wrath of disinformation, I always call it, not just on climate change per se, that made her job a lot more difficult, but it also got very personal. The things that were put out about her was unconscionable. So much so that she resigned. She left her position. Now, of course, she will say that there are many other factors, but one of the key factors was the disinformation, the toll that it had on her personal life, on her family, the violent language about her. And that happens, I notice, a lot. And the consequence of that is that we have some very able, experienced, talented people who will not go into politics anymore. Who would do that? Who would expose themselves to an environment that is so toxic and can really destroy your reputation, can really destroy your well-being? We had that at the University of Ottawa as well. A professor dared use a word that some of the community did not appreciate. She used it in a scientific context, in a historical context, saying that the word had been used historically to define a certain group and all that, which she thought was an academic expression, an objective observation. The students turned on her within half an hour. They published her personal address. They published her phone number. She also resigned. So there is impact, as we've heard here, when state actors, when states engage in disinformation, there are literally people dying out of that. And we've seen that in Myanmar and elsewhere. But what I see is on a personal basis the toll that it's taking on some very, very competent people who are now just saying, I'm not going to go there. And I'm worried that it's undermining our democracy. So it's one of the ways that it undermines our democracy. Yeah, and first of all, I must say that Sweden is a fairly resilient country when it comes to disinformation. And we're thankful for that. And at the same time, looking at it, it has had a long-term effect on our country too, as most countries. And I would also argue that COVID-19 and the pandemic has really shaped the environment, promoting disinformation, misinformation and other problems in our environment. And where polarization and, how should I say, not so nice information environment is one of the effects. And we could see that, especially last two years with COVID-19, that the people working with communication in the public sector, they were exhausted because there was so much hatred going around there. And this could also be connected to something we've been seeing for several years. And my agency, we work only with foreign state actors or foreign powers. So it could also be a non-state actor, not internally. But we could see in 2015 that certain state actors were trying to create polarization in our society by sowing distrust against the government, against politicians, against media. And more or less the whole system in itself. But we could also eventually, in 2018, see a trend where the foreign state actors didn't produce disinformation themselves anymore regarding these issues. Instead, they were amplifying disinformation and misinformation being produced in our own countries. The same thing actually went for the U.S. election. We saw that trend also in 2020. And therefore, unfortunately, this might be an effect of foreign disinformation that it has created an environment of distrust. And in that distrust, we find polarization and it starts with hate and goes, and that's the slippery slope to threats. And hate is usually not illegal in Sweden. You can be hateful in your speech, but you can threaten someone. And that is actually right now a debate in Sweden. We had a government inquiry regarding this, regarding not hate speech, not rather hate as a way of communicating. How good is that? Because it has had an effect on journalists and politicians in our country too. That's a very interesting point of view. I see a lot of differences, but a lot of commonalities between the three really diverse cases that we have here. So there is a lot of, okay, we have shed light upon a lot of issues. There's a lot of questions still, but let's try to move on to the solutions part. And of course, we're not going to find a solution, but if we backtrack a little bit from 2016 where we were suddenly the world woke up to the issue of disinformation with the Brexit referendum, the Trump winning the U.S. elections. Since then, we have had a big impulse of different government actions and legislations from creating new agencies to passing new laws. Some of them questionable in many aspects. So it doesn't seem that they have solved anything. Other, a lot of researchers argue actually that some of these legislation made possible that the U.S., for instance, the U.S. 2020 elections were not massively influenced by disinformation. So there is a lot of questions here. What I want to ask you, and I would like to start with, is what do you think, it's a double question. What do you think is the government role when actually the government wants to be a positive actor? And how do we make sure that solutions respect human rights? A lot of the legislations that have been passed, the case of Southeast Asia is a particular special. Almost nearly every country has passed in the last three years legislation that restricts freedom of expression. With the excuse of fighting fake news. So how do we make sure that we respect human rights with this legislation? And if you have a positive example, that would be excellent. Positive examples, okay. The question just became harder. Absolutely. I guess just very briefly to respond to the part around, you know, not before I start on Myanmar to talk about Southeast Asia, you know, countries that have passed these laws against, you know, what they call fake news. We're always, well, in most cases, we're intolerant of freedom of speech in the first place. It's just that they couldn't really regulate it in the way that they could now by using fake news and disinformation, you know, as an excuse. So I, yeah, that's that's that's my little take on Southeast Asia. I look at the countries and, you know, the neighboring countries to be a mother have enacted those laws and to me it felt like they aren't exactly a response to a worry but actually a way that they have now found an excuse to legitimize the way they want to control speech. In terms of let's, you know, talking about how governments should handle. Obviously, local national governments should be the first line of responsibility right to tackle disinformation but what do you do when it is actually the government who is spreading the disinformation which is the case in Myanmar and I think is the case in so many countries, even in Europe, right and in the US. And, you know, and in Myanmar, that is still the case that has been the case, and it is still the case. And this comment is going to make me extremely unpopular with my fellow countrymen and women, if it hasn't already the disinformation wasn't just limited to the military-aligned government, yeah, during the reign of the National League for Democracy to answer and suit these government, there were also government supported disinformation, or at least they did not do anything to actually manage or crack down on disinformation when it was going on the 2017, you know, programs against the Rohingya Muslims happened under the NLD watch, why all of that disinformation and propaganda and hate speech was going on. They really didn't do anything to reign it in. And I think then you have very little leeway right because you're left only with the civil society who have much less power and resources to tackle this. Having said that, there have been two campaigns that I really respect and really, you know, that I thought was really interesting that came out during those times. One was called Banzaga, which is translated to English as flower speech, but essentially just to say we do not show hatred to each other. And obviously, as many of you probably know, Facebook is the equivalent of internet in Myanmar. So that was a campaign that focused quite a lot on disinformation and hate speech on Myanmar and just essentially getting people to understand and share. And, you know, the whole debate around the silent majority who do not probably like to see this going on, but are too scared to sort of get into these, you know, vitriolic fights, even if it is just online with people who might be bots might just be, you know, or might just be agent provocateurs who might be wanting to get into a fight, but it was a way for people to instead of getting into arguments just to say, No, we do not support, you know, speech that has ill intent. And that was fairly successful. And I think people were really interested. Another thing that another campaign and that's, you know, has a website, but it's very active on Facebook in Myanmar is a page called think before you trust. And what it does, and it is still ongoing right now is to actually look at news, whether it's viral social media posts or whether it's news from the state run media, or even independent media, and essentially fact checking them and saying, Nope, this is not true. This is true. It, unfortunately, does not have the same platform or presence as the ones that are sharing disinformation. But I think it is, you know, and it's again, it's small scale. But I thought it would, you know, it's still a really good way for for for civil society to sort of take back some form of control. There also used to be an organization called Mido in Myanmar, that will end near my tech accountability network and they were very, very good at essentially getting social media platforms like Facebook to take responsibility and to moderate their content. Unfortunately, since the coup right, these organizations are no longer working or existing or have gone underground they just cannot operate that anymore. So again, we have this vacuum in Myanmar at this point in time, where you probably need reliable information the most you, you really have to go and find it because you don't know where the information is coming from who just coming from and what's the you know intent behind it. I guess what one final point I want to make is that you know, international governments, you know, for also has particularly developed nations, particularly home to the big social media platforms like Google and Facebook, but also now new ones telegram TikTok. Okay, some of them perhaps are hosted in countries that we know where democratic principles might not be the most important, but I think, you know, governments in countries where these big social media platforms are based or operate also have responsibility, because you know one really interesting thing that I've discovered when I was speaking to tech experts who've been looking at Myanmar is that a lot of disinformation in Myanmar doesn't come from work quite a substantial amount I don't want to see majority. It's not from political actors. Yeah, but from people in other countries who make money from viral content. Yeah, they're not necessarily disinformation for hire they're not being paid to do it. What they do is they make money off all the algorithms on Facebook and Google through the ads by, you know, repackaging content. That is not true, but it's viral, and they get paid a lot of money from the way these international tech giants, the way you see the incentive structures are created and I think they really need to take responsibility about it because these, you know, again, these people are not political actors, they're opportunistic, there are thousands and thousands of entrepreneurs who are completely about the content, but are more interested in the money that they can make of YouTube and Facebook and other platforms. Thank you. Yeah, that's an incredibly important point to look at the money side of things. Please. Sir, I'd see you. We're Canadians, so we're optimistic. We're utopians. We think the world will end in a good way when it ends. The ending will be good. You know, we're Canadians. There's a good movie right now playing. It's called Lost Illusions. It's based on a book by Henri de Balzac, and it's set in 1789, so, you know, half true, half not true, romanticized, but it's a very, very good movie because when I saw it I said, oh my god, it's about disinformation. There was seriously, up until 1789, and correct me, I'm not a professional reporter, so correct me if I say stupid things, but up until then all newspapers were rags. All newspapers were, you know, as they call it in the movie, bullshit vectors, okay, like it was all trash, it was all opinion, and people expected that from newspapers. One guy at one point, a small group of people said, what if we did something different? What if we had a differentiated product? What if we published something that is only truth, okay, and if there's a mistake, we'll correct it. We will have editors who look at this. We will challenge the assumptions before publishing them. What if we did that? And everybody, of course, laughed them away. They said, you know, there's no market for that. There's no market for truth. I mean, you'll never make a living out of that. Well, it just so happened that it worked. And since then, a few newspapers have stuck to that ideal of saying, let's be truthful, and let's admit that we make mistakes sometimes. Mind you, there were maybe, you know, 12 reporters in Paris at the time, you know, so it's fairly easy for them to coalesce around an idea and say, let's stick to that and let's make that our business model. I'm kind of, you know, utopian Canadian kind of way, hoping that there might be a truth network at one point, not what Trump calls his truth social media. I think we should use another term for it, but a space where people agree that if they are going to communicate on that platform, they will have their ideas challenged. If people notice that something is wrong instead of putting a little like, they'll click a little, this is not true. I don't think this is true or this needs validation. Think of the Wikipedia page. As a university guy, I shouldn't even talk about Wikipedia, but it's becoming something that is fairly decently truthful. And why did Wikipedia not go the way of Twitter and TikTok and all that? Well, it's because people are encouraged to point out errors, okay, and collectively people kind of develop truth. And that's how democracies have always worked, right? Democracies develop through a consensus around norms, around rules, around conventions, and of course it slows things down. And people will say, well, you know, freedom is a much better word than control. Well, I'm not sure. I think we need to challenge that sometimes because, you know, when newspapers decided they would control themselves, then they became better newspapers. And stories might take a little bit more time to get published, but they're truthful. The problem with speedy publication is that it is a breeding ground for error, if not disinformation. Michael? Yeah, and I've been writing things down. It's really interesting to hear these things and especially getting me on my perspective. So, and I realize I need to explain how I think because I'm perhaps the old man out here in the room. We work with foreign threats. And why do we do this? Our job is to identify and counter foreign malign information influence targeting Sweden, not internal. And that is for a reason. And that is because foreign states and foreign powers, they are threats in our system. While our population and the media that are also spreading disinformation, we perceive them as vulnerabilities. And why do I say that? Well, first of all, foreign state actors and foreign actors, they're not a part of our democratic system. They're trying to be a part of it and trying to influence it. On the other hand, our population and also international free media and anyone else who is expressing their opinion, they're a part of our democratic process. And therefore, we need to have different toolboxes for these different actors. And the reason behind this, the next phase is actually that if we have freedom of speech, that actually implies the right to be wrong when you're using the freedom of speech. Which means that you have to let everyone use that freedom. Eventually, you will figure out that the person who was wrong turned out to be right or vice versa. So therefore, that is something we need to protect. And I'm going to read something out from our instruction for our agency. So I'll skip the boring part and just go straight for the most important thing. The agency shall, through its activities, safeguard an open and democratic society and free information of opinion. That is one of our tasks. And that in itself implies that if we have an open democratic society and free information of opinion, that is the basic foundation to have a psychological defence. Which means that our government and our parliament has told us that if you protect our democratic system and our freedom of speech, then we will be resilient against disinformation. So therefore, when we tackle this then, because you wanted the solution, and of course we're in Sweden, so you will get the solution. And I started with saying that disinformation, it only works because we're vulnerable. And when we looked at these vulnerabilities, what is it that makes disinformation have an effect on us? That we realise that we are lacking some things. First of all, we lack the awareness that there is a threat. If we know that there is a threat, then we're on our toes. The other thing we also lack is the awareness of our own vulnerabilities. What is it that makes us get controlled by someone's ideas, expressions and so on? The third one is actually a lack of knowledge of our own system, our own society, how it works. So we have nothing to measure the disinformation against. And then also the lack of coordination and cooperation to help each other, to warn each other when there is disinformation going around and when there are vulnerable groups that are being subjected to it. But also to help each other to counter this, and therefore we also need to have a toolbox. We need to know how to act when we see disinformation. And it's just not only source critical thinking, it's also how you operate in the information environment in yourself. And then there are some things that can be exploited also that you need to work on. And that is of course the trust of our system. Trust in itself is something you can exploit if you lack trust for your own system. And also that Tim mentioned here, fear, fear and anger are emotions that are exploitable. So if we are going to build resilience against disinformation, we need to address these issues to create resilience in our society. And the government's role is to support civil society, our media and any institution working with the democratic process and freedom of speech. And to enable these actors to, how should I say, boost up the already existing well-working democratic process that we have in many countries. And the other role of the government will be also to protect against qualified external threats and look at the internal problems as vulnerabilities and not threats. Thank you. This is a very good solution. Not easy to implement, but a good one. So I have had the luxury of posing questions to the panelists, but I want to share that luxury with everybody and open the floor for questions to our panelists. There is a microphone there and I have a question. My name is Rauno Marisari, I'm working as Ambassador for Human Rights and Democracy in Helsinki. For our ministry, thank you Canada for organizing this. This is also the topic among our priorities. Freedom Online Coalition was mentioned in Canada this year and Finland was last year and last year the Coalition issued a statement on spreading of online disinformation and human rights. Very good statement, I must say. And two comments or questions based on that work. The first one is about the role of states. So we discussed very much how much we should focus on state-sponsored disinformation. Our experience in Finland is that if we are some way receiving disinformation campaigns, more or less it has something to do with the so-called troll factories in St. Petersburg, which operates more or less with consent of Kremlin. So shall we somewhere focus on this and to combat states' sponsor disinformation or also looking at the other things, so based on the extreme political movements, criminal organizations for economic gains, etc. And my second remark or question is about data. We, for instance, discussed how much we know about how much disinformation is targeted at women particularly and we realized that there is not so much that kind of specific data. What kind of data we need more to be better in combating disinformation. Thank you. Thank you. Maybe we take a couple of questions. Alistair. Hi there, Alistair Skraden, Head of Communications at IDEA. I could just ask each of the panelists. Imagine that it's not Ellen Musk but you who's in charge of Twitter for one day. What would be the one reform you'd make to Twitter to combat disinformation? Thanks. An easy one. Do we have another question or so we, maybe we can give the floor to the panelists. Well, there's three questions. You can try to answer all of them and just focus on one, two. I will let the tough question for Michael. If I had $33 billion, which is 33 times the budget of the University of Ottawa, I think, again, citing Wikipedia, if the base, you know, I'm a little weary just to take a back step about governments controlling content. It's almost like telling phone companies or postal service you will control the content of communications. And it's a slippery slope. We know that some governments will use the pretext of disinformation to clamp down on the flow of information. If the base, I'm not calling for revolution, but if it comes from the base, if people have a chance and an easy mechanism to point out disinformation, they will. Much like they did again with the Wikipedia example. So the very first thing I would do is put a button on there that says, I'm not sure this is true. We're working with a very, very smart young student. He's an anthropologist actually, not an IT guy at all. And what he's developed is a system whereby when you go on YouTube and you watch a video and you don't think that's true. Okay. You'd like to contest it. You'd like to counter that narrative. You can attach to the URL, you know, whether YouTube likes it or not. I mean, URL is a public. You can attach your own video to that video so that there's a counter narrative. It may play in the other direction too, of course. You know, a truthful video might be countered by a false video, but overall, and with public pressure, if we can call it that, then truth will typically come out ahead of lies. That would be my first move on Twitter. But the tough question, of course, Michael will answer. Or will I? So when it comes to data collection, first of all, one should be very careful about that method in itself. Why do we collect data? What are we going to do with that? So I'll just say a bit about what we do at our agency by saying what we don't do. We don't go into social media to find disinformation and then see if it's a foreign state actor behind it. Because if we would do that, we would actually be at risk registering our own population using the freedom of speech and being anonymous. So instead, we are actually monitoring state-sponsored and directed resources. For instance, Gloveset Internet Research Agency, they are fully directed by the Russian president office, the team in St. Petersburg. And then we try to monitor what are they trying to message, to understand how they are trying to influence our population. And instead, the most important information actually to come out with is to our own population. And that is regarding having a strong narrative about what's really going on. And one of the recommendations we have for agencies, authorities, municipalities when they're being subjected to a disinformation campaign is actually transparency and really communicating it out there. Be transparent on what's really going on because what the aggressors are doing is that they are filling the information environment with their disinformation. And there we have our population. Therefore we need to fill the information environment with correct information and make it available. And I do believe I'm almost as positive as our Canadian friends here that we live in a good environment where people don't want to be fooled, that they want to have the right information. So transparency is one thing. The other thing is education. Through high level of education and media literacy, you gain trust. And in Sweden we're lucky, we have a fairly high level of trust here. If you have that, you are very resilient. And if they continue to do this anyway to attack you, then we actually, Finland and Sweden, we share the same handbook for countering disinformation, where we have a three level response, where the first one is just to be aware and not make a fuss of it because you'll probably help the aggressor. The other way is actually to design your communication so you will reach out to your population so they get the correct information and the facts about the situation. And if that doesn't work, then it's time to go against the actors, to expose their activities, to expose them and actually, but that's the message stuff and that's not the place where you want to be. Thank you. Tim. Yeah, I'll tackle the easy one. Unfortunately, my answer is not going to be very much different from I think what what actually said I was just the first thing that came to my head was, okay, I would probably make it easier to report disinformation and abuse. And I say this because I think just a couple of weeks ago I tried to report a tweet. That is clearly, you know, not not just propaganda or misinformation, but it's actually disinformation. I mean, because I thought, you know, there was ill intent behind it that was obviously, I can't remember exactly what it was but it was actually, you know, wishing for somebody to be heard. And it just didn't fit in. I somehow couldn't find fit in to the to the selections that Twitter already gave me as I'm like, Oh, that's that's not a spam. That's actually not a promotion. It's actually, you know, it's harmful but in a slightly different way. And I think it's just the way it's being done at the moment is so narrow and I know that, you know, some of my colleagues in Myanmar as well as outside of the country, sometimes just give up and don't report because it just, it's just so much more difficult to let me let Twitter know that you know this tweet is obviously wrong or is disinformation. I mean, I was born and raised in Myanmar and I still write a lot about Myanmar but my other, you know, hat is actually being a specialist journalist covering food systems and climate change. And, you know, we have the same kind of problem like Serge said about, you know, one of the Canadian politicians when it comes to climate and food issues on Twitter there's so much disinformation I would just like. But just make sure that one day perhaps to make it so much easier to identify and report and take action against, you know, intentional disinformation. Yes. Do we have more questions from the public. Yes. Yeah. I'm head of communications at Swedish Society for Nature Conservation, the largest environmental organization in Sweden. And you mentioned civil society and you mentioned climate change and then the Minister of Environment. I would say these issues, environmental issues, climate change in particular, but also other environmental issues are often targeted. This is my experience, of course. And I would say that we could hire lots of staff working full time only responding to various lunatic facts and allegations on what we're doing and what is happening. I'm a bit curious if you have any advice to us as a civil society as an NGO, especially considering that we have an election coming up as part of the democratic system. Thanks. We have another question. Thank you. On this petition on executive director of civil rights defenders. Now that we're starting to get COVID in the sort of rear mirror. What are the major learnings from an academic point of view maybe when it comes to counter the, the disinformation around COVID, which was we've seen used as a pretext to curb civil and political routes around the world to quite a large extent. So what have we learned from that that we'd also could bring into the future. Thank you. We have one more question. Thanks. My name is Kimana. I work for the Constitution building program of international idea. My question is more related to gray zones. I assume or I know that I mean information. There's there's so many gaps in information and there's a lot of play also with possible truths. And to what extent does the whole field of disinformation also deal with those gray zones. Of course, transparency is positive, but transparency can also not be absolute. And and there is always this. Well, there are these games also in terms of playing with with those gaps and with those great zones. As I said, thanks. Thank you. So we're going to have three questions. I will ask you to try to be concise on your answers so we don't overstep our time limitations. So who wants to go first? Okay, just turn off my microphone if I go too long. On the COVID counter narrative. You know, in my own family, I have COVID deniers, you know, like I go up to see my dad and he says, you know, it's COVID thing. It's overblown and all that people who disseminate disinformation are not necessarily bad people. You know, my dad is not a bad person, but he lives in an eco chamber. Okay. And eco chambers get created where people and that answers the other question I think where people speak among themselves like we're doing here. Okay. And people have let's call them heroes, if you will, or people that they respect. Okay. It could be, you know, sports figure. It could be politician, whatever. And they share the same views on things which after a while for them becomes fact because there's no counter fact. Okay. The eco chambers are so strong, so airtight that not only will they not tolerate descent from outside, they will not even tolerate descent from inside. You know, if someone tries to be a little bit moderate and say, yeah, but or yeah, perhaps or yes, there's a gray zone. That person will be ejected, you know, excommunicated on the spot. Right. So there's no room for moderate speech. It's either gray. It's either black or white. And yet we know that in general, you know, any topic is great. This is not black or white. And so people disseminate things that they believe are true because they heard it from people they respect. Okay. So as, you know, as to the countering narrative, how do we counter narrative? We can't counter it with facts. I can't tell my dad long list of facts and he doesn't relate to that. But if his favorite sport legend, if his favorite hockey player, you know, were to come and say, no, no, no, no, this is real stuff. Okay. Listen to me. This is real stuff. That is likely to have an impact. So I think let's remember the eco chambers, the people in the eco chambers, the participants or the members, if you will, of the eco chambers are not back. The instigators are, but the people within those, those rooms are not necessarily bad. It's an interesting point. Yeah. And I'll add just fill in there and also actually answer the question regarding civil society. And first of all, don't spend time on countering disinformation. You will only talk about the subject that the people spreading disinformation wants to talk about. And so therefore you're probably just will be helping them that so to having your own strong narrative together with facts to back it up because that can kill. And if you get there first, you know, it will kill the disinformation that that is creating resilience. But and connect to that some, some of the things that we've learned and I worked for two years with disinformation and COVID-19. So there are many things to be learned there, but connected to this. First of all, if you have a strong narrative, you have to be humble about the facts because facts apparently evolve. You know one thing one day and suddenly next day, it wasn't really correct. So therefore be humble when you communicate what you know today and prepare for tomorrow where there are where the knowledge has changed. And you can almost never change an extremists. You're not going to reach out to them. And what we see and what we saw was that people that are vulnerable to conspiracy theorists, they are vulnerable to if you're vulnerable to one, you are vulnerable to others. So that group is a group that you will probably not reach with facts. They want to live in that environment for many sad reasons. And I've spent hours talking to conspiracy theorists, calling me telling me what's going on and not going on and so on. It's been interesting, a lot of sad stories actually behind that also. But what you should do instead is to focus on the people around them. So don't talk, and even if you're in a dialogue with a conspiracy theorist, don't try to persuade them. Try to persuade the people in that discussion, the other ones that haven't decided yet what to believe and not to believe. So think about that when you communicate. So don't embarrass the other, the opponent, if you have any discussion, and put out the facts there that will make the people on the edge actually lean towards your side. I'll stop there. Thank you. Yeah, I guess two very brief points and both actually, you know, touching on both what Serge and Michael have said, I think one, in terms of the civil society, I don't have a lot of advice, but I would like to point you to this really interesting report that I came across at the end of last year, it's called how to talk about civic space, a guide for progressive civil society facing smear campaigns, and you can find it on citizens for Europe. And it's essentially done by a group that's actually telling or civil society groups, working on things like social justice and environmental protection on on how to respond to smear campaigns. And I thought that was really interesting and useful so hopefully they'll be, and, you know, I see a lot of what Michael said just now in, you know, similar tips in that report as well. They have like really good examples of concrete examples of what steps you can take so hopefully that'll be useful. And the second point is also that just to reiterate what the two speakers already said, I think both in our personal and professional lives, I think we can play a role in engaging not necessarily with, you know, the Alex Jones is off the info wars but people who listen to it, some of whom may be our friends and family members I mean I distinctly remember, you know, coming back from the Rakhine state, you know, after visiting the Rohingya camps, who have people who have been displaced and coming back home. And one of, you know, the family members was like saying, Oh, but you know they burned down their own homes. You know, so why are we supporting them or why are you feeling sorry for them. And I was like, Okay, normally I would just like shut down the conversation I would just leave because I was at that point where I couldn't I could no longer deal with all this, you know, this this fake news that was coming at me, but I decided because I love them, and I know that they love me and trust me so I decided to sit down and said, Okay, let's talk about this, you know, where did you hear this where did you hear that they burned down their own homes. Oh, I heard it on MRTV, which is that Myanmar, you know, state television. I said, Okay, how do you know so it's always it was the news they said this was what happened and I said, Okay, when did you ever trust it news from MRTV and they said because it's propaganda and I said, Well, why are you trusting them now. And they're like, Oh, you know, and I said, You know me, you trust me, you love me, I have just come back. This is what I have seen these other people. And you know, it took a while, but, and I have to be in the right mind, you know, mind, right frame of mind to do this. But I have tried to do my best with people I care about. And I know people who trust me. So I think both in terms of professional and personal, if we can do it. I think that is one small way of, you know, combating disinformation. Thank you. Thank you very much. There is a lot of stream interesting things. I want to thank all the panelists for the incredible contribution but also because I haven't needed to tell them to shut up, which is a first in my life. And I will have the honor to sit my seat to Kevin. You need a microphone, I think. So sit here, please. Okay. Well, thank you so much. I mean, that was, that was phenomenal. And I really, I really mean it. I want to thank the panelists. And I want to thank Canada for convening this discussion with the support of International Idea. And this is, of course, one of the challenges of our time. It has become evident in the course of this day of discussions that we are all following this discussion very, very closely. And, you know, upon listening to this discussion, to this panel, you know, one thing kept ringing in my head, which was this question posed by Tin, I think, as to whether disinformation is a cost or a symptom. And I think the easy answer is to say that it is both. And I think it's true. Yet, I have the nagging feeling that it's not simply the availability of technology that is creating the problem. It is not simply the fact that we have technologies that viralize, you know, the bad information that we see out there. And here I go to, you know, this connects to some of the issues raised by Michael, which is the question of vulnerabilities. I mean, why is it that we are seeing the spread of this information the way we are just now? And again, it's not simply a question of the availability of technologies. There are several pieces to this answer. And here I will speculate. I mean, part of the reason why we're seeing this proliferation of this information is because trust is in short supply. Interpersonal trust and trust in institutions. And this is something that predates, by a long margin, the arrival of social media. I mean, when you see the global opinion polls, you can see that in the West, which is for, you know, the place where we have most data for, trust has been declining since the 1950s, almost continuously. Why? Well, I mean, there are many reasons. One reason I suspect is the fact that the media landscape had changed dramatically before social media came along. And had it changed in ways, I mean, with the 24 hours news cycle and this kind of thing and cable television, we all of a sudden found ourselves in an environment in which the flaws of democracy were easier to uncover. Therefore, undermining trust. We found ourselves in a place where leaders and institutions were cut to size. You know, that old quip that, you know, no man is a hero to his valley applies nowadays to our leaders and to our institutions. We are all too aware of the flaws that afflict them. And, you know, one of my mentors used to say, used to use this wonderful phrase that when you don't believe in anybody is when you're more willing to believe anything. And that's exactly what's happening. And up to this toxic brew, the fact that the world has become much more complex, much more volatile, much more difficult to understand in a way that leaves us, with a sense of dislocation and with a sense of loss of control over our fate. And that's the moment when conspiracy theories and scapegoating become useful devices to understand a reality that beats us. And if you're an angry person that has seen your economic prospects diminish, it's much more difficult to understand the workings of globalization, of the workers of the financial system that, you know, end up with a, you know, with a merger of companies that lead to your dismissal. It's much more difficult to understand all that than to embrace the great replacement theory and blame immigrants. And of course, countering lies and conspiracies with truth is very difficult to do for exactly the reason that you were saying. Because lies are just much more attractive. The truth is bound by reality. For lies and conspiracies, the sky is the limit. So what I'm trying to get at is that the factors that are behind and underneath this discussion about this information are very, very structural. So if we focus on regulating the technology or the messages, we will not solve the problem. Something else will come along. So we need to deal with the underlying issues, the lack of trust, the sense of this location and loss of control over our fate. And that poses, of course, very thorny, very uncomfortable questions about what we need to do about corruption, what we need to do about obscene levels of inequality, what we need to do about the breakdown of social contracts all over the place. What we need to do to change this idea that is prevalent in many places, that we simply accept that there are losers in our economic and political structures, and that's part of their lot. Well, no. I mean, I happen to be an optimist about globalization, right? And I do believe that all things consider there are more winners than losers in globalization, but there are a hell of a lot of losers. And the practical implication of this is that for all the problems that they may have, societies that are better integrated, where everybody has a dignified place under the sun, where institutions and authorities are accountable, do better when it comes to this information and do better in general. So let's not kid ourselves believing that there are quick fixes to this information or to the challenges of democracy. There are none. And we should acknowledge this and act accordingly, and this is what this discussion is about. How we act together and with a sense of urgency before this information and the issues that it prays upon, destroy the democratic project, which is one of the pinnacles of the human adventure. We do have a collective responsibility, and at least in my case, I thank Canada and the panelists for reminding me of that responsibility. Thank you. I think that closed down. Thank you to Canada and to all the speakers. Thank you. Thank you, also online. Thanks for having me. Have a good evening, everyone. Bye-bye.