 Hi. On today's episode, we explore how influential users on Twitter engage in the spread of different types of deceiving content. Our guest today, Park Roldemond from the Wageningen University, and research in the Netherlands and her colleagues, have conducted research that puts these so-called deceitful opinion leaders at the center instead of the content they spread. How do they influence the behavior and attitudes of their followers? How can a small group of deceitful opinion leaders change the norms of conversation on social media platforms? I'm Rodrigo Silva. And today, let's talk about politics and governance. Hi, Park, welcome to our episode. Hi, thank you. Thank you for having me. Risking a question that is probably too obvious. Why is this topic important? Well, I think it's a good question, because it has implications for people using Twitter or social media platforms. So we know that people use social media platforms not only to engage with their friends, but also to learn about the news or about politics. But unfortunately, not all information that can be found on social media is true. There's actually quite a amount of deceiving information that you can find on certain social media platforms. And we call the accounts that spread that deceiving information and the people behind it. We call them deceitful opinion leaders. And in short, you can call them dolls. However, we don't know. So we know that there's a lot of misinformation on deceiving information on social media platforms. But we don't know if that information also has an influence on people that use social media platforms. So speaking for you and me, if we follow someone, does that actually have an influence on our individual behavior? So we don't know if this has consequences. But at the same time, while we were conducting this study, certain social media accounts were being shut down or banned from, for instance, Twitter. So yeah, it's kind of funny, because on the one hand, we don't know about the impacts that spreading deceiving information has. But on the other hand, social media platforms are making decisions based on something that they don't know. So this is very important to get to the bottom of this. Absolutely. And when you started your research and your colleagues, what were you hoping to find? So what was the research gap? So the research gap is actually two-fold, because first, we argue instead of looking at the content that is being spread. So we call it deceiving information as like an umbrella term. But in deceiving information, you can think about conspiracy theories, rumors, misinformation, and fake news. So it's really a bridging term. And instead of looking at, so is this piece of information, is this fake news, or is it a conspiracy theory? We think that we should look at the people that are spreading this information. Because when this information is being spread, someone that is behind an account that spreads deceiving information doesn't think about, before sending a tweet, hey, should I spread fake news? Or should I spread a conspiracy theory today? So instead of looking at like labeling these types of information, we should look at who are the accounts that spread this information. And then secondly, I already talked about the individual level effects. But it's also very important that we study this, so that we look not only at the accounts that spread it, but also what are the effects on other social media users. Absolutely. The promising expectations. So let us know now about the findings of your article. So we have two main findings, because we also have two research gaps, or two questions that we tap into. The first finding is that indeed we see that these deceiving opinion leaders spread all sorts of information. So sometimes they have like a preferred, or two preferred sorts of information that they spread. So when, for instance, someone is really engaged in journalism, then you see that people spread a lot of fake news. So blogs or things that look like an article and have a journalistic format, but the content that is in it is not really true. So that we found that they actually do spread all sorts of information and that it's justified to not necessarily look at the content that they spread, but to put these accounts in the center of attention when conducting research. And the other thing that we found is that indeed following deceitful opinion leaders has individual level effects on other social media users. So we had three hypotheses. We hypothesized that people would use more on civil language, would become more effectively polarized, and would talk more about politics than they did before following a deceitful opinion leader. And it's important to remember here that we only use people in our study that followed one of the deceitful opinion leaders in our set, so not all eight, because then you would have different effects and only new followers. So we looked at 30 days before starting to follow a deceitful opinion leaders and then compared that to 30 days after people started to following them. And then we saw that people indeed, like all our hypotheses were confirmed. So we saw that people talk more about politics, they become more effectively polarized and they use more on civil language. So there is an individual level effect. Of course. So all the hypotheses were, Umat, can you let us know now about how these findings can impact somehow in terms of public policies or individual choice? Yes, I think it's very important for, social media companies to be aware of this. When we're recording this episode now, a lot has changed, especially in Twitter, when compared to when we conducted the study because now Elon Musk is the boss of Twitter and he's making some interesting decisions. So social media accounts that were purged can now come back on Twitter. And I'm kind of, so there are two things. On the one hand, I think it's good that people can come back on Twitter because I think when you exclude people from your social media platform, they will probably just move to another platform like Telegram or something else. And I think that when we talk about polarization and like seeing other information than what you believe in, it's not a good thing if people are being purged from Twitter because I worry that then, not necessarily Twitter, but other social media platforms maybe become more of like an echo chamber of what people already believe in and that they only see confirmation of their existing beliefs. And I think we should be very careful of that. But on the other hand, we do see that there are individual level effects. So it's kind of, how do you say that? It's difficult to make a decision because you don't want to force all people that have other thoughts out of your platform, but you also need to be aware of the effects that they have on other social media followers. Absolutely, and a very good note that you made that your research was published in the end of December 2022. We are now in the end of, recording this at the end of January 2023. So even in a month, some differences that impact as well the impact of real-life situations of your research. You have mentioned before the research gaps so can you let us know a bit about the ones in your research? So what's next? What's next for our research? Well, we would really like to see this study be done again in different countries because we talk about the Dutch public sphere, the Dutch Twitter sphere. And we only, like we use this paper as a proof of concept for the deceitful opinion leaders to show other researchers that we need to look at these influential social media accounts that spread deceiving information and that we shouldn't be that much worried about like what content they spread, but more in general, what are the consequences of these people on social media. So we would like to see more deceitful opinion leaders in the study or in the data that we collect. And also we would like to see it being done in other countries because yeah, this is only focused on the Netherlands. Of course, still a lot to find in terms of geography and profiles for future. Can you provide some additional resources to our listeners about the topic that we are discussing today? Yes, there was a podcast on the 16th of January. It's called Moderated Content. So 16th of January of 2023 where a study of Joshua Tucker and Collins was discussed by Joshua Tucker himself. And it's about the Russian foreign influence in the elections and change in attitudes, polarization and voting behavior in the States. I think that's very interesting because it also talks about like the influence of bots. So different than what we talk about, but kind of I think it falls in the same worry that accounts that spread misinformation have huge impacts and this study shows different effects. So that might be very interesting. And there's also in Leuven University, there's a PhD student, his name is Darian Harf and he's studying, he names it social influencers, social media influencers that spread misinformation. But he looks at the impact specifically on youth. So I think that's also very interesting. He looks at what kind of impact does it have on younger people. And then I have my colleague Lotte Schijfer and Emma van de Goods at Wageningen University and the University of Amsterdam. They are working on detecting misinformation on social media. And of course, like this is in very early stages, but if we would be able to detect that and like have classifiers that put banners on information, look out, this is misinformation or deceiving information that would be very helpful because yeah, there are a lot of studies of the effects of checking, fact-checking information. So I think that would be very interesting if we get there. Absolutely. I always like to end our episodes with this question. So if there is anything of this conversation that you want our audience to remember, what would it be? So what's the punchline? Ooh, punchline. Oh, I have to keep it short then I'll try. I think it's very important to think for yourself about, okay, if I'm following people on any social media platform, that you keep thinking for yourself. So the punchline would be think for yourself. And what I mean by that is that, like I said, not all information is true. And a lot of the times information that is not true is not always spread with deceitful intentions. So sometimes someone that you hold, like that you have, that you are someone who is a friend or family or who you think that spreads information that you can trust might sometimes also spread information that it's not actually true. So it's very important I think to think for yourself and look information, check information and keep learning. I think that's very important. Straight to the point. This episode is available on, let's talk about politics and governance websites on Koshy Tatu's YouTube channel, as well as in podcast directories such as Spotify, Google Podcasts and Apple Podcasts. Puck, let's have you here. Thank you. Thank you.