 Yeah, and thank you for the introduction, Yunus. Today I'm going to be talking about the psychology of accepting and sharing misinformation. Topics in this presentation includes what is misinformation and how it differs from related terms. Next, how do we assess what is true? I will present the basic psychological process of how we relate to information that is presented to us and how we engage with it. In addition to the basic psychological processing of information, there are individual differences in how we relate to information and how we judge what is believable. That is what makes some of us more susceptible to accepting and sharing misinformation. Lastly, I'm going to briefly talk about some of the potential consequences of believing in and acting upon misinformation. Now, this was partly covered in the previous panel, but as you can see from this overview, there are several terms and concepts related to false information. Some of these have clear definitions and meanings like content marketing, whilst others are hard to define, like fake news. However, two aspects of information are generally used to differentiate between most of them. First, the level of facticity. Is there something true about it or is it completely nonsense? And second, is it created and shared with an intention to deceive, profit, or otherwise cause harm? Here's a quick illustration of how one can think of different kinds of false information. At the vertical axis, you can see the level of facticity, whilst at the horizontal axis, is the intention to deceive. Even though several terms related to false information are used within research, misinformation seems to be the preferred one. So what is misinformation? Misinformation involves information that is inadvertently false and is shared without the intent to cause harm. But moreover, it has the capacity to spread through society, and particularly on social media. The aspect of intentionality is important because people who share false information, especially on social media, doesn't necessarily do it to intentionally deceive or harm anybody. However, the creator of the shared information, on the other hand, might have an intention to do so. In that case, it is referred to as disinformation. Now, most psychological research on false information is determined as misinformation. And that's because when we survey people's social media behavior, for instance, it's hard to determine whether they share false information with an intention to deceive. So it's better to give them the benefit of doubt. Now, let's move on to one of the basic psychological processes involved in assessing what is true and how we relate to information in general. When we evaluate whether information presented to us is likely to be true, we typically consider some, but rarely all, of these five criteria. First, is the claim compatible with other things I know? Is there much evidence to support it? Is it internally consistent and coherent? Does it come from a trustworthy source? And lastly, do other people agree with it? Each of these criteria are sensible and thus bear only likely truth of a message. Importantly, I can assess each criteria by considering relevant knowledge, but that is a slow and effortful process. So instead, I can assess the same criteria by relying on my intuitive responses, which are faster and less taxing. And indeed, my brain, like probably some of you guys, prefers fast and effortless. But when the initial intuitive response suggests that something may be wrong, we tend to turn to the more effortful analysis. In this way, the initial intuitive assessment of truth is a sort of gatekeeper for whether people will engage with the presented information with a critical eye or just not along in agreement. These assumptions are compatible with the long history of research in social and cognitive psychology, where the slow and effortful strategy is often referred to as analytic or system to processing and the fast and effortless as intuitive or system one processing. So the key is the ease with which the information can be processed. If it's processed easily, we tend to use system one processing. If it's hard to process, we switch over to system two processing. First criteria. Is the information compatible with what I know? Does it fit with my prior knowledge? Any claim is more likely to be accepted as true when it's compatible with other things one knows than when it's at odds with other knowledge. Here, the well-known confirmation bias can also kick in. If the information confirms your prior beliefs and opinions, it is more likely to be believed. On the other hand, when something is incompatible with what one knows, we tend to stumble. We take longer time to read it and have more trouble processing it. Thus, if the information is more compatible, it makes it easier to process and they often not along in agreement. If it's less compatible, then it makes it more difficult to process and we tend to switch over to more analytic processing. If it's compatible, there's often some kind of evidence to support it and usually if there's evidence, it easily comes to mind. People's belief and confidence in a belief increases with the amount of supporting evidence. Next, is the information internally consistent and coherent? Coherent information is easier to process than information with internal contradictions. If it contains contradictions, we will stumble and have more difficulty in processing it and most likely switch over to analytic processing. What about the source of information? Is the source trustworthy? Information is more likely to be accepted as true when it comes from a credible and trustworthy source. Trustworthiness can be based on the source's expertise, education, achievement and so on. However, trustworthiness can also be based on feelings of familiarity. You trust your families more than you just trust strangers. Additionally, repeatedly seeing pictures of a face is sufficient to increase perceptions of honesty and sincerity as well as agreement with what the person says. Moreover, a given claim is more likely to be judged true when the name of its source is easy to pronounce. Here's an illustration from some of my prior research which shows that face perception influences ratings of trustworthiness. I'm not going to say more about it other than that these two faces you see here receive very different trustworthiness ratings solely based on their appearance. The last of the criteria for truth assessment, does other people agree, is their social consensus. Research shows that they are more confident in our beliefs when they are shared by others and we are more likely to endorse a claim if many others have done so as well. Relatedly, information that is familiar because you encountered it several times through friends and family is easier to process, remember and understand. Now, in addition to these five criteria, I would like to add one additional factor and it is repeated exposure. Demagogues have known from linear that truth can be created through the frequent repetition of a lie and as Hitler put it, propaganda must confine itself to a few points and repeat them over and over again. And indeed, research shows that the best predictor of where people believe the rumor is the number of times they are exposed to it. I will now give you an example of how repeated exposure combined with some of the criteria for truth assessment come into play when we are presented with information on social media. To begin with, most social media posts are short written in simple language and are easy to read and it satisfies many of the technical prerequisites for easy processing. Now, as an example, let's say you are suspicious of vaccines or opposed to vaccination in general, you encountered this post. This easy to read message is posted by a familiar face with an easy to pronounce name is also a millionaire and two personality, of course, a credible source. Well, this was before it become present. The content of the post is compatible with your own beliefs because you're opposed to vaccines. The post is liked and retweeted by your friends who are also opposed to vaccines, thus confirming social consensus. Moreover, it is retweeted and shared by several of those in your online social network and sharing repeated exposure. With each exposure, processing becomes easier and perception perceptions of social consensus coherence and compatibility increases. At the same time, accumulating likes and retweets ensures that the filtering mechanisms of the social media platform makes exposure to opposing information less and less likely. And thus, the content of the tweet is considered to be highly believable. Now, this is, of course, a sort of extreme example. I will have this psychological process and the five criteria come into play on social media, but the same processes applies to topics of a much more mundane nature. So to briefly sum this up, to sum it up, if the presented information is compatible with your own existing beliefs and values, it has evidence to support it, is coherent and internally consistent, presented by trustworthy source and have other people agreeing. Then there's a chance that you just might not along in agreement and not engage with the presented information analytically. However, some of us are less susceptible to misinformation and thus not as easy for fake news. There's been conducted several studies on individual differences on susceptibility to misinformation. Within the social sciences, research on misinformation typically measures susceptibility in the following ways. Some present an equal number of true and false headlines and ask participants to rate the accuracy or reliability of those, like this example right here. Other may present statements which are not in this sort of Facebook format or journalistic format. They provide written claims such as 5G networks, maybe making us more susceptible to the coronavirus, and participants are asked to rate accuracy from very inaccurate to very accurate. Within this kind of research, they look at individual and group level differences in susceptibility to misinformation by measuring some other variables of interest. I've chosen a few of those variables which I now will present. First up, people differ in their propensity to engage in analytical reasoning. This is related to the five criteria and information processing I mentioned earlier, and research shows that some are more inclined to reason analytically whilst others to a larger degree rely on their intuition. Propensity to engage in analytical reasoning is often measured by the cognitive reflection test. It presents people with questions like these. There is only one correct answer to each of these questions, but when you present them to people, you often end up with two groups based on their answers. Those who answer intuitively will answer first to question one and ten cents to question two, whilst those who reason analytically will answer second to question one and five cents to question two, which are the correct answers. People's performance on this kind of test correlates with their ability to discern misinformation from true information. That is, participants who reason analytically are better at recognizing misinformation as false. So thinking analytically rather than relying on intuition seems to be a resilience factor against accepting misinformation as true. Research also shows that delusion-prone individuals, dogmatic individuals and religious fundamentalists are more likely to believe misinformation. People who score high on these kind of attitudes typically agree to questions such as, do you ever feel as if there's a conspiracy against you or the things I believe in are so completely true I could never doubt them? Or the basic cause of evil in this world is Satan who is still constantly and ferociously fighting against God. High scores and deep types of measures are all positively correlated with belief in misinformation. However, mediation analysis suggests that these relationships may be partially or fully explained by reduced engagement in analytic reasoning and actively open-minded thinking, which may broadly discourage implausible beliefs. So because dogmatic individuals and delusion-prone individuals and religious fundamentalists are engaging less in analytical thinking, there are also more susceptible to believe in misinformation. Again, analytical reasoning seems to be a resilience factor. As misinformation is often heavily politicized, much research has investigated political ideology as a predictor of susceptibility to misinformation. This type of research has often been conducted among American participants and it typically groups participants in either Democrats versus Republicans or Clinton supporters or versus Trump supporters. This was during the 2016 election and typically the results indicate that Trump supporters rated misinformation about Trump as less accurate, but misinformation about Clinton is more accurate and vice versa. However, studies have also shown that when participants are given time to deliberate, they tend to correct their responses and this correction was independent of whether the misinformation was consistent or not with their political attitudes. So again, deliberation, conscious analytical reasoning seems to positively influence the ability to detect misinformation. There's recently been conducted a cross-cultural study on the susceptibility to misinformation about the new coronavirus. It compared participants from five countries, the US, UK, Ireland, Mexico and Spain. This figure you see here shows the included predictors of susceptibility or some of the included predictors of susceptibility to misinformation and everything left of the dotted line in the middle indicates reduced susceptibility to misinformation. Whereas the marks to the right of the dotted line indicates increased susceptibility. And as you can see, in three out of five countries, conservatism or political right-wing attitudes predicted increased susceptibility whereas numerous skills which is a kind of proxy for critical thinking predicted reduced susceptibility in all five countries. And the same was true for those who showed trust in scientists. Additionally, they found that self-perceived minority status predicted an increased susceptibility to believe misinformation about the coronavirus. However, as it was self-perceived, you do not know what kind of minorities that seems to be inclined to believe misinformation, whether it's religious, political or ethnic or some other minority groups. Although misinformation is clearly not a new phenomenon, it has become a much more serious issue with the advent of Internet and the belief in misinformation can have serious consequences if one acts upon that belief. For instance, there's a lot of fabricated and false information circulating the Internet regarding vaccines. The most famous one that the MMR vaccine causes autism. And as a result, vaccine hesitancy have resulted in disease outbreaks and deaths from vaccine-preventable diseases such as we have in recent years have been measles outbreak in the Netherlands, UK, Ireland, the US and several other countries. And the increase of misinformation on social media has also proven to be a real threat to the democratic process as annual functioning democracy relies on a well-informed populace. So disinformation campaigns are targeting specific groups of voters and the political landscape is getting more and more polarized. For instance, the Pizzagate conspiracy theory during the US presidential election in 2016. Fake news websites posted false stories claiming that a pedophilia ring with high-ranking officials in the Democratic Party used a pizzeria, as you see a picture illustrated here, as a meeting ground for satanistic rituals, human trafficking and child sex abuse. And a man who believed this story entered the pizzeria with an assault rifle to investigate and end up firing shots, but luckily nobody got hurt. And lastly, the belief in misinformation about the coronavirus, such as the belief that it was created in a laboratory in China, negatively affects people's compliance to public health guidelines such as wearing masks or washing their hands. And are also more hesitant to get vaccinated against it. That's all. Thank you.