 I'm really sorry that the video is so late this week. I could blame having to do laundry, but that would be a fabrication. I could blame being forced out of my old apartment, but that would be a fiction. I could even blame all the mental energy I've had to expend coming up with new ideas for technology in my new job, but that would be an invention. The video's late. I'm sorry. It's hard to argue that the internet hasn't drastically changed the landscape for liars. Like bait, echo chambers, algorithms tailored to deliver content that will get the most views and shares from specific groups, in many ways it's become exponentially easier to sell certain audiences whatever narrative you like. Many opportunists use this framework to simply invent stories out of whole cloth, abandoning any pretense of journalistic integrity or fact checking, opting to publish outright lies tailored for a specific audience who will spread them without stopping to question their validity. A number of people have used the pervasiveness of this phenomenon to encourage a sort of postmodern power consolidation device, a confirmation bias amplifier, where any and all facts that contradict a particular ideology or narrative are discounted as being made up, or fake news. If someone really wants to believe something, no matter how untrue or warped their understanding of the truth might be, a simple rationalization mechanism has become available. Whatever evidence is presented against their chosen belief may be called into doubt as being fake, misrepresented, or unfairly biased. And to be fair, a large number of news items passed around on social media are total bunk, sometimes with an agenda. Numerous careful investigations by several US departments have made it clear that organizations funded by the Russian government have been exerting some influence on US politics. Now in a decent political thriller this would mean something like blackmailing congressmen, risky hacking, espionage, double agents, at least an assassination, but in this case it was the least sexy political manipulation the world has ever seen. It appears that Russia created several thousand fake social media accounts and spammed American voters with them. Even more annoying is the fact that the spam is, by any estimation, terrible. We're not talking about high quality, slick, professionally designed ads with seductive arguments compelling voters to support certain agendas or plausible-sounding stories faked from trustworthy news sources. We're talking about absurd memes and stories that couldn't possibly be true, stuff that can be disproven in literally five seconds if you've got a browser open. So what's all the fuss about? I mean, yeah, the fact that Russia dunked tens of millions of dollars into this weird internet crap engine to screw with the presidential election is a little bizarre and it doesn't seem like a good thing, but does it really matter? So a few million American voters were exposed to some stupid memes. Surely nobody's convinced by this garbage, right? Well, unfortunately, human psychology isn't always the most sensible thing. In fact, it seems that our minds are wired to be extraordinarily vulnerable to misinformation in numerous ways. In doing the research for this episode, I expected to find some cute factoids about bias, but man, in the context of trying to maintain a functional democracy in the face of a trend of misinformation, this stuff is freaking scary. Please bear in mind that these facts are true about every human being, including you and me, not simply those idiots who we happen to disagree with. First, let's look at the backfire effect. Psychologists have demonstrated that debunking the practice of dismantling someone's false beliefs by presenting contradictory evidence, although it can be fun, not only doesn't usually work, but is often counterproductive. The mechanisms that fuel confirmation bias make our minds slippery to information that doesn't agree with our preconceptions, which leads to the bizarre effect that when someone says, you know how you believe X, well, it's not true because of Y, our brains end up remembering you know how X, which tends to cement our belief even further. That's right. When someone presents us with evidence that directly conflicts with our beliefs, we believe them harder. Next, let's take a look at belief perseverance. Former studies over several decades have shown that humans are really bad at discounting false information, that they tend to maintain similar attitudes even after learning that some crucial support for their opinions is false. For example, in 2007, Michael Cobb presented an experiment where he supplied his test subjects with some facts about a fictional politician, including that the politician in question had taken a bribe to vote a certain way on some bit of legislation. Obviously, when he asked them to rate the legislator, they rated him poorly. However, later he changed the story, revealing that the supposed bribe had merely been an accounting error by a third party, not relevant at all. And yet, compared to a control group who had received no misinformation, the newly enlightened test subjects, knowing full well that he hadn't done anything wrong, still ranked the politician as being a little bit shady, a little untrustworthy. It seems that even when we learn that certain assumptions are false, the opinions we build with them tend to stick around. Third, let's look at something called the illusory truth effect, although that name could apply to any of these phenomena. There are a number of interesting things that happen in the brain when it recognizes something, when it receives some input from the world that looks familiar. There might be the case that it mistakes that recognition for veracity, which makes us believe things that we hear repeatedly. A very simple 1977 study by Hasher et al. demonstrated this phenomenon in a pretty convincing way. They simply handed out a series of true or false statements about anything under the sun, sports, politics, whatever, and asked their test subjects to rate how likely they thought each statement sounded. Then, every two weeks, they handed out new statements with a couple of repeats from previous weeks thrown in. Every new batch of tests, their average confidence in the new statements stayed about the same, but their certainty in the repeated statements climbed higher and higher. Okay, brief recap. We tend to believe things more when we hear them repeated. We tend to believe things even when our reasons to believe them are disproven. In fact, we sometimes end up believing them harder when they're disproven. And now, we drop that whole, screwy mental apparatus into a sea of deliberately fabricated, outrage-inducing, blatantly absurd clickbait. It sounds stupid, right? It sounds like the most childish, unsophisticated naive idiot should be adequately insulated from this nonsense. But the reality is, it doesn't matter that the span the Russian propaganda engine has been pumping into Twitter, Facebook, Reddit, and other social media sites is about as believable as infomercial acting. It doesn't matter that it's easily debunked, or that the tiniest smidgen of reflection easily discounts it. It doesn't even matter that we now know that it was, in fact, Russian political span. Human psychology is weak against just such an attack. And the behavior of a statistically significant number of American voters was likely changed as a result. Social media sites are apparently infected with large-scale operations designed to bombard us with garbage, that despite being demonstrably false, will nonetheless color our judgment. In a way, it's an obvious extension of the psychology we've been talking about. Find a tool that keeps us watching, then fill it up with whatever you want us to believe. Manufacturing this crap is trivial, and all it takes is one or two going viral to vastly influence the electorate. There are a few possible solutions I can think of, and I don't think that they're mutually exclusive. First, as many have noted, the free-wheeling, laissez-faire days of the internet where massive content generating social media companies like Facebook and Twitter could exercise no control whatsoever over their platform and who used it for what may be drawing to a close. As awesome as it was when the internet was a libertarian utopia of nerves, it's becoming abundantly clear that some form of oversight and structure must be maintained to prevent abuses of those algorithms and the power they wield. I don't wanna make that sound easy or trivial. The volume of content being produced on social media platforms makes moderation a gargantuan problem, and deciding what stories should count as feasible or not misleading is a sticky wicket, right, for accusations of abuse. But the alternative is ceding these platforms to whoever has the biggest botnet or the most psychologically exploitative stories, and that's clearly not enough. Another possible solution was voiced by psychologist Robert Caldine, whose book Influence has become a hugely influential reference for compliance, that is using psychological tools to compel people to certain behavior. After detailing numerous powerful methods for cultivating compliance in human brains, Caldine closes his book by suggesting that the problem isn't the vulnerabilities themselves, it's the lax treatment of those who exploit them. He notes that all of these psychological mechanisms serve an important purpose of parsing large amounts of information quickly. One of the primary mechanisms acting in the misinformation phenomenon, social proof, is actually a really helpful heuristic for quickly making sense of the world. If other people believe something, it's usually true. That might sound sketchy and it's certainly not foolproof, but if you've ever used Amazon reviews to choose which product out of thousands to purchase, or browse Reddit's hot category, you've seen how useful it can be. For Caldine, the main issue isn't that people sometimes look to the wisdom of the crowd for guidance, it's that a few jerks abuse that system by making some ideas look like the wisdom of the crowd. This is exactly what Russian spammers and fake news power consolidators are exploiting, by artificially boosting the trendiness of certain hashtags and stories. He thinks that the best approach here is to severely punish violations of that mechanism, to really make it sting if someone is caught trying to sell you something with an illusion of social proof. We tend to take such things in stride normally just because it's a fairly common tactic, but maybe stiffer penalties would disincentivize abuse. Finally, let's talk about the hardest solution. Last year, Jonas de Kiersmaker and Arne Roetz discovered a variable that's negatively correlated with vulnerability to misinformation and inability to correct it once discovered. The ability to correctly choose synonyms for certain esoteric vocabulary words. Okay, okay. So the synonym test is supposedly a decent metric for cognitive ability. The skillset necessary to learn, remember, and pay attention to relevant information necessary to solve problems. It's different than intelligence, which tends to be a fairly static quantity through a person's life because cognitive ability can be improved through training. It's not like people with high IQs know a decent synonym for opso math by default. It turns out that people who score high in cognitive ability can be much better at correcting for misinformation, adjusting their beliefs more or less back to the baseline after discovering that they no longer have a good reason to think that way. That doesn't make them immune to misinformation by any means, but it does help them recover more quickly from it. Almost as though learning how to learn allows them to adapt to new information quickly. Unfortunately, for this fact to be useful, we'd have to improve the cognitive ability of a statistically significant number of American citizens. Maybe then we'd have a fighting chance against some of these problems, but it's not an easy thing to do. As much as we might like to imagine that we're not goalable, misinformation plays a bigger role in what we believe that we generally give it credit for. And in many ways, we're vulnerable to it. Do you have any thoughts as to how to fight it? Please leave a comment below and let me know what you think. Thank you very much for watching. Don't forget to blah, blah, subscribe, blah, share, and don't stop clunking.