 The Great Emu War was a real battle between the royal Australian artillery and several thousand emus in 1932. It doesn't really have anything to do with today's episode, I just heard that it's responsible to disclose any conflicts of interest. As I've said before, if you watch THUNK and you don't watch PBS Idea Channel, I don't know what you're doing with your life. In this episode, Mike explores some ideas about journalism and its relationship to objectivity, the idea that in a perfect world, journalists would only convey the facts, sterilizing the reporting of any personal bias, and leaving judgment and interpretation of those facts to the people watching. There are some practical problems with this approach, because the human experience is inherently subjective, and journalists, with very few exceptions, are human. If you and I are reporting on some event, we start by viewing it from two different physical perspectives. We're already seeing different things, and from there, we unavoidably bring two different psychological filters for what we're seeing, and which parts of it are important. Even if we both exercise rigorous journalistic self-control from that point forward, editing out anything in our reports that might seem too biased, it's still entirely possible that we'd end up with two very different characterizations of a single event, not because we're trying to push some personal agenda, but just because we're human. Let's say, at least for the purpose of this episode, that perfect journalistic objectivity is an impossible dream, that it's impossible to divorce the facts from the subjective mechanisms that humans must use to interpret them. You know, all of this stuff. Most journalists will readily admit that perfect objectivity is an unattainable ideal, but there's some debate as to whether or not the pursuit of that ideal, especially in American media, has become a sort of empty idolatry, missing the whole point of the exercise in the first place. Conveying the truth about whatever you're reporting on. Many people blame recent anti-science trends, like the anti-vax movement or climate change denial, on lazy token gestures to unbiased reporting, namely giving equal airtime both to established, data-backed, scientific claims, and unsubstantiated crackpot theories. Importantly, this practice isn't objectivity, it's just a sort of lip service to it. Reporters who are striving to be objective sometimes give two sides of a story, so we should give two sides to every story. Also, Anderson Cooper wears those black t-shirts. Can we get some of those? They must be important. Because of these problems that arise from the pursuit of objectivity, some people in media have advocated a new focus, one which emphasizes full disclosure of the reporter's personal biases, affiliations, sources, anything and everything that might affect their opinion on the issue. The trend is succinctly summarized by Harvard researcher David Weinberger. Transparency is the new objectivity. In a way, advocates of transparency are trying to be realists about the human condition. Journalists aren't these perfect, abstract lenses of pure factual information. They never were. Rather than asking journalists to fake objectivity by self-censoring their well-educated opinions or give both sides of the story, maybe instead they should disclose their opinions along with the facts, giving their audience more information to make up their own minds. That's a compelling argument. I'm on the record as being for critical thinking, but there's some science about how politics affects human brains that might be cause for some concern in this model. First, let's look at this paper published in 2016, Neurocorrelates of Maintaining One's Political Beliefs in the Face of Counter-Evidence. In this study, researchers presented the participants with information that challenged their political and non-political beliefs while monitoring their brain activity with fMRI. In many ways, the results confirmed some things that we already knew. Confirmation bias is a powerful force, especially in politics, and people with strong political beliefs changed their minds less about political issues than non-political ones. The researchers also found that brain regions responsible for motivated reasoning, self-representation, and introspection were only active for challenges to political beliefs, while brain regions responsible for cognitive flexibility, like what you'd use to unlearn a bad habit, were only active for non-political ones. It seems that very different things happen in brains when political beliefs are being challenged. There's a lot of room for interpretation, but these results are at least consistent with the idea that we don't process facts about firearm legislation the same way that we process facts about multivitamins. Let's look at a different study. This 2007 paper, Priming Us and Them, claims that framing certain ideas in a political context affects how strongly people react to them. According to the authors, if you ask me how I feel about some partisan issue, I might give you a middling, yeah, I guess, or no, I don't really think so. But, if you get me thinking about politics first, just by subtly mentioning things like the election or conservatives or democrats, then my response to the same question will tend to be more extreme according to my politics. Absolutely, or no way. Of course, priming effects tend to wear off after a little while, but be honest, how many times have you checked Facebook or Twitter today? How many times have you been primed to think about what political group you belong to, or maybe don't belong to? These papers are representative of a body of psychological and neurological research that tells a similar story. The thing that I want to focus on here is what import that evidence has for the enterprise of journalistic integrity. In an ideal world, transparency is a vehicle for critical thinking. If a news blogger discloses her preference for a particular candidate before reporting on a presidential debate, then her audience can bear her prejudice in mind while interpreting her version of events and get a better idea of the truth. Facts plus bias minus bias equals facts again. Great, but it would seem that that's not the entire story here. As soon as her political leanings and values enter my awareness as a point to be considered, my brain may start functioning differently. It's possible that the facts that she reports are now being rooted through different hardware, hardware that's much less likely to change my mind about anything. My opinions may start to drift to greater extremes, ultimately coloring what parts of her report I find important enough to remember or share. For me, these points are very similar to the practical concerns raised about objectivity. Reporters aren't perfect transmitters of unbiased truth, but audiences, even informed audiences, aren't perfect receivers of it either. Full disclosure is a great idea in theory, but in practice it might lead to even greater bias. So, are we just screwed here? Is there any example of a medium where the facts actually get conveyed with minimal distortion, where both the author and the audience are working to amplify the signal and attenuate the bias? This is just my opinion, but I think many would agree that scientific papers are an obvious paradigm to look to here. They're obviously not perfect, but they're definitely more sound than the vast majority of stuff in my Facebook feed, and they have some proven mechanisms in place to maintain that soundness, mechanisms which might suit any media looking for a really fair and balanced approach. First, peer review. Scientific researchers generally have to get their papers cleared by a group of third-party reviewers, people who have some knowledge of the field, but don't have anything writing on the paper's publication. Rather than editors who might be concerned with things like writing style or page views, peer reviewers only care about one question. Does the evidence objectively say what the authors claim that it does? This is why the authors of scientific papers tend to be very conservative in their claims. It's not like you don't want a paper titled Compound X Cures Cancer, it's just that if there's the slightest chance that one of your peers might think that you're overstating your discovery, you're probably not going to get published. Stiff competition and professional rivalries exist in science, just as in journalism, so there's a provision in the peer review process for the submitter to discount certain people. If you just know that this or that person is going to rain on your parade or steal your work, you can tell the journal not to ask for their opinion. The net result is that the content of scientific papers tends to be objectively reliable despite the personal biases of the authors and the reviewers. Combined with a cultural emphasis on reputation, a scientific author might have an overblown opinion of their research, but peer review irons out the questionable bits. If a news writer had to get their article approved by someone with differing politics, do you think that they might be a little bit more restrained? Second, transparency in scientific papers is handled in a very specific way. Exhaust of research and citations are listed at the end of the paper, and any potential conflicts of interest, like a huge grant from a pharmaceutical company for the study, are disclaimed at the top. That's it. Again, a scientist's personal opinion about the full implications of their research may be candidly revealed in conferences or interviews, but they wouldn't fit in a paper. The general rule is that your data should speak for itself, just like in good journalism. Finally, while authors in almost any medium might benefit from science's peer review and transparency policies, there's a part of the whole science paper process that's the responsibility of the readers. Data before discussion. When scientists read papers in their field, it's a common practice to skip all the actual English and jump straight to the figure section. To look at the results the authors found, form an opinion based solely on those results, and then go back and compare that opinion to what the authors came up with. It's yet another filter on top of the ones already in place to ensure impartial processing of the facts. That would be our role in this model as the audience, just as journalists might work hard to present their information more even handedly, we could help them out by jumping straight to the meat of an article, direct quotes, pictures, or, best of all, numbers. Just like science readers, we could decide what we think the facts mean without the authors bias shaping our opinion, then go back and compare our evaluation to theirs. I feel like this sort of approach to the news that we consume would mitigate bias, a little bit like playing poker with real money instead of just chips mitigates bluffing. It's all well and good to say that you agree with everything that your favorite news source writes, but do you really have any skin in the game? Or are you just along for the right? What do you think? Is scientific writing a decent model for journalism to follow? Is transparency the new objectivity? Please leave a comment below and let me know what you think. Thank you very much for watching. Don't forget to blah, blah, subscribe, blah, share, and don't stop thunking.