 It's not as if people want the truth. They don't want the truth. Fake news is one of the classic collective action problems of our era. My job is to tell the truth. I hope that people believe it. I don't think fake news is the cause of the problem to democracy. I think that the proliferation of fake news is one of the most important symptoms of a decline in sort of shared values and a shared sense of truth. The information of environment and trust environment for different media is eroding quickly and needs to be corrected. It will get much much worse very quickly if we don't do something. There is a very real sense of urgency to tackle the spread of misinformation or what is more commonly known as fake news. A historical problem that now spreads with a virality once impossible before social media. The problem may be old but these conditions are new. In fact the largest longitudinal study on the phenomenon of false news and how it spreads was first published as recently as March 2018. After pouring over millions of tweets written over the course of 10 years, researchers from MIT concluded that falsehood diffused significantly farther, faster, deeper and more broadly than the truth. A couple of months earlier political scientists from Dartmouth College, Princeton University and the University of Exeter released equally troubling findings. Through analyzing the browsing history of thousands of Americans in the run-up to the 2016 election they discovered that one in four citizens had seen at least one false story. As these patterns begin to emerge, measuring the actual scale and impact of fake news remains elusive and we seem nowhere closer to solving the problem itself. Part of that difficulty stems from the very fact that there is little consensus on what fake news means. I think first we have to define fake news because we have two competing definitions that emerged over the course of 2016 as I was covering the election. At USA Today and at the USA Today Network I actually instituted a rule not to use the phrase fake news. It's not news. What it is is false information or propaganda. There was a very real phenomenon of purported news that was completely fabricated, particularly hoaxes. And that's not a new phenomenon. Anybody who's seen Snopes.com knows that urban legends and rumors that flatter people's biases have existed for as long as we've had human nature, but the internet and social media have given them a new platform, a new altitude, a new ability to spread. That said the meme fake news as applied to any legitimate news outlet by people who don't like what the news outlet is reporting, you know that is obviously a major challenge for news organizations. In January 2018 the European Commission called upon a group of 39 experts, including staffers from Twitter, Facebook and Google to provide recommendations on how governments and tech companies could best address the spread of fake news. In March this high-level group released a report calling for greater transparency on the part of social platforms while stressing the importance of self-regulation. But much like the problem itself, they're just beginning to untangle its root causes and caution against applying simplistic solutions. What responsibility, if any, do these technology firms have in addressing the dissemination of misinformation on their platforms? The media environment that all of our minds are plugged into and not just the social media environment, but just the way that technology is steering the thoughts, feelings and beliefs of two billion people every day. For our digital and social firms they like to position themselves as more like a telephone wire where they don't control the content. But the fact is that they are already controlling the content because they have algorithms that control what we see. So they're already acting the part of editor. These private sector tech platforms which do have a role as essentially media companies who are realizing that they can't just be hands-off and say you know we're just the venue we have nothing to do with what people say here. So I don't think they've figured it out either but they are wrestling with it now for the first time. People are not going to stop using social media for news and so given that and given that that's the environment that we're all living inside of how can we make those technology companies accountable to democracy and not just maximizing screen time. With mounting concerns of misinformation threatening the democratic process a number of countries are pushing for greater regulation. In January Germany passed a law that requires social media companies to remove hate speech from their platforms or else face fines of up to 50 million euros. That same month the French president Emmanuel Macron announced his own plans to draft a law against fake news that would grant judges the authority to remove misleading content during sensitive elections and in Italy citizens can now report what they believe to be fake news to the police. The bigger issue here is it's not so much reporting it to the police who are probably not equipped to get to the bottom of it. Putting it in the hands of the consumer is not really the answer because the consumer who is most likely to consume this information and pass it on is probably the one who is least likely to understand that it's false information or that it's propaganda. I think as an American and as a fan of the First Amendment I automatically react badly to the idea that there should be regulation of speech. There's got to be a way for us to have a free market of ideas that allows everybody to have a voice. It is a very slippery slope once you start regulating information and regulating content. It plums such kind of profound ethical questions that it can't just be a piece of regulation that's kind of you know hatched out by those who are immediately concerned by it. This is very much about like how we want our societies to become and be going forwards that it has to be a conversation that has a broader extent of participation we have to work out about how we involve societies in this conversation right. It's so innate to who we are and who we want to become. The part of governments and technology firms has been central to the discussion but what about the role of journalists writing and publishing stories in this context. When the digital advertising model rewards clicks there's an inherent pressure to produce attention grabbing headlines and as the aforementioned MIT study revealed truth can't always compete with fiction. Yeah the advertising only business model has been incredibly destructive for journalism. Journalism can't compete if it's just about raw raw clicks. Journalism needs a new business model without question. It's past time unfortunately journalism and print media in particular have had a really tough time for decades since the invention of radio. Journalism has never paid for itself as journalism it's always been supported by subscriptions advertising you know wealthy benefactors. I'm a former local journalist I began my career by working for newspapers small newspapers in Ohio and Pennsylvania and that's what's really been hollowed out is people sources of the information that matters to their lives local media reporters covering state capitals there are fewer and fewer political reporters legislative reporters in the state governments where these laws are being made it's not financially sustainable right now in fact right now the most sustainable business model for print media appears to be well intentioned billionaires but we know that media like everybody else is in the midst of this digital transformation and there will be a new model and somebody's going to figure it out and hopefully soon. I think we have to start solving this problem not the layer of what should news do but how do humans actually work and because we actually work in a way where we want to confirm our beliefs and we'll seek out places that confirm our beliefs how would we then have technology companies knowing that about us augment our experience to break open some of the patterns in a hyper partisan environment you will have people constantly search for sources of news and information that confirm their prior political beliefs they will often fail to find that in the most credible news media and they will seek it out elsewhere if it all boils down to human nature how do we go about correcting it or at the least understand and work around this vulnerability. If we only couch this in negative terms if we only couch this in terms of the dangers it's not going to incentivize everybody to solve the problem. Yeah I think it's important to see the upside of social media because on the one hand yeah everybody's always yelling at each other and that probably has more to do with people and human nature than it does with the platform but on the other hand it empowers people who haven't had a voice before. It's just being cognizant of how many more people are going to come online in the next five years and how ill-prepared in some ways people are to the sophistication of how they will be manipulated effectively. Part of what we're seeing fake news do is drive people away from each other. You know you're a libertarian activist or a member of a minority group or sadly a member of the alt-right the Nazis can find each other too and gain strength in numbers. You start on one video about you know NASA astronauts and you go down a flat earth route and you're in this total wormhole only because these algorithms are trying to maximize how much time we spend. We have to start by understanding where do minds get tripped up and then how would we protect those minds to function in the way that we all want to be functioning. Tim Berners-Lee the inventor of the World Wide Web cited digital disinformation as one of the biggest challenges facing the future of the internet. He remains hopeful however that collaboration can fix it. It's up to us it all decides on what we do now on the decisions we make now.