 Well, good morning ladies and gentlemen, thank you for joining us here. Welcome online to those of you watching on our Facebook channel and watching on weforum.org. Issue briefing is our third of the day, we have a hectic schedule today. The headline is digital wildfires in a hyper-connected world. I'm just going to start this session by reading a passage from our Global Risks Report in 2012. The Global Risk of Massive Digital Misinformation sits at the Centre of Constellation of Technological and Geopolitical Risks, Ranging from Terrorism, to Cyber attacks on the Failure of Global Governance. Hyperconductivity could enable digital wildfires to wreak havoc in the real world. It considers the challenge presented by the misuse of an open and easily accessible system and the greater danger of misguided attempts to prevent such outcomes. Now, I'm going to ask our panel, first of all, whether they thought ond dwi'n ddwy'r hyn sy'n myfyrdd o'r cyffredin, sy'n meddwl i'r ddau ffresiant. Mae'r Cyforth Cenedrff Rhowf yn ymwneud, mae'r ddau hynny yn ymwneud. Mae'n ddau'r ddau'n myfyrdd. Rwy'n myfyrdd gwell i ddim yn ymwneud. Mae'r pannol am yma. Rebecca McKinnon yn ymddiadau i gydag o rangynghau digital ar y USA. Mae'r ddweud ffreswn. Mae'r ddweud o ddweud. Stephen Adler yn yr edrych of Thompson Reuters. Rebecca, let's turn to you. Digital wildfires. Are we living in an era of digital wildfires? Well certainly we're in an era where anybody can be a journalist, right? And so you have all kinds of people putting out information and gaining traction and you have these memes starting and let's not forget the Arab Spring was also a digital wildfire. So you have both positive and negative wildfires in the sense that sometimes somebody will report facts and they will get out there in a way that couldn't have gotten out before. Before Facebook the Arab Spring would not have happened because the outrage over a man emulating himself in a market and that information spreading through social media would not have happened and the organizing that happened around that would not have happened or the outrage of a young man being tortured in Egypt and the information about that spreading that certainly would not have gotten to mainstream media in the past. And so there's all kinds of ways in which people have been using digital media and the fact that anyone can be a reporter for good, to advocate for human rights, to advocate for social justice. So let's not forget that. And yes there are plenty of demagogues and racists and populists of different kinds who are also manipulating the media, putting out information and trying to gain traction. So there's all these kinds of things. I think the answer is not censorship or trying to decide what we need to anoint arbiters of what is true and what is false because frankly that's just going to, we're going to really be handing more power to dictators if we do that. We're opening ourselves up to greater abuse and censorship and surveillance if we do that. What we need to figure out is how to create a healthier information ecosystem whereby when something is true or something is false there are communities who can come together and place a check on it that there's more trust in different sources of information so that you don't have entire groups of people who distrust journalists so much that they're willing to believe anything that they would like to believe whether or not it's true because there are no other credible people in their community who they trust. They say, no, this is not true and here's why. So I think we need to come up, we need to innovate more, we need to innovate more with community journalism. I think the platform certainly can work with communities to figure out how you can strengthen people with counter arguments but censorship is not the answer. Community journalism sounds laudable and innovating also does it's hard to escape the feeling that fake news organisations, for example, hate sites are extremely well organized and they work together very well. I read an article from December in which you were quoted, I don't believe it's your quote, but the sense was that these sites are encircling conventional mainstream news media sites or mainstream websites and it's getting out of control. But what do you do about it exactly? Again, I don't think it's censorship. Part of the problem is that you get, and this happens throughout history, that people with a very pointed agenda with rich backers find a way to be very focused and very strategic and to get messages out and so you need to figure out how you counter that. Do you build a movement? Part of the problem is that people kind of on the human rights and free speech side have less of one message. But I think we've seen throughout history the ability to counter lies and demagoguery with checks and balances and ways of countering that and so I think we're in a moment right now where let's say the bad guys seem to be winning, again depending on your point of view. But I think it's just a moment and I think it's in part due to some specific outcomes of specific elections that a lot of people are very upset about. But again, I think that liberals in particular need to be very, very careful about reaching for essentially authoritarian solutions to a problem of authoritarianism. We've seen that road before around certain types of socialism and communism. It didn't end well. Let's not go there. Stephen talking of authoritarian solutions. As a media organisation you're much more highly regulated than a tech company for example. Does that give you a commercial disadvantage? I don't think we're highly regulated at all actually. I mean we operate largely and globally in a position where as long as we're producing accurate information by and large we're okay because we're operating in 200 different locations over 100 countries around the world. Obviously there are different regimes in some places there's more regulation than others but we try to operate fairly consistently around the world. I'm not a believer in government regulation of media. I am a strong believer in transparency and free flow of information. I think by and large ideas compete with each other as Rebecca said and that particularly the internet and social media are amazingly self-correcting. Again somebody raised the point yesterday at a session saying that if Hillary Clinton had won the election we wouldn't be talking so much about fake news and I think that's true. A lot of people didn't like that outcome and they're blaming it more on fake news than probably they should. Of course fake news is a very old issue back to ancient Rome back to almost any political campaign going back in history. People were saying bad things about their opponent and frankly the ability to challenge that wasn't even as great as it is now. I mean you have the example of Wikipedia where there's this constant conversation going on and challenge but also I mean I've just been struck as somebody who researches books as well as doing journalism over the last 10 years, 15 years. The amount of useful accurate information available to the world electronically has burgeoned in a way that has almost never been seen in history. I mean we're back to sort of Gutenberg days and it's an extraordinary blossoming of information and while certainly we've seen people making some stuff up I've also found that if people in the public want to check it and we'll get to that question of how do you help the public want to check it but if the public wants to check the information when I go on and I see something that looks suspicious you obviously go and look at other sources. You see what other people are saying, you do some independent research and you can pretty quickly sort out what's true and what isn't but I do want to emphasize the amount of accurate information out there is greater than ever in the history of the world and I shouldn't sort of get too depressed about that about some of the side effects or the challenges that we're facing. Government regulation is not the answer. I think greater public education, people learning news literacy, civics education, people understanding better how to sort between accurate and inaccurate information will certainly be useful. I would certainly advocate that media organizations like ourselves be self-reflective about how they behave because there's a question of how much do you proliferate the stuff that clearly is not accurate and we've seen some instances of that recently. I think news organizations that rely on a business model that essentially requires you to have absolutely massive scale are creating a problem and I think really need to think about whether that's an approach that's viable in the long run. If your job is to find clickbait and get the most possible readers then you are feeding into it. I think it's useful to look at those business models and it's useful for media organizations like ours to have standards. We have trust principles written into the charter of Reuters that requires us to be independent and free of bias and I actually have to go before the board of directors every year and a test that we are following them and I think news organizations that want to be responsible that put in some standards for themselves that that would be useful and I think civics education will have some usefulness and just a final point, I think if you want to be trusted and you want to differentiate yourself from fake news providers you probably have to provide more transparency than traditional news organizations have thought they needed to do in the past. When I was at the Wall Street Journal years ago it was kind of a mantra that you never talked about how you got the news. You were concerned about the legal consequences of revealing that publicly and you felt that your work spoke for itself because you were a trusted organization. Now I think you have to prove. You have to be trustworthy and then you have to prove it. You have to say how did you get the story, how did you source, how did you parse information and think about what was accurate and what wasn't and we're going to do more of that going forward and I think a lot of news organizations are thinking about doing that as well. That's a really interesting impact on journalism which I'd like to come to but I think what I want to go to next is what you said yourself about in fact you're obviously media-literary. You're able to go and check out the source and do a little bit of research about something but other people aren't and clickbait is very, very effective and if you're an organization which is not driven by a commercial interest but by an ideological interest you're in a natural advantage to a conventional media company. So I'm wondering whether this proliferation, I understand it could come from either political persuasion or any interest group but does it pose an existential threat to the media? As we see it today. I'll try that. I've been involved with conversations where people are like bloggers versus journalists, professionals versus amateurs and that it needs to be some kind of zero-sum game which is absolutely not true. I think the fact that citizens can report from their communities, they can push back. I've been involved with situations where I was misquoted and I could blog the fact that I was misquoted and taken out of context when the reporter did that or somebody in a community can write this story completely inaccurately. Sometimes it does. Sometimes it gets shared as much if not more. Again it depends on the conversation that takes place around it but I think the fact that media has been disintermediated is over time improving journalism. It's good for journalism. I think journalism still needs to evolve and as he says become more transparent, figure out how to have more of a conversation with the communities they're covering, the communities they're reporting to and how to involve the communities more in reporting on what's happening and involve them in the fact checking process and discussion about what is true and what is not. I think we're still in very early days. We're in a very bumpy period right now but we have to work through it. We can't go back. There's no turning back. Again censorship is not the answer. There are a lot of efforts right now, a lot of pressure being put on the social media platforms to crack down on extremism, to crack down on hate speech and what you're seeing is that in their efforts to keep certain types of speech off the platforms, they're making all kinds of mistakes and so journalists and activists are actually getting kicked off of Facebook or having journalistic reporting taken down because it mentions ISIS. Or some people just having a debate about religion will mistakenly get removed from social media because they mention some inflammatory terms or situations when they're not actually extremists and so the desire to kind of clean up the conversation, to clean up the public conversation, I think actually results in less trust and results in people being more inclined to believe fake things because they don't feel that they're really, that they're really able to participate fully. So I think that's or that they can be edgy or controversial without having to worry about getting censored themselves and to have a robust debate. So we need to be very, very careful about again how we go forward and it's again really about innovating and there are of course a lot of, there's a lot of hand wringing about business model for media, but I think we also want to think about, okay, if we want a media ecosystem that sustains a society that respects human rights, sustains a society where we can have democracy, where minority rights can be respected, what kind of media ecosystem is that and does some of it not need to be maximizing profit? Some of it just need to be community sustained or paid by taxpayers or supported by communities or just kind of run at cost as a social good. We need to think about that because everything's really hanging on it. I mean life and death ends up hanging on it when people are making decisions about who to vote for based on what they're reading. So there's a lot of soul searching. I think we're all kind of collectively responsible in the end for the solutions and I think as Steve was saying, there's a lot of people who are upset because they were losers in a particular phase of politics and they're blaming everyone except themselves. I think we can all contribute to solutions if we make an effort. Oliver, I can stick to your question about whether fake news in this whole ecosystem is strangling traditional news organizations. I think all of us have been struck with the extraordinary response that people have had post particularly the US election in subscribing to traditional publications, subscribing to publications that are trustworthy. There's been an enormous number of additional subscriptions to Vanity Fair to the New York Times to many other organizations and what I come back to is that I actually think people don't just want trusted news, they actually need it and if they don't need it, we're in big trouble but I think all the evidence suggests that they need it. News is a subset of information and people need information to make decisions and ultimately you use news to make decisions. You use it as a professional, as a business person to make decisions in your professional life. Traders and investors obviously use it. All business people, everybody at Davos is using news and needs to know that the news they're relying on to make decisions is accurate. Individuals make all sorts of personal decisions based on the accuracy of what they might read in a newspaper or online. They're making decisions about insurance, about what kind of house they're going to buy, what kind of car they're going to buy and ideally they're also making political decisions. What kind of policies are likely to help them? Would a tax increase help or hurt them? Would more or less regulation of business help or hurt them in the long run? What's going to help the middle class get jobs? Ideally you want in the political arena for them to care about, I mean it is in people's self-interest to need accurate information but I think ultimately it's a need for countries where there's a high level of censorship and there isn't information and people kind of know that. So North Korea is tremendously dangerous to do this but people smuggle in flash drives, they smuggle in DVDs, they listen to illegal radio at night because they want to know what's really going on outside North Korea. It's really important to them. In countries that have firewalls people find all sorts of technological solutions to get around the firewall because they want to get at the truth, they want to know what's really going on. I don't think there are any solutions that are out there proliferating fake news but I think there is a drive towards trusted information that's really kind of very deep in people's set of needs and that is going to be as strong or stronger going forward. I think somebody who loves the media, anyone who would love the media would take great heart from the fact that subscriptions are going up in the aftermath of the UAE selection but what I suppose I'm worried about is the digital underclass of those that are not even aware, you mentioned yourself, awareness and education, digital literacy, not even aware or unable to afford to pay for news and whether that is... they're left adrift, they're not able to... they've got no guidance whatsoever and there's a lot of news out there and if the fake news is getting better at proliferating then that's a pretty hopeless situation, Rebecca. I don't think that's going to be where we end up or it doesn't have to be. It's one possible reality but it's certainly not an inevitable reality. I think yes, there are some news organizations that are going to have a subscription model and there are people who can't afford that information but there are plenty of sources out there of public interest, public-minded organizations that are putting information out there that's freely accessible and we certainly need to continue those types of institutions and I think this is again where we need to think about do you just leave it all up to profit maximizing corporations and then politically motivated groups to create our information ecosystem or is there a public good that needs to be supported other ways to make sure that people even who are very low income have access to information and that's certainly been a part of civic life in many countries that there is a public service aspect of information and so we can't forget that we need to figure out how to bake it in going forward and there has been a big flow of private money into news in the last few years pro-publica, a martial project even Jeff Bezos buying the Washington Post and putting a huge amount of money into it which I'm sure he'd like to make a profit but it also seems to be part of just a desire to build out a stronger news organization and that's gone way back there have been people who have bought political purpose but sometimes really philanthropic purpose so I think that becomes part of the ecosystem of how news organizations are funded you have some governments funding it and like the BBC trying to create something that's largely independent and you've got private enterprise and you have some very good business models out there that will continue to produce trusted news Do you see any questions from the floor? No, it was a nose scratch Rebecca, we don't have very long but I do want to find a little bit more about the work you do on a daily basis you advise businesses, what are there what's their biggest concern? What ranking digital rights does is we actually evaluate internet, telecommunications mobile companies the biggest large listed companies on their policies and practices that affect users freedom of expression and privacy and we have a whole set of indicators where we're asking questions we're evaluating how well they're doing and we're really looking at transparency so not only do the companies have commitments to respect their users' rights but what are the specific policies that they've got in place so are they informing users about what data is being collected about them with whom it's being shared are they informing users so if a user wanted to know if somebody took the information that Facebook has on me and created a dossier what would that look like Facebook ought to be disclosing enough information so that I can understand that similarly when companies all companies have terms and conditions where you're allowed to do this you're not allowed to do that how are they enforcing them what are they taking down under what conditions is there an appeals process so that if a user feels a woman named Isis have been kicked off of Facebook by mistake is there an appeals mechanism for them to get reinstated if they don't have friends in media the appeals mechanism right now doesn't work very well so how do you build a better redress mechanism are companies being transparent about the government requests they're receiving to remove content and the volume and nature of content they're removing under what conditions so that people can hold accountable who's responsible for this content being removed we need to see more accountability around algorithms again what is shaping what people see and what don't see who's making the decisions, who has the power over what I'm able to say what I'm not able to say and who's able to see what about me we need if we don't have clarity and transparency about that by companies then people are frankly going to feel even more manipulated and more mistrustful of the entire environment and I think that's going to cause people to latch even more on to demagogu but also people aren't going to be able to advocate for what they believe in or obtain information that might go against the mainstream somehow what is their concern and what is the area where you see most lacking by companies yes, most need to toughen up companies are very unclear right now even those that published a lot of policies when it comes to exactly what are you collecting about me what kind of profile could be created on me with the information you have they don't communicate clearly and so that's that's one very big issue another really big issue is again transparency around private rules that companies are at least the big US and European companies are starting to get more transparent about requests they're getting from governments but in terms of how they enforce their own terms and conditions it's a complete black box and this is a real problem because you're seeing people on a daily basis having accounts deactivated or content removed and they don't know why and they're not able to get redress on that so that's kind of the flip side of this whole fake news debate which is that already there's no accountability about the decisions that companies are making about the rules not clear redress and appeal it's very very arbitrary people are feeling tracked that they don't have control over who knows what about them and so we need to make sure going forward the connection between what I do and this debate is we don't want to get into a situation where companies are expected even more to be arbiters of speech and it becomes even less transparent and these people are not elected they're not public officials they're just random people that got hired through some process and who are they to decide what the public needs to know or not and so there's a need for accountability but also need to to be very careful about where the responsibility is placed and do you really want to put that responsibility on to companies when you mentioned accountability and transparency what is the single greatest intervention development that you would like to see to protect global commons from misinformation I think the trend towards fact checking is useful I don't know exactly how it should be done but the notion that when things are challenged or questions there's the ability to have some sort of fact checking consortium out there is probably helpful there's also algorithmic fact checking which some people are a little leery of but we at Reuters have developed our own algorithmic tool called news tracer essentially identifies events that are purported to have occurred looking at social media and then it puts it through a whole bunch of algorithmic tests to see if it's likely to be true and it gives it a star rating as to how likely it is to be true it's not your final it's your first step in looking at that information it's useful to us in the news room quickly to see did that earthquake occur did that terrorist attack occur so technology will help give people a better sense but I would really just go back I've heard some arguments in favor of censorship here at Davos this week and I would just be so cautious about going down that path it's so tempting people say well you just get the government to say you get rid of fake news you can't do it if it's a rumor you can't publish it and you see that in authoritarian regimes it's a very terrible idea and by and large and we say this all the time there's a cost to free speech a cost to free speech is sometimes people malign other people sometimes people say things falsely but if you don't have free speech you don't have free society so I would really push hard against the argument that fake news is an excuse for censorship that just should not happen Rebecca before we close just one last question what is your priority for the coming year and what do you do? Well the work my project is putting out its second corporate accountability index where we're going to be ranking 22 companies on their policies and practices and we're going to be really focusing this year on mobile ecosystems as we call them and you've probably seen in the news another choke point for speech is app stores and there was a lot of research recently about Apple removing the New York Times app from the Chinese Apple app store and so we're going to be looking at that we're going to be looking at our companies being transparent and accountable about how they manage content in app stores when increasingly this is a choke point for expression and for privacy and for people to be tracked and is there enough clarity and who's responsible for making decisions about what we can and cannot do with our apps or what apps we can use and who can make apps and who can distribute them to whom there's a real issue here in the public discourse about that that I think people aren't looking closely enough at Let's have events of this year unfold thank you both very much indeed for joining us I know you have to get onto your next schedule thank you very much for joining us here in the room for those of you watching us live online via Facebook and our website this session is now over