 Good afternoon, everyone. Thank you for coming to my session which I've entitled Cross-Wiki Ideological Conflicts and Wikimedia's vision of Knowledge Equity. As I start, I want to start with a disclaimer. I am a Wikimedean who believes in truth and supports the vision of knowledge equity in the Wikimedia 2030 vision and some of the slides that I'm going to present might provoke some nationalist tensions. So, I want to say that I have the best of intentions for our open knowledge movement. I have no intention of offending any viewer or stirring up any hatred against any nation. The journey I want to take you through in the next 20 minutes is my personal reflection on how transnational divides and translingual divides in ideology and opinion is going to be a problem for knowledge equity and I want to open the discussion on how we may be able to solve these issues. I don't claim to have all the answers. I just want to give my perspective on where we are and where I think we can go. So, with that in mind, I want to start by recapping a talk I gave at Wikimedia 10 years ago. So, this is titled Wikipedia as Afraid of Governments. And at the time it was a tongue-in-cheek critique of Wikipedia editorial policies whereby if you go to the Polish Wikipedia, the article of the Wikimedia 2010 venue city is called Gedansk and in the German version is called Dantik. You've got names of things and perspectives of things being divided alongside the lines that were already drawn based on how people use language in different language communities. And scale that up, you get to the more contentious disputed territories. So, for example, these are a set of islands in East Asia that are currently disputed between the People's Republic of China, the Republic of China and Japan. And the different editions of Wikipedia have very different perspectives and names of the articles, which reflect the main language communities and the nationalist identities behind those languages. So, the Chinese Wikipedia article calls the article Diao Yu, which is the Chinese name. The Japanese Wikipedia article says Senkaku, which is the Japanese name and they have their own national perspectives, respectively. Interestingly, the English Wikipedia article uses Japanese name rather than the colonial English name because they say currently Japan has military control over these islands. So, this is what we're following. And I think that is a good point to bring up where these problems might lead. And I think Wikimedia has an over-reliance on authorities in the outside world. And that has really hit home in how Wikipedia has been covering a political conflict that is very close to my heart, the Hong Kong conflict since 2019. There is this one case that is covered by both the Chinese and the Cantonese editions of Wikipedia about a first-aider which was reported to be shot blind by right police. The Cantonese Wikipedia, so the language spoken in Hong Kong and used colloquial in Hong Kong, reports the events quite matter-of-factly based on local accounts reporting in local media. Right police shot the first-aider blind, full stop. Hong Kong media reports were taken at face value and a valuable news photograph from that incident which was released by a news agency under an open corporate license was included in the article. Whereas if you go to the Chinese Wikipedia, it accepts the Chinese state media viewpoints that Hong Kong media staged the photos and that the cause and extent of injury of this involved person was disputed. Hence, this article was framed as a controversy rather than an incident and there was no photograph on the article because, hey, it's disputed. That's what Chinese media say, so we need to give due way to that. Okay, so we're saying in the last slide there might be some intervention of governments and the report discussed how the Croatian Wikipedia had a Croatian nationalist bias and there was some bullying behavior which was enabled by the editorial decision a decade ago to separate the different national varieties of the Serbo-Karate language continuum. And what this example shows is that one does not need a dictatorial government to have a nationalistic ideological bias to occur on Wikipedia. You just need a dominant viewpoint and that could crowded other viewpoints and if those viewpoints align with language divides then you could get big problems that affect knowledge equity. So I want to look at how some fundamental Wikipedia policies have got us where we are today and how they are causing problems. The first is this notion that Wikipedia relies on verifiability, not truth and this is determined based on reliable sources. What this means is that we rely on external sources to dictate what Wikipedia can and cannot say and so if someone can manipulate the external sources then they can indirectly manipulate Wikipedia and that is compounded by our consensus model of decision making. If one has influence over who can edit Wikipedia to promote what viewpoint then one can also influence what is stated on Wikipedia. This worked well as Wikipedia policies 10 years ago when Wikipedia was a fly on the wall but now Wikipedia has gotten so big that we need to revisit these policies because these are now easily manipulated by an authoritarian regime or a multinational corporation with too much resources and such a strong incentive to influence what a central repository of knowledge like Wikipedia is saying. So let's look at reliable sources in detail. Some of you may recall that two months ago on the pressure of Chinese and Hong Kong authorities, Apple Daily, the biggest Democrat leaning newspaper in Hong Kong shut down wholesale due to a government freeze on their company's assets. This really brings home the fact that reliable sources as we understand it are actually quite fragile when a government or a corporation gets sufficiently powerful regardless of where you are. They could indirectly dictate what the reliable sources say and therefore what Wikipedia can and cannot cover and this incident was reported on Wikipedia as well. So this news item was shown in the English Wikipedia in the news section on the front page. And the next one I want to look at is this idea of consensus. So earlier on I talked about this incident in 2019 in which the Cantonese Wikipedia and the Chinese Wikipedia presented the same incident very differently, one with a Hong Kong local perspective and one with a perspective that mainly follows that of Chinese state media. And how did this happen? Well, if you look at the discussion, you can see that the Chinese Wikipedia actually has a real consensus to accept the Chinese media points of view. I don't have any reason to doubt the sincerity of the editors who participated in that debate as much as I disagreed with more than half of them. And at this point I want to bring a reminder that Chinese authorities have been using what is called the Great Far Wall of China to block Wikipedia for over a decade now. So as soon as you can apply selective censorship as a government to decide who can edit Wikipedia, then you can influence the consensus that Wikipedia builds to take it to an extreme. To take it to an extreme, if in a certain banana republic only flat earthers are allowed to edit Wikipedia, then the Wikipedia edition representing that language community will have a consensus that says the office flat. And in the case of Hong Kong, as reported by some media outlets and also the Wikipedia signpost, there was this incident where related to this edit dispute, someone threatened to report Hong Kong editors with an editorial perspective critical of Chinese authorities and report them to the Hong Kong National Security Police. So you can see where this conflict can affect consensus, even if we have all the good intentions to run Wikipedia as a consensus building project. And the hard question I want to ask all of us is are we complicit in authoritarian abuse? Because over the last decade and a half as countries around the world, the ones that are less liberal, decide to censor Wikipedia one way or another, we have been building bridges and cultivating local communities to help them access Wikipedia from behind government censorship. But over the last few years, I think there is increasing evidence that the same dictatorial regimes are using these processes to reverse infiltrate Wikipedia by selecting who may or may not participate on Wikipedia in circumvention of the national censorship. And so to understand this problem properly, we need to define the question and that is what do you mean by knowledge equity? Well, let's start with the original Wikipedia vision from a decade and a half ago. Imagine a world in which every single person on the planet is given free access to some of all human knowledge. So the sum of all human knowledge, whose knowledge are we talking about? We are talking about not just a rich and powerful surely. We want to talk about all knowledge known to all humans, including underrepresented and oppressed voices. And we want free access for every single person. That means we want to enable access to both read and write Wikipedia so one can fully participate in the global conversation. That is this worldwide knowledge sharing project. And a few slides ago I said when Wikipedia started for the first 10 years really Wikipedia was a fly on the wall collecting the world's information. But now Wikimedia as a whole has created a reputation that we are the central library of all knowledge. So we can no longer sit as a fly on the wall. We need to foster a global conversation and we need to set some international standards on what is knowledge and what we ought to capture in our projects. And one of the main things that I think we really need to tackle now is how we can resist governments and corporations that act against knowledge equity and try to play our editorial policies into one way streets in which they influence us but we don't influence them. We need to have a robust response to those. So looking at some of the more specific recommendations that I want to bring out. The first one is tackling the issues that the notion of independent reliable sources has given us in this new era in which Wikipedia is now the leader rather than follower of knowledge. So I think Wikimedia movement has been actually doing quite well. We have Wikipedians and residents who work in big cultural institutions to help them bring knowledge they have into the public and release them in open licenses such that Wikimedia projects can propagate to the world. We have an oral history project whereby Wikimedians in good faith create primary sources of underrepresented cultural communities around the world so that their voices are heard. And we have a copy of a primary source or secondary source to be cited on Wikipedia and that primary source is also generated by a Wikipedia now acting as the source of the reliable source. In the recent years we are venturing into journals. So there are now four wiki journals on science, humanities, medicine, psychology and behavior respectively. And I think we can do more with Wikinews that has been going on for the last decade and a bit and move it more forward towards knowledge creation. And also continue to partner with our friends at Wikipedia to see what we can do in terms of reporting on the ground. And another recommendation I have is that verifiability not truth really needs to evolve because Wikipedia by its design and 20 years of evolution has become the most verifiable thing on the internet. We should cease depending completely on things external to the Wikimedia world for the sources that we try to verify from. And in my personal opinion I think Wikimedia can no longer escape the quest for truth. At some point this is something we need to tackle as a movement because again we have become the central library of humanity. The next aspect I want to talk about is leading international standards. So YouTube and Twitter already rely on Wikipedia officially in their editorial processes. So can we lead some international standards in fact checking and can we deploy more foundation resources and staff labor to help the volunteers who are at the front line of making those editorial decisions. The other thing I want to talk about is language classification. So to get a Wikipedia you need an ISO code for a language and as misattributed a Max Weinreich a language is dialect with an army and navy. And more recently people repurpose that phrase into a language is a dialect with a missionary and dictionary. Really I think what it really means is a language is a dialect with an ISO 639 code and a Wikipedia. And the last few years have seen small language communities users actually coming to the Wikimedia Foundation's language committee and asking for help to get the language classified and accredited so that they can then start Wikipedia and have an internet locale. So Wikimedia Foundation and the Wikimedia movement should really be doing more in leading these standards. And finally I think to foster a global conversation we need more standardization across Wikimedia projects. Sometimes it's been too much for example the German Wikipedia seems to complain every year but other places there might be too little and we've been starting well with Wikidata and global templates. But moving forward we also need some robust editorial policies so we have the universal code of conduct coming up and I think we need to keep pushing that. We might need global editorial policies for example we need to be robust against government censorship. Maybe we should have a rule that if an authority censors Wikipedia, Wikipedia should really be censoring their state media back. So that's all I've got for you in summary. I think Wikipedia policies are great to have got us here but there is an over reliance on authority in our current policies and these policies entrench ethnolinguistic conflicts. And our processes are vulnerable to abuse by external authorities and to achieve knowledge equity. I think we need to support knowledge creation more as a Wikimedia movement. We need to lead international standards and we need to foster strong cross-wicket relationships and policies and have a robust response to censorship. Thank you very much and let's carry on.