 Did you stay up late, or did you get up very early? All right, everybody, welcome. Thank you all so much for being here. I hope you've had a fantastic Wikimania so far. My name is Leanna Mixter, and I am the Senior Legal Manager for the Wikipedia Foundation. Today, I'm gonna be talking with you and with our panelists about some of the anti-censorship issues facing the Foundation and the broader movement. I'm joined today by a great team of panelists. We have from also at the Wikimedia Foundation, Rachel Giustari, who is part of our public policy team, Bashaq Tushun, who is part of our Turkish community, and Emma Lanzo from the Center for Democracy and Technology. They all have fantastic resumes. Please check out their full bios on the feed loop presentation. Please help out with taking etherpad notes when you get the opportunity. We certainly want everybody to be able to participate here and learn more about some of the anti-censorship issues facing our movement. So before we dive in, I just kind of wanna set the stage for our discussion. The Wikimedia movement and our projects face censorship threats from all angles. As many of you know, most if not all of our projects have been blocked in mainland China for many years. We experienced a two and a half year block of Wikipedia in Turkey from 2017 to 2020. And even today we are experiencing threats of censorship from places like Russia. And we continue to try to keep Wikipedia online as much as possible in order to share our mission for sharing free knowledge with the world. But there are regulatory environments that can make that extremely difficult from time to time. So to kick us off, I'm actually going to start with Emma and asking generally about this broader global trend about internet censorship and regulation. We're seeing this global shift towards increased government intervention in the functioning of online platforms. What new methods are governments adopting to assert their control over platforms and how is that impacting online expression rights? Yeah, thank you so much, Liana. And thank you to everyone for joining the session today. I'm really excited to be here speaking with you all. So unfortunately, I think there are a lot of different ways that governments around the world from various oratory and regimes all the way up through, you know, so-called rights-respecting democracies are trying to assert control over online speech. We see it in a lot of different forms. They're sort of the classic ways, you know, governments sending legal demands to companies to tell them to take down user speech or trying to demand access to user data using different network-level walking techniques to just block access to sites like Wikipedia and many others, internet shutdowns to just cut off access to the internet wholesale. But some of the other trends that I've been seeing a little bit more sort of pernicious or subtle ways that we're seeing governments go after speech online. One is a real trend towards governments having local presence laws or what are sometimes called hostage provisions where a under the law of a given country, there will just be a requirement that, you know, if you are directing an internet service towards this country, you must have staff on the ground. And that is sometimes because the government wants to, for example, ensure that they have jurisdiction over the company so that if they have a data protection law to enforce or a truly legitimate law enforcement demand for data to investigate a crime of some sort that they can actually get the company to respond to them. But what we've seen in a lot of countries is those sort of local presence laws, whether it's actually required in law or just a company happens to have personnel on the ground really being used to actually put individual staff's life in danger or at least put them at risk of being jailed or if there are no staff in country, there are family members potentially being threatened as a way for the government to try to coerce compliance with content removal or handing over user data. So we've seen this in Russia. We saw it, it was a big feature around you. I mean, many, many different examples in Russia and Russia passed their own local presence law, I believe last summer. And one thing that we saw it lead to last fall was for example, the Alexander Navalny smart voting app was removed from app stores because the staff on the ground for Google and Apple were getting threats of potentially being jailed. And at a certain point a company can't put its own staff in danger and has to comply with government demands. But we've also seen the sort of play out in India with the rating of Twitter offices and the kind of the use of that pressure and that fact that the company has staff and equipment in the country as at that pressure point. And then we're also seeing the idea being baked into the European Union's Digital Services Act in a requirement to have a legal representative somewhere within the European Union and not only that they're there to sort of receive legal process and make sure the company responds but also that legal representative could potentially be held legally responsible for violations of the Digital Services Act. So it's a lot of different angles that we're seeing on this kind of issue but it is I think a really concerning trend. There are a lot of other things that we could touch on including things like governments using companies terms of service for ways as a mechanism for trying to remove content. And also some different trends around laws trying to require companies to carry a certain speech. So called must carry provisions. A lot of times we've seen this in the US in particular it's government saying you must be sure to carry politician speech and you can't take down what the politicians are saying. They say it's for trying to kind of ensure that there's equal time for everybody online but it also seems like an effort to interfere with releasing disinformation and hate speech. So a lot of different things we could dig into but I'll stop there because I know we have a lot to cover. Yeah, that's wonderful. Thank you so much for giving us that full rundown. There truly are a lot of threats popping up around the world. They take different forms depending on which government or which subject matter is at issue but there really are a lot of complex things at issue. Rachel, I'm gonna give you a heads up here because I'm actually gonna take some of these we've prepared some questions in advance. I'm gonna take them out of order because I'm wondering if you would be willing if you'd be ready to tell us a little bit about some of the public policy threats we are facing in Asia and particularly this latest proposal in Indonesia. Sure, sure, Liana. So perhaps in responding to that, I can share with you what usually the Global Advocacy and Public Policy do in response to possibility of censorship and I just want to let you know that the Global Advocacy and Public Policy team within the Wikimedia Foundations is now expanding and we currently have regional specialists that working in the US, Latin America and Asia to engage in policy that are relevant with censorship. Currently, our team is focused on protecting three pillars of our work, which is the protections of Wikimedia models of online collaborations and free knowledge, protecting our people, which is the editors and people in our movement and Wikimedia values, which is untold to our work by supporting human rights online and fighting disinformations. And when we are talking about censorship, is it indirect conflict with our values and threatens Wikimedia's model of open content collections and curations? So we try to oppose it whenever we can, be it through that would require the foundations to suppress content or partially or fully blocks our platforms. And the way we are able to do so is by doing policy monitoring across the world and working with allies to assess these regulations. We also of course consulted with the community to craft our strategy to making sure that their safety and interests are central to our response to censorships. And our response strategy toward these censorships includes engaging with the relevant authorities, influencing diplomats to influence the authorities in that country that is thinking or actively implementing censorships as well as working with the UN because we've simply got their accreditations as well as the inter-regional bodies to create a progressive standard setting for protection of digital rights, also when appropriate utilizing media or public pressures against the censorships. So in alignment with your questions about development in Asia, I think I really resonate with the previous speakers and environmental scanning about the trends globally on how a government imposing law that are failed to address the diversity of internet platforms and only focus on asserting policies that caters toward big technology platforms that monetizing content and also user data. So recently in Indonesia, the government imposed a ministerial regulations that required internet platform to register, which also known as the ministerial regulations number five. The ramifications in Indonesia for registering includes requests of content taken down for material that is offensive for the government within four to 24 hours and also providing specific and personal users data as well as localization of staff and data processing devices. We also see this type of regulations happening in India, China, as well as in Cambodia. However, in Indonesia, because it's recently a new policy, the government really focused on the consequences of not registering, which can include administrative sanctions, fine, or even platform blocking, which has been imposed against pipe wall, Yahoo, and other platforms, especially the gaming streaming platform. So definitely the regulations is against our decentralized content moderations model, as well as our commitment to protect user privacy. So in response to this, many human rights groups in the regional and also national level have shared their criticism of the regulations and requesting government to refocus. And actually, I can talk to you more about what we are doing together with the community to make sure that we are still protecting our model values, as well as our peoples. And I'm happy to hash that out in further discussions. Thank you. Thank you so much. It's definitely a troubling regulatory trend, and we're so grateful for all of the work that the public policy team does to push back against these laws when they're proposed. But unfortunately, sometimes, despite all of your best efforts, sometimes these laws do go into effect with terms that do not work for Wikipedia, and sometimes they are enforced against us. And one of the most obvious examples of that is the situation we faced in Turkey. Now, Bashaq, in a second, I'm gonna come to you to talk about the community's perspective of it, but for those who may not remember or might not have been involved with the Wikimedia movement at the time, I'll just give a little bit of background. So back in 2017, all languages of Wikipedia were blocked in Turkey after we received a government demand regarding two particular articles that had some political implications for Turkey. When we received this notice, we at the Wikimedia Foundation immediately evaluated the articles and we got in touch with the Turkish community. And we all agreed that the articles in question were pretty good. They were well sourced, they were consistent with Wikipedia policies, and they should stay up. So then once, unfortunately, that did escalate to a block, we promptly challenged the block in Turkish courts, we appealed to Turkey's highest court, and our case stayed there for over two years without any progress. And while we were waiting for the Turkish court to rule, we also took the case to the European Court of Human Rights because we believed that the block violated the Wikimedia Foundation's right to freedom of expression, and also our readers and editors' rights to freedom of expression. After we made that filing, the Turkish court then concluded that the block violated Turkish law and lifted the block in January of 2020. And then finally, in March of this year, the European Court of Human Rights finally concluded our case, saying that the case was resolved because the block had ended. Now, this was a little bit disappointing because we had all really hoped that they would use our case as an opportunity to issue a strong ruling against internet censorship. Sadly, they didn't take that opportunity, they dismissed the case for procedural reasons, but the most important thing is, Wikipedia is back online in Turkey and has been for about two years. Now, Bashaq, you were there from the beginning of this whole discussion. I know some of my colleagues were chatting with you right after we got blocked. Could you tell us a little bit about the experience of the Turkish editor community while Wikipedia was blocked? How did you stay engaged to what were some of the challenges you faced? Yeah, thank you very much. Well, during the block, the community, as you said, was not involved in the process of talks with the government authorities or any legal processes. That was something perceived by the foundation teams and the identity of the volunteers were not shared with the governmental bodies, but the active volunteers, of course kept in touch with the foundation teams to be informed about all the developments related to lifting, all the force related to the lifting of the block and try to provide insight and information that might be needed from Turkey. But the editor community is actually continue to do what they know best. They did edit Wikipedia because it was possible to actually access to Wikipedia by using VPNs or changing DNS settings, et cetera. Actually this block was a block for those who didn't have enough knowledge or skills to this is the really large part of internet users. So Wikipedia has lost lots of its readers, not active contributors, but also lost very much contributions of no pasteur bias, small edits, those are lost. And many of the, many of internet users, I believe, did not really feel so much affected, negatively affected from this block because of proxy sites. And, but this was really making uncomfortable for the active community contributors because many of many proxy sites emerged and they were also displaying banners or advertising. And this made people question, should we continue contributing? Somebody we don't know is commercializing or wasn't your work. This really made uneasiness, but the most affected part during this process was definitely the group. We have a user group which was established not long before the block to increase the contributions from Turkey and building partnerships, et cetera. All this work has stopped, of course, because most of the work we planned was with schools and universities. And although it's not illegal to use VPNs, et cetera, it's not advisable to teach and encourage students to use them to access block websites. All this stuff, our postpone, we taught at the beginning for a few days. We are postponing for a few weeks and nobody expected it to have four years. But the community did want to stay accessible and visible. We use social media channels to reply to questions and also to share what other communities in the world are making, doing and to show that great things are happening through Wikimedia projects. And we did try to organize events, not focusing on Wikimedia, but other projects like our biggest event was Wikidata event, celebration, birthday celebration in Istanbul and person event. We did continue to do events because we were confident that nothing the group was doing was wrong or illegal. So the group stage active in a way, in some way, but not through Wikimedia. And also we didn't want our international friends from Wikimedia movement always asked how we can help. And we asked them to please do contribute in your Wikiprojects for by editing about topics related to Turkey, Turkish culture, because those who do not have access would probably like to contribute on those topics. And we didn't really want Turkey to be remembered as the country which Wikipedia was blocked, but there are many great things to share. This Wikimedia Foundation campaign, social campaign that we miss campaign also was kind of reflecting this idea. And also the thought that it's not only this block is not only affecting people in Turkey, but all people in the world because they do not access to contributions of Turkish people. So after the block, of course, we're very happy. Nothing pandemic or anything could stop us, the group from contributing and the loss of events. First of all, we have established an association, got an unprofit status from the Minister of Affairs and made a plan for re-attracting Wikipedia to Turkey, got funded from the foundation, work is going great till there are in many places, actually, there are even more interest about learning how Wikimedia projects are working, but from in some places we still have, is Wikipedia isn't that website that does work? Why we should be work with those volunteers? But in general, it's growing very hard. Wikipedia have reached 500,000 articles in Turkish Wikipedia last month, so it's going great. Thank you. That's so wonderful. It's really exciting to see that the way that the community stayed engaged while you were blocked, but then the excitement and growth that you've had since the block ended. I know there's still some challenges with getting traffic back to the same level it was before, but the work that the user group has done has just been really inspiring. I've been so impressed by all of the work that you all have done and your commitment to the movement. You've just been unshakable. I think it's really a testament to how our decentralized community model really can be an asset. And so Rachel, I kind of wanted to come to you to talk about this a little bit. We do see these proposals that are popping up around the world. Even when there are things that might not be effective for Wikipedia or might not make sense for Wikipedia, oftentimes there is a legitimate concern at the core of them, where people are concerned about harm to other people or safety, online harassment, extremism, disinformation. A lot of times there is a legitimate concern there, but I personally think our projects are pretty resilient to many of those threats. So I was wondering if you could tell us a little bit about how you see the Wikipedia model and the Wikimedia model generally providing a protection against those threats. Sure. So, Aliana, I think one of the core trends that we see around the globe is, again, like how the policymaker only see internet ecosystem as a monolith, which is caters towards how they are aiming to regulate the Big Tech, which definitely monetizing data users as well as content. And I think one way to address it is by advancing the feasibility of our models on how decentralized content moderations can be benefited for addressing these informations as well as how we are protecting the users by enabling them to use a dynamity for the protection of their security because in some countries where authoritarian regime become a populist government, there are chilling effects for everyone who want to express themselves or sharing decent. So I think it is very important for us to work with progressive parliamentary and informing them that our model is actually, it can work because it's not over-emphasizing on automations, but it's really realized heavily on democratic decisions-making process as part of the content moderations. And it's also important to build a closer alliances with digital rights group, not only those who are working globally, but in regional and also national level as well because we understand that one size doesn't fit all. And sometimes regional and country context can be very different. And it is also important to inform the community on how we can communicate our models to the stakeholders, especially policy makers so that they are well-informed about it. And as a final comments, I guess in making our own decisions on how to respond to regulations that might compromise our model, our value and our people, it is extremely important to circle back and use our new human rights strategy which was being announced last December as a standard setting to ensure that we can protecting the safety of our community and defending our model that has been worked for the past 20 years. And I think I will close with that and hoping to get insight from our speakers or even questions from the participants here. Thank you. Well, so here at Wikipedia, we are obviously going to be big cheerleaders for Wikipedia projects. And yes, our model works great, but we aren't the only website on the internet and perhaps not all systems are working quite as well. So Emma, I wanted to ask you a little bit about the other folks on the internet. There are certain platforms out there. I won't name them, but there are some platforms that have been accused of implementing their policies in an arbitrary way, requiring more checks and balances, more oversight. Do you believe that government control is necessary to hold the platforms accountable or to what extent is there a need for more regulation for online platforms, recognizing that maybe not all platforms are as well-behaved as Wikipedia strives to be? If we could be getting everybody to do community moderation, I think the entire world would be just a lot better off. But yeah, I mean, that is, it is really the million dollar question. Governments around the world and people, groups of citizens around the world are asking exactly this question. There's been decades of a kind of hands-off approach to online services, and now we are seeing, I think the pendulum swing in the very strongly in the other direction. As a free expression advocate, the extent of that swing does concern me a lot. I think there are, it is absolutely reasonable to be very concerned and frustrated about how whether it's social media content moderation or the way that people are directed to information through search or through ranking recommendation algorithms, people have a very genuine sense that things are out of whack in the information ecosystem and a lot of places are looking to government regulation to try to bring things back into balance. But I would say it really, really matters what the government approach is to that regulation. And there need to be really strong safeguards in law around ensuring that whatever regulators get involved in any kind of oversight of tech companies that they are independent from political interests because I think I don't have to tell the Wikimedia community that content moderation is difficult and you're never going to make everyone happy. There will always be disagreements and disputes when you're trying to do content moderation at say the scale of social media service, you are not going to have completely consistent and comprehensive enforcement on your terms of service. Even if you use all the automation you can get your hands on because it's just you're going to have error rates that mean that thousands or millions of posts that should have come down or should have stayed up will not have the right outcome. So if regulators have in mind the sense that there can be like complete consistency and coherency much less that it aligns with like their personal view of what the right call is around speech, that's a recipe for disaster. We're not going to see good outcomes there. I also think there need to be some real sort of substantive and procedural safeguards about what the kind of oversight that regulators can do actually looks like. So one thing that we're seeing a big development in including in the Digital Services Act in the European Union is this idea of services needing to conduct risk assessments to be able to look at their policies, their products, the services that they offer and actually go through something that looks like a human rights impact assessment or another kind of evaluation of what risk do their products and services actually pose to users' human rights or to the rights of other people in the community. That is a really important step for companies to take and I think it's a very positive direction to go but I think the concern with kind of over broad government regulation comes in when you start talking about what should companies do to mitigate those risks. If that looks like the regulator starting to say, hey, we keep finding disinformation on your service even though you tell us you take it down or we keep finding hate speech or we're worried about illegal terrorism. So you need to start using content filters or increase your use of content filters or make them stricter so that you eliminate that content from your services. That's not a reasonable outcome and that's gonna have huge impacts for people's freedom of expression and privacy rights especially if those filters are going to conflict with your ability to provide end-to-end encryption. So that's just like one example of how kind of interrelated so many different issues around our rights with online services can be and how regulation if it's approached in the right way can probably have a really helpful pressure on the biggest companies to do better and to think more about their users first but the regulations have to be designed with users rights in mind and with really important kind of structural safeguards to ensure you don't end up having just a political figure who is going to respond to the whims of their party making decisions then about what kinds of content a company should or shouldn't allow in their services. Wonderful and I was actually the next question I was gonna ask you was what platforms should be doing to protect against government regulation but you kind of already addressed that. Recognizing that we have only a few minutes left in our session this is flown by I am gonna use my moderator's privilege to just make an awkward transition to tell people a little bit about what's going on in Russia. I imagine a few people have probably joined out of curiosity about what's happening with the current situation in Russia and Ukraine. The good news is Wikipedia remains online in Russia, Ukraine, basically the entire affected region but we do continue to receive notices from Roskomnadzor the Russian government regulator for the internet. We have received a few notices from Roskomnadzor every year for the last several years if you check out our transparency report you would see that it's not uncommon for us to have one or two notices but we have noticed an increase over the last several months. This started late last year when Roskomnadzor launched an app that allowed citizens to report content online that they found that they believed violated Russian law. We started seeing more requests once that became an option. I suppose that's the flip side of community moderation at work there. And then this year we have seen some notices from the Russian government directly including some concerning articles about the Russian invasion of Ukraine. So far we continue to remain online. The legal team, public policy team, various other people across the Wikimedia Foundation are continuing to monitor that situation very carefully and we're staying in touch with editors who can support us and to find out how we can support them. It is a very challenging situation. We don't know what happens next. I can't predict the future but we are going to do everything we can to keep Wikimedia online and without compromising our values in order to do so. We're gonna fight for it as best we can but it is a tricky environment to work in. So just wanted to provide that information in case anybody was joining this call with the hope of getting an update about what's happening in Russia still online, hoping that we will stay that way and gonna do everything we can to make sure that that continues to be the case. So with that note, we only have a few minutes left and I had kind of a question for the whole panel which was as we see these global trends moving towards extra online regulation, what can Wikipedia do to sustain our model and deliver the promise of a free and open internet where every single human being can freely share in the sum of all knowledge? What do you all think we should be doing at this point in order to make sure that we continue to live up to our vision in this challenging environment? Anyone who wants to answer that from the panel please go ahead, the floor is yours. From the community perspective, I will say continue to do what we are doing best, editing and try to just expand this to diverse people. Well, I wanted to chime in and say a big plus one to what Rachel was saying earlier about the importance of being involved with local national level digital rights groups and advocacy groups to really understand what's going on at national level in different countries around the world. I am really lucky in my role as, sorry, a public policy advocate to work with a lot of different folks from the Wikimedia Foundation in the US and the European Union where I am always happy to see a Wikimedia person in the room because they're going to be super thoughtful, focused on user rights and having a really different story to tell than what most policymakers understand at all about how the internet works, what an online services, what is even possible with content moderation. So I have seen firsthand how influential hearing those stories can be in a public policy discussion. So I would just encourage you to be able to have that strength of advocacy in as many parts of the world as many countries as possible and to be partnering up with local groups who can help you with the navigating the political dynamics and figuring out what are the messages that are really going to hit home but you all have one of the most inspiring stories to tell about how people can use the internet and I've seen it work wonders in conversations with policymakers. So I'd say keep up the good work. Thank you, Liana, for the $1 million questions. I'll try to unpack it here. I think what's most important for us to do is definitely to advancing our ability to address these informations because now it has become one of the key challenges that we are facing in our platforms. But on top of that, we also need to improve our ability to communicate our model better because for non-weekly medians and weeky pedians, it's difficult to get a grasp of what we are doing although it's actually very simple. We have a democratic decentralized content moderations model and we are protecting the privacy of our users. So I think by trying to make our models easy to be understand, especially from policy makers point of view, we are able to have more progressive regulations that can protect our model regionally and also globally because I think it's important not only to be reactive of a policy that has risks of censorship but also creating better regulations to protect our model, our values and our people. And in doing so, I think this is a pivotal moment for us to be feasible in global advocacy stage to really amplify our unique voices and very different take from other big tech and saying to the government that it is human nature to be anxious in regulating something that very new to most of the government but they also have to be mindful that internet is not a monolith hence policy regulations framework need to be created on the basis of the diversity of internet ecosystem and also to uphold human rights, freedom of expressions, free knowledge and also privacy protections. And we are only able to do so by not only working with one government but also with many government around the world and using the UN platforms which we are able to do so more because we just got accreditations by the UN at Cossack and I hope that we'll be able to helping us in creating standard setting that can uphold and protecting our model globally. So I think I'll close with that and hoping that Liana maybe you can also add because you have more insight like me. You know, I don't know that I would have too much to add and even if I did, I wouldn't have time because we are actually at time for this panel but it was so wonderful to chat with you all. I completely agree that our model can really be, we can be a good example of how the internet could be and so I encourage us all in our respective roles to keep showing up, keep doing the good work. We're doing something right, so let's keep it going. Thank you all so much for being here. Huge thank you to our panelists and also to all of you who have joined just for listening in. I hope you have a great rest of your Wikimania and looking forward to learning from you all in the days ahead as well. So thank you all so much. Have a great rest of your day or evening depending on where you are. Thanks all, take care. Thank you, Lian.