 Hello, welcome. Thank you so much for joining us today. My name is Kira Wiesniewski and I have the honor of privilege to serve as Executive Director of Art and Feminism. My pronouns are she, her. I am joining you from the ancestral land of the Piscataway people commonly known as Baltimore, Maryland. I do want to point out that after session chat, when the top comes to an end, we can continue this conversation and building six floor nine table a. And I just want to take a moment to really thank each and every one of you for showing up in this moment. We're really excited to be here and to be in community with you today to share a little bit more about the work of art and feminism that what we're doing outside of editathons. We also want to name in this moment that not only are many of us experiencing screen fatigue, we're on the last day of a very exciting conference. And I just want to recognize that we're all going through some sort of trauma in various gradients. So we want to express extra gratitude for you showing up today. Please give yourself the next 75 to 90 minutes to be fully present. You're doing a lot and you deserve this time. And I do want to note that we saw in the chat on telegram a question about taking screen captures and sharing on social media and asking if that conflicts with any Wikimania safe space policy. I do want to say that all presenters of the session have given consent to share out in that way so you can go ahead and get your screen captures. We start all of our art and feminism events, both virtually and in person sharing our very own brave friendly space. The goal of this session is to create an encouraging space for collective learning. This requires intentional behavior where participants are conscious of and accountable for the effect of their statements and actions on other. We respect our experiences and the experience of others and recognize that we cannot do this work without one another. We agree to hold each other accountable to foster a brave and friendly space. You can see the URL for the full agreement there on the screen. I also want to uplift that Wikimania also has their own friendly space agreement. I also want to say that it would be disingenuous to not acknowledge what is currently happening internally at art and feminism. Honestly, it was discussed yesterday, possibly withdrawing from Wikimania in order to fully focus on the internal work necessary as part of our commitment to one of our values to create safer and braver spaces that are caring, equitable, pro-black, queer and transaffirming, intentional about accessibility and all around anti-oppressive. We find joy in our community knowing it will stain us in our work. We recognize that a lot of today's presentation aligns with these commitments, but we need our external work to also correlate with our internal work to address anti-blackness and overall oppression. We're engaging with an anti-oppression expert and facilitator to create policies, practices, and make the cultural shifts necessary as we continue on this journey of learning and unlearning and how to move toward equity and not only our external work, but internal work and that we're accountable. It's not easy work and there isn't one simple solution, but it's an ongoing process and we have work to do, frankly, to live up to our values. Today I am elated to be presenting with these fine individuals who will be joining me during this presentation. So from the top to bottom, left to right, we have Muhammad Sadat Abdelay, Amber Berson, Sarah Klugej, Wendavi Gandhi, Richard Nipol, Monica Stengel-Jones, Melissa Tamani, myself, and then Zita Ursula's age. So before we get into the presentation, just giving a little bit of a background of who we are as an organization. Art and feminism builds a community of activists that is committed to closing information gaps related to gender, feminism, and the arts beginning with Wikipedia. Art and feminism envisions an internet that reflects diverse global histories of art making, where communities who have often been written out of history feel welcome and empowered to participate in writing and writing our stories. Since 2014, over 20,000 people at more than 1,500 events around the world have participated in our editathons, resulting in the creation and improvement of more than 86,000 articles on Wikipedia and its sister projects. From coffee shops and community centers to the largest museums and universities in the world, we lead a do-it-yourself and do it with others campaign across the world. We provide resources for folks to train and edit Wikipedia with a feminist and anti-racist lens in a very grassroots community organizing way. Started as an American project, it has grown into a global project, and every year we now have events in all six inhabited continents. Our website is available in four languages and hopefully five by the end of the year, and our organizers edit in over 20 languages last year. And then here is a sampling of some of our partners from over the years. I of course want to acknowledge, as I'm sure you're all aware, that we're certainly not out here alone doing this work. I want to shout out some of the ongoing partners and collaborators that we're lucky enough to work with, Afrocrowd, Black Launch Table, the Wikimedia LGBT Plus user group, and of course the infamous Women in Red. And now we will get into the first part of our presentation. The Call to Action Art Commission was established by Art and Feminism in 2017. Under this program, artists create a Creative Commons license work that is hosted on Wikimedia Commons. In 2021, we are celebrating the Global South, limiting applicants to artists located in the regions where the curators of the call reside. So African countries, Brazil, India, Bangladesh, Pakistan, and Sri Lanka. We're interested in working with artists that align with our values, which also centers our work, sees their own art as a form of activism, and can help us visualize what art and feminism looks like. So in this section, we'll be sharing what we've done in the past, the current iteration in process, and touch on our plans for remixing the commissions later this year and beyond. So I would at this time love to invite Amber, Sarah, Madhavi, and Zita. I also want to quickly highlight that where Madhavi is located, it is very late. So she'll be dropping off at the end of this section. Thank you. Hi. So I wanted to talk quickly about the history of this initiative. As Kira mentioned, it was founded in 2017 by the founders of Art and Feminism. And I will draw your attention. Oh, can you switch slides? Thank you. I'll draw your attention to the image of the gasoline can. This is a work by Divya Mera, Dangerous Women, Blaze of Glory. And it was commissioned by Art and Feminism as the inaugural project. When we started these commissions, it was really an invitation process. So the curator, the founder, invited people that they were already in conversation with, already within the network. And we worked with Divya the first year. And then two years later, we were able to expand the project to Wendy Redstar. That's the image on the top right. And we'll see the third image, the third commission a little later on in the presentation. And these funds to do it came, they were self-generated. They weren't funded by the foundationer from our larger grants. We felt it was important to diversify what was being used to promote our events, give people an opportunity to find material that went beyond the one logo that we had available. But we also felt like it was an exciting opportunity to challenge what was in the commons and work with the commons in a new way. This brings us to the present. So I'll pass it to Zita and Madavi. Thank you, Amber. So this year, the 2021 Call to Action Art Commission edition was announced on Arts and Feminisms website and social media pages on the 15th of June 2021. And the target regions for this year's edition was the Global South. So we had three curators residing in Africa, India and Bangladesh and the Portuguese region. After the call was announced, we had the curators distribute the announcements to the artists in their regions. And we also organized the information sessions with each curator, facilitating one session on a platform which was suitable for their specific regions. All too soon the call came to an end yesterday with a total of 85 applicants, 57 submitted in English and 28 submitted in Portuguese. Now we the curators have a really tough job from now to September 1 to select and announce the commissioned artists. Selected artists will then have from September 1 to October 31 to create the original work. I would now hand over to Madavi to speak more about the current iteration. Thank you so much, Zita. Hi everyone. So as you already heard from Amber Zita and the team that this year our focus is on celebrating the Global South. And now that we're looking at the applications, we're not only looking at the body of work that the artists have created when we look at their applications, but we're also trying to understand how they align with the art and feminism values and how they visualize art and feminism. The idea here becomes to also for us on the inside see the different ways in which people visualize art and feminism. It's also like a little sneak peek into what the Global South thinks of art and feminism. We're also organizing a Wikimedia Commons training session for all applicants on 18th August, which is tomorrow. This is for all applicants, which means that again, we are sort of seeing that how are we exploring and encouraging everybody to start contributing to the Commons and creating art for the Commons in a way that would represent the Global South on the Commons, at least for artists and from that perspective. So that's what we're looking at. We're looking at all these applications that Zita mentioned and there are already so many different descriptions of feminism and how people are sort of looking at it. So it's really exciting for us. And yeah, we're hoping to personally ideally wish we didn't have to do just one or two, but it's going to be an interesting commission to look at when it's finally out. So that new editors on organizers or all editors on organizers can actually use these outwards to promote their events. Hi, everybody. As Minavi and Zita and Amber have just said, with the Call to Action Commission, art and feminism is looking to expand the uses of Commons beyond photos. So Commons is understandably driven by photos and there are already campaigns for Wikipedia pages wanting photos to illustrate their subjects. But there are also a lot of other file types on Commons like audio, video, vector files and GIFs like you're seeing here. This GIF was created by Tuesday Smiley as part of the 2019 Call to Action Commission. It's called rage slash sorrow. So files held in the Commons are often artworks like this and they can be sources for other users artworks. Our Call to Action remix event later this year will invite users to take the artworks commissioned by art and feminism and remix them to make their own work, which they will bring back to the Commons to share. Participants can create a new digital born artwork or a physical object that they can take a photo of. When I've organized volunteers for the art and feminism editathon at MoMA, I've had volunteers make buttons to give away. You'll see an image of that button on the right here. I took the GIF from Tuesday Smiley and remixed it in button format. And the idea for this remix came from there and we hope that this kind of remixing can help other organizers integrate art making into their events. So our remix event will take place later this year. And we'd love to hear from you at the end of the session today about any thoughts or input you have about frameworks or prompts for that. And if you might have other examples of using Commons in your curatorial or educational practices. Thanks. Great. So at this moment, we'd love to, especially since Madhavi has to drop off soon, just take a moment to see if there are any questions or comments. Unfortunately, as you might have guessed from other participating in other Wikimania parts, we don't have, we can't see what's the chat that's happening in Remo because we're on a different platform. But if somebody wanted to put any questions into the etherpad, we'd be happy to look at them. But I can start with a question generally for everyone here. What is, what do you think is the most exciting about the call to action commission and anyone of you can field that question? I think I have since the very beginning been very fascinated that we don't outline or say that this is what the artwork needs to look at Kate or to etc. It's really wide open. So I think, like I said, it really gives us the opportunity at Art and Feminism to learn and see and sort of see what everybody else has to say. So I think that's something that has really fascinated me about this call. To me, I think it's the target regions. This is the first time that we are focusing on the global south. And it will be pretty interesting to see the kind of works that emerge from the global south. Africa has different cultures and Portuguese and India, there's so much rich culture. So I believe that that would inspire the kind of designs that artists come up with. And I'm looking forward to a lot of the designs that we receive. So yeah, I'm excited. A question for you, Sarah. I know that when we were talking about the remix part of this, we were talking about how perhaps other glam institutions could do something similar. Can you talk a little bit more about that concept? Yeah, I think, I mean, to go back, I think what's exciting to me about the call to action is that it treats Commons as a creative space and not just as a repository for things that are made elsewhere. And Commons is an incredible creative tool and incredible repository of images. And I would love to see more museums and teaching spaces use Commons as a way to engage with a wider community. So, you know, if you can remix, if you can put your images from your collection online and invite people to remix it use that as part of the museum educational outreach. It gives that it gives access to that kind of platform and information to people outside the reach of one museum. So what's exciting about Wikimedia projects is their incredible global reach and to have that all in one space is I think great for access to collections. So a question that we have from the audience is artists seem receptive to sharing their work with a creative Commons license. Are you going to make extensive use of OTRS to overcome Commons copyright concerns? Amber, do you want to field that question? Well, with this commission part of what we're doing is training the users to be able to upload their work themselves. So we won't have to be engaged with the OTRS system because lots of reasons, but it's just easier that way. In the earlier commissions, it was the founder who uploaded it and then we did have to engage in that process. And I don't feel like it was too much of an extensive process. But again, part of the goal of this project is to train people to be sufficient in doing this work themselves. We do have it as part of the application process for this year's call explaining the licenses and explaining like what it means to put a work on Wikimedia Commons. And it's part of the very short application form that people understand what it means. And there's also the opportunity for questions about that as well. Does anyone, before we move on to our next session, have any, from this panel, have any last thoughts that they'd like to share about the commission or about putting art on Wikimedia Commons? I think this is also the exciting part that everybody in respect of whether they're chosen for the commission or not gets to learn about putting their art on the Commons and about licenses. I think that's also very interesting that there is an element of learning. Vita, do you have a closing thought about the commission? No, I don't. Amber, Sarah. I think we're just excited to see it happen. So, yeah, just looking forward. I wish I had something brilliant to say, but all I can say is, yeah, it's going to be exciting if you're looking forward to it. Definitely. Amber, do you have any closing thoughts before we move to the next section? Actually, yes, maybe one thing. I mentioned that this is something that's not funded through our grants. So I think maybe it'd be appropriate to thank all the people who donated to the campaign this year. We really appreciate it, and it does help us to continue initiatives like this. Great. Well, thank you so much. We are excited to see what comes out of this commission, and we'll definitely, you know, let everybody know once it's up on the comments. So thank you. I would, if we could, there we go. So for our next section, it's about unreliable guidelines, which is a report that Art and Feminism put out pretty recently. Unreliable guidelines, reliable sources, and marginalized communities in French, English, and Spanish Wikipedia is the report of our inaugural research project, Reading Together, Reliability and Multilingual Global Communities, funded by Wikicred. The project used an intersectional feminist methodology. Ooh, sorry, that was a hard word to say right now. To address how Wikipedia trainers involved in the Art and Feminism movement approach the reliable source guidelines in French, English, and Spanish Wikipedia. Rigorous analysis ends with recommendations for more inclusive and diverse Wikipedia. This lecture will feature research leads as they discuss their process, findings, and the questions that the research raises for the future. So I'd love to pass it over to Amber, Monica, and Mel. Hi, everyone. Well, first of all, I'd like to introduce the team that made possible this effort. The lead researchers were Amber Burson, who is an Art and Feminism called Lead Organizer. Monica Sengui-Jones, who is part of the Art and Feminism community. She's organized many Art and Feminism events, and she's also a PhD freelance writer and researcher based in Seattle, USA, and myself. I am also one of the Art and Feminism called Lead Organizers. I am based in Peru and I've been editing Wikipedia since 2015. But we made this work with the support of an advisory board, which we thought we think it's key to do this kind of work as something collected. They were Susan Bargnon, Mariana Fosati, Camille Larive, Wala Abdel Manayem, and Jalón Pan. We really thank their presence during this process. And we also want to thank Wikicred, which is the fund that allows us to start this research project. And to all the people who participated in the town halls that were part of the methodology we use. I'd like to pass it to Amber. So we wanted to highlight that this is a celebratory moment. It took us a long time to get this report out into the world because we wanted to do our due diligence. Our goal was to provide insight on how definitions and interpretations of the concept of reliable sources impact coverage of marginalized communities on Wiki. This categorization includes but is not limited to cis and transgendered women, non-binary people, non-Western communities, LGBTQ plus, and BIPOC communities. Reading together, which was like the overall project that created the unreliable guidelines report addressed Wikipedia's information gaps by interrogating what constitutes a reliable source on English, French, and Spanish Wikipedia's and how source authority is negotiated among editors involved in these communities. We'll pass it to Monica. Hi everyone, thank you. The methodology that we had, as Kira mentioned, was intersectional. Our project had three parts. We did qualitative research of the guidelines and foregrounded around a series of community conversations about each language. We did interpretive analysis of the guidelines and then we created the report with recommendations based on our findings. And I wanted to just take a moment to describe a little bit what intersectional feminist research means and meant for us. Kind of broadly intersectional feminist research is committed to questions of power, including how differences are created and reproduced. And while united in struggle, feminist scholars and activists have taken diverging analytical lenses to this effort to understand and remedy oppressive structures of power. And so in our work, we built on feminist techno science to acknowledge how epistemologically knowledge is situated and knowers have partial perspective. And in talking about knowers, we're talking about those who are participating in the Wikimedia project and creating the kind of community that we are a part of and researching. And so in other words, there's no such thing as purely objective knowledge because there's not pure objective knowers rather there's practices of knowing. And so we really were asking who is the knower and who is being known in these efforts to determine what is a reliable source and how are the guidelines being created and interpreted. And we really believed and argue that the circumstances and values through which legitimacy is confirmed and the costs of this is crucial to the effort to remediate asymmetrical structures of power. So the first part of our work here as you can see on the slide is we kind of summarized what's happening in each language Wikipedia. And this is about English Wikipedia here as you see here on the screen, it's according to the community processes an official guideline, which falls under the three core content policies. It began in 2005. That's when the page was created. And it had the largest kind of editorial changes happened between 2006 and 2009 and of the three got of the three pages about reliable sources. This is the oldest page. The guidelines say that Wikipedia articles should be based on verifiable sources and a source may be a book, an article, a piece of work, the creator of a work, which would be like an author, or the publisher of the work which would be like a press. And so this definition of published is broad. And I'll quote here from the English guideline materials that have been recorded then broadcast distributed or archived by a reputable party. These may also meet the necessary criteria to be considered a reliable source. What we noticed was that this idea of a necessary necessary criteria is never really specified. There is neither a definition for reliability or unreliability in the guideline rather editors are cautioned against using particular kinds of sources such as commercial sources, predatory journals, self published sources. We also noted that while there are a number of cautionary notes in the guideline there's no guidance there for how editors might evaluate sources and their reliability in relation to works published about or by marginalized communities. In French, we had similar problems, but I would note that there was not one specific guideline. In fact, there was a series of recommendations, sometimes that contradicted each other. The original information about reliable sources was separate from the translation of the reliable source page in English, but the page that's most accessed is the direct translation. However, this translation is literal and not necessarily cultural. And as a result, it doesn't necessarily include the types of conversations that need to be happening in the Francophone communities. I would also say that the sources that are included and highlighted and uplifted foreground a French that comes from France and does not necessarily take into consideration the larger context of the Francophonie. So with this French series of recommendations, what we see is too many contradictory statements and then overwhelming influence of Western French. Okay. And regarding Spanish Wikipedia, there are many things I could say about the policy, but I will try to summarize the most important ones. The first is that as in many other Wikipedia's this very important policy is was started as a translation from the English version of Wikipedia. So many of the challenges Monica has identified for that language are also applicable for Spanish. And the policy was created in 2008. And we believe editors made a great job in building all these policies. But as a result of our analysis, we found that in recent years, these policies haven't been updated. And so they don't reflect current discussions around reliability around the representation of marginalized communities in in sources as well. So that's the case of Spanish Wikipedia policy around reliability about really reliable sources and opposed to English and French. There's a single page about reliability. And this space, this page is it's very brief. So we found that there's a lot of margin of interpretation of what the policy says. And the conflict with that is that users in practice, when they're writing articles about marginalized community, these policies during the discussions are used against the inclusion of this kind of content. So we think that's a challenge. And I'll pass it to, yeah, we can continue with the next section, which is about, which is a conclusion of the work we did. And basically, we believe that Wikipedia's also authority is facilitated by social and technical processes that deliver the decision of a small number of self selected editors. And we're going to talk more about that in the section about the findings. And that's something our past feminism has worked along a lot over all over these years, it is how do we can we increase the participation of other voices in this kind of important discussions. And we started this research with this guideline in particular about reliable sources, but we think it's important to do the same kind of research about other problematic policies, like the policy around notability, for instance. So this is a first step in this path. And I think we can go to the next slide to talk a bit about the findings of our research and we organize them in three of them. And the first of them is lack of rigor. So we want to be very careful when we say this, we want to emphasize that we're not criticizing the work that has been done just for criticizing, but we want to emphasize the challenges of this. So none of the three policies we have analyzed had included discussions around what definition of reliability was being used in order to build these policies. And that's a bit contradictory because one of the basic rules of Wikipedia is that all content must be referenced from somewhere, right? So we find a contradiction in the fact that these guidelines don't have that process. And I like to pass it to my colleagues to talk about the second finding. Yeah, thank you so much, Mel. I'll echo that what we were saying is our findings definitely is not a criticism of those who did this heavy lifting to work on these. But what we were finding is that the ways in which the community formed and community policies and guidelines kind of solidified was around a concept of consensus. And according to the own page on consensus, consensus is a quote normal and invisible process that quote naturally happens between editors. Editors participate in a conversation about an article or, you know, in a work until they reach a resting point after which silence is presumed to mean consensus. We found this really interesting and I think it contributes to how the ways in which they're the kind of the rigor that Mel was describing needs to be updated on updates to add or like additions to be made to the to the guidelines. And the ways in which the sense consensus is defined makes it very difficult for newcomers to participate. When the quote if you disagree the onus is on you to say so has been I think very problematic for why potentially these haven't been updated on silence as a signal of consent privileges those more experienced editors or editors who feel comfortable participating in dialogues on the platform. And this is a critique that is something we just wanted to kind of like lift up. And it's not necessarily specific to the guideline on reliable sources and English French and Spanish but it affects all of the policies and guidelines, however they're they're named. But it's something that definitely we found worth highlighting and emphasizing. And then Amber will talk about the third finding. So I want to highlight the essential work of trainers on Wikipedia. Our research showed that trainers in general have a crucial role in fielding questions and providing a buffer between the platform of Wikipedia and the interests of newcomers, especially those who are prepared to contribute subject area specific content and or strengths in the editorial process. And they're there to help people understand when and how to edit. So things that came up in our conversations in our community hours with them or our town halls where we've taught people to edit at specific times of day to avoid having their work instantly deleted. We've heard people say things like you are going to be more successful in your editing if you do X, Y and Z. So I want to highlight their work in helping to support newcomers, especially in marginalized communities to get their work onto Wikipedia in ways that don't necessarily that aren't visible. I'm going to pass it back to Monica to begin talking about the recommendations. Okay, great. So we have several recommendations around the report and what we see as next steps. I think we would like to see this work continue. This is like an initial kind of like take like this is what's happening. This is how we figured it out. And to kind of affect change and to get we'd like to have funding and resources in place to go on and redevelop the guidelines. And this this should be we suggest with a task force of a broader range of stakeholders and those stakeholders may include trainers, as Amber just mentioned who are already kind of on the pulse of how editing and editing groups and librarians who are, you know, authorities in in sourcing, and then academic and community based subject matter experts. And what the task force can do is revitalize the guidelines in the following ways. The first would be to specifically for French and Spanish and for other language Wikipedia's de center English Wikipedia and the ways in which that guideline has been transported into other spaces to vote attention to that to prove the guideline through drawing on scholarship. I think we mentioned in the report a lot, and we kind of minimized in our discussion today but there are no citations in and we mentioned this there are no citations in these guidelines and policies they're not based on citations they're based on consensus which I discussed already so, but at the same time there's been a robust discussion of the historical and cultural specificity of the concept of reliability. And this should be included to be helpful for editors who are going in and wanting to understand how to evaluate sources. And then from that we'd like to go in and offer guidance to editors on how to address scholarship, how to address sources where they may be content about or by marginalized communities. And also how to approach sources where there may be the presence of harmful or hateful speech, or ambiguity about fact checking so all of these things are ways in which the guidelines could be improved and we definitely would like to see that happen. I can continue. Okay, great. Yeah. So, Actually, can we skip all the way to the last slide. I'll talk about that first. Go back, go back. Okay, so one big thing is we want to enable the visual editor across all of wiki. We recognize that this is a huge technological challenge it's not simply pressing a button, but that limiting it to editing just Wikipedia pages excludes people from participating in the discussion. And it is tied directly to the idea that silence equals consensus because if you cannot participate in the conversation you are necessarily silent. So having to learn code to participate in a discussion about the page that you created that is in discussion is hugely problematic. So we need this to happen. It's so crucial. So I think that brings us to the last slide, which is the discussion point. We recognize that we have lots of recommendation we have lots of ideas and it's a huge juggling of priorities about how to make this happen. Of course, we've talked about the task force and we were very encouraged in earlier conversations to know there's a lot of interest in this and also, you know, like a historical precedent for doing work in that way in the community. So we're, you know, we're hopeful that this is something that can happen. But part of what we wanted in this conversation is to hear from you and to hear what your ideas are for making this a reality. So thank you all for that. We have a lot of questions in the chat and we still have one more section to get through. But I am going to go ahead and ask one of the questions now. And then we'll hopefully get to the rest of them when we come back to the end. And I do I did see also a question is it okay to take screenshots of the session and share on Twitter, etc. And yes, you do have the consent of all the presenters in this session section to do so. So the question that I'm going to uplift first is from. It says first off this report is wonderful we product managers at the WMF have read and are actively thinking about this report. In fact, we cited it in a session we hosted on Saturday. The question the report notes that English Spanish and French wiki do do not include definitions for what a reliable source is, and is not what if any impact does this lack of objective definitions have on new editors. Well I can start and then Monica and break and jump in. I think the first problem with that is that as I mentioned during the presentation of Spanish. When there's no definition, there's the user is who needs to interpret what's a reliable source. And that's resolved in practice. So when there is this problem about a person who has been marginalized historically and hasn't been having perceived much coverage on on literature in general. So it's up to the user to decide the if what kind of sources how many sources are considered reliable enough to build an article about this person. So I think new users and experienced users need more guidance about how to work with sources in general in the media. I would also add that it impacts newcomers in sort of like the opposite way in that people with lots of experience often use it to, you know, to promote their bias and to exclude certain types of articles and to exclude certain types of sources from the larger conversation. So a newcomer might come in and say this is a very neutral thing that I'm putting in. It's a page about a well-known person. The sources are well known in my community and then somebody with implicit or explicit, you know, like, you know, I'm motivated bias might accidentally take it out as well. But there's also people who might take it out because they don't want certain content included. So I think that it has a two-way impact on new editors. Let's actually do one more question in this section and then we'll move on to the third. So this next question also coming from the chat is how do we handle the issues of these self-elevated editors? That's a great question. I think that we, I mean, what we identify in here is that those editors who have had more experience using this, using this wiki who are, you know, fluent in it are going to feel more comfortable participating. This connects in a way to the previous question. Newcomers who are not self-elevated editors per se or may have a, you know, what is that called? The barriers to entry are higher for them. Maybe they want to be self-elevated, but they're not. And you can look to tea house conversations to see all of the people who wish they were self-elevated in order to do certain kinds of things and aren't able to. The barrier to entry is really high. And so I think one of our recommendations is to lower the barrier to entry to participate in community conversations. How can you participate in a conversation around consensus if you are, like, struggling to learn wiki code? And so that's why we suggest enabling the visual editor to make community conversations include all of us who are participants in this broader community and make all of our conversations count. I think more people who participate will even the playing field in a small way with this question of, quote, self-elevated editors. And I don't know if we know exactly what that means. And I don't think it's a word that we are using in our text. So I'm presuming a particular definition of that from this question and presuming it means, like, people who have participated in the project for a long time. So but I hope that helps kind of clarify where our priorities lie. Great. Thank you again so much. We do have more questions in the chat that hopefully we'll get to after this next section, but just know that they're seeing here on the etherpad. So thank you so much, Amber, Monica and Mel. Thank you so much. So for our third section, we are going to be talking about the anti harassment working group. So in November of 2019, we created the anti harassment working group. We initiated a line of work to develop strategies and tools that help our community deal with and prevent online harassment that may occur as a result of their involvement with our feminism projects. For this section, I'd love to welcome back Amber and Melissa and also welcome Muhammad and Richard to the conversation. Hello everyone. Hi. So I guess I could start. Thanks. So basically, as you all know, the Wikipedia community is really diverse. And it has come from so many different backgrounds and expectedly so they have different views about everything. You know, their opinions about topics are different and they even have different perspectives about occurrence of events and so on. And because of this diversity conflicts are not so uncommon. And sometimes the wikis can become a very toxic place. You know, we've seen instances of miles to serious forms of intimidation through emails or pages or even user pages, sometimes in edit summaries as well. Other forms of harassment that confront our communities include personal attacks and inappropriate behavior at in-person conferences and wiki hounding, you know, whereby one person is singled out and targeted and followed around from place on the wikis, you know, with the apparent aim of causing them distress. So for editors who are at the receiving end of this toxicity, we want to provide a safe space for them to be able to talk about their experiences. We listen to their stories and where possible we do our best to offer them some advice on the next steps. Next slide. So, you know, our goals in a nutshell, you know, with the work that we do in the anti-harassment working group is to come up with tools and strategies to help our communities. I mean, people that we, people who edit with us to be able to deal with forms of harassment that they encounter as a result of their participation. We are also actively documenting these incidents of harassment and misbehavior as they come up. And really this gives us the opportunity to learn from people's experiences and then share best practices that others can use. The Wikimedia Foundation and the Wikimedia communities have over the years put together, you know, a number of anti-harassment tools and resources, which is awesome. But navigating through all of these resources is challenging, very difficult on its own. So part of the work that we do is to guide editors through these resources and tools that are already in existence. And we know we are not doing this work alone. We are very grateful that we have stronger lives. And so it's our goal to become a medium of communication between our communities and, you know, the other initiatives that are working to improve community safety and community health. Yeah, next slide. So we put together a security tool kit which you may access from our website. It's available also in English, but also available in Spanish, French, and Portuguese. And I would hand it over to Richard. Hi, next slide. Yeah, so one of the some of the important guidelines for correcting something like the tool kit is what art and feminism can and can do. Obviously, art and feminism is not the trust and safety team at the Wikipedia Foundation. It's not the arbitration committee of your local Wikipedia. It doesn't have these explicit levers of power. But because it doesn't, it has some, some, some better ability to give some more candid advice and guidance that perhaps some of those other bodies couldn't. It helps helps also give guidance about things like personal information on the media projects. A very common thing, of course, at many Edithons is we remind people, you know, you probably shouldn't use your first name, your full name as your username unless you really want to. Or you should be aware of, you know, the issues that we involved with that. Provide some thought leadership around policies and harassment and and guide people. It's important to, you know, provide documentation and and system station system station of that knowledge. It's been said by people that the documentation is a feminist principle. And I think it's, it's very valuable to have these things on paper so they don't just exist in a, in a, you know, in a, in a self selected group as some of the other comments have been also to clarify conflict of interest conflict of interest. I'm sorry. I'm sorry. Well, conflict of interest is one conflict, but to clarify conflict of course conflict is somewhat inherent on Wikipedia, including conflicts of interest, but other sorts of conflict and dealing with it and understanding, you know, understanding the context of it. And since how to make harassment to the community foundation so we can help guide people through that process and provide better documentation of that tools to make harassment report. There's been a lot of discussion in all sorts of movements about how to make proper harassment reports and make sure they're filed properly and that they get to the right people. And having some useful technical information, which is a which is a tool in itself to help you support support better safer behavior. And of course, obviously, all this is while acknowledging the flaws, the flaws inherent in Wikipedia and Wikipedia. And I think that by speaking a bit candidly about the flaws in Wikipedia and Wikimedia, we can advance the goals of Wikipedia and Wikimedia better, and, and, you know, sort of put things in the appropriate context. And, you know, understand that, you know, things have to be taken with a little bit of a grain of salt and sort of a wider understanding of the cultural problems that there are, and the potential cultural solutions, although they may, they may not be immediate. I guess next slide. Oh, Amber. I can, I can do that, Amber, if you prefer. Thank you, Richard. That was great. And last, we would like to, we would like to talk about some other work we've been doing and at the anti harassment working group, which is advocacy for safer and healthier space on Wikimedia in general. So as you know, on May 2020, the Wikimedia foundation, but of trust is voted for to ratify a new trust and safety standards for the Wikipedia, for Wikipedia and all Wikimedia projects. Actually, one of the difficulties we found when we were building the toolkit was that it was difficult to build a similar material for Wikimedia Commons, for instance, because we realized that there weren't this kind of a specific detail policies around harassment on that project. So we were really happy when all these process around the universal code of conduct started and we decided that we wanted to participate. So, but our community is very particular is formed. Most of all, for people who are new to Wikipedia and who's who edit in a kind of ritual basis every year, every March. So they are not necessarily on Wikipedia all year round. So we were thinking how can we make our community participating in this process, be aware of this process, but being mindful that they probably wouldn't have the time to go through all the pages and discussions and that these pages also aren't very friendly at the moment. So we implemented a series of tools and and video calls and meetings to allow our community to participate and be aware of these discussions. And I would like to go to the next slide please. So we produce a number of documents, surveys, social media campaigns around this topic. And we think it's been very valuable because we have been able to understand what's the perception of the our personal community around safety on Wikimedia on Wikimedia projects in general. For instance, we conducted a survey in July 2020 and we found out that people in 14 different language versions of Wikipedia had had bad experiences while editing as part of their art plus and missing work. We documented all those experiences and also suggestions about the Universal Code of Content content and also about the enforcement of this Code of Conduct. And we wrote two formal public statements and we share this with the committee who is working on the Universal Code of Content and we are very happy to have been able to do this work and to see that many of the recommendations we've done as a community have been taken into consideration by the working group that is drafting the Universal Code of Content. So we think this is an example of how we can enable the participation of new communities on Wikimedia decision making processes not necessarily on Wikimedia but using a mix of on Wikimedia and off Wikimedia platforms to offer that those experiences to the community as a whole. So everything is public in our website, you can find all the materials we have published there and I think it's good to document this. And we're really happy to be part of the history of making Wikimedia a safe space for everyone. And I think that's our last slide. So thank you so much Amber, Melissa, Muhammad and Richard for all your work, not only with this presentation today but all your work with the anti harassment working group. We now have some time for question and answer. So please go ahead and put those in the etherpad. I am going to kind of navigate some of the ones that are on here already. So the first question that we have, I believe this was about the research reports. So for Amber, Monica and Melissa and then also if we could get all the panelists back on here as well. I'll go ahead and stop sharing screen. Perfect. We did have a couple folks that had to drop it's getting very late where some people are on our panel. But to start so how will you work with the community to do a rethink of an aspect of the basic accepted. I'll be on question structure, example, silence as consensus and the reconsideration of sources so that this work is accepted as working with and not for the community. Amber or Melissa or Monica, I believe that is for you. I mean, I can start although it's, it's a really good question. How will you work with the community to do a rethink of an aspect of a basic accepted on question structure, silence as consensus and the reconsidering of sources. So I think presenting it I think presenting what we've done and starting the conversation is a good way to go in a previous presentation that we did someone asked now that you've identified all of these problems. Why don't you just go in and edit it and make the changes and participate and I think that's not the approach that would be useful. I think also all of us in who worked on this consider ourselves members of the community I've been editing Wikipedia since 2012. And so I think kind of like the slow conversation to unfold to build support to get buying from people to be like look, this is something that is an unquestioned kind of status quo but we think it's problematic like are you on board can we work on this. And I think on our to do list as well, and is to have a signpost article about this to post it on diff in the Wikimedia foundations blog to really continue to make the work that we've done visible and build coalitions with other editors, other groups that are involved in Wikipedia, other user groups to to figure out how and how we can go forward with this. What, as well I just want to emphasize is that we understand and find community as beyond those who participate in Vicki code behind the scenes conversations creates tension, because I think not everyone who's in those spaces believes that this broader definition of community matters to what happens in the encyclopedia so there is going to be tension there but we believe that those who come to edit the questions and who care about the mission and vision of Wikimedia, like the way we think should matter for how things work behind the scenes so I hope that the person who asked this question and anyone who's here will like join us in continuing to kind of take up what our report has shown and like continue in the work with us. Thank you so much for that Monica. Mr. Amber, did you have something to add to that. This is very quickly. One of the ideas we had during the town halls in for Spanish Wikipedia was how to take this conversation to the Wikipedia pages. So we have thought about things as writing a wiki essay. So we are currently working on that. And we hope that we can do this. We saw this meaning Wikipedia editors seeing their editions disrupted or facing more attacks, because of proposing new ideas. So that's why I agree with Monica that we want to be very mindful about how to do this process, because we have experienced harassment. We have seen people experience harassment because of this and we don't want to replicate that of course. But I think on Spanish Wikipedia at least I have found many, many, many opinions that are interested in this work. And I think probably the Wikimedia Foundation could create new opportunities for broader discussion around the policies. Certainly user groups like our past families and we are already doing this work, but also the Wikimedia Foundation can take some of this work too. And to create spaces for people to discuss not necessarily in both things. Thank you, Melissa. I do want to highlight that we have because we can't participate directly in the chat. If you scroll up near the top of the etherpad underneath the heading presentation slides and links to materials shared, you'll see a PDF of the slide deck that you saw today that includes all the links that you saw on the screen and direct links to the Call to Action Art Commission, the report, and as well as the security toolkit that we have currently available in four different languages, as well as a link to an after conversation chat for when we close in eight minutes, we will be some of us will be over there to continue any kind of conversation. So I do want to highlight that. So here's a question that I believe Richard you can take. The question is, isn't the focus on code and technical capability hiding the general difficulty of editing Wikipedia if you can't rely on a local editor community. I guess the way I would approach that question and I can't say I'm entirely sure the exact meaning intended, is that I don't think that Wikipedia and Wikimedia are fundamentally a technological project. With due respect to, to the many people who write those those bots that that save us every day. I think it's fundamentally a humanistic project. You know, I mean, there's a small there's a there's this technical barrier that exists to some extent and you know we can have better, you know, better, better editor interfaces things like that and those are positive. But that's not the real core of the issue. And I do think it's important that you can connect with with your local community or your thematic community. I think the, the, the core of what art and feminism and related things to do is build a social infrastructure to build a way to connect communities whether it's locally when people are able to meet in person to edit a thongs, or to talk thematically about about a given topic. And I think that building that sense of community is is stronger is a stronger and more important than building the technical tools per se, and, and also a culturation in that social community is is is every bit is important as a culturation and learning what these different coding things mean, which I don't, I don't really understand either, despite some beliefs to the contrary. So thanks. Yeah, I'm 100% social or technological. Great. So our next question, I believe, Mohammed, if you could feel that somebody asked for more clarification on what wiki hounding is. Sure. So, basically, we have instances where a person could be, you know, singled out and sort of stalked on the wikis. Following them on discussion pages, monitoring their edits, finding faults with whatever they are doing, and just targeting this person and concentrating with effort instead of trying to make their lives miserable. You know, but there are instances where like, admins, or even, you know, other editors will sort of want to track a person's edits, you know, to take care of spam or for other administrative purposes. That is, you know, a different issue. But in cases where you are deliberately just following this person all over the wikis, you know, and stalking them is a tantamount to hounding them wikis. But wiki hounding is. Thank you so much for that, Mohammed. I think we have one more time for probably one more question. And I'm going to just, I'm just going to ask it for on behalf of the panelists as we're kind of closing out today. I want to thank you all for joining us today. Both all the panelists as well as all the attendees and really engaging with this work where we do a lot of edit thoughts, but we're also doing some other stuff. And so we're really excited to be sharing this with you. I am wondering. I'm sorry, I'm like reading the chat at the same time. I'm just kind of wondering if anybody has any like kind of like closing or summer summary thoughts or questions for each other, like hearing each other's, you know, each other's work and within the art and feminism community. And I'll leave that for hopefully the last four minutes. I'm excited that when we can have regular tons again hopefully soon and and in all parts of the world that we can bring together some of these new ideas that were born during this remote period everywhere and sort of re energize the Edith on with some of these different ideas about art and advocacy and help people through harassment. I also just wanted to emphasize that this panel showcases the important and diverse work of art and feminism, which began as this bootstrap activity. And that was very in person focused and as far as as Richard just emphasized looking at how what's been happening online during the coronavirus shows how sustainable the project is and the new directions it's unfolding so it's been really great to hear about all of the work that's happening and also looking forward to in person events soon but nice work in the meantime. Sarah amber most of Mohammed any closing closing notes that's fine if not I have one more slide I can show. I'll just like one up with Richard and Monica saying that like the social aspect of art and feminism and how it brings multiple pathways of work together is really inspiring to me and I've just loved hearing about your work today and updates that were even new to me so Thank you. Yeah and hats off to the team. You Kira thank you so much and to the co facilitators amber. Mal and Mohammed you've done really great work. It's been a joy to see you take the home of art and feminism thank you. Thank you Monica. Thank you everyone again for joining us. I up here on your screen you'll see our link to our website as well as our social and our email again that after session chat will be taking place in building six floor nine table a. Hopefully some of us will be able to navigate there to continue the conversation for at least a little bit more. I do have her want to uplift that. I just posted maybe from the University of Nigeria who is one of our art and feminism organizers is starting a session at the top of the hour and building five talking about the Wikimedia community in Nigeria so definitely want to uplift that and encourage you to check that out as And please keep in touch. I do want to iterate reiterate that presently we're taking some actual steps to combat our own anti blackness and identity oppression within our organization which means that we'll be pausing some of our work to focus on that our energy on that. So apologize in advance for any delay and correspondence but once again just want to thank you and also thank Truvon and Toya for helping us on the back end here today and hope that you all have a great rest of your Wikimedia.