 session on responsible community governance, equity and access to mental health information on Wikimedia projects. Praveen, could you advance to the next slide? My name is Stella Ng and I am the senior manager of the Trust in Safety Policy Team over here at the Wikimedia Foundation. I'm currently located in San Francisco and my pronouns are she my team primarily handles the work on the UCOC as well as work on the emergency at workflow within the Foundation and I'm very happy to be joined by quite a few panelists on my end. I'm going to go ahead and pass the mic over to Dr. Monica Horton and all of the other panelists give their intro but let's just go in the order of the panelists that are there. Well hello, I'm Dr. Monica Horton. I am the policy manager for freedom of expression at the open rights group here in the UK. I have some 15 years experience of researching in the field of content online policy and I have worked with the Council of Europe on freedom of expression and human rights online. I'm currently based in London and my focus is on the online safety bill which is a new law being currently put through the Westminster Parliament and it is with the intention of tackling harmful content online. Awesome. Passing the mic on to Dr. Hussein. Could you give a quick intro to yourself as well before we get started? My name is Nada Hussein. The meeting is being recorded. I'm a medical doctor and I have been a Wikimedian for the last 12 years. On Wikipedia I usually write articles related to healthcare and during the COVID-19 pandemic I wrote several articles related to COVID-19 particularly related to the misinformation content of COVID-19. In 2020 I also launched the vaccine safety project where I was involved in mapping and bridging the knowledge gaps related to vaccines and vaccine safety. Outside of Wikimedia I'm a medical doctor and a radiologist working in Sweden and I'm originally from India. Thank you. Awesome. Tina would you like to give a quick intro to yourself? Sure. So hi everyone thank you so much for joining us today. My name is Tina Boutoyou and I am a lawyer here at the Wikimedia Foundation. Most of my work focuses on harmful content. Recently I've also been working on the UK online safety bill and I also work on producing the biannual transparency report. And then so sorry everyone who is viewing the slides we are missing one panelist Praveen. Could you please give a quick intro to yourself as well? Hi everyone I'm Praveen Das. I'm a senior is a senior partnerships manager for South Asia region and I'm based in Lucknow. Most of my partnership focus is currently in this region and without family I'm close to mental health hence I'll be talking something today. Thank you so much. All right just want to take a brief pause so let attendees know you guys can submit questions at any time during this session and we will be trying to get to as many questions as we can. I have the etherpad open in another tab so I will be looking at that throughout the presentation. But with that let's advance to the next slide. All right let's talk legal definitions of harmful content and access to information. I'd love to hear from Dr. Horton just about your experience legally with harmful content and what types of definitions you would say fall under that scope. You're muted just as a quick FYI. I can hear you please go ahead. Yeah so yeah thank you very much Stella for that introduction. Although you might think it is easy to define harmful content it is actually quite difficult to define it in a way that is legally watertight and in the UK I think we are struggling with this a bit. We have a proposed new law which is called the online safety bill and it introduces a concept of harm in relation to online content and it asks online platforms and other internet services to remove or take down or restrict that content. So it's quite important how we define it but when you ask what it means the first thing you actually get is a long list and the list may include things like eating disorders, anorexia, suicide, online abuse and the harmful content is actually divided into two categories but these are legal categories. So in one context you're talking about illegal content so content that is against the law, content that reflects a criminal offense and illegal content is defined by criminal defences in UK law and there are some 28 criminal offenses altogether that the law specifies. These 28 offenses include harassment and stalking, financial fraud, terrorism, child sexual abuse and assisting suicide and also assisting illegal immigration. So you can see quite a wide variety of things not a lot of definition though. The other content type of harmful content is known as often referred to as legal but harmful content. This is content which is not covered under criminal law but is nevertheless considered harmful and where the government considers it desirable that the online platforms remove this content or restrict it or restrict the user's accounts in some way. So in this context you get eating disorders, online abuse, anorexia and what is defined as illegal suicide but nowhere do we actually see what this content looks like in a Facebook post or a Twitter post or Instagram post. There are no guidelines on how the platforms should actually identify this content and so the law should tell the platforms what exactly they are supposed to be taking down but it doesn't and similarly the users need to know what it is they can and cannot say but in fact they don't know that. There's not a lot more definition than what I've given you here. For example if you look at the illegal content for assisting suicide you get told that it is section 2 of the 1961 Suicide Act which is about assisting or encouraging suicide and that's it but what does that post look like what's supposed to be taken down what is it that you're allowed to say we still don't really know. One of the big problems that comes up is when you're talking about linked content because as you know users on social media generally like to link to content they might link to Wikipedia and say here's something that you ought to know in this post but the question really is how are these links cheated so for example if they are taking down the content put up by the actual platform or are they taking down content down put up by a user of that platform and therefore what does that mean what you tend to find is that when platforms decide to take something down they don't just take it down in one place they take it down everywhere they can possibly find it using AI to identify it and this raises real concerns for the sites who are being linked to a lot so any sites that rely on links for people to find them which might be Wikipedia or it might be your source material in Wikipedia. How do those websites and source sites know that their content is being removed or demoted or restricted in some way and what can we do about it and at the moment there's the law actually says very little about that. Interesting I want to follow up on a point that you just made just because you're bringing up a lot of examples in which the definition isn't very clear or slightly amorphous based on your experience working on this how do legal definitions align or conflict with the right to freedom of expression globally just given it is a bit ambiguous. In this case it is deeply problematic because I will I will take the right to freedom of expression under the European Convention on Human Rights which is the one I'm most familiar with and on the European Convention you have a two-way right so you have a right just to express yourself but you also have a right to access information on one hand and you also the when you under Article 10 2 of the Convention so Article 10 is the right to freedom of expression and Article 10 has a second paragraph Article 2 but which talks about how states should go about respecting content if they are going to do so and it is actually very clear it says the law must be clear and precise as to what it is you are restricting the law should define clearly and in courts that sometimes actually defined as precise URL so when you get this very broad stuff like you get a criminal offense where we don't know what it actually looks like in terms of content that becomes problematic because how is the platform supposed to know what is that offense and I have an example from another area which is terrorism content where our reviewer of terrorism here in the UK and said look you've got you've got somebody giving a training in rifle shooting online now that could be somebody training someone with a view to them undertaking a terrorist activity but also it could be something completely innocent it could be just a rifle at a sports club but there's rifle shooting and it's legal and you know it's okay so how does the platform now supposed to define between those two things so we that that is where we end up with the problem the law wants it to be clear and precise sorry if your rights law wants it to be clear and precise this law the online safety bill is very very far from being clear and precise interesting I'd love to hear your thoughts on this Tina I mean based on these definitions how does it conflict with emerging research on what is harmful to readers yeah so it's it's tricky because in the online safety bill harm includes both physical and psychological harm that amounting to at least serious distress but we know from emerging research that across culturally harm is uh is what qualifies as harm across cultures and languages as well as lived experiences is is different and how people in different countries and across cultures rate the severity of of harm of certain content it also differs so for example in the U.S. we know pre-anarchsia content is problematic as you know based on the and we've seen how Instagram in misuse their algorithms from the Francis Hogan revelations but Americans perceive pre-anarchsia content as less severe than other types of content that's where it gets complicated but in other countries I think in Southeast Asia suicide related content is is higher on severity than in the U.S. so one of the issues with online safety legislation globally is that regulators with a particular bias and perspective are charged with enforcing these laws and may have blind spots with respect to the types of harms recognized and experienced by folks around the world. Interesting I want to drill a bit more into that I mean what principles does the foundation take into consideration related to harmful content? So what we have is at the foundation is as you all know the foundation is based in the U.S. um so U.S. law always applies and we'll take into account the laws of California in Florida um since we were based we're headquartered in California. Sorry Tina um if we could slow down a bit it looks like we've gotten a comment in chat just to that it's a bit fast at the moment. Oh absolutely thank you so much Natasha. So uh the way we evaluate harmful content or whether an international law applies to uh says the Wikimedia Foundation since we are a U.S. based organization is applying three factors. One we evaluate the law issue and see if it applies to the specific case and whether geographic jurisdiction also applies. Next we evaluate whether the case presents a risk to the foundation or the movement and these risks include but are not limited to risks to editor safety risks of project blocking or similar technical disruption and also monetary risks and then finally and perhaps most importantly we do it we conduct a human rights analysis so we evaluate whether the uh whether a law conflicts with existing human rights norms in accordance with our human rights policy. Interesting how effective our community moderation processes. Just brief pause um Natasha am I going too fast for you as well would you like for me to slow down? No it's not you it's uh Tina because what she says is very interesting to me as a Wikimedia editor to know how the foundation evaluates harmful content and she starts slowly but then it goes very very quickly and I'm not used to American accent I'm used to the English accent so I have problems understanding uh Americans when they speak if it goes too quickly sorry I'm really sorry to pass to you and probably there are other people from other country who would not be as bold as me and daring to to say it goes big fast. Well thank you so much for that feedback everyone this is why it's important that we have feedback from community members from around the world because some of sometimes we're not aware of what we do um just automatically my apologies for no no worries I really appreciate that and it actually highlights why we need to have an ongoing conversation about harmful content is sometimes we edit or we do things in ways that make sense to us without realizing that others around the world may experience maybe the way we speak or what we say differently. So I'm actually really glad that you brought this up Natasha uh yeah so the main point of of how we evaluate harmful content is one we evaluate the law we always everything is done on a case-by-case basis so we look at a complaint and then we examine the law itself often with the help of outside counsel local expert um outside counsel and we see if the law applies but also if we're subject to a country's jurisdiction. Next we also just look at what risks do we have folks on the ground we have assets what risks does a complaint or an issue bring to us present to the foundation and then third we're always committed to human rights and that's why we conduct a human rights analysis and when we if we identify that that law conflicts with human rights then we we will take that into account whether we comply or not. Again each case is evaluated on we make these evaluations on a case-by-case basis so it's hard to predict when we will take content down but I think the most important part is to realize that we have that we do take into account human rights and and we're looking at online safety legislation that is something that our public policy and global advocacy team very much work with regulators and legislators to to address. All right we've gotten a question in from the chat could you please drill down or explain a bit more about human rights analysis? Yes so um our human rights policy came out last year in 2021 and there are so article 19 of the International Convention on Covenant on Civil and Political Rights says individuals have the right to access information and to share and impart information. We also recognize the right to privacy which is recognized which is essential to the right to which is part of the right to freedom of expression. So if you see our transparency report thank you so much Rebecca first for sharing the link to our policy that's where we go in more depth. We take into account how community members and how the foundations human rights will be affected and when evaluating whether to comply with the law and when it comes to privacy we collect very little data but some in some countries where there's heavy surveillance we do our best to protect the rights of our users over there. Thank you so much Tina. Just noting to all of the attendees I am keeping a track of the questions that are being asked on Etherpad. We will be going over those at the end of the session and if we do not have the time to reach your question we will try to answer it asynchronously. Praveen next slide please. All right with this I'd like to drill down deeper into Wikimedia projects as a global resource for mental health information. Dr. Hussein I'd love for you to get started on this just based on your personal and professional experiences how would you describe mental health and suicide and how they're perceived in different countries? Yeah from my experience I could definitely say that how suicide is perceived depends on your background your culture and where you come from. We know we have some previous research showing that when it comes to gender in countries where there is so much patriarchal influence women are more likely to try to end their life and think about suicide ideation and it and how you perceive suicide also depends on which culture you come from. There are cultures where you are stigmatized overtly you are just stigmatized just for ideating related to suicide and there are also some subcultures where suicide is glorified when your suicide is for a greater good perhaps for spreading the ideology or or committing or ending your own life as a part of a holy war and so forth and there are certain religions where it's absolutely forbidden to take your own life and in some cultures or in the people with lower socioeconomic status we know that social and economic challenges often lead them to think about ending their own life. So how do you perceive suicide is a multifaceted and a very complex issue and I think it has very different you depending about where you come from and what do you your background there is a lot of difference in the way you perceive it. When it comes to Wikimedia you have just one article you have like one article about one aspect of suicide say if you take the article about suicide on English Wikipedia you probably are likely to write in a way that is suitable for the western audience and and my I would like for the article to be more diverse when it comes to like addressing the challenges of people from all backgrounds and cultures. I want to drill down on something you mentioned about stigmatization just in one of the first points you brought up. So research has stated specifically the WHO or the World Health Organization has stated that raising or addressing stigma and raising awareness is important to preventing suicide. They've also stated that improving community and online environments can help improve child and adolescent mental health. Do you have any thoughts about that? Definitely I think there is so much that Wikipedia can do. I think the whole Wikimedia movement can do in terms of like raising awareness related to suicide. We are the largest health related, we are the largest encyclopedia in the world and we also have the largest health related encyclopedia content. So I think and we are also one of the most visited healthcare related information on the internet. So this gives us some responsibility when it comes to providing content to our readers because anything that we write out there it could be like shared and amplified in different ways. So we have to be very careful about what we are presenting in terms of like in terms of what articles we show to our readers. So I would like for more expert involvement in this area so that Wikimedia content is more and more reliable and updated. I would like for the knowledge gaps on Wikipedia to be fixed so that when people look for information and they don't see it on Wikipedia, I don't want them to go further and go into other websites that provide the misinformation. I would also like for Wikipedia articles not to be overtly academic. We are as Wikimedians we are very interested in writing everything in a very academic way but when it comes to suicide I think we also have to think about writing in a person-centered way so that we take into account of the emotional challenge that is also gone through by people who have attempted to take their own life. We also want our articles to be written in a holistic way not to focus overtly on a person's how a person tried to end their own life or focus overtly on the graphic content of how a certain person ended their life and so forth. So when we write an article about say a person who ended their life, we have to be very careful about how we describe it in a person-centered way so that the readers can, it's possible for the readers to understand the fact but not overtly to focus it in a way that promotes them to actually take their own life as well. So these are some of the thoughts that I have about this. Thank you so much for providing your thoughts. I mean just going into a bit more detail I want to give Praveen a moment to speak. Praveen if you'd like to weigh in what role do you think Wikimedia projects more broadly have to address stigma and raise awareness about mental health and suicide? Okay I think you know when it comes to Wikipedia as Dr. Singh mentioned it is one of the most frequently visited resource for health information on internet and it's a global source of seeking mental health information but then again when it comes to suicide and self-harm, these are very complex issues which is caused by mixed factors. So as part of the project and as part of responsible community, I think we should be more sympathetic and also be very cautious while writing anything related to suicide and suicide prevention and build more repositories of mental health on Wikimedia projects in general. Be it Wikimedia Commons or you know Wikipedia Textual Animates repositories are going to help build more resources for you know seeking health you know for those people who wanted to come to suicide. Thank you so much. What do you think are the challenges in doing this or implementing this? I think there are a wide variety of challenges when it comes to you know mental health these are very sensitive topics and you know we may need more editors from you know the background to understand mental health very well or you know we can organize some workshops to encourage participation to learn more about mental health so that you know the communities they themselves can write really well. Also other challenges are in in general you know so lack of research in third world countries so when it comes to you know when it comes to developed countries there would be more you know initiatives towards mental health but when it comes to not so developed countries there are not very much active non-profits for government taking action towards self-harm so content specific to those countries are missing and why would we would need those information because you know the the the reason of self-harm in different countries are different. We need to understand those reasons and then identify the resources which is needed to convey you know and convey and for that a detailed research is required. Hence you know the bigger challenge is to be lack of individuals lack of experts inviting the content and the second one currently is lack of resources too. Thank you so much. I'm seeing that there's quite a few questions bubbling up in the chat and I saw that there was one regarding guidelines Tina would you like to weigh in here? You are muted. Hi could you repeat the question which one we see we have several. I'm looking at the question from Natasha from 409 AM my time would there be guidelines somewhere on how to write about suicide and just keep a note to everyone on the ether head and also on this on the channel. I am documenting all the notes we are trying to prioritize what intersects. Yes so we actually do have notes and guidelines on how to write about suicide and we will share those too with folks who are interested. Our emails are on the so my email is on the page the event page I'll drop it here but yeah just send me an email and I will send the guidelines to folks who want them. All right thank you Tina before we move into the next slide I want to give a moment for you to chat about the SPIF foundation paint your blues campaign and the impact on raising awareness about mental health and suicide in India and around the world. Sure so you know paint your blues was a campaign started by suicide prevention in the foundation it's a India based one perfect working you know in terms of building more awareness related to mental health and encouraging people to seek help. One of the issues which we have seen is people don't seek help when it comes to mental health there's a big stigma around that and especially after COVID-19 the issue of you know stress anxiety is high in the society. So we identified this as you know one of the opportunities where we could work with an organization who are you know freely active on ground in creating content and identifying you know the gap in general. So you know as a result we thought you know it could be a campaign where we could you know gather images we can generate images through different artists by working with them in general and we know that a single image can convey thousand words and has a power to increase awareness spur people to action and change opinions. So when it comes to mental health imageries or even the content both are you know both lacks in wiki world in general. Images have a large influence on attitude formation and perception of views but most images promote negative stereotypes what is currently circulating you know on internet we enforce stigma and discouraging mental health conversation and hinder self thinking you know self seeking. So art for good was a campaign you know that it started in December 2021 with an ambitious aim to build the world largest repository of mental health imagery. These basically we built these imageries with the help of artists there were more than 50 artists across India who participated and have created more than 100 imageries. The imageries the art which is currently shown you know in this presentation is the outcome from art for good campaign and basically these were created so that this can be free to download distribute images will seek you know to leverage the popularity of memes cartoons illustration infographics and also the enduring powers of photos to support text based narrative which is there in wikipedia and it is highly encouraged you know in India to the media industry to use these arts and not use the graphic you know picture to slide to ensure mental health. Thank you so much for being before moving to the next slide. I would like to ask the question that was bubbled up on the chat. What about establishing a free psychological line for all wikim medians like wikim media France does this helped a lot during lockdown especially for underrepresented minorities. Tina would you like to weigh in? Sure so it's kind of tricky because we have readers around the world so when we try to help our community we try to help our global community. I know Stella maybe you can speak to the trust and safety resources and peer support work that's happening around the world. I'm not personally familiar with the wikim media France example but we're definitely interested to hear more so again I'm happy to drop brought my email for if someone would like to chat more about this. All right Natasha would you like to weigh in with a few words about wikim media France before I give some details into our emergency workflow? Yes so because there was quite a lot of problems for underrepresented communities and that was a lot of LGBT persons on the francophone wikipedia wikim media France established a psychological support first as a test for four months I think with a psychologist who was who knew about LGBT questions in general that was pretty much used and now it has established a 24-hour round the clock service where where any person contributing to the francophone community or member wikim media France can call take an appointment and get psychological support so I'm not speaking about readers I'm not speaking about readership there but people who contribute and if we want a better representation of minorities in underrepresented communities throughout the world we have to bear in mind that these people are often the subject of microaggressions which might not be labeled as cyber harassment or harassment and might fall beneath the red line but can be very very bearing in the long run which is why having a service addressing these issues is really helpful I think. I can weigh in a bit about how we currently provide resources for people who are going through mental health crises or just episodes in general right now trust and safety maintains a 24-hour emergency app protocol in which writers contributors or wikipedia are encouraged to write in if they observe behavior that could become evidence of real-time harm or like physical harm offline as well as just threats in general a lot of what we receive in that workflow do does have to do with suicide or self-harm so we receive an email in which we believe a user is in real and present danger or maybe self-harming we will ask we will work to interpret what the harm is and then based on research send it to the appropriate law enforcement agency within their region I'm glad to hear the community is working on this we also know that some languages of wikipedia have added suicide hotlines and that is really great to hear with that said let's move into the next slide Praveen all right let's talk about health misinformation on wikipedia projects I'd love to start off with Dr. Horton and your experience working on health misinformation as well as disinformation and the differences just first off the as I mentioned just now the words misinformation and disinformation are used interchangeably could you briefly explain to our audience what the difference between those two things are yes thank you very much Stella and yes I mean if we're talking about misinformation you're talking about deliberately false information or perhaps you're even talking about something you it could be seen something has made an error who doesn't understand what they said is is actually wrong unintentionally said something wrong if you're talking about disinformation it's a much more deliberate thing and it's also a little bit more sinister disinformation does not have to be necessarily false but it seeks to divide and confuse it seeks to basically insert into a sentence of or a series of what may be rational information it will insert something that is deliberately there to confuse which may be false or it may simply be a wrong direction certainly that's how I would differentiate between the two I don't know if that's what you were expecting no that was great based on my limited experience with misinformation and disinformation it tracks of like experience that I have on the other thing that I've heard is that it's used to deliberately elicit a strong emotion or a call to action for information that might be intentionally or unintentionally false so it could be that you're sharing it yes and you don't intentionally mean for it to be fake but it could also be you're sharing it because you intentionally know that the information is not rooted in reality just going deeper into that what are some effective ways to address this problem either misinformation or disinformation okay if I can state a little bit more then about the I'll talk more about the dis the disinformation if we're we're talking about somebody who is deliberately trying to as you say either elicit a reaction but in the way that you've described or somebody who is deliberately trying to slip in and untrue to get somebody confused to get someone to believe something that may be actually false but they want them to believe it is true and when the point being that when people are confused or when they are divided which is the other thing disinformation does disinformation will tend to is part of what we call culture wars so it's trying to find an enemy out there and maybe a fake enemy but somebody that people can see as an enemy and therefore get a take on a viewpoint that they wouldn't otherwise take and when people are confused and divided they start to doubt the information that they get so even when the truth is then staring in the face they start to doubt it they don't believe it and they may act accordingly and this is certainly experience that we've had here in the UK and I will just briefly talk about that COVID here where we have seen this become as a political in the context of political organization so we have seen the same accounts who will tweet or post COVID disinformation and who will also be posting on political issues here in the UK they tend to be the pro-Brexit accounts here and they also have been linked to climate change denial accounts so what we've we have noted is that some kind of organized disinformation here around in this case around COVID and health this is quite it's potentially different from what you sometimes hear when you hear that people have posted about you know taking drinking bleach for example will cure COVID which is clearly false and this is not what I'm really talking about here I'm talking about a deliberate attempt to kind of subvert people's thoughts and it looks like it could be coming from an organized direction I just want to sort of highlight that I think the way to address it one of the best most effective ways to address it is to challenge it and to call it out and to repeat the truth and that is quite difficult because it's it doesn't always appear in in a straightforward way it appears randomly in people's timelines so how do you do that by maybe responding to those posts when you see them and challenging them and calling out what they're saying and calling out the falsehoods and the lies and the distortions in that information and informing people with the truth is the other way is the other way to do it in terms of a policy response it's it's it's very tricky because you then get back into the situation of well how does the law decide what where how it's going to deal with this stuff and if you're dealing with the content then the law would have to actually deal with the organized side of it I think that's a personal view rather than the actual content as it appears online. Okay I would love to hear a bit about the intersection of this from Dr Hussain given your training as a physician and neuroscientist could you describe your work on COVID-19 misinformation and how the community came together to address it given the scientific community knew so little about it initially I'd love to hear about that from your perspective specific lens. So COVID-19 it was a healthcare emergency and it was a new disease and scientific community knew too little about it there were no textbooks there were no past guidelines that we could use to tackle the pandemic so the situation was kind of a chaos when people see that their loud ones have the disease and they don't have enough information about it even from the institutions the world health organization the CDC and different governments tried to put out as much information as possible but then there was so much unknown about the pandemic so this created a kind of panic among people and people wanted to like just try to get whatever information they could about the pandemic and the first thing what like regular people with internet access do is to go to the internet and check what what this is about and then sooner or later they would land upon Wikipedia because we are one of the largest providers of healthcare information on the internet and on Wikipedia they could just see the status of the current status of how scientific information evolves around COVID-19 which I think was a great resource because at that time because the due to the panic that's happening all over the world people were really in need of information and we had a bunch of really very good experts plus a lot of other editors and experience Wikimedians coming together and writing articles and updating articles related to the COVID-19 pandemic and at points we had a very high view when we looked at the view statistics of certain COVID-19 articles it was like a record number of views that these articles related to COVID-19 got not just in English but also in several regional languages we do have like more than 200 languages we do have articles related to COVID-19 and these articles are still being developed by our editors and I think that it was it just really showed how powerful our communities when it comes to like coming together and building something very quickly by using content from all over the world in multiple languages and this is something which even large institutions cannot like accomplish to do it's only something that a crowdsourced enterprise like Wikipedia can do and I'm really proud of the work that the whole community has done when it comes to COVID-19 and when it comes to my own work I mostly focused on the knowledge gaps related to COVID-19 so the larger articles related to COVID-19 they were taken care of but when you talk about different socioeconomic aspects of COVID-19 such as mental health in COVID-19 you did not have a specific article for that then I just went in and created that article and worked on it giving a country-related specifics on how mental health is being affected in India related to COVID-19 for example so dividing it in terms of how it happens in different countries I also wrote about stigma related to COVID-19 so these are so I wrote all these satellite articles which are which were not given so much attention at that moment because it was the main COVID-19 related articles that were being edited by very many number of people when it came to misinformation I have a story to share I one day I was just looking through the newspaper and I saw that there was an Indian family which ate a poisonous fruit thinking that it would prevent them from getting COVID-19 and they got this information from social media so it was I knew that there was a lot of misinformation circulating because at that time I was practicing as a doctor in a primary care setup and I would see a lot of patients and all of them would have their own ideas of how COVID-19 is transmitting and thus like eating garlic help prevent COVID-19 or so they have people had a lot of assumptions on how COVID-19 is spreading and the treatment methods that are used for that so then I understood that misinformation is a real thing and we need to do something related to that and that was how I went into writing misinformation related articles about COVID-19 and now we do have some really good content related to misinformation but I think that what we have is only a tip of the iceberg because there is only so much that we can find related to misinformation that is surfacing on the internet but there is much more that is being shared privately on social media and right now we only have misinformation related articles in a bunch of languages we do really need more of those articles in many different languages that Wikipedia has and similarly this was also the case with vaccination information when governments started rolling out COVID-19 vaccines and started giving vaccinating people people were just so confused as to whether to take the vaccine or not or whether they should like take the booster doses or which commercial product they have to use there were different companies offering different kinds of vaccines so then at that time on Wikipedia it was very important to make sure that the vaccine related articles are updated and also to map the existing knowledge gaps related to vaccines and vaccine safety and show that show the current scientific information related to safety of vaccines and efficacy of vaccines so this was also something that I worked with and coming back to your question about how my background helped in this venture yeah I was I'm trained as a medical doctor and I have a PhD in neuroscience so and I also had a lot of experience writing similar Wikipedia articles before so when all this experience came together it was it felt very natural for me to go ahead and create all these articles so that so as to help people educate more about the pandemic as well as COVID as well as the vaccines yeah at that time but on other days I usually write about whatever that interests me I have also worked related to the gender gap and improving the content of biographies of female persons on Wikipedia but when an emergency came I just wrote what was I just hopped hopped in and did what was needed at that time and I think this is very this is what most Wikimediates do when they think that there is something that's happening there's a knowledge gap that needs to be fixed they just come in and they coordinate with each other they create wiki projects and they work together and build something awesome that people cannot individually do by themselves and I think I really want to like show I'm really proud of the power of collaboration that we have in the Wikimedia moment all right one last question before we move to the next topic you you touched on this quite a bit in your answer previously just about proactively creating content where it didn't exist in order to battle the misinformation and disinformation that was prevalent across the entire web during the onset of the pandemic but I wanted to just drill in a drill in a bit more there as a Wikipedia how would you describe misinformation on projects with respect to health information and other important issues such as the gender gap so taking a step back from COVID and thinking about other topics what is your take on that we do have disinformation and misinformation on Wikimedia I think when it comes to like main articles related to say COVID-19 or stroke or Alzheimer's disease there are a lot of people editing there are a lot of people watching that article so it's less likely that misinformation content comes in and sticks in there but when it comes to articles that are not so popular not so read by a lot of people there can be misinformation which is there and this could probably stick on because not many experts are reading that article very often and not many editors are updating that article and the disinformation that we see on Wikipedia could be a propaganda there are people who want to show that say vaccines are costing autism and they want to push their propaganda and they're using Wikipedia to like use unreliable references and try and insert that in vaccine related articles to show that vaccines are potentially harmful and there are there is misinformation where people give undue weight to actual facts so some vaccines contain traces of aluminum and people would like would give undue weight to that fact just to show that probably vaccines are not that safe and so forth and there are also some people who are just playfully editing they they are just playful and they just want to edit and delete parts of articles they just want to put in their own names they want to vandalize content so there is this kind of misinformation also happening and on the other hand there are also like very innocent people who really want to give information to the world but on the other hand they are having false information in their arsenal so they really do believe that vaccines are lethal and they want to sincerely spread that information to the rest of the world and they use Wikipedia for that so regardless of the intention whether it's if it's misinformation on Wikipedia then we have this we are being peer reviewed every day on Wikipedia there are people who are reading it and everybody can edit Wikipedia and if you see that there is misinformation out there you could actually go in and challenge that information on the top page or you could remove that information right away so we do have really very good policies on Wikipedia to tackle misinformation but when it comes to articles that are not given so much eyeballs the in articles which are not read that often there could be misinformation that is creeping in and I think this is where many volunteer editors have to focus we could watch list some of the articles so that if any new changes happen on vaccine related or COVID-19 related articles we could so that we could revert that changes made by that particular editor we could also track the behavior of editors who have been regularly vandalizing articles we could use we could use technology to find out which of the editors which of the edits done by a certain editor is harmful and which of the edits are good we could have like groups in communities where we discuss about misinformation policies and how we effectively tackle the misinformation problem and we could also have repositories of reliable sources where we could like where we could cross check and see that look this is a reliable resource and this is not so there are some there is so much work that is that should be done on Wikipedia in order to tackle the misinformation pandemic but when you compare Wikipedia with the rest of the internet say social media we are far ahead and we have since we started in 2001 we have been facing the misinformation problem and the rest of the world identified this problem much later and we were tackling this information like as early as in 2001 when the World Trade Center collapse happened and the rest of the world was actually following the Wikipedia's example of like finding reliable sources and tackling vandalism so we have done a good job but I think we have to continue keep evolving and use technology and our volunteer workforce to better tackle more misinformation thank you so much for your thoughts very appreciated and very good to hear that it looks like we've been on the forefront of tackling misinformation for quite a long time let's advance to the next slide I'd love to hear about best practices about writing health information versus Wikimedia policies Praveen would love to hear from you first could you tell us about the role internet resources like Wikimedia projects have in preventing individuals from dying by suicide yes I can hear you so yeah I just wanted to hold one so sorry it seems like we're having a bit of audio issues Praveen I'm unfortunately unable to hear you I'm going to give it a brief pause we all know the internet is quite a beast to tackle especially in these online spaces Praveen we will come back to you I'd like to pivot over to Dr Hussein to talk about techniques or lessons to address COVID-19 and gender misinformation and how that applies to the overall picture of mental health information on the Wikimedia projects could you give some thoughts there I think there is a lot that we can do on Wikipedia in terms of writing in a person-centered way in in a way that is not overtly too academic in a way that resonates with the reader I have some experience in doing this from my work related to gender gap so we had the first step in here is to identify how identify the writing style that we are adopting to write suicide related articles so on in when it came to gender gap there was a lot of research that showed how we differently portray men and women how we usually give a new weight to women when it comes to their marriage and their family when but when it comes to men we mostly write about their career and their achievements when it comes to women we we usually tended to write more about to generalize stuff and when when we wrote about men we usually gave more more weightage to we did not assume things about them and we we just gave them the benefit of doubt so there were so many differences in the way we wrote articles which was evident in the research that came afterwards and in the light of this research we created guidelines about how to write about women and now I think after like 10 to 15 years of doing this kind of work it just feels plain wrong to write differently about women so it has been inculcated into our culture and we are more aware of how how not to discriminate when we write articles even though that came unconsciously earlier on so now we are consciously trying to not write differently about women I think the same approach can be done when it comes to writing about suicide related articles there are many very many organizations that provide guidelines about how to write about suicide in a person-centered way and we could follow we could incorporate all these guidelines into our wikimedia when we write about suicide related content on wikimedia then when we write about biographies it would be important to focus more about the person say if we are writing about a person who ended their life we should not give undue weight to that particular incident and we should more focus on the person and we should also not give more we should not also not sensationalize suicide related content but just give plain descriptions about what has happened we should not provide overtly graphic descriptions of how some person ended their life we should just say the plain truth and in very simple words and I think we should also remember that wikimedia is not there to like to be a memorial site just to provide just to show that this person died of suicide but we have to more focus about their accomplishments on what else they have done in their life and so forth so there is a lot that we can do in terms of creating guidelines to write on content related to suicide and right now we already have a lot of policies in place we have the policy of having putting only verifiable content on wikimedia so we can avoid taking all these articles from these news channels which sensationalize suicide we could so if only we only use verifiable content on wikimedia I think a lot of the problem is solved in that way just that the verifiable sources are usually very particular about using person-centered language and so the policies that we already have on wikipedia are really very good we just have to build on those policies to create guidelines to write about suicide on wikipedia and I think we need more interest from the volunteer community to participate in creating these guidelines we already have like draft and framework of all these guidelines from the wikimedia foundation but we need more participation from the volunteer side to build up this when it came to gender gap the field that I worked before there was a lot of volunteer participation in the end and there is still a lot of edutathons a lot of discussions still researchers that is happening in this area and I want the same to happen when it comes to suicide and mental mental health related content as well. Thank you so much before we move to the next topic Tina would you like to weigh in on the question that was originally for Praveen could you tell us about the role internet resources like wikimedia projects have in preventing individuals from dying by suicide? Yes so as Dr. Hussain mentioned the wikipedia is one of the most visited health information sites online and research shows that you know it's important that content moderation policies I'm going to speak a little bit more broadly now should be developed with local perspectives and in local terms because people have different understandings of harm based on their lived experience so the wonderful thing about wikipedia is that you know and wikimedia projects generally is that they're available in 300 languages we have people from around the world with different lived experiences editing challenging different points of view and really making sure that articles are supported by reliable sources so I think you know as research changes having a community that's committed to accuracy and neutrality and sharing what they experience as harm is critical to addressing mental health around the world in light of COVID-19 there's been a renewed interest in mental health there's been more research and now is really an opportunity where the community can do what it does best follow the sources and challenge whatever notions of harm legislators or platforms and possibly even researchers themselves again we don't have as much as Praveen was mentioning earlier we don't have as much research from the global south so challenging some of those notions. Thank you so much Tina let's advance over to the next slide all right I'd like to open the floor for all the panelists what do you think is the role of the projects with respect to mental health information let's start with you Dr Horton unfortunately we're having audio okay I can hear you now perfect no sorry sorry yeah okay um the role sorry the role of the projects with regard to to mental health information and it sounds I mean but you certainly do have a very important role in bringing I was going to say bringing clarity it sounds a bit like legalistic jargon but I think just listening to Dr Hussain just now and the way that you use for example verifiable sources with the way that the amount of detail that you go into here I think because I think it it's about um the way that you can challenge the disinformation and the misinformation about the area out there is about by providing the truth and by providing echoed information that people can also find and by so I think that may well be you know the most important role thank you so much Dr Horton Dr Hussain what about you any final thoughts I would just amplify three points that I took um that I have been saying in the previous conversations first we need more research for mapping knowledge gaps related to mental health we do have a lot of articles they are scattered everywhere and we need to specifically map what defects we have in each of these articles and try to fix them secondly we need solid guidelines related to suicide related content and mental health related content as well we do have very good policies in place but we don't really have guidelines about how to approach mental health and suicide related articles so we with the participation of the volunteers we need to provide some solid guidelines just as we have done in terms of our gender gap related projects we really need a community to take over this issue and create some guidelines for editors to think about when they write about mental health and suicide related content thirdly we need more volunteer participation in this area including expert participation um our volunteers most of them um they might not be experts in mental health but they're right they are very interested in that area and they write articles and we also have a small group of experts who also contribute to these articles we need more expert um participation so that the other articles are more structured and they are um they are more understandable to a general audience and they are presented in with a diverse enriched content we also need volunteers in different languages to write articles related to mental health in our 300 plus languages I have been in editor in I have been editing in Malayalam language Wikipedia which is my mother tongue and there is too little content related to mental health in Malayalam language so we need editors in all different languages to amplify the content related to mental health um as well as to provide expert advice and expert input into all these articles so that so that they remain verifiable and up to date thank you so much Praveen and Tina could you also give your thoughts sure I think in this world where internet um you know usage is growing exponentially um and people are seeking health information online I think there's a great rule for the community to come forward you know and build a mental health repository not just in English but as Dr. Hussain said in all regional languages um and you know I think that these communities have an opportunity to increase access to knowledge on mental health and suicide prevention is a key impact subject thank you so much Tina how about you okay so writing about mental health responsibly is a complex task um so we're not going to solve this issue in this panel but regulators and governments around the world are trying to to be direct police harmful content and some of this content includes important health information unfortunately they're taking the same approach that larger tech companies have taken is apply a policy often based in the US to communities around the world instead of taking the perspective of communities into account so my takeaway from this is that Wikipedia's keep doing what you're doing stay curious follow the sources challenge what governments around the world say but uh but also be mindful of emerging research that identifies some content as being indeed harmful and explore the tensions between censorship and best practices uh and be sure to be mindful of how what we write about what we write on Wikipedia may affect those that you may not have thought would have been affected all right before we use the last five minutes to open for questions Praveen I would like to give you the floor to talk about your work with art for good thank you Silla so as I mentioned before art for good was a campaign inviting you know all the artists in India to come forward and create art around mental health which can detox and destigmatize you know the behavior of not seeking help and in collaboration with suicide prevention India Foundation we reached out to the artist communities in India there are more than 50 you know artists who participated who understand mental health from a background of psychology you know to someone who has seen loss in the family or who was going to the mental you know health state themselves have participated in this content in this context and this campaign to build image repository which is available on comments for all so that was that was all and more over with this you know initiative I think it is one of the examples which we wanted to do which we wanted to set with the communities where we work with different partners who are engaging in and they out on mental health and suicide prevention specific topic and they have got great you know expertise in these areas and somehow you know at this collaboration we are able to build our our information repositories with the help of collaborations you know with the help of collaboration between two different communities and organization I think we can build more information around mental health and suicide prevention on Vicky projects in general which will be useful for people. Thank you so much Praveen all right five minutes remaining and we have a couple questions on the docket for our presenters the first one is going to be when a Wikimedian is accused of adding illegally harmful content by a government how does the foundation communicate that to the editor if the information added is illegal in the US but illegal in the editor's jurisdiction what requirements if any bind the foundation and the editor Tina would you like to take this one sure so as we discussed with respect to the applicable law determination it varies a lot and we evaluate these cases on a case by case basis but always taking into account human rights so that's something that is very important to us but we do try to support communities as much as we can we try to work with the volunteer communities and communicate to the extent allowed by law and taking into account the safety and of the community as well as the risks to the foundation thank you so much next question as these tend to revolve around the legal aspect of this work I will be allowing for Tina to take the first stab but if any of the other panelists want to weigh in that's absolutely okay the next one is the meaning and tradeoffs in human rights law will often depend upon case law and may only be determined through legislation does Wikimedia interpret case law in making such judgments clearly a time consuming activity and what will Wikimedia do when the case law is unclear it may only be clarifiable through court proceedings and I've attached it in the chat as well well we do our analysis to the best we can but we often work with local expert council to evaluate each case and if we decide to take a case we also hire outside counsel who has that expertise perfect thank you so much Tina are there any other questions that people want to bubble up in the chat as I go through the etherpad let me find one last question here and this one I'd like to open to any panelists who would like to take it to my mind the interests of mental health practitioners and those who may be treated by them are not necessarily aligned an example is that interventions taken to reduce the immediate risk of suicide in the short term may increase long-term risks while providing a practitioner with the sense of having tried to do something literature directly targeting services use service users may hide information and mislead users how can the value of expert knowledge be traded off with implicit bias and distrust that a focus on suicide prevention could create potentially pushing readers to forms that are less regulated and having less regard for the truth this will be our last question before we all say goodbye today would anyone like to weigh in I will just say that this is Wikipedia has an important role is to help folks who don't have the medical expertise ask the right questions know when their practitioners might be biased I will pass this on to Dr Hussain this is a very difficult question to answer I don't really have a concrete answer for this question for example on Wikipedia Wikipedia is not exactly a forum we just mirror the information that is presented on different reliable sources and we just show it on our platform so it's not actually a forum where you can ask question about your mental health and get answer from an expert and so on it's there and I recently looked into the article about suicide methods on English Wikipedia and I saw that there are 7500 people approximately 7500 people per day reading that article and I'm pretty sure that some of them who are reading articles are ideated about suicide and they probably would take their own life a small minority of them would probably take provide the point of contact on our websites and from an immediate perspective I think that we cannot censor the information people who are ideating about taking their own life we try and find out the way to do that even though Wikipedia doesn't provide that kind of information so we have to balance Dr Hussain so sorry just wanted to gently tag that we are at time I would love for you to provide your written thoughts as well as we follow up with everyone in Metta thank you everyone for taking the time to attend today's session again really appreciate our panelists and we will be looking to answer any questions that we missed asynchronously thank you again everyone for attending today's session Tina provided her email in the chat as well so if you want to get in touch with her or any of our other panelists through her please feel free to send her an email but thank you everyone we can consider this session adjourned