 Hello everyone, thank you all so much for joining both in the office and online Yeah, so thank you all so much for joining today Just a quick note. Please hold on to your questions until the end I'm hoping I should be able to get through the entire presentation and so we'll have time for questions at the very end And just for those of you online. We have we do have someone looking at IRC and YouTube For questions and comments as well. Thank you so much Joe for that And with that we'll get started So that's me my name is Edward Galvez I work I'm an evaluation strategist on the learning and evaluation team our team Learning evaluation promotes the use of evaluation for learning To improve decision-making that will better wiki media projects and communities In short we help people measure their impact and learn from from their work so So I went to wiki media commons to find pictures of communities But I ended up going down a little bit of a rabbit hole and I ended up with this community So pink so these these ones will be taking us through the presentation today as Many of you know wiki media is a global community from multiple countries languages ages and all walks of life And it is a guiding principle for the foundation to collaborate with communities We want to make sure that we're hearing from communities when we're planning to do our work and finding ways to Engage with wiki medians can be difficult Oftentimes we use talk pages or mailing lists to try to get feedback But that's can be difficult from a large group of people. So one way for us to hear from communities is to use surveys However, big surveys can be hard as this guy doesn't look too happy right now We're a global movement lots of languages lots of projects it can be a lot of work for one team to survey The whole world or the all the editors that we have so the idea was for us to band together as a community It's a as an organization across multiple teams to work together to survey Our communities and so that's how see insights got started So really briefly what is community engagement insights? So it's an annual global survey to help foundation teams hear from communities. We serve so teams can make informed decisions So why are we here today, I hope you're as happy as this one So we're gonna talk about what we did What do we did in the survey a little bit about what we learned how we're using the results and Finally time for questions at the end So a little bit about what we did So Quick overview. We surveyed over 4,000 people across four Wikimedia audiences and 13 languages And who did we survey so we surveyed community audiences? There's four community audiences that we focused on the first is editors or contributors We reached them we used a stratified sample and we reached them through talk pages We surveyed affiliate organizers we and for them We made sure to reach out to all the affiliates and we used emails to reach out to them for program organizers We have a list of Contacts that we have and just for context the program organizer is someone who organizes the Wikipedia education program or editathons or glam workshops things like that to to promote Wikimedia But we and we reached them using emails and The last group was volunteer developers we use a convenient sample We don't really have a nice way to sample volunteer developers right now So we I'd sent messages on multiple mailing lists and different platforms like fabricator to to reach them So what was in the survey? So we had a hundred and seventy questions in all Not every one saw a hundred and seventy questions. That's a lot of questions. So we Each kind of audience would see a different number of questions and it was about 50 per person give or take So what do we ask in the questions? So we really focus our survey questions around the work that people are doing so Trying to first of all is trying to understand the context that some That these audiences are doing that are volunteering in and doing their work in but then related to Wikimedia Foundation programs people want to know the awareness like how many editors know about their program or how much do people know about their program How have they participated in various communication channels? For example? Etc. So these are the types of questions that we asked and we'll get into that a little bit more later and Who worked on this so we had 11 teams participate this year And you can see the list of teams on the left over there and on the right we have a list of folks who helped with this project People help with question design with communications. They developed some small software for our tools for our bots They help with translation copy editing Learning learning workshop sampling. It's a people did a lot. And so thank you so much to everyone who helped with this with this project this year huge thank you So before we get into the data, I wanted to quickly do an overview of our response rates this year From editors we had Almost 3,800 people respond to the survey. It was about a 29 percent response rate For the people who we reached out to and so we had about 11 percent increase from last year for program organizers We had 153 so we actually had more people respond this year because we tried to reach to more reach out to more program organizers But we did see a decrease in the in the response rate We're not quite sure why this is but we're suspecting it might be some of our lists might be a little bit old for affiliate organizers similar similar as program organizers we had more affiliates respond but our response rate was lower and we think that's probably because of The Wikimedia conference was happening like right at the end of the survey So People were preoccupied and then the last is volunteer developers We actually saw a really big increase there 37 percent increase in people who responded and Thank you to Maria Cruz for helping with the communications for that so now we're gonna go into the data and so Yeah, let's see what we learned So first One of the big questions we often ask ourselves is what is the diversity of the Wikimedia community? so Let's look at what that means generally this is this means actually before I go back generally this means The four measures we we looked at where it was age geography gender and education so And this is a because this is the second year we're doing the survey we can actually start to detect changes from one year to the next so In the next slides after this one are gonna see some graphs with some of these terms So I kind of want to quickly run through what these mean So Middle East and Africa Wikipedias means Arabic Wikipedia plus other like a sample of other languages in the region Asian India Wikipedias is Japanese and Chinese Eastern European is Russian and others Western Europe is Italian France German and Dutch plus a sample of others in the region Spanish and Portuguese is only Spanish and Portuguese because The editors kind of divide between Europe and Latin America In terms of their region and where they live and then English Wikipedia is Just English and then all the other Wikimedia projects is a sample of everything else But there's a we had we do a little bit extra sampling of commons and wiki data So so yeah, so the languages that are listed here are kind of the majority of the people represented in each of these categories So hopefully that'll help you with the next slides Okay, so gender so for editors Specifically just to call out actually before we move on Both program organizers affiliate organizers and voluntary developers are also on this slide So we can kind of see differences across all the audiences or how the audiences compare So specifically for the editors so all the Wikipedias and the other Wikimedia projects category We did we did not find a statistically significant difference in gender the average was still 10% women For and then if you look across the audiences We can see kind of a range from Spanish and Portuguese Wikipedia Which is that 5% all the way up to program organizers, which is pretty high It was a pretty high increase this year to 35% And then We did find so when we looked at and then coming back to editors We did see differences when comparing between global north and emerging regions We generally saw a decrease of editors coming from emerging communities or emerging regions But we did have a change in our sampling strategy, which may have affected our response rates for emerging communities overall So that's something that we can need to look at to see How much that affected our our numbers so so yeah, so basically not not too much change here age so Across the audiences the median age is still between 35 to 44 years Across all the audiences For editors there was a slight increase in age Because these are categories. It's hard to describe how much that change was but basically more people selected higher categories in in the in the question In general younger wikipedians are in Japanese Arabic and Chinese wikipedias or What I call them here India and Asia and Where's that Middle East and Africa wikipedia languages and then older wikipedians are in English and in Western Europe So the two on the bottom there So now a little bit about geography so geography due to our sampling approach It's a little bit difficult to weigh the data for this So these are just kind of people who responded to our data and because of how we sampled we have a lot of people who came from Western Europe So in general we just have and and these are also then the way that we sample is we pick the the projects that are the largest so In general this kind of does still represent that Western Europe is is a big portion of our of our of our editor population Let's see Yeah, and as I said before we did detect a decrease of editors from emerging countries, but this might be due to a change in our sampling strategy and What's useful to see here and what the what's useful to see here is how? Editors how the other three audiences kind of compare with editors. You do see a little bit more diversity Because Western Europe is less represented compared to the editor population. So and finally education So this was the first time we asked about education Basically a lot of people are very well educated among editors specifically 85 percent have post-secondary education or higher Program organizers and affiliate organizers have The highest levels of education or over 90 percent have a first university degree or higher and then wikipedia's from So Middle East and Africa and Asian wikipedia's have Both the youngest and they have the least Lowest education which would make sense because if you have a lower population then less people might have completed college. So We kind of expect that to happen So in short men continue to be overly represented across the audiences Most contributors are from the global north across the audience's everyone is well educated and age is kind of the most balanced But there are some communities that are younger and some are older. So Quick quick summary there So now we're gonna go into what is the current health of wikipedia communities? This yeah, this is gonna get a little into a lot of detail. So bear with me So first is how do we measure community health? So we use kind of three different Categories of things that we're measuring the first is what we're calling collaboration and engagement so how well people feel like they're part of the movement and Whether they're gonna continue participating in the future second is diversity and inclusion and this is really how receptive is the community to different people different voices including And including a diverse diverse folks into their into their communities and the last is harassment and conflict. So our theory is that Conflict Likely leads to harassment and so and that's something that tends to happen on on online communities So that's something that we're trying to look at as well. These are not exhaustive measures of community health These are just the ones that we've that we're looking at for the survey this year so Before I get into numbers I need to explain how these questions work a little bit so Collaboration and engagement it is six question sets each question set has three to six statements and The an example statement is I would recommend wiki media as a great place to contribute So so a person who responds to that question while they say to that statement will say either they agree or disagree It's on a scale of five. So strongly disagree disagree neither Agree and strongly agree And then there's about 28 different statements in this In this specific one and so those 28 statements are then grouped into the six These six different areas. So I'm gonna go through each one. So engagement is about motivation to contribute Foundation leadership is how contributors perceive the foundation's role in the movement Feedback and recognition is how contributors learn from others and feel achievement and being part of the community Problem-solving and negotiating is how contributors are able to work through conflict or problems Collaborative intention is how contributors support or don't support one another and then awareness of self and others is how contributors perceive others self awareness So not your self-aware, but other people's self-awareness, which is the funny. That's a funny one. Okay So I'm gonna walk you through this next slide as well So on The far left you have the construct names on the bottom There is the significance of the differences and the change we can kind of ignore that for now Next to the constructs we have the mean So this is the score that we took the average of all the the questions within that that specific Construct and we cut came up with a mean and then they're listed in in order from highest to lowest Next to it is the percent change So how much did it change from one year to the next from last year to this year and then question quality? So question quality is Kind of for the researchers out there. So it's actually the Krohn back alpha for the construct And I used a point seven As whether it's it's good or whether it needs work And so just it just this is about the wording of the question and how we we group the different statements to To to for each of these different constructs Okay, the other thing to note is that we did have some changes to the wording and when the wording changes the comparability from one year to the next can be a little bit difficult so They all change except feedback and recognition did not change the wording did not change and For foundation leadership. There was a small change in one of the one of the statements So we we're not sure if that really influenced it very much, but it did change a little bit So we have to call that out Okay so now let me go one by one and just give a little summary so Engagement there was a very slight decrease But it still remains one of the highest to measure so it came down from four point one seven to four point oh eight For foundation leadership this one was the highest decrease among the measures it decreased from a three point eight six down to three point six five We're looking into the this these questions come from the community community engagement department and they're looking into What this decrease decrease decrease means and where it comes from because we can kind of get a sense of Where it might be coming from because there's different statements within foundation leadership Feedback and recognition there were no changes found problem-solving negotiating. There was an increase of four point one percent Collaborative intention there was in a change and then awareness of self and others So we had to modify this one from last year But when we modified it there was a for a couple of statements. It was an increase of five percent However, that one still remains pretty low. It's less than three point. Oh Which is something to that that is should it's an area for improvement really Diversity and inclusion so I'm gonna run through these different statements really quickly again, so it's a similar approach Except that only the first four are the agree disagree the last two are a little bit different in terms of the question type And I'll talk about that a little bit later But just to give you a gist of what each of these are so individual commitment to diversity is a contributor self perceptions toward diversity So there were reflections on diversity Inclusive interactions and measures whether communities have a space where people can communicate freely Inclusive culture is whether individuals in the community take actions to improve diversity Belonging is how included people feel among the people around them Valuing diversity is contributors reflections on whether their community values diversity through actions or policies Presence of discrimination is how often contributors feel they are being treated unfairly based on personal characteristics So Similar to last time except the only the first four on the top are They agree to disagree scales the bottom two are a little bit different and I'll talk about those when we get to them So first the first four on top Individual commitment to diversity is doing well at four point one five We do try to reach somewhere around four point. Oh is kind is what you try to go for In inclusive interactions is a little bit lower belonging and inclusive culture are the two lowest In terms of diversity and inclusion the other scales That we have presence is presence of discrimination So this was a frequency Scale so how often do people experience discrimination and across all the audiences it was rarely or never So that's something that we we want to see And then attitudes towards valuing diversity. So for this one, it's it's a little bit different It's a it's a list of statements and people will check off which ones they they think are present in their community and so The statements are something An example might be something like leaders in my projects show that they value diversity on the projects through the policies that they create and people either check that or not check that and a total of Sure, so for all the audiences about one point five Average was one and a half check marks out of five However for program organizers and affiliates. They actually had a two point five So they had a pretty big difference from from editors, which were was one point five Okay, now moving on to harassment and conflict. So these are not so much constructs like the others These are questions that specific teams are interested in answering So generally the experience of harassment has not declined since 2017 and it appears to remain said study so 22 percent Reported that they felt unsafe or uncomfortable in any online or offline space in the last 12 months in 2017 it was 32 percent. However, there was a change in the language of the question last year We did not ask we didn't ask about the specific 12 month time period and so We know that it's not worse because it's not higher than 32 percent, but we don't know if it's better So we'll see if we ask the question again next year. We'll be able to see if there's a difference and then we also asked the question around The frequency of bullying and harassment on several of the projects and in most cases we didn't see a change so there was a few small changes in wikimedia commons increased a little bit meta decreased a little bit and a Wikiversity decreased a little bit, but there were like very small changes Yeah, so that's harassment and conflict For Let's see so in terms of conflict so the anti-harassment tools was interested in learning around about conflict And so they learned primarily context surrounding how about or on conflict on the projects They learned that about 43 percent of editors have tried to help resolve a conflict on wikipedia And then when we asked them why they help resolve conflict 30 percent We summarized the data and it's unlike 38 percent said they were trying to help wikipedia as an encyclopedia Whereas 39 percent said they're trying to help wikipedia as a community So some focused on the on the product on the wikipedia product whereas other focused on the people and Then separately these other two questions are around added their attitude questions and 55% did not know where to turn for help when they are being attacked on wikipedia and 46% said that they felt they were freely able to express their thoughts Without being attacked on wikipedia and for that one We saw a difference between men and women responded differently for for that that last bullet Where women were a little bit lower than men, so this is a little quick break It's a lot of information all right So now we're gonna get into some stories about programs that support community So we have a lot of teams that support communities in different ways And so I kind of want to and teams submitted specific questions around their works I kind of want to showcase a little bit of the other types of questions that you can find in the reports So one second Bad idea, sorry quick water break Okay, so communications department so The I yeah the gist of the communications department is that they help share the story of wikipedia and the foundation to the world And so they asked some interesting questions around how wikipedia is communicate The first is around sharing about wikipedia. So we asked folks and what ways do you and what channels do you share? Your work on wikipedia Channels like social media mailing lists things like that and we saw an interesting difference in how people share so Basically editors in general were a lot less likely to share anything Compared to volunteer developers program organizers and affiliate organizers like this includes social media This includes wiki wiki media mailing lists, etc And they just just generally don't seem to share things about their work and other on other channels And Then on the flip side is how do they learn like where do they learn information or receive information? and So we asked this question what channels do you use to learn about wiki media foundation features and services and It's is it black and white it is black and white at least on me Okay, good to know so Basically, so this is a little bit hard to follow because it's a lot of it's a heat map It has a lot of information on it But basically we have all four audiences represented here from both 2017 and 2018 Where you see the darker the shades of in darker red or in just darker color Those are affiliates program organizers and volunteer developers So basically we see affiliates programmers and developers use a wide variety of different Channels to learn about the foundation features and services like the blog social media Conferences, etc mailing list project pages, etc on the flip side if we look at editors you see it's a lot lighter in color So they generally just don't use as much these channels as much Compared to to affiliates and program organizers in total About 45 percent of editors said that they don't use any of the channels to learn about wiki media foundation services or features So that was some interesting context to to learn about So Community resources supports communities with resources like funding for project or events and the and as well as other other support And so they had some interesting questions around the conferences that they support So all the conferences like wiki mania wiki media conference regional events and they wanted to learn What is the outcome of these conferences? What are people taking away from coming to these conferences? So? This is another difficult chart to read quickly But so all the conferences are on the on the left-hand side and on top are kind of five different outcomes So I'll read all kind of summarize them from left to right so from on the left side It's discovery this and then on the far left is discovery followed by starting a project followed by Resolving conflict or changing policy followed by recognition followed by improving or learning or learning a skill or Yeah, learning or learning a skill in general. So In general all conferences seem to be really good at Discovery as well as starting or improving projects. So you can kind of see that because the two left columns are darker in shade However wiki mania seemed to be the best. So you seem to really excel at for people to discover new things And then some other conferences were better at different things So we see that wiki media conference and national and local conference wiki media conference and national local conferences kind of Are really good at helping to Resolve conflict or possibly change policy. So that's in the middle in the middle column there You can see the the two slightly darker shades and then in terms of learning or improving skills on the far right We see that regional national and thematic events are best suited for for learning And then participants and then the least often selected was recognition, which is the second one from the right Participants less selected that least often among all the different outcomes So yeah, so this was interesting to learn Finally our team my team a learning and evaluation. So we support affiliate and program organizers and using data for learning and improving program design so We asked some questions around People's capacity to do evaluation. So how well do you take the time to plan your project and also Figure out how you're gonna measure the impact of your in your program. So we ask Affiliate and program organizers a set of questions around this and in general what we're seeing is a general increase in Evaluation capacity. So this is a little bit difficult to read But basically the bars in the green is related to a survey that we did we used to do Where we had a very focused list of People that we that we emailed the survey to but now in the last two years since we started doing communication insights It's the the number of people participating in this survey has expanded So while we do see a bit of a decrease in that light blue one there because we expanded the scope We are still seeing an increase this year Basically in general there's been an increase in evaluation capacity and evaluation capacity. So Yeah, so this is really great and then community relations so The community relations team supports communications between the foundation staff and Wikimedia stakeholders and they also learned some interesting Things around context. So we asked the questions about how do editors prefer to receive updates about software development from the Wikimedia foundation? And tech news is the most preferred resource among both low and high activity editors Low activity editors also seem to rely more on social networks than the blog While high activity editors seem to rely more on village pumps and Wikimedia news sites like the signposts are in the courier So, yeah It's a lot of data Okay, so Next is the So similarly this is this is how do people receive updates and then this is how do people give feedback to developers? and We see that the majority of editors don't really care to give feedback to developers You can see that in the two the 59 and the 51% on the far left Affiliate and program organizers seem to use fabricator often. So that's in green I mean, you can kind of see the three conlet the three groups on the right-hand side of the dev PL and AFF you see that the three Green colors are a little bit higher and then volunteer developers Use fabricator as well as media wiki org and then we also see that program organizers So the PL there on the second to second one from the right They also use Wikimedia mainly. No, they also I they actually also don't really care as much to give feedback to developers So So yeah, so some interesting context around how our audiences like to send and receive information from from the foundation Okay, so we have all this data. It's a lot of information. So how are we actually using this information? So we held a bunch of learning workshops with each of the teams that participated To help them understand and use the results and thank you to Dana McCurdy for helping with those So first it's important to note that Using data takes a lot of time. All the teams are in some kind of conversation about how they're gonna use their data either to do more research To improve their programs or even to ask different questions in the survey this year But I wanted to share a few of the highlights of how teams are starting to think about how they're gonna use the results so Anti-harassment tools they want to use data to inform various projects like they're working on developing community health metrics and designing a reporting system for conflicts community programs is kind of thinking about exploring trainings as well as communications and advocacy for education and Increasing Wikipedia library awareness Partnerships and global reach is looking to reach out to a community affiliates who mentioned that they need to help with Partnerships and explore how to increase awareness of the teams in some region legal department wants to Discuss and continue to develop their communication strategy for the same annual transparency report Learning and evaluation we want to teach organizers how to increase a create a cultural learning possibly using train the trainer programs and Finally trust and safety is looking to improve the awareness of the emergency at media email address So teams have really started to dig into the data and have started to find specific things that they can use in their planning and in their work Which is pretty awesome All right, so what's next So as I said teams are starting to use the results It takes a little while for teams to incorporate into their work And we're gonna kind of follow up with them and see and see how how that goes See insights 2019 so next year survey is already getting started and we're hoping well the survey will be distributed in April of next year And so what you can do now, please read the reports It's a if you can find it on meta And there are 11 team reports as well so you can kind of dig into each team based on what you're interested in and feel free to post your reflections or questions on the talk page and Just you know ask yourself the question. How is the data useful for you and your work? So that's it Any questions Joe has a question. We do have one question. Let me just put this yes So forgive me if you've already answered this question during the thing but has sense of community ever been measured sense of community I suppose like yeah people editors and various audiences believing themselves to be part of a community and We do have a question around belonging. I am not sure if That question is specifically whether they identify as being in a community I would need to take a look at the statements to see what they asked but it could be in there James here. I Saw a rather big dip in a foundation leadership or the perception of such Do you have any insight as to what caused that dip? so the Community engagement department is working on trying to figure out what that means We are looking at so you can take a look at so because there's I think five No, there's three statements in there We can kind of see which ones were the ones that dipped from last year And so we need to take a look at like where it's coming from so it's something that we're looking at Yes, that's a good point. Yes, it is the second highest Yeah Yeah, so Jamie who is the one who writes the questions for see for the CE community engagement department She said that It the foundation leadership one is still the second highest among all the all the constructs. So that's something to note and there just might be a What's called a regression to the mean so as you as you do the survey year over a year? There's the higher measures and the lower measures will slowly move towards some some mean in the future. So, yeah Any other questions? It was a lot of information. I know Anything surprising? interesting Nothing surprising. Do you guys like the penguins? Okay There are no more questions. Yep. Oh There are no more questions. Thank you all so much. Please do check out the report I know it's a lot of information But especially if you work with communities, it can be really helpful to kind of tell you a little bit more about about them and about how our work how we're how we're Working with them. So so yeah Read the report add your comments. Thank you. Oh, that's it. Thank you so much