 Okay, so let's get started. I hope all of you had a chance to have a break. What are you talk with each other or decided to have a quiet break? It's my honor to be moderating the upcoming panel here. For those of you who have just joined, my name is Leila Zia. I'm the head of research here at Wikimedia Foundation and one of the co-organizers of Wiki Workshop. And in the next 45 minutes, I'm going to be talking with the four speakers of the panel that you see on the screen and I'm going to introduce them to you momentarily about inclusive collaboration towards high quality digital common knowledge. So let me introduce the speakers to you. Jerome is an assistant research professor at the French National Center for Scientific Research CNRS. He's a research affiliate at the Center for Law and Economics at ETH Zurich and a faculty associate at the Berkman Klein Center for Internet and Society at Harvard University. We have invited Jerome here because he has, his background is in behavioral economics and his research primarily operates at the boundaries between psychology, economics and computational social science. His current research interests are on studying how people deal with their decision making biases as well as the determinants and consequences of communication style and leadership in virtual teamwork. We're glad to have you here, Jerome. Our next panelist is Benjamin Mako Hin. He's an assistant professor in the University of Washington Department of Communications and a junk assistant professor in the Department of Humans at the Center for Design and Engineering and Computer Science and Engineering and affiliate faculty in the Center for Statistics and the Social Sciences, the E Science Institute and many more places. Mako is also a faculty associate at Berkman Klein. He studies, he and his lab study collective action in online communities, and they seek to understand why some attempts to build collaborative production networks like Wikipedia or Linux work well, while others do not attract volunteers. Our third panelist is Cristina Lerman. Cristina is a project leader at the Information Science Institute at the University of Southern California. She holds a joint appointment as a research associate professor in the USC Bitterbee School of Engineering's Computer Science Department. Her focus is on applying network and machine learning based methods to problems in social computing, and we're glad to have her here, particularly for some of her recent work on cognitive bias and its impact on collaborative networks. Our last panelist is Misha Teplisky. He's an assistant professor at the School of Information University of Michigan. He's an alumni of Wiki Workshop. We are really glad to have him back. His research is at the intersection of science, science of science, sociology of organizations, computational social science. He studies how social and organizational factors affect scientific discovery. Previously, Misha helped postdoc at the Laboratory of Innovation Science at Harvard, and he received his PhD in sociology from University of Chicago, where he was a member of the Knowledge Lab. Thank you so much to our panel for showing up. And with that, I'm going to switch to the first series of questions. And as a reminder for the audience, the first 25 to 30 minutes will be us, the panel discussing different questions, and then we're going to come back to you for the last 10 or 15 minutes. For questions, please continue raising your questions in the chat. Isaac will keep track of them, and you can raise them when it's your turn yourself or as I can read out your question to the panel. So, Christina, I would like to start by you. So, as we all know, Wikipedia has been a platform for process and collaborative network collaboration over the past almost two decades. And today, while it serves as a project with high quality content, we also see a lot of challenges around the project. And that's kind of the topic of this panel is primarily focusing on how we can create an inclusive collaborative environment in Wikipedia that can help the project continue to have high quality content and provide it as part of the digital comments. I want to start by you partly because part of your research focuses on understanding the effects of cognitive bias on the solutions identified through collective processes. You have particularly studied stack exchange and I wanted to ask you to share with us some of the things you have learned through your research. Sure. Thank you. I'm going to try to summarize my talk in about four minutes. So, yes, I do study the interactions between individual cognitive biases that people use to make decisions and and the platforms algorithmic biases and how it impacts the wisdom of crowds applications. So, if you're not familiar with what kinds of biases are, these are actually based on the simple mental shortcuts or heuristic kinds of heuristics that people use to make quick and less accurate decisions. So these kinds of biases and introduce a kind of heuristic introduced well, well, kind of statistical biases into people's actions. So what types of cognitive biases and the psychologist actually enumerated more than 100 different biases of this shortcuts that people used to make decisions. But in the context of crowdsourcing implications, the ones that has your focus my attention on mostly position bias. For example, if you look at studies where people pay that we're on the web page, the people paying most attention to for Western cultures, it happens to be like upper left and mostly that upper of the pages. So whatever short information is shows up at the top of a page, more people will be more likely to see it. And then there was also social influence bias, you know, web, different web platforms, surf, show what other people like, you know, for example, number of likes or other kind of hot or popular items they highlight them to people. And then kinds of biases often interact with algorithmic bias and what the algorithmic bias that what I mean is that yes website either they show how popular items are or maybe they will actually rank items by, you know, they hope to do it by quality but most of the time they rank items by popularity. For example, the number one top ranked item is the one that is more popular, for example, the sort of items with the popularity. And we have actually explored this interaction between cognitive and algorithmic biases specifically in context of stack exchange. So because which happens, you know, stack exchange wants to get the community to help discover best answers to questions. And specifically both through a statistical analysis through online experiments on Amazon Turks, we found that people, instead of looking through and evaluating all answers to a question, people tend to focus on answers that are near the top, you know, they have more words or maybe they've been selected by the basket as the best question. And moreover, the more answers that are that are available, the more people rely on this shortcuts on this simple heuristics by choosing the path answer choosing the answer that has already been accepted. You know, in order to upward, you know, select the best answers. So then when when stack exchange actually aggregates all this decisions all this upwards from people in actually sub optimal answers might be the one that float up to the top rather than the best quality answers. So that's why I should agree. And also, in more recent research we're showing how these these interaction with algorithmic the way algorithms display items can actually produce unstable rankings in which items that are next time you run similar experiment for some different items will different answers will rise up to the top. So we are investigating also methods not only to quantify how much these biases interact how much the fact the quality of solutions the crowd discover, but also methods to try to reduce the biases and really enable the community to discover best quality answers. Thank you very much for that Christina. So I want to turn to you for our second question and I want to ask you to talk about your recent research that research that you and your, your colleagues have done. And in understanding the effect of bias on Wikipedia and how Wikipedia has managed bias in a productive way towards quality content production. Can you speak to what you have learned through this research. So everybody in Mexico. A quick rundown of the work. So this was a recent paper called wisdom of always crowds. And what we did there, myself and the team of sociologists information scientists be So after the sort of classic question of diversity diversity and performance, and particularly cognitive diversity, which is often really hard to measure and real settings of course, easier online and Wikipedia makes it especially a good case. So we're trying to understand the role of this sort of new wish or increasingly important dimension of diversity, people's differences on ideology. And so we did it, you know, so you might think that cognitive diversity is generally good for kind of like surfacing new ideas and that people combine them. And of course, we kind of like look around the world like ideology has this other component to where it's increasingly tied to our general identities, increasingly like effective polarization. Is that some people call it so, for example, if you ask people, will they marry a Republican or Democrat like increasingly people say no depending on who they are things like that. And I see how that's kind of new wish dimension of diversity plays out in Wikipedia. And so we looked at editors of 200,000 pages and try to understand their kind of ideological position based on what they what kinds of political pages they edit conservative ones or liberal ones. And so, with that measure in mind, what we found kind of surprisingly to us was that the more ideological, ideologically diverse the team, the better the page. And that was a pretty robust relationship that we basically couldn't make go away no matter what we tried. And it kind of works and I got to work in a, I guess predicted way where the relationship is especially strong where the domain as politics less so when it's kind of like less related to politics you know, social issues relationship is weaker but science pages even weaker but it still appears a little bit. And so the paper tries to like seeing this positive relationship which we don't see anywhere like see very few places in the world where the ideological differences are actually good. Or seem to be, you know, creating better products. So what drives this kind of pattern. So, you know, at a micro level, we see that people in ideologically diverse teams. We see fewer topics in the kind of semantic sense, but in more ways. So these more in a unique words to discuss a smaller sort of topics. So they're sort of like more concentration effect, I might say, and brought and sort of the rest of the paper and in general I think this kind of raised more questions that an answer. You see this possible relationship. Nice to see but what what sort of, why is it appearing here and not elsewhere. And so a few sort of survey you look around online spaces and I think a few things jump out which are sort of not things we answer in the paper but I think questions that the paper raises this. And one is sort of, there's a big self selection of people into Wikipedia. So the kinds of people selecting into that might have ideological differences but might be more open to teamwork and disagreements and others. So there's a question of like, self selection and brand of the media has and how well people go to brand and they choose to join or not. And then sort of another kind of a secret is that, you know, with the house, like one age one topic doesn't have pre pages from one topic. So that's what like forces everybody into the same sandbox, which is pretty unique and we don't see in other places like, you know, And I think class Lee and I think maybe more most like open to study and experimentation as the role of oversight, you know, as much more bureaucratic and much more kind of levels of oversight that again like Twitter and like that. So one question I think that was pressing in my mind is like, well, like how much oversight is good. We can eat it. So I think I'd say there's a good amount and it seems to have productive effects for this kind of sensitive kind of collaboration. Is that necessary or is that optimal I think is an interesting question. So that's sort of the rundown of our paper. Yeah, thank you for that. And I think that gives us a nice segue to Jerome's work. If we can talk with you if you can talk a little bit about more about your research in this space specifically as we talk about the oversight governance, moderation on Wikipedia. So you're part of your research focuses on the effective structures on minorities. Can you talk with us a little bit more about what you have learned in your research. Yeah, sure. Well, first, thanks for having me. So I think I'm going to start just to build intuition for the audience as to where I'm going with a couple of quotes, and then I'll tie back to what Michelle was saying and develop the argument that I want to make in those in those two minutes. So the first quote that I have for you guys is from Virginia Walt. It's a quote from the 30s. An essay that she wrote, which is called a room of one zone. So I quote, there was an enormous body of masculine opinion to the effect that nothing could be expected of women intellectually, even if her father did not read out loud those opinions. Any girl could read them for herself and the reading, even in the 19th century, must have lowered her vitality and told profoundly upon her work. There will always have been that assertion, you cannot do this, you're incapable of doing that to protest against or to overcome. That's, you know, from a quote from a woman, it's pretty old. Much more recent. A quote coming from somebody who who is black male the grace Tyson, who made a speech when he received his doctoral degree. I think it was in astrophysics or something like that. The first one to get such a such such a degree in the US in the 90s, and he says, in this speech. In the intersection of society, my athletic talents are genetic. I am a likely murder or rapist. My academic failures are expected, and my academic successes are attributed to others to spend most of my life fighting these attitudes, leave these and emotional tax, that is a form of intellectual emasculation. Now, what does all of this has to do with Wikipedia. There's a significant gender gap and, you know, an ethnic minority gap on Wikipedia. And I want to point out right away that, you know, we must question whether this matters per se. Right. And I think it matters, because it creates bias, it lowers the quality of the content, just because people who could take different sense on a topic are simply not represented. It produces stereotypes when people look up for information online. And so basically my message in these two minutes is that psychology, maybe the most fundamental bottlenecks and the integration of ethnic minorities and women, as opposed to the bureaucratization of Wikipedia, which is real, but which is only an aggravating factor, in my opinion, and we'll probably return to that. So, you know, if we think about many of the users who are with us today, there are probably early, many of them are probably early enthusiasts who found it amazing that you could argue with people from all around the world for the better good. Right. And as Michelle was saying, Wikipedia has been amazing at creating rules that focus the debate on what on the debate on what people say, as opposed to who says it. That's their quality of point of view. And that have been so successful. Actually, 15 years ago, it's hard to go back, but nobody would have predicted that the very best articles on Wikipedia would turn out to be those that are highly controversial and attract a lot of different viewpoints. Right. Part of the featured articles on Wikipedia are the ones on Barack Obama on Jesus. That would have been totally unpredictable 15 years ago. Now, what I want to point out here is that this whole system of managing different viewpoints and, you know, political biases and opinions rests on contributors being resilient when they argue their points. That is, they're ready, or even they enjoy arguing their points for hours with each other. Now, if you got the intuition of those quotes that I had at the very beginning, my point is that this is not the case for many different strata of society, and in particular minorities and in particular women. And so that could be a major barrier to entry. And those are not only just quotes, because in the 20 past years of research in the behavioral sciences, we actually collected a very vast amount of experimental and field evidence for the relevance of those quotes in people's real lives. I don't care all the references, I'm going to add a bunch of research, but we know that women and minorities are much more risk averse than the rest of the population. They're much less self confident. And they hate competition. They don't like to compete with others directly. So in field studies in psychology and economics, they've shown that those differences. They actually explain a substantial corruption of students orientation choices as early at high school controlling for academic achievements. And even within a given sector, more competitive individuals are estimated to earn 10% more than others for the same qualifications. Particularly women and minorities. So I think that we should keep all of that in mind when we think about why do we want and how we want to address systematic biases that arise from lack of participation from certain strata of society on Wikipedia. And we also have a bunch of field interventions that tell us that the effect of those differences are actually very large. So, you know, just citing one and then concluding, I don't know how much I've been talking for now. I could talk more, but it's pretty obvious to everybody that when you go to college for the first time, it's pretty common for people to experience social setbacks, feelings of isolation. This is a challenging new environment far away from your family. But black students, they're much more likely to interpret this as evidence that they do not belong at university that this is something that's related to their identity to their race. And so, in one science paper, among several, what people did is that they just ran a quick social belonging intervention without those students knowing having a control and treatment. And they just manipulated them so that so as to attribute those setbacks to general factors that are common to everyone and absolutely not related to race. And the simple intervention creates very large difference in terms of their academic achievements three years down the line. They measure those academic achievements, but not only academic achievements, self reported health, depression, the accessibility of negative racial stereotypes, self doubts, and also subjective happiness. So, so my point is that, how do we make sure that women and minorities feel like they belong in a space where people enjoy arguing, sometimes even for its own sake, right. And this is much more complex a problem to solve than simplifying the contribution process at the technical level. That is create a visual editor or something it will require a lot of creativity, which should happen in the intersection between the behavioral sciences and platform design. And, you know, a few leads there we know that teamwork is efficient at having increasing women's resilience in the face of competitiveness. So we need you be to feel embedded in the social network right from their very first experience. Maybe we could think about a menu of health button so that newbies would feel like they're welcome and supported also right from the start and that this is not about a competition. And part of the challenge here is that it does not seem reasonable to expect this from active users who already have a lot on their plate. So the question to me is, how can we engineer such processes within groups of newbies that share some interest, how can we connect them while shielding them from self assertive or bureaucratic behavior that achieved so much in our domain of Wikipedia for our kind of population. I think this is what we need to tackle. Thank you so much. And Mako I think this brings us naturally to you and to your research and your labs research you focus on understanding the trade offs between engagement and quality. Can you share with us what you have learned through your research. Absolutely. So, I mean, so my research is really focused on life cycles around knowledge commons and online communities. And most of our work sort of builds on these builds on these sort of attempts to create data sets of lots of attempts to build communities like Wikis. So, in one case that involved looking at a whole bunch of the kind of online collaborative encyclopedia projects are created before Wikipedia with a way of trying to, you know, uncover those dynamics. I think much of it has looked at other wikis so big data sets of attempts of wikis from wiki or pre open software projects. Now, one thing that I suspect a number of people on the sort of watching this. No, probably not everybody is that that English Wikipedia peaks in terms of the number of active editors in early 2007 and other language Wikipedia's have peaked at different points in time. Although many around sort of that time. Many others are still like growing but our work has shown that the this pattern, which you may have seen, if you look at many language Wikipedia's have sort of this really rapid sort of linear growth that transitions really rapidly into periods of sort of slow decline is actually, you know, happening at different scales, but it's actually a very general pattern around big sort of successful communities. We see these very general kind of patterns. Now, you know, when we look at the early stages of communities we find they should come it's no surprise and I think it connects to a lot of the things people have already said in the panel that one important factor of sort of like healthy growth and the sort of like the communities that become big and successful tends to be openness, open to enable to involve contributions from lots of kinds of people communities with relatively lightweight forms of governments, governance make it off the make it's easier for them to get off the ground. As you suggested, there are a lot of, there seem to be trade offs between a lot of the kinds of things you know I hear a lot of people on the panel talking about like, on the one case sort of engagement and on the other case, sort of maintaining quality. And I think that some of our work has built on some work that other people have done to really unpack this dynamic set of trade offs between engaging more people in ways that I think are in line with what Jerome has mentioned and maintaining quality in ways that connect to what Christina and also Misha have spoken to. I think that part of the story seems to be that or this part of my story, I guess, is that this is not, it's not a story of sort of the spigot being turned off of the stream sort of slowing down. I mean, that's part of it, but there's also a big increase that we see in these communities that transition into periods of sort of maturity and decline. There are big increases in newcomer rejection which leads to decreases in newcomer retention. And this was shown first by Aaron Haffker and Stu Geiger and Jonathan Morgan I'm sure some subset of whom are watching this, or will later. And I think that that that we've shown that these dynamics also seem to play out across lots of large communities that communities succeed because they're open, or in part because they're open. But they become increasingly closed in ways that they mean that they do a less good job of integrating people who are showing up at the door over time. And this also seems to be a general pattern now. I think that that we tend to find that people that are early on in communities tend to consolidate power as their interests sort of diverge from the rest of the participants. So I think that this is a big question and this is something that that a lot of our sort of recent work in the Community Data Science Collective my sort of research group has been focusing on, and sort of understanding what's going on and our answer seems to be that there's that there really is this trade off that emerges over time between the initial sort of need to build a community to build sort of a stock of value to get lots of people in. There's a need that emerges over time to protect it from people that are showing up and have a desire to have things sort of look one way the benefits there, you know, business or point of view. And so you build a whole bunch of structure, really what you think of is like like governance structures which I think correspond, which might be ways of working or ways of talking which I think correspond to a lot of things that Misha and Jerome talks about, which are a number to maintain and manage quality, and the effects of that tends to be this real trade off and the ability to engage people, sometimes in ways that are that, you know, sort of like, just very systematically across different demographic groups, early on, when the stakes are lower, you can be more flexible. And so I'm really interested in the way in which these these those sort of dynamics begin to play out over time. And everyone on the panel. So I would like to give us all on the panel, maybe in three or four minutes to ask each other questions or if you have comments for each other, based on what you have heard. And then let's spend the last 10 minutes of it for the audience to ask any questions they may have I see that already Q is building. I think that Joe has already a question for Mako. So go ahead, Joe. Yeah. Just as a way to, to develop this conversation of this, I think, you know, everybody agrees to this to this trade off that needs to be solved. And it seems to be a very challenging, very challenging thing to do. And, you know, Mako, you know, you know a lot about all kinds of different weekies in different development stages and Wikipedia in particular, I'll be interested to know what you think about about the possibility of two tier systems, whereby we need to maintain the bureaucracy and and and and ensure quality on the one hand, but on the other hand, if we want to integrate more community members and get some low hanging fruits elsewhere. Get those people to contribute. We also need them to have fun, have an easy access be able to connect to each other. I know there's been some experiment about, you know, having sandbox, having articles pre validated by an experienced community member before they get in but that kind of kills kind of the fun, or, you know, your direct impact of it. What do you think about this being a potential way forward. So I love the idea of a two tiered system in a sandbox, in part because a little piece of Wikipedia history is that Wikipedia was originally created as the second tier sandbox for a different encyclopedia project called new pedia, which was going to be a big professional sort of a more professional sort of reviewed system, you built the open sandbox and in large part, I would argue because of its openness, it was able to succeed to the point that basically people sort of forgot about new pedia and it sort of fell apart. I think that the trade off is actually, I mean that's not a way of really answering your question. I think it's interesting. I think that in practice it's difficult to do. I do think that that Wikipedia and lots of other wikis exist in this broad ecosystem there are a huge proportion and no Isaac Johnson's on the call sort of has shown that like a huge proportion of people are consuming information from Wikipedia in places other than Wikipedia. And so there exists this broader ecosystem, which is already out there. I think that figuring out how to design those tiers and ensure that you have sort of participation in places where it's possible while also sort of like having it work another, having things sort of move easily between them can be really effective and I think that you've seen very successful examples of this and freedom of the software as well so I think that there's potential but I think that I think that it's it's it's hard to do. Thank you. Any other folks in the panel have comments for each other questions, maybe one or two more. So, I'll be much paid to this attention to this interesting discussion thinking actually what can platform do specifically to encourage participation by minorities or paper to highlight inclusiveness and highlighting on newcomers or something and I'd like to be optimistic but you know so far I'm going to talk about in a pessimistic note we are going to have a paper we have a paper that was accepted to I see WSM for this year that shows outcome of stack exchange efforts to try to encourage newcomers you know so they also had the same problem that there's been overall there's been attrition over a long period of time and you can basically newcomers don't feel welcome so they tried to one strategy they tried to implement is to have a newcomer badge you know so to highlight contributions by first time users to hopefully to make community more welcoming to them. So we actually tried to measure causal impact of this newcomer badge on behaviors. And this long story short is basically not not much impact it does not help at all it did decrease in the short term it did increase number of negative comments newcomers to see but overall it did not help stem the you know the rejection and the flow of new outflow of newcomers to the platform. So that makes me a little bit pessimistic but maybe I mean that doesn't say that there is no things that platforms can do but we need to try in the safety testing way maybe try many different approaches to try to find how we can actually make people feel more welcome and more contribute more widely. Christina one question for you is this study also taking into account the gender of the contributors or that information is not available. That information fortunately is not available. Yeah. So any other ideas people might have what you know platforms can do specifically because they do have a lot of power in control to highlight the flow top to top certain people, people's contributions. Maybe you know you know it might conflict a little bit with the governance structures but I think actually it is the duty of platforms to try to moderate some of the negative effects of social biases. Yeah and in the case of Wikipedia I wonder from the people who are in the panel or in the audience. This is relatively complex, given that the part of a lot of the governance model and policy setting and enforcement in Wikipedia is happening on the volunteer community front and then platform itself has limited, relatively limited control over what can be done in that case. If anyone on the panel has comments in this or in the audience that would be great to hear. And then maybe that can be the last comment on the panel and then we can switch to the audience questions. So having like encouraging badges not to like, you know, like showing like step or flow has done it in some cases successfully by adding badges to users to some of the kind of guide the community's response to it. I mean the paper I showed what I talked about it was like negative outcome of badges but there's could be some positive better ones so you're not directly controlling what maybe what people are doing but maybe by surfacing or highlighting some features of the community members you're actually trying to encourage certain types of behaviors to steer them towards the goal. So I have a lot of comments in this but also we have few minutes for audience questions so I'll keep those for now to myself and I'll follow up with you Christina. Isaac, if you can help us with the questions from the audience that would be great. We've got a couple. The first one is for Jerome comes from Maria in the chat and she says what exactly is the difference between direct and indirect when you say women hate competition at least indirectly. And how can it be seen or incorporated in the inclusion or exclusion from the real virtual dialogue. Yeah, so I'm not sure I actually said indirect. If I said that that was a mistake on my part. What I was saying was that apart from the concept of risk aversion and the concept of self confidence which is how much do you feel confidence in the case of Wikipedia that you can actually engage in an argument and eventually effectively and eventually win that argument or push your your your thesis. The taste for competition is about the structure of how you interact with people. So is it that you're in a group and there is one winner. Or is it that we're all together trying to achieve something. And you know, in theory, Wikipedia is theorized as the latter. This is a cooperation and deeper. But in effect, the newly experienced often very much looks like a competition. I'm going to win this over you. And you know, there are some features that are embedded in the platform that that kind of also create this. It could be possible to soften the interactions that editors have right from the start using clever social engineering. And I think that this impression doesn't come across that easily because this is really the factor that differentiates between, you know, successfully integrating newcomers, especially when it comes to women and minorities, as opposed to not. Great. The next question comes from Saif. I'm going to let you ask this unless you'd you'd like me to do it. Oh, thank you. Hello, can others hear me. Yep. Thank you very much and great panel. I'm really enjoying it. So my question is, it seems for instance, how do you guys feel that your cultural background might be acting as a gatekeeper for Wikipedia. For instance, you discussed that openness is very important for content production. But it might be that you're looking at this problem within cultures who value openness. And you might not you might have not yet studied how openness is currently affecting in cultures that have conflict with with with openness, who might become maybe they even freeze up when they have to work with with with others. And so I'm curious if you guys could talk about how you see that your own cultural background might be effect might be acting as gatekeepers for the results that we're finding on about content production in Wikipedia. And thank you very much. I'm really enjoying this panel. Whoever would like to take the question, please. So maybe I'll start. So just a interesting question one kind of anecdote that comes up. And my own work is, you know, when I started looking at this, when I started the project looking at ideological diversity and quality pages. I was, I had a result. I expected, namely that the more I expected more diversity to ruin the pages. And now and that kind of prior was based on this research that I had been seeing where I think it's still kind of showing up with, you know, like fake news and media diets are especially like bad and conservative like right wing circles and it's like all fake are shared by like, you know, there's all this research that I was kind of like consuming those pointing to like the conspiracy theory, like 2% are going to ruin collaboration and things like that. So that's what I planned on seeing in our data. And basically the opposite here. I mean, I don't want to completely oppose those as to like that we disprove that, you know, though that research or anything. But it's certainly like I just proved my own expectation and reliably showed that, you know, it was like a pattern we couldn't make go away statistically or in any way that like, the more diverse, I thought the first teams always created. You know, and so that's just, I think one example that and so we wrote the paper with what the data showed us it was not what I expected, maybe given my own perhaps my own culture bias in some way. So it's just one example where the data really drove, you know, the story. Thank you. Yeah, maybe maybe I can just add to that a little bit. I think that one great feature of Wikipedia is that it centers around languages. And so really, the amount of bias that you have to manage is defined within subcultures. So I think that, you know, cultural diversity is certainly relevant. But in a sense, it's tackled at a level that is relevant to each culture, not exactly so, because you know, languages don't really map onto cultures necessarily. I think that's one of the strengths of Wikipedia that allows to manage the amount and diversity of bias within a particular community as opposed to imposing a certain view on everybody. And so in this respect, I think that in general what Michelle saying, you know, I don't know if he has studied other other languages than English Wikipedia most of the results I know of for English Wikipedia but that should translate, relatively speaking to other languages as well. Thanks. The next questions from Jonathan moving through the queue. Thank you, Isaac. So I wanted to follow up on this, this idea of having two tiered systems, or just more broadly kind of having some amount of segregation, not necessarily in like, in all the bad senses of that term. But I think it's important between newcomers or peripheral participants and central participants. And I kind of two questions about this one is, is the purpose of these of the two tiered system to protect the newcomers from the community, or to protect the content quality from the newcomers. So it's important to consider that directly because I think that you would create different mechanisms. If your goal was to protect newcomers versus to protect content from newcomers. The second question is, if you're going to have some sort of two tiered system which of course we do have aspects of this and Wikipedia is for example English Wikipedia as a drafts name space, although interesting corollary the community is proposing getting rid of that drafts maker and I both worked on a project called Wikipedia adventure which created kind of a safe area for newcomers to edit. If you're going to have a two tiered system. How do you do with this problem of, or the potential problem of legitimacy. Somebody's coming into the community and making edits but they're not true edits, whether that's because they're in their user space or that's because they are only a kind of doing a certain number of things on articles, how do you make them feel like they're legitimate. And therefore, you know, bring them closer into the community. And then how do you also project to the community the core community that these are legitimate contributions from legitimate participants, and should be treated and evaluated So, two questions one, what's the role, what's the purpose of the two tiered system to protect community that are the newcomers or protect the content. And two, if, how do you make sure that if you're going to have a two tiered system that you preserve the legitimacy of the people who are participating in a multiple sense. And if I can ask the panel to respond, maybe in 30 seconds or a minute to this question, so that we can conclude, we can stick around and continue the conversation after the panel. Maybe I can say a couple of words but maker has not been talking I'm sure he can say things about this issue. There is no way I can answer those questions in 30 seconds. I'm not going to answer those questions. What I'm going to say though is that I think I think it's mostly about and you know this should not come across as negative I think this is mostly about protecting newcomers from the community. There is no sense of protecting newcomers from the type of interactions that established contributors tend to have, and the quasi legal arguments very dry legal arguments that they enter into, which are off putting and, you know, can be can be stressful for for a newcomer. And so I think this is this is this is what the purpose of this of a two tiered system should serve and also put back the fun into it a little bit. It's not about quality and this is something that you know, make or probably can say something in 30 seconds, having newcomers, his research suggests does not undermine the quality of a wiki. This is, you know, a recurring argument about whether you want to allow anonymous contribution so I will let him develop that point. So it's not an issue of quality to me it's really an issue of how do you interact with other contributors and how would you manage this dry self assertive bureaucracy to put a little bit of the fun back now. Definitely the challenge here is the issue of legitimacy as you pointed out, people just wonder stuff to go online they don't want to have many gatekeepers they don't want to have their stuff evaluated by, you know, 10 layers of authorities before they can actually see that they make a difference. So, you know, how do you get this equilibrium right is really the answer that I don't know and that we should collectively figure out. Make up any final remarks. Yeah, very, very quickly. I think that, so I'll answer the first question which is like, I mean, you can rephrase that question of what do we want do we want more participants or do we want like high quality stuff. And the answer of course is that we want both, like the wiki media foundation's mission is like literally part one, like we want to engage lots of people in production. And two, we want to create like a valuable valuable knowledge, sort of resource that lots of people can benefit from, for which quality is important. And the problem is that, although that there's a tension between those two. I think that figuring we can we can we can design better systems, which reduce the severity of the trade off that that are caused by our, you know, desire to navigate that tension, but I don't think that we're going to eliminate it and yeah the problem of course is always that we just, we want there are many wonderful things in the world and we want them all and sometimes it's hard to have them all. Thank you so much, Michael. Everyone on the panel. Thank you very much. Thanks for the time that you spent with us for the interactions. If you can go to this ether pad link that I just put in the chat there are a few questions from the audience that are not answered yet so if you can enter your responses there that would be great. And with that, I thank you and conclude the panel.