 Oh, we skipped a slide, we didn't get a screen chance. We have a hashtag, hashtag, We Keep Women's Summit. Hashtag, We Keep Women's Summit. Go out and all those different social media accounts you have, go out and cheer this on. Yes. We're gonna start now with our first session. And let me start by saying that while you're here, we are addressing misinformation, disinformation, missing information, and the people who are missing from giving information, but there are four people who are gonna start by sharing some information that they have. And now we turn it over to our AV team to bring up the first group of speakers. They're gonna be speaking and pre-recorded. I'm Clifford Anderson, member of the steering committee of the Wikimedia and Libraries User Group and also a founding member of the Women in Religion User Group. Together with my co-presenters, we'll be speaking about our experiment to use large language models or LLMs to converge it found that data, but allow me to let the other members of the group speak about their experience creating these experiments. I'm Lynn Harton, and I have been working with the Women in Religion Wikipedia Project since its beginning at the 2018 Parliament of the World's Religion in Toronto, Canada. The project started as an effort to address gender bias on Wikipedia. Our focus is the improvement of content on Wikipedia about the lies of cisgender and transgender women who are notable as scholars, activists, and practitioners in the world's religious, spiritual, and wisdom traditions. We are an international Wikipedia user group that meets multiple times a month to organize, edit, and train editors around the world. Recently, we have been meeting monthly with a group of women from Kenya to help with editing on women from their region of the world. Our aims are multiple. We recruit and train editors. We compile lists of notable women in religion that we work to enter onto Wikigata. We create articles and improve existing articles. And significantly, for this presentation, we create secondary sources about women in religion by means of the Women in Religion Biographical Series. Our focus on the creation of secondary sources is meant to address the difficulties encountered by editors who seek to write about women who have made significant contributions through their life works that remain undercovered in all types of media. We are a small group trying to accomplish a lot. Exploring the possibilities of AI offers us a research, editing, and writing tool that makes sense. When all the hype about AI serviced this past spring, the topic came up at one of our monthly meetings. After this discussion, several of us experimented with AI by using it to submit prompts asking for biographical summaries with citations. It was amazing to watch the text of these biographical summaries appear line by line on the screen. The summaries read well and sounded authoritative. However, because we had the published volumes in front of us, we were able to easily evaluate the accuracy of articles and the tendency of AI to hallucinate. In our discussions of our experiences, it occurred to us that we might be able to use the volumes to develop our ability to use AI as an editing tool that it would help us to create initial drafts of Wikipedia articles. We could see that we needed to learn how to create prompts that reduce the chances of AI using its predictive capabilities in ways that produce hallucinations. We also needed to figure out how to provide AI with the appropriate sources that would enable it to produce a useful draft article. I am definitely not a techie type of person. So having Clifford's expertise was critical. But I am a researcher who has made good use of the internet as a research tool which requires the ability to logically phrase and rephrase the questions I enter when I Google a topic. One of the main learnings in this experiment has been the refinement of our ability to create productive prompts. As well, our experience as editors of biographical articles on Wikipedia allowed us to rethink the prompt as a process or a series of questions designed within the framework of a Wikipedia article. Over the weeks that we conducted this experiment, the accuracy and quality of the draft articles we were able to produce improved significantly, which gives me much hope for AI as an editing tool. It also made it clear that that is what AI is, another editing tool. Human research and expertise are required. My name is Christine Meyer and I'm a long-time Wikipedia editor and contributor. And I'm a member of the Wiki Project Women in Religion. My username is figure skating fan. I've been editing and contributing to Wikipedia since 2007. Now, those are the days before Visual Editor and many of the other tools that make editing easier for us. Back then, you had to know how to use Wikimarka. You had to manually create references. Now, we still have to know those things, but it's so much easier now, especially for the new user, to begin to jump in and learn how to edit Wikipedia. The learning curve was steep, but we all did it because we had to. We all know what Jimmy Whales calls Wikipedia's prime directive. Imagine a world in which every single person on the planet is given free access to the sum of all of human knowledge. This is aspirational and a high and lofty goal, but something we've made real inroads in realizing. Of course, the gender gap in systemic bias in Wikipedia have prevented us from fully realizing this goal. We, of the Women in Religion Wiki Project, hope to have helped mitigate the barriers to fulfilling that goal, to having an inclusive, diverse, free encyclopedia, both in content and in bodies, in the form of editors and contributors to articles and biographies about specifically for our Wiki Project, Women in Religion. Perhaps artificial intelligence is a tool that can, like Wikimarka and other tools, that can help us in the goal of creating more content about and for women and other neglected topics. There's a lot to be afraid of about AI, but perhaps one of the ways that can be used for good, rather than for ill, is for the creation of more diversity on Wikipedia. Starting last spring, after discussing what everyone was talking about with AI, we in the Women in Religion Wiki Project started to do some planning around with chatGPT, and we decided that we would do more of a formal experiment to see how we could use artificial intelligence to help us create more articles and maybe even more high-quality articles. And so we're really excited to share some of the results of that experimentation that we've done throughout the summer. It's very, we're at the very beginnings of this experiment, but it's our hope that, not only we will be able to use our results, but that the editing community in general as a whole can use this. Hi, I'm Rosalyn Hinton. I've been working with the Women in Religion Wikipedia user group since 2018 when it was founded at the Parliament of the World's Religions. I've also been a Wikipedia since that time. In the past few months, we've been experimenting with AI to generate Wikipedia biographies that are reliable and to do it in a faster manner than we have been in the past. My first attempt was a query about a living womanist scholar on ChatGPT. At first, the ChatGPT gave us good information, but then it began hallucinating, which means it was making up stuff. So our group decided that we needed to feed the AI better information. The second attempt was on a Wiki writer I-Python notebook created by our colleague Clifford Anderson. This notebook populated AI with the scholar's research and some secondary sources. The AI generated a summary that could be used as a Wikipedia stub, but there needed to be a few modifications and some references added. On the third attempt, we added a number of questions that would generate a Wikipedia article. I uploaded the scholar's CV and a number of her research articles in a PDF format. The program actually generated a very competent Wikipedia article. It included her biography, her career, her education, her research interests, and her influence. However, the one thing that I would have had a little differently was a more diverse reference list. In summary, using AI isn't as scary as you think. We are already Wikipedia's gathering reliable sources and research. AI helps us to summarize this information and format it into a Wiki article very quickly. However, you do need to have a knowledge base in your field. You do need to have some knowledge of the scholar you're working on in order to catch mistakes. Where do we plan to go from here? We still hope to enhance our program's production of Wikipedia articles by enriching the vector database with additional reliable sources. We'd also like to expand our software's compatibility to include other formats beyond PDF such as HTML pages and Word documents. We also want to refine our prompts for improved outcomes. In the long run, our goal is for the augmented LLM to draft credible articles that our members can review, edit and, ultimately, if they deem worthy, publish them on Wikipedia. As noted at the outset, addressing the existing imbalances on Wikipedia is a complex task. But if artificial intelligence can help us remedy existing bias more efficiently, we as a group are eager to explore its potential. Okay, thank you for that presentation and I think the future has come. We have been presented with a large language model. Now, thank you for your presentation.