 This month we've got Matthew O'Neill, Professor of Communication in the University of Canberra's Faculty of Arts and Design and he's here to talk about his research around Wikipedia. Some of you may have been looking forward to see him talk last year's Wow event, which he unfortunately wasn't able to attend. He just let us know that he lost his wallet while it was nicked, basically. Which is a bit of a bummer. But yeah, so I thought it's been about a year since I think his initial paper came out, I think that was right. And I thought we'd give him a chance to jump in and give us an update and hear all about what he was going to talk about, how Wow or whatever else has been working on. So yeah, I'll hand over to you. And as I said, we'll stop the recording at the end and let everyone ask any questions. But yeah, over to you. Thanks very much, James. And I'm really glad to talk to everybody. So I presented very similar slides yesterday at the Ask a Light event. So apologies if you were there. But there are two big news items that I can share with you today. So it won't be exactly the same. So I want to talk about some of the projects I've been working on around information literacy. In the last few months. So there's been a few new kind of projects, publications and concepts that I've been playing around with with different people. So it's great to have this opportunity. And I'm coming at you from none of our country in Canberra. So just to clarify, I know a lot of this, I don't know, maybe. I don't know what you're familiar with and what you're not. But just to make sure we're on the same page, I thought I could just define what I'm not going to be talking about, which is media literacy, which is very much concerned with questions of representation and power. How the media represents the world. So what I'm focusing on today is information literacy, which is just much more basic, less complex question, which is, you know, whether something is true or false. And I'll explain why we don't try. We well, actually, we're going to start talking addressing media literacy as well. But we haven't done it so far. I'll explain why in a minute. So some of you probably familiar with this concept of the attention economy, which is extremely prevalent nowadays within the early 70s. The economists have said, you know, we have a wealth of information. We can't pay attention to everything. So this was in the broadcast media era. So now, of course, with social media, there's so much more possibility for creating content. And so getting people's attention is very important. So that's the first kind of guiding principle that we have. And so how do we deal with what the philosopher Kim Sterrelny from the annual called in 2006, epistemic pollution that we have in an environment where there's a lot of stuff that we're not sure about? Weekly authoritative statements. So how do we verify? So, for example, this is from a disinformation campaign that ran from 2015 to 2020 called Secondary Infections. This fake video attributed to Greenpeace. This came from Russia. And then the fake letter from the Center to Protect Journalism. Or is it the Center for? I don't know what it stands for. Committee to Protect Journalists. Sorry. So completely fake. But, you know, they look at authentic. So how can you? How can you tell? So there's been some analysis of what's known as truth decay or it's funny play on words and truth decay. And it's a very American centric. But what they did was they looked at different historical periods and where there was a lot of, you know, polarization and discontent. And they found that some of these characteristics, such as blurring of between opinion and facts, exist in previous periods. I think they looked, you know, one period in the 19th century, one period in the early 20th century. But what was different about our time is this declining trust in institutions, formally respected sources of factual information. So that's the other consideration. So the first consideration is overabundance of information and, you know, this kind of not knowing what's true or not. And the other one is declining trust in institutions. So this declining trust is local. And so the ANU has been running the selection studies since 1987. And it's pretty damning, like 70 percent of people believe government is only looking after themselves. Only 12 percent believe it's for all the people. And same with the news. When people ask, I think you can trust most of the news. Most of the time, by the way, the digital news report, which comes out. So my centre, the News and Media Research Center is the Australian arm of the Writers Digital News Report, and we released this year's edition today. So you can download it from the News and Media Research Center or from the APO. So you can see there's declining trust in the news as well. Forty three percent in Australia in 2016, 41 percent today. Well, that was last year, actually. So I don't know what I should have updated it today. And the lowest figure, the U.S., only 26 percent of people in the United States mean pretty incredible. I think you can trust the news. So that's the other thing. So you're familiar with this acronym, which is how people still use in classrooms, you know, how to address a claim or a website. Is it current? Is it relevant? Is it authoritative? Is it accurate? What is its purpose? So crap. So you'd look at the bad page, you look at the design. You know, is it has it when has it been updated? Does it have dot org or a dot com because dot org is better than dot com? Or does it have ads? Because if it's got ads, it's a bit suspect and all that kind of stuff. So the problem with this is that this kind of checklist approach results in cognitive overload. Like you there's too many things to think about. And what happens is that people often just latch on to the most obvious thing. So if it's a dot org, must be good because if it's a dot com, it's commercial. So that's not necessarily the case, but that you just latch on to the most obvious signal, visual cues or design cues don't work anymore because anybody can make a really good looking website. And also it wastes a lot of time because you have to think. So the first principle then is when there's an overabundance, you should waste. You should not waste your attention. So you should fact check fast. The fact checking has to be fast. That's the first principle. The second one is when there's distrust, you have to be. You have to give people a reason to believe. And so fact checking has to be inclusive. So one of the concepts that's come out recently is this concept of critical ignoring I mentioned it yesterday. And it is an article. So they've there are an article in the sort of the news media at the top. And there's a sort of peer reviewed article at the bottom. And you can see the names. You can't see all the names at the bottom. You can see them at the top. So that's why I've got both. So it's like I said yesterday, it's like a super group. Of academics, because there's this guy, Ralph, hopefully he's got this concept of nudging Anastasia Kuzireva. I'm not sure what she's what I can't remember what she's known for. But Sam Weinberg is the guy from Stanford's professor from Stanford who came up with this concept of lateral reading. And Stephen Levadovsky is a psychologist who does a lot of work on who's been basically editing the debunking handbook for the last 15 years, which is a very good resource. If you don't know it, it's it's they they release it every year or they've been releasing it. So they all came together and they came they put all their ideas together and they came up with this concept of critical ignoring. So you should not it's about knowing when not to engage, knowing what you should not look at. And there's too much information. So we don't want to, you know, we don't want to be critical. You know, you don't want to engage. You don't want to be have critical literacy. You want to have critical ignoring, which is interesting. So the one we use, of course, is natural reading. Don't feed the trolls is pretty obvious. People want your attention. Just don't give it to them. Just ignore them or report them if they're being offensive. But don't waste your time self-nudging. It's about controlling your information environment to to have better benefits. I'll just give you a quick definition. So are you familiar with the Stanford experiment? I mean, I something you might have. I might have. You might have seen this in a previous presentation. I think Rachel would have maybe mentioned it a while Is that something you're familiar with or no? OK, so. Well, first of all, the Stanford experiment is a kind of joke because the original Stanford experiment is people, you know, getting people to mistreat others as having role play. And if you're an institutional setting, you know, you get people to pretend to be prisoners and guards and they'll do all sorts of nasty things that was in the 60s. So they're redoing it consciously or subconsciously. And what they did was they gave these three groups of people five minutes to check out claims by these two organizations, the American Academy of Pediatrics and the American College of Pediatricians. And they said, look, his two statements, you know, which is legitimate, which is valid. Statements about bullying or about obesity, I don't know, whatever, just similar kind of areas. And the only problem, of course, is that the American Academy of Pediatrics is a legitimate organization that's been around for seven years. It's got like 65,000 people, doctors, and they're really, you know, they're a legit organization. Whereas the other one is got few hundred people, 600 people. They're basically a hate group or defined as a hate group by the Sun Poverty Law Center because they're basically against gay, gayness, gay kids, all sorts of things like that. So very conservative reactionary organization. So unfortunately, the students, the PhD students and the historians got it wrong. They either thought they were equally valid claimants or even that the second one was better than the first. Whereas the pro fact checkers, they fell out in 30 seconds. What was what? And they knew exactly which claim or which claimant was worth your attention. So how did they do it? Well, they use lateral reading, which you probably know. So it's the same as critical, ignoring, really. You don't read vertically. You don't go deep. You don't go into a claim or a website. You look away, look to the side, open another tab, search for the claim. Is the source reliable? Is the claim great? Fantastic, if it is, if it's not, let it go. The idea is that misinformation, disinformation mixes the real and the fake very explicitly and it could take you so long to untangle these threads that you'd waste hours doing it. And that's hours of your time that you've given to white supremacy or anti-gay rhetoric or whatever. So this is how they summarize all of this. So different types of information. So for distracting and low quality information, you have better self-control, you self-nudge. There's false and misleading information. You use that to reading and for trolls and malicious actors, you minimize and you don't feed the trolls. So self-nudging is basically like being aware of the link between your behavior and the architecture of the environment. So for example, avoiding things like, okay, I'm gonna work really hard for 20 minutes and then I'm gonna reward myself by looking at a video because if you look at one video, then you'll end up looking at three and you'll spend 15 minutes. So it's just being aware of how you will react to certain technological affordances, I guess. Okay, so this is the bit I can skip. No, actually it's not. So this is something that I did with Rachel and another colleague from the ANU in February. So we did a submission to this parliamentary committee on foreign interference and we said, all right, so we have two ideas, we have this idea of information literacy. So this is what I've been developing with Rachel and the other one is information health. So that's more network analysis. It's about looking at conversations on Twitter on Reddit and then deciding to what extent, using network metrics and textual metrics to what extent they're, if we identify a partisan debate, let's say, and we can say people are on either side of that debate because they've linked to well-known partisan figures, like for example, you could look at the voice now as an example of that, which is what we did in this submission. And then you can do some network metrics of to what extent people are engaging with people on their side or they're engaging with other side or to what extent they're using in civil language, to what extent they're linking to authoritative sources such as Wikipedia, for example. And so the concept of resilience is something I quite like because it appeals to a lot of people but it's a little bit controversial in the education space because some people think it kind of psychologizes social issues and it puts the pressure on children to be strong and to work everything out for themselves when in some cases, there are social factors for why people are feeling pressure or are being affected by misinformation or by other things. But I don't agree that you should relinquish the use of a term to consider these people. You have to explain why you're gonna use it. So we're writing an article for a journal called Synergy and we're going to try and explain what I'm gonna try and explain why I wanna use this term resilience. Okay, so the three principles that inform this are non-partisanship, speed and transparency. So non-partisanship because we're dealing with kids for information literacy, you wanna be as inclusive as possible and so you don't wanna turn people off. So some people that I work with, they disagree with that. They say, well, it doesn't matter, you know? But if you're dealing with children, with parents, with teachers, you really have to be careful and you have to be as wide as possible. You wanna reach people who are on the other side or who are believing in conspiracy. So you really wanna try and make it as non-controversial as possible, which is why we don't talk about media literacy so much. Speed, I've already explained in transparency because the opposite of distrust and conspiracy theories, transparency. Conspiracies always alleged there's some secret cabal doing something, I mean, even on Wikipedia you have conspiracy theorists who say there's a cabal controlling the psychopedia. I mean, those people are right, of course, because there is a cabal. I'm kidding, but anyway, the point is, the way to inspire confidence is just to be completely transparent and say, this is how the sausage is made. This is how the knowledge is being collectively produced and of course the Wiki does exactly that, but also for news. So this guy from BBC's Global News Division said that transparency is almost more important than objectivity and there's a quote, at the bottom news today still has to be accurate and fair, but it is as important for the readers, listeners and viewers to see how the news is produced, where the information comes on, how it works. And then you have other instances like open source software, open source intelligence like Bellingcat, open data, it's not as clear cut because the data that's released isn't always curated properly and the people who are then supposed to look at it, anybody, they don't necessarily have the skills. So open data is a bit more complicated than of course Wikipedia. So, although this idea that you have an evolution in trust from guarantees offered by the author, so the first encyclopedia was by Diderot, so you trusted Lord and trust the brand. Potenica, I know you have probabilities created by transparent processes. And there's all these comparative studies which I got from the age article that came out last September. I was interviewed by the sky after the second conversation article and when that happens, because he's a science writer and he had all these great references so I got them from there. Wikipedia, blah, blah, blah, it's all good. It's got the history, it's got the policies, I don't need to tell you that. Okay, a few issues I was interested in this idea that even though there's complete transparency, there could still be some problems in front of everybody and everybody, there's nothing, they're just there. So first of all, the organized manipulation of content and then systemic imbalances. So most of you would know about the project capture in Croatia when the Serbo creation or Yugoslav Wikipedia was split into different projects in the early 2000s and there were two small, basically. So the creation one was basically taken over by far right. Ustache, I guess, people who sort of excluded dissenters and Wikimedia Foundation actually had to intervene a couple of years ago and I think one admin was excluded and they're trying to, there wasn't enough people to have that kind of evening out, levelling out of disputes. It was two one-sided. Then you've got of course PR firms that also sometimes try and influence content for their clients. And this is an example I think Rachel would have given last year at the while. So there was some cases where about 80 accounts were editing non-controversial topics to bump up their credibility. And then they started editing the Ukraine and Russia pages and all that kind of stuff trying to make it look like they were. So basically altering language to minimize objectivity or pro-western accounts, et cetera, et cetera. Systemic imbalances. So on Wikipedia, you also have a problem as you know with the representation of women. There's only 20% of biographies of women. You all know that, but I don't know if you know that one. When women are featured, they are represented differently and more negatively than men. According to a 2015 study, the word divorced appears four times as often in women's biographies in English Wikipedia but it doesn't compare men's biographies, which emphasizes the prevailing societal focus on women's private lives and existence in relation to men. I mean, that was pretty, that made my head spin a bit, that's what I must say. Okay, so what did we do with all this? So we created these six lessons with ACT-affiliated schools and they went from the most foundational issues like what is the fact? What is, how do we know what we know? Then we use the metaphor of finding a sandwich on the street, the number three to introduce this idea of lateral reading. Then we introduced the idea of ad hominems of people using emotional language and why it's okay to be mad at a friend but why people getting mad at the people they don't know. And then we used the metaphor of a red car. So your parents buy a red car and suddenly start noticing them, the frequency illusion basically, just because something is ubiquitous doesn't mean it's true. So same thing happens sometimes online, you start seeing something all the time. And then the last one, Garage Dragon, it's about finding proof for statements for claims. So this was collected as a booklet which is available for free download on the IPO. And we did it in different classes in four schools. And I did it with Rachel Kanine, there's a new guy who's coming in from the Faculty of Education, Andrew Ross and we're going to reapply next year. So one first new thing to say in relation to yesterday, there's a video which I can show you. Street cake. Imagine that you come across a cake or perhaps someone offers you one. If you had never met the person offering the cake, you would be right to refuse it. You have no way of knowing if it's okay or not. If you knew and trusted the person who gave it to you, like your friends, you might accept it. If you were hungry and liked cake, if you found the cake on the street, would you eat it? You can't check if it's okay, so probably not. Think of the unfamiliar information as being like a cake on the street. If you are not sure if it is okay, check, take a ways. You are not sure about new information. Check, how do you check? You will search with Google or DuckDuckDuck. Look at the Wikipedia article. If the article is okay, for example, no warning banners, you have the answer. Okay, so that was the world premiere of that. Let's skip ahead. Yeah, so I'll be interested to know what you thought about that one. I saw something, I thought the music was a bit loud, actually, and there was something else I thought was not right, but it's a work in progress. So basically summing up good traits for exercising critical literacy are often ill-suited for sound, digital information literacy. To be critically literate, you need a lot of time. You need to be able to think intertextually. You have to have lots of different approaches. There's always bias, but traditional forms of authority are still important, whereas to be digitally literate, you need quick, confident decision-making. You don't want to spend your time. You do lateral reading, and you must have, I mean, facts must be objective, and I don't know if that's sitting different than the other one, actually. And it's about processes, not institutions. Okay, so then I can show you some bits of a chapter that we have coming out in a book. So these ideas were controversial for some people. So we had the conversation article that came out in 2021 that sort of started this whole thing where we realized there was a lot of resistance from teachers, and we wanted to kind of change the conversation a bit. So it had a big impact, and for 72 hours, they said, okay, there's gonna be some commenters, and so we engaged with people. There was one guy who made like 23 comments. He was my father's greatest number of comments, and he was extremely negative and obtuse, I would say. And you say, you know, he was pretty offensive at youth. The mob comes into play, one thing for sure, Wiki is not suitable as an educational resource of any kind except as an anxiety analysis, and I'm very worried about the state of universities if teachers are recommending it and all that kind of stuff. But these vehement objections were a minority. There's only 10 people out of 52. The majority were supportive, 31, and others were neutral or made comments that were not related. So in fact, just because there's people who shout and make a lot of noise, doesn't mean that they're the dominant view. There is a perception that, you know, people are thinking of Wikipedia in a way, in the sense of how it was in 2005, 2010. They haven't really, there's a real lack of knowledge of the institutions, the processes, people don't understand. So Darius, Jimi Elnach is a Polish Wikimedia guy, a Wikipedia, he's written a book about Wikipedia and he said, you know, it's improved substantially. One guy, there was a couple of responses in the conversation to this negative guy. And one guy said, you know, I suspect your hostility is caused by sort of knowledge, elitism and fear. And of course, then commentaries say, oh, chill out, it's just an article, man. You know, like he said, of course he got out of there immediately. He said, you know, even more annoying, like, you know, he makes all this noise and then somebody responds and he's like, oh, what are you doing? You know, you're taking this way too seriously. Anyway, so we had a workshop that was with people, they're not the people that we're working with, although one of them, one of our co-researchers did attend, but it was open to people in independent schools, as well as from the directorate. So the directorate is just for public school. The affiliated school project that we do, there's 14 schools in the ACP that take part, they're all public schools. There were two teacher librarians who were actually both from private schools and they were incredibly enthusiastic. And they said, you know, it's a great opening to research, a minor role. So they really, they saw themselves as advocates. And sadly, we had to say to them, well, you're working in private schools, so we can't actually make you part of the project, sorry. But you know, of course we shared the resources with them and all that kind of stuff, but we couldn't work with them. Okay, another is, so basically that's another finding is the teacher librarians were incredibly supportive, through organized, I hope I got the hashtag right, an event last year where we presented what we were doing. And actually we got a new public school teacher librarians from the ACT who joined because of that. So that was really good. I presented a keynote address at the school, at the SLAN, New South Wales Professional Learning Summit in September 22. The librarians from New South Wales, they've got a whole framework called information fluency. So we're sort of debating information resilience of information fluency. Then, sorry, there's people talking, sorry, it's putting me off a little bit. Yeah, so we had a lot of interest from teacher librarian publications, connections, to organize that. We reprised and expanded the first conversation article, Access, which is New South Wales provided the lengthy, so I've reproduced it here. You haven't seen it, positive review of six check-in lists of kids. And as I mentioned, Synergy, which is the Victorian association, I think is we're going to do something on information resilience, information fluency. So teacher librarians are very positive and they get it, which is great. Then also we had a debrief sessions and they were really surprised and they changed their complete, so usually what happens is when we present people, I say, oh, wow, I didn't know all this, and yes, it's really good. And sorry, can I just give me a second? Sorry about that. Just get flustered, one night, there's a lot, I'm very sensitive to noise. So they would definitely use wiki kids with year six. Yep, anyway, so yeah, so teachers basically say we can change our ways and they've identified teacher education in universities as the place, as the battlefield, basically, where this cultural change about trusting UPD has to happen. Rachel has started to incorporate some of those ideas in some of her training. Okay, in terms of the children, the school kids, the way we verified is not the best way because we sort of made this up as we went and what we should have done and what we wanna do in new iterations is have activities. So we ask people, ask the children to check something or to react to something and then we verify how they do that. And we wanna do that before the program and then you'll do it after, which is what we did in this case as well, but we did it straight before because we wanna do it four weeks in advance and then four weeks after to make sure to see whether these skills actually stay. So basically we did a survey, we asked questions and the fact checking had improved but the trust in Wikipedia had not. And that's the reason why when we have the videos, we wanna sort of reinforce the Wikipedia aspect in everyone. So we understood that we have to really kind of reiterate that a lot of times. Okay, so next steps. So we did a symposium in September where Ru was there, Amanda was there. Well, I mean, Amanda wasn't there physically but she participated remotely. And we have a report that's sort of in limbo at the moment but I wanna get it out second half of the year. We've had some preliminary meetings about doing micro credentials. I don't know if that's gonna happen or not. It's kind of an area I don't really know much about. My idea is to use the report as a kind of broadcast and see if there's any that helps us build something but we can use it anyway, summarize where we're at and we'll revive that work in the next couple of months. Okay, the second piece of information I wanted to talk about after the video is that the teacher librarian crowd was not provided. I got a negative answer today from the US Embassy. So that will not proceed straight away anyway, sadly. So it's funny because yesterday at the Escalite event, somebody said, I mean, Queensland, can I still participate of course? You can participate if you're in. Okay, another thing we're going to do is to expand to Indonesia. We want to do this in Indonesia with high schools and we're going to do another affiliated school application which we will try and link what we did for the primary and secondary system to the high school. So this will include other literacies such as Wikipedia literacy, which is, I often say to people, we're not saying use Wikipedia blindly, we want to increase people's Wikipedia literacies so they can detect when a page or an article is got a problem and they can know when they can trust a page, other types of literacy, privacy, et cetera. Finally, this is part of a project that I've been doing for the last couple of years, the Digital Commons Policy Council which tries to sort of recognize volunteer labor that produces digital commons such as open source software and Wikipedia and we're working on sustainability of digital infrastructure and we're also working on digital commons in environmental sustainability. And I think that's it.