 Welcome to this afternoon's our second day's keynote speaker. Before I hand you over to our co-chair, Matt, to introduce our speaker, there's just a few housekeeping announcements. Now, it might not be in a physical venue, but there are still things that we all need to be aware of. First of all, recordings from this and other sessions will be available in the interactive program and continue to be available afterwards. Also, we want to make sure that everyone can enjoy the conference safely. So please familiarise yourself with our netiquette and code of conduct. Many people help make this conference happen, not least of all, all of our sponsors and partners, and I do want to acknowledge their support, both for this event and as old member organisations helping us support our work throughout the year. And a particular big thank you to our headline sponsors, Canvas LMS by Instructure. I hope if you joined them for this afternoon's networking session that you had a lot of fun with them. Now, following this keynote is our special and very first conference gala online. Join us for the launch of the Future of Learning, chaired by our chair, Helen O'Sullivan, at tonight's conference gala, starting at 10 past five. So don't be late and bring your own tricks. Now, on that note, I'm going to invite our coach Matt Lingard onto the stage. Hi, Matt. I'm looking forward to this session. I'll hand over to you now. Thank you. Thank you so much. Good afternoon, everybody, or whatever time it is, wherever you're watching at some distant point in the future. Good afternoon to those of you are where it's afternoon now. I'm delighted to be able to introduce this keynote, but for those of you who are attending the conference live, first of all, I just wanted to share with you how much I've been enjoying the event and hope that's been the case for you, too. One of the things that's really struck me is the feeling of a buzz and community at the event and just shows what's possible online. There's a real feeling of togetherness and a lot of the sessions I've been to have been really celebratory in terms of the work that's gone on over the last 18 months or last two years, even. I just wanted to say that first. I'll move on to the keynote now. This afternoon's keynote is the topic is race and technology. I know that it's a topic that's of great interest and more importantly of great importance to many people and builds on work and presentations we've had before. In particular, over the last year, the development within our community of an anti-racism and learning technology group which now has, I think, over 100 members in a developing community of practice. We're really fortunate this afternoon to have Matali and Condé here with us today who's now on the screen. Welcome, Matali. Matali is an expert in race and technology. It's my great pleasure to introduce her to you now. Currently, Matali is the founding director of AI for the people which is a non-profit communication firm that uses journalism and art and culture to advance racial justice in technology. Matali actually started her career in journalism here in the UK, I think, at the BBC and also at CNN and ABC. She now writes and talks frequently on race and technology and has fellowships at Harvard, Stanford and Notre Dame. Prior to doing the work she does now, Matali worked in AI governance and was part of the team that introduced legislation in the States including the 2019 Algorithmic Accountability Act. Moving on to today, amongst other things, Matali currently sits on the TikTok Content Advisory Board and I know that's something we'll be hearing a little more about soon. I'm going to stop talking now and it's my great pleasure to hand over to Matali and I ask you to join me, the thousand emojis, in welcoming her to the stage. Thank you so much, Matali. Two things I'm going to say. I'm going to share my screen in a second. I know it looks really weird but I don't know how to do this after 18 months so thank you all in this community for putting us online and I had always wanted to come back to the UK, having been in the States and like Tracy Almond did, the comedian I used to watch her on the BBC and I feel like the black Tracy Almond in this moment so I'm very happy. Without further ado, I'm going to get on to the presentation and then it's going to take me 30, 35, maybe 40 minutes and then we're going to come out and hopefully have a conversation. So can everybody see me? I can see nobody. Oh, perfect. So I can't see you but you can see me and that's going to be good enough. So I've done many things in my career over the last 10 years but what's really bringing me to this conversation is a paper that I wrote in 2019 with my colleagues Jesse Daniels who is a sociologist at the City University of New York and D'Arction Mayor who is a computer science professor at Bucknell and what we were looking at at that time were the way that the technologies that we were using that we felt to be fair and accurate because you know they were made using maths, statistics, we didn't know anything about it, it had to be true still ended up making decisions that got us back to really the racist ideas of many say are past but as somebody who has been black all my life the present and we were really curious to why that was. So we connected with Howard Stevenson who is a professor at the he's a psychology professor at the Graduate School of Education here at the University of Pennsylvania which is very fancy Ivy League and he introduced us to this idea of racial literacy as a framework and I thought that was so perfect to bring to this audience because you your community is really involved in learning specifically learning with technology but pedagogy is something that you that you folks are paying a lot of attention to so as you can see the racial literacy and tech framework has three distinct parts the first is cognitive so what do we think about technologies as I motion to in the beginning when D'Arction Jesse and myself were thinking about this idea of technology and race we had come to the conclusion we hadn't done any interviewing yet the technologies were actually more than maths and physics that the technologies that we were building were actually taking on meanings of race and we were looking specifically at that point to the criminal justice system here in the United States and we would notice saying that the the algorithms that were being used in sentencing would sentence black defendants to longer sentences or less likely to get parole than white defendants who had the exact same the exact kind of same profile so what was it about the technology that was picking up that white people were less likely to commit crime or recommit whereas black people were and that was something that we had seen in the wider criminal justice system we start to think about not just that question but why weren't people outraged you know as a black person that is coming from the social sciences my first degree was at Leeds and I did sociology why was it that I could see this pattern really clearly Jesse my co-author who's also a sociologist could see this pattern very clearly but our friends in engineering who are actually creating these technologies and developing these data sets had no idea and it came down to one thing we surveyed around 30 people for the paper and we found that when we were speaking to these engineers all of them were involved in the tech industry working in Silicon Valley and we were looking at everybody from VP of engineering to product manager as well as people on the business side but when we were looking at our data set of engineers they actually feared talking about race they were saying things like you know it makes me feel very uncomfortable I don't want to be seen as racist we got many responses about voting for President Obama and what was really interesting about that was that wasn't an actual question on the survey but it was being offered as kind of a way of you know avoiding getting into the conversation I asked my white co-author to ask this conversation because I didn't as a black person want to either get performative allyship or and make put people on the spot but these were conversations between white people where they were saying you know these things even questions that we were not asking and then the third part of this framework is what are we going to do about this so we know that technologies in this case we were looking at the compass algorithm and are really reproducing meanings of race and racism in our context we know that part of the we know the reason this isn't even being tackled is that there is no literacy around being able to talk about race around being able to look at ourselves both as victims of white supremacy because we're all swimming in the same sea black white asian whomever it's the same system as well as perpetrators of it right when we don't say anything we then become complicit in allowing it to go on and what we what were we going to do and at that question really comes this idea of racial literacy in technology and our call was for the development of learning systems within technology both at the in the academic sense but even in industry as product teams are being created for a study of this right we knew that racism came in at the design stage through which inputs we were using which questions we were asking what we were optimizing so embedding that within the design process and this fear had to be let go we are all people we should not fear having honest conversations with each other and we were successful and we weren't successful we were really really successful at getting people to go oh yeah mm-hmm I get that that's really good mm-hmm we weren't very successful in design so one of the ways that I advanced this work is through um advisory work that I do with tiktok which is a social media platform built on musically which was a dance platform uh bought by a chinese company around four years ago now and came to united states and has gone across the world and kind of set the world on fire with dance videos very very young usership the majority of people on tiktok are actually between the ages of 12 and 24 so if you are on tiktok and not in that age range you're old and I was really interested in their algorithm because of the way black young people were writing about it in the popular press so this is a picture of their for you page so if you sign on to the app these are the kinds of things that you'll see and the for you page really helps you see people or groups that are engaged in the types of videos you like standard social media practice what drives this for you page is a series of algorithmic decision-making points this is a model of how an algorithm excuse me I'm in brooklyn so we have to have some type of random car chase okay they went past my window so isn't too bad this is a model of what algorithmic decision-making can or can look like so I'm not going to go through the whole model but it might begin here and the first um the first prompt we're seeing is do it for the sake of this conversation the do it is going to be metallic has signed on what does metallic like so the algorithm will be trained to think about the sites that I go to how long I stay there if I comment if I share if I connect with the users in any way that gives the training data needed to answer this first question what do I do the algorithm will then gain more knowledge it will look for parallel pages that do and say and act in the way that that are similar to the training data set you see that are going up and down it's then going to get really really resourceful in the universe of tiktok metali doesn't know about x let's get this to x and then I might start to interact with x that I didn't know about and then it goes on and on and on but at each of these decision points what tiktok and all social algorithms are trying to do is to train and resource the algorithm to give me the content that's going to keep me on the platform the reason that that's so important is the longer I stay on the platform the more likely I am to be advertised to I'm not suggesting um that that's happening on tiktok I actually don't know what their advertising model is I'm very interested in the content or to bring me I guess the next new dance craze and this is this unfortunately can also be a site of how algorithmic racism happens so I'm going to tell you a story of tiktok is a true story and in it I'm going to kind of talk you through the points at which this recommendation system that I've just described became racist so to kind of just set you up um this is a young woman called Addison Ray she's dancing to um a song called Captain Hook by Megan Thee Stallion uh trigger warning for anybody who's live or anybody who's listening there is some modicum of um graphic content it's describing an intimate act between um between a man and a femme identifying person if that's something that you don't want to listen to then this is a really good time to walk out the room get a glass of water it's gonna last about 16 seconds and then I'll be back look he's a sweetie kiss it when he ate it I Addison doing the dance that's what tiktok does that's why we love the platform and that always happens however that dance was made uh up by a tiktoker called Sydney Ray who's on my right of the screen with the amazing pink eye shadow and lashes Sydney is a is a beauty influencer but also really really really really likes uh Megan Thee Stallion at the time that I put this presentation together which was about two weeks ago she had uh 1.1 million followers so that was mid-august for anybody that watches this later um and so she's less likely to appear on the for you page because that first page that I showed you earlier in the presentation is really reserved for what they would call super influencers so million followers is great but these are people that might have 5 million or 10 million or 20 million this is because the algorithm is optimized to promote the most popular pages so remember when I was showing you the decision-making process of an algorithm one of the things that the algorithm is going to be trained to do is to raise the pages that have the most followers the logic being that pages that have lots of followers are the most popular on the platform and therefore going to keep people on the platform and so we can advertise or make other strategic business decisions while we have people's attention but what it doesn't think about is that Sydney is black and not only is she black she is a black what would be considered dark-skinned women and researchers have found that black women are deemed less desirable online so they get fewer social media followers so going back to the racial literacy framework where we talk about this cognitive piece that race is embedded in our technical systems this is an example of how optimization for popularity ends up becoming a proxy for race because it doesn't allow for anti-blackness in technology use the study was done in the study was reported in Nature magazine here in the United States in 2018 and based on dating profiles but that logic still persists however Addison Ray did the dance I did not show you Sydney doing the dance Allison Ray has 70 million followers so definitely firmly sits in that super user tiktok universe I was describing she's very likely to appear on the for you page in fact I cannot discuss anything that's discussed in meetings and we very rarely discuss individual tiktokers but we do discuss these these hyper influential accounts whether they be politicians or people who are famous or even people that really know how to capture the attention of the platform and Addison Ray would definitely fall into that category we actually saw content moderation decisions being made around this with Donald Trump on Twitter and during the 2020 US election where the decision was made that his account was so influential because he was the president of the United States that they would bring it down so this is very standard across social media companies to to really look at the types of messages that are being sent out by these super followers because the algorithm is optimized to promote the most popular pages without thinking about the social decision making the social content decisions in which popularity and desirability is is decided online she becomes one of the most popular people on tiktok and so as people are coming on to their for you pages they are seeing her latest content including the video that I just shared with you what does this mean so unsurprisingly Addison Ray is followed by filmmakers television makers record companies and a record company sees the video that I just shown you I'm sure it was longer than 16 seconds she loves Megan Thee Stallion she's always doing these dances and she's given seven thousand dollars to come in and teach tiktok dances to the staff of a record company as part of a promotion that they're doing as well as getting a spot on the popular late night television show Jimmy Fallon so Jimmy Fallon is a lot like and now I'm blanking now that I'm speaking to you all but the guy that does carpool karaoke don't know why I'm blanking on his name love him love him love him but it's a similar type of show comes on late at night very funny very topical and as you can see from the sign the man in the picture is uh is uh Jimmy Fallon the girl with the jeans is Addison Ray he's holding up the sign Savage Savage is another song by Megan Thee Stallion and when it comes up Addison switches to do this dance and she gets national recognition and attention we love it this type of recognition and attention leads to a whole plethora of opportunities including going to um commentate ringside uh uh I think it was a wrestling match and this Netflix movie he's all about that it is a Netflix movie about a young hip girl in America that falls in love with a young hip boy it's a take on a movie that was done in the 90s but the the girl is in the driving seat instead of being picked she's the pick picker and this debut is at number one at the beginning of September in the United States so this is what the algorithm this is how the algorithm and the the recovering of Addison Ray is really benefiting her in her real life when I looked up to see what all of these opportunities were amounting to between the seven thousand that she got paid by the record company for one dance the appearance on Jimmy Fallon which I'm sure was for free but gave this national platform through to being cast in a Netflix movie among other opportunities she currently September 2021 has an estimated net worth of around five million dollars and to give you to put that into perspective um Solange who is Beyonce's younger sister and has a number one Grammy album has the same type of estimated net worth um but I would argue a Grammy winner and having proximity to Beyonce is is valuable but guess what so is being on TikTok so let's look at what happened to Sydney Sydney actually created the video remember all Addison did was copy the dance that Sydney created so Sydney gets paid $700 by the same record company to to engage in the same promotion Sydney created the dance however Sydney's Sydney was not elevated to the for you page Sydney was not discovered by Jimmy Fallon and given a national platform Sydney was not cast in a Netflix movie that opened at number one which means that she's more likely to be cast in more movies Sydney got $700 that's around 950 pounds and she taught Addison Ray how to add to her $5 million fortune there's actually president for this so this is a picture of a minstrel show um they were very popular during the Jim Crow era which was the time just after the First World War so around 1918 and it didn't stop until the civil rights movement which was um in the 50s and 60s the 60s really by the time we start getting legislation in 1964-68 after Dr Martin Luther King died during this time black people were extremely lived in a segregated society they weren't allowed to drink from water fountains go to similar schools have certain jobs even make money for acting as black people but acting as black people was a huge business it was the minstrel business and what minstrels did is that they looked at the what black people did in the same way that Addison um looked at what Sydney did they copied it and they got money what I'm suggesting to you is in the case study I'm giving you the algorithm is acting like the minstrel because the algorithm is what is raising Sydney's Sydney's copycat thing I don't think that's even a verb um to national attention in the picture you can see that part of um being in a minstrel was that you would literally paint your face black to appear black even though you were white um and this is called blackface and you may see in the literature this idea of digital blackface this is at the same time that black actors and actresses cannot get roles in Hollywood or anywhere else for being black we can think of um stars like Dorothy Dandridge or even if we look to the life of Marilyn Monroe we'll see that she was engaged in her own um anti um anti-racist activism looking to performers like Bessie White a jazz singer and refusing to go to certain clubs unless they allowed black women black female performers to sit with white people so how does that racial literacy framework that I described earlier show up in this instance tiktok's algorithm is expressing racism by using followers by using popularity as a proxy for race and the way that it becomes a proxy for race is that and it's not really in it's not really um it's not anticipating that what makes this popular online is is mediated by race by gender men are more popular online than women by ability status I've given you a case study where we're looking at two able-bodied people but disabled dancers on tiktok do not get those types of followers uh generally particularly those with physical disabilities even though I do have to shout out Death Hoddy who is a tiktoker who um also does amazing dances um to Megan the Stallion but certainly is not at that level of Sydney Ray so when we say that follower count is a follower count is one of the optimization rates on tiktok it happens across all platforms we're also not being cognitive of how that becomes a proxy for race emotionally Sydney and other white tiktokers who do the same she is not the only one were in outcry it wasn't their fault that the algorithm favored them they were just doing a dance this is to do with the company they're not racist and they're correct um I don't know whether they're racist or not racism as you know because the work is being done in your community is the process of policies and practices that oppress people on the grounds of race and actually makes their lives harder because of their race so it's impossible to know through a screen but we do know that we do know that they were very uncomfortable that uncomfortability that I described with with the engineers they didn't want to discuss it and from the black perspective people like uh Sydney Ray who are watching Jimmy Fallon as we all do these are late night shows like James Corden I remembered at the wrong time like James Corden people that we all love these are national treasures why wasn't that opportunity given to them the question is simple the show team were just looking for the most popular people on tiktok they had no preference not knowing that that's actually a racialized question and the press was therefore full of outrage by black creators the action plan from a black creator perspective was to stop posting videos you can't steal what's not there this chap on the left on my left is um saying in the far screen made a dance for this song goes to the to the frame on my right psych this app would do nothing without black people so they were they stopped dancing and then somebody comments the way nobody knows what to do because we won't make dances for them and then um the letters are just described you know like laughing my a off um and the rest so they stopped there was a strike that strike is still ongoing as far as I know tiktok has a black creators panel that was developed after during the george floyd uprisings that came across the globe and looking for ways to make sure that black the art of black people on the platform isn't stolen co-opted and then commercialized by white creatives and so what they started to do with the algorithm like I showed you the algorithm before when dances are um when dances become really popular when it's when an addison ray or others do that dance they actually do a split screen and show the person who originally developed the dance also there are also um new protocols where they're going to act at I'm sorry the original creator so that that person can be raised and this came about not just through the example that I'm showing you in the strike but the new york times did this huge expose on the renegade dance which is very similar to the one that I had shown you and it was going on a similar trajectory but because the times did a profile of the actual um originator who happened to be I think a 14 year old girl in georgia she started to get these opportunities so it's not perfect but it does show that racial literacy and technology when taken on by a company can actually be encoded into our systems we can cognitively think about inputs that increase that reduce racial harm increase racial literacy and make the output less anti-black right emotionally we can all just admit that as human beings we're creating these systems using machine learning what machine learning does is use patterns of the past to predict the future because lots of people click on to addison ray that means she's the most popular without thinking of the way the the way the platform itself has raised that or even without thinking that the reason she's popular is because of this practice of co-opting and in terms of an action plan one of the things that I would definitely recommend to this and communities like it is that this becomes part of your decision-making matrix whether it's that you're using technologies and you're really interested in their impact on minority groups along whichever line I focus on anti-blackness but there are issues with accessibility there are issues with depictions of women and feminized folks if you are somebody who is trans or non-binary the algorithm commits harm too those questions could be asked or even as your designing systems you're thinking about how can we optimize for justice how can we optimize for fairness how can we make sure that in that learning process we're asking questions that will make the lives of people like Sydney McRae and mine better as opposed to just increasing the access and privilege of those like Addison Ray who by the way I'm sure is a wonderful young woman but was an incredible case study for the talk so finally before we get into conversation I don't have my timer so I don't know if I have over talked or under talked my hope is that I have under talked so that we can get into conversation this really gets to the work I'm doing now so this is how we make AI for the people I sit on that content advisory board I raise these questions I'm in the company of people who have the emotional resilience to have these conversations and I do that because as communicators my my ultimate professional background is in journalism so I'm a professional communicator but as communicators we want to make these processes simple we want to do it in a way that is entertaining and delightful and engaging and in a way that allows us to speak about the harm of racism without falling into traps of fear without falling into traps of guilt but really leaning into the possibility of undoing a system that we ourselves created and when I say we I'm talking about society because black people and other non-white people certainly did not create the system of racism as we know in the UK in the US even though those power and differences do exist in those populations and then once we have figured out how we are going to create this new world how do we share that news with everybody else how do we recruit people of good conscience who want AI to be for all people for the people and I'm so looking forward to your questions and your conversation and I'm hoping that next time you are looking at a video that seems entertaining that seems to be getting traction one of the questions that you are asking is how was it created and what impact did this have on the creator? Great that took me by surprise Matali that was a very quick finish thank you so much for opening that up to us and illuminating us I mean with examples from TikTok which clearly resonated with some people from experience but I'm sure we're new to many of us too we've got questions starting to come in so the first one was actually around the example and I'll just get it up on the screen for you now but I'll read it I'll read it out to you actually but in the TikTok examples you were showing or the TikTok's approach generally what would you suggest they might do instead of popularity as a proxy and is that a discussion I'm adding to that is that a discussion you've been involved in in the content advisory board or not? Yeah so with them content advisory we're really focused on making the platform safe we're not so focused on the business model so I'll just say that and I think from a safety perspective TikTok have done incredible work where September in the United States is Suicide Prevention Month so one of the things that they're customizing optimizing the algorithm to do is to show helpline numbers and to make sure that users have access to that information they're not asking users if they're vulnerable but they're just making that available and I think that's a real pro-social way of using the for you page I don't know that it's a I don't know that there's a business plan and at the end of the day TikTok is a business it is going to have to monetize I'm not the best person to talk about social media just because I have a Twitter account because I have to for work but I don't have any others just because I'm so concerned about the way that our our data is used and how many proxies there are I showed popularity but area of where you live is often a proxy for race your name you know I'm clearly not white I'm Italian Condé can be a proxy and and for gender as well so I I without the business model I would say pro-social messages up with the business model I don't I don't have much okay no don't worry thank you thank thank you very much for taking the question I've got another one on the screen for you on my screen for you from Anu Roy who works actually here at the University of the Arts London and it's on the screen now I will keep quiet while you read it well um first of all this is probably the point that I say AI for the people is going to be coming to the University of Arts London we got an invitation so I knew it's really nice to to meet you and you know I consider you Matt and everybody else their colleagues even if it's for a small part um this is really interesting so but I'll talk about TikTok because that's the example I know TikTok were actually really proactive around making sure that creators of color were supported which is why they did that split screen like when they realized that dance was trending there's a certain threshold they would then go and look for the first time that that video appeared on the platform and make sure that people knew and the reason that um TikTok did that is that they're a dance platform and they want to be a place for fun they want to be um uh you know really young kids are on there they don't want to get into the any type of politics that's kind of their view um and and I was really pleased with that however these types of um these types of situations happen across platforms so you'll see attacking of black women on Twitter which is a huge you know a huge historic thing and not much work has been done there even though I am hearing whispers that you know black Twitter and the protection of black Twitter in the US is going to become an increasing priority we all know um that you know Facebook is where information goes to die they all do but those Facebook groups do a woozy and often um that's racialized and that's across the globe whether we're looking at you know Hindu nationalists using Facebook to to do terrible things and to non Hindus in India I personally think and this is what we were saying with the Algorithmic Accountability Act that platforms are going to make money that's what they're supposed to do they are their job is to increase shareholder value and if they can hold people by by doing this they're going to continue so government has to step in and what we were proposing is that if a platform is found to harm a minority community the algorithm would not be allowed to be um marketed uh because right now it's a free for all I could write code you know Matt and I could write code today it could do all kinds of harm and there'd be nothing stopping us thank you then thanks metallic thanks for coming on to the legislation actually um I had a question in mind for that um however I've got more questions too lined up in the chat for you now um so the first one comes from uh Pete Mellor and we'll just get it on the screen for you now oh my goodness so Pete so Pete um is asking about AI tools for education um and the thing I love about this question Pete I believe that we are kindred spirits the word touted is used um so all AI systems as we're designing them now if we are using machine learning and what that means is that we just look at um you know we look at a historical behavior and then we use them to predict the future or shape the future are going to have bias the best example and I did speak about it a little is the compass algorithm where judges were using it as part of their sentencing determination and the way that the algorithm was trained is that it was looking at sent historic sentencing determinations made by judges for those crimes with the assumption that the judges themselves weren't racist that weren't prejudiced and what it found was a pattern of judges um either giving black people longer sentences sentencing black people when they didn't sentence white people or refusing and parole and that became part of how the algorithm learned education is a hugely racialized field we know um I grew up in the I grew up in the UK I thankfully got an absolutely amazing education from nursery school primary school uh and then I went to high school I actually didn't do the middle I went to middle school I think for one year then I went to high school because it went from when I was 11 to uh GCSE and beyond um I got an amazing education because I come from my parents are both doctors so I had class privilege I also had two doctor parents who are like you can do physics we're black and we do physics you're gonna be black and do physics not everybody came from that so in my school if you look to the educational data you would think that black kids are thick you would think that black kids can can't do this and can't do the other and those AI tools for education are taking in all of that data they're looking at exam results they're looking at attendance they're looking at and they're not looking at the full life of a black student so I would definitely think that they have these biases and my advice is use the AI tool as a tool don't use it to make determinations if it's telling you like we saw with the A level algorithm that if you go to a state school that you're going to get lower um AI in lower exam results than if you went to Eaton and we all know bars Johnson went to Eaton then you don't believe that right you have to look at the actual student look at that student's performance look at that student's progression and my view would be look at their social life look at the environment they're coming from and make a determination and throw out the AI tool if it's being racist sexist homophobic and all the other isms great thank you thank you it's such a big area for many of us here at the moment in terms of how people are interpreting data from from these big systems the next question I've got oh blimey is also from a colleague here here in London so I'm just going to get Dave White's question up on the screen so Dave is asking me the relationship between human led moderation and curation and algorithmic approaches power and privilege Dave so most human led moderation is done in what we call the global south specifically places like the Philippines where you have people who are contractors to the companies working and call let you know call call center type environments and they're making these decisions on videos the issue or post the issue that you have around that is they may not have local context so even in the captain hook video there's a line in the video where they where she talks about eating groceries and it's actually not food it's it's reference to a sexual act that if you are if you are if you're in that if you're a you know a black teenager white teenager in America you might know that that if you live in the Philippines and you're trying to figure out whether the the lyrics are um are you know whether a three-year-old could listen to the lyrics you might hear captain hook oh yeah that's a you know that's some peter pan that's great groceries oh okay that that could be you know some some cheese and what she's actually describing is very sexual in nature so you get miss moderation um a lot where there's no context where this algorithmic moderation you miss in other ways because it moderation algorithmic moderation has to come to a certain threshold so um i was trolled really badly this summer for being a co-author on a paper and even though i was showing this was on twitter i was you know blessed to be in conversation or privileged i should say in conversation with twitter but because the abuse didn't meet the threshold and they hadn't been trained in african-american vernacular i didn't get any relief so i think that there's somewhere in between the two where you can use the algorithm to show great patterns i think they're really really good for big you know reading things that we would never be able to read in our lives and you would need human moderation but you would need moderators who actually understood the context and in global companies you might not want someone from the philippines or de far or any other part of the world moderating content from outside their context because things could get missed okay thanks matali um we're gonna take just one more question um from roger which will put up on the screen now and then we'll start to think about why new things up maybe maybe time for one more if anybody's got one okay oh how do we help how do we guide our children oh eight-year-olds on tiktok that scares me um you know eight-year-olds on tiktok because they really are too young for critical thought um eight-year-old tiktok is not optimized to be a platform for anyone under 12 even though i have a 13-year-old whose frontal lobe is nowhere coming nowhere soon um and and so i i worry about um his tiktok keys to be honest i think um they're increasing parental controls on the platform but i if you can and i don't have my youngest child is 13 so i don't have an eight-year-old i do not know what eight-year-olds are doing if you can social media is not necessarily the safest place um just because so many hidden messages are embedded um and from an advertising perspective you don't want your children advertised in a way where you know often advertisers will create community with their with people like um i remember when i was pregnant getting into a mommy a mommy's um facebook group and i didn't realize i was being advertised to the whole time but because i was i have so many questions i was in a very vulnerable position i ended up being um really victim to that and i think that children are in a similar situation um parental controls if you can get them um definitely limit though definitely limit i think you know in the content advisory board we have so many um youth experts um who come in and look at content but youth is not as young as eight it tends to be a teenage they do have children's experts but that's not what the platform is optimized for okay thanks matali um there are lots of questions coming in so if it's okay with you i'm gonna throw a couple more at you yeah i'm here to i'm not okay fantastic um question from tim coming in next uh we'll just get that up on the screen um tim asked some really great questions would human in the loop do something to mitigate this effect absolutely i think ai is an incredible tool is incredibly powerful we should definitely definitely be using it i believe in ai for the people i'm not saying get rid of ai i think that that would be an incredible mistake but we can't rely on it we can't give up decision making to systems that don't understand social context an example i always give is one here in chicago where the children services um division that they they do all the at-risk children work so they're protecting those kids gave over decision making power to an algorithmic system and what ended up happening is the increased caseloads of social workers and they ended up being so overwhelmed with cases that more at-risk children died because they had been fed all of these new leads that amounted to nothing and again it was on racial grounds so more black children were identified as being at risk and then white at-risk children died and the chicago um the state of the state of illinois ended up having to get rid of the algorithm whereas i think if they had been a human being they could have looked more into the history of those cases they could have been we need discernment decision making includes discernment and algorithms cannot be trained to provide that okay thank you more questions coming in um any optimism on ai yes i think for mitigating climate um ai systems have the ability to help us um first of all pattern i'm thinking of agriculture specifically so agriculture agricultural practices add 10 percent of all the greenhouse gases that we have um in the globe and one of the things that i'm seeing in the ag tech sector are the development of ai tools that can help optimize farming practices to stop things like nitrogen fertilizer use not stop it but reduce it once we reduce it we also are reducing the production of nitrous oxide which is a greenhouse gas we would not be able to do thermo imaging we would not be able to um process that data and make it usable without using ai systems and given that we need a world to live in i am so optimistic about what can be done in that area and i'm i just hope those scientists get get funding and it's not human subjects right we're dealing with soils so a lot of the problems i talk about are when we're using ai in relation to human beings great that's fascinating fascinating i'm going to take one more which i missed earlier from um sheila um we'll get that up on the on the screen and i'm sure sheila's not being deaf but um i'll leave that one with you oh my goodness sheila another uh i feel like um you know sister from another mister for sure she's asking can algorithms be more explicit is there a way to make them public like food labeling am i just being deaf no you're not so algorithms are protected by either ip so intellectual property or trade secrets what that means is that when a company develops an ai system they say no no no no no this is our secret source this is what makes us prior you know proprietary this is what makes us make money if we release the source code to the algorithm then everybody's going to be able to do this and we won't have a business so that means that we don't even know how they work when i talked about the report at the beginning we have to do ethnographic another quote and qualitative work to identify patterns but we cannot tell you exactly that system that i described and one of the things that we looked at in the algorithmic accountability act was a way of allowing business to have their trade secrets why so that they can be in business i live in the united states that's you know capitalism 101 while at the same time making sure these things are safe because we actually do have that when we think about the covid vaccine and how it had to go through trials and it had to go through tests and it couldn't be i mean that's a bad example in some ways because they got emergency clearance but there is a process because these things are being used on human beings but what i think we can do is inform citizens as we can demand that we can say to our politicians we can say to our regulators that these technologies are too dangerous they are making us racist by proxy right you can't be blamed for the i mean the racist decision making of source code that you have no access to but if you use it you're definitely part of the problem and we we want a different future for ourselves and so that's a lot of the media work that we do even the work that we're coming to do at university college london is really around how do we increase the civic voice around this how do we expand this the civic imagination and let us know that the technical systems that we use are as much as our concern as airways and rivers and the houses that we live in great thank you so much matali there are more questions but i'm actually going to have to draw a line under it now unfortunately because there's a big gala and television program launch at five o'clock and we wouldn't i'd be in a lot of trouble if we ran into that so can on behalf of everybody here a huge thank you for for today fascinate a fascinating torch and it's clearly been very popular i will show you the chat afterwards and you can browse through that but i'm sure everybody will now join me in in the way we have to thank you through emojis for your for your talk today and we look forward to seeing you in london or in the uk at some point thank you thank you bye everybody