 Hello, everybody. Thank you to everyone for joining us today. So those who I don't know, my name is Kendra Albert. I'm a clinical instructor here at the Cyber Law Clinic at Birken Klein Center for Internet and Society at Harvard University. And I'm going to do some boring logistical announcements. Then I'm going to do some fun part, which is introducing April. Then we're going to do some more fun part, which is April talking, and then even more fun part, which is questions from me and questions from y'all. So, you know, the boring logistical announcements is being recorded and live streamed for the folks in the room. And if you're tuning in from, you know, the internet, thank you. Thank you for joining. And once we get to the Q&A, please utilize the chat and Q&A or Q&A function on Zoom. We won't share your name or like an image of you or anything. We'll just sort of, someone in the room will ask your questions and then April will respond. A little bit about the run of show, just sort of preview it for you, but we'll hear from April about the book. I'll get the opportunity to ask a question or two. We'll take some questions from the audience and at the end books will be available for sale. And if you ask very, very nicely Professor Williams, you know, April might even sign it for you. So, you know, the fun part, which is introducing April to y'all to have her give this talk about her incredible book. April Williams is a jointly appointed assistant professor at the University of Michigan in the Department of Communication and Media and the Digital Studies Institute. She is also a senior fellow in trustworthy AI at the Mozilla Foundation and a faculty associate at Harvard University's Bushman Klein Center for Internet Society, a.k.a. here. Her research has been published in Big Data and Society, Ethnicity, Sociology of Race and Ethnicity and Social Media and Society, among other August publications. And we're so excited to welcome her here to talk about her new book, Not My Type on Meeting Sexual Racism. So join me in welcoming Professor Williams. Thank you for that lovely introduction and thank you all for having me here today and joining me. I'm really excited to do this talk here because I started doing my interviews for this work when we were pre-pandemic in the Yellow House on Everett. So I'm really excited. This is sort of its homecoming in a lot of ways. So I want to give a little bit of a head up. Heads up. I've been playing with this format for a while, hoping to strike the right tone for the kind of audience that I dream would be reading this book. Let me grab a quicker. That would be helpful. So that is academics, technologists, policymakers, and most important, the everyday dating app user. And it's going to be a little spicy at times. So I want you to prepare yourself accordingly for that. In 2019, I was invited to lead an author meets critic session for a hot new book, Algorithms of Oppression. I had to work really hard to be a critic for that book. And after that session, when I met Sophia Noble, and we talked about the ideas for this book, I shared that I was really scared to go forward because I wasn't sure I'd have enough evidence to make my case. And it's been a long journey as you'll see in here today. There is ample evidence that connects historical marginalization with contemporary tech injustice. But this journey for me personally, though driven in part by the desire to spill the tea on the dating industry secrets, has not always been about the grief of people of color. And I want to highlight that though these systems do often fail us, it's important to talk about those failures, but it's also important to celebrate the joyful community that many find in dating culture. Not always in romantic partners. Sometimes the joy is in commiserating with your friends. At times the joy is about sharing the best or worst first day you've ever had. And sometimes you actually find love on the apps. And that's why they're worth fixing. Because I know, and I know that people love to hate the apps, but they work speaking from both personal and scholarly experience. It's taken me about seven years to get this work out into the world. I first started thinking about what algorithms might be doing in the background of dating apps when I was using them. I was an early adopter on Tinder. I made my first account in 2013, and I would get really good matches or really bad matches depending on which city I was in. And it seemed that my attractiveness was evaluated differently depending on the culture I was swiping in. I did horribly in small town, Texas, where the dating pool was made of lots of folks with dead animals in their profile pics. And I did better in Baltimore, Washington DC, and my experience in the West Coast was sort of a mixed bag. 2719 matches later, I swiped enough to have some questions about how my race might play into my attractiveness rating. And later we'll talk about the specificity of that match number. But first, I'm going to share a little story. And if you've read the book, you're familiar with it. But for those who haven't, this is the moment that really sparked my imagination about what was happening inside dating apps. I was at a conference plenary for the Americans Histological Association in 2015. And some folks from eHarmony were on the panel along with Aziz Ansari before he was canceled. He's back again. We'll see until who knows. And Christian Rutter, the co-founder of OkCupid. Someone in the audience asked why they were getting so many unattractive people in their match deck. And Christian Rutter explained that if you think your matches are ugly, it's probably because you're ugly. Yeah. He spoke about a ranking system that scores users based on attractiveness and a few other features such as the responsiveness and activity on the app. And let's say that the ranking system is a one to nine, with nine being the hottest and one being the naughtest. If you're a six, you'll mostly see other users who are also a six. Occasionally a seven or eight, but mostly in your bracket. Now that I've gotten your attention, let's go through some of the less exciting but still important details. What is sexual racism? Often concealed as private, meaningless personal preference, I define sexual racism as personal racialized reasoning and sexual intimate or romantic partner choice or interest. In broader social context, sexual racism connotes a set of beliefs, ideas, practices and behaviors at the intersection of what is considered acceptable racialized gendered performance. This theoretical framing is important for understanding how dating apps and intimacy platforms allow sexual racism to flourish because they rely on white heteronormative standards of attraction, desirability and gender aesthetics to perform the sorting and matching algorithms that we are so comfortable with these days. And I want to state clearly, sexual racism existed long before dating platforms came to be dating platforms automate sexual racism, making it hyper efficient and routine to swipe and racially curated sexual marketplaces, but they did not invent it. Because the apps hide the underlying racially informed sorting and raking algorithms, people more readily believe that their racism is private, neutral and therefore harmless. Let's talk about algorithms and AI. Algorithms are just fancy math, which most of us know. AI or artificial intelligence refers to machines or technologies that are able to perform complex tasks that typically require human reasoning and decision making. Everyone's talking about AI right now, and frankly, I think we're giving it too much credit. Yes, it's powerful, but the humans who code who animate the AI hold the real power to make the world either better, more safe or worse and less safe. So I've included an image of a dating company's patent that shows how their algorithm might work, and I say may because I don't want to get sued. And as many of us know, patents can show us an intent of a company and give us a indication of an organization's ethos, but they don't definitively tell us what a company has chosen to do. So I'll proceed with that note. They might, as shown, collect users, filter for agent location, do the fancy math, calculate whether you are hottest or not, do some more fancy math, stack those folks in your match deck and then wait for you to swipe. I include language directly from the patent filed by match group because I think it's important for people to see how explicit dating companies are about what they desire to do. Based on my reading, they're operating in the frame of belief that there is some objective standard for beauty and attractiveness. When social scientists know that beauty and desire are very much shaped by culture. Where Finn is in in the US, it's way out in other countries, where it's all the hype to be tan in the US, people in some Asian context are going to the beach fully closed to maintain paleness. There is not a universal human standard of desirability. But because match group thinks that there is, or operates as though there is, we are all forced to operate within that framing when we use dating apps. You might be thinking, okay, fair enough, they have to standardize to make the platform work, right? What if there was an added element that's not illustrated here? Spoiler alert, there are lots of added elements. So they're using signals from a social networking platform. And I imagine in the early days, this was when you could sign into the dating apps using your Facebook account. And they have other ways I've talked about in depth in the book. You've got to pick it up. And they are, as Christian Rutter said, likely comparing users to generate attractiveness spores. And here's the big one. They may also use facial recognition algorithms to detect ethnicity, hair color, eye color, and likely a whole bunch of other things. Let's talk about facial detection and facial recognition for a moment. The algorithmic process for facial recognition and subsequent racial classification systems involves many algorithms at various steps of a lengthy procedure of classifying images. Typically the face is segmented. Facial features are detected with the little points that you're seeing on the screen. And then features are classified. And here we can see that key features are used to identify facial structure such as the eyes, lips, nose, and eyebrows. Images are broken down into small patches, then into pixels where the pixels are assigned a numeric value and broken down into shades. Facial recognition algorithms work really well for one kind of person. Can you guys see what it is? It's white. It's white men. The rest of us, not so much, right? Facial recognition algorithms consistently fail to properly detect and categorize the faces of women and darker skin individuals. In 2015, Google incorrectly tagged black people as gorillas. That was facial recognition. In Detroit, where people are encouraged to use facial recognition software in between policing, the chief of police claims that the software fails to correctly identify subjects 96% of the time. Because of these failures, facial recognition software was at one time perversionally banned in policing in several major cities including Boston and San Francisco. Why did they fail? Few facial recognition algorithms have been trained on racially and ethnically diverse data sets. These algorithms are not calibrated for darker skin and struggle with skin tone reflectance variation. Facial detection algorithms are trained on the most ideal faces, mathematically speaking, often celebrity data sets. Most human faces do not naturally have ideal symmetry without surgical intervention. And of course, culture matters. Computer scientists suggest that the nation of origin of algorithm designers and coders can influence the racial stereotypes that introduce bias into classification systems. If facial detection routinely fails to recognize and categorize women, those with darker skin tones and people between 18 and 30, how can we trust them to properly evaluate attractiveness? How can AI consider hair color, eye color, skin tone and ethnicity to determine our compatibility with each other? This logic, pairing life with like, might stem from what we call homophily in sociology. Homophily is the sociological principle that humans seek out sameness. They look for people who share defining characteristics such as age, gender, ethnicity, and socioeconomic status. In fact, in their early days, OkCupid made a blog post about homophily, which they've since taken down, probably because it garnered a lot of negative attention. But the internet lists forever. And this is my rendering of that data. You can Google it and find it on Reddit pretty easily. We can look at this and think. Sure, there's some homophily there. But what can we say about the very strong tendency to rate Asian men and black women as the least attractive category almost across the board, right? So we're seeing here that most folks are rating, missing the mic, rating folks within their own racial category as more attractive than others. And then we can see on the top, the black women are being consistently rated as less attractive. And then on the bottom here, Asian men and black men are consistently being rated less attractive than their counterparts. This is not homophily. It's anti-blackness and anti-Asian sentiment. And these come from deeply rooted historical and contemporary ideas about how we view Asian masculinity and black femininity. We know that femininity and masculinity are both embodied performances, yet some outward demonstrations exist outside of what we think of as normative. Gender scholar Judith Butler further informs that ideas about beating and attractiveness are entangled with expectations about the performance of gender, masculinity and femininity. Fundational to sexual racism is that whiteness is always defined as a normative expression of sexuality, healthy desire, and acceptable gender will tropes. Because race and gender are both socially constructed, they're always changing. Hence what constitutes sexual racism is also always changing. At the core of these practices are implicit and sometimes explicit, stated racialized beliefs about how women, men, and non-binary people perform femininity and masculinity in relation to their perceived racial identity. This performance is always evaluated by its relationship to the standard of white masculinity and femininity. So we can see Asian men are believed to have particular racial traits that would preclude them from performing the expected American version of masculinity, namely the prevailing stereotype that Asian men have small penises and tend to be less muscular, less dominant, and unassertive. On the flip side, black men are believed to deviate from these same gender roles groups in almost the opposite manner. They are received to be too sexually aggressive with sexual appetites that are beyond control. In both cases, assessments of Asian and black masculinity are measured against the standard of white heterotypical masculinity in the U.S. Suitors on dating platforms subconsciously or consciously assess the potential matches the ability to conform to these scripts of masculinity or to the cult of feminine virtue. These cultural beliefs that we have about these acceptable ways of doing gender as opposed to rejected forms of doing gender, exist in each of us in our culture and also in our technology. And as we can see, these ideas are pretty widespread in our popular culture. They're deeply embedded. Dating companies, like most in Big Tech, are simply a mirror reflecting back to us and amplifying and automating existing social divides. So what does this feel like as a dator? Of course, it depends on who you are. Some people want a curated experience. They want to be able to say, I only want to look for women who are Latina or I only want white femmes. The apps work for those folks. But those on the receiving end are often not having a good time. On the contrary, across the internet, women of color report being targets of racial fetishization. To better understand how fetishization might play a role in matching behavior online. I along with colleagues, Ronald Robertson and Han Yu-che conducted an audit cell experiment on Tinder. When we were all here in Boston in the pandemic, I can talk more about that method in the time of the pandemic and also the BLM resurgence and also the election of 2020. It was a wild time to be doing research, but we found some interesting things. So we created fictive femme personas using stock images that we purchased. And we had an independent panel reach each model's attractiveness and tell us which race or ethnic group that they thought each person belonged to. What we found is that match rates were on par with previous research done in this area. But the ways that people chose to speak to women differed and that there were racialized stereotypes that informed the beauty language used. So Asian women were most often told that they were cute by a wide margin, as you can see. And this is really interesting because the Kauai aesthetic dictates that Asian women are cute with large eyes, slender faces, and accompanies a whole trope of submissiveness that accompanies the idea of Asian femme cuteness. We also found that black women were being told that they were beautiful way more often than any of the other models that we used. And this was profoundly shocking. This finding was oppositional to what we would expect because prevailing research suggests that black women are the least desired dating demographic. Our findings were really perplexing and we hypothesized in this paper that I cited here that black women are beautiful or sexy, but they're not datable nor desirable for long term pairing and of course due to a lot of the stereotypes that I talked through a second ago. And I include this study in the book to highlight the complex position of being an object of sexual fetishization that women of color often have to deal with. They're once desirable for their otherness and for their exoticism, but simultaneously not able to push through the boundaries of those tropes. The story that I share next illuminates this juxtaposition in horrifying detail. So I've given you the high level view, but I want to honor the time and lived experience of those who shared their time with me by highlighting some of the scarier encounters that they had. I'm also going to warn listeners here in the room and especially those watching at home that if you have littles around, I would proceed with caution and provide a content warning for racial slurs and physical violence. Before I read this section, I also want to provide some hopeful context, helpful context. Brandy responded to a post that I made on Tumblr asking women to share their best and worst experiences while using the apps. Her story demonstrates that sexual racism is a very real form of racism that can have consequences that extend beyond the apps. Brandy is a light-skinned black woman and was a longtime user of Tinder. It's important to note that Brandy can pass for white, though when I look at her pictures, I read her as a person of color. I've left her narrative largely unedited, except for clarity to preserve the reality of her horrific experience. So I'll start now. So two years ago on Tinder, I matched with a fine white man. He was in his mid-thirties, had blue eyes, and was completely bald. Honestly to me, he was an absolute dreamboat. So we talked for a few days, then decided we were ready to meet. I live about an hour from the city, so we decided he would drive to me. So the day comes, I'm about to go on a date with this guy that I thought was so amazing and so perfect. I'm at work and he's blowing up my phone, so I'm scared something's going on. I call him, and he's like, why are you friends with so many black guys on Facebook? I'm a bit taken aback, like excuse me. He quickly corrected what he initially said and said that there are so many black guys commenting on my pictures and he was just jealous. So I explained to him that the people I associate myself with and have grown up with are predominantly black. He asserts me that he didn't mean it in a racist way. So I'm like, okay, whatever, let me finish this shift so you can take me out to eat, which like, heard. I feel that girl. Get the bag. On our first date, he was amazing. He was sweet to me, and he didn't try to cross any of my boundaries. So we plan a second date for the very next day. This is when things dramatically changed. He's talking about Snapchat and was like, yeah, let's send some pictures to my sister. She wants to see how cute we are. I opened my snap and he sees that my friend Malik sent me a snap. I will never forget to this day it was an innocent picture of him that said good evening. So this crazy motherfucker flips out. He's like, I knew you were a nigger lover. He then snatched my phone out of my hands and started going through it. Every guy's name he saw on my phone, he proceeded to ask me, is he a nigger? Did you fuck him? At this point, I'm pretty scared. I just want my phone back so I can get out of this man's car and walk home. I try to snatch my phone back, but he swipes my hand away. So me being scared, I grab the keys out of the ignition and try to jump out of the car, but he won't let me and he snatches his keys back. So now I'm really super uncomfortable and scared. This man throws my brand new iPhone. I literally got it a week prior out the window, picks it up, sees that it's not broken and then throws it again. I see the opportunity to jump out of the car and he tries to hit me. As he was driving away, he screamed at me that I was a nigger and that I would never amount to anything. I ran to my phone to see that it was shattered beyond repair. I couldn't call out to 911 and Siri wouldn't work. So I walk home, literally shaking and crying because of what just happened. I log into Facebook and see upwards of 20 plus comments on my pictures, my wall, my statuses, calling me a nigger lover and a slut. My iPad had several messages and voicemails from him that just said nigger and nigger lover. I immediately blocked him on every social media site I had and unmatched him on Tinder. Later on that night, I get a call from an unknown number. It's him. He called me to say I betrayed him by being a nigger lover and that he would wire me the money to pay to fix my phone to right his wrongs. Absolutely not. I didn't want to dime from him and I didn't want him to know my address. None of that. So a few months pass and I decide to try a different dating site because after that I was done with Tinder. He sees me and messages me. Can you guess what he said? He called me a nigger loving slut and made it so I couldn't message him back. So this day I'm still shaken up by how absolutely disgusting this man was. What year do we think this happened in? Like 1985, 1990? No. Yeah, close enough. 2015, 2016, right? So typically when I share this, folks feel that this is a shocking or new experience and that's a valid experience. Some of us, it's not new and it's not shocking. It's pretty routine. I would say that I was horrified, but I wasn't surprised when I read this story. The perpetuated belief that explicit racialized violence simply does not happen anymore comes from the same disbelief about the impact of sexual racism. I share this story so that we are all aware that these kinds of explicit racial hate crimes are still happening. This scholar and activist Moya Bailey originated the term misogynoir, which describes black women's unique experience at the intersection of misogyny and anti-blackness. Bailey argues that digitally mediated misogynoir can result in material experiences of violence. Brandy's experience in 2016 is one of lived racial terror that is rooted in sexual racism and misogynoir. The term nigger lover was originally used to describe white slavery abolitionists and other white sympathizers. It was also used as an insult before, during and after integration for those who associated with and or had romantic relationships with black folks. A belief in white racial purity is also bound up with the insidiousness of the racial slur. White individuals who have had sexual relationships with black people were thought to be traitors to the race and were now seen as un-pure or unclean. This reprehensible white man asked multiple times if Brandy had slept with black men to assess whether she was morally corrupt and unclean, tainted by blackness in his view. At the end of Brandy's narrative, she shared that the absolutely disgusting man continued to find and harass her across platforms. But what are platforms doing to protect marginalized users when they encounter racism, homophobia, transphobia, and ableism? Not enough. Which is what chapter 5, Safety Thirst, is about. It also asks and answers the question, who gets to be safe while using dating apps? I bet you know the answer. It's white men. When researchers talk about safety and online dating or lack of safety, they're primarily referring to the safety of white cisgender women. But even this relatively privileged group of users is not completely safe. And I argue that we should expand our definition of safety. A current public health definition of safety defines it as a state in which hazards and conditions leading to physical, psychological, and material harm are controlled in order to preserve the health and well-being of individuals and the community. To attain safety, individuals in society must cultivate equity in protecting human rights and freedoms, prevent consequences caused by harm, and respect the physical and psychological integrity of all. Marginalized daters, whether queer, trans, or people of color, and especially those at any intersection of these experiences, cannot be assured safety on dating platforms if companies are not willing to make provisions for that safety. Spaces where sexual racism flourish are fundamentally unsafe for people of color because they are routinely exposed to conditions that lead to psychological and physical harm. The problem is dating companies believe that their products facilitate a mostly safe experience for the normative white user. Because dating platforms do not name sexual racism as a communal threat to safety, they shape the belief that it is normal. Users continue to see sexual racism as an expected, trivial part of the online dating experience. Hence the first way that dating platforms help to automate sexual racism is through meaning-making via inaction. Experiences of federalization can produce and overlap with sexual harassment in ways that fall outside of community guidelines, and I question the degree to which these policies are intended to protect users and the extent to which they allow unsafe behavior to flourish. The second form of automating sexual racism speaks to the technological sense of the word automation. Many dating platforms use algorithmic systems to reinforce their safety efforts, which still largely fail to offer adequate protection for marginalized users. By using automated approaches to safety with no potential for racial and gendered bias, dating platforms literally automate sexual racism. By focusing on some aspects of safety, while failing to consider how their platforms facilitate sexual racism, online dating companies reinforce for users what is acceptable behavior inside and outside of dating platforms. And I bridge Ruha Benjamin here because she talks a lot about this in her work as well. The animating force of the New Jim Code is that tech designers encode judgments and detect systems but claim that the racist results of their designs are entirely exterior to the encoding process. Racism thus becomes doubly magnified and buried under layers of digital deniers. Digital denial. There are many tech insiders hiding behind the language of free speech, allowing racist and sexist harassment to run rampant in the digital public sphere. For this reason, we should consider how private industry choices are in fact public policy decisions. They are animated by political values influenced strongly by libertarianism, which extols individual autonomy and corporate freedom from government regulation. Let's pause for a moment to talk about the free speech value. I'm not a free speech expert, but I do know that tech companies are not the government, they are private corporations, and they don't have to abide by free speech. And we know this because in 2020, in the midst of the resurgence of the Movement for Black Lives, Tinder started censoring members with BLM and ACAB, all cops or bastards in their profile. Some users were able to have their accounts reinstated, others were not. They further struggled to balance appeasing users with more conservative beliefs, versus honoring the neoliberal individuals in prize by the tech world and supported for the Movement for Black Lives, safety and freedom raises a valuable question. If individuals can be reported and removed for language that poses an affront to whiteness, could similar low to no tolerance policies be adopted for language that incites racial harm and racially fetishizes or harasses users of color? Which would result in more relative safety for those users? Certainly the answer is yes, if the weight of white privilege does not include the company's desire to provide a safe experience for everyone. Right now, platforms put way too much onus on the user to maintain their own safety without taking accountability for system failures, like the ones that Brandy experienced. Like being harassed across multiple platforms, even though they do have the ability to track data across platforms. Hence, I argue that we need this enhanced definition of safety, safety for all, with an anti-racist lens, which means making specific plans for users likely to experience sexual racism, including racial fetishization and race-based harassment. Dating companies claim to be working on better safety reporting mechanisms for users, but these plans should include methods for monitoring and reporting experiences of sexual racism as well. In my three years of consulting with the tech industry, I've met a lot of people who were truly invested and equitable in just futures, but it is often difficult for academics to translate our work into actionable buy-in in the tech industry. After talking with Insiders at Her, Grindr and OKCupid, and others who didn't want to be named, I have a good idea about what's doable, and I have some more far-off dreams that I would like or that I think would take serious commitment to justice to enact. But let's start with what they can do with relatively little effort. We need accessible transparency, and this means explaining what happens when you upload your data to Tinder. Make it easy for people to download their own data. So right now, when you download your data from Tinder, it's a long process. I did this myself, which is where that number came from earlier, the 2719 matches. And you have to log in to a desktop. You can't do it from your phone. You have to request your data. It takes them about 72 hours to get back to you. Once they've gotten back to you, you only have 24 hours to download your data, and if you don't, you have to start the process over again. Some of that might be safety reasons I would like to think that's my sort of like most generous interpretation, but it also shouldn't take this many steps for you to get access to your own data. And I would encourage everyone to do it, because when you see the kind of data that they have on you, it might make you think twice about the trade-offs that we're giving. When I did this in 2019, I was able to see my geolocation data, all of the interest groups that I had linked from my Facebook, every picture I had uploaded, and every single conversation with a match that I had. They had redacted everyone's names and what they had said, but every line that I had written was there for every single match. So some homework for you all. If you're using the apps, try to access your own data. The second is expanded safety protocol for reporting racism, sexual racism, and racial fetishization and transphobia. And reporting that is backed up by humans that are trained in cultural context and cultural competency. We know that this can be done because some orgs have already enacted these kinds of protocols where they are training folks in the context in which they are detecting toxicity and then following up on that accordingly. And the third is accountability and open dialogue. And we're reporting to users with legitimate feedback mechanisms, meaning we're thinking about doing this with your data or we're thinking about adding this new feature and allowing users to say, I like this or I don't like this, or I would not like this experience to be added to my platform, right? So these are just a start. There are a lot more that I'm working on and will continue to work on as I've sort of been touring with this book and getting feedback from folks. But people always ask, so I like to sort of put these out there just to get us started. Should we the user break up with dating apps? That's the other question that I often get. I would say that the answer is no, but we should think about what they reflect back to us. We should think about where our personal preferences come from and ask if they're really neutral or if they're shaped by our larger culture, a culture which at times can be toxic, but can also be beautiful. We have in the past come together to get shit done. And we can do that again. I think it's going to look a little different. This might look like supporting regulation of tech companies instead of shying away from it. And for those of us with greater privilege, this means critically thinking about the features that we're paying for in our apps when we use them. What motivates us to pay for these features? Are we trying to filter out an entire group of people because of the stereotypes that we have about them? If that's the motivation, maybe take a few steps back from that. Ask yourself where that desire is rooted. If you are sort of just saying I'm not interested simply because of someone's race. Are those stereotypes true? Or is it something that you've heard from your parents, your family or in the media growing up? And I'll leave you on a high note. In case you were still thinking about your future with dating apps, and I hope that you are, I have to say I do think that they're worth fixing. When I first was writing this book, I probably wouldn't have said the same thing. I sort of was like, fuck these apps in that headspace because they were terrible. Until I met my partner, who I'm now married to, on Tinder in 2021, and we got married this fall, and Kendra was actually there. So keeping it in the BKC fam, so to speak. But I will say that I think that they're worth fixing because they do allow us to find communities in ways that we didn't have access to before. And the question is really just how are we going to do that? And are these companies willing to dialogue with us in a way that is equitable and representative of the things that we need? Thank you very much. Slightly awkward, not that awkward transition into our next, into our seated portion of this afternoon's events. Thank you so much for the talk. Thank you. I have a billion questions, including the ones I actually prepared and we talked about. And then I have other questions that I want to ask. But I think one thing, I'm going to go off script for a second at the beginning. And then I'll turn to some of the questions we talked about and some of the ones from the audience. I think you might have to do it. Which is, I'd be curious, you mentioned that when we think about the harms that folks face in online dating, most folks at the intersection of multiple axes of marginalization that are especially like, you know, most likely to experience those harms. And I'd be curious to hear you talk a little bit about like queer and trans users of color in particular in your studies and like what you saw about their experiences on the dating apps. Also, because, you know, to your point, right, like experiences of like stereotypes around attraction, very widely based on sort of the communities that you're in. And so like the kinds of stereotypes we can imagine in or we know about from, you know, in game, like masculine or male spaces are very different than those that operating straight spaces or lesbian spaces. So yeah, I'd be curious about sort of folks at that particular intersection, what you found. Yeah, absolutely. I talked to a few folks who were all over the sort of like trans spectrum. And one story in particular that jumps out at me is someone who identifies as white. And they told a story that was kind of similar to Bernie's in that the person was harassing them across multiple spaces. And I don't remember which app that they reported to but they did report. And most of our conversation was about how defeating they felt that process to be. They felt like the person and they were talking to an actual person, but they felt like they just didn't care. The person kind of gave them like a so what kind of attitude like what do you want me to do about this. And that really highlights this other factor, which is that if people make a report or attempt to and they feel as though nobody cares, then they're not going to use those reporting mechanisms. So companies can say like, Oh, we have all of these mechanisms available for people to use. But if people don't actually feel empowered to use them or feel as though there's going to be any follow up, then that's not actually a real incentive. It's not actually useful to have that feature. Yeah, it kind of reminds me of the phrase like the standard you walk past is standard you accept right if they did. And then a point you make in the talk, I think in the book really persuasively is this idea of just sort of even like things like stereotypes, like people not platforms not acting on them and people sort of being kind of gaslit about whether this is they're actually experiencing harm because they report it. And it's like, Oh, well, you know, there's no action taken right and how that in fact is its own very specific like additional form of sexual racism that folks are experiencing in addition to the harm of other users. Right, exactly. There's also the harm from the sort of gaslighting from the apps. Yeah. Yeah. All right, to transition to one of the questions that we actually talked about. I want to talk about your book title because I know you went through a couple of different versions. And I'd be curious what what what why you settled on not my type. What else you were kind of thinking about? Because I think it does speak to some of this core essential claims that you're making. But I also think there's so many different directions you could go with it. Sure. So I don't know. Some of you in the room might remember in the early days I was calling this work call me master. And it's because while I was living here in Boston, I matched with some guy who called and I was with my friend on the phone. It was like, I had him on, I think FaceTime, but we were in the car and I'm terrible with names. So I said to him, like, what's your name again? I'm so sorry because he kind of just cold called me. Like I wasn't really expecting him to call. So I was like, what's your name? I'm so sorry. Like I wasn't expecting you to call. He said, I'll put it this way. You can just call me master. And I was like, what is that? And I was very shocked about my friend was like really quick to respond. She just snatched the phone out of my hand. She was like, is that a race joke? Like, what are you doing? And he was like, no, no, it wasn't a race joke. It was just a bad joke. And I was like, hmm, I don't know about that. I don't know what that means, dude. So when I got back home, I sort of followed up with him and was like, was that a race stroke? Like, would you be honest and just tell me? And he was not ever going to just be honest and tell me. But we never talked about it after that. Like we never talked again after that. So I was like, hmm, it's pretty clear that that was some kind of something that you had going on. So I wanted to call it that. I got this review back from one of the reviewers early on that was like, I'm not even comfortable saying this title. Like I don't know if I could like promote this book and talk about it in a way that I feel proud of. And the work is really good, but maybe consider changing your title. So we went through a few iterations and our like Framley group chat. And I think this was the best one. I still like swipe white. I got to admit. Yeah. We also, we also were thinking about swipe white. But I feel like it's a tongue twister. Like it's harder for me to say for whatever reason. So I was like, maybe not. I mean, it is your book. So, you know, it's fair that you picked the title. I suppose. Thank you. Thank you. So yeah, I ended up going with not my type. And I feel like it really speaks to so many of the interviews that were done, especially with sort of like white presenting audiences who would repeatedly say like, Oh, I like black people. They're just not my type or like someone else finally just not my type. And I heard it so often that I was sort of like, Hmm, this is really interesting. And we hear it so much in popular culture too. Right. Like people often talk about what is their type. And I like to push back on that a little bit because what even is your type? Like who even knows? I think often we think that we have this sort of set box. And then if we're willing to look beyond that, we might actually connect with people who aren't our type, but we just work. And maybe just for a little while, maybe indefinitely who knows, but I think that there's something to be said about thinking through and deconstructing what your type is. I love that. So I'm going to ask one more question and then I'll open up to the room. But you talked a bit about concrete goals for dating apps, right? You were like, okay, here's some specific stuff. But then you also talked about sort of your bigger longer term justice oriented dreams. And I always want to hear about your bigger longer term justice oriented dreams. So obviously I'm going to ask about that. Yeah. Tell me about what, what are the far off dreams for like a big thing that dating apps could do? Yeah. So I pitched this to a few people and I think it's doable. I feel like it's just the resources thing. But when I did my interviews, you know, I talked to a hundred folks and I would ask people, would you rather know that the person that you're talking to is racist and like see them? Or would you rather just have those people removed from your swipe deck altogether? And it was split 50-50. I was really shocked because I am definitely of the persuasion of no, get them out of here. If they're racist, I don't want to see them. But some people were sort of like, no, I want to know that they exist. So they can't like trick me, right? And so one of the things that I would like to suggest is like a safety toggle on and off filter, right? Where you could say, if you want the standard just like regular tender, for example, experience that is being used, you just use it. But if you want a safer experience, maybe with folks with like homophobia and their profile or weird fetishizing preferences or folks who are using race filters in a way that is, I would say, not great. And I'm going to come back to that in a second. You could toggle on a safe experience, a safe mode where you don't have to deal with those people. Just like, great, I want some peace today. Maybe other days you don't choose peace, but maybe sometimes you need a break. I don't know. I think that that would be really doable. It seems not that hard. And if you're watching, you should do it, but also don't take my idea for free please. But the features, I wanted to talk about these race features just for a second because I do think that it makes sense for people to use them if they are folks of color and they're using them for their protection. Right. So often what will happen is people get really fatigued over having these experiences where they're encountering like the guy who's like, oh, I've always wanted to date a spicy Latina. And in those cases, it makes sense to me to say like, I'm going to filter out all those people because I keep having this experience of racism. So I do want to honor that it's okay to use those if you are using them for that sort of like protective measure. And I would say for the rest of us to question if we're using a race filter, why are we using a race filter? I mean, I think it, you know, back back in the olden days of online dating, it reminds me of the conversation about okay, Cupid and like there was a filter that's like, I don't want to see or be seen by straight people. Yeah. Right. And it's like, that, you know, is a way in which you can actually turn the like people's preferences and the the algorithm to the extent that there's one algorithm there isn't. Right. But like the algorithmic tools towards the advantage and like safety of users who you know, like based on every based on your book based on so much other evidence are having these kinds of experiences and I think it speaks to the way in which like, I think you've made this argument so conclusively, but the sort of idea that basically it doesn't have to be this way algorithmically this is not the only set of algorithmic tools you could build. And that it you know that this is the set of choices that was made is to have Benjamin's point to your point right like this very explicit decision to prioritize some daters over others and the comfort of daters who don't want to think about the question of whether they're being racist over the actual like lived and don't have to think about the question. Yeah. Yeah. I got more, but we have an audience and they probably have lots of questions. So yeah, love to hear from folks in the room first and then we'll go to one of the questions from online if we have time. Thank you for this April this is so enlightening and I love it. I love the combination of tech and cultural anthropology. There's just two quick brief concise quick questions and one is like it seems that like pornography is different than this because it gets rid of the niceties and the conventions and it just goes straight for the jugular it's like the is what we want and we're not even going to have any bones about it. So the question is, how does pornography as a coding matter but also as a sociological matter differ from dating apps. So that's the question but and then the second quick quick question has to do with class and the way that we fetishize class and education and all these things and the way that some people want to date only within their class and they don't care about race so much. So like how does class figure into any of your research and what did you find anything at all. I mean of course there's always sort of these nuances around class where people have ideas about how we perform masculinity and femininity, which are deeply tied to class notions right like depending on where you grew up in the south or the west or wherever is going to shape what you think about what a woman or a fem or a masculine person is supposed to look like. What kind of energy they're supposed to exude is very much tied up with class and our ability to perform these different, I would say, cultural receptions of gender are tied up with our, our literal capital, as well as our cultural right. So, yes, absolutely there is the sort of class aspect with it. And I think that people often would say like, I'm okay with dating, I'm open to dating anybody, but my parents aren't. And it would be really hard for someone, a partner of mine to come home to like that kind of racism or that kind of like class scrutiny. And in those cases like people have to talk to their partner and figure out is it worth it for you to come home like, do you want to be in a space that is going to be toxic or harmful for you are not emotionally safe. And I forgot your first question. Pornography how different. Thank you. I mean, I would say that in porn, the performers sometimes are acting with consent, whereas in dating if you are sexualizing someone without their consent that is the key distinction. Right, so I, if there are lots of folks, there are black dominatrixes financial dominatrixes who like to do the race based power play, and they get off on it and that's empowering for them. And that's totally fine because they're doing it in a consensual way right so if you want to do like slave role play with someone in a consensual way, that's not my bag. Some people it totally is and they should get the bag. I'm here for them getting the bag, but they're doing that with consent right so that's not that someone is being preyed upon or being targeted without their knowledge of that situation happening. Hi, thank you so much for this talk. I studied econ and undergrad so I have like boring quantitative questions, but most of mine are related to what kinds of specific variables that you controlled for. And then when you ran the quantitative research you did, and then also I'm really curious you've talked a lot about Tinder. Those of us who are currently using dating apps know that at least I would believe Tinder is a specific environment versus other types of apps that are also owned by match group and use their algorithms so how would you say that the different apps compared to one another. Yeah, so I'll start by saying that match group has like 30 plus companies in their portfolio right so they have a near monopoly. And typically what we see with monopoly type organizations is that once you sort of created the technology or you set the standard the rest of the folks in that industry sort of follow the standard that you've set. So it's likely that the algorithms are sort of being shared like this sort of just the way society works is that we do a lot of the sharing and mapping. So it's the same kind of tech. I know that Tinder has this reputation for being a hookup app, but I think that that's more in cities to be honest and smaller places Tinder is not only a hookup app. Like where I am in Ann Arbor, it's not tiny, but it's not Boston. It doesn't have the same kind of reputation. It's a little bit more fluid. And I think that people use apps often for things other than what their sort of like reputation is. Like I know people who meet on like literal hookup apps like that's their their moniker and end up partnering with those folks right so I think that we sort of have to shy away from this idea that they are somehow set for a specific kind of experience. In terms of the quanta state of stuff. I am just sort of like qualitative sociologists on the team. I am not great with the quant, but you should read the paper because we do talk about the control factors that we have in the paper there. Thank you. Why don't we do one one from one from the internet one from the ceiling. Okay, so do you believe that the trend of users of color migrating to ethnicity specific dating platforms could help mitigate the concerns you raised. Additionally, do you think that this shift could potentially diminish their influence on the policies and features of larger or more general dating platforms such as inch. Yeah, I hinge. hinge is not doing great right now by or for the people. I have talked to so many black women who talk about being in rose jail, where they just are not having an equitable experience and there's actually a tech talker right now who has deleted hinge re uploaded her exact same data but put that she's white and is having a completely different experience than when she put that her race was black. So, I can totally see why people would be moving away from these mainstream apps if that's the kind of experience that they're having. At the same time, I those apps are sort of not always existed but the kind of people who want to use those apps are going to use them. But the folks who want that experience maybe like a more curated experience, and also still want to be open to everybody else are still going to be using the mainstream sort of apps. So I don't think that there's sort of going to be like a mass exodus. I hope that there will be a reckoning of sort of like users want to be informed about what you're doing. It's not that we want to leave the platform people love their dating apps people are not going to get rid of them that like we're not going to see the resurgence of just meeting your partner organically like at the grocery store like it does happen. But I don't think that that's the primary way. And we really just want to know what's happening and to my think my overall goal is for people to know how the algorithms are moving the data right and for companies to just be more transparent and more open with how users might have more autonomy over what they're being shown and also what they're presenting to other people. I think that's a great note to end on I will avoid an organic groceries versus conventional grocery store pun, which is probably best for everybody. Join me in thanking April for this incredible talk. I'm here with her on Twitter at April W and I hear tell she might get blue sky but maybe this is just me saying this on a recording in order to put pressure on her so don't take me seriously. And her book is available for sale on online but also here physically for folks in the room. And if you ask nicely Professor Williams might even sign it for you so join me in thanking her for her incredible talk. Thank you. Perfect.