 Hey everyone, welcome to theCUBE's coverage of Women in Data Science 2022. I'm Lisa Martin, excited to be coming to you live from Stanford University at the Ariyaga Alumni Center. I'm pleased to welcome, fresh from the keynote stage, Alex Hanna, the director of research at the Dare Institute. Alex, it's great to have you on the program. Yeah, lovely to be here. Talk to me a little bit about yourself. I know your background is in sociology. We were talking before we went live about your hobbies and roller-drivers and love. Talk to me a little bit about your background and what the Dare Institute, this is Distributed AI Research Institute. What it actually is doing. Sure, absolutely. So happy to be here, talking to the women in data science community. So my background is in sociology, but also in computer science and machine learning. So my dissertation work was actually focusing on developing some machine learning and natural language processing tools for analyzing protest event data and generating that and applying that to pertinent questions within social movement scholarship. After that, I was faculty at University of Toronto and then research scientist at Google on the ethical AI team, where I met Dr. Timney Jabru, who's the founder of Dare. And so Dare is a nonprofit research institute oriented on around independent community-based AI work. Focused really on the kind of lots of discussions around AI are done by big companies or companies focused on solutions that are very much oriented around collecting as much data as they can, not really knowing if it's going to be for community benefit. At Dare, we want to flip that. We want to really want to prioritize what that would mean if communities had input into data-driven technologies, what it would mean for those communities and how we can help there. Double-click into some of your research. Where do your passions lie? So I'm a sociologist and a lot of that being, I think one of the big insights of sociology is to really highlight how society can be more just, how we can interrogate inequality and understanding how to make those distances between people who are underserved and over-served, who already have quite a lot, how we can reduce the disparities. So finding out where that lies, especially in technology, that's really what I'm passionate about. So it's not just technology, which I think can be helpful, but it's really understanding what it means to reduce those gaps and make the world more just. And that is so important. I mean, as more and more data is generated, exponentially growing, so are some of the biases and the challenges that that causes. You just gave your tech vision talk, which I had a chance to see most of it. And you were talking about something that's very interesting, that is the biases in facial recognition software. We'll be a little bit about what you talked about and why that is such a challenge. And also, what are some of the steps being made in the right direction, where that's concerned? Yeah, so the work I was talking about in the talk was highlighting not work I've done, but the work by Drs. Joy Balumini and Tunisia Brew, focusing on the distance that exists and the biases that exist in facial recognition as a technical system. The fact remains also that facial recognition is used and is disproportionately deployed on marginalized populations. So in the US, that means black and brown communities, that's where facial recognition is used disproportionately. And we also see this in refugee contexts where refugees will be leaving country and those facial recognition software will be used in those contexts and surveilling them. So these are people already in a really precarious place. And so some of the movements there have been to de-bias some of the facial recognition tools. I actually don't think that's far enough. I'm fundamentally against facial recognition. I think that it shouldn't be used as a technology because it is used so pervasively in surveillance and policing. And if we're going to approach that, we really need to rethink our models of security, models of immigration and whatnot. Right, it's such an important topic to discuss because I think it needs more awareness about some of the biases, but also to your point about some of those vulnerable communities that are really potentially being harmed by technologies like that. We have to be, there's a fine line. Or maybe it's not so fine. I don't think it's that fine. Absolutely, I think it's used in an incredibly harsh way. And for instance, there's research that's being done in which, so I'm a transgender woman and there's research being done by researchers who collected data sets that people had on YouTube documenting their transitions. And already there was a researcher collecting those data and saying, well, we can have terrorists or something take hormones and cross borders. And you talk to any trans person, you're like, well, that's not how it works. First off, second off, you know, it's already viewing trans people and the trans body as kind of a mode of deception. And so that's, you know, whereas researchers in this space were collecting those data and saying that, well, we should collect these data to help make these facial recognitions more fair, but that's not fair if it's going to be used on a population that's already intensely surveilled and held in suspicion. Right, that's the question of fairness is huge. Absolutely. Were you always interested in tech? You've talked about your background in sociology. Was it something that you always, were you a STEM kid from the time you were little? Talk to me about your background and how you got to where you are now. Yeah, I've been using computers since I was four. I've been using, I was taking apart, you know, my parents' gateway computer. Oh, you're one of those kids. I love that. Yeah, when I was 10, going to computer shows, slapping hard drives into things, seeing how much we could upgrade computer on our own and, you know, ruining more than one computer to my parents' chagrin. But I've always been that. I went to undergrad in triple major to computer science, math and sociology. You know, and originally just in computer science and then out of the other two, I'm more interested in things and understanding that I was really interested in this intersection of tech and society. And I think the more and more I sat within the field and went and did my graduate work in sociology and other social sciences really found that there was a place to interrogate those, that intersection of the two. Exactly. What are some of the things that excite you now about where technology is going? What are some of the positives that you see? I talk so much about the negatives. It's really hard to, I mean, there's, I think, some of the things that I think that are positive are really the community-driven initiatives that are saying, well, what can we do to remake this in such a way that is going to be more positive for our community? And so seeing projects like that try to do community control over certain kinds of AI models or really try to tie together different kinds of fields, I mean, that's exciting. And I think right now we're seeing a lot of people that are super politically and justice literate and they know how to work and they know what's behind all this data-driven technologies and they can really try to flip the script and try to understand what would it mean to kind of turn this into something that empowers us instead of being something that is really becoming centralized in a few companies. Right, we need to be empowered with that for sure. How did you get involved with WIDS? So Margo, one of the co-directors, we sit on a board together of the Human Rights Data Analysis Group and I've been a huge fan of HRDAG for a really long time because HRDAG is probably one of the first projects I've seen that's really focused on using data for accountability, for justice. Their methodology has been called on to hold perpetrators of genocide to accounts, to hold state violence perpetrators to account and I always thought that was really admirable. And so being on their board is sort of kind of a dream not that they're actually coming to me for advice. So I met Margo and she said, come on down and let's do a thing for WIDS and I happily obliged. Is this your first WIDS? This is my very first WIDS. Oh, excellent. What's your interpretation so far? I'm having a great time. I'm learning a lot, meeting a lot of great people and I think it's great to bring folks from all levels here, not only people who are super senior which they're not going to get the most out of it. It's going to be the high school students, the undergrads, grad students, folks who, and you're never told to be mentored. So you know, finding your own mentors too. You know, it's so great to see the young faces here and the mature faces as well. But one of the things that I caught in the panel this morning was the talk about mentors versus sponsors. And you know, that's actually, I didn't know the difference until a few years ago at another women in tech event. And I thought it was such great advice for those panelists to be talking to the audience, talking about the importance of mentors but also the difference between a mentor and a sponsor. Who are some of your mentors? Yeah, I mean, great question. It's going to sound cheesy, but my boss, Tenezebrew, I mean, she's been a huge mentor for me and with her and another mentor, Meg Mitchell, you know, I wouldn't have been a research scientist. I was the first social scientist on the research scientist ladder at Google before I left and if it wasn't for there, they did sponsor, but then they all also mentor me, you know, greatly. My PhD advisor, Pam Oliver, huge mentor about it. And I mean, lots of, you know, primarily and in peer mentors, people that are kind of at the same stage as me academically, but also in professionally, but are mentors. So folks like Anna Lauren Hoffman, who's at the UW, you know, she's a great inspiration and collaborator and co-conspirator, so yeah. Co-conspirator, I like that. I'm sure you have quite a few mentees as well. Yeah. Talk to me a little bit about that and what excites you about being a mentor? Yeah, yeah, I have a lot of mentees, either informally or formally. And I sought that out purposefully. I think one of the speakers this morning on the panel was saying, you know, if you can mentor, do it. And that's what I did and sought out that. I mean, it excites me because folks, you know, I don't have all the answers. No one person does. You only get to those places if you have a large community. And I think being smart is often something that people think comes like, you know, there's kind of like a smart gene or whatever, but like, there probably is. Like I'm not a biologist or a cognitive anything, but you know, what really takes cultivation is being kind and really advocating for other people and building solidarity. And so that's what mentorship really means to me, is building that solidarity and really trying to lift other people up. I mean, if you know, I'm only here and where I'm at in my career because many, many people were mentors and sponsors to me and it's only right to pay that forward. I love that, paying that forward. That's so true. There's nothing like a good community, right? I mean, there's so much opportunity that that groundswell just generates, which is what I love. We are, tomorrow is International Women's Day. And if we look at the numbers, you know, women are 50% of the workforce, but only less than a quarter in STEM positions. What's your advice and recommendation for those young girls who might be intimidated or might be being told even to this day, no, you can't do physics, you can't do computer science, you can't tell them. Yeah, I mean, so individual solutions to that are putting a band-aid on a very big wound. And I mean, I think finding other people and working to change it. I mean, I think building structures, solidarity and care are really the only way we'll get out of that. I agree. Well, Alex, it's been great to have you on the program. Thank you for coming and sharing what you're doing at Dare, the intersection of sociology and technology. It was fascinating. And you're Roll the Derby. We'll have to talk more about that. Yeah, for sure. Excellent. Thanks for joining me. Yeah, thank you, Lisa. For Alex, Hannah, I'm Lisa Martin. You're watching theCUBE's coverage live of Women in Data Science Worldwide Conference 2022. Stick around, my next guest is coming right up.