 It's my pleasure to introduce Kelly Wagman who came to CMS from Brown University where she majored in computer science and economics and who after CMS is headed on to the University of Chicago to work on a PhD in computer science. In between she worked for several years as a software engineer at Microsoft and after coming here to MIT. As an RA in both the global media technology and cultures lab and in the open documentary lab where she's a joy to have three things to add. Kelly is incredibly productive has authored or co authored for peer reviewed papers and several more are in the pipeline so Second, our badly missed colleague Lisa parks is also advising Kelly and we're both Kelly and I both grateful to her for her continued engagement. And on a personal note, weekend and week out our meetings have proven to be the high point of my calendar. I've learned a ton from working with Kelly. I'm grateful for her intervention which we'll hear about in a second and inspiration. She gives me confidence in the future. Kelly. Thank you so much. What an intro. Okay, well, I guess I will hop in. Let's share. If you will see my screen, someone just thumbs up. Yep. So, like I said, my talk is titled sex power and technology or relational engineering ethos as feminist utopia. So actually William covered this a little bit but I wanted to just say that I am coming from a particular vantage point that has really informed my research so I identify as a white cisgender woman I grew up in the US. And I spent several years as a software engineer at big US tech companies. So, you know, there are a lot of different ways to think about the tech industry. A lot of people think about this image at the top, you know, kind of a hacker alone in a basement. Before I got to college I really thought it was more like the middle like I was going to be building these kind of magical things. And when I got there I realized it was actually more like the image on the bottom where, you know, guys stood around and he's fixing bugs. So, you know, I was in a TV show Silicon Valley which actually felt pretty accurate. And this is me with some of my teammates from a few years ago. So this project really came out of meeting language to explain this experience that I had of being kind of one of the only women in my context, and also like this mismatch between what I felt like the tech industry could be and what I actually experienced while I was there. So in tech, a lot of people ask what we can build but it seems less common to ask what we should build. And this is the question that really underlies a lot of my research. The word should brings with it moral and political questions that are hard to answer from within an engineering framework. So before looking at what we should build I wanted to start by trying to understand what we are building. Silicon Valley and the tech cultures that are influenced by Silicon Valley have a particular way of thinking about the world. This has been sometimes called the Californian ideology, which was a term coined in 1995 by Barbara Cameron, and there are a number of good books on this topic, here are just a few of them. So some of the main ideas from behind the California ideology are one that it has this unique history that was born from a convergence of cultures in the second half of the 20th century, including hippie counterculture, libertarian ideology, funding from the government for World War two, and the idea of cybernetics. There's also a turn from computers as actual women to computers as being mostly men because the work was originally deemed menial and then became thought of as something kind of difficult and complicated. Nathan ends meger talks about this idea of extreme masculinity in the culture, for example staying up all night to program something as kind of a show of strength. It's very much built around capitalism, things like venture capital funding, provide incentives to startups to maximize profit and scale. There's an underlying logic of kind of rationalism where many products focus on emphasizing optimization efficiency productivity. And there are also strands of neoliberalism that emphasize the individual and apps that work for self actualization self improvement. So, there's been a lot of good work talking about why the California ideology is somewhat flawed. I think Roya's presentation did a good job talking about some of these. So I'm not really interested in critiquing the existing state of the world I'm more interested in trying to understand what would a different culture different ideology around creating technology look like. So before diving into that I want to talk about a few groups that are already trying to kind of think about how do we read some of these ways of thinking. I found that really the groups that are doing this are doing really important work that I really support, but often they're not quite fully helpful to designers who are looking for kind of a vision of what else they could be building. So there are diversity and inclusion organizations which tend to focus on improved representation within the existing culture that are not typically working to actually change the culture. So activists use labor tactics to protest and ethical technology and by working conditions, particularly for contract workers. However, they often also don't provide kind of a vision of what to build it's often here is what we should not be building. Art pieces are wonderful at critiquing the existing state of the world and provide a lot of really valuable insight, but this insight is often hard for people in engineering to understand, and needs to be translated into something that's more generative. Art pieces actively build up ideas for new worlds using technology but I think can be dismissed by people in engineering because they are seen as kind of one off or not applicable to product development. In terms of participatory design co-design and other methodologies I have found really inspirational and it really influenced my work, but the methodology alone, despite providing a way for communities to build things together doesn't again quite outline like what is the vision of the thing that should be built. So, I'm proposing something that I call a relational engineering ethos. There's a simplified way of explaining the difference between that and the California ideology, as I think of someone in the California ideology context saying, I'm going to change the world for the better and get rich by making a highly scalable startup that optimizes something. In terms of relational engineering ethos, we think about I can have lasting impact and develop lifelong friendships by considering the relationships formed throughout the entire lifecycle of my technology and prioritizing caring relations over profit and scale. So, I'm going to unpack that a little bit. One of the things that's important to start with is this idea of a feminist utopia. So, traditionally utopias run into problems because they're usually static visions that are defined by a singular expert, which can lead to totalitarianism. However, in contrast feminist utopias allow for multiple visions and are more process oriented and inclusive of many voices. And the other kind of important concept here is the idea of a socio technical system. So, often in the tech industry people think about a product as a singular thing and kind of how do we build it and what is the user experience. So, for example, you can think about the Amazon Echo. You can think about how do we program it and how are people going to use it in their homes. But as Crawford and Juller showing this image you can actually think about a technology is like a web of different socio and technical systems so we have on the left like what are the minerals that have to be mine to create the object. What is the data processing and labeling that goes into creating the machine learning algorithms. And finally once the thing is disposed of how how is it going to break down and how is that going to affect the environment. So, we're shifting our thinking from thinking about the object as the singular artifact of thinking about it as this web embedded in a social system. So to come back to this idea of relational engineering, I'm proposing it as a feminist utopia because it isn't a singular vision but more a way of framing the problem of building new technology. And I'm using the idea of a socio technical socio technical system as a framework. The ideas behind relational engineering aren't new. There are many books that kind of talk about this style of thinking, but the goal for me was to kind of understand how do we wrap up these ideas in a way that is makes sense to people in the design world and for people who probably aren't actually going to go read like her way on their own. So, a few of the important concepts are emphasizing building caring relationships between humans and non humans as opposed to building neutral artifacts. Seeing interaction is situated within these messy socio technical systems as opposed to a problem to be solved or optimized considering technology as co creating experience so avoiding the human domination over the technical artifact and holding space for two new technologies that are perhaps inefficient meandering sensual, not necessarily optimized. And there's a lot of feminist literature on these concepts of care, love and joy which I'm not going to go into in depth right now. Okay, so let's think about a couple of key studies that kind of highlight this way of thinking. If we think back to that map of the socio technical system you can understand that there are relationships at many different scales. And we can kind of zoom in on just one of them. So here I'm kind of thinking about how do we zoom in on the relationships between teammates in an engineering group. So I wanted to understand what these relationships could look like in a culture that was not part of the dominant Silicon Valley culture. So I did participant observation at a self described feminist data science lab, where I attended weekly meetings. And I also interviewed nine women and non very binary engineers from a range of universities and jobs to try to get a sense of what this alternative culture could look like. So the first thing I want to talk about is how our participants describe feeling within the dominant culture. So this is a graduate student who said, I think a stereotypical technical university student is kind of a nerd but it's also someone who has a lot of privilege but isn't very good at dealing with that privilege. After graduating I would make the most money working in tech but that kind of industry culture wouldn't make me happy anymore. The second undergraduate student I talked with who was in a class said, my partner was a dude with a fancy terminal with customized colors and he's like telling me to type things. I'm just like sorry can you say that again I probably made a mistake, and he got really impatient and I got stressed out. It made me feel very on edge and like maybe that I didn't have enough experience, but then later in class, we're working on the actual problem, and then he didn't know what was going on because he didn't do the reading. It's made me think we try a lot harder to feel like we belong and we still feel like we don't even though we're more than qualified. So the important thing here is that both students describe feeling like they didn't fit into a culture, but they also talk about how they have the technical skills. So this is not a story of people not having the right skills it's a story of people feeling like they don't fit into a particular culture. Another phrase that I heard in my field work frequently with society of technical versus non technical or technical meant pertaining to math or code, whereas non technical was kind of everything else. The technical within this culture is very much masculinized and valorized. And I noticed a phenomenon that I started calling technical peacocking where my participants described usually men who would kind of show off their technical skills and a group as a display of mastery and kind of an attempt to be at the top of this hierarchy. My participants also they were not interested in engaging in this kind of behavior. And the one said guys are typically way more technically oriented. I've seen that in this caption project women are like oh I'm not that technical so I can work on outreach slide templates or graphics. So she goes on to talk about someone who's really excited about like the mechanics of sand and she kind of realizes that that is really, you know, valued in this culture but she herself doesn't quite get like why someone would want to be so excited about the mechanics of So we can see this kind of gendering and different kinds of labor that is valued differently within the culture. So the point I want to make is that it is not my interviewees that needed to be changed or taught how to code better, but the culture that needed to be fixed. I really like this quote from Stephanie Dinkins who says, once we get to the table either we're warriors, where we come with all these assumptions already embedded. So she means that most marginalized people who survive in the tech industry are forced to become warriors in order to do so. And when I'm arguing is that basically people shouldn't be forced to do that we should change the culture. So how do you do that exactly. So the literature on how so called feminist groups actually end up being oppressive. So it's really important to focus on not the word feminist but the practices and norms that I found within the group specifically. So, these are kind of the things that the group did that I thought made me the culture really welcoming inclusive. So the practices are relational and the lab really focused on building strong and caring relations among members. So, I'm just going to go through them kind of quickly. First, we have micro affirmations as opposed to micro aggressions. One example is putting pronouns and zoom names. And then we have collective decision making and embracing ambiguity. So, for example, the lab co created a handbook that explicitly welcome people from different identity groups, and also really rewarded questioning more than kind of being right. So there was no showing off of technical knowledge valuing social intelligence lived experience meant that people could bring their whole selves to the lab. It would take time to introduce anyone new, even if it meant, you know, reintroducing people that had already been introduced, and proactively engaged in conversations about social justice, research or trauma and current events in order to check in on members and see how to support them. So I really emphasize experimentation and joy as important parts of the research process. So whereas in some groups it may be that you're totally focused on productivity and producing like a number of papers or some kind of quantitative things like producing, producing zines as research outputs to kind of highlight how members could be involved in ways that they were really excited about what's important. So this kind of stands in contrast to some of the experiences people had in the dominant engineering space. So I liked this quote from the lab guy who I interviewed who said, people do their best work when they feel supported or trusted, sort of like you need it infrastructure you need emotional infrastructure in order to be creative. I think it is very inhibiting to be in a toxic environment and you're told you're supposed to be creative for your always being told that you don't belong in various ways. So I kind of wanted to end this case study with a story that highlights this kind of relation building in the engineering team not only leads to more diverse and inclusive teams, but also changes the kinds of technologies like a built. So, I met a sophomore undergraduate at a hackathon I attended where she was introduced to the idea of data feminism. When I interviewed her she told me she didn't think she would take any more computer science classes because she had had such a bad experience culture wise. But a few days after our interview she emailed me to say that she had read the data feminism book and that she had reflected on her time at the hackathon, and she had a couple of ideas. She said that the book really put deeply in green, helped her understand deeply in green power and balances within our data and technology systems. And then she goes on to outline two different tech projects that she wants to make based on her interest in raising awareness about this board of reading. And they struck me that here is someone who went from basically giving up on working in tech to saying she wanted to build out these two really wonderful projects. And it was just so clear that this is not a story of a pipeline problem where she just needed to get better at coding or be more confident. It was a story of finding an alternative to the dominant way of thinking about technology that she encountered in her classes and feeling validated and ideas. So I think if we want people like her to be leading our tech efforts in the future, we need to create a tech culture that values relationship building over things like technical. Okay, I also want to cover one other case study. So, again, we think back to that map that we saw earlier with the differences your technical relationships. You know, this was kind of zooming in on relationships between teammates, but we could also think about zooming in on the relationship between the person and the artifact itself so a human mission relationship. So this, I worked with Lisa Parks and we actually have published this as a paper and read the whole paper at this link if you're interested. So our research question was to ask how designers should think about building essentially robots we actually decide that we don't like the term robot we use the term social machine because the word robot brings with it a lot of baggage and assumptions about what it should be and what it should look like. So ultimately we argue that when designing a social machine you're not really designing an object you're designing a relationship. And we draw on a lot of background from feminist science and technology studies, which I'm not going to go into in depth but isn't the paper. But I think this image is kind of helpful in getting a sense of what we're trying to talk about so this is an ad from the 50s that basically posits these technical home systems as servants. So it implies kind of a human domination over these systems and not only imagining them as technical systems but actually imagining them as people. So before getting into how we think designers should think about building this relationship we looked at how particularly roboticist already think about designing this kind of relationship and we found that there are these four categories so people think about social machines as tools as human companions as animals and creatures and as slaves. And we argue that each of these is problematic in kind of its own way. So tools are what Lucy such men call subject objects, meaning that like a car can be a tool to get you places that often people name their cars and talk to their cars. So the subject is at once a subject because people talk with it and gauge with it and an object because it's something that kind of does something, but you can, you can't deny the fact that people treat objects to subject some of the time. In terms of humanoid machines such men outlines the problem and creating what she calls ideal organisms. So people tend to reproduce normative standards and ideals when they create something that looks like something they already know about. So if you look at humanoid robots, many of them also kind of reproduce these beauty standards that we see around the world. Animals and creatures can have some of that same problem with becoming an idealized organism. But in addition, we have a long history of dominating animals and kind of a fraud there's a fraud human animal relationship that we don't really want to replicate when we move to machines. And finally we think about slaves as essentially tools that have been made to act like people, but that the human has like full domination over. And here we argue that allowing this kind of relation of domination to exist where someone can have full control over this human like object is just like fundamentally flawed and not something that we should allow. So our proposal is that social machine should be approached as what we call a gentle and equitable others, meaning that we recognize that the social machine has agency. We think about it as existing and an equitable relationship with the human. And we also think of it as an other to humans and animals. So instead of trying to, to map it onto one of these existing categories just accept that it's like a new thing. And we run in these to these kind of two problems when you actually try to build something based on that definition that we pose as challenges to the design community. So first we think about non anthropomorphic configuration where how do you make something that doesn't replicate human ways of being and doing. And this is actually kind of hard because like I said, we talked to our cars and I wouldn't really consider the car to be anthropomorphic so it's kind of an open question whether it's possible to create something that's fully not anthropomorphic, but we like this example by Kelly Dobson who's made a blender where instead of blender is voice activated and instead of talking to it and kind of human speech, she barrels at it in blender language, and then the blender responds and turns on so kind of the human becoming like the machine instead of the machine becoming like the human. So the design challenge is mutuality so we talked about this idea of an equal relationship between humans and machines. It's also not clear whether that's possible because humans make machines so can, can you get rid of that power and balance is not, not totally clear this piece by Lauren Lee McCarthy really starts to investigate that where she acts as a digital assistant in someone's home for about a week. So you actually have a human on both sides where there's the person in the home and then she's acting out the role of the digital assistant, and they have to kind of navigate this relationship and understand what is an appropriate way to be around each and it's hard and it's awkward and it doesn't really solve the problem but it highlights kind of what a hard problem it is to kind of think about how one would design that relationship. So this is not part of the paper with Lisa but I kind of experimented with building my own social machine that would try to experiment with some of these ideas so I call this envy. And it's not anthropomorphic in the sense that it is geometric but it is kind of anthropomorphic in the sense that I have a kind of expanding and contracting, which makes it kind of look like it's breathing I think it has some agency in the sense that it if you're not clicking on it it will kind of move around on its own and leave a trail, but the human can also interact with it by clicking and moving it around so in a sense, you and the machine are co creating this image. And finally I wanted to show that you can make something that was interesting but not. So there's no goal. There's no thing that I'm just trying to solve or achieve. It's simply an interaction that you can have and kind of explore an existing kind of this ambient way of being with media. Next to wrap up, I kind of wanted to return to these two simplified concepts the California ideology as the I'm going to change the world for the better and get rich by making a highly scalable startup that optimizes something. And the relational engineering ethos and contrast that says I can have lasting impact and develop life on friendships by considering the relationships form throughout the entire life cycle of my technology and prioritizing caring relations over profit and scale. That makes a little bit more sense after we went through the two case studies. And just to summarize, I think, not the point here is kind of that not all tech development needs to look like Silicon Valley, there are viable alternatives that are more inclusive and produce different kinds of technology. So the relational engineering ethos is one alternative that I have proposed that encourages thinking about building caring relationships as kind of the foundation of the technological process. I have presented two case studies one looked at relationships among teammates and feminist data science lab. And the other looked at designing human machine relationships, and I just want to end with you know this is not like a solution to capitalism there are a lot of other big structures right here, but I do think reframing how people think about these things can lead to more interesting future ways of being in the world and I was particularly inspired by a lot of the students I spoke with who were excited about some of these. Thank you. A big thank you to William and to Lisa certainly couldn't have done this without you and many more people who I don't have time to name now but who definitely made this possible. Thanks so much Kelly. And now I'm going to open open to questions. Yes, Scott. Thank you for your presentation. And I feel like I learned a lot from it. I want to just ask one question about when you raise sort of the concern of thinking about a machine as a slave. I get sort of the that from a moral perspective, we ought, we ought to be thinking about anything as slaves. In terms of our own moral development that's important. Is there any implication that there's some impact on the machine, the machine, the machine itself I mean in other words, beyond sort of our own. What, what is the reason to avoid thinking of a machine as a slave. I think that it's twofold one the most important thing is to try to yes get rid of these relations of domination that exist in the world. Just as kind of a principle that if you have them in some places it's a kind of a slippery slope and we have a history of thinking of things that we don't consider human to be slaves, including other people. So, it's, I think there's an argument to be made for just big, getting rid of that in general. And I also think you can think about it as a kind of de centering of the human as the primary important thing in the world. So, while you may say well it doesn't really matter about the machines, you know you extend that so like the environment and animals and all these other things that we have spent our history dominating that we've kind of said well it doesn't matter about the trees. And so it's more of an outlook that we should think about the system is kind of like all important and a whole and machines are now part of that. And so how do we integrate them into creating the like power dynamics that we want to see. Teal. Hey Kelly thanks so much shows is really it's been fun to see your trajectory. I kind of want to ask a playful question about this. And I'm going to channel a Sherry Turkle kind of question because I think the the STS inflection of your project so interesting and I love it. One of the things I know Turkle fears that she writes about is that our machines asked us to love them and that's part of their affective power but they can't love us back. And so I wonder like if you extend your argument would be it would it include actually making that making machines that could love us back like a full, a full realization of the relational things. You know we engage with this in the paper a little bit and I've talked actually a lot to Sherry. It's a hard question I think that the point that I've come to and thinking about her work is that she really frames the machine as a human, and thinks of it as a thing that we're trying to make it human and we're trying to make it love in a way that is human. And I think I disagree with that that I think that like we say that the machine should not be framed in a human way of thinking about things and we should not try to make machines that claim to love in the way that humans love. But I think if we if we can kind of get the capacity to think of machines as their own thing that have their own kind of sets of properties, then I don't think that precludes creating something where you and the machine have a relationship I just wouldn't model it off of a human human love relationship. I love that answer thanks. Lisa, wonderful to see you. It's great to see you guys and I miss everybody and super great to hear your presentation Kelly. Just like, I'm really impressed by how you've extended yourself over the past two years and read so much different types of material across different fields and it's just really inspiring to witness that. And congrats on getting accepted into University of Chicago to I think it's such a, it's really wonderful. I wanted to ask a question about the slide when you were talking about shifting the dominant culture, and you, you suggest multiple areas and possibilities for for that. And one, one area that I think I would like love to hear you comment on that wasn't on that slide was just the way that big tech companies based in Silicon Valley, continue to insist that they're going to self regulate to conform with labor laws and make their, their internal cultures more diverse, inclusive and equitable. But we see then these high profile cases like what happened with Timnit, Hebrew and you know what the lawsuit going on with Chelsea glass and right now who was hired in connection with her pregnancy, or fired in connection with the pregnancy. And so I'm just wondering, I think you're going to be looked to as an expert to also comment on these more kind of corporate politics in connection with what you're suggesting. And so do you have any thoughts about these kinds of circumstances and how the concept of relational engineering could be infused in these California Silicon Valley type of environments like do you, do you see hope for that or do you think it just needs to be a kind of utopia outside of these contexts given the kinds of practices we've seen happening. Yeah, that's a great question. I think I wanted to start from the utopia outside of the context because it was important for me not to just get fully wrapped up in fixing broken systems that are like broken from the outset, in my opinion, and to really think about if we did start from scratch what would we want to build. I think you're right that this is kind of the state of the world and a lot of these systems already exists. And the thing that I like about this kind of relational engineering mindset, I think of it as kind of a mindset is that I think any individual engineer could kind of apply it in a lot of different ways to their own work. I think of like individual teams that some of these big companies trying to kind of learn from this feminist data science lab and could totally, you know, apply some of these principles and ways of existing at the team level. I think individual engineers working on projects could start to try to think about this way of thinking in their own projects and maybe that it's not totalizing or that it really fixes all of the culture but that if you start fixing little things at different scales, especially as I'm I really am so hopeful about like the next generation of students who are getting exposed to some of these other ways of thinking, you know, once they enter the workforce. It's not going to fix it like I said it's not going to fix capitalism but you get enough people planted and I think that there can be actual change. Kelly I'd like to ask it. I'd like to ask a question and that is that you know your trajectory through this program is one in which you were developing a certain set of ideas in the experience of practice. I came here with a clear set of ideas that came out of that that experience produced this, this work, and are now heading in the next step of your trajectory to a PhD program. I'm curious to hear a bit more about that next step and how this particular thesis project connects to, you know what you foresee in the coming years. I think that part of going to a PhD program in computer science was motivated by the fact that I feel like I can have more impact and more change from within that kind of vantage point then from without side from without and I kind of think of myself as like a translator because it's so much great critical work but I think it is just really inaccessible for people with an engineering background for the most part and so I really wanted to be able to like pull it and try to make it make sense. And yeah I mean I think I definitely fit the kind of warrior mold where I was like I'm going to get through this because I want to prove that I can do it and I don't know it's like what are my chances of actually trying to change the whole culture, probably slim but you know, if I could do something even at the department level in one school like that would make me feel pretty good. Thanks and we have one question from the from the chat from Hannah Varner. What are your current dissemination plans for this framework, academic articles popular scientific communication etc. Thanks, it's a good question. Yes, I would like to write for more of like an undergraduate level audience and try to figure out how to disseminate particularly to. I think like undergraduate women and marginalized groups and engineering because I think so much of it for me was understanding that that this is a big structural problem that like I wasn't the problem that I didn't need to conform to this way of thinking that I could that like other ways of thinking we're valid, but it's hard to know that I think sometimes when you're very overwhelmed by a particular way of thinking. So, yes, I'm not sure precisely what something along this line.