 Hello everyone. I appreciate those who are here in person and joining online for our first Beaker Series event of the spring. I'm really excited to be hosting this conversation between NETA and Anika Collier Navarroli on the experiences of tech workers in marginalized backgrounds and the forms of community solidarity between and across these these type companies. And to quickly introduce NETA, who will be introducing our guest, NETA is an employee fellow of the Institute for Reboot and Social Media. She holds BS and MS degrees from UC San Diego and Computer Science with a focus on systems and security. Her background is in privacy and trust and safety, working most recently as a software engineer at META on messenger privacy and Instagram privacy teams. And during her time at the company, she was involved in various integrity work streams, escalating content moderation issues and bringing awareness to bias and product features and enforcement systems. We're looking forward to about 30 to 40 minutes of conversation between our two wonderful speakers today. So please put your questions in the chat. If you're joining the Zoom and also just hold on to your questions if you're in the room, but looking forward to audience participation at the end. And we've got all handed over to Nada. Thank you for that introduction, Nick. Can everyone hear me? Great. I'm pleased to introduce Anika Collier Navarroli who is a writer, lawyer and researcher focus on the intersections of technology, media, policy and human rights. She's currently a senior fellow at the TAO Center for Digital Journalism at Columbia University, a public voices fellow on technology and the public interest with the AWPED project in partnership with the MacArthur Foundation and a 2023 Unicorn Fund awardee. Previously, she held senior policy official positions within trust and safety teams at Twitter and Twitch. And for the last decade, her work has spanned from inside tech companies and centers of research to advocacy organizations and classrooms. And so welcome. It's incredible to have you here, Anika. So a brief background is that Anika and I actually met at RightsCon this past June and are instantly connected as trust and safety workers at social media companies who are also deeply embedded within our own communities. We reconnected the last few months as I was having a lot of conversations with previous colleagues and friends that are Palestinian, Muslim and allies that have been observing a lot of biases and mistakes happening on these platforms with their companies and sometimes internal hostile environments and were feeling unable to speak up. I could empathize and relate because I was in that position in 2021 but couldn't quite figure out what resources to point them to and what we could do constructively because there was no precedent set as to how to respond when these situations come up, especially when things are bad. And so Anika was one of the first people that I reached out to and we realized that a lot of folks aren't really having these conversations and open spaces and so we're excited to have this to start this through this first fireside chat. And so thank you for joining us. Thank you so much for having me. If you feel hear me, that's perfect. Thank you so much for joining those of you who are in the room. Hello, it's my parents on Zoom and anybody else who might be joining us on Zoom. Thank you so much for joining us. As I said, yeah, this is a conversation that hasn't really been happy before. So I'm really happy to be able to be here as part of it to be possible. Yeah. And we'd love to hear voices in the room and over Zoom as well. You know, we have a lot of tech workers that are calling in from different places. And to start us off, I wanted to have to talk about a broad view of what it means to be underrepresented and a tech worker in the industry today. There are elections, complex geopolitical situations, humanitarian crises, civil unrest and social movements that have shaken a lot of us to our core and impact our personal lives. And how do our interactions with these platforms of employers change as our personal and professional lives start to become intertwined in this way? Yeah, I think that's a really great question. So I did, I wrote some research called Black and Moderate, and I spoke to six trust and safety laws that identified as Black Americans who represented 12 different companies, right? And so in this research, the word I actually described the experience, the word I used was wild, right? I actually thought an editor was going to take it out and, you know, be like, please use more precise language. And I think after reading it, that is the most descriptive way, right, of explaining the sort of yo-yo rollercoaster that you end up in these situations. I think there are so many of us, you mentioned the sort of crises, the geopolitical situations, the things that are happening in the world. So many of us end up in these shops because we care about these things, right? And we have a skill set that we believe can be useful or helpful in helping determine things like human rights and a little crisis, right? Or we come from regions that are under crisis. So in, you know, some sort of geopolitical tension, we have a deep understanding and we want to be able to help with that. I think what happens is very, very quickly, the personal and the professional become the same and they collapse. And I think that's when we experience a lot of this tension that we're going to be talking about. Sure, Dan. Sorry about that. If y'all could hear me. Yeah, I had to move from the back because I couldn't really hear. Apologies. Can you hear me better now? Yes. Okay. Yeah. No, I mean, I can attest to that. It feels wild and you're kind of looking around and thinking, is it just me? This kind of seems problematic and then other people start to talk about it and you see a lot of parallels across the industries and companies. And in one of your previous interviews, you mentioned the importance of bringing people into the rooms where policies are being developed and involving people that are doing the work. Can you talk to us more about that? For sure. I think this was, you know, I want to, I'm like sensitive about it. Swap this one. Hello. Hello. Is that better? Oh, is that better? Apologies, y'all. I promise I know how to use the microphone. Alrighty. Can you repeat your question? Yeah. So we're talking about bringing people into these rooms and policies that are being developed, the people that are at the forefront of this work in integrity and content moderation. Yes, yes, yes. I think this was the conversation we said we met in Costa Rica at Wright's Con last year and it was, you know, a conversation that was happening between regulators and, you know, social media companies and they were trying to understand what happens in crisis, right? And the thing that I was saying was we don't have to start with a scratch, right? Like we don't have to reinvent the wheel when it comes to regulations because there are so many people who have done this for so long, who have worked through so many different crisis situations and have tried out various things, right? And I think it's really important for folks who have done trust and safety who have worked in tech companies to be a part of these conversations, right? There's so much regulation that I've read and literally like it's been laughable, right? Because I read it and I'm like, we tried that. It didn't work, right? And so being able to talk to people who have already tried out these experiences, you know what works? What doesn't work? It gives us a better place to start from. And I, it's like the drumbeat that I won't let up, right? Like we have to talk to the people who have been doing the work because it's, I think it's the secret to be, the secrets, but like the key to being able to understand and effectively regulate. Very true, yeah. I think a lot of times these conversations happen in a vacuum where don't involve all the different stakeholders that maybe play a part into thinking of the best solution. And so when we talk about how we can move into this world where folks that aren't in leadership positions get to see at the table, what is your advice or how are ways that we can involve them more in this process? I think that's a really important question, right? So part of my research, one of the things that was kind of brought to light was the lack of diversity within leadership, right? And so there are so many folks who are involved in teams who might be in lower level positions or might feel powerless. And one thing that I will say is being inside of a company, there is so much power already at your disposal, right? And I think being able to understand how to find moments of opportunity, you know, windows of opportunity to jump through when they come and being able to identify when those happen, when you can use your seniority, when you can use the power that's within your hands is really important. And the example that I'll give, you know, I was recently asked about, you know, the work that I was the most proud of, which was the first time anybody asked me that about the work that I was most proud of in a company. And I realized I had never actually talked about it, right? I've talked about so many different things. And I said so much, but I never had actually spoken about the work I was the most proud of, because it was not my actual job, right? And so the work that I looked at, and I look at tech companies, that I think was the most impactful, I worked on amplifying the voices of women, of trans folks, of people of color, of queer folks, not just on the platform, but inside of the platforms as well, by starting speaker series and having folks come in. And so the information flow became different. And that was something that lasted beyond my time there, right? But again, it wasn't my job. I just happened to be working at the company. I happened to see an opportunity, recognize that it was something that needed to be done and was able to do it. I think a lot of us resonate with that. It's also underrepresented in the tech industry. And as an engineer, I also felt like the only person that looked like me in this room, and that I had this responsibility to speak up when things were going on, to be that representative of my community. So it is like an added burden and load on all of us. But then those are the moments where you feel like you can be that voice for folks that aren't in these rooms, and it becomes incredibly powerful. And so this kind of said ways into my next question, and this concept that you brought up in your piece on being Black in moderation was published in the Columbia Journalism Review in this term of compelled identity labor, which is this concept that as a minority in these positions, we feel like we have to represent your community at a company. And I finally like found the vocabulary to describe what I was doing for so long. And in 2021, me and a handful of other employees at the company were starting to notice mistakes and content moderation and escalating them to the right policy teams to try to make sure that these biases were happening. And it felt like it was on the shoulders of just a few. So tell us more about this idea and other examples that you've seen. Yeah. So I called it compelled identity labor. And the idea and the sort of concept came about based on my own experiences, much like yours. So I was working before I worked in trust and safety. I was a worked in academia, worked in thinking. So I was always aware of the literature that was coming out and the research that was coming out. And I was reading it and realizing that nothing was reflecting the experiences that I was personally having. And the very first thing that I saw that kind of identified with that was Sarah Roberts book behind the screen. She talks a little bit was a third party content moderator. And one of the things that they said was there seems to be someone inside of the trust and safety team who just has a special heart for the Middle East. And when I read that, I literally was like, oh my God, I know exactly what's happening here. And my mind immediately went to the Palestinian women that I have worked with at so many different companies who at times of intense geopolitical crisis were being called in to make decisions that were extremely personal. Right. And so this was a concept that I also didn't have words for. And so when I was talking to so many of the people for this research, at the time I literally was calling I think like shouldering a community, right? I mean, there's no, I need to put together some words to be able to sort of explain this experience, right? But it is something that I think, since I've written about it has really resonated with a lot of people and is really, you know, given kind of a terminology and a phrase just to the experience that so much of us have had, which is why I'm very grateful to have been able to do it. And those of us that are underrepresented as a person of color and a minority, there are times when we feel as we are tokenized, we're used, we really needed in these situations, we're involved in diversity and inclusion and allyship trainings. But then there's times when it feels like it's performative when there's a crisis that actually emerges, you know, what's happening in Palestine right now. And you're kind of met with silence from leadership when this is actually the time when those values need to be shown and the decisions and actions that are being taken. And so these workers that are kind of engaging in this compelled identity labor, can you tell us more about how this overburden some? For sure. I mean, it's a complete burden. I wrote this down. One of the people that I spoke with said, you don't want to always have to be that person, right? People start to look at you funny, right? And another person described it, they said that they had to divorce themselves from themselves, right? And they had to kind of put on a different mask in order to be able to do this work, right? And it becomes, you know, you mentioned the sort of sometimes you're the only person in the room. And so it feels like it is your responsibility, because if it's not you raising your voice, then there's people, there's going to be an entire room that doesn't necessarily think about the people who identify the way that you do, right? And so that places again, the sort of extraordinary burden on a handful of people to be able to, you know, have these really tough conversations. I'm going to share an example that I have actually never talked about a couple of things before. And I'll tell a little bit why I've never spoken about it before. But one of the things that I talked a lot about my research, and I wanted to talk to people about was kind of the daily environment of working in trust and safety. And so I was working at a tech company that shall remain nameless. And, you know, we were in a crisis situation, a user had done something, right? And so we get called into the emergency meeting in which, you know, all of us across the company, all the cross functional stakeholders, you know, people who are in companies will know those words, you know, those words very well. So all all the cross functional stakeholders were in a room. That means everybody from different departments by the way, all the room talking together about what we're going to do about this user who has, you know, done something about themselves in trouble. When I joined the room, someone is sharing their screen to kind of show what's happening. And I see that the user that we are talking about is a black person. I also see that the piece of content that we're looking at this person has used to Edward. And I remember sitting there thinking very like, specifically like, I really hope that nobody thinks this out loud, right? Just having that specific moment. And very quickly, not only did that happen, but the head of trust and safety goes on to read this piece of content. And, you know, again, it was a black user. So they are using the N word and it ends with the A and the head of trust and safety decides to say it and ending with a hard ER. And so in that moment, you know, you have somebody actually explain this to me and they said, you know, you have this physical reaction to these words, even if someone is just reading a word on a piece of paper, right? And so I remember reacting, but very quickly, I get called on, right? You're the most senior policy person in the room. What should we do, Anika? And so I have to take myself and say, okay, like, what do I do in this situation? How do I very quickly explain, you know, the way that black people communicate, the sort of expression that happens, why we shouldn't penalize users for being black and speaking the way that they do and do that very, very quickly while also having this super emotional reaction, right? And I said, you know, I haven't told this story before. And one of the reasons why I've never talked about this story before is because of the burden that placed on me, right? And I remember, and I wish, you know, I miss was the Lord, Mr. Teller, always around, you know, talking about things. And I wish that I could sit here and tell you all that, you know, I put some time on this head of trust and safety's calendar. And we sat down and we had a nuanced conversation. And I explained to them about how I have a black queer woman who was raised in the South, who has been, you know, has used, had not used those words, but seen those use those words used or had them used in very violent situations and circumstances and the impact that would have on me, right? And why we shouldn't necessarily be saying that in the workplace. But the reality is, this was my boss, right? And I realized, do I want to go have a confrontational conversation with someone who might misconstrue this, they might think I'm calling them a racist saying that they're racist and that ruins our relationship. They are again, my boss, my livelihood is in their hands. And so I decided, honestly, I'm not going to say anything, right? It was a Friday afternoon as well. And so I remember thinking, I have the whole weekend to get over this. But the other question that came up for me is sort of the burden, right? Like, was it my responsibility as the most senior black person at that company or in that room, to then have to go talk to a senior leader and explain to them why using racial swours in the workplace might not be appropriate, right? And why having that conversation would negatively impact me. That's the burden. And being put in that situation and having to think that you have to pick and choose your battles because it's constantly happening and is this worth escalating or bringing up and resolving? But I think a lot of people of color that are fearful and worried about this affecting their livelihood and their jobs and being targeted more than other employees. And so a lot of times those are suppressed and then they're never shared and it's this culmination of things that never ended being talked about. And so one example that I'd observed the last few months, that maybe was an innocent mistake too and didn't happen intentionally was a mistake that happened on Instagram with a user as bio when someone had written Palestinian and had Arabic text after it. There was a mistake with the AI translations that would translate the Arabic to say Palestinians are terrorists. And this is clearly dehumanizing and pushing this narrative that is harming Palestinians right now. It was quickly kind of brushed off as a hallucination and a mistake and it wouldn't happen again, but it was hard to divorce your own reaction of this being shared about your community and not really knowing how to respond when you're an employee at the company and you understand the way technology is working and how there's mistakes that come up but also wanting to make sure this doesn't continue to happen. Yeah I think that's a really important thing that you brought up and I will say I think you know the current circumstances and the current crises that are happening are bringing so much of this work to light. We now have reporting on the situation that you have talked about. The reality is in every single crisis situation that has happened all over the world this work has been done by a handful of humans. There's always someone who's in there advocating and I think being able to see now and have this conversation here first right bringing these things to light, but also seeing journalists report on what's happening has been really really enlightening because it's been great to see the work we noticed right and to recognize the activism and the burden and the reality that so many people are living on top of their regular job right that's not anyone's job to do that and yet you know there they are. Yeah yeah there's groups of internal activists at these companies that are doing in addition to their jobs as data scientists or engineers and it's it so for these people that are doing this additional work what institutional support and responsibilities should be provided to them they're clearly really needed in those situations where the context isn't really understood by a lot of folks in leadership and they're forced to kind of explain to clarify but what can be done to provide more of that support. Yeah I think it's really important first that companies hire experts in things like race, ethnicity, national origin right these are exceedingly complex you know ideas right just to say you know I once worked on a policy that had to do with ethnicity and you know I went to the United States Census and according to the in the United States there are two ethnicities Hispanic and non-Hispanic right it's literally all there is if I were as a policy person who's working in a global company to take that binary and apply it to the rest of the world we are in trouble right and that's why we need to have these experts from all around the world to have a better concept of ethnicity than I do right being an American and really kind of growing up in this binary system and I think hiring those experts is important because it's compelled identity labor it's not sustainable right it's not a good sort of business practice I think the other thing that's really important is to you know support workers right I think you know I talked to the folks that I talked to I was asking them you know what what do you do how do you take care of yourself when you're doing this work right and so many people were telling me that they were relying on their own sort of wellness practices right people were journaling or they were going to the gym right or you know they were going for a walk right but there was no sort of institutionalized support for this work right I think being able to provide therapists or stipends for therapy right to be able to provide very very real logistical support for people who are literally being damaged while doing this work is essential for companies to be able to invest in exactly um and one thing that we were talking about just yesterday as we were deeply thinking it was an interesting observation I had these last few months in how a lot of these social media platforms there's this governance of online speech but how this is also kind of extending into the way leadership is enforcing internal guidelines in a more extreme way so very very overt internal censorship that's happening when employees can't even talk about the problems that are affecting their communities because it's being disruptive to some policies and employees that can't even mourn their family members that were killed and they can't post in a company q&a asking how they can be there and stand in solidarity with their co-workers that are Palestinian during this difficult time they can't even have an organized an internal support group without screenshots being taken and leaked and these have happened across all the major tech companies um and so this feels very dehumanizing um and very opposite from what the potitudes that these workplaces tell with you know bringing yourself to work your whole self your full self yes they can't even mention their birthplace you know where they work so how are your thoughts on how this is damaging and how these companies can engage in these discussions during a time of crisis yeah I think one thing that I I will say that I have I noticed working inside of tech companies is that very often the way that a platform runs externally or functions or is governed externally is the way that the company is actually managed internally too right there is a through line between the philosophies and the way that they are translated and so this sort of you know most platforms we have the sort of very loud vocal you know group of people who tend to drive the conversation right exact same thing happening inside of companies there is a small minority of people who have a different opinion who can you know band together and maybe create some sort of movement for change same thing that's happening inside of companies right so I think that sort of parallel is really important to understand especially when it comes to things like censorship right or self-censorship right the exact same things that are happening on these companies when we see things like Palestinian folks being censored or miss mistranslations or whatever the case may be this is also happening to employees right and I think it's it's incredibly I think it's unfortunate but I again I'm happy that it's being brought to life because this has been happening you said you were doing this in 2021 I was doing this in 2019 2020 right this is four years that we have been we ourselves have been doing this work there are people who've been doing this before us right so finally having it being brought to life and being able to have these conversations I think is really important the other thing that I will say is like again you have more power than you think you do right and I think it is again there's a burden of feeling like you have to be the person to be able to do this and there is something that is so important and critical about this work um and I think you know you've done the work and you recognize like so often the sort of reality is that it's life or death for people who happen to identify with you right and when those are the stakes it's really really hard to not get involved yeah yeah and I know folks at these companies where when it becomes this intense and extreme you know they'll leave out a principle but the folks that do choose to say you know for them to recognize that they're in a position of privilege and there's a lot of advocacy they can do internal and we need people pushing for this change from all different sides for for sure and I'm gonna I'm gonna add on this because I think this is such an important piece right we will always need people working inside of tech companies right change will never happen I mean it might happen but my theory of change and the way that I have seen things work takes a variety of people in a variety of different angles advocating for change and so you could have the external advocacy but you need somebody internally who's going to be able to write the policy who's going to be able to have the conversation who's going to be able to make the argument when it counts and so I to that person to all of those people who are continuing to do that work that is the Lord's work right it is it is it is the work that like more power to you right because it is so incredibly important and I just want to say like we see you right we honor that work and part of why we are here is to talk about it and to honor it and it feels like you know we've been doing this work for years 2019 2021 to 2023 and right now it still feels like a lot of times as we're pushing for these changes internally there's still a lot of reactive measures that are taken and responses from leadership that are either made just to placate employees or just band aid solutions that aren't really addressing the foundational issues and you know senior leadership that also say things like usually when employees escalate things it's just an HR problem so how can we really push for these changes to happen so that we're not constantly bringing up the same thing zero per year yeah I think that's a I think that's a really interesting sort of you know approach and I think when you're sitting inside of a company it's really easy to also feel like you're being ignored right or about what you're saying does it necessarily matter and I again I want to encourage folks who are doing that work that it is incredibly important because it's so many moments that I have seen right there have been people internally who have been pushing for a change or pushing for a policy or pushing for something and it's never been part of our ties and like saying you know there's a crisis or the New York Times has you on the front page talking about the thing that you have been trying to work on and all of a sudden it's a priority there's resources right and as if you are the person who has been kind of beating that drum you are able to then at that moment pick it up and run with it right but it is it's incredibly unfortunate that sometimes it really does take that sort of external pressure for it to happen yeah but also it requires that internal person as well yeah yeah and it's got it got to a point where you have decided to leave and you became a whistleblower which I think is kind of put you in the spotlight so I wanted to ask and understand and share with the audience why you've decided to become a whistle blower what advice you have for tech workers that want to share their experiences externally and there's still taboo and there's a stigma around linking information and so yeah how has that been? How has it been? I'm like looking at the clock we don't have enough time to talk about how that has been yes I did decide to come forward publicly as a whistleblower and you know I'm not the first whistleblower from from trust and safety and you know in the in the research that I did I talked specifically about ephoma at Pinterest and Charlotte at Amazon and Joelle I took talk right and how these folks coming forward has given you sort of the first inside insight that we have into how many so many people are treated right and I also think right working in trust and safety you see the worst things day in and day out and can you imagine how bad things must have the bad of the worst has to be for you to be willing to like risk your career right risk everything to be able to come forward and say this thing has happened and in my case you know it was democracy right and that was something that was continues to be much bigger than me right and so it was important I felt that it was important for me after you know I think the thing about whistleblowing is like kind of has a bad rep right and I think and I get it um you know when I first became whistleblower people called me whistleblower I was like please don't like I've gotten a little better with it now these days right but I think there's there's a lot of negative connotations with it but I think the thing is is like you don't have to do what I did right you don't have to come forward go talk to Congress testify to Congress under oath monsee span right like that's a that's a very specific path I think that there are so many other ways to be able to tell the truth and I would encourage people to do this first actually let me say this I encourage nobody to tell the truth or talk to anybody without first talking to a lawyer and or a whistleblower organization so you understand your rights and your risks I am a lawyer and I just really mean that like as a person who was actually a lawyer I also had to go talk to lawyers first right so please please please please make sure that you are protected but again there are other ways to do it right you can I think of the folks who you know talk to me in doing the research right that is a form of truth telling right they're completely anonymous they're synonymous you know and their names will never be known their identities will never be known but their stories have now created an impact right I think there's a reality of that there's you know writing right there's what we're doing right here that is important to exactly yeah and I think a lot of these avenues people just don't know about right whistleblowing is the one that's mostly public a lot of people there's a lot more attention on it and a lot of people worry about even going to the media about things because sensationalist headlines and how this kind of can sometimes frame not really the most accurate picture so for folks that maybe don't really want to talk to media like you're saying join us on the other side there's a lot of work that can be done in public interest tech and work that you can do that's not within it for sure right I think there this is a really really interesting moment there are so many practitioner fellowships now right that was not like a a thing four years ago right there are so many new spaces to be able to contribute your ideas your writing your voice right to you know helping regulation to furthering conversations and so I would encourage folks who are you know trying to figure out what to do with their unique skill set that they have developed and this you know work that they have done come join us right come come during these conversations right what you know I think you and I were talking about you know there have been so many people recently who have left industry have been publishing and I've just been so excited to see these voices you know coming into the space with this incredibly nuanced you know you know recommendations right and I think that that is so important um and again we still need folks instead of tech companies too right it's it's such a it's a entire I think ecosystem of change exactly yeah and so with that that concludes our preset questions if there are any questions from the audience either in person or zoom can start I can start off the question yeah um I guess it starts off in a bit of a sour but I hopefully ends positively it's like a sour patch kid right um but I guess uh just taking in in in light of the the layouts that have unfortunately affected the tech industry especially in trust and safety I guess if you could speak to whether uh whether still connected to folks or still have platforms or elsewhere um have the burdens of compelled identity labor gotten worse in light of some of those layoffs which have really affected some of the centers of the advocacy and and knowledge and expertise within platforms that you've described for sure right so I when I was doing my research um it was during kind of the first wave of layoffs in 2022 yes yes um and it impacted my ability to do research right I think you know there were folks who I talked to who came back to me and said like the industry has completely changed right like we no longer have job stability and I can't be involved in this research right um and that was you know the first round there have been so many more rounds of layoffs now that I think you know the job security is even more fragile than whatever was you know I told my story about why I didn't necessarily speak up in a moment because of job security right I'm certain that those situations have only become more strenuous right when you realize like there's a round of layoffs happening every couple of months and all that is necessary is for you know my boss to have one thing that we might not necessarily agree on for me to be the person that's on the chopping block right it makes you be a little bit less hesitant or less willing to take on the risk right and I think that that is so unfortunate because you know the conversation that we're having is how important these roles are right because it's literally very often the digital human rights around the entire world of people who happen to identify with you right that is a it's a it's unfathomable right like how big of a of a task and how big of a deal that it is and I think I think the market uncertainty is only made it worse appreciate that and one question from online as well um this is someone who I guess is uh uh not my parents not your parents no but um interested in exploring sort of practitioner fellowships that you mentioned so I guess if you could give as much of a survey or just elaborate on those opportunities a little bit absolutely so the the research that I did that I keep talking about black and moderation it's a Columbia journalism review um I actually did that during a practitioner fellowship at Stanford um and that that fellowship I think it's been a um don't quote me on this I'm like being recorded to um I don't know how many years that it has been around so I will not give an exact number on it um but it is it is one that I remember when it first uh was announced I remember seeing it and thinking like oh I really want to do that but I was having I was working at a company at the time and I was like it was a little overburdened as we've been talking about and I was like I don't think I necessarily have the capacity to be able to take this long right um but that that program exists I think it's a very very wonderful one it's uh with the digital civil society lab um we're here at berkman right there y'all I think the fellowship application closed so maybe not this round y'all but I think this is a very very great place to be able to have these conversations as well um I know that you will have your newsletter that you send up it has so many different job opportunities on it and so many different fellowship opportunities I think being involved or getting yourself you know on the listservs that are spitting those things out there are also you know trade organizations that are popping up now that haven't been around for a while but like are incredibly helpful I think they're also you know sending out information about fellowships you know jobs resources that could be available amazing thank you and another question from online um is from an undergraduate student who's taking a course on key societal values that are affected by emerging technologies and they're curious about I guess two one of the values uh covered in the course is solidarity and inclusion um and they're wondering how best we understand that concept especially in the context of the emerging technologies um and uh better ways to conceptualize it than how it's commonly understood solidarity and inclusion wow that's a really great question um I think not even I talked about this a little one of our conversations um you know I think there's like the these these are like big concepts right you can like it's like compelled identity labor they're like words so you can put like definitions to it like what is it actually like really mean um I can give a very very concrete example I remember I there was a there was a time and a company that again shall remain famous that I was also doing my compelled identity labor during a specific moment and I remember being completely feeling completely overwhelmed right like I had I had an intense argument I mean you know by the time the arguments finally you know finish and we come around to we're going to do something you're tired right and then you have to open up a blank google doc and start writing right and I remember I had a colleague uh at the time who saw what was going on um a non-black colleague and said I'll write it for you right and I remember in that moment thinking like oh this is what it means like this is exactly what these ideas mean in action of I see I literally see what you have been going through in order to make this happen and going to take this labor off of your plate so that you are not going to have to be the person that has to carry this on and you know they ended up writing the thing I came back and reviewed it we had this great sort of conversation and back and forth but it's in those moments of understanding like what does what does solidarity really look like right it's not being silent in moments it's not you know sometimes it's sending the slack message that says I see you and sometimes it's literally picking up the work and taking the burden from somebody else and deciding like this one doesn't really hurt me as much so like I can do it thanks and we have more questions coming in online but we can pause at any moment to oh perfect hey thanks for joining us I have a kind of a long question for you let's do it so I'll build off the first question here there's uncertainty with jobs right now yes so we're seeing a lot of consolidation happen across the industry so a lot of these teams are becoming smaller people are having to take on more work ultimately what happens is engineering teams are now being tasked with improving models and some models are becoming very sophisticated and they're being trained based on past data which many times has associated with it right and so what happens now is things that are being flagged in the past that have been overlooked are now being used to train our models and so with increased automation a lot of the stuff is going to be gone just unnoticed right and so if you don't have a large voice or a large platform for distribution it's ignored okay and so I see a lot of these large companies are in a no win situation you're constantly playing defense something's always going to come up you have fewer people working on these problems and you have models you don't really understand how they're working because it's just building on top of each other every day and the people who built it are no longer at the company they left a long time ago yeah and um a real issue is documentation yes people who built it and write it down and now they're no longer there yes so I guess my question for you is how are you seeing really the next two years because the growth in AI right now is is just exponential yeah yeah the next it's I think it's interesting that you say the next two years because I'm like can I see beyond 2024 right now right like I think this this this monster of an election year is you know I think going to be the greatest challenge for trust and safety teams and technology companies that that we have ever seen right um and I I think this question of like increasing AI you and I were talking about like you all probably know it's like every funder loves AI these days right and it's like AI we love it right everyone do something with AI and it's like what is the something that you want me to do with AI I just want to be like write the words in the document and like that's where we're at right I think you know you you were talking about uh uh an AI and I was like oh the bias within it right and I think that's such an important issue and I think one of the examples that I will give that I you know used to work on was specifically around hate speech detection models right and how you know Facebook did um Facebook did some research on their own on their own algorithm and found out that it was taking down content that was integrating towards white people more than anything else right so heavily biased like we're going to take down hate speech but we're also going to protect this one group that might not necessarily be the most vulnerable right I think the reality is is that you know you're saying like somebody built that system five ten years ago has moved on right I think there's a real need to sort of go back and reevaluate the very basics right um this was something that I I really wanted to be able to do and was never sort of able to do right which was like how do I get into this and understand like what is what is the training data like is it biased what are we doing here right I don't I don't think I necessarily know right like how we solve for that but I think it's important again to go back to the foundation to go back to the basics and understand like um a good friend of mine uh and uh lawyer and researcher and Richard Richardson wrote a law review article called Dirty Data it's based on you know data is based on segregation right and so we have this data that is consistently dirty how do we clean it up maybe you'll figure it out and then you can tell me yeah okay but I think just to say like one of the things I think when we talk about content moderation we talk about trust and safety I often think that we are tasked with the impossible right it is literally like boiling the ocean and I think the thing that you can do is you're very best with the information that you have on the day that you're doing it one question from online and I'll I'll bring it over to you um a question from someone who is working at a search engine company who believes that them and their co-workers have been falsely empowered they're wondering how or how can you sense when your time is running out at a company and what are the signs of job security fading when you've exercised protected speech and have faith in reasonable people and your role well I will say if you are asking if your time is coming your time is coming that is the first step when you start thinking I think that's it um you know I was I I've always been a proponent for an exit strategy especially when you start speaking up right I think you have to have to recognize I am taking on risk here and therefore I need to figure out what option b c d e f are going to be because this is incredibly risky right um I think you know sort of the question on like when do you leave I think this is a kind of a hard one right because the reality is is that these are jobs they're there to help you pay the bills and if you leave a job on a moral conviction your bills are not going to stop right like I blew the whistle the bills didn't stop just because I blew the whistle right they they kept coming and I think having to understand that and maybe at times honestly make those sort of conflicting moral decisions of you know am I going to stay at this company that is doing something that I might not necessarily agree with because I happen to live in capitalism like maybe right and that's a really really really really unfortunate thing to do and I understand that it becomes you know very very heavy and if you can kind of get that cognitive dissonance right of like I'm here to do good but I'm not necessarily doing that um and again like get work on your exit strategy always have an exit strategy I think the sort of you know the market uncertainty makes it even harder because you know when I was working in companies it was very very easy it was like the heyday it was like oh they didn't work to twitter you can work anywhere right like that's you just kind of hopped around and you would see people from the same companies that used to work at going to the next company right I think those days are kind of over and I mean speaking of AI right like AI is now creating your entire regulatory system that is based on the way that social media is regulated right they have entire trust and safety teams jump over there see what's happening over there thank you very much for your insights and my question is more about crisis situations and trust and safety as we talked about the war in Ukraine or now the war in Gaza and where we see despite all the layoffs there are lots of adverts looking for people speaking these particular languages to get involved that obviously that puts a huge strain on the organization's dealing with all of the you know awful and yeah things that are coming in and I was wondering if you think that in the long term these kind of crisis situations that also open up to different ways of seeing the world that are maybe beyond the very US centric ways of dealing with issues if that can be a longer term learning for these companies or if you've seen that in the past or if you think it's really just crisis moments and once they are kind of no longer the center of attention it all goes back to normal and there's not really the way forward I think it has to be a learning lesson right if we are going to get this right if they're you know right then quotes again doing an impossible job I think it has to be a learning opportunity I think you know one of the one of the biggest biases that are baked into technology companies is the you know the San Francisco bias right that's where a lot of people are headquartered in so many different ways and so many different things like you cannot get an answer or a decision unless it goes through the top bosses in San Francisco right and I think that provides and creates an extreme an extreme bias right the other thing that I I want to pick up on what you said is language specialties and being a native speaker you know I think you know there's been a lot of reporting that's come out recently the Guardian did this like expose where they were like oh my gosh TikTok he doesn't have native speakers and people who don't speak English are using Google translate and I was like yeah right like this has been going on for a very long time uh much the Twitter whistle they are for a whistleblower when he came forward he said you know the misinformation team at Twitter is using Google translate in order to figure out what is happening here right like that's a problem and I think it requires to say I think that I've given you this example before but you know I was working in a company and I speak English it's only language I speak um and I always refuse to make a decision on content that was in a language that I didn't speak it was like a serious thing that I had that is not a conviction that everybody has right and I think it's really important again for people who are in these positions to make those sort of stances the example I will give um you know uh there was a piece of content that came in once it was an Arabic um and I sent it to three different people I sent it to a colleague who was a white man who happened to attend an Ivy League school and major in Arabic studies and spoke Arabic I sent it to a woman who was born in Egypt who also majored in Arabic studies in Egypt and then I sent it to a Palestinian woman who was born in Gaza okay so I sent it to all these different folks um and you know the translations that came back were incredibly different right uh the first person you know who was majored in Arabic studies in the United States came back and said to me well you know it's it's kind of bad but we can leave it up the other folks who weren't who were natural speakers this is their their language that they they have brought up in they understand different dialects right they're able to see what's going on came back and said to me we have to take this down immediately right like this is awful like this is the kind of stuff that gets people harmed and the difference in that sort of dichotomy was fascinating to me because it was a reality of if I wasn't the person who was in the position to be able to say I am going to wait first off I'm not making a decision on this that also is it's problematic because we're waiting right we're going to wait this thing is going to stay up until we figure this out and make it right but again like that very specific scenario made me remember and always realize the importance of understanding like being American-centric especially in content moderation it harms people right it's literally dangerous to build off of that too and in these times of crisis like uh Anika was saying use it as like a learning lesson and figure out where is the best place to put that pressure and build off of that momentum in 2021 I remember when you're taking this reactive approach of just trying to bring this Facebook group back up or getting this account restored you're constantly fighting these fires but then you know 2023 comes and then we're seeing the same thing over and over again because we're so focused on just like mitigating what's happening and instead making sure you reflect holistically realize that maybe there's just not the right people in the places that you know should be there to make sure that we're changing the policies when they need to or that people are properly trained in the languages and dialects when it's needed so that when another crisis comes we're not constantly fighting this now for sure two questions from online on a topic that you touched on earlier Noda the first is companies have their own internal guidelines that are used to censor employees by deciding what subjects are forbidden those guidelines are not public and NDAs prevent employees from talking about them externally yes how can we keep companies accountable and a related question is what are the top pieces of advice either you can offer to met employees trying to organize against that form of internal external censorship and calling for leaders to be more transparent to the public and our policy and our model politicians we're trying to create a union similar to the alphabet workers union yeah um i'll answer them i'm gonna i'll sort of kick it over to you um i i empathize with these folks uh whoever's writing these questions tremendously um to say like these sort of internal codes have grown and gotten worse right like i when i worked at companies they weren't this bad right um one thing that i i will say right like this person said they're not public right not encouraging anybody to leak any sort of information or to do anything of those sorts because i would never do anything like that and i think bringing things to public right like the things that you are so familiar with that you see somebody has like literally never even fathomed that that would even exist right and so i think showing the public like this is what is actually being said these are the guidelines that are were actually being you know imposed on us i think would be shocking to so many people um the other uh oh yeah i was i was gonna say i know everyone's appetite for risk is different some people are willing to leak information others when you're in that situation and you're seeing these things at the company i still don't know the nuance of what is more risky than uh what would be the most risky but talking about your own personal experiences and what you're facing i feel like writing your own things maybe not leaking you know i think you know information that could be a little controversial but just sharing what you've observed in houses affected you at the beginning this gets the conversation going for sure i want to thank you for coming this has been really i've learned a lot and i appreciate you being here um i have a question about advocacy kind of in a broader sense you know i in the work that i do i have opportunities to talk to parents and talk to you know teens and otherwise about issues that don't specifically pertain to what you're getting at but they do pertain to the idea that maybe as citizens we have some role in trying to educate our relatively moronic senators or representatives about what could be done i mean that the the appalling state of legislation around this is distressing yeah but i have to be honest i'm not actually sure uh what should i say you know we should as citizens being advocating for blank what what is the blank that we should say to our representatives or to our officials or to our to each other this is what i would like to see the government do to help protect people so i would love to hear your insights that's a great question and you know i have spent a lot of time talking to congress giving them my ideas on this sort of thing to say i think the first step is just regulation right like anything we are at a place where we literally have we have section 230 that says you can do what you want basic i mean that's not what it says i'm a boy there i brought it like please don't people are people are very precious about 230 as you all know right and but i think that the sort of reality is is that we need something i i'm a big advocate of a of a regulatory scheme not mirror something like the national transportation and safety board right so sort of like baseline standard for safety that says like you can't go flying if you don't have all the bolts right something very very simple and also the other reality of okay so there was a crash there was something that went horrifically wrong right like how do we come in get the black box figure out what happened and make sure that it doesn't happen again right we have nothing like that we have no oversight like that that's happening for technology companies and i think starting with something that simple something that independent you know regulatory body that is able to do something like that i think is a giant first step here yes sorry to point thank you for your comments um so i'm a lawyer too but from canada's we love regulation i love it for y'all can you like send that down please just in the water please i always have to preface my remarks from that when i like you know argue for regulation or like your Canadians anyway um but i was wondering if you also uh i like what you're saying about um obviously but you know oversight bodies and and managing safety that way but what about employment law levers um senior talked about the question was about non-disclosure agreements there's been work on non-compete clauses and how they've had like way too much um overreach for sure and had a problematic effect um more of that kind of economic argument but you know uh all these ways in which workers are silenced or punished for speaking up about things that are in the public interest seemed to me like things that could be dealt with through employment law levers for sure i'm just curious is your thoughts on that for sure so i mentioned ifoma the whistleblower from pinterest earlier and ifoma worked on i think it was called the the silent no more um act again i'm saying a lot of things um that are being recorded but but check me on that um and um that that was specifically uh geared towards employment law and thinking about nda's and saying in these specific circumstances if you know discrimination happens uh based on these protected categories your nda is no longer good right but i i think that question of employment law nda's is a really really important one i mean you literally had a question that said you know i when i decided to show up at this company like i basically signed my life rights away and said i would never talk about this again right like to me that's incredibly unfair right it's incredibly unfair for you to have to go work at a company give it your heart your soul your mind your brain right and then never be able to speak about it again especially when at times you do work that literally changes the world right i think it's important for folks to be able to speak out i am all about trying to figure out various lovers for this to happen i think employment law is a very very good one i think there's some great whistleblower attorneys who are also thinking about things like this um and i would love to see i would love to see that happen right as i've seen a lot of nda's in my life right and you know they are they are restrictive and they're hard and um i think they're out there are you worried at all that bad regulation might be worse than no regulation i mean we've got people trying to take books out of libraries and schools like don't really want to think about what josh holly and marjorie taylor green might uh try to enact this legislation to regulate the internet you know what this is gonna sound crazy but one of the best pieces of regulation that i have seen for the internet actually comes from josh holly really i know that sounds okay it's insane right i know i know you probably would never expect those words to come out of my mouth like i highly disagree with a lot of things that are in there but i think it's a solid structure right and i i think this is something that i i i you can hear me being like passionate about it right the reason why i'm saying we need like an independent regulatory body is because i think there's actually so much consensus in the fact that social media companies have way too much power and they are not responsible with it right i think that is a bipartisan agreement that we can all just sort of get behind right where where it's become culture wars is the like how do we do that and on what specific topics right and i think that is the incredibly challenging part and comes to the part that you're saying is like well what if we over regulate versus you know not regulate at all and i think that's a great question but i think the fear of over regulation can't stop us from doing nothing right so i think a lot about the dsa right and so uh one of the one of the think a lot about the the the the digital service act in the uk which is a piece of regulation um another piece of regulation that came in uh was the media exemption act that was part of the dsa right and it was a you know i wrote an op-ed about it and saying like this is horrible right it was a sort of um anybody could self declare as media a media institution right which is like i can't like this is disinformation highway right and the reality of that was you know as much as this one piece of this regulation might make this entire thing not work which was you know is a horrible thing we still need the regulation right and i i would continue to argue that you know it's it's easier to pull back and say like oh we're doing too much right then it is to say like we're not doing anything i think a question that follows up on that uh very nicely um for tomorrow our online audience how can we as average trust and safety workers who've been on the ground share our views on how to make policies actually effective with policymakers when they are only hearing either from tech executives or lobbyists or a few big names in the industry are there ways that we can provide input without taking a government job or a think tank job absolutely right um there are so many congressional people who are willing to have off the record conversations with tech workers about these very specific things right um you know part of my kind of astonishment and talking to congress was recognizing how little people actually knew about how social media work right there were so many assumptions of like well then this happens and i was like yeah no like that does not happen in an ideal world like i know somebody wrote in a book that that's what's supposed to happen right but like that's not actually what happens and so i think you know one thing i would recommend if if you're into it is writing these things down right like i've been writing a lot of op-eds so op-eds are like you know a way to go if that's what you're you're interested in um but i think sharing your ideas and and sharing the knowledge of the skillset that you have is incredibly important and i think again you don't have to necessarily go work in simple society you don't have to work anything you can stay working inside of a tech company and establish relationships i think one thing to do is go google like who's working on this right i'll say senator warren's office has been doing fantastic work on this area right like there's emailed addresses are all public just gonna put that out there before this event we were also talking about outlets like tech policy press that invite contributors from all different spaces when i first started this fellowship i wrote an op-ed published it with them and was joking saying this was the first time i've written something more than code but the thing is they do want practitioners and technologists to give their perspective because we see how things work with social media companies and a lot of times we just didn't know that anyone else wanted to listen yes big shout out to jesson and tech policy press hey thanks for coming um i'm not a technologist i actually work with you yes uh maybe i work in civil society at an organization called echo where we do corporate accountability work and i focus on tech accountability uh first i just want to say the power angle that you all were talking about from an outsider is so very real in the sense that the type of power that companies have oftentimes figured in countries themselves in terms of the breadth of wealth and for sure and force um gets reflected back when employees and whistleblowers come out and speak to the public for what civil society and individuals who are sort of looking at this world it's very much you that sort of facts on the ground um whistleblowing related to india tech harm on kids uh the no-type or apartheid uh with google is is a great example of that so just echoing for everybody listening in in this world to know that that power is actually front and center for civil society who's trying to take an effective look at accountability and then not not necessarily a question but maybe just an angle from the world that i sit in looking into the tech world for this upcoming year um everybody's calling it the europe democracy there are 70 plus elections that are happening around the world trust and safety teams have been slashed and for years of society has been calling for the bolstering of trust and safety teams across languages all around the world and this is one thing that i think i often have a little bit of a conversation in my head all the time thinking like do these companies know that this is happening like that there's all these elections that are going to be taking place and the sort of institutional power to slash those on the front lines for protecting democracies across the world is sort of going down the hill um and i know that there's a silicon valley bias and i think it often gets reflected to minimal to near none uh credible work in the global south where these elections actually have very real world harms we can look at india for example for sure um but just maybe a question to you all who have been in the sort of silicon valley bubble on where and how these broader big conversations that i think most people in the public and those especially in civil society are sort of zoomed in on um are able to see from their seafood offices or or yeah nearby i think that's a great question so you're saying you're saying and i agree with you like this is a monster election right but that also requires you to under have a understanding or a care about the rest of the world right and a knowledge that maybe something else is going on in the rest of the world and very often as i mentioned there's a san francisco bias right and if you are in your san francisco bubble you might not even know the needs of the countries that are having an election right so you're not going to necessarily even staff up to like go participate in that election protect that election especially not in the language in which folks are speaking because you don't necessarily know right and i think that that is that is a huge bias the other thing that i will say to your sort of um your advocacy piece right so before i started working in companies actually worked with color of change and i built their platform accountability process and so the process of their framework and their playbook and so i i again i think my theory of change requires everybody being in the room right you know so many times i worked i work on the opposite side of the stop hate for profit campaign of the stop Asian hate campaign twitch do better campaign right all of these campaigns that i fit i was the person inside of the company saying okay so you're getting this pressure this is great these recommendations might not necessarily work the way that these folks want them to so then how do i you know do something to help in this situation and i think it requires that external pressure plus the internal work plus the regulators right plus this conversations that we're having all of this is required in order for us to get to a place of change we're already 10 minutes over but a few more questions from online if you two are i have time one is uh i guess stepping outside of an american perspective but asking if you have any thoughts about the outsourcing of content moderation by tech companies to the global south um the general implications and a specific question do you think this trend exacerbates socioeconomic disparities and perpetuates digital flow? yes um the last question yes um you know third party content moderation jobs uh are the folks who are on the front lines of content moderation and sit in queues all day long and have a timer that says you only have about 10 seconds in order to make a yes or no decision buy buy or no buy all right that's what you're doing all day long looking at the worst content on the internet it is a job that has been proven to be exceedingly damaging right people get messed up by doing these jobs their brains no longer work is a people lose memory right like it is incredibly challenging and to take these jobs and to say we are going to export them to the global majority and have these folks doing the work is that already happening it has been happening for a long time and my question is why right why is this a business model why has this become a sustainable business model and i think it is very much for the reasons of you know digital colonialism has been happening equalization has been happening forever right these companies are taking they're literally taking the american idea of things like free expression and exporting them to the rest of the world right this has been happening for so long and i think that the the sort of systems of content moderation that are set up in the ways that we rely on the labor of people of color or marginalized communities to be able to do the worst of the worst work only replicates the worst parts of our society oh yeah of course thank you so much this was wonderful and also echoing which upset thank you for all the work you've done in this space yeah it's amazing um i'm kind of zooming in a bit back to the tech workers inside the companies um i was so i taught at mit last year and it was mostly a group of undergrads gunning for tech jobs for the money that yeah yeah and i assume most of them will probably end up there or a lot of a good chunk of them and so i've been thinking how do you see a way forward in terms of academia or education preparing tech workers for this kind of solidarity work or going morning with their eyes open to what it's like and what they need to do i think it's a really great question and i you know i've talked to a couple of folks who have been thinking about this and i i would love to see classes that teach people how to do this work right i think one of the hardest things about doing trust and safety work is that you know for a long time there was no sort of training in the field and so much of that was learning on the job the example that i always give is the time that i was told to make sure that world war three didn't start on the platform and i was told i had 48 hours to make it happen and it was real life right and i am not a foreign diplomat right and i remember sitting there thinking can i please call the un like can i call somebody who knows what they're doing and it was me it was me and my my team who was sitting here doing this work and the reality is is like those stakes are entirely too high to be dealing with untested and like folks who might not necessarily know what they're doing right and i think that we need to train workers to be able to do this work in a like in an environment where the stakes are not that high right like it's you know it's classroom no one's no one's actually gonna die here right like let's talk through why this might not necessarily work out right like let's talk about what's actually happening here and i think that that is such an important thing and i and i will say like you know the trust and safety jobs inside of social media companies you all been talking about how there's so many layouts happening and like that might not be you know the regulatory model that continues to persist throughout that like that specific industry but again all of these ai companies are creating trust and safety teams and those are places that i think so many students i would encourage them to go work in right you know um one of the things that i always say to people is if you see something save a copy right like that's my advice that's excellent thanks i'm just curious are you aware of any initiatives in any educational institutions that you think are starting to do a good job on this or i mean i'm hoping that i can hopefully do something in this area so like yes me i would love to do this but like i if anybody else is also interested in this like let's please talk no no i think that they're really great you know it was uh i wouldn't be able to talk to desmond patin's class at the university of pennsylvania who teaches a wonderful digital advocacy course right uh ruha benjamin at prince and i went and talked to her class as well who also teaches a wonderful tech accountability class as well these classes are happening yeah right i think there's also a piece of embedding those ideas into the courses people are really taking because you'll notice the students that are taking those classes are at the intersection of tech and ethics actually care but it's something everyone should know that if you're taking an engineering job at a tech company you may be faced with some moral and ethical dilemmas and you're gonna have to think about this eventually so i think there's this whole restructuring of these courses where having it embedded in what you're required to take but also having opportunities to learn it was so pleasantly surprised to find out five years after i graduated that my university now has a race, gender and computing course when we talk about these issues but we didn't have that back in my day we had one class that was listed with the school goal policy on cyber security that actually talked about kind of like that human piece of how this technology affecting people but before that we didn't really have a lot yeah we've come a long way literally you know when i when i first started working as like trust and safety was not an industry right there was no i remember when i graduated i was like i really want to do this there were no jobs right i literally had to wait for the jobs to come around and now we're at a place the jobs exist even last you can create your create your own opportunities you know you have like this knowledge and skill set position yourself in a way of art you're contributing the way that you can perhaps a good personal note to end on a final question about now that you've taken the whistleblower step how do you not burn out and what are your support systems for the way you can be happy to see that that was not a funny question but i surely got to pick out a bit clearly um you know i think that's a really great question i i will say this i i've encouraged i've said a lot of things encouraging people to like you know share information i have always said we need more whistleblowers and i have also always said i cannot in good conscience ask anybody to do what i have done right i could not ask people to walk down the road but i have walked down it is lonely it is isolating it is impossible it is really really really really hard right and i think you know there isn't all a lot of support in it either and i think asking other folks to do that i i i can't i feel like i can't do it because of that reality and you know thankfully um i have had a great support team right i've had i literally said like i hadn't been around this many lawyers since like law school right like once i started whistleblowing and i was like oh my gosh i have so many lawyers um right like but having support teams having people advising you having people you know more than one person advising you so that you have differences of opinion being able to tell you what risk factors take i think having family having friends is incredibly important to be able to literally get your mind off it how do you not burn out you do it's just the reality you burn out and then you ask yourself is it worth it am i gonna keep doing it and i've had to do that so many different times and i i'm here i keep saying yes so hopefully it's a less slowly road for people that are doing it now with people like you that are sharing your story and are offering yourself as a resource to talk through what you need to know before you take that step i hope so all right well with that i hope you guys will join me in thanking anika and nada for this awesome conversation um so yeah we'll have a couple more speaker series events throughout the semester we published three already and more are in the works and also just want to highlight an awesome trust and safety in the majority world's workshop that nada is organizing which will hopefully continue some of these conversations um in april so please apply to participate in that all right thank you everybody