 Okay, so I'm going to give my feel of just my experience waiting through these readings because they're a big, they're sort of like big pillars that are holding up my personal work and then in informing the environmental data justice theoretical framework as a whole. I want really starting with, really starting with design justice, that's the first time where I started to really think about design and in this case, like building environmental data infrastructure in terms of intersectionality. When I went to the data justice conference in Cardiff, Sasha Kasyamsa-Chak gave the keynote and address and that's the first time that I learned about their work and they talked about going through airport security as trans and the fact that the airport security sign or you know when you go through those big giant detector scanners, they click on whether you're male or female. So then people who are trans always get pulled to the side and then it and then as I'm thinking about that in terms of intersectionality, I'm thinking what if you're trans and a person of color as well, you know, you're probably always going to get pulled for extra security measures or something and I noticed that myself like in the summer when I'm darker, I get pulled and also when I wear my hair up in a bun because the whole system is based on white and cisgender people that hairdos that go outside of that. They, I mean, they just throw off the system and it's like red. So then in response to these sort of issues, you know, the whole idea of design justice is bottom-up design and like I said before, I think the term design and maybe talk about this a little bit more of what does design mean? Like is it engineering as well as planning and then how does that how does that inform or how does that work within environmental data justice and I would say, you know, building new tools that are from the lens of community members. So then that takes me to the fair and care principles and I think those are like a more concrete or really good concrete guideline for data scientists and technologists who are working in scientists for working with communities and vulnerable populations as well. And these principles are really popular with the income out of the US Indigenous Data Sout guarantee network. And recently I wrote one of my comprehensive exam papers on how scientists can apply the care principles to genomic research. So like they're in research, there's just a long history of extractive research that's either killed people or perpetuated ideas of racism and sexism and or heteropatriarchy and also getting caught up in this cycle of research being co-opted by capitalism and being for profit instead of for the interest. And then pollution is colonialism. I think that really reoriented myself towards the connection between environmental hazards and land or yeah land and then how that's all tied into and produced by a whole history of the state being built on native land and through the appropriation of this land in order to generate capital. So then throughout all these readings, I list the key themes somewhere. Okay, well the extractive logic systems of power are taking whatever kind of capital and people's information, and people's health and livelihood taking that in order to produce academic papers, produce whatever feeds their power. And I'm part of this too and produce money. So that's my big deal. And Riko, I don't know if you want to go about and just reflect on the other articles? Yeah, so I can just sort of just generally start with my own reason for being interested in this line of study. I think as someone who has like a political background, I've spent plenty of time talking to well, I shouldn't say family background, I think we've all had this experience of just talking to people who believe that and I'm kind of particularly interested in the racial aspect of this but who believe that racism exists and will always acknowledge that but who localize it to just being a interpersonal phenomenon and that systems aren't persons and so systems don't encode those biases and that's really hard to sort of walk especially in a little conversation where you're trying to change someone's mind. It's really hard to walk through how a whole system gets shaped and born and fortified with all these small decisions. And I think I'm particularly interested in applications of algorithms for very simple to understand life purposes that carry with it the biases of the data that are used to generate it or the creators of the algorithms themselves. And one reason I'm interested in it is I think because it's so hard to say, hey, listen, Uncle Jim, listen, man, the system's racist and he's like, look, I don't doubt that there are politicians and there are DMV workers who are racist but like the system is not the problem. And I think this is like a really clean and by clean I mean like easy to understand story about, look, Black Americans are 12% of the American population, 12 to 15%. That means that if you're trying to source Black faces and arms and bodies to teach a car to stop when it sees that, you're only feeding it roughly 12% of the images. If you're doing a good job, 12% of the images you're getting to the algorithm to train on are Black arms and legs and blah, blah, blah. So that means it's going to be significantly less talented at spotting those things and therefore it's going to stop just a little bit slower, blah, blah, and all these things feed into a real danger that is not the fault of the, and I'm going to put this all in quotes, not in the fault of the researchers who went and just got as much data as they possibly could or not in the fault of the researchers who wanted to make sure that the data that they got is representative of the population that is going to be impacted by these cars, right? If they say this is an algorithm for cars that are driving in America, well, 15% of Americans are Black, so we're going to have 15% of the images that we train our algorithm on to be Black. And that presents a real problem and you can sort of look at this as a summation of a lot of well meaning decisions at different points in the chain that results in a really horrible outcome where our Black neighbors are walking down the street and not sure if an autonomous vehicle is going to be stopped for them when they're crossing the street. And that is systemic. It's not specifically tied to one person saying a bad joke, which we all so easily define as racism. It's part of a system of norms that develop that have shaped whiteness in America as default. Not necessarily better or worse, but default when you think of just an average American, most people think of a white family and what I shouldn't say most people, most white people do, I'm sure. So the impact of that is that that gets encoded in our daily lives. And unless we're consciously working to reverse that, we're going to do that, we're on this verge of amping up everything about our society considerably. And every little element of racism that exists in these algorithms, there will be in the car that I drive and the Uber that I pick up in the blank in the blank in the blank and whatever racism is tied into the system is about to get multiplied. And so that's a real interest of mine in terms of trying to educate myself and others on how this happens so that we can sort of convince more people that this is a real clear systemic case, because this seems to be like the cleanest version of that. I think there's a lot of people who aren't going to believe that the system is racist unless they see everybody with a federal government job say something really terrible about another race. And that's just not how it actually exists. So that's that's how I've been drawn to the subject. That's my own little philosophy. So I tried to share some readings of just very lightweight here's a tech application and here's why it's doing a shitty job of kind of perpetuating an encoded norm and that norm being racially or otherwise tinged for lack of a better word. I can't I'm trying to find the articles that I found. Oh man, where did these go? I think they're in the github. Oh yeah, okay. We still need to put them on the hack empty. I put it in the chat they were in the email. Well so what you were saying about yeah I think that definitely connects with design justice. I know Sasha writes about that about that algorithmic bias. But I'm just thinking like all of our materials are built from these systems of oppression. You know like this Nalgene bottle and the straw and you know like everything is built from this system. You know because for instance coming from an environmental perspective just making plastics you know people of color are disproportionately affected by the emissions from making those plastics and we buy into that whole system and the wealth that's generated from that I mean there is a really good article by Wendy Leo Moore that of what whiteness is because you know the ideas of white and black come out of racism come out of that system and the idea that whiteness is property is like that wealth that generations have inherited from slavery and also that ability to navigate life without that kind of discrimination. I liked where you were going with that Lourdes specifically I guess one of the things that I think is really important about understanding various forms of bias including racism is sort of this I don't know if I should call it a privilege or a right to not worry about it because things are just made with you in mind. So I really like the design justice piece because it talks about like it asks a question like who has who's being introduced at the beginning who's being involved in the problem solving like I've been going down this design thinking pattern a little bit and Lourdes he asked earlier what is design and you know for me it's just like problem solving whatever realm that happens to be so it could be you know applications it could be engineering it could be design can happen in like all forms and so that piece when they just thinking about like yeah those questions at the beginning like how are people involved and who gets to be involved the communities that get me involved are like the communities affected being involved which is why I also like that you share the care principles alongside the fair principle which is really great because the fair principles were just like really data centric but the care principles were like talking about like the communities or like what would happen with like data and stuff so thank you for sharing that that was really cool but yeah just like thinking about design justice makes a lot of sense in this in this realm here like how do we how do we make sure that these communities are brought forward at the very beginning of any problem solving exercise and problem solving process I went to Sunrise Movement training last summer that was held in Berkeley and I don't know how familiar each of you is with the Bay Area but when you drive north from Berkeley there's a big obvious petrochemical plant it's a refinery I don't remember which one it is maybe I've seen it from the highway I don't really know yeah but there was a woman who came to speak at the Sunrise Movement training and Sunrise Movement has a lot of good things about it but one of the and and it like is a very explicitly inclusive movement but I think a lot of more seasoned activists from different backgrounds see all the energy and all the hope and they worry that it's naivety um so one of the things that the that the trainers also you did was they made a point of bringing in some outside organizers to kind of coalition build the house they have them like give them space to speak and one of the most interesting ones was a woman who lived kind of down the street from that refinery and she talked about like what a visible symbol it is that that's happening in their community which is a mostly black community I believe and there's just a lot of a lot of reasons why like this plant emitting chemicals is just this blight it's this obvious blight and it's this thing people want to fight and there's a thing that happens on a regular basis which is protest groups get really upset about it and decide to go make a stand and they march down the street and she says I live there and I'm an organizer and not even once in like the 20 years whatever it is I don't know how many decades not even once have they come by and said hey we're thinking about doing a strike in your area do you want to join um and so it's this whole like I mean it's it's kind of related to to the discussion that I've heard around like how easy it is to solve problems over in Africa versus here um you can solve their system because you can't solve yours at home it's not that it's easy it's that you don't understand the complexity so my friend from my program actually wrote her dissertation on the Chevron refinery and it's one of the largest refineries in the nation and she wrote about the community activist groups and how the refinery also green washes everything like they're like oh we're here but Chevron will donate this whole garden and all these sustainable things to your neighborhood so that we can be green and that goes along with the color blindness or not color blindness but what I talk about in the extractive logic paper whitewashing you know it's just like it's covering everything up and all those um the facts that they are profiting off of people's health but something else I wanted to bring up was this idea of bias and that like people I feel like data justice people or technology people always say oh this algorithm is biased etc but then coming from a sociology of science perspective um you know we sort of equate objectivity with um you know with what we're talking about color blindness and it the just the the word bias implies that there is some sort of objectivity but we know that objectivity doesn't really exist yeah um just the just how we're like as a society we're beginning to laugh at people who are in the begin maybe this has been around a long time but like who laugh at people that say they don't see color um I think we have to be the ones who are closest to not seeing color are the ones who make sure that they see it and this is my way of sort of saying like so I worked at a fashion tech startup before query and um we were training computer vision models on how to uh extract attribute data from fashion product images so this is a gray sweater it's a medium gray sweater it's a medium gray men's sweater and so from just the picture of a model wearing this we were trying to teach computer algorithms how to do that um now you'd think that when you get started on a project like that you need 10 000 images that are classified on a taxonomy and you feed that data into a neural network so that when I show the computer so it can then learn um I sort of when I began working there and seeing what we're doing I was like you know almost all of the models that we're using are white because that's the probably the most cost effective way to make sure we get lots of pictures of lots of models because most of the models are white um and I was curious about how this might do when we're trying to identify a blue bag with someone who's not white and so um we didn't get that far in the company the company only lasted 11 months but in order for the algorithm to be colorblind it needed to see this is a blue bag and it's being carried by a black person this is a blue bag and it's being carried by a brown person this is a blue bag and it's being carried by a white person like the computers are very dumb until you make them very smart um and um it can be really expensive to source a lot of that training data um like you can imagine like hey we're about to compete for a project where we tag colors of bags for a giant client um you know you want to be able to do that quickly and cost efficiently well then you're going to be searching for images of white models carrying blue and brown and yellow bags um and we you know I think like many tech startups we didn't think the world like uh what am I trying to say um we weren't doing life or death work but that's kind of not the point um because the foundation of the work that we're doing and the approach that we're taking if copied by everybody encodes a a the wrong kind of color blindness where I can't tell you what color this bag is because a black model's holding it well that's a big that's a big problem um and not because it costs more to figure out what the color of that bag is because then maybe someone has to go in and manually select oh the computer computer's never seen a black person before this is a blue bag um you have to um so to the point of like reducing someone or to get to color blindness you have to put all of these colors at the front of um this person's memory and I kind of like I equated machine learning in the in the um application we were using it to imagine you a young child fell off another planet didn't speak any language that you spoke and all you did was show them a picture and said blue and another picture and said green eventually after thousands and thousands of images they figure out what green means and what blue means and that's essentially what you're doing with computer vision um but without giving more of the context of like I said green purse uh white woman tall whatever um it's going to be blind to all of those other aspects just a fun just a little confessional from my life it's really interesting because I I mean I come from a startup background too and one of the things that you really really feel when you're going to start up is how urgent it is that you spend little and make lots um and I've I've like I wish Brendan was here because I feel like we've talked about this a little bit but um I mean frankly it's why I left Silicon Valley um we talked like one of the things that that Brenda and I have talked Brendan and I have talked about is how different it is when he works at query versus when he works at edgy because within edgy one of the things that we do is work very slowly and we take the problems and we take a lot of time and distance and thought um and it means that we don't get things done on on like a product style speed and like for those of us who actually really enjoy making product like that's it's maddening you know like you want to be able to produce otherwise you feel like you're not doing anything useful which is a whole other thing um um but there's a huge value in doing things slowly trying things that you don't complete because you've decided that it wasn't the right thing um I mean I don't I don't want to be the person who always goes back to oh well capitalism but there there is something I am a person that's you lord this yeah um but there is something there is something very very real to that right which is like this is this is us deciding what matters for ourselves and therefore for the parts of society we have influence over in my context it was can we really afford to try to hire for diversity which was like yeah can we afford not to the nuances of getting caught up in capitalism I mean we all like sometimes what was that look at oh you're now I was just thinking like even edgy has that problem yeah hiring you know like and it's because like hiring is it's not that you want it to be fast but you don't want it to be too slow either because then it just drains all our energy out and then we just use that as kind of the excuse shortcut it's like we just don't have time for it so we'll just hire like the easiest thing and the easiest thing is just to stay within our networks and the people that we know already and that's like yeah like because degree like yes because it's a lot it's really easy to get a lot of like white people pictures yeah completely yeah this whole fast-paced thing um that we are like this is our life in this society like we are always going and that's what neoliberalism is about is this idea of being extra efficient maximizing your profits um getting it like always needing more money I you know sometimes you have to take a step back and think and I think coronavirus is helping us do that um of saying wow this is you know there are things that are deeply flawed in the systems that we've created for ourselves but we also need them or how do we live outside of them now I'm very disturbed this week by like I get how difficult it must be like I don't personally get it but I get that it is difficult for people to work from home while their children are not in school and not in childcare of any kind on the other hand I've been talking to a lot of parents who are like basically weighing like you know I don't know I don't know how to entertain them other than putting them in front of a screen and I'm like what did we forget that you can like play music and talk to people and go outside and there's so many ways of being in the world that don't involve screens and and genuinely I think people have forgotten that and it's like really biting people right now and it like it feels it feels hard to watch because I mean like I'm good look out there I can go out there but in the city maybe you're just in an apartment complex and you actually can't go anywhere so you kind of have to have this virtual world that you actually live in which is why our algorithms and bias trainings are so important because like everything we do is moderated through a screen and especially through the internet like who controls the internet and how yeah and and and the thing I'm most afraid of is that it's it's that it's either it's either not a person or not a good person or not enough people and by controlling the internet I'm just gonna like boil that big idea down into something small which is just like my linkedin network people who you should say hi to or something or or like let's let's just say five of my friends got a new job and linkedin recommends that I congratulate them how they go about not putting or 25 of my friends get a new job how do they put the five that I should congratulate in front of me is a decision by kind of no person because there's an algorithm behind that based on the last five people I've messaged or the last five people I haven't messaged and that just encoding whatever behaviors even if it's like well this is behavior specific to you um what am I trying to say it it's encoding the ways of working that exist today and like locking them in unless you proactively change them like whenever I think of like a Spotify song that I don't like I'm worried about disliking it because then I think I'm never gonna hear that artist again you know like the computer's gonna learn too much from my hey there's my there's my guy I promise that approval um and so um you know I don't know the algorithm that recommends from linkedin who I should say hello to but if I pass on two women in a row if it then stops recommending that I say hello to women or like congratulations on the new job to women that I'm not gonna remember that someone now works in an organization that could be great for my business and so blah blah blah or whatever like it needs to not encode just your previous behaviors because a lot of our previous behaviors are based on the networks that we grew up with and we've all been locked in these segregated world a lot of us have been locked in a lot of these segregated worlds and the those worlds are not going to change unless it's like those algorithms are cracked into from some other point I was thinking of doing it with Spotify of um because I just like I was thinking like I just like too many bands that are all white dudes and I really need to stop that but you know I like the music that I like and so and then when I pass on stuff or no it's not when I pass on stuff but when I like things then Spotify is like oh she really likes white boy punk bands from the 90s and so we're gonna like recommend some white boy punk bands from now and I don't like I've made a rule for myself that if it's all white dudes cis white dudes from the past five years uh like I'm not listening to that the best part is that your uh like music taste is supposed to be based on music that you've heard before and music that you've liked before so it might actually work you might stop liking or you might begin to like other sounds I mean I like I have a really ecstatic music taste but um you know it's like your daily mixes you know one will be jazz one will be um hardcore punk one will be like you know like the different variations and then another will be like I don't know like shoe gaze or something but yeah but what I think is really interesting about this like why is it that I mean I would expect that Spotify LinkedIn Facebook all of them probably do specifically call out people and bands and stuff as white male female Latino whatever like we do actually choose like when we do machine learning we do actually in in most forms of machine learning we do actually choose what parameters are being looked at and we pick these ones because they're easy to identify but they are in many ways the least interesting but then we're talking about the fact that they're not chosen you know that there is no equity in how these algorithms are built like there's no algorithm I'm assuming on Spotify that says oh this person's listening to too much white music we're gonna put some black and this is I think a perfect distillation of your point Lourdes about I actually would bet that um there's there I mean I don't know I bet most data sets at Spotify are not coded by the race of the author and instead it's like just the genre because 90s punk bands were mostly white those things get correlated tightly and so when you keep listening to that you keep getting whiter audiences and to your point about like in that sense the the algorithm was color blind but because it was color blind it therefore tied to another correlative thing and therefore became pushing white music if there was somebody I shouldn't say white music white punk 90s music and if there was somebody on the engineering team who was like I'm gonna I'm gonna I'm gonna tag these these these artists by their predominant race of most of the artists or whatever something like that great or you know I'm worried that this black punk band in East Bay is getting no um or whatever there's a there's a black punk band in East Bay and so I'm gonna wait things like hey uh these I'm gonna start putting bands in front of you that are like a little bit more local to your place you know I mean like there are other categories um that are not race specific that can get other kinds of music in front of you does that make any sense like if I'm like if I'm in New York and I listened I listened to the same I listened to No FX Pennywise those are my bands um but if if Spotify sort of was aware that 90s music 90s punk rock music was a white dominated genre if it pushed up because I have a new I'm a New Yorker it pushed up a black punk band from like Queens into my feet and I was like this sounds cool and then oh my god they're from Queens that's I'm much more likely to check them out because they're from Queens than because they're they're black um anyway just um I know in their algorithms they incorporate like the the sound wave and um and match up according not only according to the genre but according to just like the melodies and sounds so that will match to you know your interest so I'm thinking yeah like if so that's one way to get people to listen to more people of color music but then um um I feel like at least before all this material and readings came out people started writing about this the act of tagging someone a band with their race like how white or have not white there would be perceived as racist and then also the question of who are you to you know some white dude coder to determine the race of a band so then you would have forms when the band puts up their information on Spotify that says what's your demographic and then the band will be like why is this a question here but I think that should be the I think that that should be a thing I think that's really interesting I would love to see recommendation engines and like you definitely see a lot of blog posts and especially librarian curations of um you know how to read for a specific story that might not be part of like existing mainstream right so like when I was in the Portland library recently I picked up a couple flyers that were movies and books centering indigenous voices or the whole um um oh what's it called but in your own voice own your own voices hashtag own voices um where you you're specifically reading books that are written by like written by the people that they're about um yeah I think that there's it's actually kind of interesting in our in our modern society like not everybody is interested in reading for diversity but there's a huge population that is I think that there's a kind of a market available for for recommendation engines that specifically give you stuff that's outside of the norm stuff that's outside of the mainstream like what everybody's reading or what everybody is used to reading yeah I just searched on Spotify punks of color and there's one playlist I mean that's a great band name but um and I think it does start with like play you know putting together a playlist plate putting together book lists curating things and then fostering those sort of tastes enough I don't know uh freaking colorblind people I mean I think this stuff is really interesting in terms of like in theory given that it's dated together we're talking about algorithmic bias as an encoded algorithm but we do have we also have influence over how people think and perceive what normal is right just by creating models of it one of the things that I was doing for a while just kind of as a pet hobby was going through those book lists for you know whatever whatever diversity category it is that was um some book list of things that were released in the last year and um my local library has a recommend this book option and so I could just put these in the library I wouldn't even necessarily have to read them they just be on the shelves like it felt really powerful because they were just books that weren't people didn't even know they existed that's on um switching gears a little bit I think I also shared an article of um tiktok moderators being alerted when someone who when one of when a tiktoker who has down syndrome has like at least 6 000 people watching their video a moderator is queued to check in and make sure that people aren't like making fun of that person and so I'm also fascinated by like the unintended consequences of really good policies and practices um and you can make all the all the reasons why in the world today that's sort of a necessary process at the same time like imagine being someone with down syndrome and knowing that if I get a certain level of popularity I'm going to have moderators in my feet immediately that's hurtful um I mean maybe I don't know it depends I get everybody can interpret that differently but I would be like I would feel othered by that um and we're sort of we're fumbling through a lot of this best intentions tech as it is the world as it is and the output is is really messy so that is sort of like patronizing in a way that's something that's cool to look out look over you right by me I love it when people intervene and are like telling people off like there's a organization I don't know if they still exist of white folks who come in and intervene in threads um of you know racist people being racist and to take the load off of the person of color in that thread who's like y'all are being racist they'll step in and and it will educate and I I mean I think that's I think that's cool that's a good way for white people to engage in solidarity but only but it's only when they're requested you know that okay that's helpful yeah and I think that goes back to like the design justice thing is like you know you ask people if they want to contribute but and this goes with edgy too it's like you don't want to you don't want to put the burden of contributing on people you want to like give them the space and facilitate their contributions and if they want to take a leadership role or like take charge then have that space open but not not compressed it could actually be really cool feature actually if if tiktok when you signed up said hey are you a member of x y z vulnerable population because if so we can prioritize moderation requests that you submit that would be me yeah I feel I should have really read through the whole case but um I think it's just triggered right um it read to me less positive than you presented it yeah yeah no I agree yeah that's kind of how I um I don't want to say yeah just give me I guess I'm really curious about how folks think we can get around algorithmic bias given that given that machine learning is definitely happening yeah um this vox article talks straight Kevin go ahead sorry I was gonna say I just like just give me a chronological timeline again and no ads don't don't push anything just let me find it through serendipity like I don't know I because when I I do like weird hacks when I'm like using when I do use social media like Facebook and stuff um like for Facebook you can create like personal lists and I only use it through the browser not through the app like even on my phone and then in that way I'm able to see things chronologically without ads and it's just like a different very different experience because I like start to see stuff from people that normally don't interact with which is nice um so I would say like how do we turn off that is there like is there a feature where we just turn off the algorithm like do we always need an algorithm to push out something like um I think the option is like turning it off completely when you don't want to when you don't want it would be a nice feature yeah I mean transparency sounds like the very first step like if you know exactly I mean this is this is the tension because this is the bread and butter this is the uh this is the trade secret but if you knew exactly how it worked you could adjust your behavior accordingly or whatever or maybe ask for changes um I mean I basically know that my Instagram feed is some combination of things I like and things that I that look like the things that I have liked right like um Instagram's the worst or like in that way what that sometimes I get those ads and I'm like yes yeah Instagram is is uh kind of widely recognized at doing ads pretty well um compared to other platforms it seems like and the other thing is what is okay so okay yeah big advertising the marketing you know but the fact that you can put ads on facebook for not a lot of money yeah it's like it really helps um if edgy wanted to put up an ad it would only cost a couple bucks and but yeah so it's like that's really cool about the technology but then people can also spread misinformation using that and can also target like african-american population with misinformation and do damage that way yeah it's like really dangerous like weaponizing being able to like specifically target people for things um it kind of leaves me into like kind of like closer to like the the dweb stuff with like um federated social networks and like how much does it make sense for us to be like installed does it make sense for me to be connected to somebody I don't know in Egypt like right now or is it should I be like having you know social networks with my neighbors you know like would it like there's um I forgot what it's called like scuttle butt you can like you know see folks that are on the same network are just around you and like is there a place for like smaller social networks instead of just like the ginormous space before we're all on the same thing that's like a question I wonder about like what would that be like yeah that's a sword that cuts both ways because the the white the white kevin wenz of the world who uh who have the same genuine interest in keeping their world a little smaller if they grew up in white worlds are then keeping their all white worlds a little tighter more tightly wound like if I I don't know um unless you're proactively breaking against that um yeah yeah I mean and it's like there's like this other question it's like well how much of the social networks is our real life like why are we spending so much time on it and then like because we're spending so much time on it it makes it easier to self isolate in these like bubbles like if we had to like step out and like deal with like if you borrow us more often how much for that change things I don't know did I know where you're living I guess too depressing yeah I think I've just chosen to like I mean I'm already 35 and race has been a problem in my country for 500 years and a problem in the world for far longer than that I think I've accepted the fact that it'll be a big part of the life of many people of the country that I live in for the rest of my time here and so keeping the goal of erasing all of that for like another lifetime just reducing the like worst parts of it like the fact I'm just bringing this back like the fact that an autonomous car can't as easily identify a black neighbor of mine is like a majorly scary problem and that seems really both like really scary and really fixable and so I personally want to focus efforts on where those two things intersect we're like look let's get the big dangerous things reduced however we can there may only be three black punk bands in the United States and for there to be six might be progress but like you know that's a longer project I think you have bad brains which is homophobic and people are like oh bad bad brains oh they're a black band it's like yeah but they're big in it so I was reading an argument machine learning I'm trying to remember where this is from I think it was an article but I don't remember it was making the argument that this is the only moment we can influence whether algorithms have bias have have our same biases because pretty soon machine learning will work faster than human ability to feed machine learning totally although I don't know if that's a full argument because it's just another one of those things that's huge and out of our control well there's that question posed by one of the articles where they said like in this vein of like flowing that way should we even do this at all like someone's like this the facial recognition soft like article like what is it actually useful for just unlocking our phones and tracking everybody wherever they're moving I mean this Instagram like quick changing our things fun facial recognition but yeah I know I agree with you it's like what are the new weird ones where it brought us like all these other technologies that are household that we like I feel like I've heard a good an argument about like it's a good thing we developed the H bomb because it brought us all these other technologies that we really like oh I mean technology isn't bad it isn't good yeah it's another tool I mean the only thing I'm sure about is that it's gonna happen I I don't really believe in human abilities is to suppress ideas like in any kind of permanent way any other thoughts some stuff that was I was reading it made me think about the echo project and the thing that was I had questions I maybe can go to like whatever study tube we're doing sorry I haven't been involved as much but just like the fact I heard Kelsey say this but like basically these permits are like permissions to pollute pollute right like that's what all these permits are it's like giving these whatever companies or whatever they happen to be permission to pollute and I'm curious like it's related to the design justice like how are these thresholds determined you know what like how long over these determinations made like what they thought about an aggregate like okay it's it's okay for one whatever power plants to pollute that much but what if we had 10 of them next to each other like is there any like thought to the cumulative effect of like all of them being allowed to pollute in that like yeah um like you know that's the florida's I know you've been sitting or is that like the same questions that you have yeah and that's also like yeah that's what I've studied is just those thresholds being based on um being generally based on white male bodies and the idea that the dose makes the poison um that's totally you know not really reflective of reality and the fact that companies prevent certain science from happening it's what this stf scholar scott freckle calls undone science there's just loads of science about these chemicals that um that isn't done and so that's I don't I really like the design justice article because it just calls into those questions again it's like these people that are being affected like why aren't they involved in its termination of these like permits and how they work and stuff like that or how they were sent I think there's a new uh algorithm's officer for New York City um and I don't know exactly what their job and powers are but I think one of the big missions of the people who cared about that position being created and shaped is just all the data that any algorithm that goes into a city process the algorithm is made public and the underlying training data if it's machinery and is made public um so that at least people can take a look at it and so this is sort of what I mean about like um Kevin your point of like do I get to kind of tune the algorithm for myself it's like well the first step would be like at least understanding how it basically works as an instrument right and then yep uh going from there I'm trying to think of like some interesting city oh uh there was a uh police effort in Chicago I think it's still live it's sort of like predictive policing if you've seen a minority report it's kind of like that where they'd crunch all this data and they'd say like look you are likely to commit a crime or be the victim of a crime based on these factors we're knocking on the door just to check to make sure that you've got housing a job blah blah because these are the four things that if you if you get those you're much less likely to be on our list in the future so like how's how's your job blah blah and this is like wildly controversial for all reasons that are probably obvious to us but a lot of people in communities that had their doors knocked on are like why the a someone were like why the fuck are cops knocking on my door proactive like this and other people like oh my god I'm so glad that they're checking on to make sure that like my housing is safe and I they helped me make a complaint about my landlord blah blah and so very brave new worlds um yeah and you can see all the obvious reasons of me concerning right that's the best version of predictive policing I could imagine yeah I agree with you um Rico that like the transparency of it like helps us determine like that similar thing where my friend um she works at a school and they wanted to use some type of data analysis to find students at risk of failing and like reaching out to them first but then there's all this identifying information about like their parents income or whatever and like so it's like those yeah it's just like brand new worlds like you were saying for sure I feel about it sorry that's true I remember I have a note here from when I read the design I'm gonna decipher it and maybe this will bring something up for you but it's just design principles the design process itself matches product best practices to ensure there's a market but it becomes potentially extractive at the point of profit balance versus incentive and challenge design and the challenge design and create a useful thing wait was that about challenge design uh the challenge to design and create a useful thing because there is something really interesting I was reading that article about design justice um my my education in design is called it's from a school of school of thought called user oriented collaborative design and it embodies the same principles that were outlined in the paper to a large degree um which is that you choose who your product is going to serve and then you go and you meet some of them and you find out what your product ought to be by way of interviewing them and then showing them rough prototypes until you have something that really solves their need and it's really cool because you're identifying problems like you can't not identify a good problem that way um on the other hand yeah you've now used the input of these people to create a product that you wish to now sell to them and everyone likes them for a profit and it's not like I'm not anti-capitalist enough to say that like like there is some level of well you deserve to make some money for going through this process of design like design and creation of a product is real work it's useful work yeah but exactly how much and where is is really much more a function of like how much can you get for it yeah and I'm not a fan of top-down models and I think you know what all of these most of these pieces are saying is that we need to shift from top down to bottom up and I personally think that corporations and companies should be owned by the workers by like everyone who is in it and maybe have shares based on how long you've been with the company and how much um like I don't know student loans you have to support a company or not and that's again like that's sort of what edgy I even though we do I mean we have issues with how much we pay people according to skills like what if we live in a world where all skills we're we're um we're equal like someone managing it makes as much as someone doing the cleaning because they're both working the same amount of hours I don't know and everyone has a stake in the organization that they're keeping that they keep running wages for housework as well yeah or maybe like no like also think of economies that are outside of money yes I'm more concerned with the idea that we have to give people incentive to do yeah it's like come on it's not an academic to make money I think I think uh as a as an MBA of a hopefully high growth tech startup and a white guy um and a capitalist and a warren supporter I uh sort of I think it'll be revolution enough quote unquote to have um just uh yeah lower to see your point um greater worker ownership of the firms that they operate in whether that's quotas for a certain size company is like 10 or 20 percent or something like that but just like the massive accumulation of wealth among such small people like capitalism was evil and terrible 60 years ago and that system was like wildly preferable to the one that exists today Elizabeth Warren's made this point many times about you know a lot of people talk about racial harmony coming once uh black families make closer to the um median net income in America and her point is like do you know how much closer their those families net income was was to the median income in the 1960s and 50s was higher wealth like wealth has stayed has gone like this um and the world has gotten way more stratified um and so we can't and and and racial depending on a lot of different things uh racial progress has progressed in a number of areas so like we have to unbuckle those a little bit or just hoping that race figure gets figured out once everybody's middle class again um but just you know the idea that uh LeBron James will have to work 1200 years at his current salary 1200 years at LeBron James wages to um a mass enough wealth to be Jeff Bezos and how many people have to work for 1200 years to be LeBron James like it's way out of way out of whack they need to pay my student loans and they need to I don't know they need to give us money because nobody works like nobody works hard enough to make that much money right right you know like 5 000 years of of work yeah that's just not it's not fair and I so I hate that marriage whenever someone gives me that like bootstraps thing like really like you think that someone who works 40 hours a week that should make that much more than 20 hours I'm angry now I forgot the phrase but I think it's like like Americans are just these a lot of folks that support the billionaires are just they believe that they'll be yeah they'll be like billionaires soon enough or whatever and it's like um that's not what one percent means the estate tax doesn't impact anybody who's trying to give less than give less than 11 million dollars to their children that's a lot and everybody's worried you know everybody's complains about this death tax and not being able to bequeath whatever little wealth we all have to our children anyway I mean is that another thing inheritance I don't you like people have to cut it cut out the large inheritances I think it's okay to like pass down you know few thousand or whatever to help when you die I don't but like more than that I just don't think it's fair I mean that's how like racism is constructed even if no money is passed down you have probably an easier education and social support and right yeah those things compound so we're coming up on four for me it's six or yeah whatever it is over there wait where are you I I'm so I'm like my nose wait Kevin do you want to read that quote out I feel like it'd be a great closer to our video it's just like things aren't better because the poor see themselves not as exploited proletariat temporarily embarrassed millionaires by John John's I'm back oh and I'm rereading the grapes of wraps grapes of wrath right now a lot of isolation yeah plenty of time reading reading for the first time yeah that's the like Kelsey was saying paying for that dream I got data together bingo proletariat workers data justice capitalism yep proletariat hit right at the end there okay y'all we made it good seeing everybody wash your hands stay away from your grandparents take care everybody bye