 Good afternoon. Thank you all so much for being here at the end of a crazy semester My name is Joy Rody. I'm the interim director of the science technology and public policy program here at the Ford School of Public Policy And today I have the great pleasure of Introducing our guest speaker, but first I need to thank a number of people who made today's event possible First of all the Ford School the policy talks program. Thank you very much Our talk today is also Cosponsored by the School of Information the science technology and society program and poverty solutions And so on behalf of STPP. We thank these programs for their support Today's talk is also supported by our STPP graduate certificate students who run a group that is open to students across the University Regardless of whether or not they're in the science technology and public policy program But who are interested in science and technology policy the group is called inspire. They're awesome And they are also co-sponsors of today's event Now to introduce our speaker Professor Virginia Eubanks is associate professor of political science at the University at Albany SUNY She's the author of automating inequality How high-tech tools profile police and punish the poor and I think it's one of the most important books I've read in the last few years She's also the author of digital dead-end fighting for social justice in the information age and Co-editor with Aletheia Jones of ain't gonna let nobody turn me around 40 years of movement building with Barbara Smith Her writing about technology and social justice appears not only in academic print But in the American prospect the nation Harper's wired and other outlets For two decades Professor Eubanks has worked in community technology and economic justice movements And I think for those of us who are scholars who also aspire to make a difference in the policy world Her work and her career arm are great models for us She's a fellow at the New America Foundation And she's also a founding member of the our data bodies project, which is an organization that works in communities to Demonstrate how digital data collection and storage systems of different types impact things like community reentry impact access to fair housing access to public assistance and Community development programs So just to orient you following Dr. Eubanks's talk there's going to be a Q&A and there are Little pencils and note cards going around encourage you to share your questions that way staff will come around and pick them up So and if you want to do it via Twitter, it's hashtag policy talks Today we have two STPP students Helping us with the Q&A our Inspire leaders Jackson Voss down here and Laura Greer And they are assisted by my right hand the STPP program manager Dr. Molly Kleinman After the Q&A we'd like you to join us in the Great Hall for a reception. There's also a book signing And Let's just get right to it. Please join me in giving a warm welcome to professor Virginia Eubanks Thank you guys. How's everyone doing? Welcome to the end of the semester. I cannot believe you're here So it snowed today. It was icy. It is the last week of the semester and I'm just very very impressed So I'm incredibly flattered and gratified to be here to be part of this conversation Great. Thanks to both Molly and Joy and Erin and everyone else who not only thought to invite me And and support that invitation but also did all the hard work to get my physical body here from upstate, New York at the Midpoint of a winter now. We would think it would be early winter, but this is our third or fourth snow now So I just really appreciate all the hard work that went into getting me here And and want to say thank you for that So my plan for today is to talk for maybe 40 minutes about the book I'm gonna assume that most people haven't read it and give you a bit a little bit of the story of the book I'm gonna try to do that with a real focus on Introducing you to some of the families who shared their stories with me when I was doing the reporting for this book It's really important to acknowledge that the folks who went on the record with their real name and their real Experiences for this book were did so at enormous personal risk that many of them were currently Relying on public assistance for their basic public for I'm sorry for their basic material needs things like food or shelter Folks who are unhoused or folks who are currently Part of a child welfare investigation For them to talk on the record about their experiences is a great gift and so I try to always make sure that we're starting from their points of view and That I acknowledge how much of this work is made possible by their incredible generosity and courage So I'm gonna talk a little bit about History I'm gonna talk a little bit about the three stories I tell in the book Then I'm gonna just sort of draw some Draw out some common ideas that I think are worth talking about more. There are the sort of ideas that are portable Out of the stories that I tell And then we're gonna leave plenty of time for conversation for questions and answers. So that's my goal We're gonna hopefully start out with good energy I ask when everybody's energy is still good to give me like up Twinkle fingers and when you start to fade like that the fingers come down And when you really need me to shut up, honestly, you can give me the downward twinkle fingers and I'll stop eventually Not right away, but I will take it into account So that's the overall sort of plan for the time we have together that sound good to folks We all right twinkle fingers if you're good. Yeah, okay. Thank you. Good I always like to To have a little bit of feedback for her for how we're going. Okay. So I'm here to propose that we're Building a digital poor house that despite the fact that new data-driven Digital technology has incredible potential to lower barriers and social assistance To speed results to create efficiencies and cost savings What we're actually doing is building an invisible institution that's made up of decision-making algorithms Automated eligibility's processes and statistical models across social services in the United States So I want to talk today about how the rise of this digital poor house responds to and kind of recreates a narrative of austerity this idea that there's not enough for everyone and We have to make really tough choices about who deserves to attain their basic human rights So even though we like to talk about our newest technologies as disruptors the tools that I talk about in automating inequality Are really more evolution than revolution? And their historical roots go really really far back in our history at least to the 1820s I mean here's the moment where I always take a second to note that my wonderful editor at St. Martin's Elizabeth Disagard helped convince me that I didn't need to go back to 1600 to start the history and what was originally a 95-page history chapter that started the book and helped convince me that getting it down to a spelt 200 year 26 page history of poverty policy was probably enough So I write a fair amount about history in the book, but I'm just going to talk about one moment in that history today so it around 1819 there was a huge and a very crushing economic depression in the United States and economic elites got very very nervous both because of the Economic crisis, but also because of some really fearless organizing that poor and working people were doing to protect their families and their rights so as economic elites do they responded by commissioning a series of studies and the studies asked What's the real cause of suffering here? Is it poverty? Is it lack of access to resources or is it what they called at the time? Pauperism which meant dependence on public benefits. So this is the how the studies are set up What's the problem? Is it poverty or is it dependence? What do you guys think the answer to the studies was? Dependence right exactly. So this does not surprise us because we're still doing the same studies and finding the same results now So the solution for them was then to create a set of public brick-and-mortar public institutions That basically raised the barriers to receipt of public assistance so high that No one except for the absolutely most desperate people Would possibly apply or ask for help. So what they did was build what was originally intended to be a network of Public poor houses in every county in the United States and they required as one of the conditions of Receipt of public assistance entering into this institution and this is really no easy choice They were technically voluntary though you could be sentenced to a poor house is as well So many of the people who were there were truly inmates in the in the real sense of the word though all of them were Referred to as inmates But folks who were entering voluntarily or folks who were sentenced to the poor house were required to give up established rights And this is 1820 so not everyone shared these rights But some of the rights that you lost were the right to vote to hold office the right to marry and also the right to family integrity so folks entering the poor house often lost their children because The idea at the time was that poor children could be rehabilitated by interacting more with wealthier families and by interaction they usually meant working for free as agricultural or domestic laborers And finally death rates at these institutions were often just really astronomical often as high as 30% annually Meaning about a third of people who entered the poor house every year died So people were literally taking their lives in their hands in entering these institutions yours by the way Was on Washtenaw Ave near Platt it is now where the county farm park is This is a key word by the way a county farm means that's where the poor house was If you have a county farm road or a county farm park That's true became the county infirmary after the rise of welfare in the 30s closed not until 1971 And there's actually a very strong UM UMish connection Though it is an unfortunate one which is after 1880 unclaimed bodies from the poor house were given to the University of Michigan for dissection If their families did not claim them within 24 hours so This is one of the great fun things I get to do in every new town I go to is look up where your poor house was and what its story was You have some pretty good records by the way So if you look at the historical society that have some good records from the poor house I suggest someone go right about it right now Okay, so I used this metaphor of the digital poor house to illustrate what I think of as like the deep Social programming or for the sort of technically minded in the crowd the legacy programming of Today's digital tools and social services So at their heart is this decision that we made back in the 1820s that public service program should act more as moral Thermometers separating the deserving from the undeserving Diverting the able or enforcing work rather than as a universal floor under everyone So I want us also to not just think about history But to think about this political moment about why this these specific tools have become popular at this particular time Because I think that these high-tech tools that are intended to establish eligibility to predict behavior to measure effectiveness have risen to prominence in social services now for three reasons first They rationalize and recreate a politics of austerity this idea that there are just aren't enough resources And we have to make hard decisions The second that they they promise to address bias but in fact they just really hide it and Third they create what I think of as a kind of empathy override that eases the emotional burden of Making what are I think inhumanly difficult decisions about who among America's 43 million poor deserve support So I'm gonna use each of those points to introduce one of the stories I tell in the book And I will introduce both the families. I spoke to and the technologies that I write about So first point is this point about the digital poor house assuming austerity and because it assumes austerity Recreates it so I dedicate automating inequality to a severely disabled young girl named Sophie Stipes and when Sophie was six she received a letter from the state of Indiana that told her that she would be losing her Medicaid because she had quote failed to cooperate in establishing eligibility for the program. So this happened just as she was gaining weight For really for the first time in a significant way in her life because she had recently had a feeding tube Implanted a gastrointestinal feeding tube implanted and she was learning to walk for the first time So the Stipes family was caught up in an attempt to automate all the eligibility processes for the state's welfare system So that's for tana for cash assistance for Medicaid or medical Insurance and for what at the time was called food stamps and it's now called snap So in 2006 then governor Mitch Daniels Signed what would eventually be a one point three four billion dollar contract with a consortium of companies including IBM ACS affiliated computer systems to create a system that replaced the hands-on work of local County welfare case workers with online applications and private regional call centers The result was a million benefits denials in the first three years of the project of the experiment Mostly for this catch-all reason failure to cooperate Which meant that just that somewhere someone somewhere in the process had made a mistake so the mistake could be the applicants they could have forgotten to sign page 34 of a 50 page application it could have been the fault of a call center worker who Misapplied policy and gave someone bad advice or it could just be some part of the computer system like the document scanning center They could have accidentally scanned a document in an upside down or drop something behind a desk or someone could have photocopied a driver's license and then faxed it to the document copying center where it was scanned and if you Copy and then fax and then scan a driver's license. Of course, you just get a black box on a white background But the the notices that people like Sophie Stipes received simply said that there was an error not what the error was and because it severed the relationship between Applicants for public assistance and the folks who had in the past been case workers who were responsible for dockets of cases And now we're responsible for lists of computerized tasks rather than families Because that relationship had been severed the system virtually Guaranteed that the burden of finding and fixing any problems with the application process Fells squarely and solely on the shoulders of applicants who were some of the most vulnerable people in the state And I just want to tell you one story About someone who lost their benefits During the attempted automation And this is the story of omega young So in the fall of 2008 omega young of evansville missed an appointment to recertify for Medicaid because she was in the hospital suffering from terminal cancer The cancer that began in her ovaries had spread to her kidneys breast and liver Her chemotherapy left her weak and emaciated young around faced umber skinned mother of two grown sons Struggled to meet the new systems requirements So she called her local help center and let them know that she couldn't make this phone recertification appointment because she'd be in the hospital But her medical benefits and her food stamps were still cut off for failure to cooperate Because she lost her benefits young was unable to afford her medications. She struggled to pay her rent She lost access to free transportation to her medical appointments and omega young died on march 1st 2009 on the next day on march 2nd She won an appeal for wrongful termination and all of her benefits were restored So that's a mega young and the indiana automated eligibility automation So the second point I want to raise is um that we believe that these new digital tools are objective and neutral But they often just hide bias Um and in this case I want to actually start with a family and then talk about the system So I want to talk to you just briefly about patrick greeb and angel shepherd So I met patrick and angel at the decaying Family support center, which is like a community hub Where families who are child welfare Who are involved in the child welfare system Attend programs access resources connect with other families provide peer support And they didn't stand out to me right away as sort of interesting people to report on Because their experience was so average It was really almost mundane characteristic of the sort of routine and dignities that are experienced by white working class people So they've struggled with low wage dangerous work poor quality public schools and predatory online education poor health community violence But through it all they've remained really creative and involved parents So patrick I talk about him in the book as I think I describe him as like a buddhist x biker He's this like rectangle of a man this like really large man with like really elaborate facial hair And the sense of incredible calm And one of their favorite parenting techniques They're raising they're helping to raise two young girls Angels daughter harriet and patrick's daughter's daughter desiré and the girls are roughly the same age And because they're so close to to each other in age they bicker a lot and so um When they're bickering too much what angel and patrick do is put them in what they refer to as the get along shirt And the get along shirt is one of patrick's like enormous button-down shirts So they shove both the girls into the get along shirt Each girl puts one arm through one of the arms of the shirt and the other arm around the waist of the other girl And then they button the shirt back up and they're not allowed to get out of the get along shirt until they start Stop fighting Even if they have to go to the bathroom and this is the thing that patrick says always works as soon as someone has to pee They stop fighting. Um So despite this angel and patrick have really racked up like a lifetime of interactions with children youth and family services Which is what cps is called in pennsylvania So patrick was investigated for medical neglect in the early 2000s because he was unable to afford A um antibiotic prescription after his daughter's Visit to an emergency room And then when harriet angel's daughter was five someone phoned in a string of reports to the child abuse and neglect hotline This was an anonymous tipster and they explained that harriet was running around the neighborhood unsupervised That she was down the block teasing a dog that she wasn't being properly clothed fed or bathed that she wasn't getting needed medication So for each call an investigator came out to the house Interviewed harriet and tabitha angel and patrick looked in all the cupboards under all the beds And requested access to the family's medical records And then each time finding no evidence of maltreatment. They closed the case So each of these interactions was entered into the family's digital case file Which is held in an integrated data warehouse. Um in the county Which feeds what's known as the allagini family screening tool, which is the tool that I report on Um uh in in here in allagini county, which is the county where pittsburgh is in pennsylvania So patrick and angel are aware that um each interaction they have with the sort of wide variety array of services They receive from the county Um can potentially raise their risk score in this predictive model and they describe to me um Feeling like they lived in a state of sort of low grade constant terror That there would be another call on the family and that the algorithm would target their daughter or their granddaughter for investigation And possibly for removal to foster care. So angel said to me quote you feel like a prisoner You feel trapped. It's like no matter what you do. It's not good enough for them My daughter's now nine and i'm still afraid that they're going to come up one day See her out by herself pick her up and say you can't have her anymore So the allagini family screening tool is built on top of a data warehouse that was um Created in um 1999 as of the writing of the book. It held a billion records That's more than 800 for every individual living in allagini county um But of course the warehouse doesn't actually collect data or information on every member And on every county resident equally. Um, in fact the data extracts mostly come from um county and state public service programs And agencies that interact a lot with poor and working class families So the system gets regular data extracts for example from adult and juvenile probation from the jails and prisons from county medical Mental health services from the county office of drugs and alcohol addiction recovery The state office of income maintenance, which is the state's version of welfare The public schools and a number of other agencies um So the limits of this data set Really shape what the model is able to predict is able to see And because it relies almost entirely on information that's only collected about poor and working class um Families the ways that it sees risk and harm Are um are shaped By the kinds of experiences that poor and working class people have with the state Um, so of course professional middle class families also need help with their parenting because everybody needs help with their parenting They probably request equal amounts of support, but often they're paying for it privately So if you need um addiction recovery support and your professional middle class, you're likely going to get that through employer provided insurance If you're getting it through private insurance that information doesn't end up in the data warehouse So information about the behavior that could be described as missing From the data warehouse If you need help with your child care, but you can afford to pay a nanny or a babysitter out of pocket Then information about your family won't end up in the data warehouse um, so um those um those limitations In the data set itself Um really caused enormous concern for folks that I talked to when I was doing my reporting in alagany county So parents mostly saw um false positives problems problems that um basically false positives just means seeing um risk of harm where no harm actually exists Um, and this makes sense for parents, right parents said to me that they felt like the system confuses Parenting while poor with poor parenting And and they felt like it was creating a system of poverty profiling that because it spent so much time um investigating Investigating and risk rating families in their communities sort of created a feedback loop of injustice that began um with Families having more data collected about them because they are interacting with county systems Having more interactions meant their score was higher because their score was higher They were investigated more often because they were investigated more often more data was collected Right and back around um, so it became kind of a feedback loop In the same way that many people have expressed concerns that predictive policing creates um feedback loops um From the point of view of intake call screeners who are the front line of the social service system in child welfare They're the folks who pick up the phone When a call comes into the anonymous hotline or who Collect reports from mandated reporters and make a decision of which um Which cases to refer for full investigation and which ones to screen out um Frontline call center workers Intake screening workers were really concerned with false negatives But for the same reason that parents were concerned with false positives So false negatives are not seeing harm where harm might actually exist. So intake call screeners Felt that since there wasn't data about professional middle class families In the data warehouse The kinds of behavior that might lead to abuse or neglect in those families Wouldn't be recognized by this predictive algorithm and they may miss Really key information about the kinds of harm that happens for example in more geographically isolated places or in the suburbs But that information would not be in the system So the models designers and administrators Say that part of the point of this system Is to root out bias in the system and I think it's really important to be very direct that bias in child welfare is a Profound issue in almost every county in the united states. So the way that most people talk about it Is around racial disproportionality In alagany county like just about every county in the united states has serious issues with racial disproportionality in their child welfare system Something like 38 of children in of youth and foster care in alagany county or black or biracial While they only make up 19 of the youth population. So they're about twice As likely to end up in foster care as they should be based on their proportion of the population And alagany county has been very serious about addressing this disproportionality And part of that Move has been to try to keep a closer eye On the patterns of decision making Of these intake call screeners. I was just talking about a moment before so this tool is intended to supplement their decision making They make two clinical decisions Put it in the system and they run this tool And this risk score shows up on sort of a thermometer green at the bottom red at the top That goes from zero to 20 At the time I was reporting if you got a score of 18 or above The system automatically launched an investigation Since the publication of the book they've actually dropped that threshold and now if you have a score 15 or above the system automatically launches An investigation So the thing that's really interesting about the idea that this tool can help them address bias And what the administrators told me was that they don't think that this tool necessarily can solve racial Disproportionality, but they can't help them identify it earlier and try to address it earlier The issue with that is that the county's own research shows That um intake call screening is not the point at which um disproportion is entering the system that discrimination is entering the system Actually, the point that the the lion's share of discrimination is entering the system is the point at which the community calls on families So um black and biracial families are called on three and a half times More often than white families. That means reported either to the hotline or by mandated reporters So that's a 350 difference Once that case gets to an intake call screener there is a bit of disproportion that enters at that moment So um call screener screen in 69 of cases involving black and biracial families and only 65 of cases involving um white families, but that's a four percent Difference rather than a 350 difference. This is something I think is profoundly important to pay attention to because to me This sounds like a very sophisticated An expensive tool um aimed at the point at which the problem is not entering the system The problem is entering the system at a at the point of referral Which is very much about our cultural Understandings of what a good safe and healthy family looks like and in the united states that family looks white and heterosexual and rich um and My concern or among my concerns is that actually removing discretion from those frontline workers Could remove a stop to the massive amount of discrimination that's entering earlier in the process um And and could potentially worsen inequality in the system Rather than making it better okay, um I'm going to talk very briefly about this so um One of the things that I think is really important about this is about this system is that the way that folks talk about It as removing discretion Or removing bias from the system and a very smart political science friend named joe sauce And he says that discretion is like energy It's never Actually created or destroyed. It's only ever moved So I think the really interesting quite among the really interesting questions to ask about this system Is is not to frame the question as are we removing discretion? Are we removing bias from the system? But is who are we taking discretion away from? And who are we giving it to and how that might how might that affect bias in the system? So in this case, we're removing discretion from frontline um social service workers in child protective and these are some of the most diverse working class and female parts of the labor force um in child welfare And we're giving it to The economists and the computer scientists and the social scientists who are building the models And I think that actually creates new issues around bias Um, particularly because they tend to be much farther away from the problems that these tools are meant to help address Um, lots more to say about that, but I'm gonna I'm gonna I'm gonna move on for now And we're gonna skip the proxy side to ask me about proxies in the system If we have time in the in the q&a But I want to make sure that we have time to talk about the the system that I reported on in Los Angeles How are we doing we still up here? We middle we down We're starting to get middley. Okay, because people won't tell me and that's all right We're starting to get middley. So we're going to go fast through this and get to questions So my final point is that these tools. Um, I think at their worst Can serve as a kind of empathy override so they can outsource to computers or allowing us to outsource to computers Some of the most difficult decisions that we face as a society So for example the system that I um reported on in Los Angeles Which is called the coordinated entry system not just in Los Angeles by the way very widely used across the country and around the world um responds to the county's extraordinary Housing crisis. So as of the 2017 point in time count, there are 58,000 unhoused people in Los Angeles county I live in a small city in upstate new york There are 10,000 more people homeless in Los Angeles county than live in my whole city, right? So this is a a human rights issue of astonishing proportion And something like a full 75 of unhoused people in Los Angeles county have no shelter at all So no emergency no emergency shelter. There's living in tents and encampments On the sidewalk In cars So this system the coordinated um entry system works by assigning each unhoused person Who they managed to survey a score on a spectrum of vulnerability and to do this They use a survey with a kind of a terrible acronym. It's called the vi sped at um, the vulnerability index and service prioritization decision assistance tool um And that the tool actually serves those at the top of the scale pretty well folks who Are chronically homeless and really need the kind of support that can only be provided by permanent supportive housing It also serves folks at the bottom of the scale pretty well folks who are crisis homeless And will be able to recover with just like a limited a time and resource limited investment um So they by the as of the writing of the book they had managed to survey and rank 39 000 unhoused people and they had managed to serve about nine thousand of them with some kind of resource That's not housing, but that could be also the sort of more limited resources of rapid rehousing. So help with an eviction or Moving expenses or or something a smaller amount of resource all of those things are counted as a match in the system um, but then there's the 30 000 people who have been surveyed but have never received a resource through community through coordinated entry People like gary boatwright Who are strong enough to survive? But not really able to get back on their feet by themselves So when you're classified as being not vulnerable enough to merit immediate assistance But not stable enough to be Served by the time limited resources of rapid rehousing This can end up Leaving people feeling like they have been included in a system That has asked them to incriminate themselves in exchange for a slightly higher lottery number for housing And this isn't a tear. It's not a terrible analysis of how the system actually works So the system uses this survey called the vi sped at to ask people some very intrusive questions about How they spend their days and what their experiences are Um, the questions are actually pretty good at establishing actual vulnerability so they ask things like Are you currently having unprotected sex? Are you currently trading sex for money or drugs? Is there someone who thinks that you owe them money? Is there an open warrant on you? Have you thought about harming yourself or someone else in the recent past? And then it asks where you can be found at different times of the day and then it asks if um the person Doing the survey can take a picture of you Um folks have to sign a pretty comprehensive informed consent in order to do this survey But it's hard to say that that informed consent is truly voluntary Because coordinated entry has really become the front door to almost all housing services in Los angeles county. So the the choice is really Give folks your data and hope that that means you get matched with a better housing opportunity Or sort of close yourself out of most avenues for housing resources in the county at all so once Once you fill out the informed consent part of it says part of the informed consent is That there is information available Um about who your information is shared with but you there's a second process That you have to go through to get that information if you do request and manage to receive that information That document says that they share your information with 161 different agencies across los angeles county And because of federal data regulations and because the fact that this data is held in a homeless management information system One of those agencies is the los angeles police department Now they're not able to get request any old information out of this system They're not able to say like you know run a query on the system that says anyone who is trading sex for drugs Like give us a list. This is not the kind of information they can get But they are able to request information out of the homeless management information system based only on an oral request So there's no warranting system. There's no oversight. There's not even a paper trail Basically, it means that a line officer can just walk into a social service office and ask for information out of the system And the worker is not required to give it to them And I think that's really important that that people know they that they're not required to give it to them But they are allowed to give it to them Based on current federal data standards So I just want to tell you a little bit about what about one of the people I talked to when doing this reporting And then we'll move on to big ideas and questions So I um in los angeles one of the folks I talked to was a guy who goes by the nickname uncle gary gary boat, right? So when I met uncle gary in um 2016 He was living in a gray and green tent on east sixth street on the edge of skid row He's a straight talking riley funny man with thinning white hair and santa claus blue eyes He's had a dozen careers welder mason paralegal Door-to-door salesman law student and most recently he was a document processor for a wholesale mortgage lender Which comes with a number of incredible ironies Um So since he moved to skid row, he has filled out the vi spedat three times and by the time I talked to him He had really lost patience with the process I mean he doesn't think he scored very high on this vulnerability scale He's 64 and other than a little high blood pressure in a hearing problem. He's he's mostly healthy His substance use to me didn't seem abusive or debilitating He has a mental health diagnosis But he doesn't actually know what it is Because he just found out that he had a mental health file when he went to court in orange county But no one's ever shared his diagnosis with him The problem though as he sees it is not his comparative vulnerability. It's simple math There's not enough housing in los angeles for the county's 58 000 unhoused people People like me who are somewhat higher functioning. He said are not getting housing It's another way of kicking the can down the road in order to house the homeless You have to have the available units Show me the units Otherwise, you're just lying So in november of 2016 as the book was going to press gary was arrested And he was charged with breaking the window of a public bus with a plastic 99 cent store broom Which when he called me from men's central in los angeles. He said is quote physically impossible um When he got out, which is about a year ago now He had lost everything His tent his paperwork his relationships with local organizations and friends And if he chooses to interact with the coordinated entry system again He'll score lower on the vi spadat because it counts incarceration as being housed So the model will see him as less vulnerable and his priority score will skip even lower So I can hear skeptics of the book In my head at all times Even now Scary stories sell books virginia like you went out you cherry picked the worst case scenarios you can find to tell this really frightening story But here's the reality Indiana was pretty bad like I will say that that is maybe a little bit of a black hat story I don't know the soul of governor mitch daniel. So I cannot speak to his intentions But one of my sources said, you know, if we had built a system to divert people from public assistance on purpose It wouldn't have worked any better than this one did but um In los angeles and in allagini county the designers the administrators the case workers I spoke with were very bright Very well intentioned people who cared deeply for the well-being of the people their agencies serve But the reality is good intentions can still produce bad outcomes And the time has really come to stop talking about intentions in the design of these tools and to start talking about The impacts they're having in on people in their day to day lives right now In fact in both these places both los angeles and allagini county designers have done many of the things that progressive critics of algorithmic decision making Ask them to do they they actually have sort of hit all the marks They've been largely transparent though not entirely transparent. They've released most of the information about how their models work They've they're mostly accountable and that the tools are held in public agencies or public private partnerships And they've even engaged in some kinds of participatory design That help bring users of the system into the design of the tools In other words, these are some of our best systems not some of our worst So here's a challenging question that I hope the book asks What if the problems with the coming age of AI and machine learning Are not broken systems are not lack of accuracy Or even lack of fairness and we can talk about that a bit in a minute But rather systems that carry out this deep social programming that I started my talk with This this dictate of the digital poor house to create systems of moral diagnosis and divert people from Resources that they are entitled to and deserve What if these systems are carrying out those imperatives too well rather than just breaking or glitching? So the designers of all the systems I studied for the book really agreed on one thing That data analytics matching algorithms automated decision making all these kinds of tools are perhaps regrettable But necessary systems for doing a kind of digital triage For deciding whose life is immediately threatened by economic inequality and who can wait But the decision to triage at all Is actually a political choice This idea that We don't have enough resources and we have to make tough decisions Is just that it is an idea and in fact, I think using the language of triage Hides the fact that we're making a political choice Because triage is really only appropriate when there are more resources coming And if there aren't more resources coming What we're doing is not triage It's automated rationing and I think if that's what we're doing we should talk about it directly Because we can do better than that and we deserve better than that That's why I wrote the book. I believe we deserve better I believe our people deserve better. I believe our communities deserve better And I think the fundamental danger of the digital poor house is that it demands that we think small That we sort of stay within These arbitrarily imposed limits both to our resources and to our imagination about how we solve For economic inequality But this political moment that we're in right now And justice itself really demands that we think big that we push back against this sort of idea of austerity fever So I don't want to leave you without a couple of notes on possible solutions Um, I know that often what audiences want me to do is walk into a room and give them a five-point plan For building better technology Or for creating more ethical data policy and I'm Sorry, and you're welcome. I'm not going to do either of those things. Um, because I actually think this is really big work That we have to think about on a on a on a deep And and pretty profound level So I think there's three kinds of work that have to be happening at the same time The first is that we have to tell a different story about poverty So we tell the story in the united states about poverty that it is an aberration That it is something that just happens to a small percentage of people who are maybe pathological to begin with The reality is that 51 of us at some point between the ages of 20 and 64 will be below the poverty line We will be below the poverty line at some point in our adult lives the majority of us That nearly two-thirds of us will receive means tested public assistance. So that's not Reduced price school lunches. That's not social security. That's straight welfare in our adult lives Almost two-thirds of us will receive welfare. That doesn't mean we're equally vulnerable to poverty That's absolutely untrue if you're a person of color If you are a person who cares for other people If you are born poor, if you have mental health issues, if you have physical mobility limitations If you're a migrant, um, you have you're much more likely to be poor and you're much more likely to stay poor once you're there But the reality is that poverty is a majority condition in the united states And if it is a majority condition Then spending all of our time resources and smarts trying to solve the problem of moral diagnosis Is it is an extraordinary? um Moral failure to identify the and address the real problem I believe if we can change the kinds of stories we tell about poverty that we can shift the politics of poverty away from this Um diagnostic moralizing and towards universal floors One of the things that's been really profound about talking about this book outside the united states Is that the kind of thing the kind of conditions I talk about in the book In many places in the world people immediately recognize and talk about as human rights violations Increasingly in the united states we're talking about them as systems engineering problems And I think that should give us some deep deep pause about the state of our national soul Because of course we can decide as a country that there is a line below which no one is allowed to go for any reason That no one in the united states goes hungry That no one in the united states lives in a tent on the sidewalk That no family in the united states is split up because a parent can't afford a child's medical prescription So as we do that work that cultural and political work changing the story and the politics of poverty Technology is not going to just stop and sort of twiddle its robot thumbs waiting for us to get it together So in the meantime, we also have to be talking about ways to create technologies that do less harm Right now So the way we talk about design particularly designed for justice Often is by talking about technologies that are designed to be objective and neutral More objective decision-making makers than human beings But the reality is building or designing technologies to be objective or neutral Just means we're building them to support the status quo Um And the metaphor I often use to help people understand that is the metaphor of like building a car in a place that is very Where the landscape is very hilly And twisty and turny landscapes say like san francisco or silicon valley Where there's lots of hills and twists and turns it's about it's like building a car and building it with no gears Right and then setting it on top of one of these hills and being surprised when somehow it rockets down To the bottom of the hill and burst into flame The reality is we have to build these tools with equity gears installed From the beginning every time um This means designing these technologies with all of our values in mind Efficiency and cost savings are important. Of course, they are But they have to be balanced with other collective goals Goals like self-determination autonomy dignity fairness equity And due process If we're to have a more just future We have to build it on purpose bit by bit and bite by bite and brick by brick If we outsource our moral responsibilities to care for each other to computers We really have no one but ourselves to blame When they supercharge discrimination And automate austerity I thank you so much for your attention and for being here for this conversation I have some sort of thoughts about what's happened since I wrote the book that I'm happy to talk about if it comes up in Q&A So I'm just going to leave this here for now so we can talk about it But thank you so much for being here and I'm excited to hear your questions All right, you'll hear me. Yeah, awesome. Great. Um, hi everybody. My name is Jackson I'm with the stpp program and a master's student here at the Ford school And I'm Laura. I'm also with the stpp program and a master's student at the school for environment and sustainability We're going to kick off the Q&A session here. We have several questions from our audience For you Um, I think what we want to start off with first is can you tell us about proxies in the system? See what happens when you put a slide in and then you don't talk about it. Um, yeah, so, um And actually it's related to To this this later work. So I'll start here and then move backwards Anyone want to fess up to this being their question because I like to make eye contact You don't have to hey. Hi. Nice to meet you So, um One of the things that's been really interesting That has happened after the book came out is when I was writing the book I really thought I had kind of two audiences and one was Folks who had experienced these systems as targets of these systems because I think often we need our experience confirmed By hearing that we're not the only person this has ever happened to And there's so much stigma in these programs that often everyone thinks they're the only person to ever Have one of these experiences and they're often really surprised when the experience is actually super common So I really was thinking about folks who are the targets of these systems when I was writing this book And I was really thinking about sort of the data scientists and economists and folks who are building these models as And the data for good folks and all those folks too But the the people I wasn't really thinking about when I wrote the book and and I've now really had a fascinating Continuing conversation with has been organizations on the ground Who are serving people in terms of meeting their basic needs? And are seeing these tools come Up through the systems and are even often being asked to consult about these systems and don't always really know Exactly what questions to ask when they're asked to consult So I've gotten a bunch of really interesting phone calls from organizations like the bronx defenders who are like Oh, hey, they're going to move to predictive analytics in new york city and child welfare. They want us to consult What do we ask? And one of the questions I tell them always to ask is there's a couple of like this is the model inspection question Is like there's a couple of under the hood questions You should ask anytime you're looking at one of these models one is the one I talked about already today Which is about the limitations of the data set is the data collected in ways that could introduce discriminatory Impacts on the system So if the data is only collected or over collected on one group of people and not collected on other folks The other issue though that is something that people should pay a lot of attention to is the issue of proxies So many of these models In social services don't actually have enough data To model the actual phenomena. They're interested in changing. So for example in child welfare The actual harm to children is recorded on this report called the Fatality and near fatality child fatality and near fatality reports Luckily for the children of Allegheny county in the world There aren't that many of these reports filed every year just sort of a handful every year and some year there's none So that's good news for kids in Allegheny county It's bad news for data scientists because it doesn't actually provide enough data to build a rigorous model So in Allegheny county, they had to choose proxies which are just stand-ins like little puppets That stand in for the thing that you actually want to measure And in Allegheny county, they originally chose two proxies for that that stood in for actual child harm in the model One was a thing called child re-referral And that just meant that there was a call on a family a report on a child That was screened out. So that wasn't that the family wasn't investigated And then there's a second call on the same the same child within two years On the second proxy was called child placements Um child placement means that there's a call on a child. They decide to open an investigation The caretaker is indicated for maltreatment We never talk about guilt in child welfare because the standard of evidence is so low That it's just whether or not there's evidence that indicates something has has happened So the the parent is indicated And child welfare and the courts decide to take the child out of the house and put them in foster and in foster care or an institution or with kin So that's child placement Now these aren't Terrible proxies necessarily, but they have they are very different things than actual maltreatment having occurred And the one that I was most concerned with when I was writing the book was this call re-referral proxy Because it seemed to me that the designers of the tool were really out of touch With what actually happens around child protective If they did not know that nuisance calling is a thing that happens Or vendetta calling, right? So it's actually common unfortunately common that people use calls to child welfare to harass each other like neighbors Somebody had a party and their neighbor gets mad and calls child protective Or there's a a couple that's breaking up and they call child protective on each other Or there's like family strife and they call child protective on each other So this idea that two calls on one child means harm has actually happened is really troubling Incredibly troubling and I think Introduces this structural systemic inequality that becomes invisible because it becomes part of this model and because it seems objective and neutral So that was a real concern I had with this system. I think you can also have similar concerns about Which children get placed in foster care, right? Particularly concerns about the system actually Modeling this Children youth and family services own decision-making, right? There's a courts are in there, too So it's not quite that simple But there can be real concerns about it actually modeling its own decision-making and creating another feedback loop there Since the book was published they've stopped using the child re-referral proxy. Um, no causal relationship that I know of But uh, they are still using um now they're using one proxy and the proxy is is child placement So the proxies are really these kinds of lenses that Might allow you to see better if they're good lenses But if they're not good lenses might really distort your vision in systemic ways And so proxies are something if you're not measuring the thing you want to change directly You have to be really really thoughtful. You really have to take apart Um, the those pieces of the system to to know whether whether and what kinds of concerns to have about it So that's proxies Thanks, I promise the next questions will be the next answers will be shorter. There's a little bit in the technical weeds That's why that's why I skipped it, but it's fascinating Thank you. That's great. Um, our next question is why is the us conception of poverty more bias towards judgment and punishment compared to other countries Yeah, god, that's a fascinating question. Can somebody else want to take this one? This is a, um So what I will say is at the moment we are moving towards poor houses the Much of the rest of the world was moving towards Universal more universal programs. Um, you know, my, um My admittedly Hypothetical answer to that or like my uh idiosyncratic answer to that Is that I think it's a mix of the real disdain and hatred we have historically showed Towards poor and working people in this country combined with racism with the history of Racial discrimination that means we create social service programs That are intended to block people of color from receiving Help And because we because white people allow that to happen Then we also suffer under those same programs. I actually think this is a great point of possible political mobilization particularly now, um, if we could sort of Find the way to work across some of those experiences is actually one of the things that I think is really um, potentially a point of hope and optimism around these systems Is I mean we were supposed to get a poor house in every single county in the united states It didn't work out that way partially because they ended up being way more expensive than the economic elites Thought they would be so they were like no screw it stop. Um second because they were also institutions that physically contained Huge groups of poor and working people together for long periods of time where they like sat and ate meals and took care of each Other's kids and you know Nourished each other when they were sick and also did horrible things to each other as well But the reality was they became these places that became sort of nodes of resistance in interesting ways And um, and that likely was a reason that they called kibosh on them as well One of my concerns about the digital poor house rather than the original institution of the poor house Is that it could it can serve Many of the same disciplinary and punitive purposes of a physical institution Without actually gathering people physically together in the same space in a way that might create solidarity The the the ray of hope here. I think is that the the these systems scale so quickly and um Are networked so deeply that they actually touch all of our lives very quickly And so I think they might also offer an opportunity for us to see our experiences mirrored in each other And to use that as a way to do political organizing that can unseat this sort of deep Cultural understanding of poverty as a moral failing. Um, I think that's hard work. We have hard work to do around that That's a great question. Thank you for whoever whoever that was. Thanks Okay, uh, next question So this question is in regard to uh data that we gathered in the pre automated system. Um What happened to the data that we generated in the early to mid 20th century? uh case worker model And is a return to that sort of the the strong case work Is a return to strong case work an option or desirable Relative to this new system. Yeah, so there's sort of two questions in there I'm gonna answer the first one fast and take a tiny bit more time with the the second one. So, um Massive data collection on poor people is not new with digital data collection, right? So one of the things I talk about in the book, um is like the eugenics movement, um, which was um Part of its specific goal was to gather information about the sort of um social diseases of poor particularly poor white families and um to um keep them from breeding So it was a deeply racist project of trying to sort of cleanse the white race from within By identifying these degenerate white families. Um So the one of the things I say in the book is that the eugenics record office In cold spring new york was probably the first big data set of the poor in the united states Um, so this is not new. Um, what is new is the potential for this data to last forever Um, and librarians particularly in the room will like roll their eyes because you know that like if you have a jazz disc Somewhere in your home, you know that just because it's digital doesn't mean you can access it later But the reality is that paper records or photographic slides take up space And eventually they have to get put away somewhere far away. They're not as integrated. They're not as easy to access as digital data is Um, so the second part of that question, which I think is also really interesting is is the solution to return to strong case work um Because I think the gets to some of the real deep tensions at the heart of this work So there's two I think really almost irreconcilable tensions in this work one is around integration Which is around how to how connecting data across different systems can both help and hurt poor and working people So one of the real barriers to benefit receipt in the united states is how hard it is to apply, right? so you have to apply if you need home home heating assistance food and um Section 8 housing You're going to spend three days in each office filling out the same application and waiting in extraordinary lines And this acts as a kind of diversion So that integrating that data could actually lower barriers for receipt If we had a system that was meant to help people Get access to the things they are entitled to and deserve And if we didn't live in a culture that criminalized poverty But in fact we live in a culture that criminalizes poverty So integrating those systems also creates this spectacular surveillance net that makes it very difficult to have any degrees of freedom inside the system So that which leads to the second real tension, which is around discretion So the reality is that case worker discretion could be one of the worst things that happen that can happen to you And in fact is one of the reasons that people of color were blocked from getting any real access to public services for like 35 years until the national welfare rights movement Um, but also under a system that's not necessarily set up to help you get to your entitlements Um, case worker discretion is the really the only thing that I ever saw work in 15 years of welfare rights Organizing to actually get people what they needed So the struggle there is thinking about this sort of bigger question of does in the system that we have right now Does justice demand an ability to bend the rules? And if it does demand an ability to bend the rules, how do we do that in an equitable way? Um, so You know the answer to that is like more casework. Yes, like well Like more of the right kind of casework. I think could certainly, um, uh improve the situation But that doesn't mean that it's a simple return to what we were doing 30 years ago, which you know, wasn't working for everyone The next question is why can't the database systems for access to public services be deprived of racial information? Yeah, that's a that's a great question. Um, so, uh In alagany county, for example, they don't use race as, um, um, a variable In their model, um And this comes back to this proxy slide. Um, the reality though, uh, is in a deeply segregated society There are about a million variables that stand in for prox, uh, stand in as proxies for for race. So neighborhood, um interactions with the criminal justice system Public school that you go to lots of these things can stand in for race in some profound ways So one of the challenges is, um, that removing race from these systems doesn't necessarily Actually change the racialized nature of the data because race is institutionalized in these systems And in some pretty profound and important ways. Um, and also I'm not sure that pulling race out of these models Is actually useful because it may just hide the processes by which Data is being racialized. Um, so I want to be able to keep track of Right is this model producing racially discriminatory outcomes and we can't do that if we're not if if race isn't one of the variables Okay, uh, thank you next question Um Regarding, uh, kind of like the return the new invoke use of, uh, work requirements for things that were previously kind of Maybe insufficient or not not adequate universal floors Uh, especially like here in the state of michigan, we have work requirements that were just implemented for both snap and for medicaid And some of these are pretty steep I think new hampshire has a 100 100 hour a month work requirement for medicaid eligibility. Um What is the role of? This these kind of data collection systems in implementing these these means tests Beyond just policing poor people but actually affecting the benefits that they receive from the government. Yeah so, um One of the stories I start the book with is um a story about a woman who goes by a student in in my writing She and I worked closely together. Um many years ago, but when I was doing it more scholarly or academic writing So she was um her identity was anonymized Um, so she goes by the student in dorthy allen um in my books and dorthy and I in 2000 were sitting in a Technology lab that we had helped build together in a residential y w c a in troi new yorks. It's a housing for low-income women um And we were talking about her Electronic benefits transfer card her ubt card Which is the sort of debit like card that you get public assistance benefits on And they're fairly new um in 2000. So so we were just sort of shooting the breeze about it one day And I said, oh, yeah, like some people have told me that you know that they prefer ubt cards because um You know, it's it's a bit more convenient and it kind of lowers the visibility of using food stamps in the grocery store And and she said, yeah, you know, that's true and it's not in some ways Um, there's there's some things that I really like about the ebt card But she said but um my caseworker also uses them as a way to track all of my purchases And I must have had this like super shocked look on my face because she kind of pointed at me and laughed for like Three minutes like kind of cried a little bit and like patted my knee for a while like oh pumpkin. Um Um And then she like she got a little quieter and she was like, oh virginia She was like, you know, you all meaning sort of like professional middle-class people You all should pay attention to what's happening to us people on public assistance because they're coming for you next Um, and this is like 18 almost 19 years ago now Um, so I feel like this is why dorthy's always in the back of my head is because that's like a remarkable prescience um And really influenced the way that I do my work in that i'm always looking for the folks who are the targets of these systems To speak first not the only stories I tell but the most important stories I tell Um, because these are folks who are already living in the future of these technologies They are experts in how they work Um, and their experiences say a lot about what's going to happen to everyone in the future I'm not saying you should only care about this because it might influence it might impact you at some point That's messed up like there's a moral argument to care about it if it's only happening to poor folks We should still care um, but um I think dorthy's right and one of the things that's been really troubling for me this year has been if you look at the 2019 federal budget It says really clearly that the um trump administration Plans to save 88 billion dollars over the next 10 years in middle-class entitlement programs in disability and unemployment insurance and social security by using these same tools And it's deeply concerning to me That this both of these tools have been tested on folks who live in what you can sort of consider as low rights environments Who are especially vulnerable And that these tools are being ramped up to be used on pretty much everyone. Um, this is this is deeply deeply concerning um for me um, so I think um It's not an accident that this stuff is all happening at the same time, right that like we're getting This expansion of work requirements that we're getting this expansion of sanctions into non tannif programs at the same time that we're building technological capacity to do it so efficiently Um, and in the political moment that we're in right now Which is a moment that's really characterized by deep economic suffering ethnic and racial nationalism Um, and deep deep distrust of government. Um, right? So I just actually just did a talk in finland for a bunch of social workers in finland And one of the things that was so fascinating about being there is I just have this incredible trust for the government Um, so they're like, oh no, we have one card has everything on it medical records public like our voting like public assistance schooling And I just like kept getting more and more Ashen like as they kept saying this and then I had this moment where I was like I'm both super jealous of you all for trusting your government that much And I feel like I like you're a toddler whose hand I need to smack away from a fire. Um Because you should stop giving this data to your government and I'm like, why did you invite me here? Like why am I here? Um, do you just want to feel morally superior and they were like well a little and I was like, yeah, I know Thanks for the flight to Helsinki though. I'm going to the sauna now. Um, but um, but but they also said, you know Look, we think these tools are coming everywhere and we they actually just voted in a much more conservative government than they had in a really long time and um, they were really concerned about what happens when you trust the government enough to give them all your data And then there's regime change And I was like, oh, yeah, we know a lot about this Um, like let's talk about this. So like I think the best example of when this becomes a real problem Is something like the DACA database the deferred action on childhood arrival database, right under the obama administration This the idea to try to um help defer the deportations of children who were brought to this to this country Um and don't have legal status um Something like 800,000 youth young people and young young adults gave their information To this database and like 2016 suddenly it turns into a database that can be used directly for deportation right so like One of the big questions that i'm asking that I don't have an answer for yet But i'm really interested in having this conversation is Is there a way we can build these tools to have unhackable values? Right, so you can't use them Against their original Intense and so like these larger values of justice equity Autonomy and self-determination are built in in a way that you can't undo them And I don't know that that's possible, but i'm interested in having that that conversation long answer, but a good question Thank you Great, and this will be the last question that we have time for But we know from the sociological literature that there are class differences in parenting To what degree are we criminalizing working class parenting using the standards of concerted cultivation Wait say what the last thing is The last part is to what degree are we criminalizing working class parenting using the standards of concerted cultivation? Okay, so I don't know what concerted cultivation is so I can't answer that question. Yeah Um, are we sort of making illegal that style of parenting by the upper middle class standard of your child should be supervised at all times? Yeah, so that's a great question not one that I know that I'm prepared to like definitively answer But what I can say is that one of my concerns about these technological systems Is that we think of them as sort of simple administrative upgrades and not as political decision making machines And in fact, they are political decision-making machines and for example, we can easily program them In ways, we don't even really understand ourselves to uphold a particular kind of standard for parenting or You know responsible work behavior or disability or whatever In ways that produce these automated inequalities and that really is the the concern that I'm writing about in the book um, so I think um one of the Things that understanding these systems as political decision-making machines helps us do Is recognize that there are lots of different kinds of expertise that need to be in the room when we're talking about these things So we tend to say like Well, we need data scientists. We need economists. We might need a social scientist And what may be a data ethicist that's becoming like a thing right like maybe one of those people But the reality is that if you don't know From the ground About community values And community cultures then you may well build into these tools The kinds of decisions and Models that don't make sense for for people on the ground And what i'll say is cathy valponi who worked for an organization in pittsburgh that Helped to support parents who have been accused of maltreatment Said you know one of the things we do in this system is once you get in the door We raise the standard on your parenting so high That that failure while it doesn't become inevitable becomes much more likely So we raise the standard on your parenting so high and then we can't offer you the resources to keep your parenting up there on that that's really one of the The major problems in the system I mean i also think one of the major problems with the system is because we've shredded the child welfare system Is because we shredded the social safety net in other places The child welfare system has basically become the resource supplier of last resort for poor families But that means that you have to make this horrible trade-off of say like a request support to keep your family healthy and safe But in requesting it agree that the state now has the authority to remove your kids So child welfare is not being tested. You can be any class any income and use the child welfare system But families that have the resources to avoid it avoid it Because of that trade-off because that trade-off is it is an unfair thing to ask parents to do So I think part of the issue is just that our child welfare system mixes these two Goals of protecting families and prosecuting Um Prosecuting maltreatment and I think there's a bigger lesson in that for these systems right because there's a way that these systems Increase the policing imperative of social service systems and integrate social service systems more deeply with processes and systems of policing Which is why I think many of these systems can be seen as criminal as profoundly criminalizing Um, so yeah, it's a great question. Thanks All right. Well, that's the all the time we have thank you everybody for your questions And thank you professor you banks for coming and speaking to us at the forward school. You're welcome. Thank you for having me And uh, we invite everybody to join us in the great hall for the reception and uh to continue our conversations and and have some snacks Thank you so much