Virginia Eubanks

Automating Inequality

Air Date: January 16, 2018

Political scientist Virginia Eubanks discusses her new book, “Automating Inequality.”

READ FULL TRANSCRIPT

HEFFNER: I’m Alexander Heffner your host on The Open Mind. The inequities fueling the incivility of economic division, the underlying plague of incivility in the American condition today, are not simply top-down or bottom-up. They are systemic and systematic, and they’re designed to reproduce a downward cycle of mortality, not mobility, for a once vibrant middle class. Virginia Eubanks, my guest today, will correct me if that broad overview misunderstood the central thesis of her new St. Martin’s Press volume, “Automating Inequality: How High Tech Tools Profile, Police, and Punish the Poor.” Professor of Political Science at the University of Albany, SUNY, Eubanks is a New America Fellow and co-principal investigator of Our Data Bodies project. In “Automating Inequality,” she powerfully examines the injury technology is inflicting on our livelihoods in the struggle for economic rights. Welcome, Virginia.

EUBANKS: Thanks so much for having me. I’m really excited to be here.

HEFFNER: Thank you. You arrived at this project through a poignant anecdote that you reveal in the introduction of the book, and I hope that you could share that with our viewers.

EUBANKS: Yeah, so there’s two stories really that, it’s, where this, the origin stories of this book. One of them was I’ve been a long-time welfare rights organizer and economic justice organizer, and in about 2000, I was sitting with a young mother who was receiving public assistance, and people were writing a lot at the time about these electronic benefits transfer cards, EBT cards, that people were getting their benefits loaded onto. And so we’re talking about EBT cards and we’ll, we’ll call her Dorothy. She goes by a pseudonym in the book. So Dorothy and I are talking. And she, and I’m like oh, so everybody’s saying that these EBT cards are really great because you know, there’s less stigma when you go to the grocery store. You’re not carrying food stamps, you just have like an ATM card like everyone else. And she’s like well, yeah, the EBT card’s good, it’s convenient. I mean except for my case worker uses it to track all of the, all of my spending. And I must have looked really sort of gobsmacked, ‘cause she looked at me and she said oh you didn’t know that, did you? [LAUGHS] And I was like no, I did not know that. And she very generously I think looked at me and said you know, you all, meaning like professional middle class people, you all should pay attention to what’s happening to us, ‘cause they’re coming for you next. And so that was really a moment, like a lightbulb moment for me where I realized that a lot of the most innovative and cutting edge technologies in the United States are first tested on poor and working class people, and that became sort of the beginning of this whole interest of mine. The anecdote I open the book with, however, is something that happened to me personally, and I think it’s really interesting in the way that it illustrates Dorothy’s concern about how far the, and how fast the stuff can spread. So, about two years ago, my incredibly brilliant and wonderful partner, Jason Martin was attacked, when he was walking back from the corner store to our house about a block away. And his jaw was broken about six places, both of his cheekbones, both of his eye sockets. And we were very lucky that he got good care immediately, didn’t lose any hearing, didn’t lose any sight, had very extensive reconstruction surgery on his face, but was in largely to be you know, taking into consideration what had happened to him. In pretty good spirits, in pretty good health. So one day I went to the pharmacy to pick up a prescription for him and the pharmacist said oh, you know, we don’t have that prescription anymore ‘cause you guys don’t have health insurance. And you know, the day before, I thought we had health insurance. So I’m in a panic and I call the insurance company and the insurance company says oh, we’re missing a start date on your coverage, so that’s why you’re not getting insurance payments. And I said, you know, funny ‘cause my partner just went through, through three weeks of treatments and we were covered for all of that, so it seems surprising to me that I shouldn’t have a start date on my coverage, and they said oh it’s probably just a, a, a slip of the finger. Like something just happened in the database where this date got erased. But I’ve been doing this work now for about 15 years and I said, you know, what this sounds like to me is that we’ve been red flagged by an algorithm that tests cases for fraud, and luckily I was in a position where I could push back, where I could, I could demand that they reinstate our pharmacy coverage and then I fought back against a number of charges that went through, and were denied because we, you know, quote lacked coverage. But it seemed to me that a number of the things that had happened to use were things that would be rated highly in an algorithm for insurance fraud. Our coverage was brand new. We got, we got, the accident happened at night and so a lot of his treatment happened late at night. He was prescribed oxycodone, which is a controlled substance. So a lot of these things are in the algorithm that looks for fraud in, in health insurance.

HEFFNER: In this book, you test how susceptible we are to algorithms in the way that you and Jason encountered that in three case studies. Three states, and I wonder if you could expound on those examples. Pennsylvania, Indiana, California.

EUBANKS: Yeah. It’s really important I think, one of the things that I say in the book is that these systems impact all of us but they don’t impact us all equally, and so one of the things that was really important to me in writing this book was to look at the folks who I think are the targets of the most invasive and least transparent systems.

HEFFNER: Who are disproportionately poor.

EUBANKS: Who are all, in this book are all in some kind of social service system. So in Indiana I’d look at the attempt to automate all of the eligibility processes for their welfare, food stamps, and Medicaid program. That was in 2006. In Los Angeles, I look into homeless services, and so I look at a system that has been called the Match.com of homeless services. It’s called the Coordinated Entry System, which is basically an electronic registry of the homeless and a, a ranking system that is supposed to match those most in need of housing resources with the most appropriate available resources. And then in Allegheny County, which is the county where Pittsburgh is located in Pennsylvania, I look at Child Protective Services, and specifically at an algorithm that is supposed to be able to predict which children might become victims of abuse or neglect in the future. But I tell each of those stories from the point of view of those who are most impacted, I talked a lot to designers of these systems. I talked a lot to front line caseworkers, I talked to administrators. But I really wanted the stories of, of poor and working families across the color line, to be represented in the, in the book, and so that was really important to me.

HEFFNER: They are. They certainly are. You did an outstanding job. What I thought would be useful, to get into the weeds substantively, was, would be to ask you this question. What unifies those three instances, and what separates them in terms of their qualitative impact on American citizens?

EUBANKS: Wow. That is an amazing question. So… I think it’s important to understand that one of the things I talk about in the book is something I call the digital poorhouse. And I talk about these systems taken together as creating an invisible digital prison for poor and working people. And that’s a pretty strong claim, and I don’t think it’s just the technology that does that, but it’s a collision of three forces. So there’s a cultural narrative that says that poverty is an individual failing, and that it’s an aberration. Like it’s, it’s, not something that happens to a lot of people. Small percentage. A minority of potentially pathological people.

HEFFNER: Mm-hmm.

EUBANKS: There’s a political system that we live in that is mostly interested and focused on asking the question what did you do to deserve being poor rather than how can we help. And then there’s these technology systems that because they’re not, specifically and explicitly built to dismantle inequalities, these long-standing structural inequalities we see in our political system, are poised to potentially intensify them. Because these systems are so fast, they scale up so fast, they’re very persistent, they last a long time, and ‘cause we don’t always really understand how they work. So the systems that I look at, like the eligibility automation, like the homeless registry and like the child abuse and neglect, predictive algorithm, they are, there are certain assumptions about who the targets are that are embedded in the system, and I’d say the thing that all of them have most in common is this idea that poor and working class people are poor or low-income, because they’ve made bad choices.

HEFFNER: And that’s the kind of lurking, psychological dilemma, or predisposition, preconceived notion animating the decisions.

EUBANKS: So for example, I think we get a really different set of systems if rather than the first thing, the first, most important priority being say, finding fraud, which is what most of these systems are, are set up to do, or trying to identify the amount of risk a parent poses to their child, if we instead built systems that said we want to make sure that everyone gets all the resources they’re entitled to and deserve by law.

HEFFNER: Well are there rewards for good actors?

EUBANKS: Mm. So I think there are a lot of good actors in this book. I think there aren’t, there aren’t a lot of black hats and white hats. There’s,

HEFFNER: But are the, are the algorithms,

EUBANKS: Mm.

HEFFNER: Anticipating and providing for incentives that are,

EUBANKS: I see what you mean.

HEFFNER: Positive and can counter the reproductive cycle that I mentioned at the outset?

EUBANKS: Yeah, I see what you mean. So let’s get specific.

HEFFNER: Yeah.

EUBANKS: I mean this is one of the great things about the, the book is it has these great, concrete specific stories in it. So the system in Allegheny County is called Allegheny Family Screening Tool, and it’s basically a statistical model that is used when a call is made to the child abuse and neglect hotline in Allegheny County. So a call comes in, a human call screener picks up the line, says you know, asks what’s happening, interviews the person, and then they have to make three decisions. One is whether or not it fulfills the legal definition of abuse. So it’s a risk rating. A safety rating, how safe they think the child is at that moment, and then the third thing that’s new is this predictive model, and the predictive model is run based on the data that they have in their county system. It weights a hundred and thirty-one different variables and presents a score to these intake screeners between zero and twenty, it’s like on a thermometer, green at the bottom, red at the top. And that’s their safety screening, their, their safety, I’m sorry, their screening score. And they’re very clear that the screening score is only supposed to support the decisions already being made by human intake call screeners, but if the score is over eighteen it actually automatically launches a Child Protective investigation on that family. And now if, if a Child Protective investigation was neutral or benign, then this wouldn’t really be a problem, but that’s not how folks on the receiving side of these investigations feel. So the example I want to give you is, I spoke to a wonderful family, Angel Shepard and Patrick Greeb who have been interacting with the Child Protective system in Allegheny County which is actually called CYF, Children, Youth, and Families for many years received support, have been investigated, and now actually work with a family support center that’s funded by the office to help support other families, who are struggling with their parenting challenges. And one of the things that’s really interesting about their case is, I mean they’re the, they’re the gold star parents, right? They’re working with this system to help other families do better. They’re really engaged. They volunteer their time. And yet they’re really nervous because they know that all of their interactions with the CYF office are going into the screening score. And then if someone calls on their family again, they face potentially being investigated or even losing their, their daughter or their granddaughter to foster care because they are scored high. And they understand it. They understand that that score is mostly based on their interactions with county services. So they have received mental health services from the county. They have received food support from the county. They have received housing support from the county. And all of those things go into this algorithm that drive up their score. So they’re really thinking hard about like, well do we continue to be engaged with this system, which is actually protecting them and helping protect their community? Because we’re afraid that we’re gonna be targeted. And I think that’s not what the systems are intended to do, but they very much have this effect because they’re not built explicitly to support that kind of work, that kind of decision making by poor families. They’re not explicitly built to support the self-determination of poor families.

HEFFNER: Ultimately you would hope that the evolution thereof is from a poor family to a working class or middle class or upward mobile family, right, that’s the hope that people are able to employ these resources to further their livelihoods, and that’s there’s a ladder.

EUBANKS: Mm.

HEFFNER: And that there’s some opportunity for mobility. Now I basically said in the intro that’s out of the picture.

EUBANKS: Yeah.

HEFFNER: And from your anecdotes, it seems like the, the likelihood that these families can achieve independence to in effect assert their self-determination is little.

EUBANKS: Well the, you know, I think poor working class communities always have the power to be self-determining whether or not the government is supporting that power.

HEFFNER: The government or the algorithm is designed to perpetuate them within the system as the guinea pigs that you identify.

EUBANKS: Mm. Mm. Yeah, I think that… So it’s important to understand the, let me tell a very brief historical story, it’s important to understand the point of origin for these systems. So when I first got involved in this work, I started looking at around the personal responsibility act of 1996, which required that welfare offices automate, a number of their processes, and I thought that’s kind of where that started. And when I went to the New York State Archives to look for the design documents, ‘cause I really wanted to see what kinds of decisions they were making when they were designing these systems, I looked in the nineties and it wasn’t there, and I looked at the eighties and it wasn’t there, and I kept going back and back and back. And the origin actually of many of these systems is in the late 1960s and early 1970s, at the point that the Welfare Rights movement is at its strongest and at its most powerful, and what the Welfare Rights movement was doing at that moment was opening up social assistance programs to a wide number of people who had been excluded and discriminated against in the past. It was mostly women of color, but also single mothers. And they won a number of legal victories that meant that you could no longer discriminate against people in eligibility decisions in social services, and the irony is that’s the moment that we see these systems rise. Because I believe that there was kind of a political sleight of hand that happened where you could no longer officially discriminate against people in eligibility, particularly women of color, but you could tighten administrative standards, raise administrative bars so high that it was almost impossible to get through the gate to get public assistance. And we now see that in TANIF for example, in Indiana, they’re down to eight percent of people who are poor and working class, folks with children receiving the program, right? So where before the system was launched it was closer to forty percent. So we see some real narrowing of the gate to, to social assistance, through these systems.

HEFFNER: That means that over 90 percent of the poor working class don’t have access to the systems in,

EUBANKS: Not to cash assistance.

HEFFNER: Indiana. Let me ask you this.

EUBANKS: Yeah, food stamps, Medicaid, but for cash assistance.

HEFFNER: Right. In terms of the psychological effect that we discuss, who, who are designing these systems now, as far as we can tell?

EUBANKS: Yeah.

HEFFNER: Are they corporations? Are they governments that are outsourcing this to for-profit, profit-making institutions?

EUBANKS: Yeah. It’s a combination. And so one of the things that I hoped to do in the book is in many ways I kind of profile the best cases, the, the best case scenarios, both in Los Angeles and in Pittsburgh. These are public agencies, and in L.A. a sort of public-private partnership. They’ve been very open about what they’re doing. They’ve released data. They’re very transparent. They’ve even done some participatory design in their communities.

HEFFNER: Mm-hmm.

EUBANKS: So these aren’t the worst-case scenarios. There’s, there’s, for example there are other child protective algorithms that are completely black boxed, we have no idea what’s in them, and they won’t release any information about them and they’re run by a private company and nobody knows, right, because it’s protected business secrets. That’s not the case with these systems. Really do feel like the folks who are designing these systems have the very best of intentions and are working really hard with incredibly limited resources to tackle really tough problems, right? We don’t want care, caregivers sometimes do horrible things to their kids, and the state has an obligation and a duty to step in. The problem though is that we base, we tend to assume, or the designers of these systems have tended to assume that the only place that discrimination enters the system is in front line caseworker decision making, right?

HEFFNER: Mm-hmm.

EUBANKS: And so if that’s true, then it makes sense to have kind of this systems engineering approach that says like well let’s keep an eye on front line caseworkers, get data on the kinds of decisions they make, and then we can maybe shape their decision-making towards equity. But what that overlooks is the fact that data scientists and engineers and top level administrative brass in social services have all kinds of biases too. And those biases actually get built into these systems in ways that are much more invisible and I think much more dangerous because they scale so quickly and they’re so, these systems are so fast. So for example, in Pittsburgh, in the Allegheny family screening tool they have built in as a proxy for child harm, this gets a little bit into the technical weeds so tell me if I get, I get astray, you can correct me. But they can’t directly measure child maltreatment because it actually happens pretty rarely. So they have to use a proxy that has enough data for them to run a model on it. So they looked for two different proxies to stand in for child harm. One was call re-referral, which means somebody gets a call to a hotline, the call gets screened out. Nothing, no investigation is opened,

HEFFNER: Mm-hmm.

EUBANKS: And there’s another call on that child within two years. The other is about whether or not the child, there’s a call on that child, there is an investigation, the investigation ends up pulling a child out of the family and putting the child in foster care. So re-referral in foster care. Now, the agency’s own research has shown that where most of the racial disproportionality in their CYF system comes in is actually at the level of when the community calls the hotline. It’s like something like 70 percent of the discriminatory impacts come from there, and they’ve actually used that very thing as a proxy for child harm. So do you see what happens? If you say getting called is the same thing as harming your child, and then that factor is the thing that brings all of the racial or most of the racial disproportionality in the system, that becomes a self-reinforcing loop. You’re going to indicate more cases that have more calls. There are more calls on black and Latino neighborhoods because of the cultural ideas we have about who are appropriate parents, right?

HEFFNER: Right.

EUBANKS: So it becomes a feedback loop.

HEFFNER: So how can the technology omit the bias?

EUBANKS: Mm. Yeah that’s, that’s a great question.

HEFFNER: How, because that’s what you’re left considering.

EUBANKS: Yeah.

HEFFNER: With your book, the future in how the algorithms can be dictated in a way that is pro-social,

EUBANKS: Yeah. Yeah. So I give sort of three sets of solutions in the, in the book and I think they’re all incredibly important, but fundamentally what we need to do is get our souls right around poverty in the United States, because as long as we believe that folks are poor because they’ve made bad choices, as long as we believe that poverty is an aberration and not the majority experience in the United States, we’re gonna produce these systems that no matter how hard we try to be fair reproduce politics as usual,

HEFFNER: Mm-hmm.

EUBANKS: Because that’s what this, these technologies aren’t good, they’re not bad. They’re intensifiers. They intensity the system we already have, because of their speed and because of their scale. So we really sort of have to get our souls right around poverty, and I think that some of that work is happening now. Just yesterday, I was down in Washington D.C. for the launch of the new poor people’s campaign, which is a great social movement. Push to bring poverty back to the top of the agenda. I also think we need to make lots of political changes around using social assistance to investigate the moral worth of people rather than using it as a way to more fairly share resources. But in the meantime, we have to do something about these technologies. And so I kind of put like a Hippocratic Oath for big data designers in the book, and I ask them to really just consider two I think pretty low bar questions. One is if you’re designing a system, does it increase the self-determination of poor and working families? The second is it, would the system be, be tolerated if it was targeted at anybody besides the poor? If it was targeted at the non-poor. And none of the systems that, I’d argue that none of the systems I profile in the book raised even this, this very low bar, so we have to build these things, we have to build these systems towards equity on purpose and not just assume that systems engineering and better data will bring better outcomes for poor and working families.

HEFFNER: On the first of those two scores, also we have to define self-determination.

EUBANKS: Mm. Yeah.

HEFFNER: And we have to ensure that people understand the, their economic rights.

EUBANKS: Mm-hmm.

HEFFNER: And… Understand that they are paramount to non-economic determination.

EUBANKS: Mm. Yeah.

HEFFNER: Because there are a lot of ways in which we may perceive ourselves as free in the tech jungle when in fact we’re not.

EUBANKS: Mm-hmm.

HEFFNER: And you allude at the very beginning to they’re gonna come after the middle class next.

EUBANKS: Yeah, I want to speak to this idea that it’s …

HEFFNER: In a, in a minute.

EUBANKS: In one minute. I want to speak to this idea that it’s,

HEFFNER: Two minutes.

EUBANKS: Coming for you next. Right, it’s really important I think to understand that, I have a moral commitment. I believe in the value of all human beings and that includes poor and working people. I think poor and working communities are sites of, of value and strength and resilience, but if you’re a professional middle class person and you’re concerned about this getting up to you, that’s also a legitimate concern and I think right now, for example with this tax bill, right, if the tax bill stands, I would predict that we will see this kind of high-speed diversion from social programs happen to a much wider swath of people. So we’re not just talking about cash assistance and food stamps but we’d be talking about social security, we’d be talking about Medicaid, we’re talking about disability, and these are folks who are really vulnerable, and may not be able to fight back against the speed and the authority of these technological systems.

HEFFNER: And the tax reform or fraud to which you allude is nothing more than further exacerbating the systemic and systematic inequity that you, it’s in effect automating it at a vicious cycle.

EUBANKS: Yeah.

HEFFNER: That, from which we may not recover. Virginia, thank you for your time today.

EUBANKS: Thanks so much for having me.

HEFFNER: And thanks to you in the audience. I hope you join us again next time for a thoughtful excursion into the world of ideas. Until then, keep an open mind. Please visit The Open Mind website at Thirteen.org/OpenMind to view this program online or to access over 1,500 other interviews. And do check us out on Twitter and Facebook @OpenMindTV for updates on future programming.