Book Review: Automating Inequality by Virginia Eubanks

Travis Weninger
7 min readDec 3, 2021
How high tech tools profile, police, and punish the poor

Virginia Eubank’s Automating Inequality takes us through three cases in the United States where technology is being used by local governments to assist in making major decisions about peoples lives and well being. The first case is in Indiana where a system developed by IBM was put in place to automate eligibility for state welfare. The second case is a coordinated entry system in Los Angeles which ranks the need for housing of an individual that is currently living on the street. The final case is an algorithm used by Allegheny County by their child welfare office that scores families with the goal of predicting whether a child will experience abuse or neglect. A common theme throughout all these examples is that they rely on vast amounts of data to create scores and ranks of the people they are aiming to serve. For the people that are affected by them and the people who’s job it is to use them there is often little understanding as to what constitutes receiving a certain score. These systems always perpetuate existing inequalities, biases, and assumptions that are prevalent in the communities they aim to serve. Using algorithms to make decisions about who gets housing, who is eligible for welfare, and which parents should be investigated for child abuse creates a bureaucratic nightmare when one tries to challenge these systems, as the results spun out by them tend to be taken as gospel and unchangeable. The end result is dehumanizing and demoralizing to the people that these systems are supposed to help.

The first case explored in the book takes place in the state of Indiana where a system developed by IBM and ACS was put in place to clean up “welfare waste.” The governor of Indiana Mitch Daniels had a goal of carrying out a “Yellow Pages Test” for the state, if a product or service is listed in the yellow pages then the state shouldn’t be providing it. Daniels believed the state’s partnership with IBM and ACS would make America’s worst welfare system better for the people it served, the systems would do this by automating welfare eligibility. For their first attempt, a call centre was opened and caseworkers began taking calls from the public. Millions of copies of drivers licenses and social insurance cards were indexed but hundreds of thousands of these documents were lost. Incentives were put in place for call centre workers to speed up eligibility, this was done by closing cases prematurely leading to more than a million denied applications between 2006 and 2008, a 54 percent increased compared to three years prior. Listening to public feedback the state decided to try a new approach that would revolve around a self serve model putting more stress on those applying to find and submit all required documents themselves. If the smallest mistake was made on these applications, one error code was spit out at the user: “Failure to cooperate.” This resulted in even more people getting denied and an almost impossible statement to challenge in front of a judge, applicants were told by caseworkers that the judge will see this error and simply deny them. When switching to a self service model the state also did not take into account that many people did not have personal computers nor were they savvy enough to navigate these complicated applications. Libraries in the state began to fill up with people completing welfare applications and librarians were forced to become case workers helping people complete these form. The state and IBM eventually admitted that this approach was a disaster but blamed forces out of their control, the state has now transitioned to a hybrid model run by Xerox, however caseworkers admit that they see no difference. These systems operate from the standpoint that it is better to deny ten eligible people welfare than approve one person who is not eligible, gatekeeping not facilitating.

In the City of Los Angeles a coordinated entry system was put in place to score the need for housing of people that were experiencing homelessness. The system was described as the match.com of homeless services and was supported by organizations like the Alliance to End Homelessness and the Bill & Melinda Gates Foundation. Coordinated entry was created to address the mismatch between housing supply and housing demand in L.A. County. Before this system the homeless would have to navigate a complex web of waitlists and social service programs. Coordinated entry was based on prioritization which differentiates between the crisis homeless versus chronic homeless, and a housing first philosophy which aims to get people housed regardless of other conditions like sobriety or mental health treatment. Caseworkers walk the streets of Downtown L.A. and survey those that appear to be in need of housing, assessments are entered into the Homeless Management Information System (HMIS). A ranking algorithm scores the individual with a rank from 1–17. 1 meaning the person surveyed is low risk with a small chance of dying, 17 meaning the person is the most vulnerable. Scores from 0–3 receive no housing intervention, 4–7 limited/temporary rental subsides, and 8–17 are assessed for permanent support. Eubanks talked to several people who had been granted or denied housing from this system and it appeared that there was no consensus made on why they were eligible over others. The people she talked to were all facing different sets of challenges, but some that appeared to be more in need of housing were denied over others that appeared to be in not as dire of a situation. A catch-22 can be created from interacting with these systems, a high score can indicate that you are in need of permanent supportive housing (which is in short supply) but too vulnerable to be given your own place. People who have interacted with the systems and caseworkers also felt as if those trying to help them were coaching them through their intake interview to give answers that would reduce their eligibility. Another issue of cooperating with these systems is the amount of personal information that is given, if applicants ask for a privacy form they are told that their information will be shared with 168 different organization, which includes the LAPD.

In Allegheny County Pennsylvania a screening tool was put in place to forecast the likelihood of child abuse and neglect, this predictive risk model would assign children a score based on several contributing factors. In the country, abuse is defined as bodily harm to a child, while neglect is defined as prolonged or repeated lack of supervision that endangers a child. The scores range from 1 to 20, 1 being the lowest risk and 20 the highest. Before this tool was put in place a savvy public servant with a big budget decided to create a data warehouse that included information from the counties department of human services, state public assistance services, and more. The data warehouse lives across two servers and holds more than 1 billion records, which equates to 800 records per person in the county. The annual cost of the warehouse which is managed by Deloitte is $15 million a year. The screening tool that was put in place to asses risk of child abuse or neglect hooks directly into this data warehouse. This results in a situation where anyone who has ever interacted with local or state government assistance automatically gains a score because there is a history of the family using public services. This creates an extreme bias as the end result is a system that disproportionally targets poorer people and questions their parenting abilities. Middle class and wealthy people have less reliance on state assistance so they likely don’t even exist in the data warehouse, giving them no risk score unless a complaint is made about them directly. In 2016, there were 15,139 reports of abuse or neglect in Allegheny County, the screening tool was wrong about 3,633 of these cases. Because the data set used by this tool only uses info about families that access public services, it is guaranteed to produce thousands of false negatives and positives annually.

Eubanks makes the point throughout the book that although these high tech systems are being used to discriminate against vulnerable populations, the goal of criminalizing and punishing the poor is nothing new in America. Eubanks throws back to the day of the Poor House which were homes that people facing economic hardship would be sent to live in throughout the 1800–1900s. At the time the public believed that placing people in these homes provided them with care while instilling moral values of thrift and industry. The reality of poor houses were horrid living conditions, malnourishment, public shame, your children being taken from you, and a loss of basic civil rights. It can be said that these high tech systems are just helping uphold a long standing American tradition of policing, profiling, and punishing the poor. With the help of technology, the poor house has evolved into the digital poor house. Is it a tradition as American as Apple Pie?

The examples that take place in Indiana and Allegheny County have something in common that is worth mentioning. They are both post industrial areas that have been severely impacted by globalization. In the book it talks about how for both of these places the areas primary industries have packed up and left leaving no opportunities behind for the residences. In these examples we are dealing with governments that have a very limited amount of resources to serve those in their communities. These systems were put in place to better make use of the very finite resources they had but ended up doing more harm than good. A goal in Indiana was to reduce the amount of Welfare fraud that occurred, while yes this does happen it occurs a lot less than people believe. These two examples to me are much more reflective of the neoliberal/Chicago School style of government that is prevalent in the West than simply arguing that technology is evil. Technology is the tool used by these hollowed out governments to discriminate against their population’s eligibility for public resources and who should be investigated by the state. While technology certainly plays a significant role in facilitating the examples mentioned in this book, it is not the only bad guy in these stories. If we think of a historical example, we know that IBM punch card machines were used by the S.S. to catalogue those sent to concentration camps. Yes, IBM punch cards helped facilitate the atrocities that were committed in these camps, but we do not blame IBM for the holocaust. The bigger problem is the political system and economic framework that these technologies are forced to operate under.

--

--