logo

41 pages 1 hour read

Virginia Eubanks

Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor

Nonfiction | Book | Adult | Published in 2018

A modern alternative to SparkNotes and CliffsNotes, SuperSummary offers high-quality Study Guides with detailed chapter summaries and analysis of major themes, characters, and more.

Index of Terms

AFST (Allegheny Family Screening Tool)

Launched in August 2016, the AFST is a predictive risk model designed by New Zealand researchers and implemented through independent corporate strategists in Allegheny County, Pennsylvania, replacing a system that relied on human decision-making with one that rigidly adheres to algorithmic rules and sidelines social work professionals. AFST uses data graded along a risk/severity continuum to determine whether a family will be investigated for child abuse by a country social worker. After implementation, the AFST has proved to be inaccurate at best and biased at worst. Its algorithm equates symptoms of poverty with those of child abuse: “A quarter of the predictive variables in the AFST are direct measures of poverty: they track use of means-tested programs such as TANF, Supplemental Security Income, SNAP, and county medical assistance” (156). The AFST’s illogical framework built bias directly into its own dataset.

Coordinated entry system (CES)

Los Angeles’s automated welfare system collects, stores, shares, catalogs, classifies, and ranks numerically information about the unhoused. Designed to triage those with immediate need of housing, CES should be “a standardized intake process to reduce waste, redundancy, and double-dipping across agencies” (85), but in practice it often comes up short and is difficult to navigate. Moreover, CES poses a threat to the people it claims to be trying to help because it so deeply invades individuals’ privacy. The data collected during this entry system is stored in the Homeless Management Information System (HMIS) for at least seven years and used indiscriminately for law enforcement fishing traps.

Digital Poorhouse

The digital poorhouse is a metaphor that paints automated algorithms and systems used in welfare as a modern version of the historical poorhouse. Eubanks argues that the digital poorhouse’s “goal is what it has always been: to profile, police, and punish the poor” (38). Just as the physical poorhouses of the 19th and early 20th centuries demonized, isolated, and contained the poor, so too do today’s digital and dehumanizing datasets put “administrative power in the hands of a small elite” while creating “towering inequalities” (200) via welfare and social assistance programs.

Scientific Charity

Scientific charity is an outdated methodology of assistance that attempted to quell the backlash to poor relief through “more rigorous, data-driven methods” (22). Rather than providing blanket welfare for citizens like the countries of Europe, the US created moral distinctions between the “deserving poor from the undeserving” (22), making public assistance contingent on monitored behavior and thus eventually criminalizing poverty itself.

Personal Responsibility and Work Opportunity Reconciliation Act (PRWORA)

Passed by President Bill Clinton after he swore to “end welfare as we know it” (37), this 1996 law decimated US welfare systems: After its passage, “almost 8.5 million people were removed from the welfare rolls between 1996 and 2006” (37).

Rational Discrimination

Eubanks uses this term to describe how automated decision-making systems and artificial intelligence entrench and proliferate bias rather than being the impartial tools they are supposed to be. Because algorithms are created from existing datasets and programmed by people who necessarily have blind spots, these systems end up perpetuating pre-existing biases.

Temporary Assistance to Needy Families (TANF)

In 1996, alongside PRWORA, President Bill Clinton replaced the children’s welfare program Aid to Families with Dependent Children (AFDC) with TANF, which carried much stiffer compliance regulations and harsher penalties for failing to adhere to the law’s requirements, with the result that many fewer poor children started receiving assistance: “In 1973, four of five poor children were receiving benefits from AFDC. Today, TANF serves fewer than one in five of them” (37).

VI-SPDAT (Vulnerability Index—Service Prioritization Decision Assistance Tool)

The first step of the coordinated entry process in Los Angeles, VI-SPDAT assesses an unhoused person’s personal vulnerability according to the urgency of their circumstances on a scale from 1-17. It uses variables like “social security number, full name, birth date, demographic information, veteran status, immigration and residency status, and […] domestic violence history” (94). Although it invades the privacy of those it document, VI-SPDAT itself is an extremely secretive bureaucracy that refuses to share results or methodology, dehumanizing and limiting the agency of unhoused people.

blurred text
blurred text
blurred text
blurred text