Virginia Eubanks at Slate: “Algorithms don’t just power search results and news feeds, shaping our experience of Google, Facebook, Amazon, Spotify, and Tinder.Algorithms are widely—and largely invisibly—integrated into American political life, policymaking, and program administration.
Algorithms can terminate your Medicaid benefits, exclude you from air travel, purge you from voter rolls, or predict if you are likely to commit a crime in the future. They make decisions about who has access to public services, who undergoes extrascrutiny, and where we target scarce resources.
But are all algorithms created equal? Does the kind of algorithm used by government agencies have anything to do with who it is aimed at?
Bias can enter algorithmic processes through many doors. Discriminatory datacollection can mean extra scrutiny for whole communities, creating a feedback cycleof “garbage in, garbage out.” For example, much of the initial data that populated CalGang, an intelligence database used to target and track suspected gang members, was collected by the notorious Community Resources Against Street Hoodlums unitsof the LAPD, including in the scandal-ridden Rampart division. Algorithms can alsomirror and reinforce entrenched cultural assumptions. For example, as Wendy HuiKyong Chun has written, Googling “Asian + woman” a decade ago turned up moreporn sites in the first 10 hits than a search for “pornography.”
But can automated policy decisions be class-biased? Let’s look at four algorithmic systems dedicated to one purpose—identifying and decreasing fraud, waste, and abuse in federal programs—each aimed at a different economic class. We ‘ll investigate the algorithms in terms of their effectiveness at protecting key American political values—efficacy, transparency, fairness, and accountability—and see which ones make the grade.
Below, I’ve scored each of the four policy algorithms on a scale of 1 to 5, 1 being very low and 5 being high…
Of course this ad hoc survey is merely suggestive, not conclusive. But it indicates areality that those of us who talk about data-driven policy rarely address: All algorithmsare not created equal. Policymakers and programmers make inferences about theirtargets that get baked into the code of both legislation and high-tech administrativetools—that SNAP recipients are sneakier than other people and deserve less due process protection, for example….(More)