Safiya U. Noble, Ph.D.

Safiya U. Noble, Ph.D.

 

© Getty Cordelia Fine

MARCH 6, 2018

Non-Fiction

Coded prejudice: how algorithms fuel injustice Two new books highlight the ways in which technology is hurting minorities and the poor

Digital tools are often hailed as transparent and democratising “disrupters”. Two new books argue that this optimistic vision is mistaken and that algorithms, as currently deployed, pose a major threat to the human rights of marginalised groups. By way of introduction to her topic, Safiya Umoja Noble, an assistant professor of information studies at the University of California, Los Angeles, recounts the experience that led her to write Algorithms of Oppression. In September 2011, looking for inspiration as to how to entertain her pre-teen stepdaughter and visiting nieces, Noble searched for “black girls” on Google. The phrase was meant to elicit information on things that might be of interest to their demographic; instead, it produced a page of results awash with pornography. As Noble drolly observes: “This best information, as listed by rank in the search results, was certainly not the best information for me or for the children I love.” Noble acknowledges that any set of results will quickly become historical: repeating the search a year later, she found that pornography had been suppressed. But she argues that such episodes should be seen as systemic rather than as one-off “glitches”. Though Noble doesn’t charge Google with racist or sexist intent, she challenges the assumption that the output of Google Search reflects the democratic, if regrettable, inclinations of its users, and questions Google’s abdication of responsibility for the results its algorithms produce.

As Noble notes, it is easy to forget that Google is not a public information resource but a multinational advertising company. Although the mechanics of Google Search are rooted in the “citations” of internet users, its proprietary algorithms can favour the webpages of lucrative clients. Meanwhile, the multibillion-dollar search engine optimisation industry is devoted to manipulating certain webpages on to the coveted first page of results. As such, Google is more of a “supermarket of ideas” than the metaphorical marketplace regarded as so vital to democracy. Just as in an actual supermarket, the most eye-catchingly placed items are not necessarily there because they are the best quality and value, but because of the commercial interests and economic clout of retailer and producers. It’s a tolerable arrangement when it comes to baked beans: but what about social identities?

The ubiquity of Google Search reflects just how accurately and quickly its algorithms can find the information we want. But in Algorithms of Oppression we are confronted with a supermarket of information in which, turning into the aisle for “black girls”, we may be confronted with eye-level pornography. Over there, a shelf of “professors” displays row after row of white males. Black teenaged boys are to be found next to the criminal background check products. A stack of white supremacist “statistics” about “black on white crimes” obscures accurate governmental sources. As public information resources continue to be defunded, Noble asks us to think about the implications of our ever greater reliance on advertising companies “for information about people, cultures, ideas, and individuals”. Noble’s thesis is a new tune in the ever-louder chorus that, in light of the dominance of the big tech companies, is singing for “protections and attention that work in service of the public”.

The same question of who benefits also animates Automating Inequality. Drawing on three case studies in the use of automation and algorithms by public service agencies in the US, Virginia Eubanks, an associate professor of political science at the University of Albany, SUNY, exposes the digital “expansion and continuation of moralistic and punitive poverty management strategies that have been with us since the 1820s”. Automating Inequality should be required reading for every politician, public servant and software engineer. While acknowledging improvements over earlier systems, this is a chilling exposé of how easily hidden values and double standards can be embedded in, and exacerbated by, automation.

In Indiana, an automated benefits system that categorised even its own frequent errors as “failure to cooperate” denied a million welfare applications over three years. Wrongful denials of food stamps soared from 1.5 per cent to 12.2 per cent. Meanwhile, in the interests of efficiency and fraud reduction, automation replaced caseworkers who could exercise compassion and wisdom in helping vulnerable people navigate the complexities of poverty, illness, unemployment and bereavement. In Los Angeles, thousands of unhoused people provide intimate information to a database accessible by 168 different organisations, so that an algorithm can classify and prioritise them. It is a poor exchange for the more than 50,000 LA residents who remain unhoused, but whose personal data can be accessed by police without a warrant. The housed of Los Angeles who enjoy mortgage tax deductions do not have their personal information scrutinised or made available to law enforcement without a warrant, Eubanks points out. In contrast, such “blanket access” when it comes to the unhoused who are likewise seeking government assistance only makes sense, Eubanks argues, within a system “that equates poverty and homelessness with criminality”.

Meanwhile, in Pittsburgh, the records of state programmes and agencies feed into a statistical model that predicts children’s vulnerability to parental abuse and neglect. The vast overlap of poverty with markers of neglect results in a statistical model that “confuses parenting while poor with poor parenting”, flagging parents who access public programmes for assistance for their families as risks to their children. Eubanks’ analysis peels away the veneer of “objectivity”. When the poor reach out for public assistance, the model adds this information to its store of statistical suspicion. The middle class, on the other hand, can get help from babysitters, therapists, and private drug and alcohol rehabilitation centres without coming under statistical scrutiny. Calls to the abuse hotline triggered by racial and class prejudices or personal vendettas are indefinitely stored as input, while the model’s scanty validation data are “a record of decisions made by human case workers, investigators, and judges, bearing all the traces of their humanity”. Despite attacking digital tools on different fronts, the two books have common themes. Both authors highlight the need for those involved in designing and using digital tools to actively attend to, and be trained in, ethical issues. While Noble recommends that technology companies employ more people educated in the humanities, a more effective approach, one suspects, would be greater diversity of leadership. It’s not a logical necessity that you need to be a black leader to care enough about the online representation of African-Americans to go to bat for their interests in a trade-off with profits — simply a strong psychological probability. Above all, both books disrupt the “fantasy”, as Eubanks puts it, that a “model or algorithm will magically upend culture, policies, and institutions built over centuries”. Noble and Eubanks pull back the curtain on digital tools of power and privilege, and ask whether we care enough to do anything about it.

Algorithms of Oppression: How Search Engines Reinforce Racism, by Safiya Umoja Noble, NYU Press RRP£21.99/$28, 256 pages

Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor, by Virginia Eubanks, St Martin’s Press RRP$26.99, 272 pages

Cordelia Fine is a professor in the history and philosophy of science at the University of Melbourne Join our online book group on Facebook at FTBooksCafe.