What happens when you type “black girls …” in a Google search? Why is it that courtroom-sentencing software used by judges overpredicts Black criminality? Would Dylann Roof have murdered innocent worshippers at Emanuel African Methodist Episcopal Church in Charleston, South Carolina had his Google search for information after the Trayvon Martin case not suggested that Black on white crime was an epidemic issue? Do we have a right to be forgotten, or do digital traces last forever in the vaults of search engines such as Google, and, worse yet, what can we do when information that was produced for private consumption is circulated in acts such as “revenge porn” that can damage or altogether obliterate women’s employment opportunities? How is Black hairdresser Kandis supposed to keep her business when the algorithmic practices of Yelp in concert with the retreat of affirmative action by the major university by which she works reduce and nearly erase her visibility and ability to make a living? In Algorithms of Oppression: How Search Engines Reinforce Racism, Safiya Umoja Noble brings together Black feminist studies, library and information sciences (LIS), and media studies to carve out the field of Black feminist technology studies (BFTS). Begun as a follow-up to Andre Brock’s challenge to see [End Page 217] what happens when you type “black girls …” on a Google search in 2011, Noble pursues the study of this most influential search engine. What she finds are tremendous continuities with previous deployments of information and media technologies and a few, mostly dystopian, ruptures that threaten our democracy and our ability to access information. Based originally in library and information sciences (LIS), Noble details how systems of categorization embody biases in classifications. LIS citation practices served as a model for Google creators Larry Page and Sergey Brin. Noble notes that even though multiple review processes are used to police citation practices, these practices continue to demonstrate race and gender exclusionary tendencies. Moreover, Page and Brin themselves noted potential abuses of the search engine process due to commercial interests. The application of a practice created to facilitate the production of knowledge applied to the marketing of commodities is but one of the many transitions from the public commons to the advertising of everything. The consequences of this latest form of neoliberal assault on our public commons are born out disproportionately by the most vulnerable members in our population. Safiya Noble focuses primarily on Black girls and women, though she acknowledges this research applies to Latinas, Asian Americans, and indeed any racialized female category.
Noble takes many “truths” about the Internet in general, and Google search engine in particular, and debunks them. These are important to circulate given that Noble provides ample evidence that the public by and large trusts Google as a search engine. Not enough people realize that results do not represent the best or even most popular information. Given that the Internet and search engines, especially Google, are part of culture and the economy, Noble urges us to remember two important things. First, our culture is rooted in a system of white supremacy. Historical analyses of employment and representation in media industries show persistent patterns of racialized tropes from advertising to pornography. Second, the contemporary formation of neoliberal economics turns everything, including matters that were recently thought to be public goods such as information and education, into commodities. Pretending Google will strengthen democracy and provide neutral information is just that—a pretense. Google is an advertising engine, not an information search engine. Google is but the latest in a series of information technologies whose promise is democratic but whose actual deployment is commercial. Indeed, as with previous “new” technologies of information and media, search engines and Google exist to make a profit, not to strengthen democracy. Furthermore, drawing on the great insight provided by Dallas Smythe, a political economist of communication, Noble contends that “[w]e are the product that Google sells to advertisers” (162). Although we are being sold products, we are the major commodity produced by the Google search engine. Every one of our clicks becomes valuable capital for Google. Furthermore, the uses to which our freely provided labor is put to has racial and gender implications. The abuses of big and personalized [End Page 218] data gathering affect those most vulnerable disproportionately. Information seeking about Black women yields little information not connected to pornography, and it is nearly impossible to find factual material about Black women as knowledge-producing agents. Black women have little power to influence how, and if, they appear in Google.
I tell my students at the beginning of every term: nothing happens without human labor. It is so easy to think that wireless actually means without wires, when nothing can be further from the truth. Every summer, after students leave for vacation, hundreds of workers, mostly white males, descend on campus, digging up floors and ceilings to update the wires necessary for our wireless academic terms. Capitalism thrives on the fetishization of labor, and Noble’s book takes us to the beginning of the “information ecosystem” where working-class Black bodies in Ghana dug for raw materials and disposed of highly toxic “disposable” technology with which we in the First World once surfed through search engines. Yet by the time we get to the labor force in Google, the highly paid programmers are mostly white males. At this point, human labor makes decisions, informed by deep-seated cultural narratives. Despite the scientific nature of the word “algorithm,” Noble reminds us that algorithms are created by human labor. The Internet, and search engines as a component of this “information ecosystem,” may be facilitated by a deployment of data turned into seemingly objective informational tools, but they are, in fact, “mathematical formulations to drive automated decisions … made by human beings” (1). Furthermore, Noble points out recent, well-documented cases where the programmers at Google, the same guys who claim objectivity, have circulated highly misogynist memos that cast doubt on their gender neutrality. Window dressing highly publicized efforts, such as 2016’s announcement that Blacks Girls Code would be moving to Google offices in New York, ignore the fact that there are plenty of highly educated and trained Black coders, programmers, and marketers. This is not an issue of “pipeline,” the term used by business and academia to suggest that the reason there are few employees of particular gender and race categories is because there are just none available. Without criticizing organizations such as Black Girls Code, Noble asserts that this move takes attention away from discriminatory hiring practices in Google and in the information economy at large.
When blatant and unavoidable racist outcomes of “neutral” algorithms reach the news, Google reverts back to two major excuses. First—and this is yet another continuity in media and information history—is the excuse of intent. Google did not intend any harm. As any good media scholar would, Noble replies that intent is irrelevant. We are looking at outcomes. Besides, intent shifts the level of analysis from the institution to the individual. Early in the book, Noble established that racial and gender representation in media and in search engines have long historical roots in a system of white supremacy. Second, in egregious cases of racism and/or sexism, Google reiterates the neutrality of[End Page 219] algorithms. Sometimes shortly following a particularly egregious case, salience and order are slightly recalibrated. For instance, the link between pornography and Black girls is toned down a little, for a limited time. I am reminded of a lecture Lisa Nakamura, author of Race in Cyberspace, once gave to one of my classes. She said, “The entire Internet is built on a thin layer of ice over the huge industry of pornography.” We owe the pornography industry for the ability to make online ordering easy and private. The price we pay for the ability to order privately is the loss of our private information to the Googles of the world. Thanks to monopoly capitalism, there are not many Googles, but that feels like an empty form of relief.
Noble ends the book with an interview with Kandis, a hairdresser in a university town. Twin forces of abandoning the goal of diversification of the university and the shift to online advertising have meant that Kandis has had to shift from word of mouth advertising to Yelp. However, Kandis astutely realizes that she cannot navigate Yelp practices without paying large advertising fees, which she cannot afford. Ending the book with this case study of an actual Black woman brings home the theoretical structure of the book. Noble begs for public policy that protects everyone, especially for the most vulnerable. This is not an easy fix—there is no App for that. Indeed, the entire book is a cautionary tale, exhorting us to contextualize “new” communication technologies in relation to a long history of racialized and gendered continuities. Usage of such scientific terms as “algorithms” masks the continuation of oppressive structures. A Black feminist perspective succeeds in elucidating the racial and gender biases of so-called neutral search engines.
Angharad N. ValdiviaAngharad N. Valdivia is a research professor of communications and media studies at the University of Illinois, Champaign-Urbana. She has published widely on media studies as a field and on issues of transnational gender and popular culture, with current attention paid to Latinas in mainstream media, especially girls and Disney.
Recent Comments