I took time off from reading my ALAMW haul to read Safiya U. Noble‘s new title Algorithms of Oppression. I had read her Bitch article back in grad school and was fascinated. When I saw the full-blown monograph, you know I had to select it for the Libraries.
With Algorithms of Oppression, Noble wants to put pressure on tech companies to not include pre-existing biases in their algorithms and to actively combat any emergent biases that their algorithms develop. She also encourages consumers of these technologies (due to her book’s tone and style, primarily academic consumers) to critically engage with the technologies, and to also demand the companies combat biases in their products.
She primarily looks at Google (though her conclusion has an excellent interview concerning Yelp). She starts with her initial 2010 search of “black girls” – which she had done in hopes of finding topics to discuss with her stepdaughter and nieces. Google answered her search with pornographic representations of black women. From then, she spent hours and hours testing Google, only to find that the search engine consistently failed to provide credible information about women of color (3). Google Images supplies images of Black people when “gorillas” is searched. Google Maps had the White House labeled as “N*gga House” during the Obama Administration (7). When searching Michelle Obama, Google offers a related search that includes the word “ape” (9).
Then, Noble discusses the “historical and social conditions” that led to Google’s search results (17). She notes that the search results allows us to see a representation of how Google, through its algorithm, conceptualizes the search term (24); and often these search results show that Google’s concept of everything is biased towards Google’s own market interests (28). (See page 39 for a breakdown of Google’s search results page, and how many advertisements there are.) This allows those with power and money to more strongly dictate search results.
She also discusses in-depth the search results surrounding various groups. As her article in Bitch would suggest, she spends many pages on the search “black girls.” She also “Hispanic” and “Latina” girls, “Indian girls” (which brings up commentary on both Indian men and women), “white girls,” and “Native American girls.” Different professions and careers are also examined: for example, searches for “doctor” lead to images of white men (82).
Noble’s third chapter is a fascinating look at what happens when an individual searches for certain communities. For example, Dylan Roof, who murdered worshipers at Mother Emanuel African Methodist Episcopal Church in 2015, wanted to better understand the death and subsequent legal proceedings of the murder of Trayyvon Martin. Roof’s search led him to White Nationalist groups, rather than information on how homicide is most often intraracial (110-115).
Next, Noble discusses legal restrictions on search engines. They are few and far between in the United States, though the European Union has managed a few protections, including the Right to Be Forgotten. She also notes the links between library classifications and their problematic issues and search engines. Finally, Noble pushes readers to demand public policy around technology products as well as to not rely exclusively on technology for social justice.
As for my recommendation: if you are ready for a very academic text about Google and technology, then by all means, I recommend it. It is however, as I mentioned, very academic so it is not a light read. However, for those of us who are systems librarians hiding in the basement, this is worth reading.