A picture of code, in green.
Book Review Contributing Editor

Book Review: Algorithms of Oppression: How Search Engines Reinforce Racism

Lorin K. Jackson, LibParlor Contributing Editor, discusses the implications of algorithms through our book review column.

Author Background

Dr. Safiya Umoja Noble works as an Associate Professor at UCLA in the departments of Information Studies and African-American Studies. She is also a visiting faculty member to the University of Southern California’s Annenberg School of Communication. Algorithms of Oppression: How Search Engines Reinforce Racism is a best-selling book, published by NYU Press. She has co-edited two other volumes regarding technology issues:  The Intersectional Internet: Race, Sex, Culture and Class Online and Emotions, Technology & Design.

TLDR/Summary

Algorithms of Oppression: How Search Engines Reinforce Racism (2018) thoughtfully analyzes what algorithmic oppression is and why it is so insidious. While this book is not a recommended introductory read due to the nature of the content (the layered world of intersections between discrimination and technology), information professionals with prior knowledge about intersectionality, basic concepts of knowledge organization, and current events will find that Noble presents a compelling argument. She implores us to consider why the information provided on the internet requires not just further regulation, but deeper thought. Noble shows us that things are not always as they seem. How we get out of the tangled web weaved by algorithmic oppression and technological redlining is not immediately clear, but Noble urges us to consider the plight with fervor.

Book Review Introduction

As a young person, I remember learning quickly that certain search terms online would return results that were not actually what I was looking for or, better yet, what I wanted to see. I wanted to guard myself against seeing certain things return after search queries. While my generation is not those of the digital natives that trail me, I was there to watch the internet take over the ways that we live and have lived since.

Safiya Noble engages with how the internet and Google search in Algorithms of Oppression: How Search Engines Reinforce Racism (2018) mirrors wider sociological, hegemonic phenomena. She argues that these platforms ultimately create more opportunities to uphold hegemony. After reading, I now shudder at the thought of being a young person who tacitly accepted that a search on “black girls” like myself would inevitably return pornographic search results. Noble raises interesting points and scathing critique about how this kind of behavior by the American populace has become commonplace, common knowledge, accepted, undisputed, and as truth. An experiment in world-making, Google represents a conglomerate of entities such as Facebook and Instagram that provide a service for “free” that allows us to engage with the world, discover information, and connect with one another, under the guise of collecting intense data about our lives, who we are, and unfortunately, what we will buy. What makes these digital entities billion dollar enterprises if everything they do is “free?” Why do these corporations act as if the algorithmic data or data mining about our lives is secondary when that seems to be their primary business strategy? As consumers, we accept these terms and conditions in order to engage, but at what cost? As Noble elucidates in this short, but dense read, additionally, everyone is impacted by the consequences of these entities according to their hierarchical “place” in society.

Intermediate to Advanced Appeal

Algorithms does not serve as an introductory read. The style and offered confluence of ideas is complex and layered for the unaided, or unfamiliar reader. Having some background in sociologies of race, media studies, and stereotypical representations would help as a precursor to this book. Similarly, the text relies on some prior knowledge to support understanding its conclusions regarding gender, intersectionality, pornographic studies, and sexual deviation, for example. For information professionals, though, this book represents a myriad of critical perspectives she implores us to use to inform our work. Perhaps, a critically under-examined entity is the behemoth that is Google and beyond the company’s suite of products, particularly Google Search. Noble carefully breaks down the intricate ways that Google Search and Google itself are not neutral entities. Bringing in relevant research about Google and referencing many scholars such as Alex Halavais, Christian Fuchs, Sarah T. Roberts, and Sanford Berman who have done similar work to credit and build the foundation of her argument, she clearly unpacks how biases exist. She shows countless examples that the very same social inequalities that are ubiquitous off of the internet IRL (in real life), are present within the fabric of the internet, and maybe even magnified. Interestingly, some of the examples provided in early chapters in the book suggest a godlike, untouchable faith in algorithmic data by the general public. Deep-machine-learning, essentially as an act of artificial intelligence, grows and develops its own independent reference points from initially human coding/programming/insight. These programs do not materialize out of thin air. Similar to the phenomenon of vocational awe attributed to experience regarding outsiders to librarianship or IT about these professions, these lofty situations are not to remain unquestioned.

“Politics of Recognition”

Algorithmic oppression and technological redlining are two ways that Noble codifies the inescapable reality that those marginalized in society such as women and people of color are further cast aside by the perpetual technological power of Google search, for example:

“…These issues are at the heart of a ‘politics of recognition,’ which is an essential form of redistributive justice for marginalized groups that have been traditionally maligned, ignored, or rendered invisible by means of disinformation on the part of the dominant culture. In this work, I am claiming that you cannot have social justice and a politics of recognition without an acknowledgment of how power – often exercised simultaneously through White supremacy and sexism – can skew the delivery of credible and representative information” (84).

Similar to the decades-old fight by critical librarians and researchers regarding the danger of classification schemes that minimize, erase, frame, and subject marginalized peoples, Noble makes a compelling argument that Google Search does the same.

Motivated by corporate interests and the financial return that follows from specific searches being framed in a certain way, the seemingly innocuous search terms may return horrifying results that only further support the subjugation of marginalized people. This project only increases profit for the companies that provide this data. The social majority is racist, so the profit motivation for search companies is to appeal to the majority. It is not in the economic interests of these companies to appeal to the minority or marginalized groups: “This research by Roberts, particularly in the wake of leaked reports from Facebook workers who perform content moderation, suggests that people and policies are put in place to navigate and moderate content on the web. Egregious and racist content, content that is highly profitable, proliferates because many tech platforms are interested in attracting the interests and attention of the majority in the United States, not of racialized minorities” (56).

Intersectionality and Digital World-Making

“Noble’s is a landmark text in research regarding social inequalities and the way that this presents itself in our world. Without mentioning this framework explicitly, she brings us into the implications of digital world-making, culture-making, and record-making. “

Noble’s project is noble and necessary. While her work may be notable for library and information professionals, the truth is that everyone either uses Google, is impacted by Google, or both, so really, this book is for everyone. Though, as explained previously, the content may not be immediately accessible in terms of its approach or jargon to readers visiting from outside the field. There are examples in the book where she works to make ideas accessible. She defines what she predicts may be unfamiliar terminology and benchmarks key facets of her argument by returning to major points for the reader throughout. She notes that there have already been significant strides in research regarding the corporate interest of Google. She adds that her innovative research has to do with applying an intersectional lens to the recurring themes of social injustice that have been present since the beginning of history. Instead of a supposition that a neutral, natural human exists as the purported unbiased vantage point from which the internet and its content are constructed:

“This teleology of the abstracted individual is challenged by the inevitability of such markers and the ways that the individual particularities they signal afford differential realities and struggles, as well as privileges and possibilities. Those who become ‘marked’ by race, gender, or sexuality as other are deviations from the universal human – they are often lauded for ‘transcending’ their markers – while others attempt to ‘not see color’ in a failing quest for colorblindness” (62).

Noble’s is a landmark text in research regarding social inequalities and the way that this presents itself in our world. Without mentioning this framework explicitly, she brings us into the implications of digital world-making, culture-making, and record-making.

Conclusion

What we have to learn is what we have to gain. A lofty project in dismantling algorithmic oppression, Noble makes the case, provides the evidence, and clearly outlines the verdicts. There is no disputing what she says due to the lucidity of her work. In the era of combating fake news and alternative sources, her work gives us a foundation from which to stand and launch into researched, proven truth.

Our social dynamics are replicated in the digital sphere – showing us what “value” we represent and how much power we have over our representations or misrepresentations on the web. The supposition that the algorithms that give us back results are out of our hands is, frankly, untrue. Algorithms are controlled at the hands of companies that provide us with the search platforms, and the programmers behind them. As Noble calls out, there is overwhelming evidence to secure claims that the internet is largely part of the same projects of marginalization that exist outside the digital world. We see the same problems play out for similar reasons: “…Representations in search engines are decontextualized in one specific type of information-retrieval process, particularly for groups whose images, identities, and social histories are framed through forms of systemic domination” (149).

She outlines herself that a risk and weakness of creating a book about this topic is that much like the proliferation of rapidly changing technology, once it is published, it is already obsolete. We can never “catch-up” to how quickly the world moves in this Internet Age. The next steps and what implications her work has on the work we have to do in order to provide a counternarrative to the weight of this discriminatory history is alluded to in the conclusion and throughout the book but left not fully articulated.

The utility of this book is its definition of several of the key components that we will need to consider and take up in order to dismantle the grip that xenophobic ideology has taken on the internet. I appreciate how this book further complicates and examines the so-called “digital divide” that disproportionately impacts people of color, working class people and women:

By rendering people of color as nontechnical, the domain of technology ‘belongs’ to Whites and reinforces problematic conceptions of African Americans. This is only exacerbated by framing the problems as ‘pipeline’ issues instead of as an issue of racism and sexism, which extends from employment practices to product design (66).

“Implicit bias is an undercurrent and a powerful force in reifying our social hierarchies. To attack implicit bias requires thoughtful intervention and real inclusion beyond lip service. In order to help make the internet truly democratic and safe, particularly for people who are marginalized in society like Black women and girls, we have to consider carefully about what is at stake.”

While we may have understood that there were issues underlying people gaining access and understanding technology, I think Noble offers a substantive account as to why some people have been caught outside of the web, or rather, tangled within it. Notable about Nobel’s account is the integration of current events. Noble references moments connected to the topic of technological redlining and algorithmic oppression in political elections, #Gamergate, Black Girls Code, the Dylan Roof church massacre, and revenge porn, to name a few. Implicit bias is an undercurrent and a powerful force in reifying our social hierarchies. To attack implicit bias requires thoughtful intervention and real inclusion beyond lip service. In order to help make the internet truly democratic and safe, particularly for people who are marginalized in society like Black women and girls, we have to consider carefully about what is at stake.

What we find in search engines about people and culture is important. They oversimplify complex phenomena. They obscure any struggle over understanding, and they can mask history. Search results can reframe our thinking and deny us the ability to engage deeply with essential information and knowledge we need, knowledge that has traditionally been learned through teachers, books, history, and experience. Search results, in the context of commercial advertising companies, lay the groundwork…or implicit bias: bias that is buttressed by advertising profits. Search engines also function as a type of personal record and as records of communities, albeit unstable ones. In the context of commercial search, they signal what advertisers think we want, influenced by the kinds of information algorithms programmed to lead to popular and profitable web spaces (116-117).


Featured image by Markus Spiske, via Pexels


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License


The expressions of writer do not reflect anyone’s views but their own

2 comments on “Book Review: Algorithms of Oppression: How Search Engines Reinforce Racism

  1. Pingback: Discrimination of a New Age – Strange Musings

  2. Pingback: “Take them away”: Cletus Bookworm’s acquiescence to censorship – Libraries in Popular Culture

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: