Noble’s “Conclusion” and “Epilogue” in her book: Algorithms of Oppression, highlight her research about the inequalities that exist on the internet- in which popular, public algorithms continue to engrain oppressive and discriminatory ideas in our society. Noble stresses that since these companies are so user-friendly and accessible, people [unaware of systematic oppression] become even more of a target to spread biases on a platform that the public sees as safe, neutral, and objective, and reliable. Noble ties up her book with an interview regarding how platforms have a “lack of identity control” (173). Algorithms affect an individual’s, mostly minorities, life outside the computer too; their identities being negatively targeted, pushed away, or taken control over, to make room for the non-marginalized.
Noble’s choice to use an interview for her conclusion was a strong writer’s move because it allows readers to feel close to the issue- to remind her readers that situations such as Kandis’s can happen to almost anyone. Nobles argument with presenting this interview is that the African-American community (as well as other minority groups) must create an influence in the field of technology studies to push and challenge racists, sexist, economic, biases. BFTS, black feminist technology studies, is a topic that I have not heard of- yet assumed was already being practised in the corporate world of technology. As Noble writes, BFTS is an “epistemological approach to researching gendered and racialized identities in digital and analogue media studies, and it offers a new lens for exploring power as mediated by intersectional identities” (171-172). Noble highlights that Black feminist technology studies are crucial when planning progress for the growth of equity online. Having African-American feminist, youth, and women working to erase stigmas by becoming a large percentage of contributions behind the screen of algorithms, and other online platforms that use negative biases, allow for the growth of an inclusive content. This idea reminds me of the push for students learning science and math during the time of WWII. At this time, the United States government was pushing for its students to be experts in this field so they can contribute to the race-to-space. The drive for using children in this competition arose due to American trying to be prized #1 in technological advancement, rather than Russia or any other country beating them. If society pushed equity with technological advancements, it would look like having students of color and more girls being taught in school how to code, learning how to make platforms more user-friendly, learning how to edit credible information on websites: is this not just as important, even more? We need to ask ourselves the question of why aren’t we pushing the idea of getting behind the screen to fix the inequalities that they project back onto us? Technology is going to be here forever, continueoulsy growing whether we decide to admit it, or not.
Kandis’ issue with Yelp could have been partially solved, at the least, if Yelp had actual people checking what type of content was being shown regarding Kandis’ business, or if algorithms were made to protect the “rights” of their users. Kandis’ personal business started to fall very short when the community around her physically changed, as well as the representation and reputation of her hair salon online. Kandis used Yelp as a way to get her business name out to the public- since before the age of the internet it was so popular that word of mouth got her the amazing reviews she needed for the salon’s intense popularity. Yet, Kandis found that the competition that wasn’t even close to what her salon offers were taking over Yelp, leaving her salon in the shadows of search pages. The positive reviews did not carry over to her online representation of her salon, in which she was paying for her competition of White companies to be ahead of her! Kandis states, “I can’t find myself and why, when I use certain keywords…they are suggesting that I don’t exist” (177, Noble). As previously mentioned, everyone under the Constitution of the United States has rights that can be exercised. Kandis’ non-existent business and online identity reminded me of the roots that this may come from- a seemly far fetched comparison to slavery. If people of color aren’t being represented as their individual self, are considered property online, and are overrun by their controlling power of White companies, what freedom do people of color get, where are their rights? Why does the work of African-Americans become property, even at this point in history where today’s society is supposed to be great at recognizing the rights of an individual? These companies, such as Yelp, carry a greater wealth over its customers in which it gives them more power over certain individuals in multiple contexts, making their title on the internet.
Noble’s “Conclusion and “Epilogue” leave readers with a heavy responsibility to try and enact change in any forms against algorithms of oppression. The last two chapters really solidified for me the point of Noble’s research and the immense effect it has on all of society.
Noble, Safiya. Algorithms of Oppression: How Search Engines Reinforce Racism. New York, New York University Press. 2018. Print.
- How would you feel if the company you worked for constantly put your competitors and/or peers identity before yours? What does this do to a person’s emotions? How would you try to make a name for yourself if the company you work for owns everything under your name/identity? How do you feel this is a violation of property?
- Do you agree with Noble’s suggestions on how to “stop” these algorithms of oppression? Why do you think I asked this question with “stop” in quotations? Do you think that it is possible to make progress with the suggestions that Noble explains such as using “alternatives to commercial information platforms”? What do you think we can teach our friends, family, students, and whoever necessary, regarding Noble’s closing chapters?