Technological Redlining: Who Suffers?

This reading brought to light a few issues the author felt that technological redlining fosters. Many of these were incredibly eye-opening, evoking a sense of fear and frustration. Safiya Umoja Noble discusses the ways in which the algorithms that control what we see, when we see, and how we see, are made by individuals. Noble says, “… some of the very people who are developing search algorithms and architecture are willing to promote sexist and racist attitudes openly at work and beyond, while we are supposed to believe that these same employees are developing ‘neutral’ or ‘objective’ decision-making tools. Human beings are developing the digital platforms we use…” (2). These mathematic formulas have a human impression, which ultimately leaves the power of bias and discrimination in the hands of the creator. This is a terrifying thought for many reasons. The first being the underlying and sometimes prominent racism on the World Wide Web, and the second being the sense of powerlessness I felt while reading this. The inventors of these algorithms are people who are not only technologically advanced, but those who hold a power and control over what content is distributed. They ultimately have the final say of what material is being shown on different search engines, forms of social media, or advertised to users; leaving everyday people left with the challenge of sifting though and spotting these inequalities. How do we stop this? If the internet is supposed to be a free and open space for all, how can these issues still exist?

Reading these pages in particular made me think of the documentary I brought up in class last week that discussed Cambridge Analytica. For those who are unfamiliar, Cambridge Analytica is a consulting firm that processes user’s data to influence or sway feelings on certain topics. During the case of Presidential election here in the United States back in 2016, it was discovered that Cambridge took personal data from millions of Facebook users, without their consent, in hopes of targeting certain groups of people for political advertising. If the data site saw that one user supported the Republican Party, or liked a page or article having to do with Donald Trump, Cambridge Analytica would then tailor the content that user saw to increase the person’s support of the Republican Party. The same thing would occur for someone who was a Democrat; their content would be altered for the “greater good” of a political campaign. Taking this information from users without consent is damaging to not only their safety, but their mindset as well. If they don’t need to look any further than their timeline for information that supports their political beliefs, then they won’t (no matter if the article or advertisement is true or false). This hinders people’s ability to think freely, and gives political candidates an unfair advantage and leg up over others who are running for office. Facebook and its creator, Mark Zuckerberg, are still being investigated and continue to testify in front of Congress.

In a day and age where information is only a click away, one would hope that the content we view would not be influenced as heavily as it is. This book serves as an eye opener to anyone who reads it; we are never “safe” online, and our search history and information are constantly being monitored and evaluated.

-Adrienne

Discussion Questions:

  1. What efforts can teachers / professors take to better educate their students about the issue of technological redlining? How can we educate each other to spot and call out racist ideals on the internet?
  2. Even though studies show that search engines like Google are racist, why do you think people still use them so frequently? Do you think Google’s popularity will ever decrease?
  3. Can you think of any other search engines or websites that portray groups of people in a certain way? Or sites that contain algorithms that can be racist? Safiya Umoja Noble, Algorithms of Oppression: How Search Engines Reinforce Racism (New York University Press, 2018). 

8 Replies to “Technological Redlining: Who Suffers?”

  1. This was a great response to the article, I like that you brought in your own opinion! It also helps to explain what the book is saying in the first place. To answer your first question about teachers and how to educate students on these topics, I think last class Dr. Savonick did that. She highlighted our Terms and Usage agreement for SUNY Cortland. I’m pretty sure everyone in the class was shocked by the findings. By using examples in our own lives and how these directly affect us is a good way to educate students. I also think if you’re being a smart reader on the internet you can tell if things are being tailored to the algorithms that you play into. It’s no surprise that we always joke about “our F.B.I. agent” that monitors our phones and laptops. We all realize this is happening, just putting a sticker over your laptop camera does not change this (which I 100% have one)., but we need to do more. By sharing and discussing this information with others we may be able to stop these issues at the root! We can find or make a new search engine that does not act on these prejudices! I think your example of the 2016 election was a good way to start this discussion. We all have something to say about that election were these studies the cause of the turnout? Let’s answer the important questions as a society.

  2. Nice post! I was super interested in class when you brought up the documentary and I like that you brought it up again and your opinion. It’s an obvious and scary thing that our advertisements and suggestions are based on our history on our laptops. Something I have noticed since our discussions in class is a lot of my youtube advertisements are polls that ask my age, my gender, what my interests are so they can cater ads to who I am (like as if they don’t already know). As for your questions, I think teachers should take the time to have a detailed lesson on it like we do in our class and educate students on the racist ideals by having them come up with examples of where it is seen. I think Google is used so often because not many people are educated on how it reflects racism and it just have been a number one source for everyone for years. If people become educated on this then there may be a chance of its popularity decreasing.

  3. Hi Adrienne!
    I enjoyed your post. I like how you brought in outside information from the documentary Cambridge Analytica. As I was reading Algorithms of Oppression I was surprised by the different examples and pictures Safiya Umoja Noble incorporated into her book. I feel like if someone were to share the idea with someone else that discriminatory and racist ideas are embedded in multiple search engines and etc. they wouldn’t believe it. The fact that Noble shares certain images for example when typing in “black girls” or when typing in “n***** house” provides the readers with solid evidence that the internet truly does have sexism and racism deeply instilled in it’s code. Noble then brings us to this idea that many people see these faults on the internet as glitches when they aren’t glitches at all.

    In response to your discussion questions for question two I believe that Google is so popular because it’s been around for years and has and continues to be successful. It’s also fast and convenient and in this generation we love the ideas of things being fast and finding out information almost instantly. We also have small computers at our fingertips constantly and whenever I have a question or want to look something up I instantly go to google because it’s there and accessible.
    Other websites I can think of that could possibly portray groups of people in a certain way are sites such as twitter, facebook, and instagram. I feel like many of these social media apps filter peoples content and creates a separation between people’s interests and ideas even in regards to one’s political stance or one’s views towards women, people of color, and even the LGBTQ community. These social media apps are sort of spaces for people to project and constantly share their ideas and a lot of the information being shared is negative.

  4. Hi Adrienne,

    I liked how you brought an outside source, being the Cambridge Analytica, and had a great stance. Just in the beginning of Chapter One, what stood out to me was, “[T]he Google Search autosuggestions featured a range of sexist ideas such as the following… Women should not: have rights, vote, work, box… (15)” As stated in previous readings, the internet was made predominantly for cis-gender, heterosexual, white males, thus some of these individuals enforce the racism used in the search engine. Many people, like myself, do not tolerate the ignorance and racism embedded in algorithms that kindle racism.

    To combine question one and three, I believe using social media as a platform to advocate for education and change would be an empowering move. There are currently various social justice groups all over social media advocating for minority groups and setting up IRL rallies to seek justice in current social injustices. Furthermore, the same can be said about groups of people who are racism, xenophobic, homophobic, etc. and they organize rally’s or meetings to enforce their hateful beliefs.

  5. Really insightful thoughts! Everything you discussed here was really important and relevant to discussion and to the text. I would agree that what we face today with these algorithms is a huge problem. What seems to be the biggest problem is the lack of blame these companies are willing to take in terms of their racism and discrimination.
    Multiple times throughout the reading, Noble talks about Google, and how they essentially admitted their algorithms are racist but claimed that there is nothing they can do to stop it, but that the search results that come up with certain phrases “do not reflect the views of the company”.
    I believe we should be holding these companies accountable. If their algorithms or search engines are racist, they should be working tirelessly to fix it, not just pushing out every disclaimer that they can. There is no logical reason that people in marginalized groups should have to experience such discrimination when they are simply trying to look something up. We should be calling these things out when we see them whether it be in online protests, through social media, or through regular protests. We should not let this become the norm and add it to the list of discriminations we can expect on a day to day basis.

  6. HI Adrienne,
    I really enjoyed reading your blog post. I thought talking about the Facebook issue was a good idea — it gave us an outside example which was helpful. It is crazy to think though, that something like that actually happened, to target specific groups or to persuade one’s thinking. However, when you think about it, it happens all of the time. Take our news channels for example, they may not be taking our personal information all of the time, but what they show reflects their beliefs. There never seems to be a common ground for both perspectives.
    As for Google, I feel like it’s popularity will never decrease (not anytime soon at least). Even if more people understood the discrimination that lies within Google, they would still continue to use it. I do not know about you, but when I need a definition, articles, addresses, phones numbers, etc I always turn to Google first. And why? Because that is what I, along with everyone else, has been “trained” to do. Maybe that is where educators can come in and help. They could use it less in the classrooms or teach students ways to notice and look for good, reliable, well-rounded sources.

  7. Adrienne,
    This was a really insightful post that really helped me understand the reading in a new light. Similar to you, this quote stuck out to me “… some of the very people who are developing search algorithms and architecture are willing to promote sexist and racist attitudes openly at work and beyond, while we are supposed to believe that these same employees are developing ‘neutral’ or ‘objective’ decision-making tools. Human beings are developing the digital platforms we use…” (2). Often, the people in these jobs are older, white males which furthers this issue. Women are often detered from fields such a programing and IT work which is very upsetting because they could do such great things. I also began to think about why we always turn to google for answers and how the answers and links are ranked. I wonder how that algorithm works. As a future educator, I hope to teach my students ways to look for the most accurate, non biased information possible by showing them multiple sources and different ways to uncover answers. Sadly, I do not think google will ever lose it’s popularity but I hope to teach my students how to think critically and find useful and appropriate information.

  8. Hi Adrienne!

    Great blog post! I absolutely love and completely agree with your concluding statement : “we are never “safe” online, and our search history and information are constantly being monitored and evaluated”. The government is always watching us through the cyber world. In the book, Noble states “we need to imagine new possibilities in the area of information access and knowledge generation, particularly as headlines about “racist algorithms” continue to surface in the media with limited discussion and analysis beyond the superficial” (9). In order to help this issue in any way, we have to communicate what the issue is and do more research than what is simply in-front of us. To answer your first question regarding education on this issue, I definitely think it is a topic that needs to be addressed more. It needs to be addressed at not only college level students but younger students as well. It should also get discussed at jobs. Almost everyone uses the internet – From my five year old sister who uses it for YouTube on her I-Pad to my 75 year old grandpa who reads articles on his phone. The point being is that if more teachers, professors, bosses, parents, and other influential positions in our lives discuss this issue more in depth then, slowly we can begin to fix the issue.

Leave a Reply

Skip to content