Skip to Main Content

The Three Pillars of Critical Information Literacy - Prof. Don Simmons Jr.: Algorithmic Bias

Algorithmic Bias

ACRL Framework: Research As Inquiry, Searching as Strategic Exploration 

Student Learning Objectives

  • Students will have the ability to understand and define algorithms and algorithmic bias
  • Students will examine examples of algorithmic bias through Google search

Terms

Algorithms – are a complex set of instructions that solve a problem or perform a task, with the selected data and information given. 

Algorithmic bias – a systematic error in a computer system that creates discriminatory outcomes. 

Bias –  is a prejudice act, or showing favoritism to one thing, person, or group of people compared to another.

Learning Materials

ADL. (2019). What is Algorithmic Bias? ADL. https://www.adl.org/resources/lesson-plan/what-algorithmic-bias

Ahmed, Z., Vidgen, B. & Hale, S.A. (2022). Tackling racial bias in automated online hate detection: Towards fair and accurate detection of hateful users with geometric deep learning. EPJ Data Sci. 11, 8. https://epjdatascience.springeropen.com/articles/10.1140/epjds/s13688-022-00319-9#Abs1

Coldewey, D. (2019). Racial bias observed in hate speech detection algorithm from Google. Tech Crunch. https://techcrunch.com/2019/08/14/racial-bias-observed-in-hate-speech-detection-algorithm-from-google/ 

Hielwel, R. (2020). Why algorithms can be racist and sexist. Vox.https://www.vox.com/recode/2020/2/18/21121286/algorithms-bias-discrimination-facial-recognition-transparency 

Lee, N., Resnick, R., Barton, G. Algorithmic bias detection and mitigation: Best practices and policies to reduce consumer harms. Brookings https://www.brookings.edu/research/algorithmic-bias-detection-and-mitigation-best-practices-and-policies-to-reduce-consumer-harms/

Noble, S. (2018). Algorithms of Oppression: Safiya Umoja Noble. YouTube. https://www.youtube.com/watch?v=6KLTpoTpkXo 

Noble, S. (2016). Safiya Noble | Challenging the Algorithms of Oppression. YouTube.

https://www.youtube.com/watch?v=iRVZozEEWlE 

Instructional Procedure

1. Prior to presenting Google research strategies, introduce the functions of algorithms and explain why it’s important. 

Google is one of the most popular search engines worldwide. And for every search engine, similar to Google, there’s an algorithm identifying articles and resources that best match your selected keywords. Algorithms are a complex set of instructions that solve a problem or perform a task, with the selected data and information given.

2. Ask students to share an example of an algorithm

Algorithms are used on other technological platforms as well. Can anyone share an example of an algorithm we experience everyday?

3. If students don’t share examples, provide the following 

As users, we experience the functionality of algorithms all the time – whether they’re ads or shared articles that pop-up on our Facebook feed, follower suggestions on Twitter and Instagram, suggested playlists on Apple Music and Spotify, or streaming recommendations on Netflix. The purpose of algorithms is to understand people’s interests and list preferences depending on the selected subject. However, like all creations, these algorithm systems can be biased based on how they’re designed, developed, and utilized. And unfortunately, due to these circumstances, some algorithms can interfere with data collection and research because of the noticeable biases with certain searches.

4. Ask students what comes to mind when they think of the term bias. 

Bias is prejudice or showing favoritism to one thing, person, or group compared to another. Algorithmic bias, in this case, is a systematic error in a computer system that creates discriminatory outcomes, such as privileging one group over another based on aspects of perceived identity and information. This frailty is due to flaws in the algorithm design including the way data is coded (with its limited capabilities) and lack of sensitivity when identifying certain information – contributing to biased results. 

5. Emphasize the function of Google – a search engine funded primarily by advertisements

More than 85% of Google’s income is generated from ads; and because they’re primarily an advertising platform, designed to optimize their content, products, and services, the first page of your search results are predominantly advertising-related content rather than top-tier articles or online resources that are best suited for your search. This inevitably leads to a biased Google search. 

6. Share examples of Google Bias. Below are articles highlighting the various flaws of Google’s algorithm (i.e. discrimination against women of color, image search gender bias, popular articles favored over empirical sources) 

Cadwallar, C. (2016). How to bump Holocaust deniers off Google’s top spot? Pay Google. The Guardian. https://www.theguardian.com/technology/2016/dec/17/holocaust-deniers-google-search-top-spot

Grant, Nico. (2021). Google Quietly Tweaks Image Searches for Racially Diverse Results. Bloomberg. https://www.bloomberg.com/news/articles/2021-10-19/google-quietly-tweaks-image-search-for-racially-diverse-results

Manjoo, F. (2018). Search Bias, Blind Spots And Google. New York Times, B1(L). https://link.gale.com/apps/doc/A552363885/OVIC?u=valh61524&sid=bookmark-OVIC&xid=9af76a11

McQuate, S. (2022). Google’s ‘CEO’ image search gender bias hasn’t really been fixed. University of Washington.https://www.washington.edu/news/2022/02/16/googles-ceo-image-search-gender-bias-hasnt-really-been-fixed/

Young, R. & Hagan, A. (2021). Search Engines Like Google Are Powered By Racist, Misogynist Algorithms, Says MacArthur Fellow. WBUR. https://www.wbur.org/hereandnow/2021/09/30/safiya-noble-internet-research

Lapowsky, I. (2018). Google Autocomplete Still Makes Vile Suggestions. Wired. https://www.wired.com/story/google-autocomplete-vile-suggestions/

Cohn, J. (2019). Google’s algorithms discriminate against women and people of colour. The Conversation. https://theconversation.com/googles-algorithms-discriminate-against-women-and-people-of-colour-112516

Noble, S. (2018). Google Has a Striking History of Bias Against Black Girls. Time. https://time.com/5209144/google-search-engine-algorithm-bias-racism/

Algorithmic Bias

Along with covering Google search bias, you can share brief examples of algorithmic bias via social media (i.e. Twitter’s hate speech detection bias against Black users, social media platforms censoring social justice activists)

Ahmed, Z., Vidgen, B. & Hale, S. (2022). Tackling racial bias in automated online hate detection: Towards fair and accurate detection of hateful users with geometric deep learning. EPJ Data Science, 11 (8) https://epjdatascience.springeropen.com/articles/10.1140/epjds/s13688-022-00319-9

Ghaffrey, S. (2019). The algorithms that detect hate speech online are biased against black people. Vox.

https://www.vox.com/recode/2019/8/15/20806384/social-media-hate-speech-bias-black-african-american-facebook-twitter

Resources

The Most Likely Machine

Winner of the 2021 innovation by Design Awards, the Most Likely Machine is a browser-based program designed to help students understand the complexities of algorithmic bias. Collaborate with faculty and assign the activity to students following your information literacy session – request students to post their reflections on the discussion board or share in class. To learn more about the Most Likely Machine, read the article below:

Rawlins, A. (2021). Algorithms are biased. This tool shows kids just how dangerous they can beFast Company. https://www.fastcompany.com/90667009/the-most-likely-machine-innovation-by-design-2021

Celebrating 75 Years of Excellence!

Westchester Community College provides accessible, high quality and affordable education to meet the needs of our diverse community. We are committed to student success, academic excellence, workforce development, economic development and lifelong learning.

MyWCC
Facebook Facebook Facebook Facebook Facebook

75 Grasslands Road
Valhalla, NY 10595
Tel: (914) 606-6600