Addressing Negative Biases in Search Engine Algorithms

Dr. Safiya Noble delivered Purdue Libraries’ inaugural Critical Data Studies Distinguished Lecture Oct. 3 at Purdue University. The event was part of Purdue’s Sesquicentennial Ideas Festival and was related to the festival theme, “Giant Leaps in Artificial Intelligence, Algorithms, and Automation: Balancing Humanity and Technology.” (Photo by Rebecca Wilcox, Purdue Marketing and Media)

Safiya Noble first encountered racism in search nine years ago. In her 2018 book, Algorithms of Oppression: How Search Engines Reinforce Racism, and in a piece she wrote for Time this past spring, she begins by sharing her story about being “stunned” at the returned Google search results on the phrase “black girls” in 2009.

Dr. Noble, an assistant professor at the University of Southern California Annenberg School of Communications, also began the inaugural Purdue Libraries Critical Data Studies Distinguished Lecture (Oct. 3, Fowler Hall) with her story about this thought-provoking experiment. Noble, who is the co-founder of the Information Ethics & Equity Institute and a partner in Stratelligence, started out by explaining what happened nine years ago when she “Googled” the phrase “black girls,” and she shared her subsequent experience of shock upon seeing the returned search results. In her presentation slides, she incorporated a screenshot of the Google search results (you can see the results in the Time article). In that piece, she lists the results, explaining, “[t]hese are the details of what a search for ‘black girls’ would yield for many years, despite that the words ‘porn,’ ‘pornography,’ or ‘sex’ were not included in the search box.”

150 Years of Giant LeapsThis year, the Purdue Libraries’ Critical Data Studies Distinguished Lecture is part of Purdue’s Sesquicentennial Ideas Festival Theme, “Giant Leaps in Artificial Intelligence, Algorithms, and Automation: Balancing Humanity and Technology.” Noble­—whose book “Algorithms of Oppression” is described as a “revealing look at how negative biases against women of color are embedded in search engine results and algorithms”­—has also written for Wired and delivered the closing plenary session lecture this summer at 2018 AUPresses Annual Meeting in San Francisco.  In Inside Higher Ed earlier this year, Colleen Flaherty notes Noble’s book has generated buzz among information science, machine learning, and technology scholars, as well among sociologists.

In her Oct. 3 lecture at Purdue, “Intellectual Freedom and Racial Inequality as Addressed in ‘Algorithms of Oppression,'” she compellingly demonstrated why people are talking about her work and why organizations are seeking her out to share her research with those who work in education, technology, and publishing.

More importantly, though, she explained why it is imperative for people to understand that Google (and other search companies) are for-profit, commercial entities responsible for these algorithms and argued that we need to build search and other platforms and repositories that belong to the public. Her last slide captured that point, as well as three other action items that we all can do and/or support to move forward, including:

  • Make scholarly research visible to the public, faster, and broadly;
  • Resist colorblind/racist/sexist practices;
  • Re-learn, re-train, re-imagine new possibilities in our field(s); and
  • Never give up.

Learn more about Dr. Noble and her work at https://safiyaunoble.com/.

Sponsors of Dr. Noble’s lecture include the Purdue Libraries Seminar Committee, American Studies, the Diversity Resource Office, the Division of Diversity and Inclusion, Purdue Policy Research Institute, the 150th AI Committee, the Department of Anthropology, the Honors College, the Center for Science of Information­—NSF Science and Technology Center, the Critical Data Studies cohort of The Data Mine Learning Community, the Andrew W. Mellon Foundation, and Purdue Fort Wayne

About Critical Data Studies at Purdue

Critical Data Studies, or CDS, is an emerging interdisciplinary field that considers and addresses the ethical, legal, socio-cultural, epistemological, and political aspects of data science, big data, algorithms, and digital infrastructure. In addition to the CDS Lecture Series, faculty and staff in the Libraries, the Honors College, and the Department of Anthropology are collaborating in the Critical Data Studies Cohort of the Data Mine Learning Community, one of Purdue’s student living and learning communities. For more information about the lecture series or about critical data studies at Purdie, contact Kendall Roark, assistant professor, Purdue Libraries, at roark6@purdue.edu.