{"id":211,"date":"2018-10-10T18:19:14","date_gmt":"2018-10-10T22:19:14","guid":{"rendered":"http:\/\/blogs.lib.purdue.edu\/volume\/?p=211"},"modified":"2018-10-11T12:26:48","modified_gmt":"2018-10-11T16:26:48","slug":"cds-lecture-ideasfest-recap1018","status":"publish","type":"post","link":"https:\/\/blogs.lib.purdue.edu\/volume\/2018\/10\/10\/cds-lecture-ideasfest-recap1018\/","title":{"rendered":"Addressing Negative Biases in Search Engine Algorithms"},"content":{"rendered":"<div id=\"attachment_213\" style=\"width: 224px\" class=\"wp-caption alignright\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-213\" class=\"size-medium wp-image-213\" src=\"http:\/\/blogs.lib.purdue.edu\/volume\/files\/2018\/10\/Noble-CDS-Lecture-214x300.jpg\" alt=\"\" width=\"214\" height=\"300\" srcset=\"https:\/\/blogs.lib.purdue.edu\/volume\/files\/2018\/10\/Noble-CDS-Lecture-214x300.jpg 214w, https:\/\/blogs.lib.purdue.edu\/volume\/files\/2018\/10\/Noble-CDS-Lecture-768x1075.jpg 768w, https:\/\/blogs.lib.purdue.edu\/volume\/files\/2018\/10\/Noble-CDS-Lecture-732x1024.jpg 732w, https:\/\/blogs.lib.purdue.edu\/volume\/files\/2018\/10\/Noble-CDS-Lecture-549x768.jpg 549w, https:\/\/blogs.lib.purdue.edu\/volume\/files\/2018\/10\/Noble-CDS-Lecture-772x1080.jpg 772w, https:\/\/blogs.lib.purdue.edu\/volume\/files\/2018\/10\/Noble-CDS-Lecture-1286x1800.jpg 1286w, https:\/\/blogs.lib.purdue.edu\/volume\/files\/2018\/10\/Noble-CDS-Lecture-624x873.jpg 624w, https:\/\/blogs.lib.purdue.edu\/volume\/files\/2018\/10\/Noble-CDS-Lecture.jpg 1380w\" sizes=\"auto, (max-width: 214px) 100vw, 214px\" \/><p id=\"caption-attachment-213\" class=\"wp-caption-text\">Dr. Safiya Noble delivered Purdue Libraries&#8217; inaugural Critical Data Studies Distinguished Lecture Oct. 3 at Purdue University. The event was part of Purdue&#8217;s Sesquicentennial Ideas Festival and was related to the festival theme, &#8220;Giant Leaps in Artificial Intelligence, Algorithms, and Automation: Balancing Humanity and Technology.&#8221; (Photo by Rebecca Wilcox, Purdue Marketing and Media)<\/p><\/div>\n<p>Safiya Noble first encountered racism in search nine years ago. In her 2018 book, <a href=\"https:\/\/nyupress.org\/books\/9781479837243\/\"><em>Algorithms of Oppression: How Search Engines Reinforce Racism<\/em><\/a>, and in a piece she wrote for <a href=\"http:\/\/time.com\/5209144\/google-search-engine-algorithm-bias-racism\/\"><em>Time<\/em><\/a> this past spring, she begins by sharing her story about being &#8220;stunned&#8221; at the returned Google search results on the phrase &#8220;black girls&#8221; in 2009.<\/p>\n<p>Dr. Noble, an assistant professor at the University of Southern California Annenberg School of Communications, also began the inaugural Purdue Libraries Critical Data Studies Distinguished Lecture (Oct. 3, Fowler Hall) with her story about this thought-provoking experiment. Noble, who is the co-founder of the Information Ethics &amp; Equity Institute and a partner in Stratelligence, started out by explaining what happened nine years ago when she &#8220;Googled&#8221; the phrase &#8220;black girls,&#8221; and she shared her subsequent experience of shock upon seeing the returned search results. In her presentation slides, she incorporated a screenshot of the Google search results (you can see the results in the <a href=\"http:\/\/time.com\/5209144\/google-search-engine-algorithm-bias-racism\/\"><em>Time<\/em> article<\/a>). In that piece, she lists the results, explaining, &#8220;[t]hese are the details of what a search for &#8216;black girls&#8217; would yield for many years, despite that the words &#8216;porn,&#8217; &#8216;pornography,&#8217; or &#8216;sex&#8217; were not included in the search box.&#8221;<\/p>\n<p><a href=\"https:\/\/takegiantleaps.com\/\"><img loading=\"lazy\" decoding=\"async\" class=\"alignleft wp-image-261 size-medium\" src=\"http:\/\/blogs.lib.purdue.edu\/volume\/files\/2018\/10\/150GL_DATES_V_BG_RGB.2-300x154.png\" alt=\"150 Years of Giant Leaps\" width=\"300\" height=\"154\" srcset=\"https:\/\/blogs.lib.purdue.edu\/volume\/files\/2018\/10\/150GL_DATES_V_BG_RGB.2-300x154.png 300w, https:\/\/blogs.lib.purdue.edu\/volume\/files\/2018\/10\/150GL_DATES_V_BG_RGB.2-624x321.png 624w, https:\/\/blogs.lib.purdue.edu\/volume\/files\/2018\/10\/150GL_DATES_V_BG_RGB.2.png 635w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/a>This year, the Purdue Libraries&#8217; Critical Data Studies Distinguished Lecture is part of Purdue&#8217;s <a href=\"https:\/\/takegiantleaps.com\/\">Sesquicentennial Ideas Festival<\/a> Theme, &#8220;Giant Leaps in Artificial Intelligence, Algorithms, and Automation: Balancing Humanity and Technology.&#8221; Noble\u00ad\u2014whose book &#8220;Algorithms of Oppression&#8221; is <a href=\"https:\/\/nyupress.org\/books\/9781479837243\/\">described as a &#8220;revealing look at how negative biases against women of color are embedded in search engine results and algorithms&#8221;<\/a><strong>\u00ad<\/strong>\u2014has also written for <em><a href=\"https:\/\/www.wired.com\/story\/social-inequality-will-not-be-solved-by-an-app\/\">Wired<\/a><\/em> and delivered the closing plenary session lecture this summer at 2018 AUPresses Annual Meeting in San Francisco.\u00a0 In <em>Inside Higher Ed<\/em> earlier this year, <a href=\"https:\/\/www.insidehighered.com\/news\/2018\/02\/06\/scholar-sets-twitter-furor-critiquing-book-he-hasnt-read\">Colleen Flaherty notes<\/a> Noble&#8217;s book has generated buzz among information science, machine learning, and technology scholars, as well among sociologists.<\/p>\n<p>In her Oct. 3 lecture at Purdue, &#8220;Intellectual Freedom and Racial Inequality as Addressed in &#8216;Algorithms of Oppression,'&#8221; she compellingly demonstrated why people are talking about her work and why organizations are seeking her out to share her research with those who work in education, technology, and publishing.<\/p>\n<p>More importantly, though, she explained why it is imperative for people to understand that Google (and other search companies) are for-profit, commercial entities responsible for these algorithms and argued that we need to build search and other platforms and repositories that belong to the public. Her last slide captured that point, as well as three other action items that we all can do and\/or support to move forward, including:<\/p>\n<ul>\n<li>Make scholarly research visible to the public, faster, and broadly;<\/li>\n<li>Resist colorblind\/racist\/sexist practices;<\/li>\n<li>Re-learn, re-train, re-imagine new possibilities in our field(s); and<\/li>\n<li>Never give up.<\/li>\n<\/ul>\n<p>Learn more about Dr. Noble and her work at <a href=\"https:\/\/safiyaunoble.com\/\">https:\/\/safiyaunoble.com\/<\/a>.<\/p>\n<p>Sponsors of Dr. Noble&#8217;s lecture include the Purdue Libraries Seminar Committee, American Studies, the Diversity Resource Office, the Division of Diversity and Inclusion, Purdue Policy Research Institute, the 150th AI Committee, the Department of Anthropology, the Honors College, the Center for Science of Information\u00ad\u2014NSF Science and Technology Center, the Critical Data Studies cohort of The Data Mine Learning Community, the Andrew W. Mellon Foundation, and Purdue Fort Wayne<\/p>\n<h1><strong>About Critical Data Studies at Purdue<\/strong><\/h1>\n<p>Critical Data Studies, or CDS, is an emerging interdisciplinary field that considers and addresses the ethical, legal, socio-cultural, epistemological, and political aspects of data science, big data, algorithms, and digital infrastructure. In addition to the CDS Lecture Series, faculty and staff in the Libraries, the Honors College, and the Department of Anthropology are collaborating in the Critical Data Studies Cohort of the Data Mine Learning Community, one of Purdue&#8217;s student living and learning communities. For more information about the lecture series or about critical data studies at Purdie, contact Kendall Roark, assistant professor, Purdue Libraries, at <a href=\"mailto:roark6@purdue.edu\">roark6@purdue.edu<\/a>.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Safiya Noble first encountered racism in search nine years ago. In her 2018 book, Algorithms of Oppression: How Search Engines Reinforce Racism, and in a piece she wrote for Time this past spring, she begins by sharing her story about being &#8220;stunned&#8221; at the returned Google search results on the phrase &#8220;black girls&#8221; in 2009. [&hellip;]<\/p>\n","protected":false},"author":136,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[11598,11599],"tags":[],"class_list":["post-211","post","type-post","status-publish","format-standard","hentry","category-150-years-of-giant-leaps","category-critical-data-studies"],"jetpack_sharing_enabled":true,"jetpack_featured_media_url":"","_links":{"self":[{"href":"https:\/\/blogs.lib.purdue.edu\/volume\/wp-json\/wp\/v2\/posts\/211","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/blogs.lib.purdue.edu\/volume\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blogs.lib.purdue.edu\/volume\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blogs.lib.purdue.edu\/volume\/wp-json\/wp\/v2\/users\/136"}],"replies":[{"embeddable":true,"href":"https:\/\/blogs.lib.purdue.edu\/volume\/wp-json\/wp\/v2\/comments?post=211"}],"version-history":[{"count":10,"href":"https:\/\/blogs.lib.purdue.edu\/volume\/wp-json\/wp\/v2\/posts\/211\/revisions"}],"predecessor-version":[{"id":260,"href":"https:\/\/blogs.lib.purdue.edu\/volume\/wp-json\/wp\/v2\/posts\/211\/revisions\/260"}],"wp:attachment":[{"href":"https:\/\/blogs.lib.purdue.edu\/volume\/wp-json\/wp\/v2\/media?parent=211"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blogs.lib.purdue.edu\/volume\/wp-json\/wp\/v2\/categories?post=211"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blogs.lib.purdue.edu\/volume\/wp-json\/wp\/v2\/tags?post=211"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}