Lancaster University scholars Paul Baker and Amanda Potts claim that Google Instant produces both racially and sexually stereotypical results based on tests that were published in the Dec. 17, 2013, edition of the journal Critical Discourse Studies.
The purpose of Google Instant is to save time typing and searching the imposingly gargantuan volume of information available on the internet. Google claims the auto-complete algorithms used in Google Instant save 2.5 seconds per search.
The researchers used Google Instant with open-ended questions that were specific to gay people, black people, and men. The questions used were like “Why do gay people…?, “Why do black people…?”, and “Why do men …?”.
The automated choices from Google Instant included “Why do gay men lisp”, “Why do black people like fried chicken”, and similarly derogatory inferences about men, Muslims, Jews, Christians, Asians, and lesbians.
The researchers propose that the automated responses produced by Google Instant are not a function of the programming but are a function of the huge volume of derogatory, openly prejudiced, and hate filled content involving these groups of people that exists on the internet.
Google Instant is prejudiced because people are prejudiced.
The researchers pose the question “What will Google do?” about this problem when in reality it should be “What will we as people do about this problem?”