Dylann Roof murdered nine people in a church basement in Charleston in 2015.
He confessed to the massacre shortly after he was arrested. He didn't testify at trial and no witnesses were called on his behalf before he was convicted of federal hate crimes.
The most emphatic statements on Roof's behalf came from defense attorney David Bruck. For weeks, the prosecution had presented evidence that Roof is a white supremacist whose violent racism drove him to kill black people. Bruck asked the jury to consider how the 22-year-old came to believe the things he did.
" 'There is hatred all right, and certainly racism, but it goes a lot further than that,' [Bruck] said.
" 'Every bit of motivation came from things he saw on the internet. That's it. ... 'He is simply regurgitating, in whole paragraphs, slogans and facts — bits and pieces of facts that he downloaded from the internet directly into his brain.' "
Bruck was referring to Roof's assertion in his confession and in a manifesto that a Google search shaped his beliefs.
So when Roof asked Google for information about race, what did the search engine show him?
The Internet and Trayvon Martin
"The event that truly awakened me was the Trayvon Martin case," Roof wrote in the racist manifesto he published online, a cached version of which was saved to Internet archive sites.
Roof was 17 years old at the time, the same age Trayvon Martin was when neighborhood watch volunteer George Zimmerman shot and killed the unarmed black teenager in 2012.
"I kept hearing and seeing [Martin's] name," Roof wrote, "and eventually I decided to look him up." Roof wrote that he "read the Wikipedia article" about the shooting and came to the conclusion that Zimmerman was not at fault.
"But," he continued, "more importantly this prompted me to type in the words 'black on White crime' into Google, and I have never been the same since that day."
In a videotaped interview with FBI agents after he was arrested in June 2015, Roof told a similar story. He said that after hearing about Martin's death he had "decided to look his name up. Type him into Google, you know what I'm saying?"
Roof told investigators he had read the Wikipedia article for Martin, and then, "for some reason after I read that, I," he paused before continuing, "I typed in — for some reason it made me type in the words black on white crime."
"And that was it," Roof said. "Ever since then ..." he trailed off and didn't finish the sentence.
It is impossible to know what Roof saw when he typed those words into Google. The search engine does not make the details of its search algorithm public, and even if the exact date and location of Roof's initial search were known (court documents suggest only that it was around 2013), there is no public archive of past search results.
"Even the Wayback Machine, which is maintained by the Internet Archive, does not preserve search rankings," explains Robert Epstein, a psychologist who studies how people interact with search engines and who has published multiple studies about Google's algorithm.
"You can find old versions of Web pages," he says, but "all this past stuff — search suggestions, search results — you cannot get to."
Epstein says the closest approximation of what Roof saw on Google begins with the search engine's "autocomplete" feature.
Taking Roof's statements at face value, he went to Google.com and typed the letters in his search term, one at a time.
But he didn't necessarily have to type all the letters in the search term, "b-l-a-c-k o-n w-h-i-t-e c-r-i-m-e." NPR googled that phrase in December 2016 and January 2017, and found the letters "b-l-a-c-k o-n" elicited this top autocompleted suggestion: "black on white crime."
Users need only press enter to complete the search.
Adding one letter to make it "b-l-a-c-k o-n w," the top autocomplete suggestion remained the same, and the second was "black on white violence." The third and fourth were "black on white crime statistics" and "black on white racism."
The top autocomplete results for "w-h-i-t-e o-n w" were "white on white crime," "white on white," "white on white acid" and "white on white kitchen."
A spokesperson for Google told NPR in an email that "autocomplete predictions are produced based on a number of factors including the popularity and freshness of search terms."
"We do our best to prevent offensive terms, like porn and hate speech, from appearing, but we don't always get it right," the spokesperson continued and pointed to a June 2016 blog post by the search engine's product management director, Tamar Yehoshua, saying Google had changed its algorithm to "avoid completing a search for a person's name with terms that are offensive or disparaging."
Search results as "propaganda" tools
When Roof hit Enter for the search term "black on white crime," the search engine returned a list of websites. "The first website I came to was the Council of Conservative Citizens," Roof wrote.
The Council of Conservative Citizens is a white supremacist organization, according to the Anti-Defamation League, which tracks hate groups. In the aftermath of the Trayvon Martin shooting, the ADL reported that multiple hate groups used inaccurate Internet posts about crimes against white people as a "propaganda tool" for white supremacy.
"On May 11, 2012, the white supremacist Council of Conservative Citizens (CofCC) posted an article on its Web site that claimed that a New Jersey newspaper had 'censored' the race of the alleged assailants in what it called 'savage mob attacks' on five white concertgoers in New Jersey. The CofCC dismissed both the newspaper and police accounts portraying the incident as an 'isolated event.' According to the CofCC, 'almost as alarming as the epidemic violent crime being perpetrated against white people is the blatant media censorship and black-out of the racial element of the incidents.'"
It's impossible to know whether the Council of Conservative Citizens page that Roof referenced appeared at the top of the Google search results. Results for a given search term and the order in which they're presented change over time.
Google searches for "black on white crime" conducted by NPR in December 2016 and January 2017 found the top results included multiple white supremacist websites, but didn't include anything from the Council of Conservative Citizens.
The top five results included the website New Nation News, which the Anti-Defamation League says "promotes the belief that 'voluntary racial segregation in all prisons is a constitutional right,' " and American Renaissance, a magazine put out by what the Southern Poverty Law Center calls "a self-styled think tank that promotes pseudo-scientific studies and research that purport to show the inferiority of blacks to whites."
The first page of results also included an article published on the website The Root titled, "Open Letter to White People Who Are Obsessed With Black-on-Black Crime," and a link to a forum about "Black on White Crime" on the neo-Nazi site The Daily Stormer.
"The top two positions in the search results matter the most," says Epstein, who has studied click rates for search results. "The top two draw 50 percent of clicks, and the numbers go down from there, so what's at the top is extremely, extremely powerful."
In NPR's test search, New Nation News was the second option, following a post from the conservative website dailywire.com.
"People equate the position of search results with how true they are," Epstein explains. "What's higher is better. What's higher is truer."
Search engines, hate speech and responsibility
In December, an extensive article published by The Guardian publicized another instance of potentially inflammatory rhetoric on Google's search engine.
The newspaper pointed out that the search "j-e-w-s a-r-e" suggested, among others, the autocomplete phrase "are jews evil." (The same final word was suggested for the search "a-r-e w-o-m-e-n." The letters "a-r-e m-u-s-l-i-m-s" suggested the phrase "are muslims bad.")
Google said it "took action within hours" and changed its autocomplete results. The search engine company "did not comment on its decision to alter some but not all those raised in the article," the Guardian reported.
NPR did the same search a few weeks later and got the suggested phrases "are jews white," "are jews a race" and "are jews christians." There were no suggested phrases for "a-r-e w-o-m-e-n," and "a-r-e m-u-s-l-i-m-s" suggested only "are muslims a race."
Some experts who study search engines and their implications for democratic society have suggested there is a disconnect between the stated mission of a free and open Internet and the reality of search algorithms, which come with all the messy biases of anything designed by humans.
Internet law expert Frank Pasquale is among those who have advocated for search result algorithms in the U.S. to be regulated by the government.
"Though [dominant search engines] advocate net neutrality, they have been much less quick to recognize the threat to openness and fair play their own practices may pose," Pasquale wrote in a 2008 paper.
U.S. courts have repeatedly dismissed challenges to search engines' editorial control over their search results, including a case in which the court upheld Google's right to present what the plaintiff argued were "biased search results that favor its own paid advertisers and Google-owned companies."
Courts have repeatedly cited the First Amendment, treating search engine companies as conduits for free speech on the Internet.
Asked what, if anything, Google sees as its responsibility concerning potential hate speech in search terms and results, the spokesperson told NPR:
"The views expressed by hate sites are not in any way endorsed by Google, but search is a reflection of the content and information that is available on the Internet. We do not remove content from our search results, except in very limited cases such as illegal content, malware and violations of our webmaster guidelines, including spam and deception."
The company did not comment on the potential role of its Internet search in the specific case of Dylann Roof.
300x250 Ad
300x250 Ad