The radicalization of Dylann Roof began online and ended in a massacre.
Before killing nine African Americans in Charleston, South Carolina’s Emanuel African Methodist Episcopal Church, Roof learned to hate in a classroom of his own making. A classroom entirely online, its climate angry, its claims inaccurate at best, inflammatory at worst.
His final thesis: a manifesto cloaking white nationalism as a martyr’s cause. Roof believed the white race was in danger and complained there was “no one doing anything but talking on the internet.” In his mind, the logical next step was to take the culture war offline—and turn hateful words into one of modern history’s most extreme acts of hate.
But the initial step that led Roof to murder nine people was not extreme: He took a curiosity and turned it into a Google search.
Roof’s curiosity was first piqued by the trial of Trayvon Martin’s killer, George Zimmerman. He searched for the case he kept hearing about on the news and, after reading a Wikipedia article, determined that Zimmerman was “in the right” to see Martin as a threat.
Roof then typed “black on White crime” into the search engine, hit enter and fell into a wormhole. Top results sent him to the website for the Council of Conservative Citizens, which offered page after page featuring what Roof referred to as “brutal black on White murders.”
Google presented Roof with well-packaged propaganda—misinformation published by a group with a respectable-sounding name and a history of racist messaging, a group that once referred to black people as a “retrograde species of humanity.”
In his manifesto, Roof claimed he has “never been the same since that day.”
From that point on, he immersed himself in white supremacist websites, as both reader and participant, honing a philosophy far removed from his upbringing—one that would inform his manifesto and fuel a mass murder.
How did this happen? A dangerous coupling of digital media trends held Roof’s hand as he walked the path of radicalization. The first guide: Google’s search algorithm, which fails to find the middle ground between protecting free speech and protecting its users from misinformation.
Google remains vulnerable to sites that peddle propaganda, especially those with heavy web traffic that utilize tricks of the SEO (search-engine optimization) trade. Google has claimed its autocomplete function filters out offensive terms—and asserted its updated search algorithm “will help surface more high quality, credible content” and bump down sites with hate speech and disinformation.
But Google’s algorithm is deaf to “dog whistles,” or the coded language of white nationalist messaging and pseudoscientific racism. As a result, it fails to stem the flow of disinformation to the top of its search results. Since Google changed its algorithm and autocomplete function, NPR found searches for “black on white crime” continued to call up “multiple white supremacist websites.”
This gateway to hate, provided inadvertently by Google, becomes more of a threat when coupled with a second phenomenon that can lead young people like Dylann Roof toward radicalization: the filter bubble that refuses to burst.
In short, search engines like Google and social media sites like Facebook take a user’s information and web history. With that information, online agency becomes a façade; behind the curtain, a formula controls what users see and steers them toward content that confirms their likes, dislikes and biases. Practically, this helps advertisers. Consequentially, it replaces the open internet of diverse perspectives with an open door to polarization and radicalization.
“Each link, every time you click on something, it’s basically telling the algorithm that this is what you believe in,” says James Hawdon, director for Virginia Tech’s Center for Peace Studies and Violence Prevention.
“The next thing you know … you just get led into this rabbit hole of increasingly extreme ideas.”
When Roof said in his manifesto that he “researched deeper,” we now know he ended up on sites like Stormfront and The Daily Stormer, known peddlers of false stories and white supremacist ideologies.
Beyond filter bubbles threatening the exchange of diverse discourse, that rabbit hole threatens to inspire acts of violence in those who dive too deep into divisive ideologies—and declare enemies.
Dylann Roof isn’t alone. On both edges of the political spectrum, online radicalization has led to acts of violence, while the normalization of hateful, misinformed rhetoric has led to unsafe and uncivil climates in schools and communities.
Long before Sean Christopher Urbanski killed Richard Collins III in May, he belonged to a Facebook group called Alt-Reich Nation that spread one-sided bigotry, misinformation and memes beneath the guise of satire. Before James Harris Jackson traveled to New York intending to kill several black men, he curated a YouTube subscription list full of alt-right figures who echoed white-nationalist myths.
And before James Hodgkinson decided to target U.S. House Majority Whip Steve Scalise (R-Louisiana) and others at a baseball practice in Washington, DC, the content he shared on his Facebook page reflected an increasingly extreme disdain for Republicans.
Ultimately, all these individuals are and must be held personally responsible for their own actions. However, without the tools and knowledge to discern fact from fiction and civil discourse from hateful rhetoric, many people, particularly the young, remain vulnerable to algorithms, filter bubbles and other unseen forces pushing them toward a dangerously divisive digital existence.
In Roof’s rambling manifesto, a chilling sentence attempts to justify the violence to come.
“Someone has to have the bravery to take it to the real world,” he wrote. “I guess that has to be me.”
For more on the miseducation of Dylann Roof, read Kaadzi Ghansah’s comprehensive profile of Roof and his transformation from teenager to terrorist.