×

Hate crimes are soaring; research building to curb the spread of extremism

They arrived in yellow rental trucks, unfurled their flags, and readied shields and smoke bombs. The hour was late, and the symbolism was unsettling: As the clock inched close to midnight on July 3, about 200 members of the white nationalist group Patriot Front marched through downtown Philadelphia, past Independence Hall and other historic landmarks, while chanting, “Take America back!”

If the demonstration was meant to be a show of strength for the organization, it ended meekly. After scuffling with a handful of counterprotesters, the Patriot Front members retreated into their Penske trucks and then were stopped by Philadelphia police on Delaware Avenue, where some marchers sat dejectedly, their heads bowed.

But the episode served a dual purpose. Social media has proven to be fertile ground for white supremacist and conspiracy-theory movements trying to attract new members. Patriot Front turned footage of its parade through the city into a hype video; on its website, its members likened themselves to Revolutionary War heroes, and insisted, “Americans must dictate America.”

A month before the Philadelphia demonstration, more than 300 researchers and scholars had volunteered to be part of a new effort to curb the spread of extremism: the Collaboratory Against Hate, a center created by the University of Pittsburgh and Carnegie Mellon University.

The project is well-timed. More than 8,000 hate crimes were reported in the United States in 2020, the highest total in more than a decade, according to the FBI. In Philadelphia, 63 people were reported as victims of hate crimes, a 320% increase from 2019, when there were 15 victims. Statewide, the number of victims doubled in 2020 to 110, but that’s likely a low estimate; of Pennsylvania’s 1,504 law enforcement agencies, just 734 supplied data to the FBI.

“We’re pushing back at something I hear a lot, which is ‘Well, people are always going to hate some people. Hate is human nature, and we’re stuck with it, and there’s always going to be hate groups,'” said Kathleen Blee, the Collaboratory’s co-director.

“But it’s not like everybody is racist, and we’re doomed to this. … There is a deeper, more entrenched, more destructive side of extremism and political conspiracy that is not part of human nature. It is deliberately constructed, like anti-Semitism is deliberately constructed in these groups. We just need to apply our tools to get past that. And make that impossible.”

‘Into the rabbit hole’

Collaboratory researchers are first digging into three areas — digital-content moderation, extremism in the military, and youth and extremism. The last, Blee said, is a newer trend she finds particularly worrisome.

White supremacists are using online video-game communities and streaming platforms to approach and recruit teenagers and middle schoolers. Some entreaties start with jokes, or by introducing white supremacist phrases to pique a child’s interest; others share links to material that lure young gamers deeper.

“That’s one of the most disturbing, and one of the least known, phenomena,” said Blee, who has studied white supremacy since the 1980s. “We’re just starting to get a handle on it. For some children, that’s probably an approach that pulls them into the rabbit hole.”

White supremacist narratives rely on a combination of elements to hook recruits: the offer of a collective identity, revelations about sinister conspiracies, and the promise of righting perceived grievances.

“Getting into that mind-set, and into that world, virtually or in real life, can be such a fundamentally self-altering experience that it really can be quite difficult to firmly pull yourself out,” Blee said.

Her concerns about hate groups recruiting minors were borne out by a 2021 survey conducted by the Anti-Defamation League, which found that 10% of gamers aged 13 to 17 had encountered white supremacist ideology while participating in online multiplayer games, and 60% had experienced some form of harassment.

“They hear people talk about the superiority of the white race, and the desire of a white homeland,” said Daniel Kelley, the associate director of the ADL’s Center for Technology and Society.

Kelley theorizes that large numbers of white supremacists aren’t flocking to online games to recruit new members; instead, savvy ones have just learned to exploit a realm that is inherently vulnerable.

In the wake of the Jan. 6 attack on the U.S. Capitol, Facebook and Twitter banned the accounts of former President Donald Trump, and thousands affiliated with far-right extremists, white nationalists, and the conspiracy movement QAnon. And Facebook faced congressional scrutiny when a whistle-blower accused the company of allowing misinformation to spread.

But Kelley said there’s little oversight of online gaming ecosystems, or “gaming adjacent” platforms like Twitch, Steam, and Discord, with some companies flatly opposed to content moderation.

Instead, they’ve become environments where hate speech “can be normalized in a dangerous way.”

The majority of parents who responded to the ADL’s survey said they don’t set security controls on games their kids use, and most teens don’t tell their parents about uncomfortable exchanges they’ve had, which can include stalking, intimidation, and threats of harm.

Kelley argues it’s a mistake to draw a distinction between online and

offline-extremism.

“There’s a tendency to call offline life ‘real life,'” he said. “But every time there’s an interaction in a digital space, there’s a real person behind that screen.”

And even one person who espouses hatred and violent fantasies online can cause unimaginable harm.

A growing threat

During her decades of research into white supremacist movements, Blee detected a pattern. “It can appear that it’s strengthening and weakening, but often what it’s really doing is crossing the line of public visibility,” she said. “It sinks below the line, and rises above the line.”

In the 1980s, many hate groups found themselves marginalized. They grew bolder in the 1990s — national preparedness expos became a big-tent gathering spot for militia groups and conspiracy theorists, and in 1995, an antigovernment terrorist who described white nationalists as his “brother in arms” blew up a federal building in Oklahoma City, killing 168 people, including 19 children.

Extremist movements multiplied at a dizzying pace in recent years, and the COVID-19 pandemic and the Jan. 6 insurrection have only energized those who want to see the U.S. government overthrown. More than 700 participants in the Capitol attack have faced federal prosecution.

And in November, a federal jury ordered more than a dozen white nationalist leaders to pay $26 million in damages over the violence that erupted at a 2017 “Unite the Right” rally in Charlottesville, Va., which left one counterprotester dead. (Patriot Front, which marched in Philadelphia and distributed propaganda on local college campuses, was formed by some members of a neo-Nazi group that took part in the Virginia rally.)

“On the extremist side, we still see a momentum,” Blee said, “not a demobilization.”

In November, the Department of Homeland Security warned that “racially or ethnically motivated violent extremists” pose a continuing threat to the nation, and might use pandemic-related health restrictions as a reason to target government officials.

Months earlier, ABC News reported that FBI agents in San Antonio concluded in a confidential intelligence assessment that white supremacists were seeking to infiltrate law enforcement and the military to “prepare for and initiate a collapse of society” and harm racial and ethnic minorities.

The Collaboratory Against Hate hopes to shape the data and other insights its researchers assemble into tools that can blunt the growth of extremism.

NEWSLETTER

Today's breaking news and more in your inbox

I'm interested in (please check all that apply)
Are you a paying subscriber to the newspaper? *
   

Starting at $4.62/week.

Subscribe Today