89.9 FM Live From The University Of New Mexico
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Social Media Site Gab Is Surging, Even As Critics Blame It For Capitol Violence

Gab was founded in 2016 as an almost anti-Twitter. The platform embraces far-right and other extremist provocateurs, including Milo Yiannopoulos and Alex Jones, who have been banned from Facebook and Twitter over incendiary posts.
Rafael Henrique
/
SIPA Images/Reuters
Gab was founded in 2016 as an almost anti-Twitter. The platform embraces far-right and other extremist provocateurs, including Milo Yiannopoulos and Alex Jones, who have been banned from Facebook and Twitter over incendiary posts.

In the days before the insurrection attempt on the Capitol, alternative social media site Gab was lighting up about it.

Some of the discussion on the social media, which is popular among Trump diehards, veered into a level of specificity that caused alarm among outside observers.

"There were directions provided on Gab for which streets to take to avoid the police," said Jonathan Greenblatt, chief executive of the Anti-Defamation League. "And which tools to use to help pry open the doors."

The plans to storm the Capitol were unfolding online in plain sight on niche social media sites and Facebook and Twitter long before the attack happened on Jan 6. Critics say social media companies, to varying degrees, permitted talk of the violence to persist without cracking down enough.

As federal investigators launch criminal cases against some of the perpetrators of the violence, a growing chorus advocates and lawmakers say tech companies bear some responsibility, too.

"We need to ascertain right here, right now, whether this specific platform was knowingly facilitating an attack on our nation's capital, literally a terror act against the seat of our government," said Greenblatt, who noted that other platforms should also be investigated over the insurrection attempt but he says special attention needs to be trained on Gab.

Gab CEO: "Suck it up and deal with it"

Founded in 2016 as an almost anti-Twitter, the platform embraces far-right and other extremist provocateurs, like Milo Yiannopoulos and Alex Jones, who have been banned from Facebook and Twitter over incendiary posts.

The site features memes mocking Democrats and liberal causes, conspiratorial musings and messages stoking the baseless idea that the election was stolen from Trump.

In 2018, Gab attracted scrutiny after the suspect who walked into a Pittsburgh synagogue and slaughtered 11 people had earlier posted anti-Semitic messages on the site.

Gab removed the shooter's account and cooperated with federal authorities. Still, the incident cost the site its GoDaddy site-hosting, but Gab sprung back. Its domain is now registered with the company Epik, known for working with far-right sites. Gab tells NPR it does its own Web hosting.

On the day of the attack on the Capitol, many were stirring the pot on the site, including Gab CEO Andrew Torba, who wrote: "In a system with rigged elections there are no longer any viable political solutions."

In an interview with NPR, Torba said nobody is going to make him take down messages on Gab. He feels that such actions censor the free speech of conservatives.

"We work with law enforcement to remove illegal activity from our site. So if we have politically incorrect opinions, the ADL is just going to have to suck it up and deal with it," said Torba, who fled Silicon Valley because he says it was too liberal. He now lives in Northeastern Pennsylvania.

The anything-goes approach has appeal to legions of people online. Since the riots on the Capitol, Gab's registered users more than doubled to around 3.4 million. There was a 800% jump in traffic to the website, a spike that forced Torba to have to order emergency servers to support the new flood of activity.

"This is growing rapidly now by millions," Torba said.

Can tech firms be held legally responsible?

Federal law enforcement officials have not said investigators are examining Gab, as the Anti-Defamation League is demanding.

Researchers say sites like Gab, in addition to the now-defunct site Parler and messaging service Telegram, did substantially help organizers spread the word about the march on the U.S. Capitol that eventually escalated into the insurrection attempt.

But so did the big platforms.

Many who showed up heard about it first on Facebook and Twitter. It was promoted with the hashtags #StopTheSteal and #FightForTrump.

Which has many wondering: Can any online platform large or small be held legally responsible for the attack?

Democratic Rep. Alexandria Ocasio-Cortez, who said she fled for her life during the violence, is pushing for it.

"Mark Zuckerberg and Facebook bear partial responsibility for Wednesday's events. Period," Ocasio-Cortez said.

Yet the problem with that argument may be the law itself, according to legal experts.

Under Section 230 of the Communications and Decency Act, tech companies have shield against civil lawsuits over what content a platform decides to leave up or remove.

Can Facebook and Twitter, or Parler and Gab, face a federal prosecution for the violence on the Capitol?

"It's hard to imagine," said Orin Kerr, a former federal prosecutor who is now a Berkeley professor focused on cyber crime.

Kerr said to prove that social media companies were, for instance, aiding and abetting in the violence, a prosecutor would need more than proof that a platform knew it might happen. Instead, such an action would require evidence that social media companies had a clear intention to assist in the violence.

Finding the probable cause necessary to convince a grand jury to indict a tech company over the siege would need "something like they created Facebook in order to lead an insurrection, or they kept Facebook running for that reason," Kerr said. "Prosecutors would need to to show Facebook, or whatever platform, was trying to bring about a specific crime."

Ryan Calo, a law professor at the University of Washington, said if a company creates something dangerous knowing it can cause harm, it can be cause for liability. And social media companies, he said, were repeatedly warned that online disinformation could result is offline violence.

"I'm very disappointed. I don't believe that they're shocked," Calo said of the platforms. "And I think they have culpability."

Yet Section 230 gets in the way of social media companies being held legally responsible for amplifying hate, extremism and disinformation.

Like Kerr, Calo said bringing a criminal case against the tech companies would also likely encounter insurmountable barriers.

"You'd have to show that the platform was probably at a minimum reckless but likely intended to support the riots," Calo said. "That would be a tough case to make out."

He said Congress should pass new laws that would hold social media companies legally responsible for failing to take reasonable steps to get rid of dangerous content on its websites.

"The platforms all had specific reason to know that there could be violence as a result of activity on their platforms," he said. "What happened at the Capitol was foreseeable."

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Bobby Allyn is a business reporter at NPR based in San Francisco. He covers technology and how Silicon Valley's largest companies are transforming how we live and reshaping society.