89.9 FM Live From The University Of New Mexico
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Top Facebook Official: Our Aim Is To Make Lying On The Platform 'More Difficult'

Tech companies are under enormous pressure from government officials to prevent their platforms from being used by foreign actors and others to disrupt the 2020 election, as occurred in 2016.
Jeff Chiu
/
AP
Tech companies are under enormous pressure from government officials to prevent their platforms from being used by foreign actors and others to disrupt the 2020 election, as occurred in 2016.

Google, Facebook, Twitter and other major tech companies met with U.S. government officials on Wednesday to discuss their plans to counter disinformation on social media in the run-up to the November election.

In a joint statement, the companies said that this was the latest in a series of meetings "on what we're seeing on our respective platforms and what we expect to see in the coming months." The statement said today's meetings focused on the upcoming political conventions and "scenario planning relating to election results."

Tech companies are under enormous pressure from government officials to prevent their platforms from being used by foreign actors and others to disrupt the 2020 election, as occurred in 2016.

In a statement earlier this month, the director of the National Counterintelligence and Security Center, William Evanina, said that Russia, China, Iran and other "foreign actors" are attempting to "sway U.S. voters' preferences and perspectives, shift U.S. policies, increase discord in the United States, and undermine the American people's confidence in our democratic process."

In an interview with NPR's Morning Edition, Facebook's head of cybersecurity policy, Nathaniel Gleicher, said the company is working harder than ever to combat such efforts, saying the goal is to make sure voters receive accurate information.

"I think I actually want to make the act of trying to tell a lie, or misleading people, more difficult," Gleicher said.

In March, Facebook announced that it removed 49 Facebook accounts, 69 pages and 85 Instagram accounts that were engaging in "foreign interference." More recently, the social network has removed about two dozen other accounts, some linked to Russia and Iran.

"We've shared information with independent researchers and then we've publicized context about it so that users can see what's happening on the platform," Gleicher said.

Facebook has a team of fact-checkers scouring for misleading or false content, the company says, but political advertising is not subject to scrutiny, according to Facebook's rules.

Gleicher defended this hands-off approach to political ads, which has allowed President Trump to spread falsehoods to millions of users about his presumptive Democratic opponent, Joe Biden.

"So we know right now in this election, there are massive debates about all of the ways that people vote and people engage. We want to make sure that people hear what elected officials are saying and what they think about voting," Gleicher told NPR.

"But frankly, I think that information is an important factor in how some people will choose to vote in the fall. And so we want to make sure that information is out there and people can see it, warts and all," he said.

Facebook has, however, on occasion overridden its policy. In June, for example, it removed Trump campaign advertisements containing an upside-down red triangle symbol, which had been used to identify political prisoners in German concentration camps during the Nazi regime.

Both Twitter and Facebook have also removed posts shared by Trump that contained false and misleading information related to the coronavirus pandemic.

Social media companies haven't always been successful at these efforts. An investigation by The Guardian found that groups promoting the conspiracy movement QAnon were rapidly growing worldwide on Facebook. The newspaper identified 170 QAnon groups, pages and accounts across Facebook and Instagram with more than 4.5 million aggregate followers.

Gleicher said that Facebook has been enforcing its guidelines against QAnon consistently. When asked about The Guardian's findings, he admitted that it has been challenging staying on top of conspiracy campaigns.

"This is a place where I think we have work to do," Gleicher said. "I think that the boundaries around what constitutes a QAnon site or not are pretty blurry and that becomes a challenge in all of us. It's why we're looking at this and we're exploring some additional steps that we can take."

Some have raised the possibility of Trump rejecting the results of the November election should he lose. In July, Trump declined in an interview on Fox News to say whether he would accept the outcome.

What would Facebook do if Trump falsely said on the platform that he was the winner of the presidential election?

Gleicher dodged the question, refusing to directly say whether Facebook would take action against such a post. Instead, he said that the final result is widely expected to not be settled on election night, and Facebook officials are gaming out how it might handle chaos that emerges during a prolonged vote count.

"We're not going to know the results immediately. There's going to be a period of uncertainty as counting is still happening. That's something that we've been particularly focused on," Gleicher said.

"In the wake of this period where votes are coming in and we don't know the results, we can expect candidates to be making claims about who's won," Gleicher said. "We could expect claims about whether the results were fair or not. These are things that we've seen before around elections but I think are going to be particularly critical this time."

"How do you accurately report on the claims that are being made," he continued, "but also provide the context to make sure that people understand and can weigh and judge these things?"

Gleicher also touted the company's goal of registering 4 million voters by posting "voting information centers" on Facebook and Instagram providing up-to-date information on how to register, obtain absentee or mail-in ballots and where to vote.

"The reason for this is voting is fundamentally about voice, and that's critical to our efforts and where we are as a company," he said. "It's the best way to hold our leaders accountable and to address important issues."

He added: "As someone who works on security, ensuring that voters have accurate information about an election is critical to protecting that election. Disinformation flourishes in uncertainty, and we've seen people take advantage of that uncertainty to drive influence operations and other types of deceptive campaigns."

Editor's note: Facebook is among NPR's sponsors.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Ashish Valentine joined NPR as its second-ever Reflect America fellow and is now a production assistant at All Things Considered. As well as producing the daily show and sometimes reporting stories himself, his job is to help the network's coverage better represent the perspectives of marginalized communities.