Facebook got through midterms mostly unscathed

Although it appears that Facebook managed misinformation more effectively than in 2016, we won’t know for sure until researchers, academics and cyber security experts can gauge the full impact.

That could take months — recall that Facebook didn’t start talking about Russian attempts to influence the 2016 election via until September 2017.

Meanwhile, the company faces a constantly shifting set of challenges.

The company this week released a report that details Facebook’s impact on the spread of hate speech in the country of Myanmar. The report recommended that Facebook increase the enforcement of its content policies and increase its transparency by providing the public more data.

In the U.S., a series of Vice investigations that found potential vulnerabilities in Facebook’s handling of political ads.

In one instance, Vice received Facebook approval to say that political ads it put together were “paid for by” any of the 100 U.S. senators, although none of the ads were actually published. Another article by Vice and ProPublica found political ads on Facebook paid for by a group not registered with the Federal Election Commission. It is unclear who is behind the group, and Facebook has not removed the ads, saying it had requested additional information from the advertiser and determined the ads were not in violation of Facebook’s standards.

Beyond the main Facebook app, the spread of misinformation appears to be on the rise across the company’s other services, Facebook’s former Chief Security Officer Alex Stamos told CNBC.

Notably, when Facebook blocked accounts linked to Russia’s Internet Research Agency, it took down 85 Instagram accounts compared to 30 Facebook accounts.

“This last cycle also demonstrated that Instagram is likely to be the largest target of Russian groups going forward,” Stamos said. “Facebook needs to really invest in porting some of the back-end detection systems that catch fake accounts to this very different platform.”

Reports over the past year ondicate small groups on Messenger and WhatsApp are becoming the hotbed of misinformation. In India, rumors on WhatsApp reportedly resulted in a group of men being lynched. The app was also used to reportedly spread misinformation ahead of the Brazilian election last month.

“The greater privacy guarantees of these platforms increases the technical challenges with stopping the spread of misinformation,” Stamos said.

Additionally, Reuters this week reported that Russian agents are changing their tactics to spread divisive content across social media while staying ahead of Facebook’s efforts. This includes moving away from fake news and focusing on amplifying content produced by Americans on the far right and far left.

The next tests for the company will come soon enough. On the horizon are the Indian general election next year, and the 2020 presidential election primaries.

“I anticipate that it will be about the end of next year when we feel like we’re as dialed in as we would generally all like us to be,” Zuckerberg said last week.

“And even at that point, we’re not going to be perfect because more than 2 billion people are communicating on the service. There are going to be things that our systems miss, no matter how well-tuned we are.”

Be the first to comment

Leave a Reply

Your email address will not be published.