Traumatized by the 2016 polls, Facebook has stepped up measures to prove that it has become a major and responsible player in the elections, and no longer a vehicle of massive disinformation, conducive to the manipulation of voters.
The US presidential election which brought Donald Trump to power had indeed been marked by various interference via social networks. British firm Cambridge Analytica had hijacked the personal data of tens of millions of Facebook users for propaganda purposes.
Organizations close to the Kremlin had also carried out massive influence operations in favor of Donald Trump.
Here is a non-exhaustive review of the tools and rules in place less than a week before the American election.
Facebook estimates that it has helped 4.4 million people register to vote in the United States this year.
The “voting information center”, launched in mid-August, bombards users of the platform with encouragement to register.
This one-stop-shop, on the model of the one created on the coronavirus, provides in particular the methods of voting in each state, a tool for recruiting agents for the polling stations and information in real time.
But encouraging participation also involves combating the various tactics of “suppression of the vote” of which Republicans are often accused vis-à-vis black or Hispanic minorities, generally more inclined to vote Democrat.
The social network is therefore, for example, committed to withdrawing publications ensuring that going to vote would lead to contamination with the coronavirus.
Facebook recently added a rule, apparently tailor-made against a Republican campaign video, in which Donald Trump Jr. asks anyone “in good health” to join “Trump’s army for election security.” The terms military and intimidation are now banned in this context.
It is the most emblematic workhorse, as Donald Trump’s mandate has been marked by controversies around “fake news”.
Facebook has invested heavily in its third-party verification program, in which some sixty media outlets around the world, including AFP, participate.
If a piece of news is deemed false or misleading by one of these media, users are less likely to see it appear on their news feed. The network also suggests that they read the verification article.
The network giant has accumulated experience with the creation in 2018 of an election unit for the midterm polls in the United States, then used in Europe and elsewhere in the world.
Emphasis has been placed on transparency, so that users are always clear who is speaking to them.
Pages from state-controlled media (financially and editorially) have been flagged as such since this summer, and ads from those organizations banned if directed at Americans.
Week after week, Facebook dismantles manipulation operations orchestrated from abroad.
He also finds networks attempting to spread false information, conspiracy theories, hate speeches and hijacked videos.
The actors involved have perfected their techniques to be less easily spotted, but they have also become less effective. Collaboration with the authorities often makes it possible to stop them before they have built a substantial audience.
Facebook has also strengthened the protections of candidate accounts.
The group now fears above all “hack-and-leak” tactics: entities linked to states give pirated information to the media and use networks to spread it. That’s what happened with emails from Hillary Clinton in 2016.
Facebook must also protect itself from the candidates themselves, in a tense national context, between the pandemic and the wave of demonstrations against racism, which regularly lead to violence.
The platform is preparing for catastrophic scenarios in the event that its services are used to proclaim results or call to challenge them.
No new political advertising may be broadcast in the week preceding the election.
And all advertising on social or political topics will be banned on its platforms in the United States on the evening of November 3, to reduce the risk of “confusion or abuse” for as long as it takes.
Facebook is concerned that the results may take a long time to be known, due to the greater use of postal voting during a pandemic than usual, and that the results will be hotly contested in the meantime. At the risk of violence in the street.
“If a candidate or party announces their victory prematurely before a result is given by major media outlets, we will add specific information indicating that the vote count is still in progress and the winner has not yet determined, ”Facebook recalled recently.
If the situation ever gets out of hand and turns into chaos, Facebook has planned emergency measures. The platform could “reduce the circulation of content” in the event of violence, said Nick Clegg, its public affairs manager.