With as many as 50 different elections around the world this year, the social network’s expanded content moderation and fact-checking capabilities are under a constant and ever-developing test, the Washington Post details.
In a country like Mexico – similar to many of Facebook’s high-growth locations – corruption and distrust of the media have created an environment in which peer-to-peer online information through Facebook or through WhatsApp is seen as credible. On Sunday, the country goes to the polls for a tightly fought election in which a relatively young multi-party landscape and unprecedented connectivity are put to the test.
“The hardest part is where to draw the line between a legitimate political campaign and domestic information operations,” Guy Rosen, a top security executive at Facebook, told the Post. “It’s a balance we need to figure out how to strike.”
The difficulty of this challenge comes from the absence of rules surrounding the operations of internal actors through free speech laws, unlike the barring of foreign agents meddling in democracies.
Even for a company of Facebook’s size, however, the scale of the task can prove overwhelming. Much of the work continues to fall to a team of third-party fact-checkers, disproving one story at a time, with each story taking multiple days to work on.
“This is the scale of [Facebook’s] challenge,” said Nathaniel Persily, a Stanford Law School professor and an expert on social media and politics. “It is almost impossible to wrap your mind around.”
In March, Facebook, along with Google and Al Jazeera’s local brand, funded Mexico’s first independent fact-checking organization, Verificado 2018, whose 12 employees have so far debunked over 300 Facebook posts.
Though Facebook is working toward new automated features to help it to tackle misinformation, many of these will not be operational in time for Mexico’s election. It also illuminates the breadth of Facebook’s challenge in other countries, where, as one Mexican magazine editor put it, the biggest challenge is from within, “it is our broken system”. What’s more, the human-reported false story flagging faces further challenges, executives told the Post, as many people flag stories as false if they disagree with them.
The company had long marketed its services to politicians and governments without much fuss, but following the 2016 US election, the situation changed and employees dealing with elections were suddenly operating against nefarious actors rather than as marketing advisers.
Sourced from The Washington Post; additional content by WARC staff