Facebook is preparing for the United States presidential election on November 3, marked by unrest. They will use internal systems that have previously been used in risky countries.
Quoted from The Verge on Tuesday (3/11/2020) the plan includes slowing the spread of posts when they go viral, changing news feed algorithms and changing rules about types of content that are harmful and should be removed.
Such a strategy has previously been used by Facebook to quell ethnic and political unrest in at-risk countries such as Sri Lanka and Myanmar.
This system will only be used in the event of violence related to the presidential election or other serious circumstances. But some Facebook employees worry that the slowing spread of viral content will hide legitimate political discussion.
The way Facebook moderates content amid a heated situation has been criticized several times. One of them was when they were unable to stem hate speech against Muslims in Rohingya, Myanmar which led to a humanitarian crisis in 2017.
In a blog uploaded last September, Facebook CEO Mark Zuckerberg said the 2020 U.S. Presidential Election would not go as usual. He said he feared there would be an increased risk of unrest in the US given the country’s divided state and the results of the presidential election that took weeks to announce.
In addition to the above content moderation system, Facebook also implemented several policies to ensure the 2020 U.S. Presidential Election went smoothly. A week before the election began, they no longer received new political ads.
The social media platform also provides an information center for voters at the top of its Facebook and Instagram feeds, and plans to provide official election results in cooperation with Reuters.
Facebook will also flag posts claiming victory prematurely, while deleting posts containing misinformation about COVID-19 and the presidential election. They will also ban all forms of political advertising in the US after the November 3 presidential election until an unspecified time.