September 8, 2020
Facebook Doubles Down on Challenging Misinformation
Social media titan, Facebook, is continuing to position itself and a tech brand fighting back hard against misinformation. After a fiasco during the last presidential election that brought the company year after year of negative publicity and bruised consumer PR, Facebook is promising to do better, as Americans prepare to vote in the next Presidential election this November.
In a recent statement, Facebook said the platform would be “restricting” new political ads up to a week prior to election day, as well as removing posts that “convey misinformation about COVID-19 and voting.”
Activists who have been pressing for Facebook to take a more proactive role in the content and advertising on its platform have expressed guarded optimism about the announcement. While some are expressing appreciation for the move, critics of the decision are many. Some say the fact that Facebook has to make this announcement at all means that they know their platform is full of misinformation that could – and may have already – affected the outcome of the vote.
CEO Mark Zuckerberg said Facebook was not going to be ignoring those concerns. “This election is not going to be business as usual. We all have a responsibility to protect our democracy… That means helping people register and vote, clearing up confusion about how this election will work, and taking steps to reduce the chances of violence and unrest…”
And even some who would support the action don’t believe it will be attainable or properly realized. They claim the platform is either unable or unwilling to enforce the policies already in place, much less any new policies. The issue, according to critics, is that Facebook is simply too big, with too many on-ramps and entry points for bad actors, as well as far too many people who support exactly the sort of content Facebook wants to restrict.
And it’s this last factor that creates some serious tension between what Facebook wants to do and what they may be actually able to accomplish. The simple fact is that millions, if not tens of millions, believe the content that Facebook wants to restrict is an accurate representation of the world. These people blame the social network for “censorship” and “bias,” saying that the company is just trying to control what people think based on a socio-political agenda.
While, in some cases, these messages are coming directly from the kinds of bad actors Facebook wants to restrict, there are plenty of everyday users who sympathize with these ideas and don’t want a massive company “deciding” for them. In the end, that makes every aspect of this situation a PR challenge for the company. Doing nothing isn’t really an option, and everything they try, much less what they accomplish, will be heavily scrutinized and criticized.