The continuing content debate of Facebook has been somewhat of a mess and having various aspects with a lot of implications and potential social reverberations which has spread far beyond the website. Without regard for approval of users, Facebook has been working hard to shrive and create a distance between itself from the debates that have been going on concerning post-truth, fake news and other issues arising on the network. Nevertheless, in the long run, shunning, disaffirmation and imploring spectator have not been working strategies in the long run.
The “downvote” button, a new feature which is an attempt on the company’s part to alleviate the crisis. The feature still in a beta state is at the moment undergoing some experiment around 5 percent of English-speaking US Android users of the platform. It is an extra button placed next to the “Like” and “Reply” buttons on certain content on the platform. When you hit it, three options will be revealed for flagging content as either, “Offensive,” “Misleading,” or “Off-Topic”. The idea here is that Facebook can track these reports and flag posts and even users consequently.
It is something of an update over the present reporting system, however, it is not very important as a few other aspects of the feature. Facebook is yet again making it amply clear that this button is not a “dislike button”. It has also already made its stance on the whole Reddit-style rating system at various junctures stating that what all users actually needed was the ability to react differently to the information, thus the still relatively new reaction emoji’s.
According to Facebook, systems have already be put in place in order to handle annoying ads and “clickbaiting” as well as vote baiting, react baiting and share baiting content on its own. The “downvote” button seems to be aimed at crowdsourcing complaints to objectively denote false or improperly placed and out of context content.
In the “post-truth” era, allowing users the liberty to vote on the extent of precision, importance or offensiveness of a particular statement or information is a frail solution for an imperfect world. It eliminates none of the bias involved in the process. As a form of extenuation, Facebook seems to be trying it out on comments, and on public Page posts. Group or individual posts at present appear to be left out of the new system. This at the very least implies that the main goal is to battle disinformation.