Facebook has published a fresh terms of service which provides more in-depth information that relates to how contents are removed, targeting of ads, and the intellectual property rights of users. Based on the information that we got from Facebook, the new terms and conditions they published do not in any way mean that the company is seeking to make any changes to how the platform functions, what they serve to provide is a clearer view for how users should perceive the network. The new terms and conditions will be applied on the 31st of July.
A lot of the language that has been improved came as a result of the work the company did with the European regulators as confirmed by a representative of the company while speaking to our sources, specifically, “Several of the updates are the result of our work with the European Consumer Protection Cooperation Network [a division of the European Commission]”. Facebook’s work with the European regulators was announced two months ago and the then, Facebook consented to amend its terms and conditions openly before June ends.
The representative added that, “Other [updates] are based on input from ongoing conversations with regulators, policymakers and consumer protection experts around the world”.
Just as it was before the review, the terms of service makes plain the many reasons content may go against the standards of Facebook. However in the new terms and conditions, the term “removing” content was replaced with “removing or restricting access to” content in a few certain places which by assumption is in line with the plan Facebook has to restrict access. A broad segment of the T&Cs provides explanation to how the new review process of Facebook can be implemented:
If we remove content that you have shared in violation of our Community Standards, we’ll let you know and explain any options you have to request another review, unless you seriously or repeatedly violate these Terms or if doing so may expose us or others to legal liability; harm our community of users; compromise or interfere with the integrity or operation of any of our services, systems or products; where we are restricted due to technical limitations; or where we are prohibited from doing so for legal reasons.
This explains the requirements necessary for decision regarding moderation can be revealed because if it is a case of crime, Facebook could be under legal gag order. Complete transparency may cause security problems if content is removed for exploiting technical vulnerabilities in Facebook.