The board, which makes “independent decisions regarding content on Facebook and Instagram”, overturned four out of five decisions by the social media company to remove content from its platforms.
Cases where Facebook’s decision was overturned include posts on breast cancer symptoms, criticisms of “the lack of a health strategy in France” and an alleged quote from Nazi Joseph Goebbels.
One decision was upheld by the Board, around the use of a slur to describe Azerbaijanis, which was removed under Facebook’s Community Standard on Hate Speech.
In a conference call to journalists, the Oversight Board confirmed that new cases selected will be announced tomorrow, as well as a call for comments on Facebook’s decision to suspend former US President Donald Trump’s Facebook and Instagram pages.
The decision to refer the case to the board was announced by Facebook last week, with the Board revealing that a statement has “not yet” been received by Trump or his team.
Here are the eight policy recommendations made across the five cases:
- Create a new Community Standard on health misinformation, consolidating and clarifying the existing rules in one place. This should define key terms such as “misinformation.”
- Adopt less intrusive means of enforcing its health misinformation policies where the content does not reach Facebook’s threshold of imminent physical harm.
- Increase transparency around how it moderates health misinformation, including publishing a transparency report on how the Community Standards have been enforced during the COVID-19 pandemic. This recommendation draws upon the public comments the Board received.
- Ensure that users are always notified of the reasons for any enforcement of the Community Standards against them, including the specific rule Facebook is enforcing.
- Explain and provide examples of the application of key terms from the Dangerous Individuals and Organizations policy, including the meanings of “praise,” “support” and “representation.” The Community Standard should also better advise users on how to make their intent clear when discussing dangerous individuals or organizations.
- Provide a public list of the organizations and individuals designated as ‘dangerous’ under the Dangerous Individuals and Organizations Community Standard or, at the very least, a list of examples.
- Inform users when automated enforcement is used to moderate their content, ensure that users can appeal automated decisions to a human being in certain cases, and improve automated detection of images with text-overlay so that posts raising awareness of breast cancer symptoms are not wrongly flagged for review. Facebook should also improve its transparency reporting on its use of automated enforcement.
- Revise Instagram’s Community Guidelines to specify that female nipples can be shown to raise breast cancer awareness and clarify that where there are inconsistencies between Instagram’s Community Guidelines and Facebook’s Community Standards, the latter take precedence.
Facebook will have seven days to restore removed content where the Board has directed them to do so, and 30 days to respond publicly to the Board’s policy recommendations.
The full rulings are available to view on the Oversight Board’s official website.
Photo: Oversight Board.