Logged-out Icon

Home / Consumer Tech

Meta Oversight Board criticizes removal of Isarel, Hamas conflict-related videos

Israel-Hamas war

Meta’s Oversight Board highlighted a significant issue with the company’s content moderation policies, particularly regarding two videos related to the Israel-Hamas conflict. These videos, one showing the aftermath of an airstrike in Gaza and another depicting an Israeli woman taken hostage, were initially removed by Meta’s automated moderation tools, a move that the Board later criticized.

*Trigger Warning*

The first video, posted on Instagram, captured the dire situation near Al Shifa hospital in Gaza, showing injured or possibly deceased children. The second, shared on Facebook, presented a harrowing scene of an Israeli woman pleading for her life during the October 7th attack. Both pieces of content were initially taken down but were later restored with a warning screen after the Oversight Board’s intervention.

This incident marks the first instance where the Oversight Board, an independent body reviewing Meta’s content decisions, utilized its expedited review process. This new approach, designed to address urgent matters swiftly, reflects the growing challenges social media platforms face in balancing freedom of expression with the need for content moderation.

In this case, the Board found that Meta’s automated tools were overly aggressive, leading to the unjust removal of content that held significant public interest. The lack of human intervention in these moderation processes was particularly concerning, as it increased the risk of incorrectly censoring important content.

While Meta responded by reinstating the videos, the Board disapproved of the company’s choice to limit their recommendation to other users. This action, according to the Board, detracted from the content’s potential to raise awareness and inform the public. In light of this, the Board urged Meta to be more agile in adapting to evolving situations, especially those of high public interest and relevance.

“While Meta explained that it retains all content that violates its Community Standards for a period of one year, the Board urges that content specifically related to potential war crimes, crimes against humanity, and grave violations of human rights be identified and preserved in a more enduring and accessible way for purposes of longer-term accountability,” the Oversight Board said in a blog.

Posts you may like

This website uses cookies to ensure you get the best experience on our website