OPT/Israel: Meta's Oversight Board to examine Israel-Palestine conflict content
"IsraelโHamas Conflict Sparks Meta Oversight Boardโs First Emergency Case", 7 December 2023
...Metaโs Oversight Board announced it would take on two expedited cases, the first ever, both dealing with the ongoing conflict between Israel and Hamas. The case will look at two posts that were initially removed from and then reinstated on Instagram and Facebook for violating Metaโs policies against sharing graphic imagery and depicting dangerous organizations and individuals, respectively. One of the posts showed the aftermath of the attack on Al-Shifa Hospital by the Israel Defense Forces, and the other was a video of an Israeli hostage being taken by Hamas on October 7.
โThe current IsraelโHamas conflict marks a major event where Meta could have applied some of the boardโs more recent recommendations for crisis response, and we are evaluating how the company is following through on its commitments,โ Thomas Hughes, director of the Oversight Board Administration, told WIRED. โWe see this as an opportunity to scrutinize how Meta handles urgent situations.โ
Earlier this year, the board announced it would take on โexpedited casesโ in what it called โurgent situations.โ
The company has been critiqued for how it has handled content around the conflict.
Meta, like many social media platforms, uses a combination of automated tools and a stable of human content moderatorsโmany of them outsourcedโto decide whether a piece of content violates the platformโs rules. ...Sabhanaz Rashid Diya, a former member of Metaโs policy team and the founding director of the Tech Global Institute, a tech policy think tank, told WIRED that an automated system often wonโt be able to tell the difference between posts discussing or even condemning Hamas, as opposed to ones expressing support.
Marwa Fatafta, MENA policy and advocacy director at the nonprofit Access Now, a digital rights advocacy group, says that she has seen little change in Metaโs systems from 2021, and believes the companyโs content moderation policies still lack transparency for users.
โItโs not clear why some of these exceptions are made for some conflicts and not others,โ says Fatafta. โWeโre seeing videos and photos, sometimes just from bystanders or journalists, being removed and itโs not clear why. Weโre really advocating for more context-specific content moderation.โ
For some of these, like the post from Al-Shifa Hospital, the company will assess whether the post is โnewsworthyโ and reinstate images or videos when users appeal a takedown decision. This happens โpretty much in every single crisis,โ according to Diya. In the case of the hostage video, the user posted it with a caption encouraging people to watch it to gain a โdeeper understandingโ of what happened on October 7, violating Metaโs longstanding policy of not showing terrorist attacks and its new policy of showing identifiable images of hostages. (Meta temporarily updated its policies after October 7 to take down videos in which hostages were identifiable.)
In a company blog, Meta said the โOversight Boardโs guidance in these cases, along with feedback from other experts, will help us to continue to evolve our policies and response to the ongoing IsraelโHamas War.โ
But the bigger issue, Diya says, is that the company continues treating each conflict like a one-off situation that requires a tailored response. โThereโs a general reluctance within platforms to preempt or prepare for crises, especially if itโs outside the US, even when there is a prolonged history of conflict or violence in that region,โ she says. โBut we have seen enough crises in the past decade to get some sense of some patterns and what kind of tools should be in place.โ
The expedited decisions from the Oversight Board, expected within 30 days, may finally push the company to do just that.