abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

هذه الصفحة غير متوفرة باللغة العربية وهي معروضة باللغة English

المقال

7 ديسمبر 2023

الكاتب:
Vittoria Eliot, Wired

OPT/Israel: Meta's Oversight Board to examine Israel-Palestine conflict content

"Israel–Hamas Conflict Sparks Meta Oversight Board’s First Emergency Case", 7 December 2023

...Meta’s Oversight Board announced it would take on two expedited cases, the first ever, both dealing with the ongoing conflict between Israel and Hamas. The case will look at two posts that were initially removed from and then reinstated on Instagram and Facebook for violating Meta’s policies against sharing graphic imagery and depicting dangerous organizations and individuals, respectively. One of the posts showed the aftermath of the attack on Al-Shifa Hospital by the Israel Defense Forces, and the other was a video of an Israeli hostage being taken by Hamas on October 7.

“The current Israel–Hamas conflict marks a major event where Meta could have applied some of the board’s more recent recommendations for crisis response, and we are evaluating how the company is following through on its commitments,” Thomas Hughes, director of the Oversight Board Administration, told WIRED. “We see this as an opportunity to scrutinize how Meta handles urgent situations.”

Earlier this year, the board announced it would take on “expedited cases” in what it called “urgent situations.”

The company has been critiqued for how it has handled content around the conflict.

Meta, like many social media platforms, uses a combination of automated tools and a stable of human content moderators—many of them outsourced—to decide whether a piece of content violates the platform’s rules. ...Sabhanaz Rashid Diya, a former member of Meta’s policy team and the founding director of the Tech Global Institute, a tech policy think tank, told WIRED that an automated system often won’t be able to tell the difference between posts discussing or even condemning Hamas, as opposed to ones expressing support.

Marwa Fatafta, MENA policy and advocacy director at the nonprofit Access Now, a digital rights advocacy group, says that she has seen little change in Meta’s systems from 2021, and believes the company’s content moderation policies still lack transparency for users.

“It’s not clear why some of these exceptions are made for some conflicts and not others,” says Fatafta. “We’re seeing videos and photos, sometimes just from bystanders or journalists, being removed and it’s not clear why. We’re really advocating for more context-specific content moderation.”

For some of these, like the post from Al-Shifa Hospital, the company will assess whether the post is “newsworthy” and reinstate images or videos when users appeal a takedown decision. This happens “pretty much in every single crisis,” according to Diya. In the case of the hostage video, the user posted it with a caption encouraging people to watch it to gain a “deeper understanding” of what happened on October 7, violating Meta’s longstanding policy of not showing terrorist attacks and its new policy of showing identifiable images of hostages. (Meta temporarily updated its policies after October 7 to take down videos in which hostages were identifiable.)

In a company blog, Meta said the “Oversight Board’s guidance in these cases, along with feedback from other experts, will help us to continue to evolve our policies and response to the ongoing Israel–Hamas War.”

But the bigger issue, Diya says, is that the company continues treating each conflict like a one-off situation that requires a tailored response. “There’s a general reluctance within platforms to preempt or prepare for crises, especially if it’s outside the US, even when there is a prolonged history of conflict or violence in that region,” she says. “But we have seen enough crises in the past decade to get some sense of some patterns and what kind of tools should be in place.”

The expedited decisions from the Oversight Board, expected within 30 days, may finally push the company to do just that.

الجدول الزمني