Meta says it is prioritizing livestreaming investigation during Israel-Hamas war


Meta says it is prioritizing livestreaming investigation during Israel-Hamas war -Gudstory

Rate this post


Following a content moderation warning from EU regulators earlier this week, Meta has published an overview of how it is responding to the risks posed by the Israel-Hamas war on its social media platforms.

Its blog post covered “ongoing efforts”, reiterating some of the existing policies and tools for users. But the company has confirmed that it has made some changes in view of the rapidly increasing incidents in Israel and Gaza.

This includes a temporary expansion of its policy of violence and provocation in order to prioritize the safety of Israelis kidnapped by Hamas.

As part of the change, Meta says it will remove content that “explicitly identifies hostages when we are made aware of it, even if it is done to condemn or raise awareness of their situation.” You are going”. It says, “We are allowing content containing blurred images of victims, but if we are unsure or unable to make a clear assessment, in line with the standards established by the Geneva Conventions, we prioritize the safety and privacy of kidnap victims.” Will give.”

Meta also says it is prioritizing investigating livestreaming operations on Facebook and Instagram – including keeping an eye on any efforts by Hamas to use the tools to broadcast footage of captured Israelis or other hostages .

But in one particularly disturbing report israeli media This week, in a story that was widely circulated on social media, a girl described how she and her family learned of her grandmother’s death when Hamas militants uploaded a video of her dead body to Facebook Apparently, he used his mobile phone to post. Graphic content on the dead woman’s Facebook page.

“We recognize that the immediacy of Live brings unique challenges, so we have restricted access to Live for those who have previously violated certain policies. We are prioritizing livestream reports related to this crisis above and beyond our existing prioritization of live video,” META said, highlighting the move taken in the wake of the 2019 Christchurch attacks in New Zealand, when a solo shooter went on a killing spree. Livestreamed the incident, which was targeted. Two mosques on Facebook.

“We are also aware of Hamas’s threats to broadcast hostage footage and we are taking these threats extremely seriously. “Our teams are monitoring this closely, and will swiftly remove any such content (and the accounts behind it), securing the content in our systems to prevent copies from being re-shared.”

Other steps taken by Meta to respond to the situation in Israel and Gaza include reducing the likelihood that its systems will actively recommend potentially infringing or borderline content and reducing the visibility of potentially offensive comments. Will reduce; and implementing hashtag blocking to make certain conflict-related terms non-searchable on its platform. Its blog post did not specify which hashtags Meta was blocking in relation to the Israel-Hamas war.

Meta’s blog post also said it has set up a special operations center with experts, including Arabic and Hebrew speakers, to enhance its ability to respond quickly to content reports.

It also says it is taking feedback from local partners (such as NGOs) on emerging risks – and claims to be “moving rapidly to address them”.

It reads, “In the three days following October 7, we removed more than 795,000 pieces of content in Hebrew and Arabic for violating these policies or flagged them as harassing.” “Compared to two months ago, in the three days since October 7, we have removed seven times more content on a daily basis for violating our Dangerous Organizations and Individuals policy in Hebrew and Arabic alone.”

In light of the increased attention and concern over the situation, Meta says it is possible that non-infringing content may be removed “accidentally”.

“To mitigate this, for some violations we are temporarily removing content without a strike, meaning removing these content will not disable accounts,” it says. “We also continue to provide users with the tools to appeal our decisions if they believe we made a mistake.”

The block’s Digital Services Act (DSA) compliance began for Meta in August as the owner of so-called very large online platforms (VLOPs).

The commission had designated 19 VLOs in April – including Meta-owned Facebook and Instagram.

The designation places an obligation on VLOPs to respond diligently to reports of illegal content, as well as to clearly communicate their terms and conditions to users and enforce their terms appropriately. But the scope is much broader – these large platforms need to identify systemic risks like disinformation and take steps to mitigate them.

The Regulation also includes a “crisis response” mechanism that the Commission can adopt on VLOps in situations where the use of their platforms could contribute to serious threats such as war.

Fines for failing to comply with the pan-EU regulation could reach up to 6% of global annual turnover – which, in the case of Meta, could reach several billions.

The social media giant isn’t alone in getting a warning from the bloc over content concerns linked to the Israel-Hamas war: Elon Musk’s “Warning has been issued. and with a formal request for information about its compliance approach.

TikTok has also received a warning from the EU about conflict-related DSA content risks.


No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *