Google, Meta, Discord and others team up to fight online child exploitation

-Gudstory

Google, Meta, Discord and others team up to fight online child exploitation -Gudstory

Rate this post

[ad_1]

A new program called Lantern aims to fight online child sexual exploitation and abuse (OCSEA) with cross-platform signal sharing between online companies like Meta and Discord. The Tech Coalition, a group of tech businesses that aims to fight online child sexual exploitation, wrote in today’s announcement that the program is an attempt to avoid detection by driving potential victims to other platforms.

Lantern serves as a central database for companies to contribute data and audit their own platforms. When companies see signals such as known OCSEA policy-violating email addresses or usernames, child sexual abuse material (CSAM) hashes, or CSAM keywords, they can flag them in their systems. The announcement said that although signals do not strictly prove abuse, they do help companies investigate and possibly take action such as closing an account or reporting the activity to authorities.

A scene showing how a lantern works.
Image: Tech Coalition

Meta wrote in a blog post announcing its participation in the program that, during Lantern’s pilot phase, it used the program to remove and report “over 10,000 violating Facebook profiles, Pages, and Instagram accounts.” Used information shared by Mega, one of the partners. National Center for Missing and Exploited Children.

The announcement of the collaboration also quotes John Redgrave, Discord’s head of trust and safety, who says, “Discord also acted on the data points shared with us through the program, which assisted in a number of internal investigations Is.”

Companies participating in Lantern so far include Discord, Google, Mega, Meta, Quora, Roblox, Snap, and Twitch. Coalition members have been developing Lantern for the past two years, and the group says that in addition to creating the technical solution, it also had to put the program through “eligibility checks” and ensure it met legal and regulatory requirements. Be and “be morally consistent.” ,

One of the major challenges of such programs is to ensure that it is effective while not introducing new problems. In a 2021 incident, a father was investigated by police after Google flagged photos of his child’s groin infection for CSAM. Several groups warned that similar problems could arise with Apple’s now-canceled automatic iCloud Photo Library CSAM-scanning feature.

The coalition will oversee Lantern and says it is responsible for creating clear guidelines and rules for data sharing. As part of the programme, companies must complete mandatory training and regular check-ins, and the group will regularly review its policies and practices.

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *