Australia fines X for failing to provide information about child abuse material

-Gudstory

Australia fines X for failing to provide information about child abuse material -Gudstory

Rate this post

[ad_1]

eSafety, the Australian regulator for online safety, on Monday imposed a $386,000 fine against X (formerly Twitter) after it failed to answer “key questions” about the action the platform was taking against child exploitation material.

The watchdog in February issued legal notices to Google, TikTok, Twitch, Discord and X (then known as Twitter) under the country’s Online Safety Act. In the notice, these companies have been asked to answer questions regarding their handling of child sexual abuse material (CSAM).

While the monetary value of the fine may not be significant, X will have to engage in reputation management with a platform already struggling to retain advertisers.

In a press release, eSafety said that X left some parts of the responses “completely blank” and others were incomplete or inaccurate. The Elon Musk-owned company was also criticized for not responding to the regulator’s questions in a timely manner.

Crucially, the platform did not provide information about CSAM detection technology in the live stream and said that it does not use any technology to detect grooming.

The report also found Google guilty of providing generic responses – eSafety said these answers were not sufficient. However, the regulator issued a formal warning against Google instead of a fine, indicating that Google’s shortcomings were not that serious.

eSafety Commissioner Julie Inman Grant criticized Twitter/X for failing to live up to its promises to tackle CSAM.

“Twitter/X has publicly stated that combating child sexual exploitation is a No. 1 priority for the company, but this can’t just be empty talk, we need to match words with concrete action,” he said in a statement. “

“If Twitter / Better systems are needed to check one’s own actions. , Both scenarios are worrying to us and suggest they are not living up to their responsibilities and the expectations of the Australian community.

Last month, X removed an option for users to report political misinformation. An Australian digital research group called Reset.Australia wrote an open letter to X expressing concern that the move “could leave infringing content subject to an unfair review process and not be labeled or removed in compliance with your policies.”

After Musk took over, X/Twitter let go a group of employees working on trust and security issues. Last December, the company also removed the Trust and Safety Council, an advisory group that consulted with the platform on issues such as effectively removing CSAM. As part of cost-cutting, the social media company closed its physical office in Australia earlier this year.

Earlier this month, India also sent notices to X, YouTube and Telegram to remove CSAM from their platforms. Last week, the EU sent X a formal request under the Digital Services Act (DSA) to provide details about the steps it is taking to tackle disinformation around the Israel-Hamas war.

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *