In another blow for X’s new approach to content moderation, under the direction of Elon Musk, Australia’s eSafety Commission has issued the company with a $US386,000 fine for failing to meet its obligations on content detection and reporting, specifically in relation to child sexual abuse (CSAM) material.
Which Musk has repeatedly highlighted as a key priority, while also touting the improvements that X has made on this front under his management.
Under Australia’s Online Safety Act, which was enacted in 2021, the eSafety Commissioner can call on online services to provide specific insight into how they’re meeting their obligations under the act, which covers various forms of unlawful activity.
As a result, in its second report since the Act was enacted, Australia’s eSafety Commission has found that both Google and X (formerly Twitter) are failing to meet its parameters, though for many of the violations, X simply failed to provide an answer, as opposed to falling short, as such.
As per the report:
“Google has been issued a formal warning, notifying it of its failure to comply due to the company providing a number of generic responses to specific questions and providing aggregated information when asked questions about specific services. Twitter/X’s non-compliance was found to be more serious, with the company failing to provide any response to some questions, leaving some sections entirely blank. In other instances, Twitter/X provided a response that was otherwise incomplete and/or inaccurate.”
In terms of specifics, the eSafety Commission says that in the three months after Twitter/X’s change in ownership late last year, its “proactive detection of child sexual exploitation material fell from 90% to 75%”. The Commission did further note, however, that X claims that its proactive detection rate has improved in 2023.
As noted, most of X’s penalties relate to not providing adequate info on its processes, which the Commission suspects to have been impacted due to X’s cost-cutting efforts.
“Twitter/X did not respond to a number of key questions including the time it takes the platform to respond to reports of child sexual exploitation; the measures it has in place to detect child sexual exploitation in livestreams; and the tools and technologies it uses to detect child sexual exploitation material. The company also failed to adequately answer questions relating to the number of safety and public policy staff still employed at Twitter/X following the October 2022 acquisition and subsequent job cuts.”
Which is the basis for the fine, so it’s not necessarily an indication that X, overall, is failing in these key areas, but that it is falling short in its reporting requirements.
So X’s detection measures could actually be improving, but it also needs to adhere to its obligations, and provide updates as requested.
So what does that mean for X? Well, it’s hard to say.
On the surface, it looks bad, with X being fined for failing to address CSAM material. But that’s not really what the fine is for, so maybe it won’t have a big impact on X’s reputation. But it likely will have some impact, and with many advertisers still hesitant to invest in Elon Musk’s X project due to concerns around its revised moderation processes, it’s another negative headline for the company.
But it doesn’t actually tell us a lot about how X is performing on this critical front.
Though as the Commission’s report notes:
“If Twitter/X and Google can’t come up with answers to key questions about how they are tackling child sexual exploitation, they either don’t want to answer for how it might be perceived publicly or they need better systems to scrutinize their own operations. Both scenarios are concerning to us and suggest they are not living up to their responsibilities and the expectations of the Australian community.”
Google’s main violation, for clarity, is that it’s failing to utilize detection tools in Gmail, chat and messages, aside from its generic responses to some of the queries.
So it’s less of a damning report, as such, and more an issue of admin errors on X’s part. But it could, as the Commission notes, point to further flaws in X’s systems which it’s not so keen to highlight.
Worth noting, too, that various third party reports have found that CSAM is still prevalent in the app, more so than X has suggested.
X has 28 days to appeal or pay the fine.