Musk’s mob declines to answer questions, breaking law dunundah.
Combating child sexual abuse isn’t just a really good idea—it’s the law (in Australia). Specifically, tech companies need to tell the federal government what they’re doing about it. If they can’t be bothered, there are penalties.
In Twitter’s case, AU$610,500—which is not nothing. In today’s SB Blogwatch, we count the cost of silence.
Your humble blogwatcher curated these bloggy bits for your entertainment. Not to mention: HONK (redux).
Straya Strikes Back
What’s the craic? Kate Conger reports—“Australia Fines X”:
“Targeted for grooming”
… for failing to provide information about its efforts to combat child exploitation. … The social media service had told officials that its automated detection of abusive material declined after Elon Musk bought the company. The amount of the fine is … about $384,000.
…
[It] did not comply with a national law that requires platforms to disclose what they are doing to fight child exploitation on their services, Australian officials said. … “Companies can make empty statements like ‘Child exploitation is our top priority,’ so what we’re saying is, ‘Show us,’” Julie Inman Grant, Australia’s commissioner in charge of online safety, said.
…
In response to whether children might be targeted for grooming on X, the company told the regulator, “Children are not our target customer. [But] Linda Yaccarino, X’s chief executive, recently said at a conference that Generation Z was the company’s fastest-growing demographic, with 200 million teenagers and young adults in their 20s visiting the platform each month.
D’oh! Matthew Broersma has more foot-in-mouth action—“Australia Fines X Over Child Protection Failures”:
“Not doing anything”
Elon Musk said last November that “removing child exploitation is priority #1.” But the eSafety Commission criticised the company’s “empty talk.”
…
The firm failed to respond to questions including the time it takes the platform to respond to reports … the measures it has in place … and the tools and technologies it uses to detect child sexual exploitation material, the commission said: … The company failed to “provide any response to some questions, leaving some sections entirely blank.”
…
Inman Grant [said] the company had not even been able to say how many trust and safety staff it has now. … “We understand that it’s hard and it’s probably very confronting and exposing for these companies to actually say, … ‘We have said this is our top priority, but really, we’re not doing anything.’”
Ouch. What else did Inman Grant say? Here she is, via Georgie Hewson—“eSafety commission fines Elon Musk’s X”:
“Hunting grounds for predators”
Frankly, X did not provide us with the answers to very basic questions we’d ask them like, ‘How many trust and safety people do you have left?’ Now that’s critical to really understanding not only the scope of the problem but also the scale. … This was about the worst kind of harm—child sexual exploitation as well as extortion—and we need to make sure that companies have trust and safety teams [are] using people, processes and technologies to tackle this kind of content.
…
We expect car manufacturers to embed seatbelts [and] we have food standards—so the technology companies should not be any different. … These companies have a fundamental responsibility to make sure that the platforms that hundreds of millions of people are using around the world are safe. … Otherwise they’re creating hunting grounds for predators and they’re allowing child sexual exploitation and material to be not only hosted but also shared.
Mic drop. MallocVoidstar doesn’t sound impressed:
A social network used by hundreds of millions of people, certainly including people under 18 (who it allows to sign up), should probably have a plan for fighting grooming [and CSAM].
What’s the scale of the problem? u/Plato112358 has a go:
There used to be 1600 people whose job it was to keep Twitter clean. Musk fired all but 25 of them to save money.
Follow the money. Anil Dash did:
Musk is not going to provide information on child abuse content on X/Twitter because he has already publicly committed to ensuring that a paid verified user who disseminates child sexual abuse content will be paid directly by the company. This is also why he tries to distract by amplifying anti-trans hate that tries to blame others for the child abuse he and his cronies directly fund.
Who is this Julie person, anyway? Pardon our exodust [You’re fired—Ed.]:
The Australian eSafety Commissioner, Julie Inman Grant, is a former Twitter employee. I’m sure her condemnation of the platform is completely unbiased and she’s only thinking of the children.
Perhaps she should rethink the penalty? So says u/dblan9:
Money is nothing to Elon. A real punishment would be making him sleep in an Australian house with spiders the size of pumpkins.
Meanwhile, mmerlin wants to see more of this:
If more countries followed suit with a proportional fine, then X might take notice and remove child sexual abuse content from Xitter, make it somewhat less abhorrently Xitty, and just return to it’s original angry place where people can fling Xit at each other.
And Finally:
You have been reading SB Blogwatch by Richi Jennings. Richi curates the best bloggy bits, finest forums, and weirdest websites … so you don’t have to. Hate mail may be directed to @RiCHi, @richij or [email protected]. Ask your doctor before reading. Your mileage may vary. Past performance is no guarantee of future results. Do not stare into laser with remaining eye. E&OE. 30.
Image sauce: DonkeyHotey (cc:by-sa; leveled and cropped)
Recent Articles By Author
Richi Jennings Australia, Australian Government, Child Abuse, child exploitation, Child Online Safety, child porn, child pornography, Child protection, Child Safety, child security online, child sexual exploitation, childpornography, CSAM, Digital Trust and Safety, Elon Musk, elon musk twitter, eSafety Commission, grooming, Julie Inman Grant, Linda Yaccarino, SB Blogwatch, trust and safety, Twitter, X