Twitter Files reveals Pfizer board member's attempt to suppress debate on Covid vaccines
News

Elon Musk’s Twitter is working on removing child sexual abuse material at scale with “no mercy” for abusers

Elon Musk’s Twitter is working on removing child sexual abuse (CSAM) at scale with “no mercy for those who are involved in these illegal activities.” Andrea Stroppa shared a thread on Twitter with updates on how Twitter has moved from being lenient toward the child abuse problem to tackling it head-on.

Stroppa spearheaded the research team at Ghost Data and found that over 500 accounts openly shared the illegal material over a 20-day period in September. You can view the full report here. In his thread, Stroppa noted that he worked as an independent researcher along Twitter’s Trust and Safety team led by Ms. Ella Irin during the past few weeks. “Twitter achieved some relevant results I want to share with you,” Stroppa tweeted.

Stroppa noted that Twitter updated its mechanism to detect content related to CSAM and that it is faster, more efficient, and more aggressive. “No mercy for those who are involved in these illegal activities.”

Over the past few days, Twitter’s daily suspension rate has almost doubled, which means that the platform is doing a capillary analysis of contents. “It doesn’t matter when illicit content has been published. Twitter will find it and act accordingly.”

Stroppa pointed out that within the past 24 hours, Twitter began increasing its efforts and took down 44,000 suspicious accounts, and over 1,300 of those profiles tried to bypass detection using codewords and text in images to communicate.

He added that Twitter is aware of strategies, keywords, external URLs, and communication methods used by these accounts. “To increase its ability to protect children’s safety, Twitter involved independent and expert third parties.”

Stroppa added that Twitter is focusing its efforts on networks of Spanish-speaking and Portuguese-speaking users that share CSAM. “Twitter continues to have teams in place dedicated to investigating and taking action on these types of violations daily. Teams are more determined than ever and composed of passionate experts. Furthermore, Twitter simplified the process of users reporting illicit content.”

In a statement to Teslarati, Stroppa said, “If these good things are happening, it’s because Elon really cares about children’s safety. With Elon, we share the idea of the light of consciousness. This light goes through millions of people and improves a bit of the world.”

Eliza Bleu, who has been pushing Twitter to protect children since before Elon Musk purchased the platform, previously emphasized that the content needed to be removed “at scale.” In August, The Verge found that Twitter was unable to detect CSAM at scale.

“Twitter cannot accurately detect child sexual exploitation and non-consensual nudity at scale,” the Red Team, “to pressure-test the decision to allow adult creators to monetize on the platform by specifically focusing on what it would look like for Twitter to do this safely and responsibly.”

In her own thread, Eliza Bleu said that she never thought she would be able to tweet this, but “Twitter is currently working on detecting, removing, and reporting child sexual abuse material at scale.”

She added that the issue will take time to clean up, but the rapid changes are “just beautiful to see.”

On Saturday, Bleu told Teslarati, “While the corporate media was fear-mongering and spreading baseless conspiracy theories about Musk’s inability to tackle child sexual exploitation on Twitter with an alleged ‘skeleton crew,’ the platform was actually busy making amazing progress towards protecting sexually exploited children.”

“I’m extremely grateful to see the progress and the changes made under Elon Musk. He has accomplished in a month what the platform could not seem to do over the past decade about the issue of child sexual abuse material. The only time the platform previously made this much progress is when they implemented PhotoDNA.”

The technology Bleu is referring to was created when Microsoft partnered with Dartmouth College in 2009. PhotoDNA aids organizations in finding and removing known images of child exploitation. Bleu also called out Twitter’s advertisers that left the platform, citing Elon Musk as the reason, yet were silent on Twitter’s slowness and, at times, refusal to remove CSAM from its platform.

Your feedback is welcome. If you have any comments or concerns or see a typo, you can email me at johnna@teslarati.com. You can also reach me on Twitter at @JohnnaCrider1.

Teslarati is now on TikTok. Follow us for interactive news & more. Teslarati is now on TikTok. Follow us for interactive news & more. You can also follow Teslarati on LinkedInTwitter, Instagram, and Facebook.

 

Elon Musk’s Twitter is working on removing child sexual abuse material at scale with “no mercy” for abusers
-->
To Top