Friday, February 23, 2024
HomeSHOPPINGX Shares New Data on Efforts to Combat CSAM in the App

X Shares New Data on Efforts to Combat CSAM in the App


In a recent update, X, Elon Musk’s social media platform, presented an extensive report detailing its endeavors to address the critical issue of child sexual abuse material (CSAM) on the platform. Amid ongoing concerns about X’s content moderation strategies and its impact on offensive content,,,,,,,,,, the platform has chosen to emphasize its efforts to combat CSAM, a topic Elon Musk had prioritized. This comprehensive analysis will delve into the key aspects of X’s report, examining the presented data, assessing the effectiveness of its measures, and addressing broader concerns about content moderation.

Focusing on the Numbers:

  1. Account Suspensions:
    • X asserts a significant increase in account suspensions related to CSAM, with over 11 million accounts permanently suspended from January to November 2023. A stark contrast to the 2.3 million suspensions in the entire year of 2022. However, questions arise about the accuracy of such actions and the potential for wrongful suspensions.
  2. CSAM Incident Reporting:
    • X reports a substantial rise in CSAM incident reporting, sending 430,000 reports to the NCMEC CyberTipline in the first half of 2023. Comparatively, Twitter sent 98,000 reports throughout 2022. The shift to fully automated NCMEC reporting raises concerns about the quality and reliability of these reports without manual review.

Automated Reporting and Discoverability Measures:

  1. Fully Automated NCMEC Reporting:
    • X highlights its transition to fully automated reporting to the NCMEC, streamlining the process but prompting questions about the accuracy and effectiveness of reporting. The surge in reporting numbers necessitates further validation to ensure its impact on CSAM reduction.
  2. Discoverability Reduction:
    • X claims to have implemented proactive measures to reduce the discoverability of posts containing CSAM patterns, resulting in a reported 99% reduction in successful searches for known CSAM patterns since December 2022. Critics argue that the effectiveness of blacklisting specific tags may be limited as offenders adapt to alternative tags.

Challenges and Third-Party Assessments:

  1. Third-Party Assessments:
  2. Reinstatement of Banned Accounts:
    • X’s decision to reinstate the account of a prominent right-wing influencer, previously banned for sharing CSAM content, raises questions about the consistency and effectiveness of its content moderation practices.


While X’s report emphasizes increased actions and reporting on CSAM, concerns linger regarding the overall effectiveness of its content moderation strategies. The platform’s commitment to combating CSAM is evident in the presented data,,,,,,,,,, yet challenges persist in reconciling reported numbers with external assessments. As X seeks to rebuild trust and address advertiser concerns, ongoing scrutiny and transparency will be essential in ensuring a safer and more responsible platform.

Most Popular

Recent Comments