A new investigation reveals that child sexual abuse material (CSAM) is being openly marketed and sold on mainstream social media platforms. Sometimes, AI systems inadvertently help the predators.
The Centre for Analytics and Behavioural Change (CABC) released a disturbing report, Mapping Child Predation Online, exposing that the rampant spread of child pornographic material is easily found, distributed, and even traded for money across popular social media and messaging platforms in South Africa.
The report is a follow-up to a previous study on The State of Child Predation in South Africa, which first warned of the growing scale of online child abuse. CABC found that the sexualisation of minor children is rife on the social media platform X (formerly Twitter); however, counter-narratives existed in the conversation.
Follow us on WhatsApp | LinkedIn for the latest headlines
The most troubling issue is the spread of CSAM content through a network of accounts on X. Many posts included hashtags or keywords like #ama2k or ama2000, which indicate that the subjects were born after 2000.
In this latest investigation, researchers tracked accounts and conversations, uncovering networks that openly share, distribute, and sell content that harms young lives. In some cases, this material is bundled into “menus” with price tags, and access to “premium” groups containing severe abuse is offered for cash.
“What we uncovered is not hidden in the dark web – it’s happening in plain sight,” said the report. “Child pornographic material is being advertised, exchanged, and sold for cash in spaces that anyone, including children, can access.”
The CABC tracked accounts and conversations on platforms, including X, Telegram, Facebook, and WhatsApp, and through detailed social media analysis. The analysis found about 500 active accounts on X using hashtags and keywords linked to CSAM and “fantasy CSAM.” Many of these accounts referred to one another, suggesting they work together. Alarmingly, X’s own AI tool, Grok, recommended explicit CSAM-linked accounts when users searched for similar profiles, unintentionally promoting harmful content.
On Telegram, researchers discovered entire groups acting as digital marketplaces for child abuse. Users shared “samples” of CSAM to attract buyers, while premium access was sold for cash.
According to the report, many of these groups have a short lifespan, particularly those dedicated to CSAM content. However, as soon as one group is shut down, another is created to replace it with more stringent access controls, and less visible to the public. Some groups were taken down during the investigation; however, many quickly resurfaced, often with tighter security settings to avoid detection.
WhatsApp numbers were also circulated for direct material exchange.
“Suspending a few groups is like putting a plaster over a gunshot wound,” said the report. “These networks regenerate overnight. Platforms need to take full accountability for the proliferation of material that harms children.”
The normalization of this crime goes even further. Researchers found accounts advertising “Black Friday specials” on abusive content and seeking “collaborations”, treating the exploitation of children as a casual business venture.
“Platforms are not just failing to stop this – in some cases, their own AI systems are helping predators find each other,” said Kyle Janse, researcher at CABC.
The CABC urged social media platforms and stakeholders to adopt stronger preventive measures. The organization urges platforms to shift from reactive moderation to proactive prevention, implementing systemic safeguards to dismantle these networks. It also urged civil society to demand greater transparency through independent third-party audits and detailed reporting on how CSAM is identified, moderated, and handled, especially in indigenous languages. The report further emphasized the need for collaboration between social media companies and civil society to strengthen moderation, reporting systems, and accountability.
The CABC recommended a blanket ban on all explicit content on social media platforms due to the potential for the proliferation of content that may harm children, as well as provide an avenue for exploitation of children for monetary or other nefarious and illegal purposes.
“Every day this material circulates, children are being harmed, retraumatised, and commodified,” Janse said. “Without immediate systemic change, South Africa’s children will remain for sale to the highest bidder online.”