General Statistics on Child Sexual Abuse Material (CSAM) and Predators on Social Media
-
Prevalence of CSAM on Social Media Platforms:
-
In 2023, the National Center for Missing & Exploited Children (NCMEC) received over 36.2 million reports of suspected child sexual exploitation, a 12% increase from the previous year, containing over 100 million files (images, videos, and other content). Approximately 85% of these reports originated from Meta-owned platforms, primarily Facebook and Instagram.
-
Meta flagged nearly 72 million pieces of content for child nudity and sexual exploitation in 2023, with Facebook reporting 56.5 million pieces (a 44% decrease from 2022) and Instagram reporting 15.4 million pieces (a 6% decrease from 2022). Of these, 17.8 million (Facebook) and 11.4 million (Instagram) pieces were referred to NCMEC.
-
TikTok reported a 25% drop in content removals for CSAM from 2022 to 2023 (125.6 million pieces flagged in 2023 compared to 168.5 million in 2022), though changes in reporting methods make direct comparisons challenging.
-
-
Child Predators and Online Grooming:
-
An estimated 82% of child sex crimes begin with contact on social media platforms.
-
In 2022/23, UK police recorded 6,350 online grooming offenses against children, an 82% increase from 3,492 in 2017/18. Of these, 73% involved Snapchat or Meta platforms (Snapchat: 44%, Instagram: 12%, Facebook: 9%, WhatsApp: 5%).
-
Approximately 89% of sexual solicitations of children occur in chatrooms or through instant messaging on social media platforms.
-
There are an estimated 500,000 active online predators daily, with children aged 12–15 being particularly vulnerable to grooming and manipulation. Over 50% of online sexual exploitation victims are in this age group.
-
In a 2023 survey, 15.6% of individuals reported experiencing online child sexual abuse before age 18, with 5.4% involving grooming by adults on social media or other platforms. The prime age of vulnerability is 13–17 years.
-
-
Specific Platform Risks:
-
Snapchat: Frequently cited for high-risk activities, including nonconsensual sharing of images and grooming. A 2023 study found Snapchat was used in 44% of online grooming cases reported to UK police. It’s also noted for sextortion and the distribution of self-produced CSAM due to its temporary content feature.
-
Instagram: In 2023, Instagram researchers found nearly 2 million minor accounts were recommended to adults targeting children. It was used in 12% of grooming cases, and there are concerns about inadequate protections against hate speech and explicit content.
-
Facebook: Despite a “zero tolerance” policy, Facebook reported 56.5 million pieces of CSAM-related content in 2023. It’s a common platform for predators to trade CSAM and contact minors, with 38% of recent cyber grooming cases involving Meta platforms (Facebook, Instagram, WhatsApp).
-
Other Platforms: Platforms like Discord, YouTube, and gaming sites (e.g., Fortnite, Roblox) are increasingly used by predators due to unmoderated chat features. For example, Fortnite’s chat functionality has been flagged as a risk for connecting predators with young users.
-
-
Nature of Online Exploitation:
-
About 1 in 5 youth experience unwanted exposure to sexually explicit material online, and 1 in 9 face sexual solicitation. Social media platforms are a primary vector for this exposure, often through comments, direct messages, or algorithmic recommendations.
-
Sextortion is a growing issue, with 3.5% of youth reporting experiences where perpetrators threatened to share explicit images. This often occurs on social media platforms like Snapchat and Instagram.
-
Self-produced CSAM, where minors create and share explicit images (often under coercion), has a prevalence rate of 7.2%. This is frequently facilitated through social media platforms and private messaging apps.
-
In 2023, NCMEC received 4,700 reports of CSAM or exploitative content related to generative AI, with 70% originating from traditional social media platforms rather than AI-specific platforms.
-
-
Demographics and Impact:
-
Girls are disproportionately affected, making up 83% of known victims of online grooming where gender is reported.
-
Perpetrators are predominantly male (96–98%) and often known to the victim (93% of cases), though online-only contact is common (98% of predators have not met victims in person).
-
The distribution of CSAM has a lasting impact, with 67% of survivors reporting that the ongoing circulation of their images causes unique harm compared to physical abuse.
-
In 2021, there was a threefold increase in CSAM involving 7–10-year-olds who were groomed or tricked into performing sexual acts online, indicating younger children are increasingly targeted.
-
-
Law Enforcement and Reporting:
-
In 2023, only 245 electronic service providers (ESPs) submitted CyberTipline reports to NCMEC, with five companies (likely including Meta, Google, and others) accounting for over 91% of reports.
-
The Australian Centre to Counter Child Exploitation (ACCCE) reported 58,503 cases of online child sexual exploitation in 2023–24, averaging 160 reports daily.
-
Operation iGuardian (2013) and Operation Laminar (2010–2011) identified predators using social media platforms like Facebook and Socialgo to trade CSAM, resulting in 255 and 55 arrests, respectively, across multiple countries.
-
Despite legal requirements, many platforms do not actively search for CSAM, relying on user reports or automated detection, which can miss significant amounts of harmful content.
-
Relevance to SafeNet Guardians’ Vision
-
High Volume of CSAM: The sheer scale of CSAM (over 100 million files reported in 2023) highlights the need for robust monitoring and reporting systems across platforms.
-
Platform Accountability: Major platforms like Meta and Snapchat are central to both CSAM distribution and grooming, yet their moderation efforts are inconsistent or insufficient, suggesting a need for stronger regulations and proactive safety measures.
-
Grooming and Sextortion Risks: The prevalence of grooming (especially on Snapchat and Instagram) and emerging threats like sextortion and AI-generated CSAM call for education campaigns and technological interventions to protect vulnerable youth.
-
Younger Victims: The increasing targeting of younger children (7–10 years old) emphasizes the urgency of early intervention and parental awareness programs.
-
Underreporting and Detection Gaps: Limited reporting by some platforms and inadequate global legislation (e.g., in countries like Iraq and Bangladesh) indicate a need for international cooperation and advocacy for stricter laws.
Specific Data on CSAM and Child Predators
Platform-Specific Breakdowns (2023–2025)
The following data highlights the prevalence of CSAM and online grooming across major social media platforms, based on reports from 2023 and 2024, with some insights into 2025 trends where available:
Meta Platforms (Facebook, Instagram, WhatsApp):
Facebook: In 2023, Facebook reported 56.5 million pieces of content flagged for child nudity and sexual exploitation, with 17.8 million referred to the National Center for Missing & Exploited Children (NCMEC). This was a 44% decrease from 2022, partly due to improved detection and reporting efficiencies. In 2024, Meta’s implementation of end-to-end encryption (E2EE) on Facebook Messenger (completed by summer 2024) contributed to a 43% overall drop in CyberTipline reports (from 36.2 million to 20.5 million), though adjusted for bundled incidents, 29.2 million distinct incidents were reported. Approximately 85% of NCMEC’s 2023 CSAM reports originated from Meta platforms, primarily Facebook.
Instagram: In 2023, Instagram flagged 15.4 million pieces of content for CSAM, with 11.4 million referred to NCMEC (a 6% decrease from 2022). A 2023 Stanford University study noted Instagram’s algorithm aggressively recommended minor accounts to adults targeting children, with nearly 2 million minor accounts suggested to potential predators. Instagram was involved in 12% of UK grooming cases in 2022/23.
WhatsApp: Implicated in 5% of UK grooming cases in 2022/23, WhatsApp’s E2EE makes it a preferred platform for predators to share CSAM and groom children, as detection is challenging.
Snapchat:
Snapchat was the most frequently cited platform in UK grooming cases, involved in 44% of 6,350 recorded online grooming offenses in 2022/23. Its temporary content feature facilitates sextortion and self-generated CSAM, with 4,312 mentions in UK reports. In 2023, Snapchat reported a significant volume of CSAM, though specific figures are less detailed compared to Meta platforms.
TikTok:
TikTok flagged 125.6 million pieces of content for CSAM in 2023, a 25% decrease from 168.5 million in 2022, attributed to changes in reporting methodologies. It was involved in 4% of UK grooming cases (427 mentions). TikTok’s short-form video format is increasingly exploited for sharing CSAM and grooming via comments or direct messages.
X (Formerly Twitter):
In 2021, X reported 86,666 CSAM incidents to NCMEC, significantly lower than Meta platforms. In 2023, it was cited in 3% of UK grooming cases (294 mentions). In 2024, X was among platforms reporting 20% fewer incidents to NCMEC compared to 2023, possibly due to reduced proactive detection.
YouTube:
YouTube was involved in 3% of UK grooming cases (330 mentions) in 2022/23. Its unmoderated comment sections and live-streaming features pose risks for grooming and CSAM distribution.
Other Platforms:
Discord: Noted for unmoderated chat features, Discord reported 20% fewer CSAM incidents in 2024 compared to 2023, reflecting inconsistent reporting. It’s a growing concern for grooming in gaming communities.
Kik: Cited in 5% of UK grooming cases (494 mentions), Kik’s anonymity features make it a hotspot for predators.
Gaming Platforms (e.g., Fortnite, Roblox): The WeProtect Global Threat Assessment 2023 noted that grooming in social gaming environments can escalate in as little as 19 seconds, with an average of 45 minutes for high-risk situations. These platforms are increasingly exploited due to unmoderated chat features.
Regional Statistics (2023–2025)
United States:
In 2023, NCMEC received 36.2 million CyberTipline reports, with over 100 million files, and in 2024, 20.5 million reports (29.2 million incidents when adjusted for bundling). Over 1.1 million reports in 2023 were referred to U.S. law enforcement, with California, Texas, and Florida reporting the highest volumes. Financial sextortion primarily affects boys aged 14–17, with 90% of victims in NCMEC reports being male.
The Department of Homeland Security’s Know2Protect campaign (launched 2024) reported a 360% increase in CSAM reports over the past decade, with 63 million files in 2024.
United Kingdom:
In 2022/23, UK police recorded 6,350 online grooming offenses, an 82% increase from 3,492 in 2017/18. The Internet Watch Foundation (IWF) identified 291,273 webpages containing CSAM in 2024, a 6% increase from 2023, with 91% being self-generated imagery. Victims are primarily aged 13–14, with a rising trend among 7–10-year-olds.
Australia:
The Australian Centre to Counter Child Exploitation (ACCCE) reported 58,503 cases of online child sexual exploitation in 2023–24, averaging 160 daily reports. Self-generated CSAM is a growing concern, often coerced through grooming on social media.
Eastern and Southern Africa/Southeast Asia:
The 2023 WeProtect Global Threat Assessment found that up to 20% of children in 13 countries in these regions experienced online sexual exploitation in the past year. In the Philippines, 1 in 100 children were victims of trafficking for online sexual exploitation in 2022.
Latin America and Sub-Saharan Africa:
Many children in these regions access the internet via personal mobile devices, increasing exposure to predators. Limited connectivity to INTERPOL’s International Child Sexual Exploitation database in low- and middle-income countries hampers enforcement.
Trends Over Time (2023–2025)
Increase in CSAM Reports: From 2019 to 2023, NCMEC reports rose by 87%, from 16.8 million to 36.2 million, with a peak of 100 million files in 2023. In 2024, reports dropped to 20.5 million (29.2 million incidents), partly due to Meta’s bundling feature and E2EE adoption, but online enticement reports surged by 192% (546,000 in 2024 vs. 186,819 in 2023).
Rise in Online Enticement: Reports of online enticement, including sextortion, increased 300% from 44,155 in 2021 to 186,819 in 2023, and further to 546,000 in 2024, driven by the REPORT Act mandating reporting of enticement and trafficking.
Generative AI and CSAM: AI-generated CSAM (AIG-CSAM) reports surged by 1,325% from 4,700 in 2023 to 67,000 in 2024, with 70% originating from traditional social media platforms. This reflects the misuse of AI tools to create exploitative content.
Self-Generated CSAM: The IWF reported a 6% increase in CSAM webpages from 2023 to 2024, with 91% classified as self-generated, often coerced through grooming or sextortion. This trend is particularly pronounced among 7–10-year-olds.
Financial Sextortion: This form of exploitation, targeting mostly boys aged 14–17, has risen, with 90% of NCMEC-reported victims being male. Social media platforms like Snapchat and Instagram are common vectors.
Global Enforcement Challenges: Over 90% of 2023 CyberTipline reports involved content uploaded outside the U.S., highlighting the need for international cooperation. The adoption of E2EE by platforms like Meta reduces detection capabilities, potentially lowering reported incidents in 2024.