Giving Compass' Take:
- Glen Pounder and Rasty Turek explain how social media platforms have become a pipeline for the rapid spread of child sexual abuse material.
- What can we do to learn more about effective means to halt the distribution of child sexual abuse material? What resources can you deploy to address this problem?
- Read about how one nonprofit is combating online child sexual abuse and exploitation.
What is Giving Compass?
We connect donors to learning resources and ways to support community-led solutions. Learn more about us.
Many of the sites and platforms that have done so much to democratize free expression around the world have also unfortunately spurred a rise in harmful and illegal content online, including child sexual abuse material (CSAM).
The internet did not create CSAM, but it has provided offenders with increased opportunity to access, possess, and trade child sexual abuse images and videos, often anonymously, and at scale. The National Center for Missing & Exploited Children (NCMEC) has seen a 15,000% increase in abuse files reported in the last 15 years. At the same time, a report from Facebook to NCMEC in fall 2020 found that only six videos accounted for more than half of reported CSAM content across Facebook and Instagram—indicating that vast networks of individuals are relentlessly sharing pre-existing CSAM.
While these criminal transactions were once confined to the darkest reaches of the web, the advent of social media platforms has unwittingly provided an efficient distribution pipeline. As a result, platforms and law enforcement agencies have struggled to contain the seemingly endless streams of CSAM. Google, Dropbox, Microsoft, Snapchat, TikTok, Twitter, and Verizon Media reported over 900,000 instances on their platforms, while Facebook reported that it removed nearly 5.4 million pieces of content related to child sexual abuse in the fourth quarter of 2020.
Facebook noted that more than 90% of the reported CSAM content on its platforms was the “same as or visibly similar to previously reported content,” which is the crux of the problem. Once a piece of CSAM content is uploaded, it spreads like wildfire, with each subsequent incident requiring its own report and its own individual action by authorities and platforms.
Read the full article about social media and child sexual abuse material by Glen Pounder and Rasty Turek at Fast Company.