Denver, Colo., Jun 5, 2023 / 16:00 pm
The social media website Twitter has apparently failed to block images of child sexual abuse, with researchers detecting several dozen known images of illegal pornographic material on the platform from March through May.
Though Twitter appeared to correct the problem, it imposed new fees for the use of an application to monitor the social media platform’s ability to block child pornography, the Wall Street Journal reported.
The Wall Street Journal’s report was based on research conducted by the Stanford Internet Observatory, which conducted a study of child protection issues across multiple social media platforms. It used a computer program to analyze a data set of about 100,000 Tweets from March 12 to May 20. The researchers found more than 40 images on Twitter flagged as CSAM (child sexual abuse material) in databases that companies use to screen content.
“This is one of the most basic things you can do to prevent CSAM online, and it did not seem to be working,” David Thiel, chief technologist at the Stanford Internet Observatory and report co-author, told the Wall Street Journal.