Despite Elon Musk saying that child abuse content on Twitter is being removed vigorously, a report has slammed the micro-blogging platform, saying that child sexual abuse imagery (CSAM) still persists on Twitter.
The New York Times found images containing 10 child abuse victims in 150 instances “across multiple accounts” on Twitter.
“Child sexual abuse imagery spreads on Twitter even after the company is notified: One video drew 120,000 views,” the report mentioned.
Meanwhile, the Canadian Centre for Child Protection also uncovered 260 of the “most explicit videos” in its database on Twitter, which garnered over 174,000 likes and 63,000 retweets in total.
According to the report, Twitter actually promotes some of the images through its recommendation algorithm.
The platform reportedly took down some of the disturbing content after the Canadian Centre for Child Protection notified the micro-blogging platform.
“The volume of CSAM we’re able to find with a minimal amount of effort is quite significant,” said Lloyd Richardson, the Canadian center’s technology director.
“It shouldn’t be the job of external people to find this sort of content sitting on their system,” Richardson was quoted as saying.
In November last year, Musk slashed 15 percent of its trust and safety staff (content moderation) saying it wouldn’t impact moderation.
Earlier this month, Twitter said it’s “proactively and severely limiting the reach” of CSAM content and that the platform will work to “remove the content and suspend the bad actor(s) involved.”
Musk had earlier said that removing child abuse content is his top priority.