August 16, 2022


a Forbes Report raises questions About how TikTok’s moderation team handles child sexual abuse material – claiming it gives widespread and unsafe access to illegal photos and videos.

Employees of a third-party moderation group called Teleperformance, which works with TikTok among other companies, allege that it asked them to review an annoying spreadsheet called DRR or the required daily reading on TikTok’s moderation standards. The spreadsheet allegedly contained content that violated TikTok’s guidelines, including “hundreds of photos” of children who were naked or otherwise abused. Employees say hundreds of people at TikTok and Teleperformance can access the content from in and out of the office — opening the door to a wider leak.

Teleperformance denied to Forbes It has shown employees sexually exploitative content, and TikTok said its training materials have “strict access controls and do not include visual examples of CSAM,” although it has not confirmed that all third-party vendors have met this standard.

The staff tells a different story, and as Forbes He explains, it’s legally suspicious. Content moderators are routinely forced to deal with CSAM being posted on many social media platforms. But child abuse images are illegal in the United States and should be treated with caution. The companies are supposed to report the content to the National Center for Missing and Exploited Children (NCMEC), then keep it for 90 days while reducing the number of people who see it.

The allegations here go beyond that. They report that Teleperformance showed employees graphic images and videos as examples of what’s tagged on TikTok, while playing quickly and easily with access to that content. One employee says she contacted the FBI to ask if the practice constituted criminal dissemination of CSAM, although it is not clear if it was opened.

full Forbes Report Worth a read, and it illustrates a situation where moderators have not been able to keep up with the explosive growth of TikTok and have been asked to watch crimes against children for reasons they feel add up. Even by the complex standards of discussions about child safety online, the situation is bizarre — and if subtle, horrific.



Source link

Leave a Reply

Your email address will not be published.