Latest Post

RSS feed generator to create RSS feeds from URLs Tekken 8 or Tekken Remake teased at EVO 2022

One Forbes Report raises questions On how TikTok’s moderation team handles child sexual abuse material – claiming it allows broad, unsafe access to illegal photos and videos.

Employees at Teleperformance, a third-party review agency that works with companies such as TikTok, claim it asked them to review a disturbing spreadsheet known as the DRR or daily must-read on TikTok’s review standards. The spreadsheet allegedly contained content that violated TikTok’s guidelines, including “hundreds of images of naked or abused children.” Hundreds of people at TikTok and Teleperformance can access the content from inside and outside the office, employees said — opening the door to wider leaks.

Teleperformance denies Forbes It showed sexually exploitative content to employees, and TikTok said its training materials “have strict access controls and don’t include visual examples of CSAM,” though it didn’t confirm that all third-party vendors met that standard.

Employees tell a different story, and as Forbes Put it out, it’s a legal adventure. Content moderators are often forced to deal with CSAMs posted on many social media platforms. But child abuse imagery is illegal in the United States and must be handled with care. Companies should report the content to the National Center for Missing and Exploited Children (NCMEC), then keep it for 90 days, but minimize the number of people who see it.

The charges here go far beyond that limit. They said Teleperformance showed employees graphic photos and videos as an example of tagging content on TikTok while playing fast and loose while accessing that content. One employee said she contacted the FBI to ask if the practice constituted a crime to spread CSAM, but it was unclear if anyone had opened it.

complete Forbes Report Well worth reading, it outlines how moderators have been unable to keep up with TikTok’s explosion and have been told to monitor crimes against children for what they see as unconscionable. Even by the standards of a complex debate about child safety online, this is an odd — if accurate, horrific — situation.

Source link

Leave a Reply

Your email address will not be published.