TikTok moderator, the worst IT job? Former employees sue the company

According to the testimonies of some of the approximately 10,000 moderators hired by parent company ByteDance, to moderate potentially offensive content distributed on TikTok, the job involves watching up to 12 hours a day of an endless stream of “great”, including scenes of extreme violence such as murder, rape, beheading, cannibalism and other forms of extraordinary violence.

ByteDance, the company behind the TikTok network, is being sued by former moderators who are accused of psychological trauma, resulting from repeated exposure to scenes that show “explicit acts of extreme violence”. The former moderators especially blame the lack of psychological counseling and the pressure of the employer to work “continuously” in shifts of up to 12 hours, watching almost without pause countless scenes with strong emotional impact.

The success of the TikTok network, which last year attracted more visitors than even Google, has a dark side that only moderator teams really get to know. Apparently, not a few of the videos uploaded to TikTok show such extreme scenes that employees who have to watch them in response to reports sent by users or triggered by the platform’s automated algorithms end up with the worst psychological traumas comparable to those encountered by victims. and soldiers who spent a lot of time in conflict zones.

Read:  Xiaomi Band 7 Pro, officially announced. Comes with bigger screen and NFC

According to allegations filed by Ashley Velez and Reece Young, TikTok moderators encounter scenes of child pornography and beatings against children, long-range shooters and other situations where it is virtually impossible to remain emotionally indifferent. However, TikTok moderators are required to watch such scenes as part of a continuous stream of clips sent for verification, with a few seconds to determine if the material violates the platform’s posting rules, moving immediately to the next clip.

As a result of the activity carried out for the employer, “the plaintiff has sleep problems and, when she sleeps, she has terrible nightmares”, it is shown in the process.

Read:  iOS 17: Apple devices will soon take over speaking for you

Mimicking the strategy already seen in other social media companies (Facebook and YouTube) that have faced similar issues among moderator teams, TikTok has published a set of recommendations to help moderators cope with potentially traumatic activity. Suggestions include limiting moderators’ shifts to four hours and free psychological support by hiring qualified counselors. But all of these are just recommendations, which ByteDance and potential partners hired for content moderation on TikTok don’t really follow.

The Best Online Bookmakers April 18 2024

BetMGM Casino

Bonus

$1,000