A TikTok moderator has recently sued the company over mental trauma caused by graphic videos. Moderator Candie Frazier says that she had to watch violence, school shootings, fatal falls, and even cannibalism videos as a part of her job. As a result of watching these videos, Frazier claims she is now suffering from mental trauma.
According to Engadget (via Bloomberg), TikTok has 10,000 content moderators, and the lawsuit states that these people are exposed to lots of nasty content like child pornography and animal mutilation.
“Plaintiff has trouble sleeping, and when she does sleep, she has horrific nightmares,” according to the lawsuit. Frazier is now seeking compensation for psychological injuries and a court order to force TikTok and ByteDance to provide the necessary support to moderators.
TikTok under fire for working conditions of content moderators
Thursday’s complaint in federal court in Los Angeles reveals some noticeable information about the working conditions of a TikTok moderator. As per the complaint, moderators have to work 12-hour shifts, and they have only one hour for launch and two 15-minute breaks.
“Due to the sheer volume of content, content moderators are permitted no more than 25 seconds per video and simultaneously view three to ten videos at the same time,” Frazier lawyers said in the complaint.
In a response, TikTok said it is working “to promote a caring working environment for our employees and contractors.” “Our safety team partners with third party firms on the critical work of helping to protect the TikTok platform and community, and we continue to expand on a range of wellness services so that moderators feel supported mentally and emotionally,” a company spokesperson said in a statement.
TikTok, along with Facebook and YouTube, has tried to develop guidelines for content moderators on dealing with child abuse cases. The promised guidelines are providing psychological support and limiting shifts to four hours. However, the lawsuit states that TikTok had done nothing to implement those guidelines.
A TikTok moderator have to watch many weird and nasty content every day that can even cause post-traumatic stress disorder (PTSD). They have to watch those contents, so users don’t see them on their feed.
Of course, there are many similar lawsuits against other social media networks. In 2018, a content moderator filed a similar lawsuit against Facebook. He claimed that Facebook did not provide the necessary support to moderators to protect them from unpleasant content. Another case against Facebook just submitted a few moths ago in Europe.