In a nutshell: A TikTok moderator has sued the platform and parent company ByteDance for refusing to support workers who encounter graphic content, including child pornography, beheadings and school shootings.
Bloomberg reports that in her proposed class action lawsuit, content moderator Candy Fraser says she has had to show videos of cannibalism, headbanging, suicide and a fatal fall from a building.
The complaint said that while TikTok was one of several social media platforms that introduced rules such as psychological support and a four-hour shift limit designed to help moderators cope with the impact of child pornography, the company was unable to implement them.
The lawsuit also states that TikTok moderators work 12-hour shifts with an hour-long lunch break and two 15-minute breaks, and that they must watch hundreds of videos every day. Fraser’s lawyers said employees are allowed no more than 25 seconds per video, and they often watch three to ten videos at a time.
Las Vegas resident Fraser says she now has PTSD due to graphic videos. According to the complaint, she also has sleep problems and has terrible nightmares while sleeping.
Fraser is seeking compensation for psychological trauma and a court order requiring the company to create a medical fund for the moderators.
Moderators suing companies for allegedly causing them PTSD is not new. A content moderator for contractor Facebook Pro Unlimited filed a lawsuit against the social network in 2018 after “persistent and blatant exposure to highly toxic and highly disturbing workplace images” led to a riot. There was also the case of the YouTube mod, which sued a Google-owned firm in 2020 after watching thousands of disturbing videos and developing symptoms of post-traumatic stress disorder and depression.