TikTok moderator sues over mental trauma caused by graphic videos - Help with AI for latest technology

Breaking

Underreview for latest technology gadgets and worldwide technologies, AI, Machine Learning, Neural networks, Artificial intelligence, Tensorflow, Deep Learning, DeepAI, Python,JavaScript,OpenCv, ChatBot, Natural Language Processing,Scikit-learn

Friday, 24 December 2021

TikTok moderator sues over mental trauma caused by graphic videos

A TikTok moderator has sued the social media platform and its parent ByteDance over trauma caused by graphic videos, Bloomberg has reported. In a proposed class-action lawsuit, moderator Candie Frazier said that she has screened videos showing violence, school shootings, fatal falls and even cannibalism. "Plaintiff has trouble sleeping and when she does sleep, she has horrific nightmares," the lawsuit states.

Compounding the problem, TikTok allegedly requires moderators to work 12-hour shifts with only a one-hour lunch and two 15-minute breaks. "Due to the sheer volume of content, content moderators are permitted no more than 25 seconds per video, and simultaneously view three to ten videos at the same time," according to the complaint. 

Plaintiff has trouble sleeping and when she does sleep, she has horrific nightmares.

Along with other social media companies including Facebook and YouTube, TikTok developed guidelines to help moderators cope with child abuse and other traumatic images. Among the suggestions is that companies limit moderator shifts to four hours and provide psychological support. However, TikTok allegedly failed to implement those guidelines, according to the lawsuit.

Content moderators take the brunt of graphic and traumatic images that appear on social media, making sure that users don't have to experience them. One company that provides content moderators for large tech firms even acknowledged in a consent form that the job can cause post-traumatic stress disorder (PTSD). However, social media companies have been criticized by their mods and others for not paying enough given the psychological hazards, and not providing enough mental health support. A similar lawsuit was filed against Facebook in 2018. 

Frazier is hoping to represent other Tiktok screeners in a class-action suit, and is asking for compensation for psychological injuries and a court order for a medical fund for moderators. 



from Engadget is a web magazine with obsessive daily coverage of everything new in gadgets and consumer electronics https://ift.tt/3yVjbBa

No comments:

Post a Comment

Guys Comments for Revolutionary Change!!!