In short: Facebook may soon be facing a class-action lawsuit led by former content moderator Selena Scola, following the plaintiff's claim that the company doesn't provide any form of mitigative support for those holding the position. The problem reportedly stems from the fact that content moderators are subject to a wide array of traumatizing videos, images, and sometimes live streams on the site. Those hires often occur through staffing agencies such as co-defendant Pro Unlimited Inc., according to the suit, resulting in a high-turnaround of contracted workers that don't necessarily have access to care through the agency or Facebook. The plaintiff wants to see Facebook put systems in place to ensure that those who do or might eventually suffer PTSD are receiving testing for the disorder and treatment where applicable. Scola was formally diagnosed with PTSD after working as a Facebook content manager for nine months beginning in June of 2017.
Background: The law firm representing Scola, Burns Charest LLP, claims that the content in question can and does often include thousands of instances of "child sexual abuse, rape, torture, bestiality, beheadings, suicide, and murder." The firm also claims that over a million reports of "potentially objectionable" material requiring moderation every day, while Facebook reported that it serves as many as 2.23 billion active users every month as of Q2 2018. So it's not at all surprising that some of the content shared would go well beyond the level of 'offensiveness' that most people would experience within their lifetime. However, the issue at hand here appears to be that this may be a problem that Facebook is either unaware of or simply ignoring. The results of that, the suit claims range from psychological trauma to symptoms and diagnosis of PTSD - presumably not dissimilar to that experienced by veterans returning from war zones.
Impact: For the time being, Facebook does not appear to have provided a response to the allegations and a class-action suit has not been formally filed. Instead, this case is in its very early stages with the representing firm currently seeking to reclassify the suit as a class-action. In the meantime, Facebook is already under scrutiny for its role in the spread of misinformation and concerns over how it handles its users' private data. Other social media companies do, meanwhile, provide preventative care and other mitigations for trauma experienced by moderators. So this case isn't likely to cast the social media giant in a positive light, regardless of which direction it ultimately goes.