0

User-generated content (UGC) comes in different kinds, from positive to the negative, and the harmful ones. People have been employed to ensure that the negative/harmful content is kept at bay. They actually do this as a source of their livelihood.

Their job is deceptively powerful — they’re the ones who decide what should or shouldn’t be removed from a platform, effectively combating against harmful content online. But you will agree with me that the job of a content moderator is also difficult and tasking.

“Their days are spent debating whether a video depicting something like a beheading should be given a place on some of the world’s most popular websites. It’s complicated to decide if a piece of content — even if it’s grotesque — is newsworthy enough to stay.” according to Mashable.

Having to do this every time could definitely take a toll on content moderators. But two Microsoft employees say their company, one of the largest in the world, failed to provide them with proper support as their mental health deteriorated and they began showing symptoms of Post-Traumatic Stress Disorder, or PTSD.

ADVERTISEMENT

Lawsuit

In a lawsuit filed in King County, Washington, Two former Microsoft employees have sued the tech company for damages, because they say the disturbing content (child sex abuse photos, murder videos, and other extreme content) they had to view for their jobs has caused them to experience post-traumatic stress disorder (PTSD). Mashable reports that “the amount is to be decided during a trial, according to the complaint filed in district court.”

According to the BBC, Henry Soto and Greg Blauert worked for Microsoft’s online safety team, which was responsible for checking images surfaced by software or reported as offensive, and forwarding illegal ones to the US National Center for Missing and Exploited Children.

ADVERTISEMENT

 The Adverse Effect

Blauert had a mental breakdown in 2013, according to their complaint, and Soto said he suffered panic attacks, depression, hallucinations, and had trouble spending time with children, including his son. The men said Microsoft minimized their complaints, suggesting they take a walk or play video games to take their minds off it, and told them if they wanted to transfer, they would have to apply for a new job.


Microsoft’s claims

ADVERTISEMENT

Quartz reports that Microsoft has rejected Soto and Blauert’s claims. The company said it acknowledges the difficulty of the job and that the employees had access to mental-health professionals and an individual wellness plan to “ensure those who handle this material have the resources and support they need,”

This is the full statement Microsoft provided Mashable:

We disagree with the plaintiffs’ claims. Microsoft takes seriously its responsibility to remove and report imagery of child sexual exploitation and abuse being shared on its services, as well as the health and resiliency of the employees who do this important work. 

Microsoft applies industry-leading technology to help detect and classify illegal imagery of child abuse and exploitation that are shared by users on Microsoft Services. Once verified by a specially trained employee, the company removes the imagery, reports it to the National Center for Missing & Exploited Children, and bans the users who shared the imagery from our services. 

This work is difficult but critically important to a safer and more trusted internet. The health and safety of our employees who do this difficult work are a top priority. Microsoft works with the input of our employees, mental health professionals, and the latest research on robust wellness and resilience programs to ensure those who handle this material have the resources and support they need, including an individual wellness plan. We view it as a process, always learning and applying the newest research about what we can do to help support our employees even more.

More in Features

You may also like