Facebook will pay $52M to US content moderators for trauma on the job. What about its international contractors?

A 3D-printed Facebook logo is seen placed on a keyboard in this illustration taken March 25, 2020.

In a landmark decision that could have implications for content moderators around the world, Facebook has agreed to pay $52 million to compensate some US-based workers for the trauma they endured on the job.

Related: Twitter and Facebook are collaborating to stop the spread of coronavirus misinformation. Is it enough?

According to the agreement announced on Tuesday, Facebook will make payments to more than 10,000 current and former content moderators to settle a class-action lawsuit they brought against the company alleging that they developed post-traumatic stress disorder (PTSD) and other conditions as a result of the work. The settlement was first reported by the technology publication, The Verge.

Facebook relies on tens of thousands of content moderators around the world to review often graphic and disturbing posts and determine whether they should be removed from the platform. Many of these workers are contractors, employed by third-party firms that work with Facebook.

Related: Catholic Twitter debates Trump’s handling of coronavirus pandemic

This is the first time a social media company will pay workers who say their mental health suffered as a result of exposure to disturbing content, according to lawyers who represented the content moderators in the lawsuit. The new settlement covers only workers based in the US, but the unprecedented move could have an impact on content moderators in other parts of the world.

“What we’ve seen is that after kind of years of holding out their workforce of content moderators at arm’s length, Facebook is at least trying to take some responsibility for the mental health of some of its content moderators.”

Cori Crider, Foxglove

“What we’ve seen is that after kind of years of holding out their workforce of content moderators at arm’s length, Facebook is at least trying to take some responsibility for the mental health of some of its content moderators,” said Cori Crider, director of Foxglove, a London-based nonprofit that’s assisting with a separate lawsuit launched by content moderators in Europe.

“And I would put it in front of a court pretty much anywhere and say, look, if they take responsibility for [content moderators] in the [United States], they should take responsibility for them in India. They should take responsibility for them in South America. They should take responsibility for them in Europe.

Related: ‘Straight-up debunking’: How a fact-checker vets fake news

“The last thing you want to see is a kind of half-measure payment to some workers in the United States, but people in even more precarious situations in North Africa, and India and all other places where content moderation is done get nothing and see no improvement in their conditions,” Crider said.

Kickstart The World’s fundraising drive!

The article you just read is free because dedicated readers and listeners like you chose to support our nonprofit newsroom. Our team works tirelessly to ensure you hear the latest in international, human-centered reporting every weekday. But our work would not be possible without you. We need your help.

Make a gift today to help us raise $67,000  by the end of the year and keep The World going strong. Every gift will get us one step closer to our goal!