Over 200 Facebook workers have said that the social media giant is forcing content moderators to come back to the office during the ongoing COVID-19 pandemic as the company's attempt to depend on automated systems has "failed".
The workers wrote an open letter to Facebook CEO Mark Zuckerberg and Chief Operating Officer Sheryl Sandberg and the heads of two companies, Accenture and CPL, to which Facebook subcontracts content moderation, Techcrunch reported.
"Without our work, Facebook is unusable," the letter read. "Your algorithms cannot spot satire. They cannot sift journalism from disinformation. They cannot respond quickly enough to self-harm or child abuse. We can."
In August, Facebook had told its employees that they can continue to work from home until July 2021.
The letter accused the social media giant of putting content moderators' lives at risk by asking them to return to offices.
The workers, in the letter, asked the company and its outsourcing partners to work on better safety and working conditions. The demands also included hazard pay for moderators who must return to the office and better health care and mental health support.
They also mentioned that the company should "maximise" the amount of work that can be performed at home.
Most content moderators at Facebook are not its employees but contractors who work for third parties, including Accenture and CPL.
Facebook spokesman Drew Pusateri in a statement said that "Facebook has exceeded health guidance on keeping facilities safe for any in-office work."
"We appreciate the valuable work content reviewers do and we prioritize their health and safety," he said. Moderators have access to health care and "confidential wellbeing resources" from their first day on the job, he added.
Accenture in a statement said: "We are gradually inviting our people to return to offices, but only where there is a critical need to do so and only when we are comfortable that we have put the right safety measures in place, following local ordinances. These include vastly reduced building occupancy, extensive social distancing and masks, daily office cleaning, individual transportation and other measures."
The company said that decision would increase its use of automated systems to flag violating content.
The content moderators also revealed the shortcomings of the automated systems. "Important speech got swept into the maw of the Facebook filter — and risky content, like self-harm, stayed up," they wrote. "Facebook's algorithms are years away from achieving the necessary level of sophistication to moderate content automatically. They may never get there."
"Facebook needs us," the letter read. "It is time that you acknowledged this and valued our work."