Fewer pieces of suicide, child nudity and exploitation content were removed when staff were forced to work from home
Facebook struggled to remove content that promoted suicide or exploited children because of the Covid pandemic, the company has admitted, after global lockdowns forced it to rely more heavily on automatic moderation than it ever had before.
In March, Facebook sent many of its content reviewers home, and began focusing on AI-driven moderation. In its first quarterly report on its moderation practices since the crisis took hold, the company has revealed the successes and failures of that approach.
Selected by softengoxford