As if Facebook's year is not going badly enough, the company's moderation guidelines are being called to task. The New York Times cites leaked moderation guidelines that show how confusing Facebook's policies for handling hate speech and propaganda messages are.
That should not really be a surprise to most people considering how the social network has been exploited to influence political movements. Certainly, Facebook's shaky rules have grabbed headlines before. Moderation tests have previously found the network fails to puck up multiple hate speech items.
In the report, the New York Times found Facebook cannot keep up with information through changing political environments.
For example, countries like Bosnia and Sri Lanka have shifting political situations, while an extremist group in Myanmar was able to continue using Facebook for months.
Further results show the company misinterpreted laws regarding speech restriction in several countries, among them India. In conclusion, the New York Times says Facebook is more interested in protecting its reputation.
NYT reports some of the problems are caused by an overworked moderation team:
“Facebook outsources moderation to companies that hire the thousands of workers who enforce the rules. In some of these offices, moderators say they are expected to review many posts within eight to 10 seconds. The work can be so demanding that many moderators only last a few months.
The moderators say they have little incentive to contact Facebook when they run across flaws in the process. For its part, Facebook largely allows the companies that hire the moderators to police themselves.”
Facebook has already been rocked by numerous scandals this year. Earlier this month, the company was caught handing user account access to 150 major companies.
The New York Times reports that major companies across the tech and financial sectors were granted access to hundreds of millions of user accounts. Most of Facebook's major partners received privileged access, including Amazon, Netflix, Spotify, and Microsoft.