Facebook is taking lots of criticism over the way it handles vicious and disturbing content.
As the company scrambles to manage recordings of suicide and murder posted on its stage, a report from The Guardian gives new experiences into the awkward part the online networking giant now plays as a substance controller.
The report depends on spilt documents that purportedly lay out the internal standards and rules that Facebook uses to survey posts containing savagery, hate speech, nudity, self-harm, and psychological oppression. It highlights the company’s battle to control destructive substance without being accused of stomping on the freedom of expression.
The Guardian says the documents were “supplied to Facebook moderators within the last year.” Facebook declined to affirm the Guardian’s revealing, but it didn’t debate it.
“We work hard to make Facebook as safe as possible while enabling free speech,” Monika Bickert, the company’s head of global policy management, said in a statement to CNNMoney. “This requires a lot of thought into detailed and often difficult questions, and getting it right is something we take very seriously.”
Facebook CEO Mark Zuckerberg declared not long ago that the company was procuring 3,000 more people to help “review the millions of reports we get every week.”