Seven days after the news broke of different recordings of suicides posted on Facebook which remained on the site for a considerable length of time, the company has considered another measure to add 3,000 more people to its operations group to screen for hate and violent videos and different posts to provide a quicker action.
Stamp Zuckerberg, the CEO of Facebook said this would be an addition to the 4,500 people officially working in this task. What is not clear is whether these are full-time representatives or temporary workers, and how screeners will, as a result, be screened.
“If we’re going to build a safe community, we need to respond quickly,” Zuckerberg wrote in a post earlier today. “We’re working to make these videos easier to report so we can take the right action sooner — whether that’s responding quickly when someone needs help or taking a post down.”
The move to include more human curation is a stage in the correct course. To date, the company has put more attention on building algorithms and systems for people to provide details regarding companions or even themselves if they are concerned. In March, it launched another feature of suicide prevention tools. After a month it worked on to battle vindicate porn.
“We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help. As these become available they should help make our community safer,” Zuckerberg wrote.
As the world’s largest interpersonal organization, Facebook has held a questionable place in the debate about what part online networking is playing in how data is spread far and wide today. It is high time that Facebook continues on taking strict measures to combat hate online, which of course, has a significant impact on the mass.