CEO Mark Zuckerberg announces intention is to hire 3,000 extra members of staff to review content.
The increase comes on top of the 4,500 employees Facebook already has to review content broadcast or posted that could be judged as being in violation of the company’s community standards. In the post on his Facebook profile page written earlier this week, Zuckerberg also said the company is to develop new tools to manage the millions of content reports it receives by users every month.
“If we’re going to build a safe community, we need to respond quickly,” Zuckerberg said. “We’re working to make these videos easier to report so we can take the right action sooner — whether that’s responding quickly when someone needs help or taking a post down.”
Against the rules, but up to users to report…
Videos and posts that feature hate speech, crime, racism, self harm, or glorify violence are against Facebook’s rules. The company has recently started using Artificial Intelligence algorithmic software to identify child abuse content and previously banned images. But in most cases, questionable content is currently only reviewed and removed if users report it.
“We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help,” Zuckerberg also wrote.
Just another drop in a virtual reality ocean?
The social media giant has come in for substantial criticism in the last few weeks for allegedly not responding quickly enough to homicides broadcast live on its service. Zuckerberg’s announcement has been met generally by the media as a positive move, if only as a first step in the right direction for the company to regulate and police itself more effectively.
Facebook currently has around 1.86 billion active monthly users. The real question is whether 7,500 people reviewing content is anywhere near enough, or if it’s a token gesture on behalf of Facebook’s bottom line.
Time will tell.