Skip to main content

Facebook is recruiting thousands of moderators to review flagged content

facebook spaces
Christian de Looper/Digital Trends
If Facebook has one big, overriding problem, it’s objectionable content. The social network’s billions of users report videos and pictures that violate the website’s terms of service every day and it is up to the site’s moderation team to review complaints. But, it has struggled lately.

On Wednesday, Facebook said it would recruit as many as 3,000 moderators to help parse the network’s content for hate speech, child exploitation, animal abuse, teenage suicide, and self-harm. They will join the existing 4,500-member review team.

Facebook’s moderation problem is an open secret. In 2016, a BBC reported that private Facebook groups were being used by sexual predators to trade images of exploited children. Despite promises by Facebook’s head of public policy to “[remove] content that shouldn’t be there,” a follow-up investigation found that Facebook failed to remove a vast majority of the images — about 18 or 100 — after the BBC used Facebook’s own systems to report them.

In response, the chairman of the U.K. House of Commons’ media committee, Damian Collins, told the BBC he had “grave doubts” about the effectiveness of Facebook’s moderation. “I think it raises the question of how can users can make effective complaints to Facebook about content that is disturbing, shouldn’t be on the site, and have confidence that it will be acted upon,” he said.

Image used with permission by copyright holder

The chairman’s comments came on the heels of more onerous oversights. Earlier in 2017, three men live-streamed the gang-rape of a woman in the city of Uppsala, Sweeden, 50 miles north of Stockholm. Last month, a man in Thailand killed himself and his child and broadcast it. And two days prior to Facebook’s annual F8 developer conference, a Cleveland man filmed the shooting and killing of a 74-year-old man.

“We still have a lot of work to do, and we will keep doing all that we can to prevent tragedies like this from happening,” CEO Mark Zuckerberg said during F8’s keynote address. “We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards, and easier for them to contact law enforcement if someone needs help.”

Facebook is also improving its automated moderation tools. It is developing new algorithms that will automatically identify and take down objectionable content, and tools that will make it easier for users to report problems and contact law enforcement.

But Zuckerberg said that these measures won’t be an instant fix. “Artificial intelligence can help provide a better approach,” Zuckerberg said in an open letter. “[But it will take] many years to fully develop.”

He praised the company’s human moderators, who ensure flagged Facebook content abides by the network’s Community Standards.

“Just last week, we got a report that someone on Live was considering suicide. We immediately reached out to law enforcement, and they were able to prevent him from hurting himself,” Zuckerberg said.

Kyle Wiggers
Former Digital Trends Contributor
Kyle Wiggers is a writer, Web designer, and podcaster with an acute interest in all things tech. When not reviewing gadgets…
Facebook is courting creators with a new Music Revenue Sharing
Facebook Website

A new revenue sharing program from Meta now allows Facebook video creators to make money off of videos that include licensed music.

On Monday, Meta announced via a blog post the launch of Music Revenue Sharing, a new program that lets creators earn money on videos that include "licensed music from popular artists."

Read more
Facebook’s new Feeds tab emphasizes chronological posts
A smartphone with the Facebook app icon on it all on a white marble background.

If you'd prefer to view more of your loved ones' Facebook posts in chronological order, Facebook has a new mobile app feature for you.

On Thursday, Meta CEO Mark Zuckerberg announced via a Facebook post a new feature for your Facebook feed called the Feeds tab.

Read more
Amazon sues 10,000 Facebook groups over fake reviews
Amazon logo on the headquarters building.

When it comes to reviews on Amazon and similar shopping sites, most people have by now developed their own approach to dealing with them.

Some use a blend of instinct and experience to decide if what they’re reading is genuine, while others scan a broad selection to try to get an overall feel for a product’s reputation. Of course, some folks simply ignore them altogether.

Read more