Facebook to Add 3,000 People to Its Team Reviewing Harmful Posts

Media News Facebook
Facebook to Add 3,000 People to Its Team Reviewing Harmful Posts

Following a rash of recently livestreamed murders and suicides via Facebook Live, Mark Zuckerberg announced that Facebook plans to add 3,000 more employees to its large team already screening for such posts.

In a post to his own Facebook page this morning, Zuckerberg noted that the additional hires would be joining the already existing team of 4,500. He explained:

If we’re going to build a safe community, we need to respond quickly. We’re working to make these videos easier to report so we can take the right action sooner— whether that’s responding quickly when someone needs help or taking a post down.

With a user base that has grown to reach nearly two billion, perhaps the the two-thirds increase in human-handled curation (as opposed to it being done simply with algorithms) is a necessary step for the social media giant. Previously, after Donald Trump managed to win the presidential election with much help from Facebook-shared, completely false news stories, the company took partial responsibility by introducing a new “disputed” tag to allow users to flag fake news.

For others’ takes on how we can fight fake news, you can read about the founder of Wikipedia’s plans here, or watch a take from Samantha Bee from back in December here. Hopefully, we won’t need to update you much in the future concerning Facebook’s much larger issues concerning livestreamed harm to users and others.

0 Comments
Inline Feedbacks
View all comments
Share Tweet Submit Pin