Facebook to Hire 3,000 People to Monitor Crime and Suicide Videos

Zuckerberg vows to give Facebook users the "safe community" they need.

Facebook boss Mark Zuckerberg announced plans Wednesday to bring 3,000 new employees aboard to assist in the social networking site's continued efforts to monitor and promptly remove violent content. The issue of how best to deal with footage of violent acts being livestreamed or otherwise shared on the platform has become a heated discussion topic in light of instances such as the filmed murder last month of a 74-year-old Cleveland man, whose suspected killer later took his own life.

"Over the last few weeks, we've seen people hurting themselves and others on Facebook -- either live or in video posted later," Zuckerberg said in a Facebook post Wednesday morning. "It's heartbreaking, and I've been reflecting on how we can do better for our community." In order to continue building a safer community for its users, Zuckerberg said Facebook needs to "respond quickly" and make videos easier to report. To make this happen, Facebook is bringing on thousands of new employees.

Facebook will add 3,000 people to its global "community operations team" over the next year. That team, Zuckerberg added, is already 4,500 members strong as of Wednesday. Community operations team members are tasked with reviewing the millions of reports the site receives on a weekly basis, including non-video requests related to hate speech and the exploitation of children.

Citing a recent instance in which a reported Facebook Live stream stopped someone who was considering suicide, Zuckerberg said the site is also working to make the reporting of violent videos and similar content a faster process. "No one should be in this situation in the first place, but if they are, then we should build a safe community that gets them the help they need," he said Wednesday.

In a previous statement shared during the nationwide search for Cleveland murder suspect Steve Stephens, Facebook's VP of Global Operations said his team was reviewing its content reporting methods. "We disabled the suspect's account within 23 minutes of receiving the first report about the murder video, and two hours after receiving a report of any kind," Justin Osofsky said. "But we know we need to do better." In his post, Osofsky also mentioned the use of artificial intelligence in keeping Facebook safe. Osofsky said artificial intelligence helps "prevent the videos from being reshaped in their entirety."

Latest in Life