Anyone with an ear to the news — online, TV, or even print — will have heard about some of the recent atrocities being publicized on Facebook. The biggest one to shake the US was about an Ohio man who streamed himself murdering someone on Facebook Live.
This is obviously a big concern for Facebook, with the company’s CEO having had to address the issue in a recent post. He says Facebook users deserve a safer community, and it’s not just about those who mean to do violence, but also those who mean to do it to themselves.
That’s why the company wants to add another 3,000 moderators to their team atop the 4,500 they already have. These moderators’ task is to act swiftly on reports of offensive videos with appropriate action, whether that be taking the post down or even going as far as contacting law enforcement to get them involved. Facebook says they’ll make the tools for doing all this stuff easier than it is, too.
Alongside murders and assaults and things of that nature, Facebook says suicide notes and actual attempts are also an issue. For them, it’s not enough to take the video down: they also want to help those who feel the need to harm themselves and have already helped to stop at least one suicide. Unfortunately, they don’t have a 100% success rate.
Either way, Facebook knows this is a serious problem that can only worsen now that video is becoming a bigger part of their platform, especially with the advent of live broadcasts that can compel people to do and show things they otherwise wouldn’t.
Comments