Social networking giant Facebook has said it took down tens of millions of bad stuff, including nudity and “sexual activity” from its platform during the first quarter of 2018.
In a blog post on Tuesday, the company said it pulled down “21 million pieces of adult nudity and sexual activity in Q1 2018, 96 percent of which was found and flagged by our technology before it was reported”.
It added that “we estimate that out of every 10,000 pieces of content viewed on Facebook, seven to nine views were of content that violated our adult nudity and pornography standards”.
Facebook also said it took down or “applied warning labels to about 3.5 million pieces of violent content in Q1 2018, 86 percent of which was identified by our technology before it was reported to Facebook”.
The social media site also said that for hate speech, “our technology still doesn’t work that well and so it needs to be checked by our review teams”.
“Three weeks ago, for the first time, we published the internal guidelines we use to enforce those standards,” it said.
“And today we’re releasing numbers in a Community Standards Enforcement Report so that you can judge our performance for yourself.”
Despite removing the millions of harmful content, Facebook said it still has “a lot of work to do to prevent abuse” on the social networking site.
Facebook Publishes Enforcement Numbers for the First Time https://t.co/QT3dAvvqSn
— Facebook Newsroom (@fbnewsroom) May 15, 2018