World’s biggest social network released a report that shows that over 3.2 billion accounts were taken down, twice the numbers from the same period, for depicting child abuse and suicide, among other concerning topics.
Facebook removed 3.2 billion fake accounts between April and September so far this year, along with millions of posts depicting child abuse and suicide, according to its latest content moderation report released on Wednesday.
Reuters informed that these numbers double last year's 1.55 billion accounts taken down in the same period. Also world’s biggest social network disclosed for the first time how many posts it removed from popular photo-sharing app Instagram, which has been identified as a growing area of concern about fake news.
Proactive detection of violating content was lower across all categories on Instagram than on Facebook´s app, where the company initially implemented many of its detection tools. For example, there was content affiliated with terrorist organizations removed proactively 98.5% of the time on Facebook and 92.2%, on Instagram.
Furthermore, it removed more than 11.6 million pieces of content depicting child nudity and sexual exploitation of children on Facebook and 754,000 pieces on Instagram during the third quarter.
Law enforcement is concerned that Facebook’s plans to provide greater privacy to users by encrypting the company’s messaging services will hamper efforts to fight child abuse. FBI Director Christopher Wray said the changes would turn the platform into a “dream come true for predators and child pornographers.”
The company also added data on actions it took around self-harm content for the first time. The report said it had removed about 2.5 million posts in the third quarter that depicted or encouraged suicide or self-injury. They also removed about 4.4 million pieces involving drug sales during the quarter, it said in a blog post.