On the 13th anniversary of its very first video upload, YouTube has released an inaugural data and insights report on how it is handling inappropriate content.
In the midst of significantly bolstering its content moderation team late last year, the Google-owned platform says it removed nearly 8.3 million videos from October to December 2017. According to YouTube’s analytics, the videos were mostly comprised of spam or people attempting to upload adult content.
During this three-month period, the platform also flagged 6.7 million videos using computers rather than human moderators, and of those videos, 76% were removed before they even received a single view. (YouTube introduced machine-learned flagging in June 2017, but it still requires human review to assess whether something actually violates its policies.)

On the human side of things, 1.1 million videos were flagged by a trusted individual, 400,000 by a YouTube user and 63,900 by a government agency. The top 10 countries with the most flags for suspected violations were India, the US, Brazil, Russia, Germany, the UK, Mexico, Turkey, Indonesia and Saudi Arabia. YouTube says it has staffed close to 10,000 moderation positions and has hired full-time specialists with expertise in violent extremism, counter-terrorism and human rights.

At 30%, sexual content was the leading reason for receiving a flag, followed by spam (26%), hateful or abusive messaging (15.6%), violent or repulsive content (13.5%) and harmful or dangerous acts (7.6%).
In a move for more transparency, the company will release a quarterly report on how it is enforcing its Community Guidelines. By the end of the year, YouTube plans to refine its reporting systems and add additional data, including information on comments, speed of removal and policy-removal reasons.
While YouTube did not specify how much of the flagged or removed content was aimed at children, the platform has been vocal about its plans to crack down on inappropriate videos aimed directly at kids. At the end of 2017, Malik Ducard, YouTube’s global head of family and learning, addressed kids and family content creators exclusively on Kidscreen. During the week of his guest column, Ducard said more than 150,000 videos were removed for violating the company’s policies.