YouTube deleted over 58 million videos and 224 million comments that violated its policies within the third quarter, said Google’s parent company Alphabet on Thursday, as the tech giant demonstrates progress in fighting inappropriate content on its video platform.

The activities of extremists on social media platforms and violence provocation from critics’ hateful content have become a major concern among government officials and some groups in the United States, Asia, and Europe. This has resulted in pressure from the groups on social media service firms such as Facebook and YouTube to identify and remove such contents as quickly as possible.

The European Union has proposed heavy fines on social media platforms and other online firms that fail to remove materials from extremist within an hour of a government order to do so. However, social media platforms have given in to the obligation, having agreed to act upon requests from authorities to remove offensive content within 36 hours, said an official at India’s Ministry of Home Affairs on Thursday while speaking on the anonymity conditions.

Youtube videos

YouTube started issuing reports quarterly about the platform’s enforcement efforts this year. According to YouTube, most of the contents the company removed in the past quarters ware spam. It uses automated detection tools to quickly identify spam, nudity, and contents from extremists. Around 90 percent of all 10,400 YouTube videos it took down for violent extremism in September and the 279,600 videos the platform took down for child safety issues received less than 10 viewers, YouTube said.

However, the media platform still faces a serious challenge with contents promoting hateful rhetoric and dangerous behavior. YouTube depends on reports from users on problematic videos or comments because policies associated with automated detection technologies are less efficient and relatively new. Hence, the content must be viewed, sometimes widely before they are removed.

In hopes of reviewing user reports faster, the platform this year employed thousands of moderators, increasing them to over 10,000, though it has described the pre-screening of every video as unfeasible.


Information about the accounts removed in the third-quarter shows the number of YouTube accounts the company disabled for either committing egregious violations, according to Google, such as uploading child pornography or violating three policies within 90 days.


YouTube took down around 1.6 million channels where the violating 50.2 million videos were uploaded to. About 4.5 percent of the removed YouTube channels violated child safety rules, 13 percent has to do with nudity, and nearly 80 percent are related to spam uploads.


Billions of comments are posted every quarter by users, said YouTube, which declined comments on the number of accounts that have uploaded videos. Only a small fraction of its users were removed, the company disclosed.


In line with the previous quarter, YouTube has removed about 7.8 million videos for policy violations.