YouTube to take further steps against hate speech, extremism
The video site said it will start to place videos in a “limited state” when they don’t violate YouTube policies but do contain “controversial religious or supremacist content.”
“The videos will remain on YouTube behind an interstitial, won’t be recommended, won’t be monetised, and won’t have key features including comments, suggested videos, and likes,” said YouTube in a company blog post.
“We’ll begin to roll this new treatment out to videos on desktop versions of YouTube in the coming weeks, and will bring it to mobile experiences soon thereafter. These new approaches entail significant new internal tools and processes, and will take time to fully implement.”
The move follows YouTube’s recent implementation of new guidelines that has seen it take a harder line on hateful, demeaning and inappropriate content.
YouTube said it is already seeing some positive progress, with its machine learning systems working faster and more effectively than before.
“Over 75% of the videos we’ve removed for violent extremism over the past month were taken down before receiving a single human flag,” said YouTube.
“The accuracy of our systems has improved dramatically due to our machine learning technology. While these tools aren’t perfect, and aren’t right for every setting, in many cases our systems have proven more accurate than humans at flagging videos that need to be removed.”
Some 400 hours of content is uploaded to YouTube every minute. The site said that finding and taking action against violent extremist content “poses a significant challenge”, but said its initial use of machine learning has more than doubled both the number and the rate of videos it’s removed for violent extremism in the past month.
YouTube attracted controversy earlier this year, and some advertisers pulled budget from the site, after it was revealed that, in some instances, ads appeared alongside extreme or inappropriate content.