YouTube To Demonetize & De-Emphasize "Questionable" Videos


YouTube has announced that it will be changing the way in which it treats videos that don't directly violate site policies but are offensive, extreme, or otherwise "questionable." YouTube will sanction such videos by demonetizing them, making them harder to find, and taking away some key features from them. Videos of such a nature could include a number of elements, such as extremist or supremacist sentiment, speech that could be construed as hateful, or violent content that is depicted offensively, but does not violate rules or falls under the umbrella of extremism. Such videos, once flagged by users and examined by YouTube's staff, will be put behind an interstitial, have comments disabled, won't appear in recommended videos, and won't have recommended videos or other features such as likes available during viewing.

Essentially, such videos will be more difficult to find and almost impossible to stumble across, and when a user does find them, the uploader will not benefit from the content. This change will begin rolling out to YouTube on desktop in the next few weeks, and will begin hitting mobile versions of the service soon after. Treatment of videos that are illegal or violate site policies outright will not be changed as part of this strategy shift. This move comes alongside a glut of other initiatives that YouTube has been working on rolling out in regards to videos that feature hateful or shocking content, with a special focus on extremism and terrorism.

The announcement is part of YouTube's work toward protecting users from videos that are offensive, shocking, or may help radicalize users. The main point of the campaign has been to seek out and eliminate videos that are pro-terrorism or violate site policies, especially in offensive ways, before they can be found by viewers. In order to accomplish this, YouTube has been making heavy use of machine learning, and is reporting a high rate of success with finding and removing such videos before they are ever flagged by a viewer. Additionally, trying to seek out pro-terrorism content on YouTube will, ideally, result in the user being redirected to anti-terrorism content that could help to dissuade potential extremists from joining such a cause. YouTube stated that the fight against the spread of terrorism is "challenging" but that the company is finding new ways to wage it a constant basis.

Share this page

Copyright ©2017 Android Headlines. All Rights Reserved.

This post may contain affiliate links. See our privacy policy for more information.
Senior Staff Writer

Daniel has been writing for Android Headlines since 2015, and is one of the site's Senior Staff Writers. He's been living the Android life since 2010, and has been interested in technology of all sorts since childhood. His personal, educational and professional backgrounds in computer science, gaming, literature, and music leave him uniquely equipped to handle a wide range of news topics for the site. These include the likes of machine learning, Voice assistants, AI technology development news in the Android world. Contact him at [email protected]

View Comments