YouTube wrote on its official blog that they will be taking necessary measures in censoring content which they deem controversial, whether or not it violates their laws and policies. About a month ago, they have announces four steps to be taken to address these “controversial” content.
We’re taking to combat terrorist content on YouTube: better detection and faster removal driven by machine learning, more experts to alert us to content that needs review, tougher standards for videos that are controversial but do not violate our policies, and more work in the counter-terrorism space.
YouTube says a mix of human and cutting-edge machine technology will handle the videos that are “controversial” as this makes the task easier, efficient, accurate, and fast.
Better detection and faster removal driven by machine learning: We’ve always used a mix of technology and human review to address the ever-changing challenges around controversial content on YouTube. We recently began developing and implementing cutting-edge machine learning technology designed to help us identify and remove violent extremism and terrorism-related content in a scalable way.
They have also stated partnerships with more experts from NGOs and institutions which support their cause.
More experts: Of course, our systems are only as good as the the data they’re based on. Over the past weeks, we have begun working with more than 15 additional expert NGOs and institutions through our Trusted Flagger program, including the Anti-Defamation League, the No Hate Speech Movement, and the Institute for Strategic Dialogue. These organizations bring expert knowledge of complex issues like hate speech, radicalization, and terrorism that will help us better identify content that is being used to radicalize and recruit extremists. We will also regularly consult these experts as we update our policies to reflect new trends. And we’ll continue to add more organizations to our network of advisors over time.
For content creators, YouTube will strictly start implementing tougher standards on videos that are potentially “controversial.”
Tougher standards: We’ll soon be applying tougher treatment to videos that aren’t illegal but have been flagged by users as potential violations of our policies on hate speech and violent extremism. If we find that these videos don’t violate our policies but contain controversial religious or supremacist content, they will be placed in a limited state. The videos will remain on YouTube behind an interstitial, won’t be recommended, won’t be monetized, and won’t have key features including comments, suggested videos, and likes. We’ll begin to roll this new treatment out to videos on desktop versions of YouTube in the coming weeks, and will bring it to mobile experiences soon thereafter. These new approaches entail significant new internal tools and processes, and will take time to fully implement.
But this step is not limited to the content creators, but also, you, as a viewer. Whenever you search something flagged by users and considered violent, you will be redirected to a playlist that debunks what you’ve originally searched for.
Early intervention and expanding counter-extremism work: We’ve started rolling out features from Jigsaw’s Redirect Method to YouTube. When people search for sensitive keywords on YouTube, they will be redirected towards a playlist of curated YouTube videos that directly confront and debunk violent extremist messages. We also continue to amplify YouTube voices speaking out against hate and radicalization through our YouTube Creators for Change program. Just last week, the U.K. chapter of Creators for Change, Internet Citizens, hosted a two-day workshop for 13-18 year-olds to help them find a positive sense of belonging online and learn skills on how to participate safely and responsibly on the internet. We also pledged to expand the program’s reach to 20,000 more teens across the U.K.
This announcement however has stirred up the YouTubers, and considered this move to be propagandist in nature.
If a video doesn’t break YouTube’s terms of services then they absolutely SHOULD NOT be attempting to dampen the reach of the video any further,” said YouTuber Annand “Bunty King” Virk, who raised his concerns with The Daily Caller. “Who determines what’s passable and what isn’t? At what point do we finally realize that saying the right thing isn’t always about saying what people want to hear?”
“By these standards, if YouTube existed previous to the Emancipation Act, they’d be censoring videos criticizing slave owners, since being anti-slavery wasn’t popular… at all,” he added. “The popular opinion isn’t always the right opinion.”
Matt Jarbo, who goes by MundaneMatt on YouTube, shared his views with The Daily Caller on the move. “They know its almost a non-issue completely,” Jarbo said. “But due to the controversies surrounding those videos, they’ve gotten a much larger spotlight than they deserve.”
“I do not trust their ability [to automatically flag extremist content],” he said. “I think they have an algorithm in place to help combat those issues, but it’s not narrow enough to not impact the skeptical/anti-SJW content.”
YouTube however considers this move as their contribution in the fight against terrorism.