Even if Google and Youtube are private companies, there’s a free speech issue in play with today’s announcement because they are unregulated public utilities, according to Trump policy adviser Steve Bannon.
Mike Cernovich, Mark Dice, and Paul Joseph Watson may be affected by this new policy. More hard core thinkers on the right, such as Jared Taylor, Richard Spencer, Tara McCarthy, and Varge Vikernes, are really likely to see damage done to their effort to communicate.
Since the ADL (Anti-Defamation League) considers realistic assessments of race and religion to be hate, the involvement of the ADL in this program bodes ill for our ability to find the information we need.
Content creators on YouTube who follow all of the site’s rules may still face censorship by the platform, under new plans announced by Google.
According to a post on YouTube’s official blog, videos will now be subject to the rule of the mob. If enough users flag a video as “hate speech” or “violent extremism,” YouTube may impose restrictions on the content even if it breaks none of the platform’s rules.
We’ll soon be applying tougher treatment to videos that aren’t illegal but have been flagged by users as potential violations of our policies on hate speech and violent extremism. If we find that these videos don’t violate our policies but contain controversial religious or supremacist content, they will be placed in a limited state. The videos will remain on YouTube behind an interstitial, won’t be recommended, won’t be monetized, and won’t have key features including comments, suggested videos, and likes.
YouTube has also rolled out a “trusted flagger” program, in which 15 “expert NGOs and institutions” to help them identify hate speech and extremism on their platform.
Among these organizations are the No Hate Speech Movement, a left-wing project pushed by the Council of Europe, as well as the Anti-Defamation League, an organization whose president has been accused of “manufacturing outrage” by the World Jewish Congress.
YouTube is also planning to artificially alter its search results so that searches for “sensitive” topics on YouTube no longer return the most popular videos, but a “playlist of curated YouTube videos that directly confront and debunk violent extremist messages.”
The platform also plans to artificially promote videos created via its “Creators for Change” program, which, in YouTube’s words, features creators who are “using their voices and creativity to speak out against hate speech, xenophobia and extremism.”
We’ve started rolling out features from Jigsaw’s Redirect Method to YouTube. When people search for sensitive keywords on YouTube, they will be redirected towards a playlist of curated YouTube videos that directly confront and debunk violent extremist messages. We also continue to amplify YouTube voices speaking out against hate and radicalization through our YouTube Creators for Change program. Just last week, the U.K. chapter of Creators for Change, Internet Citizens, hosted a two-day workshop for 13-18 year-olds to help them find a positive sense of belonging online and learn skills on how to participate safely and responsibly on the internet.
YouTube framed its blog post around fighting “terror content,” yet their announcement also strays into areas that have nothing to do with fighting terrorism, like the company’s diversity efforts. The blog post boasts about YouTube’s involvement with the “Creators for Change workshop” in which “creators teamed up with Indonesia’s Maarif Institute to teach young people about the importance of diversity, pluralism, and tolerance.”