A BBC Trending investigation has found inconsistencies in how YouTube deals with neo-Nazi music tracks which advocate murder and violence. The company admits that it has "more to do" when it comes to hate music videos.
The songs glorify violence and frequently talk about the mass killing of Muslims, Jews and other groups in celebratory detail. And they can be found on the world's most popular video-sharing platform, YouTube.
On the site, neo-Nazi videos rack up hundreds of thousands or even millions of views, and attract hundreds of comments.
Among the tracks found by the BBC was one titled Fire up the Ovens by a band called the Bully Boys. It celebrates the Holocaust, references anti-Semitic conspiracy theories and includes the lyric "we love to kill." Other songs by other bands call for burning Turkish people and call Muslims "sub-humans" and include other, much more derogatory slurs.
In some countries, such material is illegal. And the Google-owned company's guidelines prohibit "content that promotes violence against or has the primary purpose of inciting hatred against individuals or groups based on certain attributes" including race, ethnic origin, religion and a host of other categories.
BBC Trending identified 99 videos and one channel which were potentially in violation of local laws and/or YouTube's own rules. Using the site's reporting system, we flagged all of these links and included the time codes of specific lyrics.
After our initial reports, seven of the videos were taken down, while another 47 were made unavailable in the UK and some other countries (for instance, Germany has laws against Nazi symbols and propaganda). In some cases there were clear discrepancies in the way material was handled.
For instance, our list included multiple copies of some tracks, as the same song is often uploaded by multiple YouTube users. We reported three copies of a song called We Fly the Swastika. Two were made unavailable in the UK, and on one, no action was taken.
In other instances, some versions of a song were taken down from YouTube, while other versions remained live.
After we identified ourselves as journalists and provided YouTube with the full list, all 100 links were eventually deleted from the site for violating the company's rules or made unavailable in the UK and some other countries.
You might also be interested in
YouTube declined our interview request and instead sent a statement: "We do not allow videos that incite hatred on YouTube and work hard to remove infringing content quickly, through flagging and through advances in technology.
"We know there's more to do here and we're committed to getting better. We're making progress in our fight to prevent the abuse of our services, including hiring more people and investing in machine learning technology... we also support creators who promote tolerance on their YouTube channels."
YouTube says their reporting systems are working correctly. But the company admitted that it made mistakes in handling some of the reports.
Many of the white supremacist songs posted on YouTube are from punk and hardcore bands active in the 80s and 90s.
The scale of neo-Nazi music on various online platforms was highlighted by the website Digital Music News shortly after the "Unite the Right" march in Charlottesville, Virginia in September 2017. After bloody clashes and the death of a counter-protester, attention focused on extremist content on hosting services and social networks.
Two days after the events in Charlottesville, Digital Music News published a story with the headline "I Just Found 37 White Supremacist Hate Bands on Spotify".
"We found a surprisingly high amount of neo-Nazi and white supremacist groups [online]," says the site's founder and publisher, Paul Resnikoff. "Some was easy to identify, it was really easy to say that was a white power group."
But YouTube took little action, says Resnikoff, and other experts urged the company to do more to restrict hate music.
"People have expectations of our social media platforms, that are selling us advertising all day long, to create more a protective environments for their users," says Oren Segal, director of Anti-Defamation League's Center on Extremism. "And this is true for parents whose kids are spending many hours online and they don't want them to either intentionally or accidentally have access to this form of hate.
"People holding these companies accountable is an important element of the equation if these companies are ever going to change," Segal says.
Reporting by Natalia Zuo
Do you have a story? Contact us.
More from Trending: Mum's bullying campaign leads to 'honesty' app ban
A wildly popular anonymous messaging app has been removed from the Apple and Google stores after accusations that it has been facilitating bullying. READ NOW