Main content

Camilla Thurlow: How can we stop online extremism?

Love Island’s Camilla Thurlow has been investigating how to tackle online extremism in a Special Report for BBC Radio 5 Live.

(Photo: Ed Miles)

I’m Camilla Thurlow. Last year I was in series 3 of Love Island. Before that, I worked in humanitarian Explosive Ordnance Disposal, clearing landmines in places like Afghanistan, Cambodia and Zimbabwe.

I have to admit, I was never really that engaged in social media before Love Island. However, I was suddenly involved in the online world like never before, and it dawned on me the immense influence it has on real-world events.

Given my history in conflict-affected areas, my focus was on how developments in technology mean conflicts are no longer restricted by geographic boundaries.

The internet, in particular social platforms, provide a place for extreme content to spread.

So, for my BBC Radio 5 Live special report, I wanted to investigate online extremism and what’s being done to tackle it.

My report took me to the offices of the Henry Jackson Society in Westminster, where I met the truly brilliant Nikita Malik. She flags up extremist videos to platforms like YouTube with a view to getting them taken down.

Nikita showed me two extreme videos that are considered “softer” because they don’t incite violence.

One was an extreme Islamist preacher telling people not to vote in the UK election. Another was a racist video, which I found extremely uncomfortable to watch.

What struck me was how the person in the racist video made his arguments sound as if they were based on scientific research.

You could see how a young or impressionable person could stumble across it, watch it and repeat what they’d heard to their friends at school.

Nikita describes it as ‘mood music’, legitimising beliefs and providing a backdrop for more dangerous behaviour to occur.

Nikita explained: “Something like this could be qualified as an opinion or a joke.”

“Both of us can see that it is very disturbing. But there are thousands of people who would agree with this person or think it’s funny. It doesn’t particularly call for violence against a group of people but it does play to that sense of superiority over a minority group.”

Despite being flagged in July 2017, the video I saw was still up a year later.

It had been viewed more than 11,000 times and had nearly 500 comments underneath. It has now been removed.

The Home Office commissioned a company to build an AI tool to identify ISIS propaganda online.

The ambition was to identify how the government could enforce regulations around extreme content. I spoke to Dr Ilya Feige from ASI Data Science, and the accuracy rate of the tool is astonishing.

The tool was designed specifically for IS material. However, the model itself could be adjusted to look for other extremist material.

YouTube: ‘We will make mistakes’

YouTube has said it ‘missed the mark’ by not taking down an extremist video.

Miriam Estrin is YouTube’s Policy Manager for Europe, the Middle East and Africa.

She told 5 Live: “With an open platform like YouTube, we will make mistakes and we have to learn from them, and we will.

“The reason we have these relationships with trusted flaggers is so they can alert us to issues and videos that violate our content policies. In this case, clearly, we missed the mark and we will run the analysis and learn from the mistake.”

Miriam said YouTube is getting better all the time at dealing with videos which violate their terms of use: “YouTube sees over 400 hours of video uploaded every single minute. It becomes a challenge of how do you detect that needle in the haystack, when you have really large haystacks.”

Nikita spends her time reading the court cases of young people who have been radicalised to try and find out what the patterns are.

“Things I begin to see time and again are abuse but also loneliness,” she said. “Young people in particular when they join either far right groups or Islamist extremist groups or even gangs, tend to do so because it gives them a peer network.”

To find out what’s been done to help raise awareness among young people, I went along to a voluntary session run by Prevent, the Home Office’s anti-extremism programme. The participants are volunteers and have not been identified as holding any extreme views.

What we actually see on the ground is an increasing rise of far-right extremism"
Prevent co-ordinator Sean Arbuthnot

The project uses technology to engage young people and was devised by youth worker, James Clacey.

In the Gateway College in Leicester, James and Prevent co-ordinator Sean Arbuthnot, are inviting young people to take part in different activities designed to teach them about the methods these extremists use to draw people in.

“People tend to obsess over Islamist-inspired extremism – Isis, Al-Qaeda and stuff like that,” said Sean.

“But what we actually see on the ground is an increasing rise of far-right extremism, neo-Nazi, white supremacy type stuff. We’re trying to raise awareness of that.”

The optional workshop gave the young people a chance to openly discuss and debate – and this is vital in them being able to exercise moral judgement and protect themselves online.

One of the biggest things I’ll take away from my report is how easy it is to access content and how the softer spectrum stuff can be so convincing and can come across as being pretty innocuous and also relatively factual.

I’ve also been astounded by the level of knowledge and the ability of those I spoke to in identifying ways to counter extreme content.

I have no doubt in the coming years there will be further conversations about topics such as censorship, Artificial Intelligence, and the creation of geographic borders online – among many others.