Imran Ahmed The Center for Countering Digital Hate

Imran Ahmed Q&A: ‘Tech companies are creating a toxic environment for Muslims’

Photograph courtesy of Imran Ahmed

Nylah Salam

The proliferation of online hate on social media platforms has become a daily challenge for everyone, including governments, technology activists and members of marginalised communities. Global events such as Brexit, the Covid-19 pandemic and the ongoing refugee crisis have led to soaring levels of internet-based abuse. 

Anti-Muslim posts and campaigns are particularly prevalent on social media. A recent report from the London and Washington DC-based Center for Countering Digital Hate (CCDH) revealed that social media companies, including Facebook, Instagram, TikTok, Twitter and YouTube, had failed to act on 89% of posts reported as Islamophobic between February and March 2022. 

We spoke to the British CEO and founder of the CCDH, Imran Ahmed, about the challenges of online hate speech and what social media platforms can do to combat it. 

This conversation has been edited for length and clarity

Hyphen: Before setting up the CCDH in 2018, you worked with the UK Labour party and alongside Jo Cox, the MP who was murdered by a far-right terrorist in 2016. Did her death play a role in the launch of the organisation?

Imran Ahmed: There were two things that really drove me in the first instance. One was seeing the rise of conspiracist antisemitism in the UK and the second was the rise of a virulent anti-Black, anti-Muslim political hatred. Then, it was the murder of Jo Cox, who was my colleague at the time. Seeing the way online hate had created both the environment and the direct motivation for killing made me realise we were missing the way people were sharing information. There was something we had to do: we have to socialise social media, we have to make it subject to the same pressures that normal society is under. 

How does the CCDH combat the widespread misinformation that reaches millions of users globally, like climate-change denialism and anti-vaccine narratives? 

We look at bad actors on bad platforms. The truth is that the platforms that are the most responsible for tolerating and enabling the spread of malign information by bad actors are the world’s most popular ones: Facebook, Twitter, TikTok, Instagram and YouTube. Our job is to make sure they enforce the rules they themselves have set. 

To what extent does Islamophobia drive and feature in online misinformation? Can you give a few examples?

In my experience of 20 years, Muslims have certainly been the target of disinformation. This can be as grotesque as eliding Muslims with terrorism or accusing them of being deceptive, or it can be outright hatred towards Muslims — specifically Muslim migrants. The types of hatred that we see on social media platforms come from a range of actors, from Hindu nationalists to white supremacists, to garden-variety politicians. 

Holding social media platforms to account is a major undertaking. What is the process?

We start by researching, studying the phenomenon. We look at the type of hatred and who is producing it. We also look at the spaces in which it’s being spread. For example, Facebook is a platform where you drip feed misinformation into people’s newsfeeds over a period of time to recolour the lens through which they see the world, to make them fear Muslims. This can be done by continually posting stories about dangerous Muslims or misinformation about Muslims. 

We then look at whether or not platforms are taking effective action. A very good example is a recent study that we did, looking at several hundred examples of outright hatred posted online. We reported it to the platforms using their own platform reporting tools. We went back and audited what action they took, and we found that 95% of the time they take no action against even outright hatred that stands in contravention of their own community standards.

We also advocate and campaign for change through the press to politicians, not just in the UK, but around the world, including the European Union and the US.

A recent CCDH report states that more than 88% of posts containing anti-Muslim hatred had no action taken against them by social media companies, even after your researchers reported them for breaching platform standards. Isn’t that a major failing on the part of tech companies?

Tech companies are failing to protect Muslims from hatred and creating a toxic environment for them. 

Your report also found more than 20 Facebook and Instagram pages dedicated to anti-Islam content, with a combined following of 361,922 accounts. Meta, the parent company of both platforms, has come in for criticism for hosting anti-Muslim pages and groups before why do they still exist? 

Social media platforms don’t care what the content is, as long as they can slap an ad on it. It’s a monetizable space. They’re perfectly willing to do business with Islamophobes. 

The pandemic saw a huge rise in Islamophobia. Muslims were often blamed for spreading Covid in countries like India. Has that situation changed?

This is a very good example of Islamophobes jumping at the opportunity to inculcate Islamophobia and Covid misinformation actors also seeing an opportunity. One of the biggest problems on social media is that when you allow these groups to exist and prosper in your spaces, they will cross-fertilise each other, which drives something that we call “convergence”. These actors hybridised their ideologies and conspiracy theories, creating massive conspiracy theories.

Have you seen any new types of digital hate towards Muslims online? If so, how has this been countered?

What we are seeing grow now is a form of conspiracist misinformation about Covid. That’s not something Muslims are used to facing. It’s very similar to antisemitic conspiracism. It’s a much more difficult thing to take on because at the heart of every conspiracy theory is a lie. Lies are very difficult to directly disprove, so the rise of a new form of conspiracist Islamophobia is very worrying. 

What we do is make sure that people are fully informed and we help them to produce material that inoculates them against the grand themes that underpin new Islamophobic conspiracies. 

What, in your view, is the overall picture for social media platforms with regards to digital hate and misinformation in the UK now?

Social media platforms have been exposed by three things in the UK. One is the rampant hate that’s very visible against black football players. The second is the absolute failure to deal with the disinformation that flowed unabated about Covid and took tens of thousands of British lives. The third thing is the rise in conspiracies, antisemitism and Islamophobia, which have caused a lot of concern at the highest levels of government because they pose a serious threat to public order and to safety. 

Topics

Selected stories