With the rise of social media over the last decade and a half, its societal effects are only now becoming apparent. We sat down with Humanity Hub member Ritumbra Manuvie, Director of The London Story (TLS), to discuss their ‘Preachers of Hate’ series that profiles the real-world effects of hate speech on Meta’s social networks (Facebook, Instagram, and WhatsApp) in India.
India, the world’s largest democracy, has over 350 million users of Meta’s various social networks. But the ‘Preachers of Hate’ story begins with earlier report and campaigns about hate speech problem in India, says Manuvie. “We’ve been continuously monitoring the social networks in India for a few years”, she says, “mainly to record them for posterity and transparency”.
It is now clear that hate speech online is often amplified by the social networks themselves, and this has real-world and quite violent consequences.
The London Story is uniquely positioned to look into this, with a team of data scientists, lawyers, and political minds. Something that’s also really important to the work of TLS is the analysis of propaganda, and how it works in India but also amongst the Indian diaspora abroad.
The ‘Preachers of Hate’ report uses a mixed- research using OSINT methods and big-data approaches to “create a digital ethnography” of the virtual landscape within which the hate-speech is published and amplified, says Manuvie.
What’s Meta going to do?
Meta, the company that owns Facebook, WhatsApp, and Instagram (and plenty of other companies besides), makes most of its money from advertising. The company had revenue of just under 118 billion USD in 2021: and works hard to manage its image.
“There’s not really that much pressure to act on hate speech, and in meetings we’ve had in the Netherlands, Meta often deflects the dialogue with tall talk and unsupported claims”, says Manuvie. For example, it took Meta two years to delete one video where a speaker calls for the ‘annihilation’ of Muslims, she adds. The video (with its URL) was given to Meta’s team, and there was an Oversight board case registered on it – the case number was then relayed to Meta’s Oversight Board secretariate and yet no action was taken during this time, says Manuvie. Meanwhile, the video accrued 32 million views.
How can we improve the situation?
Compliance with existing law is the first step, says Manuvie, and that new and more policy is necessary. Most often, social media companies hide behind their moderation policies, which do not always follow the local national or international law. Companies also often create moderation policies but fail to actually apply them. What needs to be critically evaluated, for The London Story, is how despite claims of action by social media companies, hate speech continues to grow and amplify.
What’s worse, is that companies such as Meta haven’t yet engaged with civil society work, says Manuvie. “There’s so much trauma-inducing work that has been done in identifying precise pieces of posts, comments and ‘dangerous individuals’, but Meta’s response to these inputs is rather disappointing”, she adds.
Looking for collaborations
The London Story is open to collaborate with similarly placed organisations, says Manuvie. “We are part of networks like The People vs. Big Tech, and the Global Alliance Against Digital Hate and Extremism. We are always open for building more solidarity and encourage meaningful connections and collaborations. Remember, there’s strength in numbers.”