NEW DELHI: Facebook in India has been selective in curbing hate speech, misinformation and inflammatory posts, particularly anti-Muslim content, according to leaked documents, even as the internet giant’s own employees cast doubt over its motivations and interests.
Based on research produced as recently as March of this year on company memos that date back to 2019, internal company documents on India highlight Facebook’s constant struggles in quashing abusive content on its platforms in the world’s biggest democracy and the company’s largest growth market. Communal and religious tensions in India have a history of boiling over on social media and stoking violence.
The files show that Facebook has been aware of the problems for years, raising questions over whether it has done enough to address the issues. Many critics and digital experts say it has failed to do so, especially in cases where members of Prime Minister Narendra Modi’s ruling Bharatiya Janata Party are involved.
Media giant’s internal documents show the company has been aware of the problems for years
Across the world, Facebook has become increasingly important in politics, and India is no different. Modi has been credited for leveraging the platform to his party’s advantage during elections, and reporting from The Wall Street Journal last year cast doubt over whether Facebook was selectively enforcing its policies on hate speech to avoid blowback from the BJP.
Modi and Facebook chairman and CEO Mark Zuckerberg have exuded bonhomie, memorialised by a 2015 image of the two hugging at the Facebook headquarters.
The leaked documents include a trove of internal company reports on hate speech and misinformation in India that in some cases appeared to have been intensified by its own recommended feature and algorithms. They also include the company staffers’ concerns over the mishandling of these issues and their discontent over the viral malcontent on the platform.
According to the documents, Facebook saw India as one of the most at risk countries in the world and identified both Hindi and Bengali languages as priorities for automation on violating hostile speech. Yet, Facebook didn’t have enough local language moderators or content-flagging in place to stop misinformation that at times led to real-world violence.
In a statement, Facebook said it has invested significantly in technology to find hate speech in various languages, including Hindi and Bengali, which reduced the amount of hate speech that people see by half in 2021.
“Hate speech against marginalised groups, including Muslims, is on the rise globally. So we are improving enforcement and are committed to updating our policies as hate speech evolves online,” a company spokesperson said.
This report is based on disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by former Facebook employee-turned-whistleblower Frances Haugen’s legal counsel. The redacted versions were obtained by a consortium of news organisations.
Back in February 2019 and ahead of a general election when concerns of misinformation were running high, a Facebook employee wanted to understand what a new user in India saw on their news feed if all they did was follow pages and groups solely recommended by the platform itself.
The employee created a test user account and kept it live for three weeks, a period during which an extraordinary event shook India — an attack in occupied Kashmir had killed over 40 Indian soldiers that sent tensions with Pakistan soaring.
In the note, titled “An Indian Test User’s Descent into a Sea of Polarising, Nationalistic Messages”, the employee whose name is redacted said they were shocked by the content flooding the news feed. The person described the content as having become a near constant barrage of polarising nationalist content, misinformation, and violence and gore.
Seemingly benign and innocuous groups recommended by Facebook quickly morphed into something else altogether, where hate speech, unverified rumours and viral content ran rampant.
Published in Dawn, October 25th, 2021