Facebook did not follow its own recommendations on risks during Assam and West Bengal polls

ILLUSTRATION BY SHAGNIK CHAKRABORTY
09 June, 2022

According to two internal presentations by Facebook—which has since been re-branded as Meta—despite predicting that its platforms would host inflammatory content that could lead to physical violence during the 2021 assembly elections in Assam and West Bengal, the social-media giant failed to launch on schedule a majority of the initiatives that were in its arsenal to prevent such a situation.

An internal presentation, titled “Problemscape for 2021 India Regional Elections,” prepared by Facebook researchers on 24 March 2021, hinted towards the possibility of communal violence triggered through Facebook, Instagram and WhatsApp, and identified coverage gaps to control the inflammatory content. It outlined the various threats the platforms could pose to peace and election integrity, including hate speech, misinformation, violence, incitement and rumours about election-rigging, and suggested that the company use its “Break The Glass” measures—the most extreme measures Facebook can use to stop the spread of offensive and inflammatory content. However, according to the second document, titled “Indian Regional Elections BTG Tracker,” a majority of these measures were not employed on the date they were to be launched, 25 March—two days before polling started in West Bengal and Assam.

These presentations are among a large tranche of documents submitted by Frances Haugen, a former product manager in Facebook’s civic-integrity team, to the US Securities and Exchange Commission, with which she filed a series of complaints about the technology conglomerate’s role in perpetuating misinformation, hate speech and violent extremism. Haugen’s lawyers provided legislators with redacted versions of these documents, which were reviewed by a consortium of news organisations, including The Caravan.

In the Problemscape document, Facebook’s Central Integrity Ecosystem group—which primarily works on reducing harmful content on the platform—tried to identify coverage gaps in its “integrity defence” in relation to assembly elections in West Bengal, Assam, Kerala, Tamil Nadu and Puducherry, between 27 March and 2 May. The presentation noted that, “due to their history of communal violence, West Bengal and Assam are of most concern.” Facebook’s qualitative research also showed that users frequently encounter inflammatory content that played on communal and religious divisions, primarily in homogenous groups. The Tracking Reach of Integrity Problems, Facebook’s survey that measures what users think of what they have seen on the platform, noted that over twelve percent of Facebook users in India had reported seeing something on the platform, in the past seven days, that could lead to violence. This number was higher than the same metric in the United States and the global average.