Facebook is in the pillory for new revelations about how it deals with hate comments, calls for violence and other content with social explosives. This time, the main focus is on India, the largest market for the US group with around 340 million users of the social network and almost 400 million users of the associated messaging service WhatsApp.
According to reports from US media and news agencies, the company failed to take appropriate action against the harmful content on its Indian platforms despite internal warnings. The According to reports Facebook employees in India noticed a sharp increase in rumors and incitement to violence from December 2019 onwards.
It is said to have been particularly dramatic on WhatsApp in February of last year when many people died in violent clashes between Hindus and Muslims on the subcontinent. The platform operator then sent dozens of its own analysts to the country to ask users about their experiences with algorithmic recommendations.
Back in February 2019, a Facebook researcher had created an account for a new member to find out how using the network feels for someone who lives in the Indian city of Kerala. The New York Times explains the experiment. The account took to heart a simple rule: “Follow any recommendations generated by Facebook’s algorithms to join groups, watch videos, and explore new pages on the site.
Facebook shows pictures of the deceased
“The result was” shocking “according to an internal report created in the same month: he had” seen more pictures of dead people in the last three weeks than in my entire life, “wrote the analyst.” The test user’s news feed is has become an almost constant flood of polarizing nationalist content, misinformation, and violence and bloodshed. “
According to the reports, the files show that Facebook has been aware of the problems for years. Critics complain that the company did too little about it, especially in cases in which members of Prime Minister Narendra Modi’s ruling party were involved. Above all, anti-Muslim propaganda by Hindus was able to spread almost unhindered via the platforms.
The documents allegedly show that the Californian corporation viewed India as one of the “most vulnerable countries” in the world and identified Hindi and Bengali as priorities for automation in the search for violations of community rules through hostile statements.
Too few moderators
Nevertheless, there were not enough moderators in the national language or sufficient labeling of content to prevent misinformation. The internal papers are part of a larger collection of materials called “The Facebook Papers”. They come from Frances Haugen, an ex-product manager at the company who recently went public as a whistleblower.
Indications of India were found in several places in the documents Haugen filed in a complaint with the US Securities and Exchange Commission earlier this month, according to reports. 87 percent of Facebook’s global budget for the classification of disinformation is intended for the US, only 13 percent for the rest of the world.
A company spokesman argued that the numbers did not include fact-checking partners, most of whom were based outside of the United States. Facebook has invested a lot of money in technology to find hate speech in different languages, including Hindi and Bengali. The operator has cut the amount of relevant content that people around the world see by half this year.