June 5, 2023

Great Indian Mutiny

Complete IndianNews World

Facebook is reluctant to fight misinformation in India

New Delhi (AP) – According to leaked documents obtained by the Associated Press, Facebook’s Indian office has acted selectively to combat the spread of misinformation and intolerant content, especially anti-Muslim posts. And interests.

The internal documents of India’s activity over the past two years reflect Facebook’s current battle to fight misinformation on its sites in the world’s largest democracy and the company’s fast-growing market. Political and religious tensions are broadcast on social media and promote violence.

The documents state that Facebook has been aware of the problems for many years and raises questions as to whether they are sufficient to combat them. Countless dogs and digital experts say this did not happen, especially in cases involving Prime Minister Narendra Modi’s Bharatiya Janata Party (BJP).

Around the world, Facebook has become an important political tool, and India is no exception.

Modi is said to have used the site to his advantage during last year’s election, and reports in the Wall Street Journal last year raised doubts as to whether Facebook was chosen to combat intolerant content for fear of PBJ retaliation. In the photo of the two hugging at Facebook headquarters in 2015, it looks like there is a good relationship between Modi and Facebook CEO Mark Zuckerberg.

The leaked documents include internal reports of intolerable content and misinformation in India, which are sometimes exacerbated by Facebook’s own methodological recommendations. They also reflect the concern of the staff in dealing with these issues and their discomfort due to the inappropriate content of the platform.

See also  India seizes $ 725 million from Xiaomi for 'illegal money transfer'

According to the documents, Facebook has considered India as one of the most “endangered” countries in the world and has identified Hindi and Bengali languages ​​as a priority for detecting the violation of the rules of hostile content. However, there are not enough moderators or knowledgeable staff in those languages ​​on Facebook to detect this misinformation that has sometimes unleashed episodes of violence in the real world.

In a report sent to the AP, Facebook said it had “significantly invested in technology to detect intolerable content in various languages, including Hindi and Bengali”, which in 2021 “halved the amount of intolerable content people view.”

This and other articles about Facebook that will be published this week are based on complaints made by former company executive Francis Hagen to the U.S. Securities and Exchange Commission. Revised editions received by Congress were obtained by a committee of press organizations, including The Associated Press.

In February 2016, just before the general election, when a warning arose about the amount of misinformation, a Facebook employee decided to examine what a new user from India would see if he followed the pages and groups suggested by his own site.

He created a new account and held it for three weeks, at a time when 40 Indian soldiers were killed in a guerrilla attack in Kashmir and the country was on the brink of war with Pakistan.

The employee, whose name was not released, said he was “amazed” by what he saw. He described it as “an almost constant torrent of divisive nationalist content, misinformation, violence and bloody acts.”

See also  Govit-19 in India: Corona virus cases and vaccination from May 13

The blatantly innocent groups recommended by Facebook were soon filled with intolerant content, unverified rumors and viral content.

The nominated groups were full of fake news, anti-Pakistani rhetoric and Islamophobic content. Most of the content was very graphic.

One post showed one holding the bloody head of another covered by the Pakistani flag, with the Indian flag on top of it. In the popular news section on Facebook, there was a lot of unverified content referring to India’s retaliation after the attack, which included a picture of the Nebam bombing from a video game discredited by Facebook checkers.

The employee who did the experiment said, “In the last three weeks I have seen more people who have died than in my entire life.”

A Facebook spokesman said the study “encouraged in-depth and rigorous analysis of its systems’ recommendations and contributed to changes to improve the product.”


Sam McNeill contributed to this report from Beijing.