Facebook’s misinformation problems are worse in India
The Francis Hoggin leaks suggest that Facebook’s problems with extremism are particularly dire in some areas. Documents provided by Hugin to newspaper The New York Times andnewspaper The Wall Street Journal and other media outlets indicate that the company is aware that it has promoted disinformation and extreme violence in India.
And it seems that the social network did not have enough resources to deal with the spread of harmful substances in the densely populated country. It did not respond with adequate action when tensions erupted.
A case study from early 2021 indicated that a lot of malicious content for groups like Bajrang Dal, a Hindu nationalist militant organization, was not reported via the social network and its apps due to a lack of technical knowledge needed to detect content written in Bengali and Hindi.
Meanwhile, the company reportedly refused to flag Rashtriya Swayamsevak Sangh, a volunteer right-wing Hindu paramilitary and Hindu nationalist organization, for removal due to political sensitivities.
Bajrang Dal, which is linked to Prime Minister Modi’s party, was not mentioned, despite Facebook’s internal call for its material to be removed. The company had a whitelist of politicians exempt from fact-checking.
The company has been struggling for five months to combat hate speech, according to the leaked data. The research showed how quickly Facebook’s recommendation engine suggested toxic content.
A fake account that followed Facebook’s recommendations for three weeks was exposed to an almost permanent amount of divisive nationalism, disinformation and violence.
The company said the leaks did not tell the full story. Company spokesman Andy Stone argued that the data was incomplete and did not take into account the extensive use of third-party fact-checkers outside the United States.
He added that the company has invested heavily in detection technology for hate speech in languages such as Bengali and Hindi, and that the company continues to improve the technology.
Read also: Facebook wants to expose intellectual property violations
Facebook says its corrective efforts are a work in progress
I defended The social media company about its practices. She said it has an industry-leading process to review and prioritize countries at high risk of violence every six months.
She noted that the teams considered long-term and historical issues along with current events. The company added that it works with local communities, constantly improving technology and refining policies.
However, the response did not address some of the concerns. India is the company’s largest single market, with 340 million people using its services. But 87 percent of the disinformation budget across the company is focused on the United States.
Even with the presence of third-party fact-checkers, this suggests that India is not receiving a disproportionate amount of attention.
The company did not raise concerns that it was turning a blind eye to certain people and groups other than a previous statement that it enforced its policies without regard to position or association.
In other words, it’s not clear that Facebook’s problems with misinformation and violence will improve in the near future.
Read also: Comprehensive review of the Portal Go from Facebook