Rohingya seek redress from Facebook for role in Myanmar massacre

With roosters crowing in the background as he speaks from the overcrowded refugee camp in Bangladesh that has been his home since 2017, Maung Sawyeddollah, 21, describes what happened when violent hate speech and misinformation against Myanmar’s Rohingya minority They started spreading on Facebook.

“We were fine with most of the people there. But some very nationalistic and narrow-minded guys escalated the hate against the Rohingya on Facebook,” he said. “And the people who were good, in close communication with the Rohingya, changed their minds against the Rohingya and it turned into hate.”

For years, Facebook, now called Meta Platforms Inc., pushed the narrative that it was a neutral platform in Myanmar that was being misused by malicious individuals and that, despite its efforts to remove hateful and violent material, it woefully fell short. . That narrative echoes his response to the role he has played in other conflicts around the world, whether in the 2020 US election or hate speech in India.

But a new and comprehensive report from Amnesty International claims that Facebook’s preferred narrative is false. The platform, says Amnesty, was not simply a passive site with insufficient content moderation. Instead, Meta’s algorithms “proactively amplified and promoted content” on Facebook, inciting violent hate against Rohingya since 2012.

Despite years of warnings, Amnesty found, the company not only failed to suppress violent hate speech and misinformation against the Rohingya, but actively spread and amplified it until it culminated in the 2017 massacre. The timing coincided with the growing popularity of Facebook in Myanmar, where for many people it served as their only connection to the online world. That effectively turned Facebook into the Internet for a large number of the Myanmar population.

More than 700,000 Rohingya fled to neighboring Bangladesh that year. Myanmar’s security forces were accused of mass rape and murder and burning down thousands of homes owned by Rohingya.

“Meta, through its dangerous algorithms and relentless pursuit of profit, substantially contributed to the serious human rights violations perpetrated against the Rohingya,” the report says.

A Meta spokesperson declined to answer questions about the Amnesty report. In a statement, the company said it “stands in solidarity with the international community and supports efforts to hold Tatmadaw accountable for his crimes against the Rohingya people.”

“Our security and integrity work in Myanmar continues to be guided by feedback from local civil society organizations and international institutions, including the United Nations Fact-Finding Mission on Myanmar; the Human Rights Impact Assessment we commissioned in 2018; as well as our ongoing human rights risk management,” Rafael Frankel, director of public policy for emerging markets, Meta Asia-Pacific, said in a statement.

Like Sawyeddollah, quoted in the Amnesty report and who spoke to the AP on Tuesday, the majority of people who fled Myanmar — about 80% of Rohingya living in Myanmar’s western Rakhine state at that time) are still refugees. Y camps are asking Meta to pay reparations for his role in the violent crackdown on Rohingya Muslims in Myanmar, which the US declared a genocide earlier this year.

Amnesty’s report, released Wednesday, is based on interviews with Rohingya refugees, former Meta employees, academics, activists and others. It was also based on documents disclosed to Congress last year by whistleblower Frances Haugen, a former Facebook data scientist. She notes that digital rights activists say Meta has improved its engagement with civil society and some aspects of its content moderation practices in Myanmar in recent years. In January 2021, after a violent coup toppled the government, he excluded the country’s armed forces from his platform.

But critics, including some of Facebook’s own employees, have long argued that approach will never really work. It means that Meta is playing whack-a-mole by trying to weed out harmful material, while its algorithms designed to push “engaging” content that is more likely to annoy people essentially work against it.

“These algorithms are really dangerous for our human rights. And what happened with the role of the Rohingya and Facebook in that specific conflict is at risk of happening again, in many different contexts around the world,” said Pat de Brun, researcher and advisor on artificial intelligence and human rights. rights in Amnesty.

“The company has been completely unwilling or unable to address the root causes of its human rights impact.”

After the UN’s Independent International Fact-Finding Mission on Myanmar highlighted the “significant” role Facebook played in atrocities against the Rohingya, Meta admitted in 2018 that “we were not doing enough to help prevent our platform from being used to foment division. and incite violence offline.

In subsequent years, the company “touted certain improvements in its community engagement and content moderation practices in Myanmar,” Amnesty said, adding that its report “concludes that these measures have been wholly inadequate.”

In 2020, for example, three years after violence in Myanmar killed thousands of Rohingya Muslims and displaced 700,000 more, Facebook investigated how a video of a prominent anti-Rohingya hate figure, U Wirathu, was circulating on its site.

The research revealed that more than 70% of video views came from “chains”, that is, people who played a different video were suggested, showing what “will follow”. Facebook users weren’t searching or searching for the video, but instead were fed by the platform’s algorithms.

Wirathu had been banned from Facebook since 2018.

“Even a well-endowed approach to content moderation, in isolation, would likely not have been enough to prevent and mitigate this algorithmic damage. This is because content moderation does not address the root cause of Meta’s algorithmic amplification of harmful content,” Amnesty’s report says. .

Rohingya refugees are seeking unspecified reparations from the Menlo Park, California-based social media giant for its role in perpetuating the genocide. Meta, which is the subject of twin lawsuits in the US and UK seeking $150 billion for Rohingya refugees, has so far been denied.

“We believe that the genocide against the Rohingya was only possible because of Facebook,” Sawyeddollah said. “They communicated with each other to spread hate, they organized campaigns through Facebook. But Facebook was silent.”

Leave a Comment