The leaked Facebook doc shows how often the company restores posts it had initially deleted and how fees change depending on where you live

Facebook removes tens of millions of posts each month over concerns that they advocate violence and hatred, spread misinformation related to the pandemic, encourage self-harm or show nudity.

If users believe their post has been unfairly removed, they can appeal.

Leaked internal documents reveal that the success of those appeals varies widely between countries: If you live in North America or Europe, you are up to five times more likely that deleted content will be back online than if you live in Asia, South America. or the Middle East.

This is in accordance with disclosures made to the US Securities and Exchange Commission and provided to Congress in redacted form by Facebook whistleblower Frances Haugen. The redacted versions received by Congress were reviewed by a consortium of news organizations, including the Toronto Star and La Presse.

An internal note titled “Appeal Rate by Country” includes a table of the top 38 countries by number of posts removed by Facebook in a single month in 2019. The United States tops the table with nearly 2.6 million “removals” during that period; Canada ranks 38th, with just over 123,000.

The graph shows a large discrepancy between the percentage of positions eliminated that are restored in Europe and North America (where it ranges from 2.8% to 5%) and countries in Asia, South America and the Middle East (where it ranges between 0.8% and 5%). percent to 1.8 percent).

In a comment accompanying the graph, a Facebook employee writes that high restore rates in the US, Germany, the UK, and Canada “probably mean that our products work better for these countries and they don’t work as well for them. countries like Myanmar, Dominican Republic and Cambodia “.

The employee also noted that different types of positions are appealed at different rates.

“If a country has a lot of hate speech, it could have a higher attractiveness rate than a country with a lot of nudity (which is traditionally less attracted, probably because it is less subjective),” the memo says.

Meta, the new name for Facebook’s parent company, did not respond to questions about this global disparity in restoration rates.

Experts say this discrepancy could be due to a lack of human resources dedicated to content moderation outside of the world’s richest countries and an over-reliance on artificial intelligence to get the job done.

Heidi Tworek, a professor of science and technology studies at the University of British Columbia, wonders if these restore rates indicate an excess of enthusiasm on Facebook’s part to remove suspicious posts.

“I don’t know. But it does tell us that mistakes are being made,” he said.

Tworek noted that Germany’s network enforcement law and Canada’s proposed online harm bill threaten social media companies like Facebook with large fines if certain types of harmful content are not removed within 24 hours. . This encourages social media to adopt a delete-first attitude. Restoring something depends more on the propensity of users to appeal and the ability of Facebook staff to understand the language and context of the post and appeal, he said.

“Perhaps the high restoration rates are what we would see everywhere if FB had the right number of human moderators, with the right language skills,” Tworek said.

“It gives us insight into the mindset of what FB is investigating and what is not. They want to find out why people aren’t more attractive, but not if this tells us something about the quality of the AI ​​or if it means we need more content moderators. “

Facebook whistleblower Frances Haugen, seen here in Brussels, testified before the United States Senate and the United Kingdom Parliament last month claiming that Facebook prioritized profit over user safety and concealed its own investigation by public.

The leaked document is part of the Facebook Papers, which was first reported on by the Wall Street Journal and has now been shared with media around the world. Haugen, the former Facebook employee behind the leak, testified before the US Senate and UK Parliament last month claiming that Facebook prioritized profits over user safety and withheld its own investigation from investors. and the public. The leak contains documents covering a variety of topics, from efforts to increase Facebook’s audience, to how its platforms could harm children, to its alleged role in inciting political violence.

“We are a company and we make a profit, but the idea that we do it at the expense of the safety or well-being of people misunderstands where our own business interests lie,” a Meta spokesperson said in a written statement. The company is on track to spend $ 5 billion this year on “safety and security,” including employing 15,000 people reviewing content in more than 70 languages.

In the leaked appeals document, there is also a large discrepancy in the number of removed posts that are appealed, ranging from a low of 6.6 percent in Cambodia to a high of 28.1 percent in the US.

“Appeal rates are high in the US + other similar countries (Germany, UK, Canada) and very low in some other countries (India, Indonesia, Myanmar and Cambodia),” the memo reads. “If we want to have more equitable appeal rates across countries, we may need to customize the appeal experience for certain types of countries.”

These comments reveal an internal mindset within the company that appears to prioritize hitting the target numbers rather than addressing the underlying issues of the spread of hate, violence and misinformation, said Fenwick McKelvey, professor of information and communication technology policy. at Concordia University who studies social media.

“Facebook is grouping these issues as if they could all be tackled in the same way,” he said. “It is a managerial response to real social problems that require more nuance.”

While it’s difficult to draw conclusions from a single document, McKelvey said this approach is typical of Facebook’s previous actions to address abuse as a technical issue rather than a social one.

“These automated systems seem optimized to protect Facebook’s legitimacy in the eyes of governments who want to regulate its behavior. They are not designed to combat the damage that the systems were supposed to do, ”he said.

“This seems insincere for a company that claims to have a global commitment to human rights.”



Reference-www.thestar.com

Leave a Comment