A leak of internal documents, published by the US press, has caused Facebook to give many explanations.
The newspaper The Wall Street Journal and other U.S. outlets last week published internal Facebook documents that came from someone at the company Mark Zuckerberg.
One of the most striking data was a study that indicated that Facebook knew that its social network Instagram turns out to be “toxica”For some teenagers.
The company has claimed that it is only a biased interpretation of data shown by the media.
Here are five things the internal document disclosure revealed and the response given by the company.
1. Facebook knew Instagram was ‘toxic’ to teenage girls
One of the most significant revelations of the so-called “Facebook Archives” it was an internal investigation into how Instagram influenced the lives of teenagers.
In a survey, the social network found that the platform was a “toxic” place for many young people.
According to some slides of the investigation, published by The Wall Street Journal, 32% of the adolescents surveyed said that When they felt bad about their body, Instagram made them feel worse.
Given this, Facebook said that “is not accurate.”
“In fact, the research showed that many adolescents we spoke to feel that using Instagram helps them when they go through difficult times and problems that adolescents have always faced,” the social network said in a statement.
As he exposes, of 12 adolescent problems – such as loneliness, anxiety, sadness and eating problems – in 11 the social network made them “feel better instead of worse.” Only a third of those with body image problems said they felt worse with the social network, according to the statement.
In the midst of these signs, Facebook announced Monday that it has put plans to launch Instagram Kids on hold., a version of its social network for children under 13 years of age.
2. Celebrities were treated differently by Facebook
According to the documents presented by The Wall Street Journal, many celebrities, politicians and high profile Facebook users they enjoyed a special treatment as to what content they can post being under a system known as XCheck (cross verification).
For example, Brazilian soccer player Neymar was allowed to show photos of a naked woman “to tens of millions of followers before Facebook removed the content,” says the US newspaper.
Although Facebook admits that the criticisms were “fair”, it points out that the XCheck system facilitates the dissemination of complex topics.
“This could include activists raising awareness of violence or journalists reporting from conflict zones,” says Facebook.
Facebook notes that many documents mentioned by The Wall Street Journal they contain “outdated information intertwined to create a narrative that misses the most important point: Facebook itself identified the problems with cross-checking and has been working to address them.”
Despite her rejection, her own Facebook Oversight Board, which makes decisions about moderating complicated content, has demanded more transparency.
In a blog this week, the agency said the disclosures had “drawn renewed attention to the seemingly inconsistent way the company makes decisions.”
He cautioned that the lack of clarity in cross-checking could contribute to the perception that Facebook was “unduly influenced by political and commercial considerations.”
Since it began its work investigating how Facebook moderates content, the Oversight Board, funded by Facebook, has made 70 recommendations on how the company should improve its policies.
3. The “weak” response to human trafficking
Documents presented in the US press also suggested that Facebook employees regularly detected posts about drug cartels and human traffickers, but the company’s response was “weak.”
In November 2019, BBC News Arabic released a report highlighting the problem of domestic workers put up for sale through Instagram.
According to internal documents, Facebook was already aware of the problem. The Wall Street Journal reported that Facebook only took limited action until Apple threatened to remove its products from its App Store.
In its defense, Facebook said it had a “comprehensive strategy” to keep people safe, including “global teams with native speakers covering more than 50 languages, educational resources, and partnerships with local experts and external fact-checkers.”
Critics warn that Facebook does not have the means to moderate all content on its platform and protect its 2.8 billion users.
David Kirkpatrick, author of the book The Facebook Effect, told the BBC that it seemed that the social network had no motivation “to do anything for meediar damages ”outside the United States.
“They have done a lot of things, including hiring tens of thousands of content reviewers,” he said.
“But a statistic that caught the attention of Wall Street Journal was that, despite all his work against disinformation in 2020, only 13% of that work was done outside the United States“He added.
“For a service that is 90% outside the United States, and that has had a huge impact, in a very negative way, on the politics of countries like the Philippines, Poland, Brazil, Hungary, Turkey, they are doing nothing to remedy all that “.
Kirkpatrick suggested that Facebook only “responded to public relations pressures” in the United States because it could affect its stock price.
4. Facebook faces high demand from shareholders
Zuckerberg’s company is also facing a complex lawsuit from a group of its own shareholders.
The group alleges, among other things, that the $ 5 billion payment to the US Federal Trade Commission to resolve the Cambridge Analytica data scandal was so high because it was designed to protect Zuckerberg from personal liability.
Facebook said it had nothing to say about the ongoing legal matter.
5. Facebook is autopromueve
In another report, the newspaper The New York Times indicated that Facebook had started a campaign to push positive news in order to improve its image.
According to the newspaper, programa Project Amplify was designed to “show people positive stories about the social network.”
Facebook said there have been no changes to its news rating systems.
In a series of tweets, spokesman Joe Osborne said that a trial of what he called “an informational unit on Facebook” was very small and only limited to “three cities.” Assures that posts were clearly labeled as driven by at the company.
For Osborne, it was “similar to the corporate responsibility initiatives that people see in other tech companies and consumer products.”