We are supported by readers, when you click & purchase through links on our site we earn affiliate commission. Learn more.

Facebook: More and more internals describe devastating corporate management

Whistleblower Frances Haugen has made thousands of pages of internal Facebook documents public. These Facebook files have it all. They prove, according to Haugen, that the company “has repeatedly misled the public about what its own research reveals about safety for children, about the effectiveness of its artificial intelligence and about its role in disseminating divisive and extremist messages.”

Haugen had hired on Facebook in 2018. From 2019 she worked as a senior product manager in the “Civic Misinformation” team. Its task was to counter false information and agitation. According to Haugen, however, he soon no longer believed that her team could make a difference. She decided to act as a whistleblower to inform the general public about the grievances on Facebook. In May 2021, she left the company with large amounts of material that she has since made available to various media.

More from c't magazine

In addition, she has filed eight complaints with the US Securities and Exchange Commission. She is of the opinion that Facebook willfully misrepresented the extent of and its knowledge of serious problems with products to investors in public statements and thus violated securities laws. In addition, Haugen was before the US Senate and the British Parliament invited to present their view of the problems on Facebook to the politicians in the two countries.

Some of the allegations made the Facebook files in a nutshell:

  • The same rules do not apply to all of the nearly three billion users of Facebook’s platforms, even though Facebook boss Mark Zuckerberg repeatedly claimed this, even under oath before the US Senate. Facebook created a system called XCheck that exempted almost six million users, mostly celebrities, from applying the content rules. In doing so, they could have allowed themselves missteps that would have been sanctioned in the case of ordinary members.
  • The content that young people see on Instagram is a danger to their self-esteem, can trigger depression or even induce suicidal thoughts. A third of the girls who feel uncomfortable in their bodies feel worse when they use Instagram. This is just one of many figures from internal studies on the negative effects on adolescents.
  • Facebook has set itself the goal of engaging more younger children. “Why should we care about tweens?”, The Wall Street Journal quotes a document from the year 2020, which also immediately gives the answer: “They are a valuable but untapped target group”. In English, “tweens” are children between the ages of 10 and 13. With the Messenger Kids, Facebook is already operating a first product for under 13-year-olds. A similar product should follow for Instagram. After criticism, the project is currently on hold.
  • Facebook’s newsfeed algorithms prefer posts that generate negative emotions such as outrage and sensationalism. The Wall Street Journal detailed a change to the newsfeed algorithm in 2018 that should aim to strengthen bonds between users and encourage interactions between friends and family. Facebook should become a more peaceful place – at the same time, the measures should drive up interaction rates. When it became clear that the changes led to more interaction, but also more outrage, Mark Zuckerberg personally protested against some of the proposed corrections. He feared that this would reduce people’s interaction on Facebook again.
  • In the so-called engagement-based ranking, algorithms decide on the basis of comments, likes and disseminations who gets what on Facebook and Instagram. The problem: Facebook does not disclose the algorithms, they are not controlled by any independent body: “Only Facebook knows how to personalize the news feed for you.” For such a powerful and extremely profitable corporation, this can no longer be tolerated.
  • Facebook’s AI hardly helps to identify harmful content. “We will probably never have a model that detects the majority of integrity violations, especially in sensitive areas,” the Süddeutsche Zeitung quoted a Facebook researcher in a memo from 2019. He spoke of a detection rate of two percent, another internal study names three to five percent.
  • There are too few moderators for malicious content, especially in the non-English speaking world. For example, there would be a lack of moderators for many countries where Arabic is spoken.

Frances Haugen raises serious allegations against Facebook at her hearing in the US Senate.

(Image: Alex Brandon – AP – dpa)

Facebook vehemently defends itself against these and other allegations. Mark Zuckerberg wrote in a blog post that his company doesn’t put profit above safety and wellbeing. For example, his network changed the news feeds to show less viral videos and more content from friends and family. Facebook introduced this even though it was clear to the company that people would then spend less time on the network.

Facebook says it employed 15,000 moderators for more than 70 languages ​​around the world, including Arabic in Yemen, Libya, Saudi Arabia and Iraq, to ​​combat fake news. According to Guy Rosen, VP of Integrity, Facebook has the rate of hate mail viewed over the past three quarters reduced by 50 percent. Only 0.05% of the content viewed is hate speech.

Frances Haugen is not the first Facebook whistleblower. Sophie Zhang already has systematic problems and dubious decisions on Facebook denounced. Zhang had worked as a data scientist in the so-called Site Integrity Fake Engagement Team since 2018. Their job was to fight bot accounts.

She reported many suspicious bot networks in more than twenty countries. For a network that supported Honduran President Juan Orlando Hernández, it took more than a year for Facebook to finally delete the affected profiles. In many other cases, Facebook did nothing because the networks didn’t meet Facebook’s changing criteria to be prioritized for bans, Zhang said. In general, she describes the decision-making processes as rash and haphazard. In 2020, she was fired from Facebook.

In April 2021, the online platform Buzzfeed was one internal investigation leaked about the role Facebook played in the attack on the US Capitol on January 6th. Although the company had spent months preparing for a possible delegitimization of the election by Trump and his supporters, the report said, according to Buzzfeed, that those efforts were being outmaneuvered by a powerful network of coordinated accounts.

Could Facebook have acted more effectively against the “stop the steal” movement of Trump supporters? The non-profit Media Matters of America mention, thatthat Facebook didn’t even have to resort to its own safeguards to recognize that it was a concerted action to delegitimize the election. Numerous media had already pointed out in November how the slogan was used to spread false information in Facebook groups en masse.

With Messenger Kids, Facebook already operates a product for under 13-year-olds.

As part of the publications relating to the Facebook files, other whistleblowers approached the media who reported on grievances – albeit anonymously to date. In a complaint to the US Securities and Exchange Commission, a former employee also accuses the company of putting profit over integrity.

According to a report in the Washington Post, he worked for Facebook’s “Integrity” team, which aims to protect users of the platforms from harmful and illegal content. According to him, Facebook executives have repeatedly undermined these efforts so as not to jeopardize the growth of the platform or to anger the US administration under Donald Trump. Among other things, there was a “white list” of media that were excluded from Facebook’s internal review process. The right-wing conservative portal Breitbart News, which Steve Bannon ran before he became Trump’s advisor, is also said to have been on this list.

In another report on a fairly topical case the Washington Post cites on three unnamed insiders. In the run-up to the national congress of the Communist Party ruling in Vietnam, Mark Zuckerberg personally arranged for the censorship of articles critical of the government to be significantly tightened. The company, which otherwise holds the flag of freedom of expression, has made itself an accomplice to the Vietnamese censorship for reasons of profit.

To home page