Leak shows Facebook’s business model needs regulating, says MEP

 Leak shows Facebook’s business model needs regulating, says MEP

The European Parliament’s lead and shadow rapporteur for a major reboot of the bloc’s digital rulebook have called for an investigation following the Facebook whistleblower leaks.

One of the MEPs has also called for incoming EU rules to directly tackle business models that favor “disinformation and violence over factual content”.

In a joint statement, the lead rapporteur for the EU’s Digital Services Act (DSA), Christel Schaldemose (S&D), and Alexandra Geese (shadow rapporteur for the Greens/EFA), said they are in touch with the former Facebook employee turned whistleblower, Frances Haugen.

In an interview with 60 Minutes today, Haugen revealed herself as the source of a raft of recent leaks to The Wall Street Journal which has reported on the internal documents for a number of stories — including that Facebook’s internal research suggested Instagram made teenage girls’ anxiety and body image issues worse and that the tech giant operated policy carve-outs for whitelisting celebrities.

The two MEPs said the leaks make it clear that Big Tech must not be allowed to continue to regulate itself.

The EU’s executive moved forward in December last year with a major reboot to the digital rule book — introducing the DSA and another piece of regulation that’s specifically targeted at tech giants’ market power (aka the Digital Markets Act), kicking off a process of (ongoing) negotiations between EU institutions to amend and adopt legislation to extend platforms’ accountability.

The support of the European Parliament is required to pass the digital policy packages. And Geese is unlikely to be alone in calling for stronger measures than were contained in the Commission’s original DSA proposal in light of the latest ugly Facebook revelations.

In the joint statement, Schaldemose said that large tech companies have shown they are “simply not capable” of responsible self regulation.

“The governing of our shared spaces on social media must be done through democratically controlled institutions just as we have done in the parts of our society that do not lie in the digital realm. We must demand transparency from the tech companies and we must allow civil society, law makers and scholarly experts to have insight into the building blocks of the algorithms. This is the only way that we can have a public debate about the effects of these algorithms,” she also said. 

“Today, we know this from the files, there are arbitrary protections of celebrities and a huge focus on negative, wrong and conflict-ridden content that threaten to undermine the very democratic conversation that we once hoped, the social media platforms could strengthen. To keep that hope alive and to allow all voices the ability to join in on the conversation, we must put firm demands to the companies governing these spaces.”

Geese went further — calling for the DSA to be strengthened in light of Haugen’s whistleblowing — arguing that the exposures are game-changing and make the case for regulating whole business models when they benefit from the amplification of disinformation at the expense of truthful content.

“I am extremely grateful for the courage of the whistleblower that finally gives us insights we need to effectively legislate. The revelations couldn’t be more timely for the work on the DSA,” said Geese. “The huge volume of documents and the person’s deep expertise are impressive. Until now, neither the public nor legislators have been able to gain such a deep insight into the mechanisms that have become far too powerful. The documents finally put all the facts on the table to allow us to adopt a stronger Digital Services Act.

“The conversation confirms my view that we need strong rules for content moderation and far-reaching transparency obligations in Europe. In a democracy we cannot tolerate an internet where some people have the right to promote violence and hatred in spite of the rules and others see perfectly legal content taken down by automated filters.

“We need to regulate the whole system and the business model that favours disinformation and violence over factual content – and enables its rapid dissemination. We also need consistent enforcement in Europe. It is naïve to appeal to corporate self-regulation and responsibility. We as elected politicians have the responsibility for democratic discourse and must exercise it in the legislative process.”

In her interview with 60 Minutes, Haugen was quizzed about a complaint made to Facebook in 2019 by major political parties across Europe — which were said to have raised concerns with the tech giant that its algorithmic preferences was forcing them to “skew negative” in their communications on its platforms and that was leading them to adopt more extreme policy positions.

“You are forcing us to take positions that we don’t like, that we know are bad for society, we know if we don’t take those positions we won’t win in the marketplace of social media,” said Haugen, summarizing the parties’ concern in the interview.

Facebook was contacted for a response to the MEPs’ joint statement.

In a statement to Reuters, the tech giant reiterated its customary claim that it has “been advocating for updated regulations where democratic governments set industry standards to which we can all adhere”.

Haugen has said that she made the decision to turn whistleblower after becoming frustrated that Facebook was not responding to such concerns and that executives at the company were instead prioritizing its financial performance over making changes to its content-sorting algorithms that could reduce the platform’s negatively polarizing effects on society.

“Facebook has thousands of [content] options it could show you. And one of the consequences of how Facebook is picking out that content today is it optimizing for content that gets engagement or reaction. But its own research is showing that content that is hateful, that is divisive, that is polarizing — it’s easier to inspire people to anger than it is to other emotions,” Haugen also told 60 Minutes.

A year ago the European Parliament voted to back a call for tighter regulations on behavioral ads — such as those which power Facebook’s content-sorting social media business — advocating for less intrusive, contextual forms of advertising and urging EU lawmakers to consider further regulatory options, including asking the Commission to look at a phase-out leading to a full ban.

With ever more ugly revelations coming out of Facebook — seemingly on a weekly basis — momentum could well build in the European Parliament for taking a far tougher line on engagement-based business models.

Facebook founder Mark Zuckerberg got a frosty reception from MEPs back in 2018 — the last time he took an in-person, publicly streamed meeting with a part of the institution, in that case in the wake of the Cambridge Analytica data misuse scandal.

Asked about the MEPs’ statement today, a Commission spokesperson told the Reuters news agency that its position in favor of regulation is “clear”, adding: “The power of major platforms over public debate and social life must be subject to democratically validated rules, in particular on transparency and accountability.”

Related post