How Europe鈥檚 data law could make the internet less toxic
Loading...
| Brussels
Sweeping new European Union legislation seeks to 鈥渞evolutionize鈥 the internet, forcing social media giants including Facebook and YouTube to take steps to tamp down the spread of extremism and disinformation online.
Known as the Digital Services Act (DSA), it is likely to create ripple effects that could change how social media platforms behave in America, too.聽
In one of the most striking requirements of the new law, Big Tech companies with more than 45 million users will have to hand over access to their so-called algorithmic black boxes, lending greater clarity to how certain posts 鈥 particularly the divisive ones 鈥 end up at the top of social media news feeds.
Why We Wrote This
A new EU law calls on Big Tech companies to open up their algorithmic 鈥渂lack boxes鈥 and better moderate online speech. The goal is no less than preserving the public square on which democracies depend.
Companies must also put in place systems designed to speed up how quickly illegal content is pulled from the web, prioritizing requests from 鈥渢rusted flaggers.鈥
And if platforms recognize patterns that are causing harm and fail to act, they will face hefty fines.聽
鈥淲e need to get under the hood of platforms and look at the ways in which they are amplifying and spreading harmful content such as hate speech,鈥 says Joe Westby, deputy director of Amnesty Tech at Amnesty International in London.聽聽
鈥淭he DSA is a landmark law trying to hold these Big Tech companies to account,鈥 he adds.
Unlocking the Big Tech business model聽
Big Tech companies have long endeavored to shrug off regulation by invoking freedom of speech. The DSA takes the聽tack聽that while ugly and divisive speech shouldn鈥檛 be policed,聽neither should it be聽promoted 鈥 or artificially amplified.聽
But in order to sell ads and collect user data 鈥 which they also sell 鈥 the big online platforms have been doing precisely this.聽
The key to this business model is keeping users online for as long as possible, in order to collect as much data about them as possible.
And research has shown that what keeps people reading and clicking is content that makes them mad, notes Jan Penfrat, senior policy adviser at European Digital Rights, a Brussels-based association.
This in turn gives Big Tech companies incentive to prioritize and push out anger-inducing content that provokes users 鈥渢o react and respond,鈥 he says.聽
This point was driven home last year through a trove of internal documents made public by whistleblower and former Facebook data engineer Frances Haugen.聽
In leaked company communications, an employee laments that extremist political parties were celebrating Facebook鈥檚 algorithms, because they rewarded their 鈥減rovocation strategies鈥 on subjects ranging from racism to immigration and the welfare state.
It was one of many examples聽in those documents聽of how Facebook鈥檚 algorithms appeared to 鈥渁rtificially amplify鈥 hate speech and disinformation.聽
To endeavor to fix this, the DSA will require Big Tech companies to conduct and publish annual 鈥渋mpact assessments,鈥 which will examine their 鈥渆cosystem of users and whether or not 鈥 or how 鈥 recommendation algorithms direct traffic,鈥 says Peter Chase, senior fellow at the German Marshall Fund in Brussels.聽
鈥淚t鈥檚 asking these large platforms to think about the social impact they have.鈥澛
There are insights to be had from these sorts of regular exercises, analysts say.聽
Twitter, which has a reputation for publishing self-critical research, made public an internal evaluation last October that found its own algorithms favor conservative rather than left-leaning political content.
What they couldn鈥檛 quite figure out, it admitted, was why.聽
The DSA aims to provide some clarity on this front by requiring Big Tech companies to open up their algorithmic black boxes to academic researchers approved by the European Commission.聽
In this way, EU officials hope to glean insights into, among other things, how Big Tech companies moderate and rank social media posts. 鈥淥n what basis聽do聽they recommend certain types of content over others? Hide or demote it?鈥 Mr. Penfrat asks.
And聽under the law, if Big Tech companies discover patterns of artificial amplification that favor hate speech and disinformation pushed out by bad actors and bots 鈥 what social media companies call 鈥渃oordinated inauthentic behavior鈥 鈥 and don鈥檛 take action to stop it, they face devastating fines.
These could run up to 6% of a company鈥檚 global annual sales. Repeat offenders could be barred from operating in the EU.
鈥淭hey have to do something about it, or they can get caught,鈥 says Alex Angler, fellow in Governance Studies at the Brookings Institution.聽
鈥淪o they can鈥檛 just shrug their shoulders and say, 鈥榃e don鈥檛 have a problem.鈥欌
鈥淲eaponized鈥 ads 鈥 and the law鈥檚 response
Up until now, such evasiveness is precisely what has characterized Big Tech companies, and analysts say it鈥檚 largely because promoting divisive content has been so wildly profitable.聽
Mr. Penfrat recalls the surprise of EU policymakers he lobbied when he would explain the nearly unfathomable amount of personal data that tech giants commodify 鈥 and how they often tap the emotional power of anger through a 鈥渟urveillance-based鈥 advertising model.聽
鈥淓very single time you open a website, hundreds of companies are bidding for your eyeballs,鈥 he says. In a matter of 鈥渕illiseconds,鈥 the ads pushed by data brokers who have won the bid are loaded for web users to view.
But it鈥檚 not just goods and services that advertisers are selling. 鈥淎nyone can pay Facebook to promote certain types of content 鈥 that鈥檚 what ads are. It can be political and issues-based,鈥 Mr. Penfrat says.
And bad actors have taken advantage of this, he notes, pointing to how the Russian government 鈥渨eaponized鈥 ads to push its preferred candidates in U.S. elections and justify war in Ukraine.
The DSA will ban using sensitive data, including race and religion, to target ads, and prohibit ads aimed at children as well. It also makes it illegal聽to use so-called dark patterns, manipulative practices that trick people into things such as consenting to let online companies track their data.
What鈥檚 more, it requires Big Tech companies to speed their processes for taking down illegal posts 鈥 including terrorist content, so-called revenge porn, and hate speech in some countries that ban it 鈥 in part by prioritizing the recommendations of 鈥渢rusted flaggers,鈥 which could include nonprofit groups approved by the EU.
Likewise, if companies remove content that they say violates these rules, they must notify people whose posts are taken down, explain why, and have appeals procedures.
鈥淵ou鈥檝e got these mechanisms today, but they鈥檙e very untransparent,鈥 Mr. Penfrat says. 鈥淵ou can appeal but never get a response.鈥澛
A European law with U.S. effects
The DSA has been received by data-policy experts with a mix of skepticism as well as praise聽鈥 with some voicing worry about unintended harm to competition or the diversity of online speech.
Yet the DSA is expected to drive policy in the United States as well as in Europe, says Mr. Engler, who studies the impact of data technologies on society.聽
鈥淭his is a classic example of the 鈥楤russels effect鈥: the idea that when Europe regulates, it ends up having a global impact,鈥 he adds. 鈥淧latforms don鈥檛 want to build different infrastructure based on whether the IP address is in Europe.鈥
And as academics are able to delve into Big Tech鈥檚 black boxes, the mitigating measures they suggest will not only be a good starting point for public debate, but could also provide inspiration for America, too.
During Ms. Haugen鈥檚 whistleblower testimony, U.S. lawmakers signaled that they could be open to the sorts of regulations that the DSA puts in place.
At a press conference following the congressional testimony last October, Sen. Richard Blumenthal, a Connecticut Democrat, marveled at the bipartisan agreement on the need for reform.
鈥淚f you closed your eyes, you wouldn鈥檛 even know if it was a Republican or a Democrat鈥 speaking, he said. 鈥淓very part of the country has the harms that are inflicted by Facebook and Instagram.鈥澛
American and European regulators say this is true on both sides of the Atlantic.