The European Union (EU) has reached an agreement on the final version of the Digital Services Act (DSA) regulation, its landmark proposal to set rules for responsibility and accountability for internet companies that facilitate user content and communications online. Although the final text—which is not yet public—has not been officially approved by the European Parliament, experts consider that step a formality in this case. The DSA has been compared to the EU’s General Data Protection Regulation (GDPR) in that it will reach beyond universally accepted norms to chart a new course for government intervention, through rules that will be treated as the new standard across the Western internet world going forward.

At the outset of this process, in December 2020, R Street’s blog set out some expectations for what the legislation would do. Looking back a year and a half later, these expectations appear to have been realized, while a few new surprises have made their way into the mix:

  1. The DSA will “establish a new roadmap for transparency and accountability for online platforms.”

Without question, the DSA imposes substantial new transparency and accountability requirements, with the heaviest weight of regulation falling on companies with more than 45 million monthly active users (MAU) in the EU—for reference, this threshold was calculated at approximately 10 percent of the total population of the EU; and for Q1 2022, the estimated MAU for the Facebook service was approximately 90 percent of the population of the EU. While transparency is not mentioned in the official release except in the context of recommender systems, Mathias Vermeulen of AWO, a European data rights agency, calls the bill “a data-gathering machine” for the scale of its increased transparency obligations. Additionally, while specific content mandates (see below) will draw greater scrutiny and discussion, the centerpiece of the DSA is found in its provisions requiring self-assessment and risk mitigation and allowing these internal work ups to be reviewed by independent auditors and researchers.

  1. The DSA will require platforms to:

    • “take down material determined to be illegal,” including harmonizing illegal content processes across the EU to reduce compliance costs;

    • “adopt and publish content policies, have appeals procedures, and provide transparency so that their users can understand and know what will happen with their content and activity online,” refraining from regulating specific practices or moderation outcomes where content is lawful but harmful; and

    • “be consistent in the execution of all of these policies and procedures.”

The final DSA text will include specific mechanics and processes for complying with law enforcement and court orders—standardizing processes across the EU—in a long-needed procedural update. Fortunately, the EU continues to steer away from a consideration of upload filters or any similar unrealistic technology mandate placing impossible burdens of content policing on technology intermediaries. Instead, the DSA includes a raft of substantial procedural obligations related to content moderation, as Daphne Keller’s analysis articulates. As she notes, small enterprises can be exempted, and medium enterprises can be given some amount of flexibility in compliance. Consistency in execution is a matter for enforcement. The official release states that obligations specific to the largest service providers will be enforced directly and exclusively by the European Commission, though the details of that mechanism are uncertain and highly important; and, without full text, it’s difficult to say anything at this point regarding enforcement mechanisms for the remainder of the statute’s provisions.

  1. The DSA risks

    • Aggressive timetables;

    • Lax procedural safeguards; and

    • Creating structural advantages for large companies.

Five years ago, the German government’s landmark NetzDG legislation included an aggressive 24-hour timetable within which social media companies would be legally required to take down any online content that was “obviously illegal”—alternatively framed as “manifestly unlawful.” Such an obligation effectively delegates a fundamentally government activity to a private actor, a transfer that is dangerous as a matter of principle and, in practice, virtually certain to result in overcompliance, as numerous advocates have expressed. While the DSA appears to avoid such hard requirements, tea leaves indicate that some comparable suggestions have arisen in the context of responses to notices from end users regarding illegal hate speech.

As with most EU-level laws, the mechanisms and procedural safeguards themselves are well-articulated; and whether they will be applied in lax ways resulting in economically inefficient or societally harmful outcomes will largely depend on enforcement mechanisms, which will remain unclear for some time to come.

Some criticize that the GDPR empowers big tech at the expense of its competitors. Certainly, to the extent that legislation imposes expensive compliance obligations across an industry, the biggest companies can hire the most expensive compliance lawyers. New tech laws can often appear to be motivated principally by the behavior of a few large companies, if only because the media attention and debate gravitate toward a few companies; the DSA is no exception to this principle. It endeavors to address this through explicit tiering of obligations as well as various exemptions and relaxations for smaller companies. While this approach is a refreshing dose of honesty, in some sense, it’s too soon to gauge its effectiveness at mitigating unintended consequences.

But ultimately, it’s the new parts of the DSA that will gain the most attention and that may have the most practical impact—the obligations on which a consensus was not reached in 2020 but managed to develop over the course of the past year and a half. The three notable additions include:

  1. A ban on dark patterns, via a prohibition on “online interfaces” designed “in a way that deceives, manipulates or otherwise materially distorts or impairs the ability of recipients of their service to make free and informed choices.” These are big words, clearly. Absent an enforcement process that surfaces particularly juicy internal memoranda, proving such behavior seems difficult in practice, even if the spirit is commendable. Notably, in the United States, the Federal Trade Commission is currently working to use its existing authority regarding deceptive practices to go after dark patterns; in 2019, bipartisan legislation was introduced to ban dark patterns, but it did not advance.

  2. A requirement to offer non-profiling-based recommender systems, applicable both to so-called Very Large Online Platforms as well as Very Large Online Search Engines—both set at the aforementioned 45 million MAU threshold. Active discussion on Twitter ensued as to the meaning of “profiling-based,” such as whether the requirement amounted to forcing a linear chronological feed option in the context of social media—already offered by some, but not all, social media. Other open questions persist as well, including whether it would suffice to permit a user to run search queries from a web browser in private mode and not logged in.

  3. A prohibition on the use of personal data of minors in targeted advertising. This was apparently one of the most hotly discussed political issues during the final negotiations. In particular, the Parliament wanted to expand the GDPR’s scope of limits of use on sensitive data, which is not reflected in the EU’s press release. The final text will reflect the outcome and rules going forward, but the practical consequences for the internet industry could be significant.

As previously noted, the DSA should be understood in the context of the EU’s parallel efforts to adopt the Digital Markets Act (DMA) and the more framework-like Democracy Action Plan. The DMA managed to outrace the DSA to the regulatory finish line by one month and will enter a similar parallel phase of finalizing and making public the official language alongside the massive challenge of developing and executing on enforcement capabilities. The Democracy Action Plan, while not tied to major legislation in the same way, has nevertheless also made significant progress on a number of fronts.

Meanwhile, in the United States, despite several high-profile hearings, no comparable bills to the DSA have made significant progress, although many different ideas have been put on the table. Some traction has been seen in bills that skew away from intermediary responsibility and toward calls for transparency and accountability in the use of “algorithms.” In computer science, an algorithm is something akin to a recipe or formula that describes how a piece of software will work. However, in modern and especially political parlance, the term has become shorthand for a small subset of computer programs, typically machine-trained automated decision-making systems that filter and prioritize content presented to users.

Because the underlying artificial intelligence technologies are often difficult for non-experts to understand, and because their outputs are often difficult to explain even by experts as a result of the massive training sets that power them and the complex iterative feedback loops that develop over time, the word algorithm has become something of a lightning rod. Despite what feels like bipartisan agita over the practices of big technology companies, as the summer legislative recess and midterm elections approach, it is far from certain that anything substantive will pass in this legislative session—leaving the EU, again, out in front of the rest of the world.

Image: billionphotos.com

Featured Publications