While Congress and the President spent an entire election pushing bad ideas to rein in big tech, the world stopped waiting for America to lead.
This week, the European Commission plans to advance its Digital Services Act (DSA) for the European Union. The DSA’s vision for rules that govern online platforms stands in stark contrast to the United States, where Section 230 of the Communications Act––the infamous immunity law that protects internet services who engage in content moderation––has been a lightning rod of political attention, high-profile attacks, and bad policy solutions from both parties. So now, the EU has stepped up to fill an internet policy leadership void in an ambitious way.
As we head into a new administration and Congress, American policymakers must seize the opportunity in front of them: learn from this process and consider the consequences of trying to regulate content online and the threats to markets and free speech that come with it. At the same time, the United States urgently needs to encourage innovation and agility in content management practices by strengthening non-governmental mechanisms that will help shape evolving norms and best practices. This is also a golden opportunity to reduce or eradicate overly aggressive regulatory burdens, or bolster under-protective due process protections, and set a good example through our own laws. Tracking the development of DSA standards will undoubtedly be critical for American lawmakers as they undertake these challenges.
Just as the EU’s General Data Protection Regulation (GDPR) rewrote the playbook for internet privacy and data protection, so too will the DSA transform content policy issues. It will establish a new roadmap for transparency and accountability for online platforms like Facebook, Twitter, and YouTube that act as intermediaries for internet users who create and share content online. While the law’s implementation details are in places quite specific and prescriptive, the DSA’s core requirements are fairly straightforward in principle: platforms must take down material determined to be illegal, adopt and publish content policies, have appeals procedures, and provide transparency so that their users can understand and know what will happen with their content and activity online––and then be consistent in the execution of all of these policies and procedures.
Three pieces of this framework are worth a deeper examination: how the DSA regulates illegal content; how it approaches lawful but harmful content; and how it works alongside two other major EU initiatives––the Digital Markets Act and the Democracy Action Plan.
First, the DSA seeks to harmonize the management of content that is illegal to post or distribute across EU member states––“illegal content”, in the imperfect shorthand. In its current form, the DSA follows the “notice and takedown” global paradigm, in that it would require platforms to remove content after meaningful due process notices, a common feature of other areas of law such as copyright. By taking steps to standardize illegal content processes across the 27-country bloc, the DSA should reduce compliance costs associated with the current patch-work approach. It will also put the burden of determining legality where it belongs, with the government, rather than shifting that onus to the platforms themselves and resulting in incentives to repress legitimate speech.
Second, for content and conduct online that is considered lawful but harmful (sometimes shortened to “harmful content”), the DSA’s authors appear to have opted to hold platforms responsible primarily through transparency and moderation consistency, similar to the proposed PACT Act introduced in Congress in 2019. Rather than regulating specific platform practices or moderation outcomes for lawful but harmful speech—both of which would risk harming free expression and free speech—the DSA shifts the frame of online content moderation to one of consumer protections, and explicitly identifies this as its goal.
There are risks to the DSA’s approach. For example, aggressive timetables or lax procedural safeguards can skew platform behavior towards over-blocking, or can create structural advantages for large platforms that can scale resources for compliance more effectively. But if it works, this will be a step in the right direction. In content policy, government-mandated standards of conduct raise many concerns, including that they would risk being out of date before the (virtual) ink dried. A focus on transparency, consistency, and process would provide platforms with necessary flexibility, and users with key protections for their rights.
Finally, the DSA is not the only proposal in the EU’s legislative ambition towards big tech. The Digital Markets Act will address concerns over concentrated power in the space and the Democracy Action Plan will focus on disinformation in elections. While there is overlap among all of these agendas, the EU is working to create separations in the policy processes to gain more traction in these complex and challenging problem spaces. The same issues the United States is grappling with—interference in elections, targeted disinformation campaigns, size and scope of particular platforms—are under consideration by EU policymakers as well.
The pandemic has shone a bright spotlight on both the strengths and weaknesses of the internet ecosystem. Investing in getting the future of internet governance right will help preserve the former and mitigate the latter. And now is the time to engage, both in Europe and in the United States, to help prevent mistakes and regulatory overreach, to restore constructive problem solving and consensus building to our political processes, and to develop values and norms through the input and collaboration of all the diverse stakeholders in the internet ecosystem.