
Whether and how technological changes reshape free speech rights remain contested questions across the Atlantic.
The European Commission imposed a €120 million fine on social media platform X in December 2025 for violating the Digital Services Act (DSA), marking the first penalty issued under the European Union’s landmark law aimed at protecting users of online platforms.
Several officials from the Trump Administration condemned the penalty as targeting U.S. tech companies and undermining free speech right. U.S. Secretary of State Marco Rubio described the EU’s decision as “censoring Americans.”
In defending its decision to fine X, the EU stressed that the DSA has “nothing to do with censorship”—that its purpose is to promote users’ free speech rights by holding online platforms accountable.
The reaction from both U.S. and EU officials, which focused on whether the fine constitutes “censorship,” underscores the fundamental divide between the two dominant actors in global tech governance as to how they understand the relationship between digital regulation and free speech rights—specifically, whether constitutional free-speech protections generally restrict the government from regulating online platforms.
In the United States, the state action doctrine provides that constitutional rights—including First Amendment free speech rights—apply only to government action, not to private conduct. When privately owned online platforms manage the content on their services, including deciding what content users may post, they are protected rather than restricted by the First Amendment.
Another constitutional principle that bars government regulation of online platforms is the “marketplace of ideas.” Under this principle, government intervention is disfavored because it disrupts the marketplace’s ability to correct itself. Instead, online platforms should self-regulate, as they are responsive to public demands in a competitive market.
Given the United States’ preference for market solutions over legal intervention, Anu Bradford of Columbia Law School describes its approach as the “market-driven regulatory model.” Section 230(c) of the Communications Decency Act—designed to deregulate rather than regulate—exemplifies this model by shielding online platforms from liability for their content-moderation choices.
In contrast, the EU has adopted the “rights-driven regulatory model,” under which protection of platform users’ fundamental rights shapes its regulatory design.
The EU rejects the state action doctrine and recognizes the “horizontal effect” of constitutional rights, according to which constitutional limitations also apply to private parties. The rationale behind this approach is that extremely powerful private actors are just as likely to threaten fundamental rights as the government. Acknowledging that the power to control online speech on a global scale is concentrated to a limited number of tech giants, the EU concluded that governmental intervention is necessary and enacted the DSA.
Recognizing that government can restrict online platforms’ free speech rights to ensure those of the users, however, is not to say that government can regulate platforms without constitutional constrains. Rather, what it entails is that government and courts cannot one-sidedly focus on platforms’ rights and must take into account both platforms’ and users’ competing interests.
Consider the DSA. Commentators argue that it strikes a proper balance between platforms’ and users’ free speech rights by introducing procedural, rather than content-based, requirements. The European Commission fined X not because it failed to suppress certain content but because X violated the DSA’s transparency requirements by, among its actions, not providing researchers access to public data.
In recent years, platfomrs’ failure to self-regulate has prompted some scholars and policymakers in the United States to reassess the market-driven regulatory model.
Jack M. Balkin of Yale Law School, for example, argues that existing First Amendment doctrines fail to account for users’ free speech interests. Similarly, Philip M. Napoli of Duke University contends that First Amendment law should prioritize users’ rights, given the central role that digital platforms play in modern communication.
The Trump Administration, however, remains committed to the market-driven approach. In his first week in office, President Donald J. Trump signed an executive order titled “Removing Barriers to American Leadership in Artificial Intelligence,” underscoring the critical role of free markets in driving technological innovation. The Administration’s stance was also reflected in its consideration of imposing additional tariffs on the EU and sanctioning its officials for implementing digital regulations.
Courts continue to disagree over whether online platforms’ unique role in shaping online discourse justifies digital regulations.
The U.S. Court of Appeals for the Eleventh Circuit ruled that a Florida statute requiring platforms to carry speech from certain speakers, such as journalists and political candidates, is likely unconstitutional because platforms’ decisions about what speech to host are themselves protected expressive choices.
The Fifth Circuit took the opposite view. Characterizing platforms’ content-moderation decisions as “censorship,” the Fifth Circuit upheld a Texas statute that introduced a must-carry requirement prohibiting online platforms from moderating content based on its viewpoint, reasoning that platforms’ moderation practices are “not speech” at all.
The circuit split led to the U.S. Supreme Court’s 2024 decision Moody v. NetChoice. Rather than settling the core constitutional issue on whether the First Amendment protects or constrains private platforms’ content moderation choices, however, the Court returned the case to lower courts to undertake a more comprehensive analysis on the impact of the laws.
Still, the opinions in Moody reveal that the Justices disagree on this issue. Justice Elena Kagan, writing for the majority, seems to reject the idea that the novelty of technologies warrants revisiting settled First Amendment doctrines. Drawing on longstanding precedents, Justice Kagan stressed that the government may not compel platforms to carry content they would otherwise exclude in order to “rejigger the expressive realm.”
In contrast, Justice Samuel Alito emphasized in a concurring opinion the transformative impact of social media platforms on modern communication, describing them as the “modern public square.” He signaled that Florda and Texas laws are likely justified, considering how denying users access to online platforms substantially diminishes their ability to speak and to obtain information.
Ultimately, with the Court leaving the issue unresolved, it remains to be seen whether the United States will move away from its market-driven approach as debates over censorship, constitutional limits, and platform power continue.


