A Role for Antitrust in Online Content Moderation

Font Size:

Scholar argues that updates to antitrust regulation could support healthy online discourse.

Font Size:

Two years ago, Facebook banned then-President Donald J. Trump from its platform. This year, the U.S. Department of Justice sued Google, alleging that the company monopolized certain kinds of digital technology.

These two actions—one by a private company, the other by the federal government—might seem unrelated. But according to a recent article by Nikolas Guggenberger of the University of Houston Law Center, they both signal the vast power held by today’s online platform companies. Guggenberger argues that unhealthy online discourse—such as calls for violence, misinformation, and hate speech—will continue to plague platforms as long as big tech companies wield excessive power over the exchange of ideas.

Guggenberger contends that because just a few platforms dominate the online marketplace, content moderation decisions are both conspicuous and contentious. Platforms also lack sufficient incentives to invest in better moderation processes because they have no real competition. And users lack influence when these few platforms condone or perpetrate discrimination because they have few alternatives.

Guggenberger advocates strengthening antitrust laws, which seek to prevent companies from monopolizing, or gaining too much control of a market through unjustified or illegal means. If more companies entered the online platform market, they would reduce the power of any one platform and could encourage innovation through competition. Guggenberger concludes that greater competition would foster healthier discourse in the “digital public sphere,” the online space where people exchange views about society.

For context, Meta, Google, and Apple have grown over the past few decades to control the digital public sphere by dominating social media, video sharing, search engines, and app stores. According to Guggenberger, they attained this high level of control due to network effects. On a network-based platform such as social media, the more users a platform obtains, the more valuable the platform becomes, because users can connect with more of their friends, and advertisers can reach more consumers. When a majority of people congregate on one platform, other platforms become less attractive.

Although social media platforms compete early in their development, once one company achieves a big enough size advantage, network effects create a huge barrier to new companies entering the market. Extreme market concentration, Guggenberger notes, creates interrelated problems that threaten healthy online debate, including content moderation challenges and threats to balanced discourse.

Content moderation decisions have extremely high stakes not only for the companies, but also for the public. For example, when Mark Zuckerberg and Jack Dorsey banned President Donald J. Trump from Facebook and Twitter, Guggenberger argues that they “singlehandedly redefined national discourse” by preventing a sitting president from communicating with his followers. Guggenberger maintains that leaving such power to two corporate executives is dangerous, whether their decision was correct or not, because executives lack public accountability.

Companies’ tight control of the market also leaves little incentive to improve content moderation, which Guggenberger argues contributes to platform frailty. Without real competition, companies have less motivation to invest in improving their content moderation processes because size—not quality—dictates which platform attracts users. Facing limited alternatives, he notes that users must endure exclusionary content moderation practices to participate in the digital public sphere at all.

Furthermore, as a company’s platform grows, misinformation impacts even more users. Before the 2016 U.S. presidential election, for example, various groups manipulated Facebook’s algorithms to distribute misinformation about election interference on a huge scale; Facebook’s failure to remove this content allowed it to reach many people. Market concentration amplifies divisive content because a platform’s business model depends on attracting as much user engagement as possible.

Guggenberger emphasizes that current antitrust laws are ill-equipped to confront these challenges of digital monopolies. Because antitrust regulation focuses on consumer price protection, Guggenberger argues that regulators under-scrutinize certain markets that operate efficiently but could threaten democracy.

Antitrust also requires a company to control between 70 percent and 90 percent of the market before it qualifies as a monopoly. As a result, Guggenberger explains that market concentration on digital platforms can reach levels too high to allow balanced discourse, but too low to attract the attention of antitrust enforcers.

Finally, antitrust does not grant relief unless the monopolizer has acted unjustly or illegally. Consequently, when lawful, “organic growth” results in just three companies dominating the digital public sphere, Guggenberger notes that regulators are ill-equipped to curb their power.

To address the enforcement conundrum that organic growth poses, Guggenberger proposes a “doctrinal pivot”: lawmakers should update antitrust statutes and guidelines to impose no-fault liability on tech giants.

The actions of online platforms could then be challenged based on market share alone, without a required showing of bad conduct. Guggenberger argues that this reform would comport with antitrust law’s historical goal of “providing an environment conducive to the preservation of our democratic political and social institutions.” Broadening antitrust law’s scope beyond focusing on consumer price protection to regulating practices that hinder healthy democratic function would also modulate private power.

Guggenberger suggests a second reform that also would help newer online platform companies compete in the digital market: regulators could require interoperability between platforms. Guggenberger argues that if platforms were required to exchange information freely, the current network effects that lock users into the biggest platforms would no longer prevent them from switching. This change could lead to more healthy competition, reducing market concentration and remedying many of the content moderation problems that arise from it.

According to Guggenberger, content moderation will not succeed in encouraging more inclusive online exchanges until a greater number of competitive companies share market power. Because platforms cannot achieve this realignment unilaterally, policymakers must step in to impose antitrust liability and interoperability requirements on market giants.