Scholars discuss regulating digital platforms through the lens of data privacy.
More than 90 percent of the world uses Google search. Less well-known, however, is that every day, Google collects data on billions of consumers worldwide. In the United States alone, Google has data on 72 percent on its users. Consumers rely on digital platforms like Google for a wide range of services. At the same time, consumers report feeling little control over what happens to their personal data. As Google and other digital platform companies grow exponentially, so too do data privacy concerns. How can regulators respond?
A digital platform is a website, app, or digital venue that interacts commercially with users or groups of users. Popular platforms include YouTube, Uber, TikTok, and Spotify. The largest and most dominant digital platforms in the world are Facebook, Amazon, Microsoft, Apple, and Alphabet—Google’s parent company. These digital giants are known collectively as “Big Tech.”
The types of data collected by Big Tech companies and other digital platforms can include location, search history, purchase history, emails both written and received, and more. For example, companies can use data internally to figure out who to send coupons to or how to improve their efficiency. Another relevant—and controversial—practice involves selling data to third parties such as auditors, credit agencies, marketers, and other service providers. As this information comes to light, a growing number of platform users, members of Congress, interest groups, and even some platforms themselves are calling for some type of data privacy regulatory regime.
Many countries recognize the need for some form of privacy law. South Korea and the European Union both passed comprehensive data privacy regulations, known as the Personal Information Protection Act (PIPA) and the General Data Protection Regulation (GDPR), respectively. The United States currently lacks a federal privacy regime, although members of Congress have proposed several privacy-related bills. On the state level, the California Consumer Privacy Act (CPPA) and the newly passed California Privacy Rights Act (CPRA) serve as a template for other states trying to create their own privacy regimes.
In the face of growing consumer dissatisfaction and several antitrust lawsuits, Big Tech itself has signaled a preference for comprehensive privacy regulation. Tech executives at the World Economic Forum in 2020 expressed a preference for uniform regulation and support for industry involvement in policy developments.
Antitrust creates another regulatory intersection between Big Tech companies and data privacy. Since information can help companies perform better, some scholars see data sharing as a way to level the playing field between Big Tech companies and smaller platforms. Specifically, an interoperability policy would require tech platforms to exchange information by operating in conjunction with each other. Although it may alleviate antitrust concerns, data sharing can raise data privacy issues. Data privacy scholars point out that this increased competition is achieved at the cost of consumers’ data privacy.
This Saturday Seminar explores the intersection between digital platform regulation and consumer data privacy.
- In an article for the European Competition Journal, Katharine Kemp of Australia’s University of New South Wales argues that the “notice and choice” consent model popularized in the 1970s is no longer realistic in today’s world. She states that rapid technological development has reduced consumers’ ability to understand what they are consenting to in privacy policies. She also explains how companies today have an incentive to collect extremely large sets of data because it fuels machine learning and can provide companies with industry insight and a competitive edge. She argues that these competitive incentives ultimately harm consumers and that regulators should take them into consideration.
- In an article published in the Santa Clara High Technology Law Journal, Andrew W. Bagley and Justin S. Brown of the Georgia Tech School of Public Policy explore how broad terms in click-wrap agreements allow for consumer data to be shared to third parties. Bagley and Brown explain that when an internet user clicks “I agree,” they often consent to the “lowest common denominator” for privacy protections and unknowingly permit disclosure to third parties. Furthermore, these click-wrap agreements do not address what happens to data after it have been shared. In the absence of regulatory or legislative solutions, Bagley and Brown urge corporations to implement self-regulatory schemes for handling consumer data.
- In an article for the Stanford Technology Law Review, Dina Srinivasan of Yale University argues that Google relies on user data to gain a competitive advantage in online advertising space. Srinivasan explains that Google, under the name of data privacy, does not give full access to user data profiles. However, Srinivasan explains that Google and Google-owned intermediaries access the full user data information themselves and set time restrictions on outside parties that give Google affiliates more comprehensive user data and a speed advantage in online advertising. Srinivasan recommends that policymakers require Google to provide outside parties fair access to data and internet speed while limiting what kinds of consumer data can be used for targeted advertising and including an opt-out clause for behaviorally targeted ads.
- Europe’s attempts to regulate the tech industry have led to unintended consequences, Michael G Jacobides of London Business School, Martin Bruncko, and Rene Langen argue in a paper for Evolution LTD. Jacobides, Bruncko, and Langen, all of whom are senior advisors for Evolution LTD, explain how the EU’s General Data Protection Regulation (GDPR) was meant to protect consumer privacy but allowed Google to cement itself as a leading advertising partner. By pushing the consent requirements of GDPR onto publishers and labeling itself as a “controller” of personal data, Google obtained increased access to personal data with fewer restrictions on its use. Jacobides, Bruncko, and Langen predict that advantages in the size and strategy of large platforms will allow continued domination of the advertising market, despite any future EU regulations restricting data sharing.
- In a paper for the Bank of Italy Occasional Paper, Oscar Borgogno and Michele Savini Zangrandi of the Bank of Italy evaluate the global data governance regime of digital platforms. Borgogno and Savini Zangrandi categorize national considerations in developing digital data regulations into three buckets: data control, national security, and competition policy. They argue that these considerations result in complex national regimes that require coordination on the international level. Borgogno and Savini Zangrandi argue that international guiding principles for digital data governance is the most ideal solution but also the most difficult to attain. They posit that smaller groups of countries, like the G7, could instead create coordinate their data frameworks.
- In an article for the Yale Law Journal, Herbert Hovenkamp of the University of Pennsylvania Law School discusses how to align antitrust remedies with the desired policy outcomes in the context of platform regulation. He defines a successful antitrust regulation as one that can increase output, decrease prices, improve product quality, or spur innovation. He identifies interoperability and information pooling as promising ways to even competition. He also discusses how a platform itself could serve as a “market” in which antitrust law may apply to internal decisions.
The Saturday Seminar is a weekly feature that aims to put into written form the kind of content that would be conveyed in a live seminar involving regulatory experts. Each week, The Regulatory Review publishes a brief overview of a selected regulatory topic and then distills recent research and scholarly writing on that topic.