With Great Power Comes Great Loyalty

Font Size:

Expert proposes a new framework of data loyalty to reenergize data privacy regulation.

Font Size:

When visiting a car dealer, you do not give away all your personal details—your birthday, your interests, the names of your family members and closest friends, or pictures of your children.

Why, then, do you tell Facebook?

According to a forthcoming article by Woodrow Hartzog of the Northeastern University School of Law and Neil Richards of the Washington University in St. Louis School of Law, people share this deeply personal information because they relate to online services more like intimate friends than like other commercial businesses. As people share their personal data, however, companies gain increasing access and power over their information, experiences, and safety.

According to Richards and Hartzog, companies may not always use this power for good.

Recognizing that these relationships can easily lead to harm, Richards and Hartzog recommend that companies adopt both a general duty of loyalty that would apply to all entities that collect data and a second layer of special subsidiary fiduciary duties to target specific problems.

A general duty of loyalty would require companies to act in the best interests of their customers, with lawmakers defining “best interests” to include both collective good and individual good.  This duty could include, for example, extra emphasis on collecting data only for specific purposes and minimizing data collection to those purposes alone.

Building on previous arguments for regulating digital companies as information fiduciaries, Richard and Hartzog expand on the importance of a duty of data loyalty. This duty is critical, they argue, because of vast information and power imbalances between people and platforms.

Richards and Hartzog explain that these imbalances occur because platforms have ongoing relationships with their users. Those users may visit the platform dozens of times each day and interact with an environment constructed and customized specifically for them.

Every time people interact with websites and mobile apps, they expose their identity and personal story for companies to extract. Richards and Hartzog contend that this practice endangers both individual and collective well-being because companies collect and use expansive amounts of information with little to no regard for users’ interests.

Based on users’ unique profiles and clicks, platforms tailor user experiences and maximize profit by taking users’ attention, labor, and time. But users themselves cannot change what they see on a screen, nor can they compare how their experiences differ over time or from other people’s experiences. Because users lack control over their data and their online experiences, Richards and Hartzog urge lawmakers to attack this power imbalance by implementing a duty of data loyalty.

Companies currently approach data privacy through a “notice and choice” model in which they provide notice of their privacy policies and require individuals to consent. But Richards and Hartzog argue that this policy has failed.

Individual decisions to accept or reject a privacy policy cannot account for larger, collective harms to society and democracy. With data-driven business decisions, individual companies can endanger civil rights, mental health, relationships, and democratic self-governance, claim Richards and Hartzog. A duty of loyalty could help expand privacy law to address not only individual harms caused by platforms, but also systemic harms.

To prevent some specific harms, Richards and Hartzog propose four subsidiary data loyalty rules.

First, companies should personalize online experiences carefully so as not to discriminate against vulnerable groups. Second, companies should prohibit the government and other third parties from accessing user data. Third, companies should design their websites and applications so as not to take advantage of people’s limitations or vulnerabilities. Finally, companies should refrain from allowing harmful, dangerous content to go viral.

Tech companies and their supporters portray themselves as positive champions of innovation, but Richards and Hartzog reveal that they do not always act responsibly. If companies have a duty of loyalty and then fail to honor it by using data against their users’ best interests, those actions may look like betrayal and exploitation.

But not everyone agrees with Richards and Hartzog’s assessment. One critique against the duty of loyalty raises the issue that imposing the duty would create incentives for directors to favor users’ interests over shareholders’ interests. Richards and Hartzog counter, however, that companies can balance their loyalty among users by focusing on supporting the best interests of a reasonable user rather than every individual user.

Other critics suggest that imposing a duty of loyalty would not be effective in practice because of difficulties detecting violations. But Richards and Hartzog contend that the duty will impose substantive limits on company’s data practices. Currently, a company can profit from user data at the user’s expense while claiming that users have read the company’s inscrutable privacy policies and consented to the exploitative data practices. With a duty of loyalty, companies could be motivated to use data only for purposes that the user actually approved or for purposes that benefit them.

The rhetoric of loyalty can pack political and moral force to help the public hold companies accountable for how they collect and use data. Although no silver bullet will solve all the challenges confronting privacy regulation, Richards and Hartzog conclude that data loyalty reframes the relationship between platforms and people and has revolutionary potential to inspire privacy reform.