Preventing the Sharing Economy From Abusing Your Data

Font Size:

Scholars argue that consumer protection law can address sharing economy firms’ potential to abuse information.

Font Size:

Imagine that you have just arrived in an unfamiliar city and need to get to your hotel downtown. Your phone’s battery is dying, so you would do almost anything to get there and recharge it. Would it be wrong for your Uber app to recognize your phone’s low battery level and inflate the price of a car ride downtown?

It would, according to a recent paper highlighting the potential for firms like Uber or Lyft to abuse consumer data. Ryan Calo, a professor at Washington University School of Law, and Alex Rosenblat, a scholar at the Data & Society Research Institute, argue that this potential stems from imbalances in power and information that firms in the “sharing economy”—a growing market of transactions between strangers using a digital platform—can leverage to their benefit. The authors argue that consumer protection laws, which have combatted such imbalances since the early 20th century, present the right framework for solving this kind of data abuse.

All modern businesses analyze consumer data. For example, Calo and Rosenblat note that Facebook and Google users tacitly agree to allow those firms to monetize users’ data and display targeted ads in exchange for “free” use of those firms’ services. And exploiting the flaws in human reasoning revealed by consumer data far predates the sharing economy. For instance, consumers find products priced at $9.99 more enticing than those at $10.00, and they do so to an extent greater than just the difference of a single penny.

Unlike targeted ads and strategic pricing, however, the authors contend that sharing economy firms enjoy two key advantages that allow them to identify and exploit consumer behavior far more than what was possible in the past. First, sharing economy firms have exclusive and limitless access to consumer data. Second, they can tailor their software applications to take advantage of that data. As such, sharing economy firms can identify new patterns of behavior and exploit them with much greater ease.

Recall the example of arriving in a new city with an almost-dead cell phone. A sharing economy firm might discover a pattern of increased willingness to pay higher prices when a consumer has a low cell phone battery. Indeed, although Calo and Rosenblat note that Uber denies adopting this practice, it was Uber’s own research team that identified this pattern of consumer behavior in the first place.

Researchers have also found that the Uber app interface depicts numerous drivers around a consumer’s location, but that none of these “phantom cars” represents the location of a real Uber driver. Although Uber contends that this is merely a representation that unoccupied drivers are currently on the road, Calo and Rosenblat suggest that this is really a conscious app design meant to entice the consumer to choose Uber over hailing a taxi.

Moreover, Calo and Rosenblat note that the potential for abuse of the service providers—for example, Uber drivers—is arguably even greater than that faced by consumers. The contracts that Uber insists drivers sign are exceptionally complex, and their terms change frequently. Thus, drivers often work under terms unknown to them, and, should conflict arise, they may not be able to identify which terms apply at a given moment.

Drivers are also enticed by heat maps indicating the level of activity in a certain area, but they are not told the specific price they can command there. Uber thereby limits drivers’ ability to make informed decisions about where and when to work, and drivers’ patterns of behavior are exploited in a manner similar to depicting phantom cars to consumers. Furthermore, there is the very real possibility that ultimately Uber will use the limitless data it gathers on its drivers to replace them with self-driving cars.

Despite such dangers, sharing economy firms present a compelling story of market disruption and competition, and their services have become indispensable to consumers and service providers alike. Firms leverage these points to resist government regulation. Nonetheless, Calo and Rosenblat argue that the investigative and regulatory powers offered by consumer protection law are a viable solution.

Using consumer protection law to regulate sharing economy companies raises two challenges: detecting the harms, and addressing them without stifling legitimate innovation. Regulators can detect harm through direct investigation, such as compelling a firm to produce documents or visiting the firm to learn more about industry practices. They can also encourage third parties such as academics to investigate.

Once a regulator identifies questionable practices, it must decide whether to address them by creating incentives to discourage such behavior or drawing clear lines between permissible and impermissible activity. Because there is a wide range of potentially questionable behavior, line-drawing would be tedious and inefficient. Therefore, Calo and Rosenblat argue that setting incentives—such as creating a board that oversees research conducted on consumer data—is the more attractive approach.

Alternatively, an approach that combines creating incentives with drawing lines involves treating firms as fiduciaries. Calo and Rosenblat suggest that because consumers entrust firms with their data, firms owe them certain obligations in return, such as loyalty. Fiduciary duties apply to a wide range of conduct and have the advantage of already being a well-established area of law. Imposing such obligations on firms, then, represents an attractive and proven way to address the power and information imbalance between sharing economy firms, their consumers, and their service providers.

Calo and Rosenblat’s paper appeared in the Columbia Law Review.