The Reopened Window of Tech Regulation

Font Size:

Scholar analyzes regulation gaps and argues that COVID-19 created a unique opportunity to regulate technology overuse.

Font Size:

On average, children in the United States spend more time in front of a screen for entertainment each day than they do enrolled in school.

In a recent article, Gaia Bernstein, a professor Seton Hall University School of Law, argues that the COVID-19 pandemic has reopened the window of possibility for the United States and other countries to regulate the overuse of digital technology.

COVID-19 intensified the technology overuse problem as screentime enveloped almost every aspect of life for adults and children. Bernstein argues that this escalation led to screens becoming “visible” as people noticed the effects of prolonged screentime acutely. She contends that a window of opportunity for technology regulation has been reopened due to COVID-19

According to Bernstein, regulators have failed to protect against the risks of excessive screen time because of the invisibility of digital technology, the rapid entrenchment of social norms and business interests, and institutional resistance to early intervention.

Bernstein explains that when a new technology is adopted a window of opportunity—a period of time where inventers and users explore different options of design and use—arises where “interpretive flexibility” exists. Eventually, the window shuts and society reaches a stage of closure where all “interpretive flexibility” is lost.

Bernstein depicts closure as being the point when a society no longer examines potential designs, uses, or social norms for a technology. At that point, further change becomes improbable and the technology becomes invisible, as people no longer think about or notice how they use the technology. The common electric toaster, for example, is now “invisible” because people use it without pondering substantially different designs or usages.

Bernstein acknowledges that the process of technology adaptation is far from linear. She identifies some major events as shaking up entrenched norms and practices, citing 9/11 and World War II as examples.

Sometimes the window of possibility stays open a long time, while other times users accept new technology as it is from the start—without reflecting on choices of design and use because the technology is “invisible” at the outset. She argues that the internet is an example of an initially “invisible” technology because users were unable to see the invisible choices made by tech companies, such as the collection of personal data.

Invisibility does not render a technology permanently immutable from inception. But Bernstein asserts that it does increase the risk that society will not engage in the process of “interpretive flexibility.” Although computer and device screens themselves are visible, the designs that make websites and apps so addictive have been “invisible” from the start, she argues.

Bernstein dissects designs advanced by tech companies that release intermittent “rewards” —such as a notifications or a new post—on an unpredictable schedule. These intermittent “rewards” result in more dopamine being released in users’ brains. In turn, this biological reinforcement leads to users spending more time using devices or apps than they would normally.

Altogether, design choices that keep users on their devices longer enable technology companies to collect more data on users from increased screentime, which allows for more accurate targeted advertisements and for more advertisements overall to be shown to users. Because many technology companies do not charge for their services—and instead they earn their revenue from advertisements—Bernstein concludes that the business interests of the entire internet economy rely on design strategies that produce technological overuse. More screentime translates to more personal data and exposure to targeted advertisements; reducing it would threaten the very core of this business model.

Although society has started to realize the impact and extent of time that people spend online, the window of “interpretive flexibility” has now shut, according to Bernstein. The invisibility of companies’ designs ensured that small choices—such as joining a social network—were seldom evaluated accurately.

Even though some people have begun to question the prevalence and reliance on technology, closure has been achieved without debate or reflection and technology overuse is deeply engrained across demographic groups. According to Bernstein, this entrenched social norm explains part of the lack of regulatory intervention to protect users from excessive screen time.

Bernstein contends that U.S. regulators’ failure to intervene stems from institutional resistance to early intervention to technological developments. She links this hesitation to a strong social ethos that assumes innovation promotes progress and human welfare.

Early intervention aims to protect the public from potential risks of new technology, however, this approach means that some precautions prove unnecessary over time. Opponents of early intervention warn that moving too soon could preclude unanticipated uses of technology and produce poorly designed laws. Bernstein argues that U.S. regulators seek to promote innovation and fear regulating in the dark, leading to the adoption of a “wait and see approach.”

She suggests that this regulatory approach, conflated with the invisibility of information technology, resulted in inaction and produced the entrenchment of social norms and business interests that exacerbated technology overuse.

Despite this regulatory void, she maintains that recent increased visibility, coupled with a growing body of scientific research highlighting the detrimental effects of excessive screentime, underscores the need for regulation. Bernstein concludes that the COVID-19 pandemic has reopened the window of opportunity for regulators to tackle technology overuse and to create a better offline-online balance.