Home > News > 

Utilizing Legal Action to Mitigate Addictive Technology

Addictive Technologies ConferenceIn today’s digitally driven world, our relationships are increasingly becoming as thin as the devices we hold. Despite growing awareness of technology’s harms, screen time keeps trickling up: adults average five hours a day on their phones, while teens spend over eight hours on screens. On April 12, legal academics, attorneys, regulators, and health professionals convened at Seton Hall Law’s “Legal Responses to Addictive Technologies: Addressing the Impact of Screens, Social Networks and Online Games on Kids” conference. Hosted by the Institute for Privacy Protection and the Gibbons Institute for Law Science and Technology, the conference illuminated the pervasive, yet debilitating influence of social networks on the younger generation, prompting a call for collective legal action to combat the escalating crisis.


New Jersey Attorney General, Matthew J. Platkin, Underscores Big Tech’s Deceptive Practices


New Jersey’s Attorney General (AG), Matthew J. Platkin, opened the conference with a keynote address calling social media’s effect on emotional and physical well-being the “healthcare crisis of our time.”  Similar to the tobacco industry in the ‘90s, big tech companies are aware of their products’ harm, but deny any responsibility. Curated algorithms, quantitative liking, and haptic notifications are employed to increase engagement, particularly among children who lack the impulse control to turn away. Depression, low self-esteem, and self-harm have followed, but platforms have suppressed research to conceal these harms. Concerningly, children under 13 are viewed as valuable and profitable market for big tech.  Platkin has co-led a bipartisan coalition of 41 AGs across the U.S, suing Meta for purposefully designing features that fuel addiction among children.  Meta is only the first of the bad actors to be held accountable, with the coalition actively investigating TikTok. While big tech defends its practices as an exercise of free speech, the First Amendment has limits for deceptive and unconscionable business practices. AGs across the country are working to hold these companies liable for lying about harms that cause serious mental and emotional challenges to kids.


The Vast Web of Online Harms


The discussion turned to the extent of online harms, exploring how online platforms activate psychological vulnerabilities. Recognizing big tech’s business model is personalized marketing, platforms are filled with curated content and low friction awards that keep us online longer. TikTok, for example, holds back a video it knows you’ll love, showing it to you when the app senses you are about to stop scrolling. Like slot machines, where uncertainty of the next outcome encourages addiction, viewing a likable video after a stream of boring content triggers a dopamine hit that keeps kids scrolling. 


In addition to rewiring dopamine receptors, the content consumed on social media deteriorates mental health. One sixth of teens have an unwanted sexual encounter on social media, 1/6 have seen self-harm content, and 22% have been cyberbullied. Snapchat’s “Quick Add” feature, for example, encourages users to accept friend requests from strangers.  Kids receive over 200 notifications each day from Meta, with 1/3 of teens waking up in the middle of the night to check their phones. We’ve all experienced the difficulties of falling back to sleep after looking at our phones; for kids, the displacement of sleep stunts growth. By principally communicating online, kids fail to understand the value of face-to-face compromise and negotiation. Instead, they shallowly equate popularity with the number of followers they have and devalue friendships with those who don’t like their posts. Reversing these social norms is critical if we want to promote a genuine and collaborative public sphere.  While it’s easy to blame parents for enabling their children’s technology use, big tech ultimately has the responsibility to change the addictive nature of their products.


How Do We Escape the Grip of Addictive Technology?


With the problem clearly identified, panelists focused on the approaches to regulating addictive designs, advocating for a “wrongful engagement” framework and an understanding that technology platforms limit autonomy. Rather than viewing engagement as a business metric, it can be construed as a legal wrong that provides entry to lawsuits. A wrongful engagement theory holds companies liable for intentionally designing tools to encourage technology use in a manner that is unfair, deceptive, or abusive. Utilizing this framework may pressure big tech to prioritize healthy consumer loyalty over wrongful engagement. Big tech claims that people freely consent to social media use, but this overlooks the societal and infrastructural forces that heavily influence human behavior. Understanding the enormous pressure kids feel to be online may overcome First Amendment arguments that regulation interferes with individual freedom. Litigation plays an essential role in setting standards, but transformative change requires collective recognition of how technology encroaches on our autonomy.


Utilizing the Federal Trade Commission to Hold Big Tech Accountable


The conversation continued with a keynote address from Serena Viswanathan, Associate Director for Advertising Practices at the Federal Trade Commission (FTC). The effects of screen time vary among children, leading to a delicate balance of encouraging child independence and protecting them from harm. Proving the likelihood of harm through causation, not mere correlation, has been difficult. Moreover, delineating between platform design and content is not always clear, particularly as social media influencers blur the line between advertising and content. Online safety, however, is a high priority for the FTC, which sees privacy standards as a means to address the problem of profiting from data collection. The FTC has several legal tools to shift the burden from parents to providers, including bringing cases for deception and unfair practices.  


The Legal Approach To Regulating Technology


The conference concluded with a conversation on the legal challenges of regulating addictive design. To combat big tech’s weaponization of the First Amendment, the panelists urged courts to apply more scrutiny when balancing the competing interests of free speech and social harm. Others argued strategic litigation aimed at creating good precedent is the first step. This would enable attorneys to challenge pre-enforcement facial challenges wagered by big tech and get to discovery. Once at the discovery stage, collaboration with researchers could then occur to strategically acquire useful data. In addition to legal measures, taxes could be imposed on platforms and advertising sales, with the resulting revenue allocated towards combating online harm. Regulatory approaches should focus on increasing friction in digital architecture, such as using time delays in posting and age-gating. Both practices support increased deliberation, self-governance, and the genuine exercise of free will in technology use.


The Path Forward


Considering the novelty of legal action to contain technology addiction, Professor Gaia Bernstein, the conference’s organizer, summarized the day: “The conference gave academics an opportunity to evaluate where to focus their scholarship so it could contribute most to the policy debate and gave activists access to the newest ideas just percolating in legal academia that usually take years to get to the publication stage.” 


As concerns over children’s digital well-being continue to mount, the conference served as a crucial catalyst for galvanizing stakeholders towards enacting concrete measures to safeguard the health of future generations. “Many seeds for future collaborations were planted at Seton Hall Law School during this conference,” said Bernstein. The stage is set for a concerted push towards a more responsible and ethical digital landscape. Let’s close the curtain on the 21st century’s most pressing healthcare crisis.

Categories: Special Programs

For more information, please contact:

  • Seton Hall Law School