Big Tech Faces Reckoning, But Profit Motive Escapes Scrutiny

5 min read

Analysis of: Starmer vows to tackle social media’s ‘addictive features’ to protect children
The Guardian | March 26, 2026

TL;DR

UK government signals tech regulation after US courts hold Meta and YouTube liable for designing addictive products that harm children. The state mediates between tech capital and public outrage, but proposed reforms target symptoms while leaving the profit-driven extraction of attention intact.

Analytical Focus:Material Conditions Contradictions Historical Context


The US court verdict holding Meta and YouTube liable for designing addictive technology represents a significant moment in the evolving relationship between tech capital, the state, and civil society. Yet the dominant framing—shared by governments, campaigners, and media alike—treats platform addiction as a problem of corporate negligence rather than an inherent feature of surveillance capitalism's business model. The extraction of user attention is not a design flaw but the core mechanism through which these companies generate value; addiction is the product working as intended. The proposed regulatory responses, from age verification to banning 'addictive features,' represent the state's attempt to manage the contradiction between capital's need for maximized engagement and the social reproduction costs this creates. Parents, educators, and health systems bear the burden of managing children's screen time and mental health—unpaid reproductive labor subsidizing platform profits. The $6 million verdict, while symbolically significant, amounts to minutes of Meta's daily revenue, illustrating the vast asymmetry between the costs externalized onto families and the penalties capital faces. Historically, this moment echoes previous regulatory interventions targeting consumer products—tobacco, automobiles, pharmaceuticals—where public health crises eventually forced state action against resistant industries. The tech sector's invocation of Section 230 immunity and appeals to innovation follows the same playbook of delay and deflection. What distinguishes this phase is the unprecedented concentration of capital in a handful of platforms and their structural integration into daily life, making meaningful reform more difficult even as the harms become more visible.

Class Dynamics

Actors: Tech capital (Meta, Google/YouTube), State actors (UK government, EU Commission), Professional-managerial class (researchers, campaigners), Working-class families, Children and young people, Aristocratic celebrities (Duke and Duchess of Sussex)

Beneficiaries: Tech platform shareholders and executives, Legal professionals handling litigation, Child safety NGOs gaining prominence, Politicians positioning as child protectors

Harmed Parties: Children subjected to addictive design, Parents bearing costs of managing screen addiction, Workers in precarious content moderation roles, Mental health services absorbing externalized costs

Tech capital maintains structural power through platform monopoly, lobbying capacity, and ability to appeal verdicts indefinitely. The state mediates between accumulated public anger and capital's interests, proposing reforms that address symptoms without threatening the attention-extraction business model. Working-class families lack organized political voice in this debate, represented instead by professional campaigners and aristocratic figures who frame the issue in moralistic rather than structural terms.

Material Conditions

Economic Factors: Surveillance capitalism's reliance on maximizing user engagement for advertising revenue, Externalization of social reproduction costs onto families and public health systems, Legal liability as emerging threat to platform business models, Concentration of digital advertising markets in Meta and Google duopoly

Users—including children—function as both the raw material and the product in platform capitalism. Their attention is extracted through algorithmically-optimized engagement features, transformed into behavioral data, and sold to advertisers. This represents a novel form of value extraction where the labor of producing content and generating data is unwaged. Content moderation, meanwhile, is performed by low-paid workers in the Global South facing severe psychological harm—the hidden labor behind platform 'safety.'

Resources at Stake: Children's attention as extractable commodity, Behavioral data for advertising markets, Platform market share and advertising revenue, Legal precedent affecting future liability

Historical Context

Precedents: Tobacco industry litigation and eventual regulation (1990s-2000s), Automobile safety regulation following Nader's Unsafe at Any Speed, Pharmaceutical industry oversight after thalidomide scandal, Child labor laws during industrial capitalism

This represents a recurring pattern where capitalism generates social harms that eventually require state intervention to manage the resulting legitimacy crisis. The tech industry has enjoyed an extended period of minimal regulation justified by innovation rhetoric—a feature of neoliberal governance that prioritizes capital accumulation over social protection. The current moment suggests this exceptional status is eroding, though history shows such regulatory victories are often partial, captured by industry influence, and require constant defense against rollback.

Contradictions

Primary: The fundamental contradiction between platforms' profit imperative (maximizing engagement through addictive design) and the social need to protect children's development and mental health. These cannot be reconciled within the existing business model—attention extraction IS the product.

Secondary: State's dual role as protector of citizens and facilitator of capital accumulation, Tech companies' claims to be 'responsible' while designing for addiction, Age verification proposals that expand surveillance in the name of protection, Individual liability verdicts versus systemic business model immunity

Short-term: expect incremental regulations that impose compliance costs without transforming the business model, likely benefiting larger platforms that can absorb costs while eliminating smaller competitors. Medium-term: mounting litigation may force design changes, but platforms will seek to preserve engagement metrics through new mechanisms. Long-term resolution requires either public utility models for digital infrastructure or platform cooperativism—neither currently on the mainstream policy agenda.

Global Interconnections

The regulation of tech platforms cannot be understood apart from the broader crisis of neoliberal governance. After decades of privatizing public goods and celebrating market solutions, states now face the consequences of allowing private capital to control essential social infrastructure. The EU's more aggressive regulatory stance reflects different state-capital relations than the US, where platforms emerged and where regulatory capture is more advanced. The global nature of these platforms creates jurisdictional arbitrage opportunities—threatening to relocate or reduce services in response to regulation. The attention economy also connects to labor market transformations. As stable employment becomes scarcer and gig work expands, platforms fill the void of social connection and meaning previously provided by workplaces and communities. Children's 'addiction' to social media reflects their immersion in a society where digital platforms mediate nearly all social interaction—a structural condition, not individual moral failure.

Conclusion

The court verdict and government responses represent a potential inflection point, but one whose trajectory depends on political organization rather than legal proceedings alone. The framing of platform harm as a matter of 'protecting children' obscures the broader class dynamics: these platforms exploit users across all ages, extract value from unwaged digital labor, and externalize costs onto working families. Meaningful reform would require challenging the surveillance capitalism model itself—treating attention as a public good rather than a commodity for extraction. Without organized working-class pressure for structural alternatives, regulatory responses will likely remain within capital-friendly bounds: compliance costs that consolidate monopoly power while leaving the extraction apparatus intact.

Suggested Reading

  • The Age of Surveillance Capitalism by Shoshana Zuboff (2019) Zuboff's analysis of how tech platforms extract behavioral data as their primary accumulation strategy directly illuminates why 'addictive design' is a feature rather than a bug of platform capitalism.
  • Capital, Volume 1 by Karl Marx (1867) Marx's analysis of how capitalism externalizes social reproduction costs onto workers and families provides the theoretical foundation for understanding why platforms profit while parents bear the burden.
  • Prison Notebooks (Selections) by Antonio Gramsci (1935) Gramsci's concept of hegemony helps explain how platform companies maintain legitimacy through innovation rhetoric while the state manages the resulting social contradictions through limited regulatory intervention.