By Adam Levitt and Amy Keller
Published in Law360 on May 5, 2026. © Copyright 2025, Portfolio Media, Inc., publisher of Law360. Reprinted here with permission.
Recent jury verdicts against big tech companies demonstrate a trend: Consumers are tired of being spied on, exploited, analyzed and monetized without meaningful limits or accountability.
In the last year alone, juries have held Google LLC liable in major privacy cases involving app activity tracking and Android data transfers. Likewise, juries have held Meta Platforms Inc. liable in multiple litigation actions concerning social media addiction, as well as in litigation over the collection of sensitive reproductive health data through the Flo period-tracking app.[1]
Against that backdrop, the U.S. Congress has once again turned to comprehensive privacy legislation, with the Securing and Establishing Consumer Uniform Rights and Enforcement Over Data Act, or the SECURE Data Act, positioned as a step toward reining in data practices that have long operated in the shadows.
While the bill has the veneer of the practices for which experts have long advocated — such as invoking data minimization, purpose limitation and heightened protections for certain categories of information — advocates question whether, at its core, it meaningfully constrains the collection and use of personal information, or whether it instead codifies a system that continues to rely upon obfuscation and confusion.
Lawmakers face a basic choice: Set a real floor for consumer protection or wipe out stronger state laws in favor of a weaker federal compromise.
The newly introduced SECURE Data Act doesn’t fix what’s broken in American data privacy law; it locks it in. It elevates a model that has consistently failed to constrain how companies collect and exploit personal data, while stripping away the few mechanisms that have begun to work.
At its core, the bill relies on a familiar model: Companies disclose what they collect and consumers are expected to protect themselves by reading the fine print. That approach may look protective on paper but, in practice, it has too often left consumers with zero control over how their data is collected, used and shared.
By preserving a framework that is built around disclosure and flexible standards, the bill allows companies to continue expansive data collection and use, so long as they can justify it as being reasonably necessary within their own business models. That elasticity isn’t a constraint; it’s an invitation.
Firms would retain broad discretion to define the scope of permissible data practices — including the aggregation, retention and monetization of consumer information — with limited external checks and minimal exposure.
The bill’s most consequential feature is preemption. It would displace state privacy laws at the very moment that states have been doing a lot of the most meaningful work in this area — including expanding what counts as personal data, narrowing permissible uses and creating real consequences for misuse.
If Congress is going to wipe away stronger state protections, it should replace them with something at least as strong. This bill doesn’t do that.
Meanwhile, smaller companies may gain little from the federal promise of uniformity. While federal preemption removes the need to navigate a patchwork of state laws, it also eliminates stronger state-level protections that could otherwise build consumer trust and level the playing field.
The result is a system that favors scale — that is, larger corporations — and existing data advantages, reinforcing the dominance of companies that are already best positioned to exploit personal data at volume.
A weak federal standard wouldn’t create clarity so much as lock in rules that may quickly become outdated as data practices continue to evolve.
Further, uniformity has value, but not if it comes at the expense of accountability. Privacy violations are often small, widespread and difficult for regulators to catch one by one. If companies face little real risk when they stretch vague standards like those spelled out in the bill, those limits will mean very little in practice.
Regulators may not challenge most violations — not because they are indifferent, but because the scale of the data economy makes comprehensive oversight impossible. A weak enforcement structure, like the one proposed in the SECURE Data Act, significantly reduces the likelihood that companies will face substantial consequences for privacy violations.
American law often relies on a mix of public and private enforcement to make rights real, and privacy rights should be no different. Yet this bill weakens that backstop and goes further by requiring notice and an opportunity to cure before regulators act, which risks turning violations into warnings rather than any kind of meaningful deterrence.
Without a meaningful private right of action and with procedural hurdles that delay regulatory intervention, compliance risks become more theoretical than real. For many organizations — particularly large, well-resourced technology companies — this creates a predictable environment in which aggressive data strategies remain economically rational and perpetually overreaching.
The same problems appear in the bill’s substantive rules. While the bill nods toward familiar privacy principles, like data minimization and purpose limitation, it ultimately routes them through elastic qualifiers, like what a company deems reasonably necessary or technically infeasible.
These are not hard limits so much as permission structures. They risk turning baseline protections into self-judged standards, where the scope of collection, retention, profiling and downstream use expands to meet business incentives, rather than being meaningfully constrained by law for the protection of consumers.
This dynamic becomes even more pronounced in the bill’s treatment of pseudonymous data. By positioning it as categorically less sensitive, the bill understates how modern data practices work.
Information that has been stripped of obvious identifiers can often still be linked back to individuals, across datasets and over time, and be used to profile them at scale.
In that environment, pseudonymization is less a safeguard than a speed bump. It may obscure identity at the margins, but it does little to prevent profiling, interference or reidentification when paired with today’s analytics capabilities, let alone even modest advances in machine learning.
All that said, for a framework that appears to regulate data practices, it leaves the most consequential decisions — such as what to collect, how long to keep it, and how and whether to analyze it — largely in the hands of the entities that the law is meant to constrain, which brings us to the central trade-off.
If Congress were offering a strong federal standard with robust rights, real enforcement and clear limits on data use, there would be a legitimate debate about national uniformity. But that is not what is being offered here.
Instead, the SECURE Data Act tries to sell weaker rights, weaker enforcement and fewer avenues, all in place of accountability, while eliminating stronger alternatives. That’s not harmonization; it’s rollback.
A serious federal privacy law would set strong baseline protections nationwide while preserving the role of states as innovators and enforcers. It would recognize that rights without remedies are ineffective, and would include a meaningful private right of action.
Further, it would impose enforceable limits on how personal data is collected, used and retained — limits defined by law, not corporate discretion or confusing statements in fine print.
This bill does none of those things. Instead, it resolves a decade-long debate over privacy law in favor of the least accountable model: rights on paper, paper tiger enforcement, and preemption on stronger protections.
Privacy is the boundary between individuals and an economy that is built on extracting and monetizing their data. When that boundary weakens, everything downstream weakens with it, including competition, security and autonomy.
Congress still has a choice. It can write a privacy law that imposes real limits and real accountability, or it can pass a law that ensures neither and still call it reform. The SECURE Data Act does the latter. It would entrench a system that protects data extraction over consumer protection and must be rejected.
Adam Levitt is a founding partner at DiCello Levitt LLP.
Amy Keller is a managing partner of the Chicago office and chair of the privacy, technology and cybersecurity practice at the firm.
The opinions expressed are those of the author(s) and do not necessarily reflect the views of their employer, its clients, or Portfolio Media Inc., or any of its or their respective affiliates. This article is for general information purposes and is not intended to be and should not be taken as legal advice.
[1] https://www.npr.org/2026/03/25/nx-s1-5746125/meta-youtube-social-media-trial-verdict.