Understanding the constellation of data laws that affect your business is harder than locating Libra at night in LA. It’s more like finding Ophiuchus. Never heard of it? Exactly.

Here are some top data threats to keep in mind in 2026 and the years ahead. Talk to counsel about whether and how they affect your business.

Zombie privacy claims---still undead [106]

For all the foofaraw over CCPA and its peer-state brethren, the real privacy threat remains zombie laws long predating 21^st^ century statutes.

Only the government can enforce CCPA and its peers. Plaintiffs’ lawyers can’t. But zombie laws allow lawsuits---and many open up fixed per-person fines. Zombie claims argue the common web tech on your website constitute surveillance devices or wiretap visitors. It’s your domain, so you’re liable.

In the last five years, zombie cases have outnumbered CCPA enforcements 90 to 1. And that doesn’t include state-court claims, or the out-of-court settlements that in all likelihood outstrip all court cases combined.

...& new automated scanners mean no escape [113]

If you operate online and haven’t faced a zombie demand, your luck may soon run out. Software firms are launching products that scan websites automatically to provide all the audit trails and visual artifacts necessary to backstop a claim. To date, that’s been a laborious process for the small group of firms responsible for most zombie claims.

As the floodgates open, courts may finally put up barriers to zombie claims (in the memorable 2025 words of a federal judge, “this is yet another pixel case”). Until then, smart mitigation steps can reduce your risk profile without sacrificing the tools that power your sales & marketing.

Outbound AI requests? Input perils for privacy & IP [197]

By now, you’re hopefully licensing paid AI tools. If you or your employees aren’t paying in dollars, you’re paying in data. That’s table-stakes, but standard license terms and an internal AI Use Policy leaves.

  • Your employees are probably using additional AI tools because yours don’t meet their needs. Outside the big tech and large enterprise space, many white-collar employees will admit as much to friends. You circulated an ?
  • Payment alone doesn’t address all risk. Oh, “the contract says no training on our data”? You’re probably sending sensitive and proprietary business information to a destination unknown without confidentiality protection (running inference in a dedicated cloud? Nevermind.)

Developing products and discussing legal or compliance issues with AI are particularly risky. The first may limit or impact registered-IP protections. Conversations and usage records are discoverable and unlikely to be privileged. By default, many services keep them indefinitely.

In H1 2023, OpenAI got just 6 reportable government requests for data (that excludes other legal contexts, like lawsuit subpoenas). In H2 2024? 61. In H2 2025? 309. At that pace, 20,000 is a fair estimate for the next two years.

Selling user data for training: cash now, liability later [68]

Data-hungry AI developers see untapped opportunity in modeling human behavior to improve advertising and marketing products. If you’re approached about licensing user content or behavioral data for training, a cash-heavy, low-burden opportunity can be tempting to sign on the spot. But understanding the market and implement data hygiene are key to maintaining future value and minimizing long-tail liability risk.

Don’t mess with Texas (or Connecticut) [160]

  1. About 20 states now follow California's lead by enacting a disclosure-and-opt-out privacy regime. Early on, most states watered down CCPA's requirements. But the big-tech backlash of the last few years created strong rules in surprising states like Texas.

Start at the beginning:

  • CCPA only kicks in the year after a business first hits $26.5m in revenue or has 100,000 Californians’ data.
  • Texas law kicks in when a business ceases to be a “small business” under the U.S. SBA definition (and sells something “consumed by” Texans). That threshold fluctuates by industry NAIC code. Are you a loan brokerage? $15m revenue. Interior design shop? $9m. Law firm? $15.5m.
  1. The difference: Texas and a few others require affirmative, opt-in consent to sell or share sensitive data (like precise geolocation) or process previously undisclosed purposes. An email update may not be enough If you want to share data (...or realize you already do) and haven’t said so before.

Age verification grows up [101]

  1. For 30 years, age verification was an afterthought, unless your business targeted kids under 13. A checkbox ("I represent I'm at least 13 years old") at registration offered protection.
  1. State efforts to broaden verification requirements have grown beyond pornography offerings---some can kick in if you offer user-to-user messaging. Meanwhile, generally-applicable privacy laws almost always require consent to sell or share kids' data, and increasingly define 16 (CA) or 18 (DE, FL) as upper limit.
  1. If your products or services attract a teen audience and haven't considered verification measures, it's high time to look at that afresh.

Risk assessments reach our shores [60]

State privacy laws require businesses to document internal risk assessments before exposing consumer data to risky processing. Texas joined California in requiring these cost-benefit analyses prior to ‘sharing’---using adtech---online.

As states with newer laws finally ramp up enforcement, targets should expect higher fines and longer scrutiny if they can’t cough up their paperwork.

CCPA security audits and AI burdens loom [93]

  1. Not to be outdone, California's latest privacy regulations went live January 1. In addition to risk assessments, further compliance burdens attach to businesses which:
  • maintain 250,000 Californians’ data or 50,000 Californians’ sensitive data > ‘cybersecurity audit’
  • use AI to help reach decisions with legal or similarly significant effects > pre-use notice & opt-out

A cybersecurity audit can be internal, and audits under existing standards suffice (like a SOC 2 Type II, which ‘audits’ your answers to a questionnaire but doesn’t test or verify your actual practices).

The link has been copied!