Understanding the constellation of data laws that affect your business is harder than locating Libra at night in LA. It’s more like finding Ophiuchus. Never heard of it? Exactly.

Here are the top threats you may need to have in mind in 2026 and the years ahead. Talk to counsel about whether and how they affect your business.

  1. Zombie privacy claims---still undead

For all the foofaraw over CCPA and its peer-state brethren, the real privacy threat remains zombie laws long predating 21^st^ century statutes.

Only the government can enforce CCPA and its peers. Plaintiffs’ lawyers can’t. But zombie laws allow lawsuits---and many open up fixed per-person fines. Zombie claims argue the common web tech on your website constitute surveillance devices or wiretap visitors. It’s your domain, so you’re liable.

This stark comparison understates the situation, as most zombie demands quietly settle out of court. Since 2021:

CCPA: 10 enforcements; $10.6m.

CA zombie laws:

  1. \...& new automated scanners mean no escape

If you operate online and haven’t dealt with a zombie demand, your luck may soon run out.

  1. Outbound AI requests? Input perils for privacy & IP

1. ::: {custom-style="body - no space"}

By now, you probably know your business uses AI tools. If you or your employees aren't paying in dollars, you're paying in data. Your employees are probably using free services because yours don't meet their needs. Outside of big tech and large enterprise, some or even most white-collar employees will admit it to friends.

:::

2. Payment alone doesn't address all risk. It's high time to follow the data. Oh, "the contract says no training on our data"? You're probably sending sensitive and proprietary data without a confidentiality protection, and all but certainly to a destination unknown (running models in a dedicated cloud? Good on ya.) Are you and your employees discussing internal legal or compliance issues with AI? Those conversations are discoverable and unlikely to be privileged.

In H1 2023, OpenAI got [just 6]{custom-style="Hyperlink"} reportable government requests for data (excluding other legal contexts, like lawsuit subpoenas). In H2 2024? 61. In H2 2025? 309. At that pace, the total could near 20,000 in the coming years.

  1. Selling user data for training: cash now, liability later

## AI models aren't magic. They are fundamentally just the result of software learning from data. The software techniques that build them are not new at all. So what's with the massive capital raises in the AI sector? Sure, training is expensive, but training a model alone doesn't attract investment.

## To build a standout model, developers need good data. That increasingly means licensed data---scraped and torrented data isn't just a liability risk, it's also a risk plenty will take. So the best kind is data competitors don't have or can't access.

## For years, rightsholders like academic publishers have licensed content and data for training.

## Age verification grows up

## Risk assessments reach our shores

## Don't mess with Texas (or Connecticut)

## CCPA security audits and AI burdens loom

## Forecast: torrential downpour of rights requests

The link has been copied!