Connecticut Just Drafted the Most Aggressive AI Deepfake Liability Bill in the Country
Last updated: April 25, 2026
Connecticut Attorney General William Tong and the legislature’s Judiciary Committee announced HB 5312 on April 23. The bill creates a private right of action for any Connecticut resident who is the subject of a nonconsensual AI-generated sexual image. It also gives the AG authority to sue platforms that fail to remove flagged images within 48 hours, with fines up to $25,000 per day. Two-year statute of limitations from discovery. Plaintiff anonymity available on request. This is not Connecticut’s first deepfake law. It is the first one with real teeth.
The federal Take It Down Act, signed last year, criminalized nonconsensual AI-generated sexual content and required platforms to remove flagged material within 48 hours. Federal enforcement has been slow. Connecticut’s HB 5312 is what happens when a state attorney general loses patience.
What HB 5312 Actually Does
Three operative provisions. First, an individual whose likeness appears in an AI-generated sexual image gets a private cause of action against the person who publicized it. Damages are not capped in the draft, and the plaintiff can proceed anonymously. Second, the Connecticut AG gets standing to sue companies whose platforms host such images and refuse to remove them on request. Third, the bill imposes a $25,000 daily fine on platforms that miss the 48-hour takedown window after receiving a valid takedown request.
The two-year discovery rule is the structural punch. AI-generated images can sit on obscure forums for years before a victim finds them. Most state revenge-porn statutes start the clock at the date of dissemination, which is often unknowable. HB 5312 starts it at discovery. That is a much friendlier rule for plaintiffs and a much harder one for defendants who hoped to run out the clock.
Why $25,000 Per Day Is the Number That Matters
The takedown fine is what platform GCs need to focus on. Forty-eight hours is short. A platform receiving 100 valid takedown requests in a quarter, missing the window on 20 percent of them by an average of three days, is looking at $1.5 million in exposure. That math compounds fast at scale.
More importantly, the fine creates a per-image liability ceiling that is high enough to fund a real moderation operation but low enough that plaintiffs and the AG will actually pursue it. State AG offices love statutes that pay for themselves. The Tong office spent the press conference walking through specific use cases: school photos manipulated and distributed in classrooms, group chats on Telegram trading nonconsensual material, a CNN-documented network of men sharing videos of drugging their own wives. The political optics of underenforcement are brutal. This statute will be used.
The Compliance Architecture Most Platforms Do Not Have
If you advise a platform that hosts user-generated content, including any tool that allows image upload, comment threads with image attachments, or AI generation features, the Take It Down Act already required a takedown process. Most platforms built one. Few built it well.
A Take It Down Act-compliant intake form is not the same as a 48-hour-with-fines workflow. The differences:
Triage speed. A 48-hour clock starts at receipt of the takedown request. If the request lands in a generic abuse@ inbox that is reviewed Monday through Friday, you are already 30 hours behind on a Friday filing. Build a same-day routing path or pay the fine.
Verification standards. Federal law allows reasonable verification of the requestor’s identity. Connecticut’s bill does not waive that, but a state AG with a $25,000-per-day hammer will not be sympathetic to platforms that demand notarized affidavits before pulling material. Reasonable means reasonable.
Documentation. Every removal decision needs a timestamp, a reviewer, and a rationale. Without that record, you cannot prove compliance when the AG sends a subpoena six months later.
Appeals for over-removal. If a platform pulls content that turns out to be authentic and consensual, the original poster has a defamation claim brewing. Build the reverse process or face the second lawsuit.
The 35-State AG Letter Is the Real Signal
Tong is one of 35 state attorneys general who signed a March letter to xAI demanding that Grok stop generating nonconsensual sexual images of public figures, take down what it has already created, and suspend the users who prompted the content. Thirty-five AGs is a coalition large enough to coordinate enforcement across most of the country. HB 5312 is Connecticut’s version of putting state law where the coalition’s letter already pointed.
Expect similar bills in New York, California, Massachusetts, Illinois, and Washington within the next 90 days. The drafting language is close enough across states that compliance teams can build one workflow that satisfies all of them, but only if the workflow is built to the strictest standard. Right now, that is Connecticut.
What GCs Should Do This Week
Three things, in order.
Audit your takedown workflow against a 48-hour clock. Time every step from inbound request to removal decision. If your median is over 24 hours, the workflow is broken. If your 95th percentile is over 48, you have a fine problem coming.
Map your platform’s AI generation features against the bill’s definition of “AI-generated sexual image.” Any image generation tool that does not have prompt filtering and output review is an enforcement target. The Connecticut AG’s office has been clear that platforms that build the tool also own the output.
Update your terms of service. The bill creates user liability for publication, not just platform liability. If your TOS does not explicitly prohibit nonconsensual AI-generated sexual content and reserve termination rights, fix it. The platform that can demonstrate clear user agreement, prompt enforcement, and good-faith takedown compliance has a much better defense profile than the one that cannot.
The bill is moving. Connecticut’s session runs into early summer. If it passes in current form, it takes effect October 1. That gives platforms and content companies five months to build the operational muscle. Anyone hoping the federal Take It Down Act would be the ceiling on this kind of liability now has a clear answer. It is the floor.