← Back to Blog

Amazon Is Using a Hacking Law to Kill AI Shopping Agents. The Ninth Circuit Will Decide If That's Legal.

· Don Ho

Last updated: April 8, 2026

By Don Ho, Esq. | April 8, 2026

Last updated: April 2026

Amazon is using the Computer Fraud and Abuse Act — a 1984 federal hacking statute — to block Perplexity AI’s Comet browser from helping consumers shop on its platform, and the Ninth Circuit’s upcoming ruling in Amazon v. Perplexity (No. 26-1444) will determine whether any website can use criminal fraud law to kill competing AI agents that their own customers want to use.

The dispute centers on Perplexity’s Comet browser, a Chromium-based browser with a built-in AI assistant that can search, compare, and purchase products on Amazon at the user’s direction. Amazon sued in November 2025, won a preliminary injunction on March 9, 2026, and then watched the Ninth Circuit stay that injunction on March 30 pending appeal. Amazon has until April 22 to respond.

Here is what makes this case matter beyond two companies fighting over market share: the legal theory Amazon is using, if it holds, would let any website operator block any third-party software that accesses their platform in ways they don’t approve, using criminal fraud statutes designed to prosecute hackers.

What Comet Actually Does

Comet launched in July 2025, initially available only to Perplexity’s $200/month Max subscribers. It went broadly available in October 2025 after millions joined a waitlist. Perplexity acquired browser company Sidekick and deployed nearly 100 engineers to build it.

The browser includes an optional AI assistant. A user clicks a tab to activate it. Once active, the assistant can browse websites, search for products, and complete purchases at the user’s instruction. The technical pipeline works like this: the assistant takes a screenshot of what the user sees in the browser, sends an encrypted version to Perplexity’s servers, receives instructions for the next step, and executes that step in the browser on the user’s computer.

The critical technical fact, which Amazon’s own expert confirmed in the district court: no Perplexity computer ever connects directly to Amazon’s servers. All data flows through the user’s browser first. The assistant cannot log into a user’s Amazon account independently. Purchases require manual user authorization.

In Perplexity’s framing, a Comet user accessing Amazon is no different from a Safari user accessing Amazon. The user runs the software. The user’s computer makes the requests. Perplexity provides the assistant that helps the user navigate.

Why Amazon Is Fighting This

Amazon generated $68.6 billion in advertising revenue in 2025. When a user activates Comet’s AI assistant to shop on Amazon, the assistant bypasses sponsored products, recommendations, and other advertising elements. It goes straight to what the user asked for.

Amazon’s advertising business depends on attention. Every product search on Amazon is an ad impression. When AI agents shop on behalf of users, those impressions evaporate.

Amazon acknowledged this internally. According to Perplexity’s brief, Amazon’s own complaints noted that its advertisers do not pay for ads shown to “automated agents.” The company formed a group focused on agentic AI in March 2025 and launched its own suite of AI tools, including “Buy for Me” (which completes purchases on third-party retailer websites using an AI agent) and Rufus (an AI shopping assistant that reached 300 million users and generated nearly $12 billion in incremental sales in 2025).

Amazon has also blocked dozens of outside AI agents from its platform, including OpenAI’s ChatGPT. Perplexity already faces a class action over user data practices, so the company is navigating legal exposure from multiple directions simultaneously. And if you think the platform dependency angle is abstract, Anthropic’s recent OAuth lockdown showed how fast a provider can cut off access to an entire ecosystem overnight.

So the picture is clear: Amazon is building its own AI shopping agents while using federal hacking law to block competitors’ AI shopping agents from touching its platform.

Amazon’s primary claim is under the Computer Fraud and Abuse Act, 18 U.S.C. § 1030(a)(2). The CFAA was enacted in 1984 to prosecute computer hacking. It requires five elements: intentional access to a computer, without authorization or exceeding authorized access, obtaining information, from a protected computer, causing at least $5,000 in losses.

Perplexity’s brief argues Amazon cannot satisfy any of the five.

On access: Perplexity’s computers never touch Amazon’s servers. The user’s browser makes every request. Suing Perplexity for the user’s browsing is like suing Apple because a Safari user visited Amazon.

On authorization: Amazon’s own account holders authorized the assistant to help them shop. The district court’s own judge acknowledged this conduct is potentially “beneficial” to everyone involved. The key precedent Amazon cited, Facebook v. Power Ventures, involved a company accessing third parties’ private data without those users’ permission. Here, every piece of information the assistant touches belongs to the user who activated it.

On information: Perplexity obtains nothing directly from Amazon. Data arrives at Perplexity’s servers only after passing through the user’s browser.

On losses: The Supreme Court’s Van Buren decision (2021) and the Ninth Circuit’s hiQ Labs v. LinkedIn ruling (2022) established that CFAA “loss” requires technological harms like corrupted files, not lost advertising impressions. Amazon offered no evidence of any technological damage. Its real complaint is lost ad revenue.

The District Court’s Reluctance

Judge Maxine Chesney issued the preliminary injunction on March 9 after a hearing where she repeatedly acknowledged the case was unprecedented. Her own statements during the hearing are telling. She said Comet’s conduct “almost as if what [Perplexity is] doing shouldn’t be covered” by the CFAA and that “people ought to be allowed to bring these kinds of shopping assistants in and have them talk.” She described herself as “kind of stuck” because the statute “is SO broad and covers people who may be performing a beneficial act.”

A judge who grants an injunction while openly questioning whether the statute should apply is a judge handing the appeals court a roadmap to reverse.

What This Means for Every Platform and Every AI Builder

If Amazon’s theory holds, any website operator can block any third-party AI agent from interacting with their platform by sending a cease-and-desist letter and then suing under the CFAA. That creates a legal framework where incumbents control which AI tools consumers can use on any website, regardless of what those consumers actually want.

Walmart and Target have taken the opposite approach, testing ways to work with third-party AI shopping tools. According to PYMNTS Intelligence, 70% of consumers say they are open to using AI agents for shopping, and more than half of consumers who rely primarily on AI platforms prefer to complete purchases inside those environments.

The Ninth Circuit’s ruling will set the terms for whether agentic AI can operate on the open web, or whether platforms can use a 1984 hacking statute to wall off their ecosystems from AI competition. This case sits at the intersection of the AI regulatory patchwork — where existing statutes get stretched to cover AI use cases they were never designed for — and the real-world safety risks of agentic AI acting autonomously on behalf of users.

What to Do Now

If you’re building AI agents that interact with third-party platforms, watch this case. If the Ninth Circuit reverses (likely, given the stay and the district judge’s own skepticism), you will have strong precedent that user-initiated AI browsing does not violate the CFAA. If it doesn’t, you need to reassess every agent that touches a platform with restrictive terms of service.

If you’re a platform operator, this case defines the boundary of your control. Amazon’s approach, blocking outside agents while building its own, will face antitrust scrutiny regardless of the CFAA outcome. The DOJ and FTC have already taken active positions in cases involving AI and competition.

If you’re a general counsel at any company using AI shopping or procurement agents, audit your exposure. The CFAA carries both civil and criminal penalties. Even if the Ninth Circuit narrows the statute, the legal uncertainty alone creates risk for any deployment that touches a platform with a published prohibition on automated access. The FTC’s case-by-case enforcement approach means you can’t rely on clear agency guidance either — you’re building compliance strategy in real time.

Amazon has until April 22 to file its response. Oral arguments are expected this summer. The Ninth Circuit’s decision will be the first major appellate ruling on whether AI agents have the legal right to shop on behalf of consumers, or whether platforms can lock them out with a criminal statute designed for a different era.


If you’re deploying AI agents that touch third-party platforms, your legal exposure isn’t hypothetical — it’s one cease-and-desist letter away. Kaizen AI Lab helps companies build agentic AI strategies that don’t end in a courtroom. Let’s talk.


Don Ho, Esq. is Founder & CEO of Kaizen AI Lab, advising companies on operational growth strategies and the legal aspects of AI integration in their businesses.

Kaizen AI Labs

Ready to Deploy AI in Your Business?

Schedule a discovery call with our AI consulting team. We'll map your operations, identify leverage points, and show you exactly where AI moves the needle.

Book a Consulting Call
AI

Adjacent Media by Kaizen Labs

Is Your Brand Visible to the Bots?

Get a free GEO audit and find out if your brand is being cited, found, or completely invisible in AI-generated answers. Then let's fix it.

Get a Free GEO Audit
GEO