A Dog Custody Case Just Exposed the Biggest AI Risk in Court: The Judge Didn't Check
Last updated: April 6, 2026
By Don Ho, Esq. | April 6, 2026 Last updated: April 2026
California’s 4th District Court of Appeal sanctioned an attorney and issued a first-of-its-kind footnote requiring judges to independently verify case citations in proposed orders, after a San Diego trial court based its ruling on two AI-hallucinated cases that did not exist. A San Diego couple broke up. They fought over the dog. An attorney cited two cases to support her client’s position. The judge relied on those cases in ruling against the ex-boyfriend. There was just one problem: the cases didn’t exist.
California’s 4th District Court of Appeal, Division 1, confirmed in a unanimous March 5 opinion that the citations on which the trial court’s order was based were AI hallucinations. One was “not a real case.” The other shared a name with a real case but had a different citation and covered spousal support, not pet custody. Neither supported the propositions for which they were cited.
The appellate court sanctioned the attorney $5,000. But the real warning landed on judges: a footnote added one week after the opinion stated that “it is equally important that judicial officers and court staff who are not themselves using generative AI verify the citations contained in proposed orders submitted to them by counsel.”
That footnote is a quiet earthquake.
What Actually Happened in the Courtroom
Roxanne Chung Bonar represented the woman who had custody of Kyra, the dog at the center of the dispute. During litigation, Bonar cited two cases she said were relevant precedent for the custody determination. The San Diego Superior Court commissioner presiding over the case noted those citations in siding with Bonar’s client, granting full rights and denying the ex-boyfriend’s visitation request.
The losing side’s attorney was then ordered to draft a written order reflecting the ruling. That attorney included the citations as noted by the judge, not knowing they were fabricated.
On appeal, a new attorney named David Beavans took over for the ex-boyfriend. Beavans discovered that the two precedents Bonar cited were not real. “Never before have we seen cases invented out of whole cloth,” Beavans told the San Diego Union-Tribune.
When confronted during oral arguments, Bonar said she didn’t remember where the fictitious citations came from. She acknowledged not having a paid subscription to a legal research service and said she had been using online resources, including AI, for research. She initially called the fabrication claim “baseless” and questioned opposing counsel’s competence for not being able to find the cases.
The appeals court noted that Bonar “doubled down” before eventually acknowledging the citations’ provenance was uncertain. She was sanctioned $5,000.
The Judge Problem Is Worse Than the Lawyer Problem
Lawyers submitting AI-hallucinated citations is a known crisis. The AI sanctions tracker documents over 1,200 cases worldwide. Courts have responded with disclosure requirements, sanctions, and new standing orders.
What makes this case different is the failure at the bench. The trial court judge relied on these citations in making her ruling. The order that came out of her courtroom was based, in material part, on fictional law.
The appellate court said it clearly: “It is an abuse of discretion for a court to rely in material part on fictional case authorities in rendering a decision or making an order.”
That language is significant. It puts judges on notice that they cannot passively accept citations submitted by counsel without verification. Pet custody case law is admittedly thin, so it’s understandable that a commissioner might not recognize unfamiliar citations off the top of their head. But that’s exactly the scenario where verification matters most: when you’re relying on cases you haven’t personally read and can’t evaluate from memory.
The Appellate Court Didn’t Overturn the Ruling. That’s the Strange Part.
Despite finding that the trial court’s order was based on fabricated cases, the appeals court declined to reverse the ruling. The panel reasoned that the attorney on the losing side should have caught the fake citations earlier, before they were incorporated into the order. The court also noted it was possible the trial court could have reached the same conclusion without the hallucinated cases.
This creates an awkward precedent. A ruling based on fictional law stands because the opposing side didn’t catch the fiction in time. The implication is that attorneys now bear the burden of verifying not just their own citations but also every citation submitted by opposing counsel, including those the judge has already accepted.
That’s a heavy burden. But given the current state of AI-generated legal research, it’s also unavoidable.
What Every Practitioner Needs to Do Right Now
Verify every citation from opposing counsel. This was always good practice. Now it’s mandatory survival. Run every case citation through Westlaw, LexisNexis, or any other authoritative legal database. If a cited case doesn’t appear, flag it immediately. Do not assume the court will catch it.
Object before the order is drafted. If you lose a motion and the court asks you to prepare the order, do not include citations you haven’t independently verified, even if the judge referenced them from the bench. If you can’t find a case the court cited, raise it with the court before the order is entered.
Watch for the AI hallucination tells. AI-fabricated citations often have real-sounding case names paired with wrong reporters, wrong volumes, or wrong years. The case name might exist, but the citation doesn’t match. If something feels too perfectly on point for an obscure legal question, that’s worth a second look.
Prepare for the “your attorney should have caught it” defense. This appellate court essentially held that the opposing attorney’s failure to catch fake citations earlier meant the ruling could stand. That reasoning may or may not hold in other jurisdictions, but the practical lesson is clear: you are your own last line of defense.
The Footnote That Changes Everything
The appellate court’s added footnote is the most consequential part of this opinion. By stating explicitly that judges must verify citations in proposed orders, the court extended the AI verification obligation from attorneys to the bench.
This matters because judges are increasingly using AI themselves. The Northwestern University study released this same week found that more than 60% of surveyed federal judges use AI tools in their work. If judges are both receiving AI-generated citations from attorneys and generating their own AI-assisted analysis, the verification gap compounds. Nobody is checking the checker.
The California State Bar announced its first discipline of an attorney for AI hallucinated citations earlier this year. The San Diego case is the first California appellate opinion to directly address a judge’s responsibility in the same chain of failure. It fits a pattern playing out in courtrooms across the country, from New Mexico to Nebraska to Georgia. And the stakes are rising: a Walmart attorney just got caught using AI-generated citations in a court filing — the shortcuts keep getting taken and the consequences keep getting worse.
One year from now, every court will need a clear policy on how judges verify citations in orders, regardless of whether those citations came from attorneys, clerks, or AI tools. The San Diego dog custody case is a small-stakes dispute with enormous procedural implications.
The fight over Kyra the dog is over. The fight over who’s responsible for checking AI-generated law is just getting started.
When the judge doesn’t check and the lawyer doesn’t check, nobody catches the hallucination. Book a diagnostic to implement verification at your firm.
AI hallucinations in court filings are no longer a novelty — they’re a pattern. If your firm doesn’t have a verification protocol, you’re one bad citation from sanctions. Book a diagnostic to build one before it’s your name on the order.