AI-Hallucinated Case Law in Canada: What We Found, What It Means, and What Needs to Change

I. Summary

We spent the past year looking for AI-hallucinated cases submitted as real law in Canadian courts. We found too many.

Between January 2024 and March 10, 2026, Canadian courts and tribunals caught parties citing at least 211 cases that do not exist across 111 separate decisions. Those decisions span 42 courts and tribunals nationwide. In 82 of the 111 decisions, the court found or presumed that AI tools generated the fictitious cases.

The growth has been rapid. We identified 7 decisions in 2024. That rose to 80 in 2025. In just the first ten weeks of 2026, we have already found 24 more.

These numbers are conservative. They count only what judges and adjudicators caught, remarked on, and published. In nearly half the decisions (54 of 111), the court flagged fictitious citations without naming the specific cases. That means the true count per decision is often unknown, and citations that went undetected never enter the data at all.

The 211 figure is a floor. The ceiling is anyone’s guess. Check out our full dashboard by clicking here.

II. Who is affected?

In 87 of the 111 decisions, the fictitious cases appeared in the submissions of self-represented litigants. If you are representing yourself and using AI to help with your legal research, this matters to you directly. AI tools do not just occasionally make mistakes. They fabricate entire cases. They will give you a citation, a case name, a year, and a court. Everything will look real. But when the judge searches for that case and finds nothing, the consequence falls on you.

Legal professionals have submitted fictitious cases, too. This problem is not limited to any one group. But self-represented litigants are the most exposed. Many may not even know that AI has a tendency to “hallucinate”.

III. What needs to change?

We built CaseCheck to address part of this problem. It lets users upload a list of cases they plan to cite, extracts each citation, and prepares them for verification against a Canadian case law database. It keeps a human being in the loop. Rather than letting AI check AI, the tool ensures that a real person makes the final call on whether each case exists.

CaseCheck, a Canadian legal citation verification tool designed to catch AI-hallucinated cases in Canadian courts before they reach the bench, addressing the rise of fictitious citations in Canada and AI-fabricated case law.

But a verification tool alone is not enough. Courts and tribunals need to take three proactive steps to protect self-represented litigants before fictitious citations end up in their submissions.

First, educate self-represented litigants about the specific risks of using AI for legal research, including the risk of AI-hallucinated citations. Put simply: AI tools fabricate cases. They invent case names, citations, and holdings that sound real but are not. Courts should say this plainly, in intake materials, on their websites, and at the start of proceedings.

Second, point self-represented litigants to where they can actually do legal research. Ask yourself: how many Canadians know that CanLII exists? Lawyers take access to case law databases for granted. We learned about these databases in law school and use them in our practice. A self-represented litigant searching for legal information may find ChatGPT long before they find CanLII. Courts should be actively directing people to free, reliable legal research tools.

Third, help self-represented litigants understand what happens when they solely rely on AI to generate their legal arguments. I am not talking about the fines or penalties. I am talking about the fact that AI does not just fabricate citations. It fabricates entire legal arguments. It will build an argument that sounds persuasive and is completely wrong. The consequence is not just punishment from the court. The consequence is that you lose your case because your arguments never had a legal foundation to begin with.

IV. The data is public.

We have published the full dataset of fictitious citations in Canada and AI-fabricated case law as a freely accessible, bilingual database at courtready.ca/fictitious-citations-in-canadian-courts. We update it weekly. Researchers, journalists, lawyers, judges, and self-represented litigants can all access it.

A study of AI-hallucinated cases in Canadian courts found 211 fake cases cited as real law across 111 decisions. The data tracks fictitious citations in Canada generated by AI, including AI-fabricated case law submitted to 42 courts and tribunals.

If you are preparing legal submissions, use CaseCheck to verify your citations before you file. If you are a court or tribunal, consider what you are doing to warn Canadians appearing before you.

The problem is growing faster than the system’s ability to catch it. The issue is here to stay, so waiting is not a strategy.

V. Methodology

Courtready manually conducted targeted keyword searches of decisions published on the Canadian Legal Information Institute (CanLII) from January 1, 2024, to March 10, 2026. Search terms were designed to capture judicial language indicating that a cited authority could not be verified, as well as decisions that explicitly discuss the use of artificial intelligence in legal proceedings.

The findings represent a conservative estimate. Decisions not published on CanLII, or that do not use language captured by the search methodology, would not be reflected in this data. Not all fictitious citations identified in the study are necessarily AI-generated: in 82 of 111 decisions, courts found or presumed AI involvement. In the remaining 29 decisions, courts did not conclusively establish the source of the non-existent cases.

VI. FAQs

How many fictitious citations have been found in Canadian courts?

As of March 10, 2026, Canadian courts and tribunals have flagged at least 211 non-existent cases cited as real law across 111 decisions and 42 courts and tribunals. This figure is conservative and reflects only cases that judges caught and wrote about in published decisions.

What are AI-hallucinated citations?

AI-hallucinated citations are case references generated by artificial intelligence tools that look real but refer to cases that do not exist. They typically include a plausible case name, citation, year, and court, making them difficult to identify without verification.

Do AI tools create fake legal cases?

Yes. Generative AI tools such as ChatGPT can fabricate case names, citations, and legal holdings that appear authentic. In 82 of 111 Canadian court decisions flagging fictitious citations, the court found or presumed that AI tools generated them.

How can I verify if a Canadian legal citation is real?

CaseCheck (casecheck.courtready.ca) lets users upload their case list and verify each citation against a Canadian case law database. Users can also search for cases directly on CanLII, Canada’s free legal research database.

What is CaseCheck?

CaseCheck is a Canadian legal citation verification tool built by Courtready. Users upload a list of cases they plan to cite, and the tool extracts each citation and prepares it for verification, keeping a human being in the loop.