ARTICLE
17 July 2025

Artificial Advocacy: How Canadian Courts And Legislators Are Responding To Generative AI

OH
Osler, Hoskin & Harcourt LLP

Contributor

Osler is a leading law firm with a singular focus – your business. Our collaborative “one firm” approach draws on the expertise of over 400 lawyers to provide responsive, proactive and practical legal solutions driven by your business needs. It’s law that works.
Generative artificial intelligence is steadily making its way into daily legal practice. While this evolving technology offers incredible potential to optimize legal processes and improve access to justice...
Canada Ontario Nova Scotia Manitoba Yukon Technology

Introduction

Generative artificial intelligence is steadily making its way into daily legal practice. While this evolving technology offers incredible potential to optimize legal processes and improve access to justice, it must be used responsibly. Among other things, the increased use of this type of AI has resulted in inaccurate or non-existent legal citations in court materials (colloquially referred to as "hallucinations"). One independent global database has tracked more than 150 known cases of hallucinations in court to date.

As the AI landscape evolves, Canadian courts and legislatures have been grappling with the use of AI in the courtroom. Below, we provide a brief guide to the different rules and directions issued to date to address AI in Canadian courts, as well as a review of several recent decisions involving hallucinations in Canadian court filings.

Canadian courts' approaches to AI

Thus far, several Canadian courts and legislators have adopted approaches to combat hallucinations in court materials, and three general frameworks have emerged

  1. The requirement for certification of the authenticity of authorities
  2. The requirement for disclosure if AI is used to prepare court materials
  3. Encouragement of the responsible use of AI in preparing court materials

Certifying authenticity

Ontario is presently the only jurisdiction with a legislative requirement addressing AI in the courtroom. Amendments to the Ontario Rules of Civil Procedure in O. Reg 384/24 introduced on December 1, 2024 require:

  • Certification of legal authorities: litigants must certify authorities cited in factums submitted to the court are authentic. Authorities from certain sources, such as government websites or CanLII, are presumed authentic unless proven otherwise.
  • Certification of expert reports: experts must certify the authenticity of every authority, document or record referred to in an expert report. Similarly to legal authorities, sources from government websites, scholarly journals, and commercial publishers of research are presumed authentic unless proven otherwise. Experts must express any doubts they may have about the authenticity of sources within the certification. However, there is no need for experts to certify the authenticity of evidence provided to them to analyze.

Certification is required for every factum or expert report submitted to Ontario courts, irrespective of whether AI was used. Certification is in the form of a signed statement that the party "is satisfied as to the authenticity" of the authorities in a factum or expert report.

Requirements to disclose AI

Various Canadian courts (including those in Manitoba, the Yukon and Nova Scotia, as well as the Federal Court), require written disclosure if AI is used in court filings. Unlike Ontario's Rules of Civil Procedure, which require certification by all parties relying on authorities, "disclosure" jurisdictions only require action by parties who use AI to prepare court materials. The particular disclosure requirements vary depending on the court:

  • The Court of King's Bench of Manitoba issued a Practice Direction on June 23, 2023 [PDF] requiring that "when artificial intelligence has been used in the preparation of materials filed with the court, the materials must indicate how artificial intelligence was used". The Manitoba Provincial Court and Court of Appeal remain silent on the issue.
  • The Supreme Court of Yukon issued a more specific Practice Direction on June 26, 2023 [PDF] requiring litigants who employ artificial intelligence for legal research or submissions in any matter or form before the court to disclose the tool used, and the purpose for which it was used. The Yukon's Territorial Court and Court of Appeal have not issued similar directives.
  • Nova Scotia's Provincial Court [PDF] and Registrar in Bankruptcy [PDF] issued directives requiring disclosure when AI is used, including how the AI tool was used. Nova Scotia's Provincial Court requires litigants to specify which AI tool was used.

The Federal Court's Notice to the Parties and the Profession on May 7, 2024 [PDF] formalizes the Court's expectation that submissions containing "content created or generated" by AI must contain "a Declaration in the first paragraph stating that AI was used in preparing the document, either in its entirety or only for specifically identified paragraphs". In the notice, the Court also issued the following additional guidance:

  • The declaration is required when AI generates the content itself, but is not required for content authored by a human, revised by an AI tool, and considered and implemented by a human.
  • Lawyers joining an ongoing matter must use best efforts to ascertain if content was AI-generated in materials drafted by the previous representative for the party. A declaration must be provided in respect of those materials.
  • The notice also confirmed that paragraph 3(i) of the Code of Conduct for Expert Witnesses applies to disclosing when AI was used in the expert's methodology.

Encouraging the responsible use of AI in court

Certain courts in Newfoundland and Labrador [PDF], Quebec (including the Court of Appeal [PDF], Superior Court [PDF] and Court of Québec [PDF], Alberta, and Nova Scotia (including the Supreme Court [PDF] and Court of Appeal [PDF]) have issued similar guidance encouraging the principled use of AI. Unlike other Canadian courts, these courts do not require certification or disclosure. Rather, they recommend the following practices:

  • Exercise caution when referencing authorities derived from AI in submissions.
  • Rely exclusively on authoritative sources, such as court websites, CanLII, and commercial publishers.
  • Verify AI-generated content using human control (i.e., maintaining a "human in the loop").

Evaluating the need for rules and regulations on AI in court filings

There is controversy in the profession about whether AI-specific rules and regulations are required. Arguably, existing obligations and general duties to the court should be sufficient. For example, the British Columbia Court of Appeal has not issued specific guidance regarding AI, but reminds litigants at section 7.3 of their March 12, 2024 Filing Directive [PDF] of their existing obligations with respect to the authenticity and accuracy of all materials filed with the Court.

Moreover, it is unclear whether litigants are effectively adopting AI practice requirements. In February 2025, the Canadian Bar Association reported the Federal Court received three to four AI disclosures out of almost 28,000 legal filings in 2024.

Sanctioning irresponsible use of AI in Canadian courts

Notwithstanding the above-noted initiatives, Canadian courts have recently had to deal with the fallout of the improper use of AI technology. For example, in three recent reported decisions, courts have sanctioned parties for relying on inaccurate or inexistent legal authorities:

In Zhang v. Chen, 2024 BCSC 285, applicant's counsel cited two non-existent authorities. She ultimately admitted the citations had come from ChatGPT, and she had not verified them.[1] The British Columbia Supreme Court considered imposing "special costs" against the lawyer, which are typically reserved for "reprehensible conduct or an abuse of process".[2] While the Court ultimately declined to award special costs, the lawyer was personally liable for all or part of the costs awarded to the opposing party.

Special costs were imposed in Hussein v. Canada (Immigration, Refugees and Citizenship), 2025 FC 1060. In Hussein, the applicant's counsel submitted several cases that either did not exist or were inaccurately cited for specific propositions of law.[3] The lawyer admitted to relying on artificial intelligence and failing to verify the sources independently. However, this admission was made only after the lawyer had produced incomplete books of authorities, on two separate occasions, in response to a Court direction.[4] The Court found that the counsel's concealment of his reliance on artificial intelligence amounted to misleading the Court and consequently ordered special costs.[5]

In Ko v. Li, 2025 ONSC 2766 , the Ontario Superior Court of Justice ordered a lawyer to attend a contempt of court hearing after citing non-existent cases in written submissions. Ultimately, the lawyer's admission, apology and corrective steps — including attending professional development programs specific to the risks of AI in legal practice[6]were found to have adequately purged any possible contempt.

To date, there have been no reported Canadian cases involving hallucinations in the context of an expert report. However, the District Court of Minnesota recently addressed this in Kohls v. Ellison.[7] In Kohls, the Court excluded an expert report and denied leave to file an amended report after it became aware the report contained hallucinated sources. Ironically, the expert report in question opined on the dangers of AI and deepfakes to democracy.

Implications

Canadian courts are responding to the use of AI in court filings in a range of ways. These responses indicate litigants must balance using AI with their duties to the court. The failure to do so can result in costly — and embarrassing — consequences. In an effort to curb the improper use of AI, Canadian courts have demonstrated a willingness to sanction parties who fail to verify the veracity of authorities represented to the court.

While the use of AI in the legal profession has exciting potential, it is imperative that its use be subject to appropriate safeguards so as to ensure that it does not undermine the administration of justice in Canada.

Footnotes

1. 2024 BCSC 285 at para. 12.

2. 2024 BCSC 285 at para. 26.

3. 2025 FC 1060 at para. 39.

4. 2025 FC 1060 at paras. 35-38.

5. 2025 FC 1060 at paras. 41-43.

6. 2025 ONSC 2766 at para. 24.

7. 2025 WL 66514.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More