A lawyer in Melbourne, Australia has been referred by a federal circuit judge to the Office of the Victorian Legal Services Board and Commissioner after filing a summary of authorities that do not exist, which he prepared using Gen AI assistant Leap LawY.
The case, which happened in the summer but we first became aware of thanks to coverage on 10 October by The Guardian newspaper, was heard by Justice Amanda Humphreys. Subsequent research confirms that another lawyer has fallen foul of Gen AI hallucination, and filing case law without verifying the accuracy of the information.
Justice Humphreys adjourned divorce proceedings in July after the fake citations came to light, ordering the anonymous solicitor ‘Mr B’ of ‘D Law Firm’ to email the court with reasons why he ought not be referred to the regulator for his conduct.
In an order dated 27 August, Justice Humphreys ruled that Mr B (who she ordered remain anonymous) must be investigated, after he admitted to not reviewing but still filing the fictitious output generated by Leap LawY.
In his defence, Mr B said that he didn’t fully understand how the research tool worked and offered to pay the costs of the adjourned July hearing. He also submitted personal details around the stress that the July hearing had caused him.
Justice Humphreys referred back to the US case of Mata v Avianca, in which Steven Schwartz famously filed case law that didn’t exist after using ChatGPT. She also referred to the guidelines from the Supreme Court of Victoria and County Court of Victoria, which say that parties who are using AI tools in the course of litigation need to understand how they work; the use of AI programs mustn’t mislead other parties and their use should be disclosed; and the use of AI is subject to normal legal practitioner obligations, including candour to the court.
“Importantly in the context of this matter, the guidelines issued by the Supreme Court and County Court of Victoria explain that generative AI and large language models create output that is not the product of reasoning and nor are they a legal research tool,” the judge said. “Generative AI does not relieve the responsible legal practitioner of the need to exercise judgment and professional skill in reviewing the final product to be provided to the court.”
The judge also noted that solicitors have a duty not to engage in conduct which is likely to diminish public confidence in the administration of justice or bring the legal profession into disrepute.
While the judge accepted Mr B’s apology and the steps he had taken to mitigate the impact of his conduct, and she acknowledged the stress he had experienced and the fact that he was unlikely to make the same mistake again, she considered that the regulator was the appropriate body to determine whether further action was required.
“I also consider it is in the public interest for the Victorian Legal Services Board and Commissioner to be aware of the professional conduct issues arising in this matter, given the increasing use of AI tools by legal practitioners in litigation more generally,” she said.
Leap partnered with LawY earlier this year to provide its customers with a generative AI tool to help them with legal research and drafting. Leap LawY offers a second layer of human verification from local lawyers, and a spokesperson for Leap told Legal IT Insider that Mr B did use this verification step. He submitted the cases for verification at 9pm his time and a reply was sent within four hours with the corrected cases, but those were not used, the spokesperson told us.
Leap’s said in a statement: “LawY is a market-leading, powerful AI research tool, not legal advice. LawY sits outside of the lawyer-client relationship because that should rightly stay between lawyers and their clients.
“Whether a lawyer is asking a colleague or using a tool like LawY in their work, due diligence remains a key part of their ethical obligations. That’s why LawY provides users with a free verification process that is underpinned by experienced, local lawyers. This unique feature of LawY is the best way to ensure AI is used to provide accurate answers.
“Despite the legal professional using LawY’s verification process, which sent the user the correct information just four hours after requesting it and well before appearing in court, the user unfortunately did not utilise this correct information in court. This example provides a timely reminder that AI is a powerful tool but must be used appropriately by users to add value to legal practice.”
For our recent Generative AI report, which includes a summary of regulatory statements re Gen AI, sign up HERE