91原创 family lawyer Chong Ke has been ordered to pay court expenses to opposing counsel for what B.C. Supreme Court Justice David Masuhara called an “alarming” use of artificial intelligence program ChatGPT.
Last year, Ke had been representing who was disputing custody and child support payments with his ex-wife and West 91原创 resident Nina Zhang, who was represented by family lawyer Lorne MacLean (K.C.).
Through the course of litigating, Ke had submitted two fake AI-generated divorce cases, as grounds for her arguments, to MacLean and his legal team. Although Ke subsequently withdrew the submissions to MacLean and they never entered into evidence in court, MacLean had expended time determining the cases were fake.
And so, Zhang filed for special costs to be awarded against Ke for her error.
“As this case has unfortunately made clear, generative AI is still no substitute for the professional expertise that the justice system requires of lawyers. Competence in the selection and use of any technology tools, including those powered by AI, is critical. The integrity of the justice system requires no less,” wrote Masuhara in a Feb. 20 decision.
However, Masuhara ultimately ruled that Ke only be on the hook for four half-days of court costs, stopping short of awarding special costs, which he said would be an “extraordinary step” against a lawyer.
Special costs against a lawyer “requires a finding of reprehensible conduct or an abuse of process by the lawyer. The authorities make it clear that special costs against a lawyer are appropriate only where there has been ‘a serious abuse of the judicial system... or dishonest or malicious misconduct… that is deliberate,’” explained Masuhara.
Ke asserted in court that the insertions of the fake cases were due to an “honest mistake based on a lack of knowledge of the risks of ChatGPT,” noted Masuhara.
“I am remorseful about my conduct,” Ke told the court.
While MacLean questioned in court how someone like Ke, with a PhD in law from the University of Victoria, could make such a mistake, he stopped short of calling her dishonest when asked by Masuhara, the ruling states. MacLean also noted lawyers have been guided by the Law Society of British Columbia on how to properly use ChatGPT, should they choose.
Masuhara noted Ke was “not aware of the various notices from the society.”
And Ke stated she was unaware ChatGPT could generate fake cases.
Masuhara also noted that damage was limited, as the fake cases were caught by MacLean before entering the court.
Had there been, Masuhara alluded to much more serious consequences: “Citing fake cases in court filings and other materials handed up to the court is an abuse of process and is tantamount to making a false statement to the court. Unchecked, it can lead to a miscarriage of justice.”
Furthermore, Ke has been subject to “significant negative publicity” regarding the submissions to MacLean, causing Ke anxiety and stress.
Ke was represented by lawyer John Forstrom who submitted the matter is for the Law Society of British Columbia (which has acknowledged an investigation is ongoing into Ke’s use of ChatGPT).
“Mr. Forstrom was critical of the public ‘whistle blower’ campaign of Mr. MacLean, K.C. to ‘expose and vilify’ Ms. Ke’s mistake and argued that the steps taken to investigate the existence of the cases were not necessary to dispose of the key issue; namely, a father’s application to have parenting time with his children in China,” noted Masuhara.
In addition to court costs, Masuhara issued guidance to Ke, stating she is to review all of her files that are before the court.
And “it would be prudent for Ms. Ke to advise the court and the opposing parties when any materials she submits to the court include content generated by AI tools such as ChatGPT,” the judge concluded.