Uncover The Truth Behind Government Secrecy: Live Conversation With Federal Officials
For those eager to uncover the truth behind the federal government’s actions, our fifth …
01. February 2025
An Australian lawyer has found himself at the center of a controversy after using ChatGPT to write court filings in an immigration case. The artificial intelligence platform generated case citations that did not exist, leading to concerns about the accuracy and integrity of the legal process.
Justice Rania Skaros referred the lawyer to the Office of the NSW Legal Services Commissioner (OLSC) for consideration after the court heard in an appeal of an administrative appeals tribunal ruling that the lawyer had filed amended applications and outlines of submissions containing nonexistent case citations and alleged quotes from the tribunal’s decision. The lawyer initially claimed the errors were unintentional, but later admitted to using ChatGPT to write the documents.
Skaros expressed concern about the lawyer’s conduct and failure to verify the accuracy of what had been filed with the court, noting that significant time had been spent by the court and her associates checking the citations and attempting to find the purported authorities. According to an affidavit provided to the court, the lawyer had used ChatGPT due to time constraints and health issues.
The lawyer stated that he had accessed the platform, inserted some words, and it prepared a summary of cases for him. The lawyer explained that the summary read well, so he incorporated the authorities and references into his submissions without checking the details. The incident has sparked debate about the use of generative AI in legal proceedings.
Counsel for the immigration minister argued that the lawyer had failed to exercise adequate care and that such conduct should be referred to the OLSC to prevent future misuse of AI cases. Skaros acknowledged that the use of generative AI is a live and evolving issue, and it was in the public interest for the OLSC to be made aware of such conduct.
This is not the first instance of a lawyer being referred to a regulatory body over using AI; a Melbourne lawyer was also referred to the Victorian legal complaints body last year after admitting to using AI in a family court case that generated false case citations. The NSW supreme court has issued a practice note limiting the use of generative AI by NSW lawyers, which stipulates that it must not be used to generate affidavits, witness statements, character references, or other material tendered in evidence or used in cross-examination.
The note comes into effect on Monday and aims to ensure that lawyers use AI responsibly and accurately. As the use of generative AI becomes more widespread in various industries, including law, it is essential for professionals to understand its limitations and potential risks. The incident highlights the need for lawyers to exercise caution when using AI tools and to verify the accuracy of the information generated.
The Australian government has expressed concern about the misuse of AI in legal proceedings, emphasizing the importance of public interest in preventing such conduct. With the increasing reliance on technology in the legal sector, it is crucial that professionals adhere to strict guidelines and regulations to ensure the integrity of the justice system.