Australia – NSW Supreme Court Practice Note on AI & Litigation

Change is coming for litigators when it comes to the use of Gen AI.

The Supreme Court NSW practice note comes into effect on 3 February 2025.

It contains several prohibitions regarding the use of Gen AI in litigation.

How will this impact your practice? Will it lead to more efficiency or create additional layers of complexity?

Also read

NSW Chief Justice Issues Guidance on Use of Generative Artificial Intelligence in Court

By Paul Gregoire and Ugur Nedim

Generative artificial intelligence, or Gen AI, is part of the broader AI computer science, which is geared to machine learning and problem-solving, and it focuses on the creation of new content, which can include visual images, audio or text. And there are a number of well-known platforms that can be used to produce such material, with ChatGPT being the most widely recognised.

AI is a technology in its early stages. This triggers concerns about attempts to apply it in official settings when it’s not able to adequately replicate the work of a human. And one sector where false information produced by Gen AI could have serious consequences is in the courts.

case that came before Federal Circuit and Family Court Judge Amanda Humphreys in July 2024 was demonstrative of this, as the proceedings involved a family law lawyer producing a document for the court that listed authorities, or prior court cases, that supported the position he would argue, which her Honour then found on closer perusal was a list of past hearings that didn’t exist.

And whilst this instance of a legal practitioner applying AI to generate details in a hurried manner might not have caused any substantial issues later in proceedings, it’s the suspect AI generated materials that don’t get called out that might have more detrimental impacts at a later point.

So, in an effort to prevent the misapplication of Gen AI in the NSW court system, NSW Chief Justice Andrew Bell released Supreme Court Practice Note SC Gen 23 – Use of Generative Artificial Intelligence on 21 November. And this document sets out the limits to how Gen AI can be applied within this state’s court system, and it further contains a general AI guidance for NSW judges.

A note on practice

“Generative AI is a form of artificial intelligence that is capable of creating new content, including text, images or sounds, based on patterns and data acquired from a body of training material,” begins the practice note. “That training material may include information obtained from ‘scraping’ publicly and privately available text sources to produce large language models.”

“Gen AI is capable of being used to assist legal practitioners and unrepresented parties with various tasks, including drafting documents and summarising information,” the guidance continues. “This practice note is directed to the circumstances where such use is acceptable.”

The guidance outlines that Gen AI does not include applications that correct spelling or grammar, assist with transcription or generates chronologies from preexisting source documents, while it further makes certain that it does not have any bearing on the use of internet search engines, like Google, and neither does it have any implications for “dedicated legal research software”.

A key warning in respect of legal issues that could be generated by the use of this cutting-edge technology is that data being feed into “chatbots” can be used to train the larger language model involved in the program and therefore, confidential information may become available to others.

So, there is a general prohibition regarding Gen AI and the NSW courts, which involves a ban on entering of information into a program that is content that’s the subject of a nonpublication or suppression order, as well as any information specifically produced for the courts without approval to utilise this or any material produced on subpoena or that’s the subject of a statutory prohibition.

The potholes involved in Gen AI

The document warns legal practitioners and unrepresented parties about the “limits, risks and shortcomings” that can be involved in Gen AI that they ought to be aware of.

These include “hallucinations”, which is the generation of “apparently plausible coherent responses”, which are actually “inaccurate or fictitious” and as an example, it provides false citations and fabricated legislative or case references.

Further, Gen AI relies on preexisting datasets, which can be outdated, incomplete or actually include misinformation, which means these programs can be propagating falsehoods that are already circulating in the public sphere. And the nature or scope of the datasets underlying AI Gen content can lead to “biased or inaccurate output”.

Another issue is that unless the user prevents it, the searches and content generated via artificial intelligence then goes into the greater pool that can be used to produce more computer-generated content.

The NSW Chief Justice’s note further warns that there is a “lack of adequate safeguards to preserve the confidentiality, privacy or legal professional privilege that may attach to information or otherwise sensitive material submitted to a public Gen AI chatbot”. And further, there are no guarantees that information utilised in generating content is not the subject of copyright.

Limits and allowances

Chief Justice Bell then notes other limits to how Gen AI can be applied to NSW court work. These involve a prohibition on generating “affidavits, witness statements, character references or other material” that is supposed to reflect witness opinion, as these opinions should reflect a witness’ own knowledge and not that generated by a machine.

AI should not be used to alter, embellish, strengthen or dilute witness statements. So, as of 3 February 2025, when the practice note takes effect, all affidavits, witness statements or character references “must contain a disclosure that Gen AI was not used in generating” them.

Under exceptional circumstances, however, Gen AI can be used to produce affidavits, witness statements or character references, and for this to be done, reasons for its use must be detailed, the chatbot program and its version that is used to produce it must be specified, as well as whether open-source or closed-source materials have been involved and if any of the content is confidential.

In terms of written submissions, summaries or skeletons of arguments produced by Gen AI, lawyers must verify cited citations, legal or academic authorities, case law or legislative references to ensure that they exist, are accurate and relevant to proceedings, and this verification must be done by a person.

AI use doesn’t absolve lawyers from being held accountable for any negative outcomes produced.

Exports reports, however, cannot be generated by AI, unless a lawyer seeks leave to do this and stipulates the benefits that would be gained by doing so.

If Gen AI is used in the production of an expert opinion that should be specifically explained to the court and any references within it should be verified. And if a report is in relation to a professional negligence claim, any use of Gen AI must be squared up at the first directions hearing on the matter.

“Legal practitioners and unrepresented parties must draw the requirements of this practice note to the attention of experts when instructing them,” the NSW Chief Justice adds.

A note to those on the bench

Chief Justice Bell then provides a further practice note titled Guidelines for New South Wales Judges in Respect of Use of Generative AI, which was released on the same day as the lengthier NSW Supreme Court practice note and was produced with input from heads of jurisdiction, or the most senior judicial officers from each of the courts.

The guideline is clear that NSW judges cannot use Gen AI to produce their reasons for a judgement or to assess evidence in order to produce their reasons, nor can artificial intelligence be utilised in editing or proofing judgements or any part of a judgement.

If judges seek to apply the use of this technology for secondary sources, then judicial officers ought to familiarise themselves with the pitfalls of Gen AI use, while anything produced as a result of AI use should be verified afterwards regardless of how polished it appears. And judges should require their assistants to disclose any AI use, and they can quiz litigants and lawyers over any use.

AI red flags that judges ought to be aware of include inaccurate or false case citations, dodgy assessments, case references that aren’t suitable for the jurisdiction involved, out of date references, submissions that diverge from general understandings, the use of repetitive language and the use of terms more closely related to other jurisdictions.

“Due to the rapidly evolving nature of Gen AI technology, these guidelines will be reviewed on a regular basis,” NSW Chief Justice Andrew Bell ended.

https://nswcourts.com.au/articles/nsw-chief-justice-issues-guidance-on-use-of-generative-artificial-intelligence-in-court/#:~:text=The%20guideline%20is%20clear%20that,any%20part%20of%20a%20judgement.

 

 

Also

 13 December 2024

Generative AI and a new era of legal technology governance in the Supreme Court of NSW

PA
Piper Alderman

Authors: Robert RiddellPouyan Aski

On 21 November 2024, the Chief Justice of New South Wales issued Practice Note SC Gen 23, which provides comprehensive guidelines for the use of Generative Artificial Intelligence (Gen AI) in legal proceedings. This Practice Note will take effect on 3 February 2025.

Alongside the Practice Note, amendments to the Uniform Civil Procedure Rules will also come into force.  These measures are aimed at ensuring the responsible use of Gen AI in the legal domain while addressing potential risks and ethical considerations.

Generative AI: Benefits and Challenges in the Legal Sector

Generative AI, leveraging large language models, can generate a wide range of content including text, images, and sounds.  Prominent platforms like ChatGPT, Google Bard, and Llama, alongside legal-specific tools such as Westlaw Precision and Lexis Advance AI, have demonstrated their utility in drafting documents, summarising information, and analysing extensive datasets.  However, this advancement introduces challenges:

  1. Accuracy Issues: AI tools may generate “hallucinations,” producing outputs that seem credible but are factually inaccurate or entirely fabricated.
  2. Data Bias: The quality and scope of training datasets may lead to biased or incomplete outputs, impacting the relevance and applicability of AI-generated content.
  3. Confidentiality Concerns: Public Gen AI tools often lack adequate safeguards, raising the risk of sensitive data exposure or misuse.
  4. Ethical and Legal Risks: The use of copyrighted material in AI training datasets could result in unintended breaches of intellectual property laws.

 

Key Provisions in Practice Note SC Gen 23

The Practice Note addresses multiple facets of Gen AI’s integration into legal processes, providing specific guidance and restrictions:

General Prohibition

Restricted Information: A prohibition upon entering information subject to non-publication or suppression orders, or other sensitive material, into any Gen AI program.

Use in Affidavits and Witness Statements

Authenticity: A prohibition upon using Gen AI to generate content for affidavits, witness statements, character references, or other evidentiary materials.  The intention being that such documents reflect the genuine knowledge and opinions of the individuals involved.

Disclosure: Any affidavit, witness statement, or character reference must include a disclosure statement that Gen AI was not used in its preparation.  In exceptional cases, leave may be sought to use Gen AI for specific purposes, subject to strict conditions and disclosures.

Expert Reports

Restrictions: Gen AI must not be used to draft or prepare any part of an expert report without prior court approval. Such leave applications must include detailed information about the proposed use, the specific Gen AI program, and the expected benefits.

Transparency: If Gen AI is used in preparing an expert report, the expert must disclose its use and keep detailed records of how the tool was employed.

Written Submissions and Summaries

Verification: When Gen AI is used to prepare written submissions or summaries of arguments, the author must verify all citations, legal authorities, and references for accuracy and relevance.  This verification must be conducted manually, without relying on Gen AI.

Professional Obligations: The use of Gen AI does not absolve legal practitioners of their professional and ethical responsibilities to the Court and the administration of justice.

Scope Limitation

Exclusions: The Practice Note does not apply to tools that merely correct spelling or grammar, provide transcription, or assist with formatting.  It also excludes traditional search engines and dedicated legal research software that do not generate substantive content.

Judicial Guidelines

In addition to the Practice Note, guidelines have been issued for judges regarding the use of Gen AI.  These guidelines caution against using Gen AI for formulating judgments or analysing evidence.  Judges are encouraged to familiarise themselves with the limitations of Gen AI and to ensure any AI-generated research is thoroughly verified for accuracy and relevance.

Moving Forward

The introduction of Practice Note SC Gen 23 and the accompanying judicial guidelines represents a significant step towards integrating Gen AI into legal practice responsibly.  These measures aim to harness the benefits of Gen AI while mitigating potential risks, ensuring the integrity of legal proceedings and the administration of justice.

https://piperalderman.com.au/insight/generative-ai-and-a-new-era-of-legal-technology-governance-in-the-supreme-court-of-nsw/?utm_source=mondaq&utm_medium=syndication&utm_content=articleoriginal&utm_campaign=article