The Gen AI PN provides (at clause 23) that “…Gen AI must not be used to draft or prepare the content of an expert report (or any part of an expert report) without prior leave of the Court”.
The primary position is that an expert witness provides evidence which is based on their own knowledge, skill and experience. Further, as reflected in the principles stressed by the High Court in Makita v Sprowles, the expert must disclose their reasoning process. These basic principles for expert evidence underlie the Court’s view that a cautious approach should be adopted in relation to the use of Gen AI by experts.
If an expert wishes to use Gen AI in the preparation of an expert report, then an application for leave from the Court must be made. An application for leave must identify:
- the proposed use of Gen AI in the report preparation;
- which Gen AI program is intended to be used, including whether it is closed-source or open-source and what privacy and confidentiality settings it contains; and
- any documents which will be submitted to the Gen AI program for the purposes of generating any part of the expert report.
Where leave is obtained to use Gen AI (the exact circumstances of when this will occur remain unclear), an expert witness will also be obliged to disclose the use of Gen AI in the expert report itself, keep records of how Gen AI was used in the preparation of the report (e.g., what prompts were used and any variables applied) and annex these to the report.
While the Gen AI PN comes into effect next year, where an expert report is prepared between 21 November 2024 (the release of the PN) and its effective date (on 3 February 2025), such reports must identify which, if any, parts of that report has used or relied on Gen AI in its preparation.
The Gen AI PN is supported by a set of guidelines for New South Wales judicial officers in the use of Gen AI and similarly adopts the position that Judges in NSW should not use Gen AI in the preparation of judgments or analysis of evidence presented.
A briefing was held on 2 December 2024 to provide the legal profession and wider public with further information on the Gen AI PN. The briefing was led by Chief Justice Bell, along with Justice Garling (as the head of the Court’s Technology Committee) and the Director & Prothonotary, Rebel Kenna.
Chief Justice Bell acknowledged the many benefits of Gen AI but also stressed that, “…at this stage of its maturity, it [also] carries risks and limitations.” Examples of the risks the Court sees associated with the use of Gen AI are noted in clause 7 of the Gen AI PN.
The briefing emphasised that the legal profession, and those that provide the Court with evidence and opinions, are ‘critical thinkers’ with overarching duties to the Court. Those duties of critical thinking and ethical obligations to the Court require an independent mind and moral commitment to the Court which cannot be delegated to a computer.
The Gen AI PN is ‘nuanced’ in that it does not seek to proscribe the use of Gen AI for legal research but it does adopt a conservative approach and applies to the use of Gen AI to generate the content of primary witnesses and documentary evidence, its use by experts, and in any material which is presented in written submissions and summary of argument.
The approach of the Court in the Gen AI PN—including the requirement that Gen AI not be used in expert reports, affidavits, witness statements and similar—differs from the approach adopted in other Australian courts and by State Law Societies and the New South Wales Bar Association. These bodies have provided cautionary guidelines for the use of Gen AI which highlight the limitations of its use but have generally stopped short of restricting its application in the manner above.
The use of Gen AI and its impact in the legal profession and Court systems has already been seen. For example, in the US matter of Mata v. Avianca, Inc., No. 1:2022cv01461 – Document 54 (S.D.N.Y. 2023) the use of Gen AI created false citations. In the matter of DPP v Khan [2024] ACTSC 19, Mossop J ruled that AI-assisted testimony or character references should carry little probative value, unless the AI’s use is fully disclosed. An example of the concerns for the use of Gen AI and the protection of privacy was noted in Victoria, where the Department of Families, Fairness and Housing banned the use of Gen AI after the name of an at-risk child (and other information) was uploaded into ChatGPT for a report submitted to the Children’s Court.
It must also be acknowledged that the use of Gen AI technology has also already become embedded in many aspects of professional life. In ExpertsDirect’s recent ‘Expert Witness Survey Report’, 68% of surveyed lawyers believed that experts may use AI to assist in writing their reports. Some believed that this was already occurring. In contrast, 60% of experts said AI would not be used by them in preparing a report. Now, with the adoption of the Gen AI PN by the Court, experts must not use Gen AI unless leave has been granted (effectively prior to the preparation and presentation of any such report).
Further information on the practical application of the Gen AI PN will become available in the coming weeks and months as the Supreme Court Rules are amended to reflect the content of the Gen AI PN.
It is acknowledged by the Court that this area is one of rapid technological advancement and the application of the Gen AI PN will be reviewed and assessed periodically to assess the growing experience and further technological developments.
Practice Note Gen 23 – Use of Generative Artificial Intelligence – link.
Makita (Australia) Pty Ltd v Sprowles [2001] NSWCA 305 – link
Mata v. Avianca, Inc., No. 1:2022cv01461 – Document 54 (S.D.N.Y. 2023) :: Justia – link
AI ban VIC Case – link
DPP v Khan [2024] ACTSC 19 – link
Expert Witness Survey Report – link