ChatGPT and the Law
ChatGPT has the potential to revolutionise our industry. Even as students, it has become a valuable tool to assist in assignments or to summarise readings that would ultimately take us hours to get through. But is using this software effectively replacing our own work, or are there significant holes in the information we so heavily depend on that we don’t realise are there?
ChatGPT has developed so rapidly in the last two years that not only is it able to summarise large academic documents generally, but now a new specialised version has emerged: Law ChatGPT. Claiming to generate high-quality legal text output that is both accurate and natural-sounding, the platform allows you to produce legal documents within seconds and even allows the user to select the country of their jurisdiction to ensure the content produced is suitable.
The tool recently passed four different law exams at the University of Minnesota, performing at a C+ average, which while not perfect, is a passing mark that ultimately outranks many students, as reported by the Law Society Journal. CNN journalist Samathana Kelly reported that a law professor at the university, Mr Jon Choi, said ‘ChatGPT struggled with the most classic components of law school exams, such as spotting potential legal issues and deep analysis applying legal rules to the facts of a case [but] could be very helpful at producing a first draft that a student could then refine’.
While ChatGPT can now draft documents and understand legal information it cannot completely replace lawyers at this stage. Dentons Associate, Santiago Paz, argued that the software can only be considered reliable as a starting point due to significant limitations in accuracy, application, and confidentiality concerns. Garling Wu further pointed to the lack of connectivity to the internet and delays in updates of information post-2021 resulting in out-of-date legal advice. This means that cross-checking by practitioners is still required as responses are ‘too ridden with errors including fake case references to be used directly in legal advice’ as expressed by Sam McKeith. Additionally, as the software learns from the answers it provides, difficulties arise with more nuanced situations where there is a lack of precedent.
The most prominent concern regarding the use of ChatGPT in the legal field are ethical issues and confidentiality concerns arising due to the storage of information by the software. Open AI claims that ‘when you use our services, we may collect personal information that is included in the input, file uploads, or feedback that you provide to our services’, and in situations where high levels of confidentiality are required, these terms reduce the ability for practitioners to utilise the software tool.
Removing the human element from the law at this stage is not plausible, so our jobs are safe for now, but how the software will develop in the future is unknown. For the near future, ChatGPT responses require human revision, and at this stage are only useful as a starting point.