A holographic image of a person reaches out to touch a hologram of the scales of justice

‘Severe sanctions’ for lawyers who fail to verify AI research, with a warning for managers

A High Court ruling related to the use of AI in legal proceedings has warned there will be ‘severe sanctions’ for lawyers who fail to comply with their professional obligations when using the technology.

Managing partners and heads of chambers must demonstrate that ‘practical and effective measures’ have been taken to ensure every person providing legal services complies with legal and professional obligations, and should ‘expect the court to inquire whether those leadership properties have been fulfilled’.

The comments were made in a judgment related to two cases in which the use of AI had resulted in the inclusion of non-existent citations and quotations.

In one case, the claimant’s barrister – Sarah Forey – had cited five made-up cases and mis-stated the effect of a section in the Housing Act, using American spelling and ‘formulaic’ prose.

Although the court considered the threshold for initiating contempt proceedings against Ms Forey had been met, it declined to do so in part due to the ‘potential failings on the part of those who had responsibility for training Ms Forey’ and for supervision of the ‘extremely junior lawyer operating outside her level of competence’.

In the second case, the claimant had carried out his own research using AI and passed it over to his solicitor, Abid Hussain of Primus Solicitors. Mr Hussain failed to notice that 18 out of the 45 citations didn’t exist, and others didn’t contain the quotations claimed or have any relevance to the subject matter.

The judgment noted the ‘lamentable failure to comply with the basic requirement to check the accuracy of material that is put before the court’, but did not consider that the threshold for initiating contempt proceedings had been met.

Both Ms Forey and Mr Hussain were referred to their regulatory bodies.

Although contempt proceedings were not initiated in either case, Dame Victoria Sharp stressed in the written judgment that this would not be a precedent. Lawyers who fail to comply with professional obligations when using AI, she said, ‘risk severe sanction’.

As well as the initiation of contempt proceedings, sanctions available to the court include public admonition of the lawyer, the imposition of a costs order, the imposition of a wasted costs order, striking out a case, referral to a regulator and – ‘in the most egregious cases’ – referral to the police.

The judgment has highlighted the wider implications of the cases for the legal profession. Freely available generative AI tools such as ChatGPT ‘are not capable of conducting reliable legal research’, Dame Sharp said, and those who use AI ‘have a professional duty therefore to check the accuracy of such research by reference to authoritative sources, before using it in the course of their professional work’.

“This duty rests on lawyers who use artificial intelligence to conduct research themselves or rely on the work of others who have done so. This is no different from the responsibility of a lawyer who relies on the work of a trainee solicitor or a pupil barrister for example, or on information obtained from an internet search.”

Ian Jeffery, CEO of the Law Society of England and Wales, said the judgment ‘lays bare the dangers of using AI in legal work’.

We need to ensure the risks are properly addressed. The Law Society of England and Wales has already provided guidance and resources to the legal profession. We will continue to develop and expand them as part of our wider programme of supporting members with technology adoption.  

“Artificial intelligence tools are increasingly used to support legal service delivery. However, the real risk of incorrect outputs produced by generative AI requires lawyers to check, review and ensure the accuracy of their work.

“Whether generative AI, online search or other tools are used, lawyers are ultimately responsible for the legal advice they provide.  The High Court judgment in this case reinforces that responsibility, grounding it in established rules of professional conduct and setting out the consequences of breach.

“Public trust is of paramount importance for upholding the rule of law. Our Law Society AI Strategy reinforces the need to ensure that AI is used in a responsible way while we support the sector through technological change. The legal profession has a key role to play in maintaining confidence in our legal system.”

Want to have your say? Leave a comment

Your email address will not be published. Required fields are marked *

Read more stories

Join over 7,000 conveyancing professionals – Check back daily for all the latest news, views, insights and best practice and sign up to our e-newsletter to receive our daily and weekly round ups

You’ll receive the latest updates, analysis, and best practice straight to your inbox.

Features

Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.