Warning that AI doesn’t replace legal duty
A High Court ruling in Britain has laid bare the dangers of using AI in legal work, Law Society of England and Wales chief executive Ian Jeffery has said.
Regarding Britain's recent High Court ruling about lawyers using AI, the real risk of incorrect outputs produced by generative AI requires lawyers to check, review and ensure the accuracy of their work, he said.
Real risks
“We need to ensure the risks are properly addressed,” he commented.
“The Law Society of England and Wales has already provided guidance and resources to the legal profession. We will continue to develop and expand them as part of our wider programme of supporting members with technology adoption,” Jeffery said.
Whether generative AI, online search or other tools are used, lawyers are ultimately responsible for the legal advice they provide, the chief executive added.
“The High Court judgment in this case reinforces that responsibility, grounding it in established rules of professional conduct and setting out the consequences of breach.
"Public trust is of paramount importance for upholding the rule of law. Our Law Society AI strategy reinforces the need to ensure that AI is used in a responsible way while we support the sector through technological change.
“The legal profession has a key role to play in maintaining confidence in our legal system," he concluded.
Meanwhile, Britain's Civil Justice Council has created a working group to examine the use of artificial intelligence in preparing court documents and consider amendments to procedure rules.
Judges in England and Wales now have access to large-language model artificial intelligence software on their own personal computers.
Revised guidance to the judiciary now covers terms including ‘hallucination’ and ‘AI agent’. Tips for spotting submissions produced by AI are also given.
Failing to check work
A recent panel discussion in London heard that the emerging problem of fake citations was a consequence of lawyers failing to check their own work.
"You should be taking personal responsibility for what goes in your name, and that applies whether you’re a judge or you’re a lawyer," said Court of Appeal judge Lord Justice Birss.
"Looked at that way round, lawyers producing documents with hallucinated case references [would not be] a problem. You shouldn’t be putting anything to a court that you’re not prepared to put your name to."
He said that AI should never be used to summarise a document that hasn't been read.
'You can use AI [to summarise], but you can only use it if you have read the document. What you can’t do, is not read the document, and then get AI to summarise it. That’s crazy," the England and Wales Gazette reports.
Gazette Desk
Gazette.ie is the daily legal news site of the Law Society of Ireland