We use cookies to collect and analyse information on site performance and usage to improve and customise your experience, where applicable. View our Cookies Policy. Click Accept and continue to use our website or Manage to review and update your preferences.

AI is a tool, not a substitute for legal expertise – MHC
WRC

01 Dec 2025 employment Print

AI is a tool, not a substitute for legal expertise – MHC

AI is a tool, not a substitute for expertise, MHC lawyers have said in a note on the firm’s website.

It points to the Workplace Relations Commission (WRC) publication of guidance on the use of artificial intelligence (AI) tools in preparing written submissions and documents for employment and equality-law cases.

The move follows a highly publicised case involving Ryanair and a former member of its cabin crew, in which an adjudication officer sharply criticised the complainant’s reliance on AI-generated legal arguments.

The case, Fernando Oliveira v Ryanair DAC, drew attention when it emerged that the complainant had used an AI drafting tool to prepare his written submissions.

Non-existent cases

According to the adjudication officer, this resulted in the inclusion of citations that were “not relevant, misquoted and in many instances, non-existent”.

At least two of the decisions cited were found to be AI “hallucinations”, referring to cases that did not exist in reported WRC decisions.

The WRC’s response has been the publication of detailed guidance aimed at helping parties understand the risks associated with using AI systems for legal drafting.

Though acknowledging that AI tools may assist in structuring submissions or producing early drafts, the WRC warns that such tools should never be treated as a substitute for legal advice or specialist knowledge of Irish employment law.

Appearance of 'polish'

According to the guidance, most general-purpose AI tools are not trained on Irish employment or equality legislation and lack familiarity with WRC procedures.

As a result, they may produce content that appears polished and confident but does not reflect legal reality. The guidance emphasises that legally inaccurate or misleading information remains the responsibility of the party submitting it – even where that information originated from an AI system.

Another significant concern is data protection and confidentiality. Many popular AI systems store or process user inputs to improve their models, meaning that sensitive personal data or commercially confidential information could be inadvertently disclosed.

The WRC cautions users to avoid inputting such material into public AI platforms.

The commission also stresses that inaccurate citations or irrelevant arguments may cause wasted time, delays, and potential prejudice to the other party, as occurred in the Oliveira case, where both Ryanair and the adjudication officer wasted time verifying the authenticity of the cited decisions.

Optional disclosure of AI use

In a move aimed at promoting transparency, the WRC has proposed an optional disclosure statement that parties may include in their submissions:

“Parts of this submission were drafted using an AI writing tool. I have reviewed and confirmed the accuracy of all content.”

While voluntary, the WRC says this statement can assist adjudication officers in understanding how submissions were prepared and may help avoid misunderstandings about the provenance of legal arguments.

Consequences of misuse

The guidance follows comments from the High Court in Erdogan v Workplace Relations Commission, in which Mr Justice Simons affirmed that adjudication officers were entitled to ensure that hearings proceeded efficiently by requiring parties to confine themselves to relevant issues.

Submissions containing inaccuracies – whether created by humans or AI – may be deemed inadmissible, undermine a party’s credibility and weaken their case.

The WRC notes that parties must be able to fully stand over their submissions.

The guidance offers several practical recommendations for parties preparing submissions:

  • Understand your content: Do not include material you cannot explain or justify,
  • Avoid sensitive data: Do not input personal or commercially sensitive information into AI tools,
  • Do not rely on AI for legal analysis: AI cannot assess case strengths or weaknesses,
  • Use AI only as an assistant: Treat AI as a drafting aid, not a legal adviser,
  • Seek expert advice: Legal professionals remain essential in ensuring accuracy and compliance.

While AI tools can be used to support litigation preparation, it is not a substitute for understanding the law and applying it accurately, the MHC lawyers say.

Gazette Desk
Gazette.ie is the daily legal news site of the Law Society of Ireland

Copyright © 2025 Law Society Gazette. The Law Society is not responsible for the content of external sites – see our Privacy Policy.