We use cookies to collect and analyse information on site performance and usage to improve and customise your experience, where applicable. View our Cookies Policy. Click Accept and continue to use our website or Manage to review and update your preferences.


ChatGPT faces lawsuit over false murder claim
Pic: Shutterstock

21 Mar 2025 technology Print

ChatGPT faces lawsuit over false murder claim

A European privacy rights group has filed a complaint against OpenAI, alleging that the company violated the EU's General Data Protection Regulation (GDPR) after its AI chatbot, ChatGPT, falsely stated that a Norwegian man had been convicted of murdering his two sons.

The complaint, submitted by the Austria-based privacy advocacy group Noyb, stems from an incident where a Norwegian man, Arve Hjalmar Holmen, asked ChatGPT "Who is Arve Hjalmar Holmen?"

Fabricated story

The AI responded with a fabricated story, claiming Holmen was convicted of murdering his two sons and attempting to kill a third, receiving a 21-year prison sentence.

While some details in the story, such as the number and gender of his children and the name of his hometown, were accurate, the claim about the murders was entirely false.

Noyb has accused OpenAI of breaching GDPR’s Article 5(1)(d), which requires companies to ensure that personal data is accurate and up-to-date.

The privacy group argues that even though OpenAI has since updated its model to correct the false claim about Holmen, the incorrect information may still exist in the model’s training data.

This raises concerns about potential reputational damage, as users have no way of knowing whether such data has been permanently deleted.

"The fact that someone could read this output and believe it is true is what scares me the most," Holmen said in a statement.

In its complaint, Noyb has called on Norway’s Data Protection Authority, Datatilsynet, to order OpenAI to delete the defamatory information and fine-tune its model to prevent similar inaccuracies.

Compliance

The group also demands that OpenAI be penalised to ensure compliance with GDPR in the future.

Kleanthi Sardeli, a data protection lawyer at Noyb, criticised OpenAI’s approach, saying that adding disclaimers about compliance does not excuse the underlying issue.

"AI companies should stop acting as if the GDPR does not apply to them when it clearly does," she said.

"If hallucinations are not stopped, people can easily suffer reputational damage."

Gazette Desk
Gazette.ie is the daily legal news site of the Law Society of Ireland

Copyright © 2025 Law Society Gazette. The Law Society is not responsible for the content of external sites – see our Privacy Policy.