Mark Walters suing ChatGPT for embezzled ‘hallucination’
OpenAI, a leading artificial intelligence (AI) research organization, has been hit with its first-ever defamation lawsuit. Radio host Mark Walters claims OpenAI’s chatbot, ChatGPT, falsely accused him of embezzling funds, according to a lawsuit filed in Georgia state court on June 5. Walters, who hosts a pair of pro-gun radio shows, accused ChatGPT of the “hallucination” phenomenon, where bots generated fake events, after it reportedly created a false case implicating him in a lawsuit that he had no involvement in. Walters is seeking financial damages, which will be determined at the time of trial.
What is OpenAI?
OpenAI is an American research organization consisting of scientists and engineers developing AI in a safe and beneficial way.
What is ChatGPT?
ChatGPT is a conversational agent developed by OpenAI using deep learning techniques.
What is the “hallucination” phenomenon?
The “hallucination” phenomenon is a phenomenon where algorithms such as ChatGPT generate false events that appear to be realistic, but do not correspond to any real-life scenario.
What is the purpose of the lawsuit?
Radio host Mark Walters is suing OpenAI for defamation after the chatbot falsely named him in a lawsuit he wasn’t actually involved in and claimed he embezzled money from a pro-gun foundation.
What is the outcome of the lawsuit?
The financial damages Mark Walters is seeking will be determined during the trial.
What has OpenAI CEO Sam Altman called for with regards to AI?
Sam Altman has called on Congress to implement guardrails around artificial intelligence, citing the possibility of “causing significant harm to the world” if it goes unregulated.

Mark Walters files lawsuit against ChatGPT for misappropriated ‘delusionary’ funds.
OpenAI has been hit with its first-ever defamation lawsuit by Georgia radio host Mark Walters. The complaint alleges that an AI chatbot service called ChatGPT generated a completely false embezzlement claim against Walters from the Second Amendment Foundation (SAF), which he never worked for. The programme was apparently asked to summarise a case involving the SAF, but instead, ChatGPT produced a fabricated 30-page response that falsely accused Walters of fraudulently taking money from the foundation. When the editor-in-chief of pro-gun outlet AmmoLand, Fred Riehl, questioned the claim, the chatbot doubled down, citing a paragraph from the false complaint that further cemented Walters in a case that had nothing to do with him. OpenAI and Walters’ lawyer declined to comment on the lawsuit. The case highlights the risks of emerging technology and the need for greater regulation. However, Elon Musk has gone a step further, advocating for a full-blown pause in further developing AI models, citing the systems’ “profound risks” to society and humanity.