【】
Is OpenAI responsible for what its popular AI chatbot, ChatGPT, says? A new lawsuit against the company filed by a Georgia-based radio host argues that the company is.
Armed America Radio host Mark Walters filed against OpenAI for defamation earlier this week, marking the first such case in AI, according to Bloomberg.
So, what happened? According to the complaint, Fred Riehl, the editor-in-chief of the gun outlet AmmoLand, was doing research for an article on a Washington federal court case, Second Amendment Foundation v. Ferguson.
When Riehl asked ChatGPT for a summary of the case, ChatGPT responded with information about Walters, who the AI chatbot said was the Second Amendment Foundation's treasurer and chief financial officer. According to Walters' suit, ChatGPT told Riehl that Walters engaged in "defrauding and embezzling funds" from the organization it said he worked for.

The lawsuit states that, according to ChatGPT, Walters "misappropriated funds for personal expenses without authorization or reimbursement, manipulated financial records and bank statements to conceal his activities, and failed to provide accurate and timely financial reports and disclosures to the SAF's leadership."
One problem: Everything ChatGPT allegedly told Riehl about the case and Walters was completely fabricated. Walters did not defraud or embezzle funds from the Second Amendment Foundation. In fact, Walters doesn't nor has he ever worked for the organization. The Second Amendment Foundation v. Fergusoncase isn't even about financial fraud, it's a suit filed by the gun group against the Attorney General Bob Ferguson over Washington state's gun laws. As Gizmodopoints out, Walters isn't mentioned anywhere in the actual 30-page lawsuit from the Second Amendment Foundation.
SEE ALSO:A lawyer used ChatGPT for legal filing. The chatbot cited nonexistent cases it just made upWhen Riehl asked ChatGPT at the time to confirm what the AI chatbot was saying about the court case and Walters, it insisted the information was correct. ChatGPT even returned with what it had said was "the paragraph from the complaint that concerns Walters.” ChatGPT proceeded to quote a completely made-up paragraph that does not appear anywhere in the actual Second Amendment Foundation v. Ferguson filing. The AI chatbot even cited an "erroneous case number.”
Riehl did not end up publishing a piece based on ChatGPT's information. However, Walters still filed this lawsuit against OpenAI and is seeking "punitive damages in an amount to be determined at trial."
While Walters may be the first to bring "AI hallucinations" to court, the trend of AI manufacturing false information will likely result in more such cases in the near future. An Australian mayor threatenedto sue OpenAI back in April after ChatGPT claimed he was a convicted criminal in a bribery scandal when in fact he was the whistleblower in the case.
TopicsArtificial IntelligenceChatGPTOpenAI
相关文章

Pole vaulter claims his penis is not to blame
Following the cringeworthy moment in which pole vaulter Hiroki Ogita's penis grazed the bar and he f2025-12-14
Chloë Grace Moretz is amazed by this Chloë Grace Moretz lookalike
Celebrities, take note: you know you've 。 really 。 got a doppelgänger when people start calling h2025-12-14
Watch Hawaii become a winter wonderland
A winter storm warning is in effect for an unlikely place: tropical Hawaii.。The National Weather Ser2025-12-14
Some disturbed and festive Trekkie made a gingerbread USS Enterprise going down in a fiery crash
We've already seen what happens when you turn a gingerbread house into a castle, but someone has now2025-12-14
Dramatic photo captures nun texting friends after Italy earthquake
The image of an injured, bloodied nun, calmly texting friends and family in the wake of the deadly e2025-12-14
Mysterious bread sponge is desperately trying to be 'the dress'
Another day, another person trying to stir up as big a controversy as "the dress." 。Twitter user Crai2025-12-14


最新评论