【】

Air Canada's argument that its AI-powered customer chatbot was solely liable for its own actions didn't hold up in civil court (thank goodness), and now the airline must refund a customer who was given the incorrect information about being comped for his airfare.
The 2022 incident involved one Air Canada customer, Jake Moffatt, and the airline's chatbot, which Moffatt used to get information on how to qualify for bereavement fare for a last-minute trip to attend a funeral. The chatbot explained that Moffat could retroactively apply for a refund of the difference between a regular ticket cost and a bereavement fare cost, as long as it was within 90 days of purchase.
SEE ALSO:Reddit has reportedly signed over its content to train AI modelsBut that's not the airline's policy at all. According to Air Canada's website:
Air Canada’s bereavement travel policy offers an option for our customers who need to travel because of the imminent death or death of an immediate family member. Please be aware that our Bereavement policy does not allow refunds for travel that has already happened.
When Air Canada refused to issue the reimbursement because of the misinformation mishap, Moffat took them to court. Air Canada's argument against the refund included claims that they were not responsible for the "misleading words" of its chatbot. Air Canada also argued that the chatbot was a "separate legal entity" that should be help responsible for its own actions, claiming the airline is also not responsible for information given by "agents, servants or representatives — including a chatbot." Whatever that means.
"While a chatbot has an interactive component, it is still just a part of Air Canada’s website," responded a Canadian tribunal member. "It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot."
Related Stories
- Snapchat's My AI chatbot posted a Story then stopped responding. Users freaked out.
- 'Eternal You' reveals how AI chatbots aim to resurrect the dead
- Microsoft's Bing AI chatbot Copilot gives wrong election information
- ChatGPT will now remember things about you
- OpenAI comments on alleged ChatGPT private conversation leak
The first case of its kind, the decision in a Canadian court may have down-the-road implications for other companies adding AI or machine-learning powered "agents" to their customer service offerings.
TopicsArtificial Intelligence
相关文章
Xiaomi accused of copying again, this time by Jawbone
Imitation is not always the best form of flattery. 。 SEE ALSO:Xiaomi's MacBook Air clone is called, w2025-08-02People are throwing random stuff in bathtubs for 'Lush bath bomb' meme
Sit back, relax and enjoy a nice bath inspired by the Internet.Natural and vegan beauty lovers are p2025-08-02Mustachioed cop found the cutest little gumshoe to help him on the job
Good partners are hard to find, especially ones as cute as this little kitten.A mustachioed cop who2025-08-02Eat cheese and chocolate with melon and mascarpone Kit Kat
LONDON --Chocolate purists: look away now. Kit Kat has launched a new variety of the classic bar tha2025-08-02Plane makes emergency landing after engine rips apart during flight
A Southwest Airlines flight bound for Orlando, Florida, made an emergency landing Saturday morning d2025-08-02Trump's 'woman's card' insult inspires hugely successful Kickstarter campaign
When DonaldJ. Trump criticized Hillary Clinton last Tuesday for playing the "woman's card," Zach Wah2025-08-02
最新评论