The airline tried to argue that it shouldn't be liable for anything its chatbot says.

Experts told the Vancouver Sun that Air Canada may have succeeded in avoiding liability in Moffatt’s case if its chatbot had warned customers that the information that the chatbot provided may not be accurate.

Just no.

If you can’t guarantee it’s accurate then don’t offer it.

I as a customer don’t want to have to deal with lying chatbots and then having to figure out whether it’s true or not.

Exactly. The goal of a customer service is to resolve issues. If communication isn’t precise and accurate, then nothing can be resolved.

Imagine this:

“Okay Mr Jones. I’ve filed the escalation as we’ve discussed and the reference number is 130912831”

“Okay, so are we done here?”

“You may end this conversation if you would like. Please keep in mind that 20% of everything I say is false”

“But we’re done right?”

“Yes”

“What was that confirmation number again?”

“783992831”

“That’s different than the one you gave me before before”

“Oh sorry my mistake the confirmation number is actually 130912831-783992831. Don’t forget the dash! Is there anything else I can help you with?”

Create a post

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community’s icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

  • 1 user online
  • 117 users / day
  • 243 users / week
  • 662 users / month
  • 2.08K users / 6 months
  • 1 subscriber
  • 3.48K Posts
  • 69.1K Comments
  • Modlog