Conversation
Notices
-
Embed this notice
Sexy Moon (moon@shitposter.club)'s status on Friday, 16-Feb-2024 01:49:22 JST Sexy Moon
@adiz @gregeganSF @j_bertolotti nobody is selling this LLM stuff to people admitting that it hallucinates. -
Embed this notice
:verified_2:防空識別區𝒔𝒐𝒄𝟶 (adiz@soc0.outrnat.nl)'s status on Friday, 16-Feb-2024 01:49:23 JST :verified_2:防空識別區𝒔𝒐𝒄𝟶
@gregeganSF@mathstodon.xyz The fact that the airline even tried to get out of that is amazing but airlines are evil and scummy so I am also not surprised. @j_bertolotti@mathstodon.xyz
-
Embed this notice
Greg Egan (gregegansf@mathstodon.xyz)'s status on Friday, 16-Feb-2024 01:49:25 JST Greg Egan
Yeow. Corporation suffers weird fantasy that it somehow doesn’t matter if it uses a hallucinating chatbot to supply information to customers because it can somehow wash its hands of whatever nonsense it says ... only to find out that the law doesn’t work like that.
“Air Canada, for its part, argued that it could not be held liable for information provided by the bot.
"In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions. This is a remarkable submission. While a chatbot has an interactive component, it is still just a part of Air Canada’s website," Rivers wrote.
"It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot."”
H/T @j_bertolotti
-
Embed this notice