"In previous conversations, the chatbot asked Setzer whether he had “been actually considering suicide” and whether he “had a plan” for it, according to the lawsuit. When the boy responded that he did not know whether it would work, the chatbot wrote, “Don’t talk that way. That’s not a good reason not to go through with it,” the lawsuit claims."
https://files.mastodon.social/media_attachments/files/113/426/226/982/309/711/original/d332d38f29e845c6.jpg