God damn it. I've literally been warning FOR YEARS that LLMs will cause someone to commit suicide. I use this example in all my talks on why we need more research on safe NLP systems. The example I literally use is that a chatbot will reinforce someone's suicide ideation and they will act on it. Now it's happened. Now it's real.
"Belgian man dies by suicide following exchanges with chatbot"
https://www.brusselstimes.com/430098/belgian-man-commits-suicide-following-exchanges-with-chatgpt
GNU social JP is a social network, courtesy of GNU social JP管理人. It runs on GNU social, version 2.0.2-dev, available under the GNU Affero General Public License.
All GNU social JP content and data are available under the Creative Commons Attribution 3.0 license.