GNU social JP
  • FAQ
  • Login
GNU social JPは日本のGNU socialサーバーです。
Usage/ToS/admin/test/Pleroma FE
  • Public

    • Public
    • Network
    • Groups
    • Featured
    • Popular
    • People

A Stanford University study, covered in Ars Technica last month, found that therapist-branded chatbots from Character.AI and other providers can encourage delusional thinking and express stigma toward people with certain mental health conditions. But one of its co-authors, Nick Haber, argued that AI likely does have positive applications to therapy, including in training human therapists and in helping clients with journaling and coaching. That strikes me as true — and still not quite enough. Part of the problem here surely relates to language: the words "therapy" and "therapist" connote a level of trust and care that no automated system can provide. Tools like ChatGPT can clearly provide a convincing therapy-like experience — even one that has therapeutic benefits — but should never be mistaken for the genuine article.

Download link

https://files.mastodon.social/media_attachments/files/115/052/558/126/233/424/original/8f089b58e4989cf0.png

Notices where this attachment appears

  1. Embed this notice
    Casey Newton (caseynewton@mastodon.social)'s status on Tuesday, 19-Aug-2025 10:27:35 JST Casey Newton Casey Newton

    Chatbots can provide something *like* therapy, but it's not the genuine article — so I think it's good more states are cracking down on how companies present them https://www.platformer.news/ai-therapy-paxton-meta-character/

    In conversation about 7 months ago from mastodon.social permalink
  • Help
  • About
  • FAQ
  • TOS
  • Privacy
  • Source
  • Version
  • Contact

GNU social JP is a social network, courtesy of GNU social JP管理人. It runs on GNU social, version 2.0.2-dev, available under the GNU Affero General Public License.

Creative Commons Attribution 3.0 All GNU social JP content and data are available under the Creative Commons Attribution 3.0 license.