GNU social JP
  • FAQ
  • Login
GNU social JPは日本のGNU socialサーバーです。
Usage/ToS/admin/test/Pleroma FE
  • Public

    • Public
    • Network
    • Groups
    • Featured
    • Popular
    • People

Conversation

Notices

  1. Embed this notice
    Bread up, Bro (sickburnbro@poa.st)'s status on Tuesday, 29-Oct-2024 01:18:18 JST Bread up, Bro Bread up, Bro
    what's interesting here is the same problem with automated cars. Liability. If you have a person taking an transcription and they mess up, liability is well established.

    But with AI, as things stand liability will transfer to the company making the software, and one bad case will bankrupt them.
    In conversation about 8 months ago from poa.st permalink

    Attachments


    1. https://i.poastcdn.org/c7ab64ce8878febae8e68f0b4296ea4c84e7e48d6563e008a388572801e4a4e5.png
    • BowserNoodle ☦️ likes this.
    • BowserNoodle ☦️ repeated this.
    • Embed this notice
      snap (snappler@poa.st)'s status on Tuesday, 29-Oct-2024 01:41:02 JST snap snap
      in reply to
      @sickburnbro With cars it's especially rough because I won't buy a car that doesn't prioritize me/my wife over every other living creature. I'm fine with it making a safe attempt at stopping, but I'm not paying tens of thousands of dollars to be ethics-theory'd off a bridge. And while plenty of internet ethicists will claim otherwise, they aren't likely to do so either.

      As for the Speech-to-Text stuff, I use it a lot for synced lyrics for songs, but it's spotty enough that it almost always requires checking. Still speeds up the process about a ton though. Because now it's two listens instead of fifty. So the liability should still be on the person knowingly trusting an 80-95% accurate STT model instead of doing an initial transcript and a second listen to correct mistakes.
      In conversation about 8 months ago permalink
      Bread up, Bro likes this.
    • Embed this notice
      Bread up, Bro (sickburnbro@poa.st)'s status on Tuesday, 29-Oct-2024 01:44:46 JST Bread up, Bro Bread up, Bro
      in reply to
      • snap
      @snappler that's the tricky part though, no-one is going to want to use it for medical transcriptions unless it is very accurate, but it being very accurate means that people will end up assuming it is perfect.

      That's why the liability thing comes in, they'll want to claim "it's not our fault" but the doctor will want to have them share blame because the whole point of a transcript is that it is reliable..
      In conversation about 8 months ago permalink
    • Embed this notice
      BowserNoodle ☦️ (bowsacnoodle@poa.st)'s status on Tuesday, 29-Oct-2024 02:24:29 JST BowserNoodle ☦️ BowserNoodle ☦️
      in reply to
      • snap
      @snappler @sickburnbro I can't wait until my car solves a trolley problem to decide which area to hit. There's already an actuarial value assigned to peoples' lives. With enough data and processing power, cars could decide to kill a group of orphans over a businessman because the cost of legal payout would be lower based upon real data. It's incredibly dystopian.
      In conversation about 8 months ago permalink
    • Embed this notice
      Zergling_man (zergling_man@sacred.harpy.faith)'s status on Tuesday, 29-Oct-2024 02:37:55 JST Zergling_man Zergling_man
      in reply to
      @sickburnbro A computer can never be held accountable


      Turns out neither can a jew
      In conversation about 8 months ago permalink

Feeds

  • Activity Streams
  • RSS 2.0
  • Atom
  • Help
  • About
  • FAQ
  • TOS
  • Privacy
  • Source
  • Version
  • Contact

GNU social JP is a social network, courtesy of GNU social JP管理人. It runs on GNU social, version 2.0.2-dev, available under the GNU Affero General Public License.

Creative Commons Attribution 3.0 All GNU social JP content and data are available under the Creative Commons Attribution 3.0 license.