GNU social JP
  • FAQ
  • Login
GNU social JPは日本のGNU socialサーバーです。
Usage/ToS/admin/test/Pleroma FE
  • Public

    • Public
    • Network
    • Groups
    • Featured
    • Popular
    • People

Conversation

Notices

  1. Embed this notice
    AI6YR Ben (ai6yr@m.ai6yr.org)'s status on Wednesday, 18-Mar-2026 13:42:26 JST AI6YR Ben AI6YR Ben

    Former Uber self-driving chief crashes his Tesla on FSD, exposes supervision problem

    https://electrek.co/2026/03/17/former-uber-self-driving-chief-tesla-fsd-crash-supervision-problem/

    #tesla #crash

    In conversation about 2 days ago from m.ai6yr.org permalink
    • Embed this notice
      Paul Cantrell (inthehands@hachyderm.io)'s status on Wednesday, 18-Mar-2026 13:43:48 JST Paul Cantrell Paul Cantrell
      in reply to
      • Johanna, CanCon variety

      @johannab @ai6yr

      Classic among classics.

      Also, there’s a 99pi about exactly this: https://99percentinvisible.org/episode/children-of-the-magenta-automation-paradox-pt-1/

      In conversation about 2 days ago permalink

      Attachments

      1. Domain not in remote thumbnail source whitelist: 99percentinvisible.org
        Children of the Magenta (Automation Paradox, pt. 1) - 99% Invisible
        from Katie Mingle
        On the evening of May 31, 2009, 216 passengers, three pilots, and nine flight attendants boarded an Airbus 330 in Rio de Janeiro. This flight, Air France 447, was headed across the Atlantic to Paris. The take-off was unremarkable. The plane reached a cruising altitude of 35,000 feet. The passengers read and watched movies and slept.
    • Embed this notice
      AI6YR Ben (ai6yr@m.ai6yr.org)'s status on Wednesday, 18-Mar-2026 13:43:50 JST AI6YR Ben AI6YR Ben
      in reply to

      LOL this is the problem with relying on AI tools, as well...

      "...His core argument: Tesla is asking humans to supervise a system that is specifically designed to make supervision feel pointless. As he puts it, an unreliable machine keeps you alert, and a perfect machine needs no oversight, but one that works almost perfectly creates a trap where drivers trust it just enough to stop paying attention.

      The research backs this up. Psychologists call it the “vigilance decrement”, monitoring a nearly perfect system is boring, boredom leads to mind-wandering, and drivers need 5 to 8 seconds to mentally reengage after an automated system hands control back. But emergencies unfold faster than that...."

      #AI

      In conversation about 2 days ago permalink
    • Embed this notice
      Johanna, CanCon variety (johannab@cosocial.ca)'s status on Wednesday, 18-Mar-2026 13:43:50 JST Johanna, CanCon variety Johanna, CanCon variety
      in reply to

      @ai6yr every time

      This publication comes to mind:

      https://how.complexsystems.fail

      As does a Human Factors lecture I attended last century (ugh) on the amount of money spent on psychological research to make fighter plane cockpits human-goof-proof, ON TOP of the extended, intense, and repeated training pilots go through.

      One of the points in the early 90's was cars were becoming too complex for mere untrained humans to cope with, with next to no thought about the human-tech interface required.

      In conversation about 2 days ago permalink
    • Embed this notice
      AI6YR Ben (ai6yr@m.ai6yr.org)'s status on Wednesday, 18-Mar-2026 13:43:51 JST AI6YR Ben AI6YR Ben
      in reply to

      **VERY glad the guy and his kids are okay, but it would have been else if the Uber self-driving chief had been incinerated or killed by a self driving car. 🤔

      In conversation about 2 days ago permalink
    • Embed this notice
      AI6YR Ben (ai6yr@m.ai6yr.org)'s status on Wednesday, 18-Mar-2026 13:43:51 JST AI6YR Ben AI6YR Ben
      in reply to

      "...What makes this account particularly striking is Krikorian’s background. At Uber’s Advanced Technologies Center, he ran the team building autonomous vehicles and trained human safety drivers on exactly when and how to intervene when a self-driving system fails...."

      🤔

      In conversation about 2 days ago permalink
      Steve's Place repeated this.
    • Embed this notice
      Paul Cantrell (inthehands@hachyderm.io)'s status on Wednesday, 18-Mar-2026 13:51:18 JST Paul Cantrell Paul Cantrell
      in reply to
      • Johanna, CanCon variety

      @johannab @ai6yr

      I’ve listened to almost every episode by now, and I can recommend the experience.

      In conversation about 2 days ago permalink
    • Embed this notice
      Johanna, CanCon variety (johannab@cosocial.ca)'s status on Wednesday, 18-Mar-2026 13:51:20 JST Johanna, CanCon variety Johanna, CanCon variety
      in reply to
      • Paul Cantrell

      @inthehands @ai6yr oh, cool, thanks! I love 99pi, except for the fact that their back-catalogue is longer than I have years left to live, I suspect. I've listened on-and-off for over a decade and they had quite the archive when I started!

      In conversation about 2 days ago permalink
    • Embed this notice
      Paul Cantrell (inthehands@hachyderm.io)'s status on Thursday, 19-Mar-2026 02:01:04 JST Paul Cantrell Paul Cantrell
      in reply to
      • Johanna, CanCon variety
      • Raglan Niall :lk: :tinoflag:

      @johannab @Niall @ai6yr
      Yeah. In multiple spheres, I’m increasingly resigning myself to “people are going to have to learn it for themselves” mode — shifting focus from global prevention to local mitigation, away from trying to control others and toward protecting what’s already in my sphere of personal control.

      In my software consulting days, I often found myself trying to get companies not to hit themselves in the head with a hammer, but often it turned out to be best to just go ahead and let them do that and then ask “How’d that work out for you?” It’s painful to see in advance the needless damage, the waste, but sometimes it’s the only thing that works.

      In conversation about 2 days ago permalink

      Attachments

      1. No result found on File_thumbnail lookup.
        control.in
    • Embed this notice
      Johanna, CanCon variety (johannab@cosocial.ca)'s status on Thursday, 19-Mar-2026 02:01:05 JST Johanna, CanCon variety Johanna, CanCon variety
      in reply to
      • Paul Cantrell
      • Raglan Niall :lk: :tinoflag:

      @Niall @inthehands @ai6yr

      No kidding, I had apparently actually listened to those ones but forgotten, and now I've listened again AND need to pile those into my "slow the fuck down with AI in everything" references pile. "automation paradox" is being baked in to the whole stack right now. Particularly terrifying when I think of my previous medical systems roles, because we seem to have dropped any idea of regulating life-endangering tech, too.

      In conversation about 2 days ago permalink
    • Embed this notice
      Raglan Niall :lk: :tinoflag: (niall@mastodon.nz)'s status on Thursday, 19-Mar-2026 02:01:06 JST Raglan Niall :lk: :tinoflag: Raglan Niall :lk: :tinoflag:
      in reply to
      • Paul Cantrell
      • Johanna, CanCon variety

      @inthehands @johannab @ai6yr that two-parter was awesome. While the tech may have improved between then and now, no decent solution to the fundamental problem of paying attention/ being ready for failing automation has been proposed.

      In conversation about 2 days ago permalink
    • Embed this notice
      Tim Ward ⭐🇪🇺🔶 #FBPE (timwardcam@c.im)'s status on Thursday, 19-Mar-2026 06:55:26 JST Tim Ward ⭐🇪🇺🔶  #FBPE Tim Ward ⭐🇪🇺🔶 #FBPE
      in reply to

      @ai6yr It's solved for pilots. It's called "training" and it works fine.

      Oh wait ...

      https://www.youtube.com/watch?v=5ESJH1NLMLs

      In conversation about 2 days ago permalink
      Steve's Place repeated this.
    • Embed this notice
      Tim Ward ⭐🇪🇺🔶 #FBPE (timwardcam@c.im)'s status on Thursday, 19-Mar-2026 06:55:26 JST Tim Ward ⭐🇪🇺🔶  #FBPE Tim Ward ⭐🇪🇺🔶 #FBPE
      in reply to

      @ai6yr Having said which:

      (1) When I've taken passengers flying I've trained them in opening the door as part of the safety briefing.

      (2) When I have overnight visitors I should them how to open all the outside doors in case of fire.

      (3) It has *never* occurred to me that I need to give *car* passengers a safety briefing on how to open the door.

      In conversation about 2 days ago permalink
      Steve's Place repeated this.
    • Embed this notice
      Dina (soundstruck@m.ai6yr.org)'s status on Thursday, 19-Mar-2026 06:55:26 JST Dina Dina
      in reply to
      • Tim Ward ⭐🇪🇺🔶 #FBPE

      @TimWardCam @ai6yr I recently thought about that while hitching a ride with a friend in a Mustang EV (and they actually weren't 100% sure about the manual override instructions!). Here's an article with instructions for most of the EVs on the market. https://www.consumerreports.org/cars/car-safety/how-to-escape-your-car-if-the-electronic-door-release-fails-a8152892189/

      In conversation about 2 days ago permalink

      Attachments

      1. Domain not in remote thumbnail source whitelist: article.images.consumerreports.org
        How to Escape Your Car If the Electronic Door Handle Fails - Consumer Reports
        From Tesla to Lexus, more and more vehicles have doors with an electronic button instead of a manual latch. CR lets you know how to open them in an emergency.

Feeds

  • Activity Streams
  • RSS 2.0
  • Atom
  • Help
  • About
  • FAQ
  • TOS
  • Privacy
  • Source
  • Version
  • Contact

GNU social JP is a social network, courtesy of GNU social JP管理人. It runs on GNU social, version 2.0.2-dev, available under the GNU Affero General Public License.

Creative Commons Attribution 3.0 All GNU social JP content and data are available under the Creative Commons Attribution 3.0 license.