In the piece — titled “Can You Fool a Self Driving Car?” — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.

The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.

  • NotMyOldRedditName@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    21 hours ago

    Well it’s not that it was a crash caused by a level 2 system, but that they’ll investigate it.

    So you can’t hide the crash by disengaging it just before.

    Looks like it’s actually 30s seconds not 10s, or maybe it was 10s once upon a time and they changed it to 30?

    The General Order requires that reporting entities file incident reports for crashes involving ADS-equipped vehicles that occur on publicly accessible roads in the United States and its territories. Crashes involving an ADS-equipped vehicle are reportable if the ADS was in use at any time within 30 seconds of the crash and the crash resulted in property damage or injury

    https://www.nhtsa.gov/sites/nhtsa.gov/files/2022-06/ADAS-L2-SGO-Report-June-2022.pdf

    • oatscoop@midwest.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 hour ago

      I get the impression it disengages so that Tesla can legally say “self driving wasn’t active when it crashed” to the media.

    • FuglyDuck@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      20 hours ago

      Thanks for that.

      The thing is, though the NHTSA generally doesn’t make a determination on criminal or civil liability. They’ll make the report about what happened and keep it to the facts, and let the courts sort it out whose at fault. they might not even actually investigate a crash unless it comes to it. It’s just saying “when your car crashes, you need to tell us about it.” and they kinda assume they comply.

      Which, Tesla doesn’t want to comply, and is one of the reasons Musk/DOGE is going after them.

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        19 hours ago

        I knew they wouldn’t necessarily investigate it, that’s always their discretion, but I had no idea there was no actual bite to the rule if they didn’t comply. That’s stupid.

        • AA5B@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 hours ago

          Generally things like that are meant more to identify a pattern. It may not be useful to an individual, but very useful to determine a recall or support a class action