• Pika@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    6
    ·
    4 months ago

    Furthermore, with the amount of telemetry that those cars have The company knows whether it was in self drive or not when it went onto the track. So the fact that they didn’t go public saying it wasn’t means that it was in self-drive mode and they want to save the PR face and liability.

    • IphtashuFitz@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      4 months ago

      I have a nephew that worked at Tesla as a software engineer for a couple years (he left about a year ago). I gave him the VIN to my Tesla and the amount of data he shared with me was crazy. He warned me that one of my brake lights was regularly logging errors. If their telemetry includes that sort of information then clearly they are logging a LOT of data.

        • Pika@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 months ago

          Dude, in today’s world we’re lucky if they stop at the manufacturer. I know of a few insurances that have contracts through major dealers and they just automatically get the data that’s registered via the cars systems. That way they can make better decisions regarding people’s car insurance.

          Nowadays it’s a red flag if you join a car insurance and they don’t offer to give you a discount if you put something like drive pass on which logs you’re driving because it probably means that your car is already getting that data to them.

    • catloaf@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      I’ve heard they also like to disengage self-driving mode right before a collision.

      • sylver_dragon@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        That actually sounds like a reasonable response. Driving assist means that a human is supposed to be attentive to take control. If the system detects a situation where it’s unable to make a good decision, dumping that decision on the human in control seems like the closest they have to a “fail safe” option. Of course, there should probably also be an understanding that people are stupid and will almost certainly have stopped paying attention a long time ago. So, maybe a “human take the wheel” followed by a “slam the brakes” if no input is detected in 2-3 seconds. While an emergency stop isn’t always the right choice, it probably beats leaving a several ton metal object hurtling along uncontrolled in nearly every circumstance.

        • zaphod@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          4 months ago

          That actually sounds like a reasonable response.

          If you give the driver enough time to act, which tesla doesn’t. They turn it off a second before impact and then claim it wasn’t in self-driving mode.

          • whotookkarl@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 months ago

            Not even a second, it’s sometimes less than 250-300ms. If I wasn’t already anticipating it to fail and disengage as it went though the 2-lane wide turn I would have gone straight into oncoming traffic

      • GreenBottles@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        4 months ago

        That sounds a lot more like a rumor to me… it would be extremely suspicious and would leave them open to GIGANTIC liability issues.

        • sem@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          4 months ago

          It’s been well documented. It lets them say in their statistics that the owner was in control of the car during the crash