Self-driving cars are often marketed as safer than human drivers, but new data suggests that may not always be the case.

Citing data from the National Highway Traffic Safety Administration (NHTSA), Electrek reports that Tesla disclosed five new crashes involving its robotaxi fleet in Austin. The new data raises concerns about how safe Tesla’s systems really are compared to the average driver.

The incidents included a collision with a fixed object at 17 miles per hour, a crash with a bus while the Tesla vehicle was stopped, a crash with a truck at four miles per hour, and two cases where Tesla vehicles backed into fixed objects at low speeds.

  • dogslayeggs@lemmy.world
    link
    fedilink
    English
    arrow-up
    26
    arrow-down
    5
    ·
    1 day ago

    It’s important to draw the line between what Tesla is trying to do and what Waymo is actually doing. Tesla has a 4x higher rate, but Waymo has a lower rate.

    • merc@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      3
      ·
      1 day ago

      Not just lower, a tiny fraction of the human rate of accidents:

      https://waymo.com/safety/impact/

      Also, AFAIK this includes cases when the Waymo car isn’t even slightly at fault. Like, there have been 2 deaths involving a Waymo car. In one case a motorcyclist hit the car from behind, flipped over it, then was hit by another car and killed. In the other case, ironically, the real car at fault was a Tesla being driven by a human who claims he experienced “sudden unintended acceleration”. It was driving at 98 miles per hour in downtown SF and hit a bunch of stopped cars at a red light, then spun into oncoming traffic and killed a man and his dog who were in another car.

      Whether or not self-driving cars are a good thing is up for debate. But, it must suck to work at Waymo and to be making safety a major focus, only to have Tesla ruin the market by making people associate self-driving cars with major safety issues.

        • merc@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          2
          ·
          24 hours ago

          Well, Waymo’s really at 0 deaths per 127 million miles.

          The 2 deaths are deaths that happened were near Waymo cars in a collision involving the Waymo car. Not only did the Waymo not cause the accidents, they weren’t even involved in the fatal part of either event. In one case a motorcyclist was hit by another car, and in the other one a Tesla crashed into a second car after it had hit the Waymo (and a bunch of other cars).

          The IIHS number takes the total number of deaths in a year, and divides it by the total distance driven in that year. It includes all vehicles, and all deaths. If you wanted the denominator to be “total distance driven by brand X in the year”, you wouldn’t keep the numerator as “all deaths” because that wouldn’t make sense, and “all deaths that happened in a collision where brand X was involved as part of the collision” would be of limited usefulness. If you’re after the safety of the passenger compartment you’d want “all deaths for occupants / drivers of a brand X vehicle” and if you were after the safety of the car to all road users you’d want something like “all deaths where the driver of a brand X vehicle was determined to be at fault”.

          The IIHS does have statistics for driver death rates by make and model, but they use “per million registered vehicle years”, so you can’t directly compare with Waymo:

          https://www.iihs.org/ratings/driver-death-rates-by-make-and-model

          Also, in Waymo it would never be the driver who died, it would be other vehicle occupants, so I don’t know if that data is tracked for other vehicle models.

          • hector@lemmy.today
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            2
            ·
            14 hours ago

            I seem to recall a homeless woman that got killed like right away when they released these monstrosities on the road, because why pay people to do jobs when machines can do them for you? I’m sure that will work out for everyone, with investment income.

            • JcbAzPx@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              7 hours ago

              That was Uber’s attempt at self driving after they had to give up the stolen Waymo data. Waymo is probably the best at self driving, but even they spend too much time blocking traffic when they can’t reach the Indian call center to fix the situation they’ve gotten themselves in.

        • 73ms@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          2
          ·
          24 hours ago

          When there’s two deaths total it’s pretty obvious that there just isn’t enough data yet to consider the fatal accident rate. Also FWIW like was said neither of those was in any way the Waymo’s fault.

          • hector@lemmy.today
            link
            fedilink
            English
            arrow-up
            2
            ·
            14 hours ago

            That’s the problem, you can’t trust these companies not to use corrupt influence to blame others for their mistakes. It’s you verses a billions of dollars companies with everything at stake, that owns (senior tiered leasing rights,) your politicians, both locally, in state, and federally, and by extension the regulators up and down the line.

            Do you not know how things work in this country? Given their outsized power we don’t want them involved in determining blame for accidents, dash cam footage or no, we’ve seen irrefutable evidence is no guarentee of justice, even if it’s provided to you.

            • 73ms@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              13 hours ago

              Well Waymo isn’t assigning blame, it’s a third party assessment based on the information released about those accidents. The strongest point remains that fatal accidents are rare enough that there simply isn’t enough data to claim any statistical significance for these events. The overall accident rate for which data is sufficient remains significantly lower than the US average.

              • hector@lemmy.today
                link
                fedilink
                English
                arrow-up
                2
                ·
                13 hours ago

                They have influence with the police and regulators, and insurance companies, to avoid blame.

                They are on limited routes, at lower speeds, so they won’t have a higher fatality rate. If you compared human drivers for that same stretch of road it would also be zero. You can’t compare human drivers on expressways during rush hour with waymo’s trip between the airport and the hotels on a mapped out route that doesn’t go on the expressway.

                • 73ms@sopuli.xyz
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  2
                  ·
                  10 hours ago

                  It is obviously false that fatal accidents would be “zero” on the roads Waymos are limited to, it’s ridiculous to even suggest such a thing. What is true that such accidents are even more rare there though. It’s another good reason for why it makes no sense to solely focus on fatal accidents as they are unlikely to be involved in them anyway due to these limits. That’s in addition to the fact that the statistical analysis is simply impossible with current vehicle miles.

                  Now, I’m not saying we know for certain Waymo is much safer than a human as the current statistics imply, that is going to require more rigorous studies. I would say what we’ve got is good enough to say that nothing points to them being particularly unsafe.

                  • hector@lemmy.today
                    link
                    fedilink
                    English
                    arrow-up
                    2
                    ·
                    10 hours ago

                    What do you mean, you are comparing dangerous driving spots to safe driving spots. No shit the highway entry ramps have more fatal accidents than the leisure cruise in the 8 lane road from the airport to the hotel. And yes, human drivers on that leisure cruise would have a different rate of accidents than on the death ramps on the expressways.

                    Not acknowledging that point, and misrepresenting it, doesn’t speak well to your credibility here, it’s a simple and unarguable point.

      • TooManyGames@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 day ago

        I immediately formed a conspiracy theory that Teslas automatically accelerate when they see Waymo cars

        • merc@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 day ago

          And it’s not out of aggression. It’s just that their image recognition algorithms are so terrible that they match the Waymo car with all its sensors to a time-traveling DeLorean and try to hit 88 mph… or something.

    • ThirdConsul@lemmy.zip
      link
      fedilink
      English
      arrow-up
      10
      ·
      edit-2
      1 day ago

      Isn’t Waymo rate better because they are very particular where they operate? When they are asked to operate in sligthly less than perfect conditions it immediately goes downhill https://www.researchgate.net/publication/385936888_Identifying_Research_Gaps_through_Self-Driving_Car_Data_Analysis (page 7, Uncertainty)

      Edit: googled it a bit, and apparently Waymo mostly drives in

      Waymo vehicles primarily drive on urban streets with a speed limit of 35 miles per hour or less

      Teslas do not.

      • 73ms@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        24 hours ago

        We are talking about Tesla robotaxis. They certainly do drive in very limited geofenced areas also. While Waymo now goes on freeways only in the Bay Area with the option being offered to only some passengers Tesla Robotaxis do not go on any freeways ever currently. In fact they only have a handful of cars doing any unsupervised driving at all and those are geofenced in Austin to a small area around a single stretch of road.

        Tesla Robotaxis currently also cease operations in Austin when it rains so Waymo definitely is the more flexible one when it comes to less than perfect conditions.

      • dogslayeggs@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        2
        ·
        1 day ago

        That is certainly true, but they are also better than humans in those specific areas. Tesla is (shockingly) stupid about where they choose to operate. Waymo understands their limitations and choose to only operate where they can be better than humans. They are increasing their range, though, including driving on the 405 freeway in Los Angeles… which is usually less than 35mph!!

        • jabjoe@feddit.uk
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 day ago

          Are they doing FSD if there are human overseas? Surely that is not “fully”.

          So human overseas and not only cameras.

          • 73ms@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            2
            ·
            23 hours ago

            All these services have the ability for a human to solve issues if the FSD disengages. Doesn’t mean they’re not driving on their own most of the time including full journeys. The remote assistant team is just ready to jump in if there’s something unusual that causes the Waymo driver to disengage and even then they don’t usually directly control the car, they just give the driver instructions on how to resolve the situation.

            • jabjoe@feddit.uk
              link
              fedilink
              English
              arrow-up
              1
              ·
              14 hours ago

              I think Waymo are right to do what they do. I just wouldn’t call it “fully”. If Telsa are doing the same and still doing badly, or should be doing the same and aren’t, it still makes them worse than Waymo either way.