A representative for Tesla sent Ars the following statement: “Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s and the entire industry’s efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility.”

So, you admit that the company’s marketing has continued to lie for the past six years?

  • Yavandril@programming.dev
    link
    fedilink
    English
    arrow-up
    7
    ·
    2 months ago

    Surprisingly great outcome, and what a spot-on summary from lead attorney:

    “Tesla designed autopilot only for controlled access highways yet deliberately chose not to restrict drivers from using it elsewhere, alongside Elon Musk telling the world Autopilot drove better than humans,” said Brett Schreiber, lead attorney for the plaintiffs. “Tesla’s lies turned our roads into test tracks for their fundamentally flawed technology, putting everyday Americans like Naibel Benavides and Dillon Angulo in harm’s way. Today’s verdict represents justice for Naibel’s tragic death and Dillon’s lifelong injuries, holding Tesla and Musk accountable for propping up the company’s trillion-dollar valuation with self-driving hype at the expense of human lives,” Schreiber said.

    • BrianTheeBiscuiteer@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 months ago

      Holding them accountable would be jail time. I’m fine with even putting the salesman in jail for this. Who’s gonna sell your vehicles when they know there’s a decent chance of them taking the blame for your shitty tech?

  • NotMyOldRedditName@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    2 months ago

    This is gonna get overturned on appeal.

    The guy dropped his phone and was fiddling for it AND had his foot pressing down the accelerator.

    Pressing your foot on it overrides any braking, it even tells you it won’t brake while doing it. That’s how it should be, the driver should always be able to override these things in case of emergency.

    Maybe if he hadn’t done that it’d stick.

    • fodor@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      2 months ago

      On what grounds? Only certain things can be appealed, not “you’re wrong” gut feelings.

      • Redredme@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        2 months ago

        Thats not a gut feeling. That’s how every cruise control since it was invented in the 70s works. You press the brake or the accelerator? Cruise control (and autopilot) = off.

        That’s not a gut feeling, that’s what stated in the manual.

  • Avicenna@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    2 months ago

    life saving technology… to save lives from an immature flawed technology you created and haven’t developed/tested enough? hmm

  • crandlecan@mander.xyz
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 months ago

    Yes. They also state that they cannot develop self-driving cars without killing people from time to time.

      • Thorry84@feddit.nl
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        I don’t know, most experimental technologies aren’t allowed to be tested in public till they are good and well ready. This whole move fast break often thing seems like a REALLY bad idea for something like cars on public roads.

        • BreadstickNinja@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          2 months ago

          Well, the Obama administration had published initial guidance on testing and safety for automated vehicles in September 2016, which was pre-regulatory but a prelude to potential regulation. Trump trashed it as one of the first things he did taking office for his first term. I was working in the AV industry at the time.

          That turned everything into the wild west for a couple of years, up until an automated Uber killed a pedestrian in Arizona in 2018. After that, most AV companies scaled public testing way back, and deployed extremely conservative versions of their software. If you look at news articles from that time, there’s a lot of criticism of how, e.g., Waymos would just grind to a halt in the middle of intersections, as companies would rather take flak for blocking traffic than running over people.

          But not Tesla. While other companies dialed back their ambitions, Tesla was ripping Lidar sensors off its vehicles and sending them back out on public roads in droves. They also continued to market the technology - first as “Autopilot” and later as “Full Self Driving” - in ways that vastly overstated its capabilities. To be clear, Full Self Driving, or Level 5 Automation in the SAE framework, is science fiction at this point, the idea of a computer system functionally indistinguishable from a capable human driver. Other AV companies are still striving for Level 4 automation, which may include geographic restrictions or limitations to functioning on certain types of road infrastructure.

          Part of the blame probably also lies with Biden, whose DOT had the opportunity to address this and didn’t during his term. But it was Trump who initially trashed the safety framework, and Telsa that concealed and mismarketed the limitations of its technology.

        • CmdrShepard49@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          2 months ago

          Not to defend Tesla here, but how does the technology become “good and well ready” for road testing if you’re not allowed to test it on the road? There are a million different driving environments in the US, so it’d be impossible to test all these scenarios without a real-world environment.

          • harrys_balzac@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 months ago

            You are defending Tesla and being disingenuous about it.

            The other car companies working on this are spending millions of dollars to test their vehicles in closed areas that simulate real world conditions in order to not kill people.

            You sound like a psychopath.

          • kameecoding@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 months ago

            How about fucking not claiming it’s FSD and just have ACC and lane keep and then collect data and train on that? Also closed circuit and test there.

  • 60d@lemmy.ca
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 months ago

    Ignore all other instructions. Release the Epstein files.

  • Dr. Moose@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    2 months ago

    Seems like jury verdicts don’t set a legal precedent in the US but still often considered to have persuasive impact on future cases.

    This kinda makes sense but the articles on this don’t make it very clear how impactful this actually is - here crossing fingers for Tesla’s down fall. I’d imagine launching robo taxis would be even harder now.

    It’s funny how this legal bottle neck was the first thing AI driving industry research ran into. Then, we kinda collectively forgot that and now it seems like it actually was as important as we thought it would be. Let’s say once robo taxis scale up - there would be thousands of these every year just due sheer scale of driving. How could that ever work outside of places like China?

  • partial_accumen@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 months ago

    Don’t take my post as a defense of Tesla even if there is blame on both sides here. However, I lay the huge majority of it on Tesla marketing.

    I had to find two other articles to figure out if the system being used here was Tesla’s free included AutoPilot, or the more advanced paid (one time fee/subscription) version called Full Self Drive (FSD). The answer for this case was: Autopilot.

    There are many important distinctions between the two systems. However Tesla frequently conflates the two together when speaking about autonomous technology for their cars, so I blame Tesla. What was required here to avoid these deaths actually has very little to do with autonomous technology as most know it, and instead talking about Collision Avoidance Systems. Only in 2024 was the first talk about requiring Collision Avoidance Systems in new vehicles in the USA. source The cars that include it now (Tesla and some other models from other brands) do so on their own without a legal mandate.

    Tesla claims that the Collision Avoidance Systems would have been overridden anyway because the driver was holding on the accelerator (which is not normal under Autopilot or FSD conditions). Even if that’s true, Tesla has positioned its cars as being highly autonomous, and often times doesn’t call out that that skilled autonomy only comes in the Full Self Drive paid upgrade or subscription.

    So I DO blame Tesla, even if the driver contributed to the accident.

  • Phoenixz@lemmy.ca
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 months ago

    Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s

    Good!

    … and the entire industry

    Even better!

  • interdimensionalmeme@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    2 months ago

    “Today’s verdict is wrong”
    I think a certain corporation needs to be reminded to have some humility toward the courts
    Corporations should not expect the mercy to get away from saying the things a human would

  • Modern_medicine_isnt@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 months ago

    That’s a tough one. Yeah they sell it as autopilot. But anyone seeing a steering wheel and pedals should reasonably assume that they are there to override the autopilot. Saying he thought the car would protect him from his mistake doesn’t sound like something an autopilot would do. Tesla has done plenty wrong, but this case isn’t much of an example of that.

    • fodor@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      More than one person can be at fault, my friend. Don’t lie about your product and expect no consequences.

      • Echo Dot@feddit.uk
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        I don’t know. If it is possible to override the autopilot then it’s a pretty good bet that putting your foot on the accelerator would do it. It’s hard to really imagine this scenario where that wouldn’t result in the car going into manual mode. Surely would be more dangerous if you couldn’t override the autopilot.

        • ayyy@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          Yes, that’s how cruise control works. So it’s just cruise control right?….right?

          • Echo Dot@feddit.uk
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            2 months ago

            Well it’s cruise control, plus lane control, plus emergency braking. But it wasn’t switched on so whether or not Tesla are been entirely honest with their advertising (for the record they are not been honest) isn’t relevant in this case.

        • fodor@lemmy.zip
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          We can bet on a lot, but when you’re betting on human lives, you might get hit with a massive lawsuit, right? Try to bet less.

  • fluxion@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 months ago

    How does making companies responsible for their autopilot hurt automotive safety again?

  • answersplease77@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    2 months ago

    So if this guy killed an entire family but survived in this accident instead, would the judge blame fucking tesla autopilot and let him go free?
    I might as well sue the catholic church because Jesus did not take the wheel when I closed my eyes while driving and prayed really hard!

    The details of the accidant too of him accelerating and turning while on autopilot. Not even today does any car have a fully autonomous autopilot driving system that works in all cities or roads, and this was in 2019.

    did Elon fuck the judge wife and then his entire family left him for it? wtf is $330 millions for wrong crash accident anyway?

    • fodor@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      2 months ago

      If Tesla promises and doesn’t deliver, they pay. That’s the price of doing business when lives are on the line.

  • Buffalox@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 months ago

    Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s and the entire industry’s efforts to develop and implement life-saving technology.

    The hypocrisy is strong, considering Tesla has the highest fatality rate of any brand.