Paraphrasing:
“We only have the driver’s word they were in self driving mode…”
“This isn’t the first time a Tesla has driven onto train tracks…”
Since it isn’t the first time I’m gonna go ahead and believe the driver, thanks.
Furthermore, with the amount of telemetry that those cars have The company knows whether it was in self drive or not when it went onto the track. So the fact that they didn’t go public saying it wasn’t means that it was in self-drive mode and they want to save the PR face and liability.
I have a nephew that worked at Tesla as a software engineer for a couple years (he left about a year ago). I gave him the VIN to my Tesla and the amount of data he shared with me was crazy. He warned me that one of my brake lights was regularly logging errors. If their telemetry includes that sort of information then clearly they are logging a LOT of data.
Modern cars (in the US) are required to have an OBD-II Port for On-Board Diagnostics. I always assumed most cars these days were just sending some or all of the real-time OBD data to the manufacturer. GM definitely has been.
Dude, in today’s world we’re lucky if they stop at the manufacturer. I know of a few insurances that have contracts through major dealers and they just automatically get the data that’s registered via the cars systems. That way they can make better decisions regarding people’s car insurance.
Nowadays it’s a red flag if you join a car insurance and they don’t offer to give you a discount if you put something like drive pass on which logs you’re driving because it probably means that your car is already getting that data to them.
I’ve heard they also like to disengage self-driving mode right before a collision.
That actually sounds like a reasonable response. Driving assist means that a human is supposed to be attentive to take control. If the system detects a situation where it’s unable to make a good decision, dumping that decision on the human in control seems like the closest they have to a “fail safe” option. Of course, there should probably also be an understanding that people are stupid and will almost certainly have stopped paying attention a long time ago. So, maybe a “human take the wheel” followed by a “slam the brakes” if no input is detected in 2-3 seconds. While an emergency stop isn’t always the right choice, it probably beats leaving a several ton metal object hurtling along uncontrolled in nearly every circumstance.
That actually sounds like a reasonable response.
If you give the driver enough time to act, which tesla doesn’t. They turn it off a second before impact and then claim it wasn’t in self-driving mode.
Not even a second, it’s sometimes less than 250-300ms. If I wasn’t already anticipating it to fail and disengage as it went though the 2-lane wide turn I would have gone straight into oncoming traffic
Yeah but I googled it after making that comment, and it was sometimes less than one second before impact: https://futurism.com/tesla-nhtsa-autopilot-report
That sounds a lot more like a rumor to me… it would be extremely suspicious and would leave them open to GIGANTIC liability issues.
It’s been well documented. It lets them say in their statistics that the owner was in control of the car during the crash
Since the story has 3 separate incidents where “the driver let their Tesla turn left onto some railroad tracks” I’m going to posit:
Teslas on self-drive mode will turn left onto railroad tracks unless forcibly restrained.
Prove me wrong, Tesla
The ~2010 runaway Toyota hysteria was ultimately blamed on mechanical problems less than half the time. Floor mats jamming the pedal, drivers mixing up gas/brake pedals in panic, downright lying to evade a speeding ticket, etc were cause for many cases.
Should a manufacturer be held accountable for legitimate flaws? Absolutely. Should drivers be absolved without the facts just because we don’t like a company? I don’t think so. But if Tesla has proof fsd was off, we’ll know in a minute when they invade the driver’s privacy and release driving events
Tesla has constantly lied about their FSD for a decade. We don’t trust them because they are untrustworthy, not because we don’t like them.
I have no sources for this so take with a grain of salt… But I’ve heard that Tesla turns off self driving just before an accident so they can say it was the drivers fault. Now in this case, if it was on while it drove on the tracks I would think would prove it’s Tesla’s faulty self driving plus human error for not correcting it. Either way it would prove partly Tesla’s fault if it was on at the time.
They supposedly also have a threshold, like ten seconds - if FSD cuts out less than that threshold before the accident, it’s still FSD’s fault
How is a manufacturer going to be held responsible for their flaws when musk DOGE’d
investigating his companies?
The ~2010 runaway Toyota hysteria was ultimately blamed on mechanical problems less than half the time. Floor mats jamming the pedal, drivers mixing up gas/brake pedals in panic, downright lying to evade a speeding ticket, etc were cause for many cases.
I owned an FJ80 Land Cruiser when that happened. I printed up a couple stickers for myself, and for a buddy who owned a Tacoma, that said “I’m not speeding, my pedal’s stuck!” (yes I’m aware the FJ80 was slow as dogshit, that didn’t stop me from speeding).
I mean… I have seen some REALLY REALLY stupid drivers so I could totally see multiple people thinking they found a short cut or not realizing the road they are supposed to be on is 20 feet to the left and there is a reason their phone is losing its shit all while their suspension is getting destroyed.
But yeah. It is the standard tesla corp MO. They detect a dangerous situation and disable all the “self driving”. Obviously because it is up to the driver to handle it and not because they want the legal protection to say it wasn’t their fault.
At my local commuter rail station the entrance to the parking lot is immediately next to the track. It’s easily within margin of error for GPS and if you’re only focusing immediately in front of you the pavement at the entrance probably look similar.
There are plenty of cues so don’t rolled shouldn’t be fooled but perhaps FSD wouldn’t pay attention to them since it’s a bit of an outlier.
That being said, I almost got my Subaru stuck once because an ATV trail looked like the dirt road to a campsite from the GPS, and I missed any cues there may have been
You uh… don’t need to tell people stuff like that.
Maybe I’m missing something, but isn’t it trivial to take it out of their bullshit dangerous “FSD” mode and take control? How does a car go approximately 40-50 feet down the tracks without the driver noticing and stopping it?
Yes.
You hit the brake.
Ideally you hit the brakes before buyin the tesla.
It simply saw a superior technology and decided to attack.
That … tracks
If only there was a way to avoid the place where trains drive.
I checked first. They didn’t make a turn into a crossing. It turned onto the tracks. Jalopnik says there’s no official statement that it was actually driving under FSD(elusion) but if it was strictly under human driving (or FSD turned itself off after driving off) I guarantee Tesla will invade privacy and slander the driver by next day for the sake of court of public opinion
Deregulation, ain’t it great.
Tesla’s have a problem with the lefts.
How the fuck do you let any level 2 system go 40 to 50 fucking feet down the railroad tracks.
We’re they asleep?
Damn. I hope the train is ok
Tesla’s self-driving is pretty shite but they seem to have a particular problem with railway crossings, as also pointed out in the article. Of all of the obstacles for the self-driving system to fail to detect, the several thousand tons of moving steel is probably one of the worst outcomes.
Where’s the video?
What a cool and futuristic car. It’s all computer!
I’m still waiting for Elon’s car to drive onto train tracks.
You could not pay me to drive a Tesla.
Teslas do still have steering wheels, after all
You don’t say!
Can’t wait to hop in a Robotaxi! /s
What’s that? They’ll have human drivers in them? Still maybe no.
Working as expected then.