A 2025 Tesla Model 3 in Full-Self Driving mode drives off of a rural road, clips a tree, loses a tire, flips over, and comes to rest on its roof. Luckily, the driver is alive and well, able to post about it on social media.
I just don’t see how this technology could possibly be ready to power an autonomous taxi service by the end of next week.
This represents the danger of expecting driver override to avoid accidents. If the driver has to be prepared enough to take control in an accident like this AT ALL TIMES, then the driver is required to be more engaged then they would be if they were just driving manually, because they have to be constantly anticipating not just what other hazards (drivers, pedestrians,…) might be doing, they have to be anticipating in what ways their own vehicle may be trying to kill them.
Absolutely.
I’ve got a car with level 2 automation, and after using it for a few months, I can say that it works really well, but you still need to be engaged to drive the car.
What it is good at… Maintaining lanes, even in tricky situation with poor paint/markings. Maintaining speed and distance from the car in front of you.
What it is not good at… Tricky traffic, congestion, or sudden stops. Lang changes. Accounting for cars coming up behind you. Avoiding road hazards.
I use it mostly like an autopilot. The car takes some of the monotonous workload out of driving, which allows me to move my focus from driving the car to observing traffic, other drivers, and road conditions.
For no reason?
They are running proprietary software in the car that people don’t even know what is happening in background of. Every electric car needs to be turned into an open source car so that the car cannot be tampered with, no surveillancing, etc etc
Everyone should advocate for that because the alternative is this with Tesla. And I know nobody wants this happening to other car manufacturers cars as well
Elon took the wheel because that person made a mean tweet about him
“Kill me” it said in a robotic voice that got slower, glitchier, and deeper as it drove off the road.
EXTERMINAAAAAATE!!!
I have visions of Elon sitting in his lair, stroking his cat, and using his laptop to cause this crash. /s
Why would you inflict that guy on a poor innocent kitty?
That tree cast shade on his brand.
It had to go.
The problem with automation is complacency. Especially in something that people already have a very hard time taking seriously like driving where cell phone distraction, conversations, or just zoning out is super common.
Why someone will be a passenger in self-driving vehicle? They know that they are a test subjects, part of a “Cartrial” (or whatever should be called)? Self-Driving is not reliable and not necessery. Too much money is invested in something that is “Low priority to have”. There are prefectly fast and saf self-driving solutions like High-speed Trains.
I have no idea, I guess they have a lot more confidence in self driving (ESPECIALLY Tesla) than I do.
It got the most recent update, and thought a tunnel was a wall.
… and a tree was a painting.
To be fair, that grey tree trunk looked a lot like a road
It’s fine, nothing at all wrong with using just camera vision for autonomous driving. Nothing wrong at all. So a few cars run off roads or don’t stop for pedestrians or drive off a cliff. So freaking what, that’s the price for progress my friend!
I’d like to think this is unnecessary but just in case here’s a /s for y’all.
GPS data predicted the road would go straight as far as the horizon. Camera said the tree or shadow was an unexpected 90 degree bend in the road. So the only rational move was to turn 90 degrees, obviously! No notes, no whammies, flawless
Why was the driver not paying attention and why didn’t they just steer it back into the lane? It’s not called “Full Self Driving (Supervised)” for no reason. Hopefully Tesla get all the telemetry and share why it didn’t stay on the road, and also check if the driver was sleeping or distracted.
Watch the video. Happens insanely quickly. And on straight road that should be no issue so person’s guard was down
Watched it. Is there even any evidence that it was in FSD mode?
Can you prove that it wasn’t in self driving mode? Since you are being so inept about a technology that has had several other reports doing the same thing. You are defending the car which is a known POS.
The claim is that it was in SFD. Where’s the evidence?
The fact that many other Teslas have done the behavior that is reported. Are you new to how facts work?
That’s not evidence lol. The claim is that FSD did this, but no evidence was provided to show that it was.
The owner of this Tesla could have also posted the internal cabin camera footage. Wonder why they didn’t?….
removed by mod
That’s not how that works, New-account-with-negative-2500-karma. You supply evidence for your own claims, others can review the evidence.
The claim was that FSD did this, but no evidence was provided to say it did.
My claim is that there’s no evidence to show FSD was enabled.
“I’m confident that Save full self driving (SFSD) will be ready next year”
Don’t drive Tesla
Not really worth talking about unless the crash rate is higher than human average.
Imagine if people treated airbags that way XD
If Ford airbags just plain worked, and then Tesla airbags worked 999 times out of 1,000, would the correct answer be to say “well thems the breaks, there is no room for improvement, because dangerously flawed airbags are way safer than no airbags at all.”
Like, no. No, no, no. Cars get recalled for flaws that are SO MUCH less dangerous.
HAL9000 had Oh Clementine!
Has Tesla been training their AI with the lumberjack song?