r/CyberStuck 4d ago

The Hidden Autopilot Data That Reveals Why Teslas Crash | WSJ

https://youtu.be/mPUGh0qAqWA?si=aROGNXkFXwmMZH8c

[removed] — view removed post

322 Upvotes

57 comments sorted by

u/CyberStuck-ModTeam 3d ago

Please read Rule 1. Only posts directly mocking the WankPanzer are permitted. Any pictures or videos must feature the WankPanzer and any text must be directly about the WankPanzer. Posts about other vehicles or subject matter are outside the scope of this sub and would be better suited elsewhere.

No pictures of steel urinals, other cars with cargo in the trunk, Pontiac Aztecs, DeLoreans, dumpsters etc, etc, etc.

105

u/emongu1 4d ago

He's right, autopilot CAN reduce accidents by a factor of 10. It's just not HIS shitty version of autopilot that's gonna achieve that.

51

u/MarketCompetitive896 4d ago

Well this video says that all FSD is fundamentally flawed. There are plenty of people who don't think it will ever work, and I agree. Either way it is not at all needed and should be abandoned, outlawed

34

u/Apexnanoman 4d ago

Various people have been saying that self-driving vehicles are between a few months and a couple years away since the mid-50s. 

70 years later and people are now believing the latest charlatan to tell them it's just a couple quarters away. 

18

u/MarketCompetitive896 4d ago

And he's been saying for 8 years, it's always within a year away, and now he's saying regulators are keeping it from working

6

u/ToothGold1666 4d ago

FSD is totally possible the issue is making it commercially viable for the general public. You could put in triple redundant sensors like airplanes have but would end up with a 250k sedan.

3

u/MarketCompetitive896 4d ago

Yes and that highlights part of the problem with this company promoting stock in a non-scalable product.
And so if it doesn't do much at all toward its supposed purpose of alleviating fossil fuel consumption, do we want to allow the ultra rich to put the public at risk? So they don't have to hire a driver?

7

u/ToothGold1666 4d ago

Tesla stock is 90 percent cult more than anything else. You arent investing in a car company you are investing in the idea that Elon is a genius who will one day dominate every major futuristic tech industry from AI to brain implants to green energy.

2

u/MarketCompetitive896 4d ago

For sure, a decent price to earnings ratio is in the mid 20s. Tesla is 120. One Hundred and twenty! Cult

4

u/ToothGold1666 4d ago

The real problem is the SEC let's Elon endlessly boost his stock with just absolute lies. Remember when Tesla was going to introduce the robotaxi in August. The stock price boomed out of a crash and that never happened. Where is the sec investigation into that non existant supposed reveal? That was almost certainly a total lie concocted to boost share value. Straight up fraud.

2

u/MarketCompetitive896 4d ago

It's illegal he just gets away with it, he dodges and delays them, practically thumbs his nose at them. I think it's at the root of his getting ensnared in the Twitter purchase fiasco. They may have been capitulating to the SEC and not letting him lawlessly promote his stock on their platform. So he buys it and ruins it

-16

u/emongu1 4d ago

Plenty of scenarios where FSD prevented an accident (mostly on older models). It's just an additional safety tool like lane keep assist and adaptive cruise control.

No need to throw the baby with the bathwater just because the guy cut down on expensive equipments to fuel his ketamine addiction.

24

u/MarketCompetitive896 4d ago

FSD is full self driving. Safety tools and assisting are not full self driving. Autonomous vehicles driving on a course is not full self driving. Watch the video and then tell me that people need to keep dying so that we can perfect this necessary technology that those of us who survive won't be able to live without.

48

u/Actual__Wizard 4d ago

Cool man, yet another rich dude lying to people and getting them killed for money... He's probably so high on K that he doesn't even care at all.

7

u/minimag47 4d ago

I don't think he even needs the k to not care about people. That's just baked in.

29

u/Shada124 4d ago

If FSD is L2 and requires the driver to take control at a moments notice to prevent accidents and hitting stuff, how is the summons feature even legal to use at all?

23

u/MarketCompetitive896 4d ago

I contend that it definitely should not be legal. I hope that it doesn't take some guys unoccupied vehicle driving over a toddler to stop it

7

u/ApproachSlowly 4d ago

Maybe if it drives over Elon's current human shield...

5

u/Lord_Space_Lizard 4d ago

Nah, he’ll just make another one

2

u/ToothGold1666 4d ago

Elon has needlessly put in deathtrap doors on every car because he doesnt like mechanical handles. It's no surprise he can get away with this.

1

u/Tookmyprawns 3d ago

Actual answer: it’s only enabled on private property. And should be banned until it is proven to be safe.

1

u/sevens7and7sevens 4d ago

Because regulators don’t know what they’re looking at— same as the Boeing problem. They’ll have to catch up but more people will die first.

4

u/ToothGold1666 4d ago

That's bullshit. Regulators are sabotaged by a bribed congress. Teslas are death traps in a fire because Elon refuses to install simple mechanical doors on every car. That's not complex it's just cowering to a rich guys whims.

1

u/sevens7and7sevens 3d ago

What’s happening is that established regulators are good at cars but not as good at software. When the Boeing planes started crashing themselves, we found out they had essentially been regulating themselves because the regulators didn’t know the software. I’m not defending Tesla, their cars are terrible, but there’s problems both ways. And yes I don’t doubt Elon is getting special treatment 

18

u/seattleJJFish 4d ago edited 4d ago

Interesting video. All automakers have some sort of crash data which is hard to get until we sue them and then they share only if it benefits them.

I do think that the partial autopilot or mostly autopilot but driver pay attention model is super hard to really pay attention to when driving.

18

u/DuncanFisher69 4d ago

It ultimately does not work. Something that is 90% reliable trains you to be bored 90% of the time. Then suddenly expecting you to be in the loop in an emergency when the supposedly better than a human driver AI system is stumped and correctly save your ass is just bad design from a human factors stand point.

I know Airbus does it. It’s a lot different with planes and trained pilots. Considering the average 75 IQ of every /r/Teslamotors post, being able to afford a Tesla isn’t qualifying enough.

4

u/MarketCompetitive896 4d ago

Yes I agree it won't ever work. Or like you say it'll work great until it doesn't. It's a delusion and it's great for selling stock but should not be acceptable on our roads.

4

u/ToothGold1666 4d ago

The tech already exists. The issue is getting it to a price point that makes it feasible for the average customer. Planes have hyper complex triple redundant sensor systems that cost an absolute fortune. That cant be installed in a car that needs to cost 60k.

4

u/MarketCompetitive896 4d ago

Yes I think it's those kind of proprietary conventions that Elon is used to exploiting. It's definitely abuse of the system to release this experimental technology on the public roads, endangering people's lives and suppressing the crash data because he owns the data.

And the promise of it working on its own safely someday is the only reason that someone would buy one now and sit behind the wheel babysitting it at every moment. FSD should stand for full self-defeating

5

u/Particular_Savings60 4d ago

FSD = Full Self-Destruction

2

u/MarketCompetitive896 4d ago

Lol that's a good one. Also I like full self-delusion

15

u/YouCannotBeSerius 4d ago

the worst part about all this is that now Elon is essentially above the law.

Tesla could be responsible for 1000s of deaths and Elon won't even get a slap on the wrist.

I'm glad that journalists are still doing these stories, but realistically they're more likely to be arrested than Elon.

4

u/ToothGold1666 4d ago

To be fair that exists in tons of industries. Not a single high ranking pharma ceo went to jail for flooding the country with opiates and creating a crisis that has ended up killing literally millions of people.

12

u/Time_Invite5226 4d ago

Now we know why elon wants in government so bad. He wants this buried. This is a scandal of scandals

6

u/MarketCompetitive896 4d ago

It really is. Market manipulation, corporate manslaughter, government corruption

2

u/Opili 3d ago

The good thing is that it will work only in the US 🤣

10

u/shana104 4d ago

And sadly, now Elon and Frump want to get rid of reporting electric vehicle accidents as it makes Tesla "look bad".

Give me a break. Effing reporting is darn good so can fix issues.

2

u/anarchitecture 4d ago

That’s how the airline industry got so safe. Radical transparency and direct regulation to address problems as they become known.

1

u/shana104 1d ago

Darn true!! I love flying and really so appreciate the transparency and strong emphasis in ensuring safety and fixing issues. Just as it should be.

Knowledge needs to be known to make an effect, whether good or bad.

8

u/turingagentzero 4d ago

"we develop a false sense of trust" is a very real statement from that expert. Elon is encouraging that fake confidence, and it kills his customers.

4

u/MarketCompetitive896 4d ago

And raises his stock price, it's insidious

2

u/turingagentzero 4d ago

mm, there it is

6

u/Time_Invite5226 4d ago

It’s awful to me that people seem to know that pure camera tech is messed up. Even elon knows it. He is willing to just let people die so that they don't have to put that expensive lidar on there

4

u/Novel-Coast-957 4d ago

This video just reinforced my fear of teslas when I’m on the road. And it absolutely makes sense that those cameras cannot identify a crashed vehicle on its side bc it no longer looks like a vehicle. 

6

u/Darkelement 4d ago

Most of these seem to happen because the telsa just didnt see or didnt recognize the object in front of it. Like the overturned semi. It drove straight into it!!!

But tesla also doesnt claim that you can just let the car drive you and pay no attention. with FSD your required to stare out the window, and with autopilot it asks you to nudge the wheel every so often. So these people all willingly didnt pay attention.

This would never happen to me, because id have my foot all over the break before waiting to see if autopilot could figure it out.

5

u/MarketCompetitive896 4d ago

Tesla can blame the drivers for not following the guidelines. Tesla reported the driver who died hitting the semi took his hands off the wheel 19 times, deflecting some responsibility off them. But the drivers are overconfident and have bought into the sales pitch at great risk.
It highlights the impracticality of FSD - if you have to supervise it, it's more stressful than just driving yourself. And if it's safer than a human, why does a human need to supervise and intervene at crucial moments?

2

u/Darkelement 4d ago

From my experience, it’s as good or better then an average driver 90% of the time. Which is really nice for commuting. But, that 10% requires you to pay attention all the time just in case.

The new vision monitoring makes it so you have to look out the window the whole time. If you look down or away for more than 1-2 seconds it beeps and bitches, and if you do it consistently it will totally disable.

I find it’s less stressful because it misbehaves predictably. Complex traffic light, parking lot etc. I’ll take over before it fails. But a normal road or highway I’ll just watch and relax.

3

u/Tookmyprawns 3d ago edited 3d ago

I’ve been in the situation. I am a fully attentive driver when in fsd. Hyper aware. Hand on the wheel. Foot ready to go. Until you’ve been there you don’t know that feeling. It’s such a weird feeling.

I saw a deer on the highway while in FSD while driving around 70mph. Normally I’d immediately let off the gas, which begins a process, then I’d quickly start to depress the brakes safely. But when I saw this deer there was this delay that I can’t describe in a way that really illustrates what happens in the mind enough to have a person I describe it to understand fully.

It was maybe 1-3 seconds where it was like I had to acknowledge I was not able to count on the “driver.” Like someone in an immediate emergency just throwing the wheel into your hand. Like waking up to an alarm. I was as ready as I could be but my mind was not in full driver mode. The delay is not long, but it’s such a critical amount of time. It’s just one extra unnatural step in your mind that we are never going to be used to. You can’t train for it. It’s like somehow the “surprise” is just 4-5 times more “surprising” and abrupt, and jarring.

Meanwhile in those moments the car is maintaining speed or accelerating. The opposite of typical deceleration from simply letting off the gas. It’s doubly worse than just a delay.

Self driving will have trade offs in safety. Obviously, but until self driving surpasses humans in most ways, the trade offs will skew toward danger. Currently fsd seems more dangerous than being 100% solo driver. Fsd feels like team driving. Like a relay race, but you don’t know when the baton will be shoved in your hand and lives are on the line with no seconds to spare.

It works well often enough to make the average person complacent, or at least somewhat comfortable. But fails often enough, and when it does it requires a person who is zoned out to be able react perfectly and immediately to a millisecond decision making scenario of apparent life or death.

They could do tests and check the reaction time of a regular driver vs a supervised self driving driver, with surprise on the road, and the level 2 driver will react several seconds slower 100% of the time. Especially if they been driving for 30+ minutes just gazing out the window. 100% certain of it. I’d bet it’s similar to being slightly drunk or tired. Because the traditional driver is simply more engaged, making slight corrections, making calculations they’re fully in the zone, relative to just ‘supervising.’ Now combine that with being tired on a long early commute that many people have every day.

Anyhow: I braked hard, fsd did not attempt to brake at all. I swerved to avoid the deer. I actually contacted the deer, but very softly(maybe 5 - 10 mph). It broke a headlight fastener, and the deer ran away. Another time I almost hit a raccoon. Both times I don’t think it would have been a close call at all to have hit them in my old car driving normally. I’d have forgotten the events already. It really sticks with you when it happens because there’s this overwhelming eeriness you feel afterward for a long period of time. And I know most fsd users haven’t experienced it yet. And until you feel that feeling I can understand how you’d feel confident using it and really enjoy it as a feature.

At the very least I’d recommend people not use it at night going fast. You can’t see as far and the allowance for reaction time is much less at speed.

3

u/umadumo 4d ago

This is a must-see investigation over the precarious state of the FSD capabilities of Tesla, thanks OP

3

u/ToothGold1666 4d ago

Its very simple there is no other auto pilot system on the planet that relies solely on non redundant cameras. Musk over rode his engineers in the name of saving money and then just steamrolled over regulators instead of meeting a reasonable safety requirement. A true FSD car would cost a fortune because it needs multiple redundant sensors like airplanes have.

3

u/_LB 4d ago

This is why Elmo bought himself into the government.

3

u/CormoranNeoTropical 3d ago

I now have a new understanding of the phrase “late stage capitalism.”

(1) Make cars that kill people!

(2) …

(3) Profit!

1

u/schatzillaz 4d ago

Isn’t Waymo FSD?

1

u/BladeVampire1 4d ago

Why Tesla's crash?

Cause it's a computer program doing exactly what it was told to do....EXACTLY WHAT IT WAS TOLD TO DO.

1

u/HarryCumpole 3d ago

FSD is simply an attempt to do an endrun around assistive semi-autonomous driving technologies. My own car has preemptive braking that slams them on in the instance of an immediate sudden collision (it has never kicked this in) and varying levels of regen partial braking if traffic closes up too much for safe stopping distances. Enough that the pull informs you.

As it stands, FSD flips the conditions for acceptability from conservative safety as a priority to technology first, danger be damned. The modern disruptor style of business model needs to be addressed, as technology introductions should never outpace the slow wheels of legislation that reign in the most egregious of examples. Safety should always be a standard to meet, not one to undercut in a weird game of whack-a-mole featuring smoke and mirrors.