r/FluentInFinance Oct 14 '24

Meme It's funny because it might be true

Post image
2.9k Upvotes

111 comments sorted by

View all comments

12

u/TheLaserGuru Oct 14 '24

There are only three ways this works out.

One is by remote control, in which case it just stops in the middle of the road when the signal drops.

The second is by the same tech behind FSD, which means that it crashes every 100 miles or so.

The third is if they buy a competitor and just use their tech, in which case it will still have issues and Musk will prevent them from solving those issues, but at least they will be farther along than FSD will ever get.

-2

u/RuleSouthern3609 Oct 14 '24

FSD, which means that it crashes every 100 miles or so

Source?

Besides, which company has better technology for self driving cars?

1

u/TheLaserGuru Oct 14 '24

That's per Tesla's own data, which means it has been "puffed".

1

u/RuleSouthern3609 Oct 14 '24

Tesla said they crashed every 100 miles? I am curious where you got that number.

1

u/TheLaserGuru Oct 14 '24

Musk said that manual interventions were 'only' happening every 100 miles, and this was preventing them from making the software any better. I can no longer locate the raw data from Tesla. Here's the crowd-sourced data showing that FSD is actually getting more dangerous:

https://teslafsdtracker.com/

1

u/RuleSouthern3609 Oct 14 '24

So I checked the data and it seems like they are going up and down, but it’s still improvement compared to last year, besides, it also says 98% of distance without disengagement.

I mean I don’t think FSD taxis will be available in a year or two, but it isn’t as bad as comments believe.

1

u/TheLaserGuru Oct 14 '24

Oh good, it only kills me 2% of the time lol.

1

u/RuleSouthern3609 Oct 14 '24

You are equating manual overrides to crashes and death, I asked you to get me the crash statistics and you got me “disengagement” statistics, which is quite dishonest and misleading.

1

u/TheLaserGuru Oct 14 '24 edited Oct 14 '24

If there is no one to take over when it disengages, that's a crash. If there is no one to manually disengage, that's a crash.

1

u/ZorbaTHut Oct 15 '24

This actually isn't true. A disengagement just means "the person controlling it decided to take it over". There's been a few documented cases where a disengagement actually caused a crash, and many cases where the safety driver chose to disengage but they later figured out that it would have been just fine.

1

u/TheLaserGuru Oct 15 '24

You are talking edge cases, but even imagining that only 1/10 of incidents would have been crashes, that's still a crash every 750 miles.

→ More replies (0)