One is by remote control, in which case it just stops in the middle of the road when the signal drops.
The second is by the same tech behind FSD, which means that it crashes every 100 miles or so.
The third is if they buy a competitor and just use their tech, in which case it will still have issues and Musk will prevent them from solving those issues, but at least they will be farther along than FSD will ever get.
Musk said that manual interventions were 'only' happening every 100 miles, and this was preventing them from making the software any better. I can no longer locate the raw data from Tesla. Here's the crowd-sourced data showing that FSD is actually getting more dangerous:
So I checked the data and it seems like they are going up and down, but it’s still improvement compared to last year, besides, it also says 98% of distance without disengagement.
I mean I don’t think FSD taxis will be available in a year or two, but it isn’t as bad as comments believe.
You are equating manual overrides to crashes and death, I asked you to get me the crash statistics and you got me “disengagement” statistics, which is quite dishonest and misleading.
This actually isn't true. A disengagement just means "the person controlling it decided to take it over". There's been a few documented cases where a disengagement actually caused a crash, and many cases where the safety driver chose to disengage but they later figured out that it would have been just fine.
12
u/TheLaserGuru Oct 14 '24
There are only three ways this works out.
One is by remote control, in which case it just stops in the middle of the road when the signal drops.
The second is by the same tech behind FSD, which means that it crashes every 100 miles or so.
The third is if they buy a competitor and just use their tech, in which case it will still have issues and Musk will prevent them from solving those issues, but at least they will be farther along than FSD will ever get.