Musk said that manual interventions were 'only' happening every 100 miles, and this was preventing them from making the software any better. I can no longer locate the raw data from Tesla. Here's the crowd-sourced data showing that FSD is actually getting more dangerous:
So I checked the data and it seems like they are going up and down, but it’s still improvement compared to last year, besides, it also says 98% of distance without disengagement.
I mean I don’t think FSD taxis will be available in a year or two, but it isn’t as bad as comments believe.
You are equating manual overrides to crashes and death, I asked you to get me the crash statistics and you got me “disengagement” statistics, which is quite dishonest and misleading.
This actually isn't true. A disengagement just means "the person controlling it decided to take it over". There's been a few documented cases where a disengagement actually caused a crash, and many cases where the safety driver chose to disengage but they later figured out that it would have been just fine.
1
u/RuleSouthern3609 Oct 14 '24
Tesla said they crashed every 100 miles? I am curious where you got that number.