DEV Community

Discussion on: Is it Ethical to Work on the Tesla Autopilot Software?

Collapse
 
bternarytau profile image
BTernaryTau

I notice that for part 3, the linked article cites the weaker of the two sets of data I've seen used to defend Autopilot. The stronger set of data deals with all crashes for a larger sample size and compares crash rates from before and after the system is installed.

Collapse
 
bosepchuk profile image
Blaine Osepchuk • Edited

Interesting. Thanks for sharing.

I'm not trying to bash on Tesla but the article doesn't actually contain any data (just the 40% number). How many miles driven with autopilot and without? How many crashes in each mode? Which versions of autopilot? Weather factors? Time of day factors? Other factors?

I really hope self driving cars do in fact save lives. Everyone knows someone who has being injured or killed in a car accident and it's just terrible. If we can use tech to prevent even some of those deaths, that would be awesome.

Collapse
 
bternarytau profile image
BTernaryTau

Unfortunately it appears the data itself is not public, which is now really annoying me.

Despite being a Tesla supporter, I do agree with many of your concerns. It's important to minimize both type I and type II errors, and it is problematic that companies working on the technology are incentivized to downplay the former and emphasize the latter.

Thread Thread
 
bosepchuk profile image
Blaine Osepchuk

I agree.