Yet another example came to light on Monday when a driver in North Brunswick, New Jersey wrecked his Tesla on a highway while the vehicle was in Autopilot mode. According to a report published by News 12 New Jersey, the driver said that the vehicle "got confused due to the lane markings" at a point where the driver could have stayed on the highway or taken an exit. The driver claims that Autopilot split the difference and went down "the middle", between the exit and staying on the highway.Insty is skeptical, but I'm not. This is exactly the kind of situation that you should suspect the software could handle badly: confusing input from signs or lane markers leading to a failure to navigate the car on a safe route. It's not a software bug, it's a gap in the algorithm used to control the car.
The car then drove off the road and collided with several objects before coming to a stop. The driver claims that he tried to regain control of the vehicle but that "it would not let him".
Friday, February 15, 2019
Self-Driving cars: unsafe at any speed
A Tesla on autopilot drove itself into a wreck. The failure mode is interesting:
I'm not sure that this is solvable, either. The way software developers handle these "edge cases" is (a) ignore them if possible (I can't see Tesla being able to do that, or (b) write a special case condition that covers the situation. The problem with the later option is that there can be hundreds of these special cases that need to be coded in. That makes the software a huge bloated mass that nobody can really predict how it will work. Validation becomes really hard and QA testing becomes essentially impossible.
And this is without postulating software bugs - this is all trying to make the algorithm suck less. Of course, the more code you have to write, the more bugs you will have - remember that validation becomes really hard and testing well nigh impossible? You'll have an unknown number of potentially fatal bugs that you probably won't know about.
Until we have a different type of computer (probably one that is not based on von Neumann architecture). If you want to get really computer geeky (and I know that some of you do), automotive autopilot problems are almost certainly NP-Hard. For non computer geeks that means if you want to code one of the then good luck - you're going to need it.
The bottom line: I have absolutely no intention to ever trust my life to software that is NP-Hard. I know (and admire) many software developers, but this is flying too close to the sun. Someone's wings will melt.