Yet another example came to light on Monday when a driver in North Brunswick, New Jersey wrecked his Tesla on a highway while the vehicle was in Autopilot mode. According to a report published by News 12 New Jersey, the driver said that the vehicle "got confused due to the lane markings" at a point where the driver could have stayed on the highway or taken an exit. The driver claims that Autopilot split the difference and went down "the middle", between the exit and staying on the highway.Insty is skeptical, but I'm not. This is exactly the kind of situation that you should suspect the software could handle badly: confusing input from signs or lane markers leading to a failure to navigate the car on a safe route. It's not a software bug, it's a gap in the algorithm used to control the car.
The car then drove off the road and collided with several objects before coming to a stop. The driver claims that he tried to regain control of the vehicle but that "it would not let him".
I'm not sure that this is solvable, either. The way software developers handle these "edge cases" is (a) ignore them if possible (I can't see Tesla being able to do that, or (b) write a special case condition that covers the situation. The problem with the later option is that there can be hundreds of these special cases that need to be coded in. That makes the software a huge bloated mass that nobody can really predict how it will work. Validation becomes really hard and QA testing becomes essentially impossible.
And this is without postulating software bugs - this is all trying to make the algorithm suck less. Of course, the more code you have to write, the more bugs you will have - remember that validation becomes really hard and testing well nigh impossible? You'll have an unknown number of potentially fatal bugs that you probably won't know about.
Until we have a different type of computer (probably one that is not based on von Neumann architecture). If you want to get really computer geeky (and I know that some of you do), automotive autopilot problems are almost certainly NP-Hard. For non computer geeks that means if you want to code one of the then good luck - you're going to need it.
The bottom line: I have absolutely no intention to ever trust my life to software that is NP-Hard. I know (and admire) many software developers, but this is flying too close to the sun. Someone's wings will melt.
And don't get me started on self-flying air taxi.
ReplyDeleteYou may like this one too:
ReplyDeletehttps://davidhuntpe.wordpress.com/2018/12/01/on-becoming-a-technophobe/
I miss a bug and robots collide. Embarrassing but survivable.
ReplyDeleteNot so much autonomous vehicles.
And you may refuse to ride in one, but the other guy's might come right atcha.
They've built a large test track about a mile from my home here in Ypsi with overpasses, a fake tunnel, an exit to Planet B (IIRC), but I never see vehicles move on it.
I've been expecting this kind of autopilot failure for a while, which might mean they're handling it better than I guessed.
ReplyDeleteOr else people have the sense to turn off autopilot when they see the "construction ahead" or other indicators of lanes being shifted.
Insurance and liability will put a stop to this sooner or later anyway. Insurance will not want to pay if there was a software failure, so that leaves only to company (Tesla, in this case) liable for medical and property damage costs. This single-point liability will prove far more costly than any potential profits (the ambulance chasers will assure that) and these programs will go back to the idea board. Even the litigation to defend the software will likely prove too costly unless specific legal protections are enacted for self-driving cars (which will undoubtedly leave the customer holding the bag because there's way too much money in insurance to let the .gov foist it off on them).
ReplyDeleteFord runs this commercial for their cars with the auto-correction feature.
ReplyDeleteThe drivers portrayed should not even be behind the wheel.
Afraid of a motorcycle passing?
Can't backup and see in the mirror that you're about to remove it violently?
Get an Uber.
auto pilot on a Tesla is in Beta. Not intended for inattentive drivers. The software and hardware are not ready for full-time control.
ReplyDeleteIf you can make an aerodynamic abortion like the V22 Osprey fly - you can make a self driving car. If you can make anti-ballistic missile missiles that work - you can make not only self driving cars, but autonomous attack drones a la Skynet.
ReplyDeleteYesterday I almost slid into a friggin wreck three cars ahead. Some fuggin chinaman who couldn't see over the dashboard slid across three lanes of traffic and half way up an embankment, taking out two as he went. I passed as he was sliding down and he took out more cars behind me. Quite frankly, self driving cars don't scare me half as much as vibrant, idiotic or drunk drivers.
The V22 killed a lot of people in its development too. My money is on the machines.
If we can't make self driving trains, which literally run on rails, how do we think we can make self driving cars?
ReplyDeleteHubris all the way down.
And what was the nut that holds the steering wheel doing?
ReplyDelete