Monday, June 11, 2018

Self Driving Car software is worse than you think

This is kind of jaw dropping.  The cars cannot avoid ramming stationary objects at high speed:
A natural reaction to these incidents is to assume that there must be something seriously wrong with Tesla's Autopilot system. After all, you might expect that avoiding collisions with large, stationary objects like fire engines and concrete lane dividers would be one of the most basic functions of a car's automatic emergency braking technology. 
But while there's obviously room for improvement, the reality is that the behavior of Tesla's driver assistance technology here isn't that different from that of competing systems from other carmakers. As surprising as it might seem, most of the driver-assistance systems on the roads today are simply not designed to prevent a crash in this kind of situation.
This is bizarre, and I strongly recommend you read the entire article.  You would think that this would be a basic capability, but since the system was put together from parts that evolved over time, this is something that seems to have dropped through the cracks.  It's highly doubtful that this is the only think that can kill you that has dropped through the cracks.

Holy cow, what a mess.

6 comments:

Unknown said...

I've read the article and I understand that the assumptions the system is assembled under are believed to be reasonable by the manufacturers and their engineers. Sadly, the marketing departments have given the world the impression that this is full-blown self-driving and so the customers are treating them as if they were and are now starting to die as they pay the price.

As a computer programmer, with over quarter of a century, writing code, I was distressed that in a couple of places I saw the word "ignore". Their assumptions of the mode of operation lead them to feel that it is safe to ignore stationary objects at high speed, treating them as automatic false positives. You never ignore anything as a good computer programmer. Life delights in sending mangled data to your program and the roughest thing your code will ever deal with is real data! These self-driving cars, that aren't, need to be taken off of the road until the software and detectors have greatly increased in capability.

Beans said...

So. Still the best autopilot is the one located between one's ears. How totally... expected.

Sounds like in most respects airplane autopilots actually have less variables to worry about.

And the people that want self-driving cars are also the ones that want their house all internet-of-things and smart guns. The cult of non-think. Yay.

Old NFO said...

The next big thing, IMHO, will be the catastrophic failure of the self driving vehicle. There are simply too many variables for the current state of the art systems to handle quickly enough to actually be safe.

drjim said...

NFO.....define "catastrophic failure".

By my standards, these things have had SEVERAL "catastrophic failures" where people died.

The MSM just won't cover it because self-driving cars are one of their current darlings....

Eric Wilner said...

Catastrophic failure... hm.
I wonder what it would take to induce sensory overload?
Get a high enough density of self-driving cars, and irresponsible people might start experimenting with, e.g., T-shirt cannons loaded with chaff. Might make the nice, orderly automated commute get suddenly exciting.

Ken said...

@Eric Wilner: Repent, Harlequin! Said the Ticktockman. :-)