Thursday, August 25, 2016

Why self-driving cars are a lot further from practical use than we think

It turns out that this is a really, really hard problem:
Rosenband added that four-way junctions with no lights are still a nightmare for the robot cars. An example junction is California and Powell in San Francisco, which has the added bonus of two cable car lines going through it. Human motorists rely on eye contact to know when it's safe to go or just take the initiative and move first. A driver-less car gets stuck trying to safely nudge its way across the box. 
"At four-way stops, oftentimes cars arrive sorta at the same time and it's a coin flip for who goes first. We have to make it comfortable for the person in the car; you don’t want the vehicle to inch forward and then slam the brakes, and you also want to be courteous to other drivers," Rosenband explained.
This is a great overview of the problems of computer/sensor recognition of what is trivially easy for humans.  There are great examples here of the problems that we overcome instantly and naturally, but which flummox the computer:  the red balloon next to a green traffic light, the traffic light partially obscured by a bus, a traffic light with the setting sun right behind it which blinds the sensor.

We handle this via common sense, but you can't program common sense.  They're trying, though:
You can teach a computer what an under-construction sign looks like so that when it sees one, it knows to drive around someone digging a hole in the road. But what happens when there is no sign, and one of the workers is directing traffic with their hands? What happens when a cop waves the car on to continue, or to slow down and stop? You'll have to train the car for that scenario. 
What happens when the computer sees a ball bouncing across a street – will it anticipate a child suddenly stepping out of nowhere and chasing after their toy into oncoming traffic? Only if you teach it.
And this is the heart of the problem: you have to define literally every possible failure condition and program those into the software.  Even with machine learning, there are too many to be practical.  If you miss one and a car kills someone, the lawsuits will be enormous.

This is an outstanding article on the complexity that technologists are trying to bite off.  While unstated, you get a real feel for how they want to fly high - perhaps so close to the sun that their wings will melt.

12 comments:

  1. Another problem I've read about is, you can't program a car to break the law, for obvious legal reasons. So if the self-driving car comes up to an accident, double-parker, tree-cutting crew, or debris blocking the traffic lane, it will just stop and sit there, since it's not programmed to cross the centerline.
    As stupid as people are, computers are even dumber.

    ReplyDelete
  2. There is an easy solution. Just make sure the self-driving cars have large signs saying "Google" or "Microsoft" or "Apple" or "Facebook" and everyone will understand that normal laws do not apply to them, and that they had better get out of the way so their betters can proceed. And that car will teach that kid not to chase their toy. Permanently!

    ReplyDelete
  3. I was always under the impression that at a 4 way stop the vehicle on the right goes first. But since you can't trust the other driver, you proceed with caution. ( and then it will get resolved with hand signals. --- something a driverless car can not participAte in ).

    Driverless cars might be ok on the interstate - as long as they are banned from the left lane. But you won't find me One anytime soon........ Unless I can no longer drive safely ( like my father ...... He needs a driverless car )

    ReplyDelete
  4. If it becomes that driverless cars ALWAYS have the right of way, we are doomed.

    ReplyDelete
  5. About all that I can foresee on limited-access highways will be a form of truck convoy, with robotic trucks closely following one driven by a human. But even that would be too problematic on a surface street.

    ReplyDelete
  6. Driverless truck convoy, following one driven by a human. Great! Now instead of two trucks about 100' long taking 5 minutes to pass on a highway, we've got two truck caravans 500' long taking 25 minutes to pass on a highway. That's an improvement, fer shure!

    ReplyDelete
  7. Juvat wins the internets for today.

    ReplyDelete
  8. They're just not thinking outside the box...see...the solution is to make sure that all child's toys and clothing have some kind of V2X wearable technology transponder in it, so the car can see the ball, or the kid's PJs. Then all the old toys and clothes need to be Outlawed. Problem solved!

    Interesting timing in that it conflict's with Cringely's post today, which is basically saying that self-driving cars are really just a big government-industry subsidy boondoggle to get Everyone to Upgrade Soon. With massive government loans, of course. Unless you want to stay shut in.

    (He didn't say quite it that bluntly, but the implication is there. On the other hand, I don't mind saying it at all.)

    ReplyDelete
  9. I look at it it this way: we still have problems with aircraft autopilots getting confused and telling the pilot, "I don't know... You take it ". Comparatively, autopilots are a simpler problem. It's relatively constant speed and direction. There's no kid on a bike standing on a cloud and darting out in front of the plane. No truck stopped in the air to make a delivery where it's supposed to be going. There are far more bad corner conditions for cars. I think truly autonomous cars are a long way away.

    ReplyDelete
  10. Heck, I don't like watching out for BICYCLES, and they are driven by self-aware, thinky meat sacks.

    ReplyDelete
  11. Just consider ... You are a computer programmer. You are told to write a program to allow a ton of metal to proceed at speeds in excess of 50mph down roads inhabited by (a) non-programmed vehicles and (b)un-progamamble humans.
    How can you write a computer program to take cognisance of these infinite variables?
    Answer?
    You can't.
    No car I ever drive will ever attempt to think for me. I am better at hazard perception and, more importantly, anticipation, than any computer can ever be.

    ReplyDelete
  12. Have they been working on the computer decision tree where in roundabouts you have to yield to semi's even though you would have the right-away? If it was a car instead of a semi you would have the right-away. And then box trucks do not have the right-away over you.
    That's kind of confusing until you look at the vehicle and make the decision.
    The semi thing is because in many roundabouts the semi's will take up a lane and a half while going around.

    ReplyDelete

Remember your manners when you post. Anonymous comments are not allowed because of the plague of spam comments.