Last night a woman was struck by an autonomous Uber vehicle in Tempe, Arizona. She later died of her injuries in the hospital.Think about that last sentence for a moment. The implication is that these things aren't very good in a variety of environments.
The deadly collision—reported by ABC15 and later confirmed to Gizmodo by Uber and Tempe police—took place around 10PM at the intersection of Mill Avenue and Curry Road, both of which are multi-lane roads. Autonomous vehicle developers often test drive at night, during storms, and other challenging conditions to help their vehicles learn to navigate in a variety of environments.
I'm not blaming the car software, at least until we know more than we do now. But as someone who has made a career from the unanticipated consequences (or purely lousy coding) of software people, I'm really unimpressed with the safety claims of the marketroids here.
The people who have built these devices simply do not know as much as they think they do. After all, there are known things, unknown things, and things that we do not even know exist. The designers clearly understand that there are unknown things, which is why they test in (ahem) "challenging conditions". The more honest among them might even admit that this testing might even reveal unknown unknowns.
But not the marketroids. And today a woman is dead.
It will be a cold day indeed before I ever get into one of those things. I understand too much about how high tech products are created, and how they fail. And about the kind of things that programmers don't know.
8 comments:
The hubris of the folks promoting these cars never ceases to amaze me. If the squares in your grid were scaled to relative sizes, I see that black square as bigger than all the others, with top left being the smallest.
The family must have to put up an electric fence to keep the lawyers out.
I think this whole thing is going to be killed off once enough lawsuits start.
SiGraybeard, I see the black square as bigger than the other three combined.
And as to the lawsuits, you bet.
Been a software engineer all my adult life and I can tell you without a doubt there is no such thing as bug free software. When lives are in the balance I can never trust a machine to do a man's work, the programs are only as good as the person that wrote them.
These "self driving" cars will come to be looked upon as "It Seemed Like A Good Idea At The Time"......
Time to ban all fully automatic high capacity vehicles.
There is a large scale autonomous test track being built next to Willow Run Airport just a couple miles from my home.
They've created an overpass, tunnel, roundabout.
Why are these thing on the street?
Currently being lost in the story is the fact that there was a "Safety Driver" in the car at the time of the accident. Investigators have not said yet if the "Safety Driver" was impaired or distracted, (the question being, if the car is doing the driving, what are you doing, Safety Driver? Drinking your adult beverage? Smoking your medicinal herb? Playing World of Basement Dweller Warrior XXVII on your phone?)
While being inside these "autonomous vehicles" might not be safe, being outside doesn't seem much better.
Watch the flow of the investigation change when reps from Uber meet with the politicians from Tempe.
And just wait till Wally World turns their robo-trucks loose on the highways!
The robo-trucks are what's on my mind, too.
I need to go back and look at what's going on in the world of ag, where I know the technology has been tested (there's no avoiding a 20-hour workday in ag, sometimes).
Apparently the woman that was hit came out between two parked cars and was not at a crosswalk. Neither the driverless car or the “safty driver “. Were cited and the pedestrian was found to be at fault in the instance.
I expect that long haul double and triple trailers will be the first to be “automated”. .... or at least Semi automated with a ride along safty driver who is allowed to sleep through Kansas , nebraska , Texas, New Mexico ..... etc .
Post a Comment