Last night a woman was struck by an autonomous Uber vehicle in Tempe, Arizona. She later died of her injuries in the hospital.Think about that last sentence for a moment. The implication is that these things aren't very good in a variety of environments.
The deadly collision—reported by ABC15 and later confirmed to Gizmodo by Uber and Tempe police—took place around 10PM at the intersection of Mill Avenue and Curry Road, both of which are multi-lane roads. Autonomous vehicle developers often test drive at night, during storms, and other challenging conditions to help their vehicles learn to navigate in a variety of environments.
I'm not blaming the car software, at least until we know more than we do now. But as someone who has made a career from the unanticipated consequences (or purely lousy coding) of software people, I'm really unimpressed with the safety claims of the marketroids here.
The people who have built these devices simply do not know as much as they think they do. After all, there are known things, unknown things, and things that we do not even know exist. The designers clearly understand that there are unknown things, which is why they test in (ahem) "challenging conditions". The more honest among them might even admit that this testing might even reveal unknown unknowns.
But not the marketroids. And today a woman is dead.
It will be a cold day indeed before I ever get into one of those things. I understand too much about how high tech products are created, and how they fail. And about the kind of things that programmers don't know.