Thursday, May 3, 2012

In theory, there is no difference between Theory and Practice

In practice, this is not the case.

Differ writes about the latest on the Air France 447 crash:
Now in the airbus we have the Pilot directing the FMS [Flight Management System computer - Borepatch] which directs the autopilot which is actually a flight control computer that has overriding authority in several areas of the flight regime. Who or what is really flying the plane.  Even when he takes manual control, the pilot is not really flying the plane, only telling the FCCs what he wants to do and the FCCs can respond with different behavior depending on the situation.  If he doesn't understand the situation he will expect a different response than that the FCCs command.
This is actually a very French approach to the problem: experts (almost certainly from the elite Ecole Polytechnique) design a delightfully complicated system to theoretically eliminate (well, reduce) the possibility of human error.  The system becomes undelightfully complex as it is exposed to the Real World, with all sorts of bolt-on software modifications added after planes flew into Greek mountains and the like.  After five years, even the original designers don't understand how the system really works, assuming that they ever did in the first place.

But the political establishment circles the wagons, because we can't have a bunch of Ecole Polytechnique types losing business to American Cowboys.  Differ again:
I read a report in the Daily Telegraph today which covered some of the latest findings in the investigation into the Air France A330 crash in 2009.  Nothing especially startling and the article is pitched at non-aviators, but quite interesting all the same.

http://www.telegraph.co.uk/technology/9231855/Air-France-Flight-447-Damn-it-were-going-to-crash.html

The loudly spoken narrative is that all Airbus pilots love their planes and think they're the best thing since sliced bread, however, as with the political narrative in the MSM, there's more than an undercurrent of dissension.
Because in Practice, there is a difference between Theory and Practice.  I've spent most of my career in a field that wouldn't exist if software architects could design products without gross errors and software engineers could implement those designs without even more gross errors.

Any by "gross", I don't mean any disrespect to the people involved.  What I mean is people will die because of the errors.

Speaking professionally, there is simply an irreducible number of software bugs per thousand lines of code.  So sorry, that's our reality.  It's also a truism that a software architecture is like a battle plan - it never survives its first encounter with the enemy (in the case of software, that would be the end users).  Again, that is our reality.

The idea that a design can be intuited by Really Smart Elite Engineers™ and then implemented by less smart, less elite engineers, is very European.  It's also very wrong, and why I really don't like flying on airbus.  If everything goes to plan, it's not a problem.  If something unforeseen comes up, well the designers just assume that everything will go to plan.
Dr. Evil: Scott, I want you to meet daddy's nemesis, Austin Powers.

Scott: What? Are you feeding him? Why don't you just kill him?

Dr. Evil: I have an even better idea. I'm going to place him in an easily escapable situation involving an overly elaborate and exotic death and just assume it all goes to plan.
And it's all fun and games until the pilot inadvertently has selected the wrong flight mode and really really really needs the engines spooling up to 110% rated power to miss the ridge top, and the FMS overrules him to keep everything within tolerance.

The typically American solution, of course, is to let the pilot actually fly the plane.  Sure, he will make mistakes.  The approach we take is to have a co-pilot, who also may make mistakes.  Designs are about tradeoffs.  Our way is to err on the side of death via (highly trained) human error because it does better in emergency situations.  The French approach is to err on the side of death via computer error because it does better in "normal" situations.

Be, I know about software bugs.  I'll take the pilot who's allowed to fly.

13 comments:

Bob said...
This comment has been removed by the author.
Bob said...

Besides, you can pin a Ordre national de la Légion d'honneur on a Sully Sullenberger, but not on a piece of computer software.

Dave H said...

If everything goes to plan, it's not a problem.

This is my beef with code written by freshly minted CS graduates. They can design and implement wonderfully capable systems, but the first time a user pushes on the knob that says "Pull" it all goes down in flames.

That's fine for a paint program, but not for power plant controls.

After a while in the trenches, if you're paying attention you learn some of the dumb things users can do. If you're really good you start asking yourself what will happen if the user does them right here.

Ed Skinner said...

In all forms of engineering, whether software, mechanical, electrical, etc., designers can screw up. Wires vibrate and bang against bare metal and, eventually, the insulation breaks down and the wire shorts out. *Any* engineering discipline can eventually fall short of perfection.

Having said that, let me now add that I teach some of those software engineers at some of those companies that are building avionics. Computers make is really, really easy to achieve levels of complexity, with layer upon layer upon layer, so much so that, hell, it's anybody's guess what will happen when the unexpected crops up.

Ultimately, the computer *must* let the pilot fly the plane because, in extreme situations, the pilot wants to survive whereas the computer just doesn't understand the concept of life.

Faced with circumstances never seen before, pilots can be extremely creative -- IF GIVEN THE OPPORTUNITY.

Of course, not all will succeed.

But you gotta give them a chance to fly the frickin' airplane!

(When I travel, I strongly prefer Boeing over Airbus because, while both are heavily automated, the former understands who has the best motivation to survive, and it ain't the computer.)

Dave H said...

who has the best motivation to survive, and it ain't the computer.

Now there's an urban myth in the making. A while back the Mythbusters addressed the idea that the crash position jetliner passengers are instructed to use will actually cause death from a severed spine in the case of an impact. The claim is that an airline pays less if a passenger dies than if they survive but are maimed or crippled. (Figures the Mythbusters uncovered show this is likely. I think on average airlines paid out $1.5 million per dead passenger versus $5 or $6 million for crippled ones.)

Faced with those kinds of economics, what airline wouldn't want HAL 9000 flying the plane?

NotClauswitz said...

I'm really glad that United only flies Boeings to Hawaii!

ScottH said...

Your approach is a North American thing (ex: Vimy Ridge); Maetenloch wrote something about it last night:

http://ace.mu.nu/archives/328938.php

I like the Demotivational poster!

Goober said...

I nearly wrecked a pickup truck one time because it tried to outsmart me. The new traction/stability control devices suck if you aren't expecting them to be there.

Driving on a closed gravel road, way too fast because I was enjoying myself a lot. Pitched into a corner that i tried to square off, and really needed horsepower to the ground to make that work - driving hard on gravel requires you to oversteer originally as you enter a corner, then steer out of the oversteer as you apply massive amounts of wheel-spinning horsepower to the ground beneath your rear wheels, ensuring that you "drift" around the corner as planned.

The traction control sensed the rear wheels spinning out, and de-rated the horsepower to the rear wheels to keep them from spinning. Anyone who knows how to drive hard on gravel knows that this is catastrophic. If you get in trouble on gravel, the old saw of "when in doubt, throttle it out" is absolutely true. The truck, however, tried to outsmart me and cut the throttle, and the truck pushed through the corner wide and nearly went off the road.

Okay, it did go off the road, but I managed to recover, all the while swearing my ass off at the truck's computer for doing this to me.

I an only imagine this same situation in an airplane.

Rev. Paul said...

Brigid told me some time ago about Airbus, including the whole tail-becomes-delaminated-in-flight thing. We won't go anywhere near one of those, and you've provided even more reason.

Goober said...

Okay, so I just read the article again, and I have several questions that i feel should be answered by reviewing authorities:

1.) I am not a pilot. I've never flown a plane. That being said, even I know that if you lose airspeed and your stall warnings start to shreik at you, that you throttle up and level out flight (or even descend if you have altitude, which they did at 38,000 feet!). Why a man with nearly 3,000 hours of flight time does not understand this, I am at a loss. Why he kept pulling back on teh lever to climb when he was at 38,000 feet, we will probably never know.

2.) Why, in the situation of an emergency, was the junior guy flying the plane? They had a pilot on break, i understand that, but you've got a guy with 7,000 hours and a guy with 3,000 hours, and when the shit hits the fan, you let the less experienced guy fly? And worse, you don't actively check to see what the hell he's up to?

3.)Even if the side-stick feedback is not evident, the second guy had to know what was happening. If you apply throttle, and you keep getting stall warnings, the only option left is that you are climbing and need to do something about that.

4.) These guys needed to trust theit instruments. The horizon indicator said that they were climbing. That is an instrument that is on every single plane ever built. It is flying 101 - airspeed, flight level indicator, and altitude. Why the hell weren't they checking that?

I don't really see this as a failure of the computer systems. They did exactly what they were supposed to do - they told the pilot all along that he was doing something stupid, and he overrode the computers and kept doing something stupid. Yeah, they probably would have caught it on a boeing plane, surely, but I don't blame the plane. There was a series of shitty decisions made b these guys, that even I, a total non-pilot, know better than to do.

Jake (formerly Riposte3) said...

Goober: You're pretty much exactly right - it sounds like there was plenty of stupid to go around in that cockpit. I have to wonder why, when the warnings persisted, the more senior pilot didn't take charge. And you're also right that it would seem nobody bothered to look at the actual instruments. Even with the airspeed indicator buggered they should have still had the artificial horizon to tell them that they were nose up, the engine instruments to tell them that the engines were working and what they were set for, the vertical speed indicator to tell them they were descending (despite the plane's nose up attitude), and the altimeter (which would tell them both that they were descending and that they had altitude to work with). Nobody paid attention to the most basic and most important instruments until it was too late.

I also have to wonder why any pilot with 3,000 hours of experience would pull back on the stick when he's getting a stall warning. That's the exact opposite of what you should do. The lack of feedback to tell one pilot what the other was doing was just the final nail in the coffin built by stupidity.

I will say one thing, though: basic inputs, like pushing the throttles to full power, should override the computer and give you full power, every time. Things like "autothrust" should either move the controls they're linked to, or be completely separate and automatically overridden by manual input. And whether it's a sidestick or column, if the guy in the left seat moves his, the one for the right seat should move with it.

Rabbit said...

I'd heard somewhere that Airbus used a particular OS for its FCC which was prone to 'deciding for itself' as well as pulling an occasional BSoD. Seems like a nice new Air France Airbus A320 flew into the treeline while the pilots were trying to climb out off a high-speed low pass for a beauty shot.

Found the video. Pilots trying to set attitude to climb out, computer overrides.

http://www.youtube.com/watch?v=_EM0hDchVlY

http://en.wikipedia.org/wiki/Air_France_Flight_296

Flying is an unnatural act for humans. I don't fly as much as I did, but when I do, it's on anything but Airbus.

Anonymous said...

Having created quite a bit of code myself, and fixed - or sometimes, "attempted to fix" - even more code written by others, my code philosophy has become "it's going to be broken somewhere, I just hope that somewhere isn't super critical before we find it." Usually it is, though, and we find it because something super critical just happened. Which is why I try to have as many poorly-trained users as possible thrash it before production release. Code Wizards usually treat it too gently.

There's a youtube video showing cockpits during arrivals at a bunch of South American airports; I immediately noticed how much control movement pilots used in Airbus planes, both amplitude and frequency. It was a lot of both, and I immediately thought that's a concern and questioned what's going on in the black boxes to require that much control movement.

When I learned to fly, way back when, I had drummed into my head that the aircraft is smarter than you: properly trimmed, removing one's hands and feet from the controls will quickly allow the aircraft to assume a normal or near-normal attitude. Meaning, stop the yanking and banking (attempting to over control) and it'll settle into pretty close to horizontal flight, and from there even a 100-hour idiot should be able to maintain a stable attitude. I'm wondering if the massive infusion of artificial smarts in aircraft has changed all that.