Tuesday, November 25, 2008

Confirmation Bias, the media, and the Academy

People process information in a peculiar manner. When we're introduced into a new situation, we gather much more information than we do when we're in a familiar situation. It's sort of a heightened state of alert, as the mind gathers data to make sense of the situation. Once the mind has figured out a pattern that explains the situation, it issues a "Stand Down" order to the conscious intellect, and we can focus on other things.

This is why you want a driving instructor with new drivers; they don't always process data with the same reaction time that an experienced driver does. As the mind constructs mental patterns of behavior, it can shift more to an "auto pilot" mode and still handle things competently.

Normally, this is a good thing. There are far too many things going on, and living in the heightened state 24x7 would be exhausting. The brain's mental patterns help us live more productive, happier lives.

It's enough of a value that people very much resist changing established patterns. Marketeers understand this, and it's why it costs so darn much to introduce a new brand into the market. It's also why most of those new brands fail, and fail fast. People essentially say "Look, I'm sure that your toothpaste is very nice and everything. But I've already picked a toothpaste that I'm happy with, and it's simply not worth the mental effort to honestly evaluate your stuff." Of course, they don't say this in so many words, but it's there anyway.

The brain essentially sorts data into "relevant" and "irrelevant" buckets. "New Toothpaste Brand" is often put into the "irrelevant" bucket, while "That car doesn't look like it's going to stop at the red light" goes into the "relevant" bucket. After a little training, it all happens automagically. Again, this is usually a good thing.

It goes wrong when the mind puts data into the wrong bucket, and it does this a surprising amount of the time. If the cost of a miss like this is low, then it really doesn't matter. The new toothpaste may be the shizzle Flippity Floppity Floop, but so what? The mental model works well enough, and the new data is rejected, even if it would be superior.

The rejection of data that does not fit the mental model, but which would be superior to the current model, is Confirmation Bias. We all have it, it's one of those facts of life. Adam Shostack has a good post over at Emergent Chaos on this, with a very funny example of Confirmation Bias:
There's a really funny post on a blog titled "Affordable Indian Astrology & Vedic Horoscope Provider:"
Such a choice of excellent Muhurta with Chrome release time may be coincidental, but it makes us strongly believe that Google may not have hesitated to utilize the valuable knowledge available in Vedic Astrology in decision making.
This is a beautiful example of confirmation bias at work. Confirmation bias is when you believe something (say, Vedic astrology) and go looking for confirmation. This doesn't advance your knowledge in any way.
Groups have it, too, when a mental model is transmitted within a group. This is where Confirmation Bias becomes damaging, where a Group Think can form because of competing rewards systems. Group members accept the maladaptive data categorization because the group dynamics reward conformity. It doesn't happen all the time, in all groups, but when it does it can produce spectacular failures.

We're seeing one right now with the media. There are far too many examples to cite, but they all have a characteristic in common:
Bad news that's damaging to Democrats, and good news that would help Republicans are suppressed.
A rational thinker would wonder why they would do this - certainly the loss of perhaps half of their market (so far) would cause the Adult Supervision at the media companies to make some serious changes. They haven't, in any sort of way that matters. So how come?

Partly, it's just bad luck that their industry is already in a disruptive transition. One of the most interesting business books of the last decade is The Innovator's Dilemma, by Clayton Christensen. Even good companies get in trouble with extremely disruptive market transitions, and so we wouldn't expect media companies like The New York Times to be any different.

We would expect that a well run media company would resist driving half their potential customers away. We're sure not seeing that resistence. The 2008 election was stunning in its display of media bias (again, there are too many examples to cite here; use your Google-fu). I have no explaination, but I do have a suggestion:
Once you have more than a certain portion of a group all exhibiting the same philosophy, Confirmation Bias becomes institutionalized, as there aren't enough remaining members to say "wait a second - that doesn't make sense."
It felt right to the NYT to sit on the story of John Edward's illegitimate child, to the point where they were scooped by the National Enquirer. It made sense to the NYT to run three front page stories about Bristol Palin's pregnancy. I'm not saying it makes any sense to me, but it did to them. In their mental model, reversing these (as the rest of the world would consider to be rational) would be like changing brands of toothpaste. Why would you bother? The newsroom already has a perfectly functional (to them) view of how the Universe works.

Adam Shostack has another interesting post on what this all means for the media, because now it's possible that we (you and I) have constructed our own model of how the media work:
We've been talking a lot lately about confirmation bias. It turns out that newspaper endorsements are more influential when they are unexpected.
The Boston Globe endorses Obama? Didn't see that one coming. The media runs a story on shooting that's pretty straight up? The Gunblogosphere posts about it.

Universities have a big dose of this as well. Second Amendment types will hold up Bellesiles and the Harvard study showing a 40:1 ratio of family-to-Bad Guy casualty rate for guns in the home as Exhibit A. The Global Warming fraud is exhibit B.

If it's true that group dynamics reinforce confirmation bias once a certain threshold of conformity is reached, then it means that the media (and Academia) have to take a big, unpopular step: Intellectual Diversity has to become top priority. The NYT has actually tried this, bringing on conservative writers like Bill Kristol.

It's not enough. The centers of groupthink need to be integrated, and for the NYT this isn't the Op-Ed page, it's the Newsroom. For Universities, it means the faculty lounge, particularly in the liberal arts. This will be terribly unpopular in the newsrooms and faculty lounges, and will require pretty forceful leadership. The media will get that - there's a Bad Moon Rising in media company boardrooms; if the board doesn't take care of this, the shareholders will ultimately get a board who will. Ot the companies will go out of business.

Academia is a more interesting case. Much of the University system in this country is publicly funded. If a significant portion of the population decides that it's not worth the candle, then the tipping point will arrive suddenly. Right now, confirmation bias runs in favor of the Academy: most people still think that the University is about learning truth. It's not, and that's a major risk. If lots of people find that they've spent a ton of dough for a diploma that doesn't help them get a job that gives them a return on that investment, all sorts of questioning will begin. The confirmation bias will be shattered, and people will revert to a heightened state of alertness for things dealing with education. At that point, the new brand of toothpaste may be welcomed.

No comments: