When does 1/2 and 1/2 equal 1/4?
Watching the MacNeil/Lehrer NewsHour a while back, I saw two members of the Senate Intelligence Committee talking about the report they completed regarding intelligence failures relating to the war in Iraq. They claimed part of the culprit was a groupthink mentality where everyone viewed the evidence with a predisposed conclusion that weapons of mass destruction must exist in Iraq. My immediate reaction was how 1984-ish the term “groupthink” is and whether or not I should just tune out the report altogether.
But then I listened a little more and was surprised at what was said. Apparently all the reports about unmanned vehicles spraying deadly chemicals or reconstituted nuclear arms programs or mobile biological weapons factories were tagged with caveats that were ignored to reach the conclusion that Iraq must be doing something bad. In other words, there was a possibility — or better yet a probability — that there were no unmanned vehicles or nuclear bombs and the like.
And finally the reasoning and logical part of my brain kicked in. If there were warnings that these reports could be false, then the probability of them all being true is less than any one of them being true. Remember probability? So take two bins, each with half red and half blue socks. You take a sock from one, then a sock from the other. The probability of picking two red socks is… 25% since it’s 50% for each bin, then you multiply the two together to get 25%. In other words, it’s less likely you’ll draw two red socks when you combine the probabilities.
Replace my above example with “true” for “red” and “false” for “blue.” So the reports of Iraq’s weapon stockpiles were possibly true or possible false, then the likelihood that all of the reports were true is less than the probability of any one being true. This triviality of mathematics didn’t stop the government from presenting all of this evidence to the United Nations as fact and reason for war. Nor did I see any other nation call the U.S. this. I suppose there weren’t any math majors as analysts in either the U.S. or any other countries.
All you conspiracy theorists need to take some math lessons too. Recent scares are building up to the moment when a world government will form and stamp out all freedoms? Chemicals in the water and subliminal messages on TV keep us subdued and pacified? Tin foil hats can reflect electromagnetic waves that aliens send down to brainwash us? That’s about as likely as frozen Walt Disney driving around America with Spuds MacKenzie and zombie Ronald Reagan running people down in their ’68 Cadillac Eldorado convertible.
We as humans somehow buy into these conspiracies. Magic bullets, the Illuminati, Santa Claus — they’re exciting compared to the dullness of reality. We’re willing to suspend our disbelief even if a situation is completely improbable. We’re creatures of suspicion; the simpler an explanation is for a situation, the less likely it’s the real answer.
Rather than plea to simple reason, we argue from fear, misunderstanding, and complexity. We’ve been doing this for ages, holding on to ideas that we laugh at today — that the sun revolved around the earth; UFOs crashed in the desert and were taken to Area 51; that Bert and Ernie are gay.
It’s harder to accept a simpler explanation due to pressures at the time keeping those explanations as the “truth” — because the earth was created by God and therefore must be at the center of the universe; that the government would obviously cover up the alien landings with a blanket of lies; two men who share a bed for decades (yet miraculously stay the same age) must be gay.
The reality of the situation may be boring, but at least we’re more sure of this than the previous theories — the earth revolves around the sun; a high altitude weather balloon crashed in the desert; Bert is a figment of Ernie’s imagination like that character Brad Pitt played in Fight Club.
Besides the human instinct to believe the unbelievable, two other related culprits at work here. The first is an error of selective judgement where, given a set of facts and observations, you come to a conclusion that is isn’t supported by those facts and observations. You can omit parts of your observations when coming to this conclusion, but the error is entirely in your reasoning about those observations rather than the observations themselves. You know, like how O.J. Simpson got off for murdering his wife. Of course he did it. All the evidence pointed to him. The gloves were “too small?” Yeah, right. The only person who didn’t know that O.J. did it was… well… who didn’t believe that O.J did it? I rest my case.
The second and far worse error is selective observation. Rather than coming to the wrong conclusion given a set of facts, the result of selective observation is a set of facts and observations that can lead only to a specific conclusion. Often this information is skewed, removes any observations that don’t support the conclusion, or even has fabricated information inserted when the results didn’t come out as expected. How about the Kennedy assassination? We all know there had to be multiple gunmen, but the government only believes in magic bullets so that’s what they concluded. Maybe once all the people involved die we’ll find out the truth.
The reason selective observation is worse than selective judgement is that when you make the wrong judgement, you can always go back to the facts and draw a new conclusion. When your observations are skewed, then there’s no way to guarantee a correct (or at least a better) judgement from those observations. In other words, the information that you chose not to commit (or to commit incorrectly) means all results based on that information is flawed.
Taking this back to the case of war in Iraq, certainly there was selective judgement on the part of the Bush administration to take and present the intelligence as fact without caveat. From the perspective of everyone else, we cope with the observations given to us by Bush et.al.; if we’re to believe what the government tells us, we have no other conclusion to draw except that Iraq has weapons of mass destruction. Unless, of course, we believe the U.N. weapons inspectors.
Maybe given more time, the U.N. inspectors would have turned up something. So here’s the final flaw that felled Bush’s arguments. Given this premise — Iraq has weapons of mass destruction — proving the affirmative is much more difficult than proving the negative. A word-bender for you: we can never be certain that Iraq does not have weapons of mass destruction. Think that over a few times. Said differently, we can never be certain that Saddam was right when he said Iraq destroyed all its WMD. However, we can easily prove they did have WMD simply by finding them. Some might claim that’s impossible too, but I say proving Saddam right is the more difficult proposal of the two. We can keep searching Iraq forever for WMD and never find them. And we will never be sure that there were never any WMD unless we find them.
Most people probably won’t think this deep about Iraq, probability, logic errors and the like (nor work themselves into confusion like I did in that last paragraph). However, I would desperately hope that our government is doing this kind of thinking. I like to believe that the steady decline in Bush’s job approval and agreement (or rather disagreement) on whether the country is headed in the right direction is the result of the American people are grading him on his logic and coming to a new conclusion of their own.
My political persuasions aside, the lesson here is to please take the time and become more math and logic literate. There is no better pleasure than laughing in the face of a person who can’t make a coherent argument or understand facts, statistics, and probabilities. Or make up words. Like “Kosovians” or “resignate” and “subsidation” or “subliminabable.” Because even if George W. Bush doesn’t excel at logic or statistics or forming cohesive statements, at least we know he’s creative.