Free Men Defend their Positions and Know Why They Believe in Them
Relying solely on the opinions of others leaves a free society incapable of free thought not altered by cognitive bias
When I was a newly minted United States Army Second Lieutenant going through training to become a Military Intelligence officer at Fort Huachuca, Arizona, long ago in the fall of 2008, I thought I would land knee-deep in some pretty intense stuff. Many of my peers were already seasoned combat veterans and others were on their way to units that would soon be deploying to Iraq or Afghanistan. Surely, we wouldn’t be in for a snooze-fest of a curriculum, right?
Wrong.
While the title of the book escapes me, everyone in the class was handed a small blue handbook outlining the concept of critical thinking. The moment was a bit like when your father hands you a state guide to road signs and traffic signals when you’re ready to get behind the wheel for the first time. This guide was intended to penetrate the mind and get people to think outside of their preconceived notions and make legitimate, realistic, and diverse assessments. Military Intelligence professionals, in a combat environment, are expected to assess enemy strength and disposition in a Most Likely and Most Dangerous fashion for no other purpose than to aid his commander in formulating a mission plan. Good intelligence officers (S2s) have their assessments taken seriously, and those with a penchant for incompetence don’t.
This handbook warned specifically against false consensus effect and confirmation bias. The former is when a person believes practically everyone else shares his point of view, and it is pervasive in the political world and is exacerbated by factional divides. The latter is when an analyst or researcher searches only for information that backs up his or her hypothesis. For example, someone guilty of confirmation bias who thinks bacon is bad for you will only consider perspectives that align with that hypothesis, and avoid those that are contrary, even if they are more scientific, factual, and provable.
A tactical analyst unable to form his own opinions and points of view will eventually be caught off guard, and the results may prove catastrophic. Successful combat strategy requires changing tactics to keep the enemy off guard or to adjust if the current strategy is ineffective or unsustainable. An analyst who goes to work every day and doesn’t ask himself how the enemy may look to change tactics and spot the signs of this change through various forms of transmitted intelligence, will potentially miss key information, and leave his unit exposed to harm or tragic loss.
Nothing will change, they’ve always done it this way.
They can’t attack at this time because they don’t have night vision capabilities.
Those thoughts can be dead wrong if an analyst doesn’t consider that the enemy may have changed leadership or received a financial windfall that allowed for procurement of new equipment and the subsequent adoption of new strategy. Failure remains static. Success involves change, whether in business, war, or relationships.
In my political world, I find many who are unwilling to dig into why they believe the way they believe. Some call it polarization, others call it tribalism, and others accuse factions for allowing these divides to form. While these things may be contributing factors, ultimately, I believe people fear that looking into their belief systems may create conflict that we call cognitive dissonance – when beliefs and actions don’t line up. For example, a man who realizes the health risks associated with smoking but smokes anyway practices cognitive dissonance.