It is said that during the colonial times in India, the British rulers, who were slightly bothered by the abundance of cobras in Delhi, set a bounty for each snake handed in—dead or alive.
This worked just fine for a while, until some of the more business-minded locals came up with the lucrative idea of setting up breeding farms specializing in serpents.
Predictably, this eventually came to the government's attention and resulted in the revocation of the reward scheme.
Rather unpredictably, however—at least from the government's point of view—the disappointed breeders released their snakes in the wild upon learning about the cancelation, causing a spike in the number of cobras around Delhi.
This interesting—albeit slightly apocryphal—tale is now referred to as the Cobra Effect in economic circles and sheds light on one of the greatest pitfalls of decision-making: static thinking.
To avoid mishaps similar to what was outlined above, one must understand that business ecosystems, markets, and the larger world are ever-changing landscapes, which are altered by each action a player takes—often in unpredictable ways.
The Lebanese-American author, Nassim Nicholas Taleb, believes that leaders and decision-makers often fail to “think in second steps." And, the only antidote to static thinking lies in the appreciation of the fact that “real life happens to have second, third, fourth, nth steps."
In the anecdote about cobras, for instance, predicting that serpent breeders would set up shop around colonial Delhi in response to the monetary incentive would be thinking in the second step, while foreseeing that the breeders would one day close down would be the third step.
As such, modeling all conceivable scenarios that may unfold as a result of a decision is always a good idea.
Decision trees are handy tools in this regard, enabling us to map out and take into consideration all possible courses of action, chance events, and outcomes—at least to the best of our ability.
But, decision trees are notorious for getting complicated pretty quickly.
Coming to terms with the complex nature of the real world and the repercussions of our deeds may result in another fatal flaw: over-complication, information overload, and analysis paralysis.
And the paralysis caused by over-complicating matters is almost as destructive as static thinking. Studies have shown that information overload generally pushes managers to make poorer decisions, with worse financial outcomes.
You may have experienced this kind of headache when making small personal decisions, say, when shopping.
Back in simpler times, we would go to a local shop and buy the product that we needed, choosing from a few options on display, and we would generally be happy with it.
But these days, with the multitude of alternative options, stores, review websites, and user feedback available online, we simply cannot decide and increasingly end up buying the wrong can opener, a keyboard that just doesn't do the trick, and a toy that your cat will not even look at. “Complexity kills," as the saying goes.
To make matters worse, mathematical simulations using cellular automata—such as those drawing on John Conway's (1970) Game of Life—have proven that it is theoretically impossible to see ahead more than a few steps, even in slightly complex systems.
It is not the question of more intelligence—artificial or real—or more information; it is simply theoretically impossible to—with any clarity—predict the future state of any complex system.
In one of the episodes of the American television sitcom, The Big Bang Theory, Sheldon Cooper, an idiosyncratic Caltech physicist, realizes as much and says: “From here on in, I've decided to make all trivial decisions with a throw of the dice, thus freeing up my mind to do what it does best—enlighten and amaze."
As such, striking a balance between the necessity of thinking in second steps and avoidance of information paralysis may sound paradoxical. But, there may be a compromise: developing a set of intuitive heuristics.
Heuristics are the product of augmented intelligence over many years of lived experience, in the case of an individual, or several generations, in the case of a society or large firm.
As practical models of reality created by trial and error, heuristics rarely produce optimal results, but they have stood the test of time and are often the safest bet—at least in the context they are applied.
“Never change a winning team," for instance, is a heuristic favored by businesspeople, sport commentators, and even politicians. We do not know for a fact that this will be the best decision in each and every case. In all likelihood, substituting a few team members can set an already winning team on fire.
However, people have learned over time that not tinkering with the lineup of a winning team is the most low-risk option, especially as those who have tinkered and failed have been scorned, while those who have succeeded have not received any credit for making an already winning team win.
We should also remember that heuristics may be difficult to grasp and subject to many exceptions. As Taleb twitted in 2013, “It is remarkable how many people confuse heuristic with law and propose exceptions to invalidate a heuristic: only laws refuse exceptions."