A fundamental organizational challenge is that of establishing the causes and consequences of patterns of decision making that interfere with vigilant information processing. The research question is when and why do decision makers fail to look into available alternatives with care even when vital issues are at stake? The most explicit discussions are to be found in the writings of management and organizational scientists, who have long recognized dysfunctional constraints imposed by organizational rituals, bureaucratic procedures, and the rigorous demands of the executive role.
Bureaucratic constrainsts prevent prevent decision makers in a corporate setting from applying their full resources to intensive search and deliberation on core problems. Kurt Lewin, the great pioneer who first developed the analysis of decision making in terms of psychological conflict, pointed to the psychological consequences of idiosyncratic commitment to a decision as having a freezing effect inclining any given actor to be highly resistant to any new knowledge or piece of information attempting to induce him/her to change his/her mind. The logic is that when a person has committed himself to a given set of ideas, there is less emphasis on objectivity and there is more in the way the person views and asses the available alternatives.
Organizations have to work very hard in order to overcome the vulnerability of people to gross errors in arriving at a decision through superficial search or biased information processing. Flaws and limitations in human information processing and new knowledge acquisition points to the propensity of decision makers to be distracted by irrelevant aspects of alternatives and possibilities, which leads to loose predictions and outcomes. Such a condition is similar to the tendency of key actors to be swayed by the form in which information about risk is prepared and presented, the reliance on faulty interpretations and stereotypes, which can lead to critical mistakes. This is what some researches refer as the illusion of control on the mentality that we are doing right or that we don’t want to hear any different perspective or assessment.
Misconceptions about how change operates is what has been called the gambler’s fallacy: clear-cut information and knowledge about the probability of an event is not taken into account because people believe that chance is a self-correcting process. “It rained more than the average the past few months and therefore it will rain less than the average next month-so lets decide to take our vacation then”.
Other source of error is about exposing executives to irrelevant information (not to say incorrect) and thus having them regarding the changes of the outcome as 50:50, whereas when no such information is available they are more likely to take correct account of prior information and knowledge about baseline probability, which could be considerably greater or less than 50 percent. Other sources of error are attributable to insensitivity to sample size or to inaccuracy of criteria or sources to make the decision.
Finally, when the degree of complexity of an issue exceeds the limits of cognitive and emotional abilities, there is marked decrease in adequacy of information and reality processing as direct effect of information and emotional overload and ensuing fatigue.
In short, beware of rituals and habits, lack of standards, freezing ideologies, superficial knowledge and information, bias and stereotypes, procrastination, gambler’s fallacy and psychological stress. There are, at least, two kinds of decisions: those that are expensive and those that are not… Raising good knowledge is cheaper than preserving ignorance…
Copyright 2001 QBS, Inc.