Mitigating Decision-making Errors Along a Transformation Journey 

PartBDecision making.jpg

In Part A of this two-part article on decision-making errors, the main categories and types of decision and judgement errors were reviewed along with some associated logic fallacies.

 So What?

Two practical questions emerge. First, what can we do to improve our judgement? A combination of antidotes is often recommended to mitigate the untoward effects of these decision traps: being humble and aware, knowing yourself and knowing others, and following a process are the top three. The first, being aware, is like telling a pitcher to “throw strikes” (well-intended, but not of great practical help – this is what the pitcher is trying to do but it does not help him/her do it!). The second, to know oneself, is harder than diamonds and steel, according to Benjamin Franklin. The third, following a process, offers the most tangible promise for something we can actually do that can consistently make a difference. 

One tool to do this is to post in the meeting/board room a list of typical decision errors. Then, after an orientation to the definitions and examples, commit to making sure that, for any significant decision, team members review and call out for which of the ‘traps’ the team is most at risk. Campbell, Whitehead, & Finkelstein, in their HBR article ‘Why Good Leaders Make Bad Decisions,’ call these “red flag conditions.” They assert that we analyze situations using pattern recognition and attaching emotional tags to arrive at a decision to act or not. But because we tend to do the two processes almost instantaneously (instead of sequentially), our brains leap to conclusions and are reluctant to consider alternatives, as researcher Gary Klein has shown. And we are particularly bad at revisiting our initial “frame” or assessment of a situation. For those ‘red flags’ identified by the team, the team could then refer to a set of remedies/mitigations. These include involving a more independent source, clarifying the relevance of existing (or identifying needed) information, referring to a higher/governing group, etc. The nature of the process should result in a “pause and shift gears” awareness and action.

Daniel Kahneman’s ‘3 Key Questions’ are instructive:

1.     Is there any reason to suspect the people making the recommendation of biases based on self-interest, overconfidence, or attachment to past experiences?

2.     Have the people making the recommendation fallen in love with it?

3.     Was there groupthink or were there dissenting opinions within the decision-making team? 

The second question is how do we enlist others? Since most decision-making involves (or should involve) the judgement and processing of many people – senior team leaders, physicians, board members – how do you raise the collective judgement and decision-making literacy? A common governance error includes deferring to (or over-processing) a single comment from one person perceived as a subject matter expert, even though the comment is not a matter of technical competence but a personal statement or opinion. This is the trap of being ‘personality’ driven vs process driven. It can lead to a bias of seeing and hearing only a narrow view or side of a story (selective perception). This results in false confidence because other information and perspective is deliberately and systematically precluded. Humans who hear a conclusion first typically become postured to seek only other information that supports it. Determining what information is NOT being considered, vs only relying on what is presented/available or ‘spoon fed’ is critical. Asking “What could we be missing?” or “If we are missing something, information or an alternate view, what would it be and who would provide it?” may mitigate this error. Asking the presenter of a position to take the opposite side can also help elicit bias: if the presenter offers clearly weak contrary arguments, then this is a red flag. While such due process seems obvious, comfort, convenience and expediency – as well as group think - lobby against it. It is easy to be blinded by the ‘light’ of one personality or story, regardless of whether the source who provides it has a conscious or unconscious conflicted interest.  

Sometimes simply asking, “What is that based on?” is often enough to separate opinion/motive and fact and prevent “false narratives” from going viral. Another check-and-balance question that can elicit better perspective is “On a one to ten scale, how confident are you/we in this decision and why?” The classic governance error of “getting into the kitchen is often based on the well-intended notion that “closer to the action is better” for judgement. In fact, such proximity typically does the opposite. It narrows the board’s perspective and exposes it to a higher risk of only hearing a negative view (most often by a vocal minority who are underperforming or more resistant to change/protective of the status quo) at the expense of the broad and “whole” perspective required for healthy governance. When a biased and narrow view is mis-taken as a common view, i.e., mis-taken as gospel,” then bad decisions get made. This is why As Kurt Godel notes, “You can’t be in a system while at the same time understanding the system you’re in.” 

Transformation in a VUCA world (volatile, uncertain, complex and ambiguous, per the U.S. Army War College) demands an elevation of our leadership decision making game. Transforming, indeed transcending, our own inherent decision-making flaws, requires that we

1)    actively raise our vigilance by seeking out decision making traps and

2)    utilize our remedies with stronger decision process discipline.

Read Part A: Healthcare Transformation and Decision-making Errors.