Share on Facebook Tweet about this on Twitter Share on LinkedIn

People are often their own worst enemies when it comes to managing projects. The research on projects outcomes is indisputable, human misjudgment is the single most critical factor in project failures. More specifically, the way in which human psychology frames and limits how we interpret our environment and the decisions that follow. This is the underlying premise of our book “Project Think: Why Good Managers make Bad Decisions”. Humans are rife with biases and heuristics, so many in fact that I could fill up a year’s worth of blogging on the subject, so I will not. Instead,  it would better use of my time to examine a couple that you will recognize encounter and offers some suggestions about how you can minimize their impact on your decisions.

A couple of years ago, I was travelling to Toronto to attend a meeting in late June. I always travel light, no more than one carry on plus a laptop so I do not have to any luggage. It is cheaper, but it also means eliminates lost luggage and waiting at the baggage carousel. It also means that I can only pack a limited wardrobe. In this case, I remembered the last time I had visited the area in June and the weather was perfect, warm and sunny, so I packed accordingly. Of course, when I arrived, it was relatively cool and rained incessantly. In retrospect, as I pondered my lack of a good rain jacket, I recalled other times that I had visited the region in the summer and rain gear or at least water repellant shoes were needed about half the time. I had fallen victim to the “availability bias”. Because the weather had been so kind to me in my last trip, it had left a lasting impression. This is sometimes referred to as the vividness of the event. In any case, I applied my experience from that trip to the upcoming trip and in effect discounted all of my other experiences that I should have taken into account. According to availability bias people are making assessment of probability of certain events not based how often such events occurred, but how often do they come to mind.

If this sounds familiar, it should, as it happens all the time when we assess the likelihood and impact of events on our projects. Our brains are tuned to recall events that were either very pleasant or unpleasant as opposed to the ordinary or routine. We have limited ability to store and recall events: therefore, it makes sense that we tend to have better recall of those that can have the biggest impact on our survival. In prehistoric times, this mechanism allowed us to have vivid memories of great hunting places or which plants were poisonous. However, when you are identifying and assessing your project risks, these same mechanisms that kept us from starving, lead us to often over or underestimate the impact of risks. If you have never been involved in a project when a risk occurs, there is a good chance that you may discount its impact on your current project. The alternative is also true, if you had a project fail due to a risk, you may overestimate its probability and impact on future projects.

Related to the availability bias, we can often falsely link events because of close proximity. This referred to as illusory correlations. Here is an example, fires and firefighters are inextricably linked, like baseball and hotdogs. You might see where I am going with this, while there is an extremely strong correlation between firefighters and fires, it is not causal. Eliminating your fire department to reduce the probability of fires may not have desired result. While this is obvious, we often do form illusory correlations that are linked to the vividness of previous experiences that lead us to take ineffective measures to manage risks. So what steps can you take to overcome your own biases?

Keeping archives of past projects that include a risk history is your best bet to minimize the impact that availability bias will have on your decisions. Risk histories should contain all of the risk data collected during the course of a project. These should include risk assessments at each stage gate, risk controls and strategies, risk status reports, etc. Risk archives will provide tell you how often the risk was identified in the past (probability), whether the assessment was correct and the effectiveness of the controls. Reviewing past performance as part of new risk assessments provides a reality check in which you can compare you own perceptions against the actual record of what has happened in the past. While past performance does not always predict the future, it does provide a solid foundation upon which your assessments.

Bring the “Wisdom of the Crowds” into your process by having input from the widest variety of individuals as possible into your risk assessments. While we all share the same psychological limitation, each individual experiences differ and will serve to moderate the effect of the availability bias. There are limitations to this method as expert opinion often outperforms groups, but as expert opinion is also tinged by personal biases, group input provides additional reality check upon individual perceptions.

Improving your project risk management includes understanding that your biggest risk is yourself. Human psychology and cognition are strictly limited and have a profound impact on our perceptions and can skew how we recall past experience in unexpected ways. By accepting that our reality is the product of our biases and need to be managed accordingly is key to improving the accuracy of your risk assessments and the quality of your decision making