No one wants to be a spoilsport. No one wants to be that person at the back of the room saying: “Wait a minute, what if...” But when it comes to managing risks, considering what could go wrong - and planning for it - can mean the difference between a project's success and it grinding to a halt shortly after take-off.
Examples of bad risk planning abound, particularly given the financial mess the world has been stewing in for the past few years. Despite these obvious reminders to prepare for eventualities, people still get it very wrong. Often, it's not so much a result of poor methodologies as the underlying psychological factors that influence decision-making. Past experiences, cultural values, cognitive biases, and individual differences all affect our choices; and understanding their impacts can provide valuable insights for project managers.
Anita Potgieter, COO at project and portfolio management company, FOXit, says fears or previous failures, for example, can delay decision-making. A manager may feel paralysed or rely on support from superiors, potentially wasting valuable project time.
“If the project manager made a bad risk decision, which not only caused reputational damage to the company but was also heard about in the industry, they would most likely not be willing to make decisions and take risks,” she explains.
There are typically four approaches to a given risk: accept, transfer, mitigate or avoid. Acceptance requires the least action, while mitigation or avoidance involves drawing up a set of actions that either prevents the risk from happening, or minimises its effects.
The type of approach an organisation takes depends largely on its risk management maturity, argues Guy Jelley, CEO of project portfolio management provider Post Vision Technology. An immature environment, for example, will see a lot of risks simply accepted, meaning the project manager doesn't really have to take any further action.
“They acknowledge that there's a risk, but no one does anything about it. What happens then is there are delays or the project doesn't get delivered successfully.”
There are various reasons for adopting this attitude. “It might be that people are very optimistic and do not believe that things will go wrong, or that people are prepared to accept the risk and its consequences without really understanding the impact of it on the project.”
The more mature an environment, says Jelley, the more it will engage in risk avoidance measures, such as making provision for a second set of facilities or equipment in case the first fails.
According to Potgieter, people often exaggerate or underestimate risks due to lack of experience or a failure to understand what the project is about - and what the impact will be should the risk occur. Another cause is overconfidence.
“Psychologists have determined that overconfidence causes people to overestimate their knowledge, underestimate risks, and exaggerate their ability to control events. A project manager who is overconfident, persuasive or an extreme extrovert could easily downplay a risk and make people believe it's not a potential threat.”
The problem comes, says Potgieter, when that risk is realised and the stakeholders blame the project manager, who they feel has deceived them.
Charles Seybold, CEO of online project management solution provider, LiquidPlanner, says underestimation is very common in software project work - usually between 25% and 100%. “This is caused by three things: inexperience, unawareness, and cultural bias (it's easier to beg forgiveness later than to ask permission first).
“When a team wants to do the work, an honest estimation is almost always pricier than the provided estimate. People always assume that the stars will align and disasters will never happen to their project, but realistically, that's not the case.”
Gut feeling
Much research has been done into the psychology of risk - why people exaggerate some risks while downplaying others - and why we're often such poor judges when it comes to determining the likelihood of an event.
Most people, for example, subscribe to optimism bias, by thinking they'll do better at something than others. It's what supports the 'it couldn't happen to me' attitude to accidents and illnesses. We're also less afraid of risks we feel we have control over than those we don't, which is why people fear dying in a plane more than they do driving in a car, even though statistics suggest the opposite.
People always assume that the stars will align and disasters will never happen to their project, but realistically, that's not the case.
Charles Seybold, LiquidPlanner
Liz Pearce, COO of LiquidPlanner, says recent research shows the vast majority of what we do is driven by habit. “People do what they get rewarded for and shy away from the stuff with bad outcomes. I submit that rationalisation is just what we talk about after the fact, but that the core issue is the good or bad habits of the organisation.”
She adds that justifications can often become completely illogical. “In my time managing a large project management office, there were many things that really smart and decent people did that were bizarre. A senior engineering manager once invented the concept of 'delayed delivery' to explain why a quarter of the project's deliverables were being pushed out of the release. He disclosed this just before the promised delivery date. I said: 'No...it's not “delayed delivery”, that is what we call a “miss”.' It is irrational to sit on bad news, but people do it all the time.”
Familiarity also plays a role. Potgieter says project managers are generally much more attentive to unknown risks. “As you get to know a new risk, you gradually grow accustomed to and start to accept it. This means that as a project manager you will manage the risk less frequently than a new, unknown one.”
A Global Knowledge paper on the psychology of risk adds that when determining how probable a negative event is, people choose their behaviour according to how the risk scenario is framed, not an actual evaluation of the risk.
“For example: imagine that the US has two different alternatives to combat a new disease that is expected to kill 900 000 people. If treatment A is implemented and followed, 300 000 people will recover. If treatment B is implemented and followed, there is a 1/3 probability that all 900 000 people will recover and a 2/3 probability that no one will recover.
“When presented with these two options (A or B), almost three-quarters of the people choose treatment A, even though both have exactly the same expected outcomes. In this instance, people over-value the probability that no one will recover.”
In general, says Jelley, people are rather poor performers when it comes to planning for risks. “They don't take the emotion out of the issue. Everyone wants to be the hero and deal with whatever comes along. They don't want to be pessimistic and they worry about being the one who draws attention to a potential risk.”
Pearce says bad estimation accounts for most project failures, adding that good estimates require a healthy culture and good tools. “High levels of transparency in the project management process create a more trusting environment and leads to more sharing of information.
“Risk often grows in the dark corners of the project knowledge-sphere. It's always tough for a whistle blower because, by definition, somebody has already not done their job. Pointing out mistakes requires an extra-strong character,” she says.
“Most people are smart enough to avoid finger pointing because that costs social credibility. It's best to have an automatic or habit-based system that is always looking to catch things early.”
Fatal disconnect
Catching things early, however, is not something local practitioners seem particularly good at. A 2008 survey by researchers at Unisa and the University of Johannesburg, published by Project Management SA, set out to determine the IT project management maturity of local companies. It found that 27% of projects failed, 36% were challenged and 37% were considered successful.
It is irrational to sit on bad news, but people do it all the time.
Liz Pearce
“Project risk management appears to be the PMBOK [Project Management Body of Knowledge] knowledge area that has matured the least. Most projects fail because one or more of the risks that were identified were not avoided or mitigated, and then actually occurred, resulting in the need for reactive measures, which are almost always less effective than proactive ones,” it states.
“In the South African market, people just don't do risk management,” says Jelley. “In general, projects are not being delivered on time or within budget, and are not of the necessary quality.”
Jelley believes the gap between project sponsors and project managers is the main reason for failures. “Project management maturity is very low in SA because there's very little interaction from the business executives' side. Project managers often have to do everything themselves with no one taking ownership, and are told to just get on with it.”
He argues that risk mitigation should be done at the highest level. “So if, for example, there is an IT-related risk, the CIO should be overseeing that. The reality is this is not happening at all - people prepare for a project as if everything is going to go according to plan, and there isn't a comprehensive list of activities to deal with things if it doesn't.”
Pearce agrees, saying that while teams are quite good at identifying risks, the project often falls down because management cannot process the questions and requests for decisions effectively.
That said, Jelley admits he's seen much more focus on risk management in the past year than before, and that organisations seem to be waking up. “They're saying: 'The fact is, projects are not being delivered properly, things are happening that we could have planned for, and we can do better.'”
Managing IT projects in a world where the only thing that's certain is uncertainty is no easy feat. But defining risks from the get-go, and dealing with them realistically, will save much time and expense in the long-run. Seybold adds that while underestimating risks can be a deathblow, overstating risks is one of the most wrongly maligned concepts around.
“Research has shown that while the cost of underestimation is extremely large due to false starts, thrashing and poor quality, there is almost no downside to overestimation. If somebody gets finished with their work early, they usually just move on to get an early start on the next high priority task.
“Old school database developers always sandbag, and I wouldn't have it any other way.”
Share