Understanding over-optimism can help organizations avoid poor decisions, create environment for better risk management
By Jared Dempsey, Kognivate, and Tim Wigham, Exceed Performance
In general, people are moderately optimistic and feel hopeful about life. In fact, research has shown that optimism helps people to be successful at home, in business and on the sports field. Words like drive, grit and resilience are all rooted in optimism.
The drilling industry is a story of tenacity and innovation, from its early days of land drilling, then pushing the boundaries to use piers to drill just a little off the land’s edge and at the beginning of the shallow sea, and finally deepwater drilling around the globe. The advances in the oil and gas industry have matched the technological progression of the space industry. None of this progress would have been possible without verve and optimism.
But there are downsides
While optimism can be a crucial ingredient for success, over-optimism can be an ingredient for failure. It is because of optimism that people can be convinced to invest in financial schemes that promise to make an incredible return on their money. It is optimism that causes such surprise when the investments are later exposed as fraudulent Ponzi schemes, and the investors lose their money.
The Dunning-Kruger effect is a psychological theory suggesting that people can be poor predictors of their own performance.
Over 20 years ago, two researchers – David Dunning and Justin Kruger – asked university students to estimate how well they would perform in upcoming tests. The students who performed least well on the tests strongly overestimated how well they would do; they thought they would do great. At the same time, those who performed the best on the two tests somewhat underestimated how well they would do; they thought they would do OK but ended up doing better.
Some commentators argue that the Dunning-Kruger effect doesn’t always happen, but there is enough evidence to suggest that people often overestimate their abilities and future performance. Experts Werner De Bondt and Richard Thaler said in a 1995 book, “Perhaps the most robust finding in the psychology of judgment is that people are overconfident.”
Illusory superiority and the illusion of control
Most people have heard the phrase “delusions of grandeur.” However, fewer people have heard of its close cousin, “illusory superiority.” The idea is that people a have a tendency to see themselves as better than other people.
For example, people rate themselves as more likely than their colleagues to speak up and point out risks. Additionally, approximately 80% of drivers rate themselves as above average when asked how safely they drive.
Another type of overestimating is the illusion of control. We often think we have more control over events than we actually do. As an example, 40% of people think that they can get better at predicting the results of a coin toss through practice.
Not only does overestimating ability and control indicate how over-optimism can be a common feature in everyday life, people are also unlikely to see themselves as being over-optimistic.
In the workplace, over-optimism can lead to poor decisions, which leads to poor risk management. Every project manager knows of the Project Management Triangle and the importance of bringing in projects on time, full scope and within budget. Project management has been a formal discipline since the 1950s. However, despite the professionalization of project management, projects across many industries still frequently miss their targets.
A 2018 report by Wood Mackenzie stated: “Surveying the last decade of (oil and gas) project delivery, the average development started up six months later than planned and $700 million over budget… That is a huge amount of value destruction.”
So, why do so many projects go awry? Sure, project governance and control systems are important, but overconfidence is also believed to be a major influence.
Professor Bent Flyvbjerg of Oxford University says that projects often start with overly ambitious and unrealistic targets, which are doomed to failure. The unrealistic targets arise from two influences: strategic misrepresentation and optimism bias.
In cases of strategic misrepresentation, people deliberately deliver false information or estimations to gain an advantage. For example, a vendor bids low on price with the intention of winning a contract and then being able to make more money through delays.
In cases of optimism bias, people genuinely overestimate their control and ability to deliver, resulting in inaccurate estimations of time, cost and scope delivery. Inaccurate estimations can be due to a mix of inexperience and wishful thinking.
According to the planning fallacy, there can be a tendency for managers, when planning new projects, to believe that the project will be completed on time, full scope and within budget even when previous similar projects have been late, reduced in scope and over budget. Even when projects have commenced, over-optimism can lead to poor decisions, such as a belief in the eventual success of a struggling project that should actually be suspended or canceled.
Overconfidence can occur in everyday operations – for example, pushing a piece of equipment beyond its limits. The role of overconfidence in causing incidents that cause damage, lost time and harm is well known. In fact, overconfidence is listed as an option in many root cause analysis tools.
In his book “Disastrous Decisions,” Andrew Hopkins points to overconfidence as a significant part of the Macondo disaster. He points to decisions, such as not performing a final cement test, as evidence of overconfidence. Safety expert Sidney Dekker makes the link between apparent success and over-optimism: When things have gone well in the past, we can be lulled into a false sense of security and lose a sense of unease, vigilance and risk aversion.
Previous to drilling the Macondo well, the Deepwater Horizon rig had drilled the 35,000-ft Tiber well in over 4,000 ft of water. In addition, it is well known that the rig had gone seven years without a lost-time incident. The rig’s previous successes may have led it to a misplaced sense of confidence and to the tragic blowout in 2010.
Given that over-optimism seems to be a natural trait for most of us and can result in negative outcomes, how do we prevent it from being a problem? Here are some ideas:
1. Make over-optimism part of the vocabulary in your workplace. Educate people about how optimism can be good and bad. To be forewarned is to be forearmed. Duane McVay, a professor of petroleum engineering with Texas A&M University, not only advises educating people about optimism, but he also warns against bonuses and incentives that encourage people to be optimistic when making estimations.
2. Look widely. Get team decision on estimations and decisions. Dr Flyvbjerg encourages organizations to use reference forecasting when considering proposed projects. Reference forecasting involves looking at similar projects, including projects outside of the organization, and seriously considering the actual performance of those projects as a basis for estimating the proposed project’s performance targets. Reference forecasting requires forecasters to resist the urge to think “this time it will be different.”
3. One approach to combatting the planning fallacy is to use the “inside view” and “outside view.” The inside view emphasizes how the planners think the project will unfold, while the outside view asks the planners to consider their own records in predicting performance. Forecasters are asked to reflect on their historical over-optimism or pessimism when making past project estimations to help them recalibrate their own present estimations. In other words, if a forecaster has been over-optimistic in the past, the forecaster should revise their estimations downwards and be less optimistic.
4. It’s important to create a workplace where people can share contrasting ideas in a group setting. Team members need to feel comfortable challenging other team members if they suspect overly optimistic or unrealistic estimations. Leadership coaching and purposeful 360-degree leadership feedback can be valuable in creating teams with high degrees of psychological safety, where people feel free to share their opinions. You can also use an online survey platform to anonymously get estimations. Any gap between anonymous and face-to-face generated estimations should be revisited.
Christopher Robin supposedly said to Winnie the Pooh: “You are braver than you believe and smarter than you think.” He may have been right in Winnie’s case, but for most of us, we tend to think we are slightly more capable and in more control than we really are. In most areas of life, optimism is highly desirable, but when it comes to highly significant business decisions or safety choices, it can be beneficial to step back and ask if we are being too optimistic. There is a fine line between optimism (optimal reality) and over-optimism (oblivious to reality). DC