- It’s never what you think
Key example: “One example, is the world of high reliability organisations which is a well-studied area looking at things like nuclear aircraft carriers, power plants, and air traffic control.
“These are highly complex and inherently dangerous systems with constant activity and things that could go wrong and yet they’re often one of the safest places to work. They are one the lowest risk environments.”
Lessons: “A lot of it comes down to culture. You manage risk at the point at which it occurs and a simple example is if you look at an aircraft coming into land, the pilot is in charge of the aircraft… but there’s no question that if somebody down there at the point in which the risk is occurring called a halt or wants to change something, they have the power to make that decision right there and that’s a really key part of this.”
- Risk is not the effect of uncertainty on objectives
The problem: “I love ISO 31000 for its elegance and I really liked the definition of risk as the effect of uncertainty on objectives. It’s not perfect by any means, but I love that it includes a broad church… that includes positive and negative consequences.
“The problem is that outside ISO 31000 the rest of the world thinks risk is bad, risk is negative. Pick up any dictionary and look for risk and you’ll find it’s all about adverse outcomes.”
Talbot’s solution: “We need to think about how we define risk and a new lexicon.
“Perhaps we need to think of the medical approach where they use words like tension pneumothorax which only means one thing to a medical professional or an ambulance officer. For anybody working in that space it is immediately obvious what it is.”
- Risk management equals future management
Talbot says: “We are trying to shape the future and trying to create the future… So, for me, risk management is about future management.
“We know so little about our known universe – that the potential for where we go with this discipline of risk management now is unlimited and we need to be thinking about that.”
- Blind spots
Talbot says: “Part of my success in life and self-awareness has been this journey of constantly looking for my blind spots. My friends who know me well, will see things about me that I don’t see and vice versa I see things that seem obvious to me but are invisible to them and when we’re doing risk management it’s the same.”
Lesson for risk managers: “As part of a risk study you will be interviewing executives and frontline managers and all sorts of people to find out what’s going in the organisation. Throw that question in there, ask them – What are the blind spots? What is the boss not seeing? etc.”
- RIP Black swan events
Talbot says: “One of the blind spots we think about are Black Swan events, so this concept of events which come out of nowhere, that are unanticipated, or that perhaps couldn’t be calculated by probability calculations but have a large effect on history.
“My view is it’s 2020, we’ve had catastrophic bushfires in Australia and California we’ve had locust plagues in East and Central Africa, we’ve had pestilence, we’ve had plague, we’ve got Covid-19. We’re facing what may be the biggest depression in recorded history. I don’t think we can afford to say anymore ‘oh sorry boss a Black Swan came out of nowhere.”
Lesson for risk managers: “It’s our job as risk managers to make the black swans visible… Swans come in all colours and we either need to be predicting them or using our imagination to see them. It’s lazy risk management to say sorry that was a Black Swan, I didn’t see it coming boss, better luck next time. The public and the world deserves better.”
- Swiss cheese = causal chains
The swiss cheese model: “One of the more interesting aspects is Swiss cheese as a model with barriers being modelled as slices of cheese and if all the holes line up then you get this scenario where risk manifests.
“But more interestingly, in every fatal or significant mishap that I’ve looked at, multiple things had to ‘go wrong’ or barriers had to fail, any one of which, if that link had been broken in that chain, would have reduced the magnitude.”
Example: Piper Alpha was an oil rig. In 1988 in the North Sea, a series of failures caused first a major oil fire and then a catastrophic gas explosion. This took a dozen things going wrong before this manifested as a major risk and a tragic fatality for 167 men and their families and their friends.
Lesson for risk managers: Putting this to a causal chain model with bow ties and a whole range of other things is a great way of looking at risks. Even if you use this concept to model near-miss events – a really thorough analysis is probably going to prevent major fatalities.”
- Rock pools of risk
The concept: “The idea is that if your enterprise is a rock pool there are high points where you can stand on a rock and you’ll be out of the water. There will be points where you’re walking through the sandy base and there’s a level of risk that’s up to your shins and you’re happy with that. And then is as you walk through the rock pool there are going to be bits of coral and hidden holes where you could turn an ankle or injure yourself severely.”
Applied for risk managers: “People often have this idea that risk management at an enterprise level is about documenting all the risks, putting in place a piece of software, and sending people out around the world to look at your facilities and all the exposures. That’s playing whack-a-mole really. You’re going to see death by a thousand cuts really trying to treat enterprise risk by that way.
“What you need to do is aggregate the risks in a way that you can say OK here’s my risk criteria, my risk appetite, my tolerance – and wherever my facilities are, or my people are operating, or whatever my projects are I want them to have more or less the same risks.”
“The idea is not to bring a truck load of sand to fill this rock pool up so that you’re not walking through any water because that’s a wasteful use of resources. What you’re trying to do in this analogy is understand that topography.
“Then you can say: this is the high ground, it’s way safer than the rest of my financial portfolio, for example, so I might move some of that money or some of those resources, some of those people, by metaphorically rolling one of those rocks into a hole.”
- Thank you, Covid-19
Talbot says: Covid-19 is tragic in terms of the deaths and the economic impact and the illness that it’s causing and the stress for people… but it is preparing us for what we know is inevitably going to be a worse pandemic – a higher risk and worse disease load.
“My key insight into this is anything we treat for a known event has got downstream consequences for a Black Swan, a Green swan etc. Things we don’t anticipate will always benefit from steps we take now and sometimes it’s worth spending a little extra or thinking about the possible additional benefits of these risks.”
- Fooled by randomness
The problem: “We attribute success to our own skill when quite often it’s actually luck.”
Example: “When you put your money into a managed fund you try to pick one that’s been successful and has outperformed. But if we think about the index for a stock market, 95% of managed funds fail to beat the index.”
- Geniuses they may be
The problem: It’s tempting to believe that other people know more than you. Especially when they are very confident.
Case study: “Long-Term Capital Management was a hedge fund, where a series of geniuses, Nobel laureates and PhDs started an options hedge fund. They made 40% a year for five years, fantastic returns, money was piling in, everybody wanted to be part of this.
“After year five the Russians defaulted on the Rouble and Long-Term Capital Management almost brought down the entire financial market.”