Risk Managementby Peter Parkes
Step 2: Assessing risk
And the day came when the risk to remain in a tight bud was more painful than the risk it took to blossom.
We do not have sufficient resources to deal with all possible risks, so we need a way of deciding which ones to deal with and which ones to accept.
In our log of the risks that we have identified, usually referred to as our risk register, it is sensible to include additional columns to describe features about the risks, including the following:
- A description (as the context will probably be lost after the initial risk assessment discussion or workshop)
- A guess as to what may be the cause of the risk – in other words, the root cause
- Some estimate of what the consequence of the risk may be
- An estimate of the likelihood of the risk occurring.
The most important aspect to deal with here is probably our assessment of what the root cause is, as our estimate of likelihood and consequence affects the occurrence little, but our understanding of what may cause it to happen affects how we choose to use our resources to deal with it.
With regard to our assessment of likelihood and impact, most texts refer to qualitative methods and quantitative methods.
As the term qualitative implies, here we make a straightforward assessment of the risk, generally using simple bands – high, medium and low. Sometimes five bands are used, including very high and very low.
The various combinations of likelihood and impact can be displayed in what is sometimes called a probability impact grid.
The unique IDs (reference numbers from the risk register) of the risks are entered into the relevant boxes to give us cross-references back to the risk register and details within it and, most importantly, to indicate whether immediate action is required.
Even with qualitative assessment, you will gain a much more rounded understanding of the map of risks if you introduce others to the process, both to capture risks that one individual may not have thought of, but also to moderate your own ideas of likelihood and impact.
For quantitative assessment, we expect to have a better understanding and estimate of our risks. Sometimes, however, this is just an illusion. As in any field, you need to avoid implying precision to a poorly defined number.
Quantitative assessment is about building and using the knowledge and data that the organisation has acquired over time. For example, if you run a factory, you could establish from your records how many times particular machines have broken down over a period of time, and hence extrapolate a likelihood that they will break down in a future time period – for example, during an important production run. Using this information, you may be able to justify replacing those machines, due to the likely cost of breakdown.
Tables of such data exist and are used to support complex business cases. For example, using the statistics for accidents on lit motorways versus those on unlit motorways, we can build a safety case for lighting some stretches of new motorways where there is heavy traffic. The fact that lighting significantly reduces accidents, but is only used on busy sections of motorway, indicates that our government has chosen to accept some risks/accidents/fatalities on a cost versus benefit calculation.
Many industries, such as the construction, oil, gas and nuclear industries, have extensive risk databases and use an industry standard ‘cost of life’ to determine when mitigation is cost effective.
The basis of scenario planning is imagining a number of possible futures that would be created if a specific variable changes, for example, if the price of oil were to dramatically rise, or fall. If the price of oil is critical to your business and there is uncertainty about that price in the future, what risk does this create in the different possible futures or scenarios?
Once a particular scenario materialises, then the whole risk register changes completely.
Risks with very high probability but low impact are best built into the baseline estimate for a task (so we know to allow ten minutes extra for the bus, for example, because buses are frequently late).
Risks with very low probability but very high impact are difficult to treat. For example, an individual aeroplane accident or a nuclear accident is most unlikely. However, if such an accident were to happen – as it eventually will if the sample is sufficiently large – then the consequences will be so disastrous as to oblige drastic measures to be taken in preparation.
On a more upbeat note, high impact can make a low probability risk worthwhile: oil exploration requires companies to drill very expensive holes, knowing that the risk that they will find oil from a particular test drill is almost zero. But they also know that if they drill enough test drills they will eventually find oil. The quality of the oil field, however, may be low, and conditions difficult, so that it is only worth extracting the oil if market conditions move in a certain direction – say, if the market stays above $100/barrel. They refer to the risk management – or opportunity management – around these activities as Scenario Planning.
Perception of risk
It should be recognised that humans are creatures of habit, and that which is familiar to us we perceive as acceptable and less risky, whereas that which is strange to us is often perceived as more risky. Also, we tend to distort the normal distribution of risk: believing that things that happen rarely happen more often, and that which happens frequently does so less often.
The public worries about the risk from nuclear waste, and fortunes are spent on informing the public that they have more risk of exposure to radiation from a plane journey or the sun. Similarly, we ignore the risks of driving to the airport, with many hundreds of road casualties each year, to worry about the prospect of an aeroplane crash, which has minimal risk (for most carriers).
The annual reports from bodies such as the Royal Society for the Prevention of Accidents (RoSPA) makes interesting reading. Do you know how many people died from accidents involving table mats last year? A clue: more than deaths from the nuclear industry and airlines in the UK combined.
The moral of the story: don’t look for exotic risks. You may be blinded by familiarity to the risks most likely to affect you. Again, the accident book at the nuclear facility at Sellafield is full of the usual array of paper cuts, tripping over open draws and falling off ladders, rather than people injured by radiation from nuclear leaks.
If you think that risk management is of particular importance to your operation or project, get some professional help to at least run a risk workshop (see Organising a risk workshop).