Measuring quality

Why bother? If you don’t, then how will you know whether your hoped-for improvement has made a difference? Like navigating at sea, you need to know where you are to start with, where you are aiming for and where you are on the journey. Sounds simple? If it really is so simple, why do so many people see management information as more of a smokescreen than a reflection of what’s happening? The short answer is that the wrong things are being measured, at the wrong level of detail, at the wrong intervals and/or for the wrong purpose. So people drift off course (and waste a lot of time and money on the measurement process).

Whatever you measure, people’s perception of how you will use it will affect the data you get. As with most things, capability and motivation are the vital ingredients for success.

The auditor’s tale

The Head of Internal Audit wanted to know what proportion of audit time each constituent task typically took up, so she set up a time recording system to show that. She told her staff that this was her purpose, but also that she expected the total time on their work sheets to equal their conditioned hours. As a result, people thought that in reality she wanted to check up on their overall working hours, and took to working backwards from their total time. By a miracle, their timesheets always added up to 40 hours! She never did find out how much time was spent on drafting audit specifications.

Measure the ‘vital few’ things

If someone asks you ‘What are your Hoshins?’ you might be tempted to reply ‘Bless you’, but all they mean is ‘What are your critical success factors?’ Hoshin is a Japanese word (as are so many quality expressions) for a method of setting strategic direction. Some of the more statistically-based quality models sound as if they advocate measuring anything that moves, but the underlying idea is to get a sound basis for evaluating progress, rather than mere subjective opinion. That means different things at different levels in the organisation.

Example

There is an amazing drawing of St Pancras railway station in London, in every gothic detail, drawn from memory by an autistic savant. It is a wonderful piece of virtuosity, which draws you magically into examining it with a magnifying glass to enjoy the minute detail. Then at the orther end of the scale, there’s the slapdash or minimalistic approach where you can hardly even guess at the subject of the painting.

The individual detail of St Pancras is very useful indeed when examining operational tasks. For example, reducing how much a barista has to stretch and move when preparing coffee may increase customer satisfaction by saving time and providing a hotter cup of coffee.

Customer satisfaction levels are strategically important, but if you total up all that level of detail, the board gets a picture of statistical virtuosity rather than anything they can base decisions on. Wouldn’t the neccessary minimum approach be more useful for their purpose? So measure the few things that are vital at your level, and are vital from a customer viewpoint! Keep it simple, and remember that the devil is in the detail.

Be careful what you measure

Action

What you measure is what you get.

Measuring an improvement somehow suggests that you take it seriously. Measurement also tends to affect how people behave. Think of the imaginative ways that hospitals found to move up the waiting list league tables. What if that imagination had been applied to actually making a difference to patient care? Was it the measures, or the way they were used, that produced the behaviour that adversely affected public opinion?

This is why measures are such powerful influences on behaviour, and so need careful design.

Interpreting results

Question the results and evaluate the measures. Measures and indicators give questions not answers, so interpret them with care. They may be telling you something other than the obvious. So before you introduce a measure, think about what it will really tell you (or not), and after a trial period, ask yourself – is it telling you what you thought it would? Can you do anything useful with the information anyway?

How do you measure?

Most organisations have some form of performance measurement. These have tended to concentrate on how things are done – measuring inputs. Current trends are to measure the effect of what is done – the process and the outputs, or enablers and results. Although some quality gurus have argued otherwise, many people agree that real sustained improvement comes from competent, motivated people finding their own ‘best way of working’ to produce optimum results. Many factors influence that outcome, such as company culture, management and peer behaviour, HR and financial systems. So it makes sense for any measurement of performance to take several of those factors into account, perhaps along the lines of a Balanced scorecard.

Also see the topic on Performance Management.