Saturday 20 April 2013

Why iron control leads to hospital mistakes

When the management theorist Naresh Khatri went to live in the USA, having been brought up in India and Singapore, one thing particularly intrigued him about the American health system: why did everyone talk about how much it was changing when, as far as he could see, it wasn’t changing at all?

Khatri had been in the USA before as a student, and – to him at least, and despite all the rhetoric and cacophony of change – nothing actually seemed to be any different. Certainly there were changes of regime. There were the new health maintenance organisations and the rationalisation of diagnoses and symptoms. But the basic feel of surgeries and doctors was much as it always had been. 

There were still hugely expensive law suits, and the same vast insurance bills, the same huge hospital corporations and the same mistakes. Nearly 100,000 patients died in the USA every year because of mistakes, more than car accidents and breast cancer (in England and Wales, the equivalent figure is contested, but is variously put at 800 or 34,000 a year).

Khatri had no inside experience of healthcare at that stage. He had worked at the Federal Bank of India for five years. But like any good academic researcher, he set out to find out why there was this mismatch between his perception and everyone else’s. 

He began to organise seminars on healthcare among his fellow management academics at the University of Missouri. He tested his ideas against similar research in other countries, and the conclusion he came to was that hierarchical management style, the blame culture, and the obsession with top-down IT systems, had just carried on regardless. If the new systems, which caused so much argument, were really more efficient, you might expect that mistakes would be going down too. Actually they stayed much the same. 

 Health reformers were even using the burgeoning cost of hospital mistakes as a reason for standardising and controlling even more, but there was little evidence that they were right.  More targets, more numbers, more control, less change.

Khatri began to suspect that the highly-controlled culture of compliance and blame was actually why the mistakes were happening in the first place. So he and his team designed an experiment to categorise 16 hospitals in Missouri by management style to see if it had any effect on hospital mistakes. They also put in a whole series of tests to see if they were thinking along the right lines, including surveys of over a thousand health providers across the USA. 

The results were peculiar: the expected link between medical mistakes and a culture of blame wasn’t there in the sample. But what was clear was that there were fewer mistakes when the medical staff trusted and felt good about each other, and more drug-related errors in hospital cultures which were exerting the most detailed control.

“The current bias towards innovative technological solutions over those that require the transformation of current dysfunctional culture, management systems, and work processes in healthcare must be corrected if medical errors and quality of patient care are to be taken seriously,” he wrote.  More about Khatri and his research in my book The Human Element.

This confirmed that control is less effective than letting staff use their human skills, but it doesn’t explain why this might be. But other research that was going on at the same time suggested a reason. Another study found that up to 80 per cent of hospital mistakes in American hospitals had less to do with technical problems than with the personal interactions inside the healthcare teams. 

 Working in a blame culture forces staff to protect themselves, even if it is just against reams of paperwork. They put more effort into shifting blame than genuinely discussing mistakes, or what are called in the jargon ‘adverse events’. In other words, it is their relationships with each other – and with their managers of course – that make the difference. This isn’t about face-to-face relationships with patients, it is about face-to-face working relationships.

“We tend to think that, without regulation, people will do stupid things,” says Khatri. “But actually, they don’t. And when you exercise that kind of control, then you are not using people’s ideas fully.”

I thought of Khatri's research when I read the government's response to the Mid-Staffs crisis, and in particular John Seddon's system thinking response yesterday.  More control is not a solution.  More targets means more effort going into meeting the targets and regulations and less on patient care.  It is precisely the opposite of what needs to happen.

Increasing the penalties for people who massage the target figures is the logical next step for iron control, but as Seddon says, it is impossible.  To make things work, frontline staff always massage the figures.  They have to.

Whitehall has still not learned from the failure of control during the New Labour years.  It is staggeringly expensive, largely because it shifts resources and imagination into meeting the regulations rather than doing the work effectively.  In fact this gap in learning is, for me, the central misjudgement by the coalition - and I don't really understand it.

Why do people who are so determined to set the economy free - so that people's entrepreneurial skills can be used - not realise that the same applies to public services?

No comments: