I’ve recently had the privilege of two hours with Sir Brian Jarman, GP, previous president of the BMA, statistician, geophysicist, senior IHI Fellow, founding member of Dr Foster and yes, architect of the HSMR – Hospital Standardised Mortality Ratio.
His story is fascinating – a selfless commitment to improvement, transparency and the development of balanced data.
HSMR measures an adjustment in patient mortality while the sister measure, Summary Hospital-level Mortality Indicator, measures 30 day mortality post discharge. SHMI excludes patients coded as palliative so discharging a terminal patient to die at home doesn’t count.
We’ve all heard criticism of the HSMR. Indeed it would appear at times that only patients are interested in a 25% increased probability of dying in Hospital A versus Hospital B – but then why wouldn’t they? After all they are the patients and death is kind of an all or none event.
HSMR was initially developed by Sir Brian as a means of seeing if it could be used in the formula for redistributing resources to hospitals. The square root of the SMR of an area was used to calculate the allocation to the geographical area e.g. SMR 1.21, Square root 1.1, the area gets 10% more on adjusted allocation.
When it comes to quality assurance, the first response as clinicians when faced with comparative data is to claim that our patients are actually the illest or the data is invalid. In the Bristol inquiry a higher proportion of Downs Syndrome babies was used for this excuse, only for the research to prove the opposite. Multivariate analysis with regression modelling is not manipulation of the data, its adjustment to a meaningful scale. The simplest example of readjustment is a ratio. If hospital A and hospital B both have 50 deaths a month but hospital A has 500 admissions and B 625 admissions then we can all work out which is the ‘safer’ of the two. Similar principles but more complex methods adjust for multiple parameters in turn thus putting the multiple variables back in context. These variables include:
- Age on admission (in five-year bands up to 90+)
- Admission method (emergency or elective)
- Socio-economic deprivation quintile of area of residence of patient
- Financial year of discharge
- Primary diagnosis
In fact the HSMR is the ratio of the actual number of acute in-hospital deaths to the expected number of in-hospital deaths, for conditions accounting for 80% of inpatient mortality.In England the baseline is reset to 100 each year.
For the full monty here is an excellent summary from the man himself.
The dark reality is that greater attention to the HSMR could have prevented the need for the Mid-Staffs and Bristol inquiries where HSMR had been steadily climbing for the decade before. It was not only ignored but claimed, by David Nicholson among others, that the data was not available. Organisations in the West Midlands SHA logged on to their mortality alerts 69,000 times in the period covered by the Mid Staffs Inquiry (Jan 2005 to Mar 2009).
Revised HSMR for Mid-Staffs 2003-2008
In fact in the name of transparency Dr Foster is now putting more and more data in the public domain. Go on, check out your local.
Still, understanding the data and nuances is crucial for sound judgement. So what does Jarman say when challenged about the use of HSMR? He’s pretty clear:
1. HSMR is a trigger, not the end of the story
2. If someone can improve on the clinical and statistical aspects then he welcomes their input. Unlike other companies the method is clear.
3. Use in league tables has never been suggested, but use for individual improvement has.
4. HSMR should not be used to generate figures on ‘preventable deaths’ or ‘avoidable mortality’. This use in the media has likely contributed to its criticism.
However, far more dangerous than a statistic that carries a probability of being incorrect 0.1% of the time is the attitude of explaining away difficult data, hiding it or denying it. The quality triangle includes quality planning, assurance and most importantly improvement. We’ve been infected with quality denial but we have a real opportunity for treatment and cure.
Becoming an improvement organisation, which the NHS needs to become, requires a commitment to understand where we are at. If we don’t measure, we can’t tell we have improved. In addition a cyclical system of learning is required as the engine for change.
I’ve heard people say “there will be another Mid Staffs, you can’t stop it happening”. True, if things stay the same. There’s another way though.
Despite Bristol Infirmary getting a clean bill of health from the Royal College of Surgeons (twice) prior to the uncovering of the reality, real changes happened post inquiry. It wasn’t a wonder cure but simple process improvements and getting the basics right. Resource allocations were dealt with (for instance having 3 paediatric cardiac surgeons/ million vs 40/million in Northern Europe). Re-training occurred, system flaws (such as having the recovery area on a different level to the theatre) were dealt with.
The mortality rate for paediatric cardiac surgery fell from 24% to 10% then 3%. Cause for celebration? Those involved should be congratulated.
Let’s not wait for an inspection nor concentrate one’s effort on the coding department but take the stats on the chin and be resolute about making our health system safer.