Thursday, 16 January 2014

Health statistics - can they be bad for your health ?

How health statistics can misinform - 

We hear of and we see many headlines concerning health issues, and most of them are both bad news and very dramatic. 
They can be very misleading and they can exaggerate what they intend to show. The methods of exaggeration always involve the use of percentages, rather than what  are usually called raw data, ordinary numbers.


Example 1:

About 800,000 people die each year in the UK.


Let us suppose that 8 of  these people die by being stung by a bee or a wasp. This would be an annual death rate of 1 in 100,000.

Let us suppose that in the following year 16 people die from the same cause. This is a death rate of 2 in 100,000.

How is it best to describe this increase in death rate?

First and most simple, there is an increase of 8 per year in the UK. 

But we need to generalise to allow comparisons between other countries, comparisons within different parts of the UK, and comparisons with other years.

And so we describe the increase as 1 in 100,000 per year, from 1 to 2  per 100,000 in this example. We can also express it as an absolute increase of 0.001% of population.

These expressions of increase would not create headlines or induce panic among the population. In order to achieve this, the increase must be presented in a much more dramatic way.

Increasing 8 to 16, or even from 1 to 2 is doubling. This sounds good, quite impressive.

“Twice as many people died from bee and wasp stings this year compared to last year”. Now we are moving into headline news. It is starting to sound dangerous.

But is it twice a big number or twice a little number? This is important in translating risk or benefit to an individual or to a community. Twice something small, like a cent or a penny, is not worth worrying about. Twice something big like £100 or $100 is worth attention!

When a number doubles, the change can be expressed as a 100% increase. 100% of one is one, added to one gives two, a 100% increase.

Once again the question must be asked (but rarely is), 100% of what? Something big or some things small?

In this example what do you think is the most useful and meaningful way of presenting the increase:
100% increase or 0.001% increase (1 in 100,000)?
You would probably agree with me that the latter expression is most useful, giving a reasonable assessment of risk.

Example 2:

We are informed that if you are admitted to a UK hospital on a Saturday or Sunday, you will have a 16% increased risk of death during the following 30 days compared to if you had been admitted on a week-day.

Dramatic and worrying. Big headlines in the national press. Questions in Parliament. “Something must be done!” “Consultants must spend more time in the hospitals at the weekends.” 

Olfactarithmetic (recent New Scientist) - does the data “smell” right, or does it smell fishy? Is it really true (as the politicians and newspaper immediately assumed) that for 100 deaths following hospital admission on Monday to Friday, there are 116 deaths for those admitted on Saturdays and Sundays?

It is indeed dramatic but it smells fishy. There appears to be a major mortality effect and it should be obvious. The important but missing information is: how many deaths are we talking about? What are the numbers?

The study that led to this identification of 16% was based on approximately 14,217,640 (14.2 million) people admitted to hospital in the UK between April 1st 2009 and March 31st 2010. By 30 days after hospital admission there had been 187,337 in-patient deaths. The proportion dying in hospital was therefore:

187,337 / 14,217,640   =  1 /  75.9  =  0.013   =  1.3%

Now we can apply the 16% because now we know the answer to “16% of what?” :  it is 1.3%.

16% of 1.3 = 0.21

So in reality the chance of dying in hospital within 30 days of admission is only 0.21% greater if admitted at a weekend compared to a weekday.

The study also reports on 30-day mortality whether dying in hospital or following discharge home. There is now a total of 284,852 deaths. 

284,852 / 14,217,640   =  1 /  49.9 =  0.02   =  2% deaths of those admitted.

An excess of deaths for those admitted at the weekend would be 16% higher, which means 0.32% greater than weekday admissions.

Another way of looking at this, is that a weekday admission gives a 98% chance of survival 30 days, compared to 97.68% following a weekend admission. Not a significant difference.

This clearly not going to make newspaper headlines or lead to a parliamentary debate.

But what is the best way of expressing this increased risk? Is it 16% increase or is it 0.32% increase? I suggest the latter, but I have no interest in headlines. The absolute increase, 0.32% is most realistic and it conveys the most useful message to the public.

(I thank Dr Steve McCabe from Portree, Skye, for drawing my attention to this) 

Data source:
Journal of the Royal Society of Medicine. 2012 February; 105: 74–84.
Weekend hospitalization and additional risk of death: An analysis of inpatient data.
N Freemantle, et al.


What is the benefit of statins?

We have seen this in an earlier post but it is another example of how statistics can, in this case, exaggerate benefit.

A clinical trial of statin therapy was conducted in the West of Scotland and published in 1994 (is was called WOSCOPS). It studied men aged between 55 and 65, with high cholesterol levels. They had the world’s highest incidence of death from coronary heart disease (CHD, heart attack). Half were given a statin every day for 5 years and the other half acted as controls, being given a dummy placebo tablet.

The five-year death rate in the controls (untreated) was 4.2%, and in the controls it was 3.1%.
This looks like a death rate reduction of 1.1%  (ie 4.2-3.1)
This looks rather undramatic, but it means that 100 at-risk men must take a statin tablet every day for 5 years to prevent (delay) one death. Not very good.

How can we amplify the effect and make it look better?

1.1 is approximately one quarter of 4.1, in other words 25%.

So the advertising of statins tells us that they reduce death rate from CHD by 25%.

This is much more impressive and leads to big sales. But it still means that 100 high-risk men must take a statin for 5 years to delay one death.

There remains the question: 25% of what?

It was more than 20 years ago that the study was performed. Since then the death rate from coronary heart disease has gone down by a factor of 40, from 800 to 20 per 100,000 per year. Therefore it will now require 4000 men (40 x 100) to take a statin for five years to delay one death. This is however kept quiet, the information has been hidden. 

We are still told of the 25% reduction of deaths, but not that it is now 25% of something only one fortieth of what it was 20 years ago. Is it worthwhile?

Beware statistics when expressed as percentages. 
Remember to use olfactorarithmetic - does it smell right?



No comments:

Post a comment