Monday, 23 June 2014

There is more to the sun than vitamin D

A conference on the subject of Vitamin D took place in London on May 23-25 2014. It was a big conference with presenters and delegates from all over the world.

There were presentations of particular importance.

Observational studies have consistently shown that people with high blood levels of vitamin D have many health advantages. A clear example of this from the UK is shown in Figure 1.



We can see in the first group that “All-causes” death rate is almost twice as high in those with the lowest blood levels of vitamin D compared to those with the highest levels. Death rates are much higher for deaths from cardio-vascular disease, cancers and infectious diseases in those with the lowest levels of vitamin D.

If we are interested in good health and long life, we would want to be in the groups with the highest blood levels of vitamin D, greater than 30 ng/ml or 70 nmol/L. It would appear that the sensible way forward would be to give by mouth supplements of vitamin D so as to improve blood levels. This has been done on several occasions, but the results have been disappointing. This could be for a number of reasons.

First the dose might have been inadequate to achieve a satisfactory blood level, but this has been checked in recent studies. One piece of information that came out of the conference is that there does not appear to be an advantage in achieving blood levels of vitamin D above about 40 ng/ml (100 nmol/L).

Second it might be necessary to start with a high blood level of vitamin D early in life, but it would take many decades for this to be investigated further.

Third, it might be necessary to give the supplement for longer has been the case in the trials. Time will tell as more publications appear.

The fourth explanation, and the most interesting, is that the blood level of vitamin D is a measure of the exposure of an individual to the sun. This suggests that vitamin D being an index of exposure to the sun might not be a substitute for the sun, which is of fundamental importance to human heath (and to that of all animals). 



The action of the sun on the skin converts 7-dehydrocholesterol (7-DHC) into vitamin D (cholecalciferol), which is then activated in the liver to 25-hydroxy vitamin D (calcidiol).

A study from Edinburgh described in an earlier Post (Nitric Oxide) showed that exposure to ultraviolet light reduced the blood pressure of the subjects tested. However there was no change in the blood level of vitamin D, indicating that the beneficial effect of ultraviolet light on the blood pressure must have been mediated by something other than vitamin D. The sun acting on the sun releases nitric oxide from nitrates in the skin and it is this that has a beneficial effect on the circulation.



If people are deficient in vitamin D then a supplement by mouth will be of some benefit. It will heal the bone disease osteomalacia and there might be other benefits. However it would appear that vitamin D taken by mouth does not have the full range of benefits of the sun.

It is course possible that the beneficial effects of the sun on the skin will include not only vitamin D and nitric oxide but that there might be other actions that have not yet been recognised.





Wednesday, 7 May 2014

Is Obesity Bad for You ?


The generally held view is that we are witnessing an epidemic of obesity which will lead to countless deaths from diabetes and heart disease. This is one of many health warnings that we hear, all of which deliver dire warnings of impending doom. Slimness is all-important. The overweight are made to feel guilty. It is claimed that the obese are or shortly will be the majority of the population. 

Obesity is usually measured by the BMI (Body Mass Index) which is body weight adjusted for height. A tall person could have the same weight as a short person but would have a smaller BMI, and would need to have a greater weight before being labelled as overweight or obese. 

A person of height 1.8m is said to be underweight if less than 60kg, normal if 60-80kg, overweight if 80-100kg, and obese if greater than 100kg. The weakness of BMI is that it does not just indicate the fat content of the body but also its protein content and so it is not a reliable indicator of obesity. The definitions of normal and obese have been redefined to make more people “obese”.

Figure 1


The forecast epidemic of obesity is not happening and the proportion of the population said to be obese has not changed during the past ten years. It is about 35 per cent at present. However the prevalence is not distributed evenly across society. 

Obesity in the UK is much more common in the poor, the socio-economically disadvantaged, than in the wealthy, and this is especially true in women. The reason is not known but it is somehow linked to other diseases. It is not clear whether it is behavioural, the result of faulty eating or lack of exercise. It could be biological, perhaps linked to the bacteria living within the intestine, which are related to and possibly causative of obesity. It could be something to do with low levels of vitamin D reflecting lower exposure to the sun.

Figure 2

We can see in this UK study from 1976 (Department of Health and the Medical Research Council) that in all groups obesity was more common in the "lower social groups", what we call today the rather cumbersome "socio-economically disadvantaged".

It is said that the obese will die from heart disease and heart failure. However this has not been the experience during the past twenty years. The survival of people with heart failure is surprisingly better in the obese than in the thin. This has been demonstrated in Sweden, in people following a heart attack. In the USA it has been shown that the long-term survival of normal-weight people with type 2 diabetes was only half of that of overweight or obese people with type 2 diabetes. 

It is the very underweight who have the worst survival but this appears only in older people. Young people have such a low death rate that differences will not be apparent. 

Figure 3  Relative death rates related to BMI


Figure 3 shows that the relationship between death rate and BMI is what is called a “U-shaped” curve. This is quite common in health-related biological measurements, for example death rates and alcohol consumption. It means that extremes are not good and it is best to be in the middle, in this case normal weight or overweight. 

Being grossly overweight is bad, but normal weight people should not strive to be thin.

There is also a difference between the “healthy obese” and the “unhealthy obese”, and judged by cardiovascular parameters such as blood pressure and insulin resistance. Strict adherence to BMI does not seem to be sensible.

See also: 
New Scientist 2014; Volume 222, Number 2967, Page 44.


Thursday, 17 April 2014

White skin - Neanderthal inheritance


White skin - Neanderthal inheritance

My younger son Daniel pointed out to me an error in my last post, the topic of which was a recent increase of the frequency of rickets in children in the UK. As would be predictable from previous experience, the incidence was higher in children of South Asian and Black ethnicity than in white children.

I mentioned in the Post that the dark skin people living in tropical parts of the world is an evolutionary adaptation to the high intensity of the sun. Daniel pointed out that strictly speaking this is not correct, as Homo sapiens originated in Africa and the vast majority of the world population is dark-skinned, living in tropical or semi-tropical zones.

It is the white-skinned population of the world that is a small minority, differing from the majority, and it is this group that is an evolutionary variant, and adapted to living only in the northern temperate area of north-west Europe. What was the origin of this small group?

Neanderthal (Neander tal) is a valley close to Dusseldorf in Germany. It is indicated by a sign on the autobahn viaduct that crosses the valley. Skeletal remains of a humanoid species were first discovered in the valley in the 18th century and have since been discovered in other parts of Europe. The humanoid became know as Neanderthal Man, scientifically Homo sapiens neanderthalensis.

Recent genetic studies have shown that the white-skinned variant of Homo sapiens has in the genome some DNA derived from the now extinct species Neanderthal Man. In other words at some time in the distant past there was a small amount of interbreeding between the new Homo sapiens spreading out of Africa and the older Neanderthals living in Europe, about 60,000 years ago. Little genetic advantage was passed on to Homo sapiens but one is a genetic factor for fair skin, an advantage when living in northern Europe. Another genetic factor is fair straight hair.

Darwin introduced the idea of survival of the fittest, indicating that if a new gene gives an advantage, then those with that gene will thrive. In north-west Europe a pale skin gave an advantage in the form of enhanced vitamin D synthesis.

And so, thanks to the Neanderthal humanoid, white-skinned people are genetically adapted to living very distant from the equator. Dark skin is the global norm of pure Homo sapiens. The Neanderthals became extinct as Homo sapiens proved to be superior, but their genetic influence lives on in white-skinned people.

 Reference:


http://www.newscientist.com/article/mg22129542.600-neanderthalhuman-sex-bred-light-skins-and-infertility.html#.U0WpP16AQpE

Wednesday, 2 April 2014

Rickets - a recent increase in children

Rickets is a disease of children. Formation of bone is impaired so that the bones become soft. When the child starts to walk the bones of the legs bend, giving rise to “bow-legs”. In severe cases the pelvis will contract and if in a girl it will lead to subsequent difficulty with labour. 

It is due to deficiency of vitamin D, which is essential for the process of “ossification”, the incorporation of calcium into bone. Rickets was very common in industrial cities in the 19th century. A hundred years ago it was recognised as being due to lack of sunlight, which acts on the skin to produce vitamin D. 

It was also discovered that vitamin D was present in fish oils, and this led to an opportunity to treat or prevent rickets despite lack of sunlight penetration through heavily polluted atmosphere. Cod liver oil became a standard medication for children in northern Europe, but this became out of fashion in the latter half of the 20th century when clean air legislation and holidays in the sun became effective.

Rickets became a disease of history, but in the 1960s there were medical case reports of rickets in children of immigrants from South Asia into the northern parts of the UK. This became a more widespread problem as the result of increased immigration into the UK. In the USA rickets was identified in Black children.

Dark pigmented skin is an evolutionary adaptation to tropical and sub-tropical regions where sunlight intensity at ground level would cause severe burn in non-pigmented white skin. The pigmented skin is less efficient in the synthesis of vitamin D but this is not important when sun intensity is high.  It is of great importance when the individuals move to live in north-west Europe. The northern parts of the UK are further north than anywhere in China, and sun intensity is relatively low, even in the summer.

South Asian people are adapted culturally to living in a hot hot sunny climate by dress that covers virtually all of the body. This is an important additional factor leading to vitamin D deficiency when resident in north-west Europe.  

The trickle of case reports has now reached a significant number, and among white children in addition. A study from Oxford has shown a significant increase in the incidence of rickets during the past 15 years. Incidence means the number of new cases each year. The numbers are small, one increasing to three per 100,000 children aged less than 15 years. The study includes both the incidence within Oxford and also within England.


Increase in rickets in recent years

The incidence is lower in Oxford than in England overall, probably the result of a lower immigrant population. Ethnicity is important in the development of rickets, but it has been occuring in white children in recent years.

Overall an important part of the increase in the incidence of rickets is an increase in the proportion of the ethnic minority and especially the South Asian of the UK population. During the past few years, the birth rate of the indigenous population has decreased by 2%, whereas that of the ethnic minorities has increased 19%. 


 Proportion is the the proportion of children who have rickets

But the incidence of rickets has increased in white children, and this requires an explanation. There is no suggestion of a decrease in the intensity of the sun and so we must for behavioural factors. 




Saturday, 8 March 2014

Should there be population screening for coeliac disease ?

Should there be population screening for coeliac disease ?

This Post follows two previous posts on the subject of Coeliac Disease

http://www.drdavidgrimes.com/2013/12/is-there-really-epidemic-of-coeliac.html

http://www.drdavidgrimes.com/2014/02/how-common-is-coeliac-disease.html


Anaemia is one of the classical features of coeliac disease and so it follows logically that if someone is found to be anaemic, that person might have coeliac disease. It is suggested that 5% of people with iron deficiency anaemia have coeliac disease.

Anaemia, low haemoglobin concentration in the blood, has many causes. In most cases there is an impairment of the production of red blood cells. This can be the result of specific disease of the bone marrow, where red blood cels are produced. Anaemia can also be due to the absence of an essential factor in the synthesis of haemoglobin, the oxygen-carrying complex that is within the red cells. The three essential factors are iron, folic acid and vitamin B12. Shortage of iron is usually due to blood loss, occasionally dietary shortage of iron, and rarely the malabsorption of iron from food through the intestinal wall into the blood-stream. This is what happens in coeliac disease.

Causes of iron deficiency anaemia:
                   Menstruation        20%
                       Cancer colon       5–10%
            Cancer stomach     5%
             Coeliac disease      5%
            Poor diet                 5%
Pregnancy

Not all people diagnosed with coeliac disease develop anaemia. The coeliac disease blood test is so commonly positive with the condition being unrecognised, and so it does not follow that coeliac disease when diagnosed will necessarily be the cause of anaemia.  

It is simple to treat deficiency of iron or folic acid with a course of tablets. It then follows that additional life-long gluten free diet might not give an advantage. 

A 86 year old man was found to have anaemia. His family doctor referred him for a gastroscopy, wondering quite rightly whether or not he had a bleeding lesion in the stomach. He did not, but the endoscopist rather enthusiastically took a biopsy from the duodenum, knowing that coeliac disease might cause anaemia. The biopsy showed partial villous atrophy, the typical appearance of coeliac disease, and the man was sent to my outpatient clinic for further evaluation. His anaemia had been treated and he felt as well as he could expect for his age. Should he be advised to take a gluten free diet? The answer was clearly “No”. It is important to treat a person and not just a biopsy specimen. There would be nothing to gain from him taking a gluten free diet and in this particular case there would be a great deal to lose. He lived alone and he could not cook – he had depended on his wife for sustenance until her death a few years earlier. His present diet seemed to bread either as toast or sandwiches. No wonder he had developed iron deficiency anaemia. A gluten free diet would have brought about his death from starvation.



Is there a place for screening

I have noticed that paediatricians are keen on case-finding. It is known that although coeliac disease does not follow a Mendelian inheritance pattern it is more common in a family member. After diagnosing coeliac disease in a child a paediatrician might tell the family doctor to arrange screening blood test for other family members. I remember well the father of a child newly diagnosed with coeliac disease being sent to me with a positive coeliac blood test. He was a fit and healthy young man without anaemia and without symptoms. Was there are point in trying to coerce him into a life-long gluten free diet? 

This was a long time ago but it made me stop and think about case-finding among the asymptomatic. It is very simple, just a blood test. Much more simple that mammography or cervical cytology. It leads to “diagnosis” but to no more than that. There is certainly no national enthusiasm for screening for coeliac disease, but very enthusiastic individuals might emerge – they usually do and cannot be kept down.

Wheat intolerance 

Many people with intestinal disorders are classified as irritable bowel syndrome. They often find that bread makes them worse and avoidance of bread an other wheat products helps. Testing for coeliac disease will occasionally be positive but it will be negative in the majority. This might come as a disappointment as the diagnosis of coeliac disease is something very positive. Furthermore with this diagnosis will come the entitlement to certain gluten free food products on prescription.

What is happening in people with wheat intolerance but negative coeliac testing is that the intestinal tract cannot handle wheat (not just gluten), probably because of an imbalance of intestinal bacteria. They will take a gluten free diet, best called a wheat free diet and they should not be labelled as having coeliac disease.

It is estimated that in the past only about ten per cent of people with coeliac disease were diagnosed. The simplicity of diagnosis since the introduction of the coeliac disease blood test makes it possible for many more to be diagnosed. These could be people with minimal “illness” such as mild anaemia, usually without symptoms but noted on routine testing. Population screening for coeliac disease would be based on people with no symptoms and no suspicion of illness. Positive testing would lead to the imposition of an unwelcome diet with no significant advantage. Although there will inevitably be enthusiasts for coeliac disease screening, the objective view is that although it would be very easy, it would be of no value.

Sunday, 9 February 2014

How common is coeliac disease?

How common is coeliac disease?

This follows from the previous post Is there really an epidemic of coeliac disease?”

The characteristic syndrome of coeliac disease is rare. It usually will develop in children when, following a milk-only diet, weaning to cereals takes place. The child will fail to gain weight, develop diarrhoea and be generally unwell. After coeliac disease is diagnosed the change to a gluten-free diet will be followed by a rapid improvement.

If coeliac disease is diagnosed in adult life it is assumed that it has been present since early childhood and has been clinically latent. It can be diagnosed at any age, even among the elderly. The presentation would be with anaemia, shortage of iron and folic acid, low weight and diarrhoea.

The new simple blood test allows the testing of people in whom there is no suspicion of coeliac disease, and this allows an estimate of its frequency in the general population. The result is quite remarkable and it now appears that about 90% of people with coeliac disease are undiagnosed. They are walking around happily and taking a normal diet.

But many of them might not be as well as they think. Some will have low-grade abdominal symptoms, often classified as an irritable bowel. If they are demonstrated as having coeliac disease, then a gluten-free diet might give them a sense of health better than they had previously experienced.

Testing for coeliac disease in adults should be considered in circumstances such as irritable bowel syndrome, symptoms being abdominal discomfort after eating, with bloating and diarrhoea.  The testing should be considered especially if there are other pointers such as low weight or weight loss, and iron deficiency anaemia. It can however be diagnosed in people who are overweight.

Irritable bowel symptoms are so common in society that many people accept them as just part of life, and indeed they are often just features of normal intestinal physiology. There comes a time when they interfere with life and this is when a visit to the doctor comes about. Hitherto undiagnosed coeliac disease might then be diagnosed by the simple blood test. 

What harm does it do?

It is vital that this is understood, otherwise justification for “treatment” will make no sense.

In the early 1970, when I was taking Crosby capsule biopsies, case reports came out in the medical journals of various gastro-intestinal cancers in adults who had been diagnosed as coeliac disease in childhood. This was thought to be a big problem, and that very strict adherence to a gluten free diet was imperative to avoid cancer. We thus entered an area of medicine by fear, and this still exists: “You must keep to a very strict gluten free diet or will develop cancer”.

Experience now tells us that these case reports were exceptional and that there is no excess of common gastro-intestinal cancers in coeliac disease. There is however an increase of a very rare form of intestinal lymphoma, but it remains rare. A large community based study of coeliac disease from Nottingham UK, as opposed to previous and biased hospital based reports, indicates a very minimal increase in deaths from intestinal lymphoma, and rather surprisingly a significant reduction of deaths from lung and breast cancers. So coeliac disease is far from being all bad news.

The risk of rare lymphoma has become far less than originally thought. In the past, without up-to-date information, we have been basing risk on only the 10% of coeliac disease patients that have been diagnosed. When we consider all people now assessed as having coeliac disease, the risk of lymphoma immediately reduces by a factor of ten, and this is a big factor.

The purpose of treatment therefore changes from cancer prevention and life prolongation to treatment of the illness, if indeed there is an illness. When coeliac disease has been diagnosed it is worth trying a gluten free diet. If it improves the abdominal symptoms then the person diagnosed will want to continue the diet. Some will find that they are extremely sensitive to gluten and that just a tiny amount will cause symptoms that might persist for several days.


On the other hand some people will notice no benefit. Although they have undoubtedly being diagnosed as having Gluten Enteropathy (disease of intestine caused by gluten), strictly speaking do they really have coeliac disease if they are coming to no harm?

Thursday, 16 January 2014

Health statistics - can they be bad for your health ?

How health statistics can misinform - 

We hear of and we see many headlines concerning health issues, and most of them are both bad news and very dramatic. 
They can be very misleading and they can exaggerate what they intend to show. The methods of exaggeration always involve the use of percentages, rather than what  are usually called raw data, ordinary numbers.


Example 1:

About 800,000 people die each year in the UK.


Let us suppose that 8 of  these people die by being stung by a bee or a wasp. This would be an annual death rate of 1 in 100,000.

Let us suppose that in the following year 16 people die from the same cause. This is a death rate of 2 in 100,000.

How is it best to describe this increase in death rate?

First and most simple, there is an increase of 8 per year in the UK. 

But we need to generalise to allow comparisons between other countries, comparisons within different parts of the UK, and comparisons with other years.

And so we describe the increase as 1 in 100,000 per year, from 1 to 2  per 100,000 in this example. We can also express it as an absolute increase of 0.001% of population.

These expressions of increase would not create headlines or induce panic among the population. In order to achieve this, the increase must be presented in a much more dramatic way.

Increasing 8 to 16, or even from 1 to 2 is doubling. This sounds good, quite impressive.

“Twice as many people died from bee and wasp stings this year compared to last year”. Now we are moving into headline news. It is starting to sound dangerous.

But is it twice a big number or twice a little number? This is important in translating risk or benefit to an individual or to a community. Twice something small, like a cent or a penny, is not worth worrying about. Twice something big like £100 or $100 is worth attention!

When a number doubles, the change can be expressed as a 100% increase. 100% of one is one, added to one gives two, a 100% increase.

Once again the question must be asked (but rarely is), 100% of what? Something big or some things small?

In this example what do you think is the most useful and meaningful way of presenting the increase:
100% increase or 0.001% increase (1 in 100,000)?
You would probably agree with me that the latter expression is most useful, giving a reasonable assessment of risk.

Example 2:

We are informed that if you are admitted to a UK hospital on a Saturday or Sunday, you will have a 16% increased risk of death during the following 30 days compared to if you had been admitted on a week-day.

Dramatic and worrying. Big headlines in the national press. Questions in Parliament. “Something must be done!” “Consultants must spend more time in the hospitals at the weekends.” 

Olfactarithmetic (recent New Scientist) - does the data “smell” right, or does it smell fishy? Is it really true (as the politicians and newspaper immediately assumed) that for 100 deaths following hospital admission on Monday to Friday, there are 116 deaths for those admitted on Saturdays and Sundays?

It is indeed dramatic but it smells fishy. There appears to be a major mortality effect and it should be obvious. The important but missing information is: how many deaths are we talking about? What are the numbers?

The study that led to this identification of 16% was based on approximately 14,217,640 (14.2 million) people admitted to hospital in the UK between April 1st 2009 and March 31st 2010. By 30 days after hospital admission there had been 187,337 in-patient deaths. The proportion dying in hospital was therefore:

187,337 / 14,217,640   =  1 /  75.9  =  0.013   =  1.3%

Now we can apply the 16% because now we know the answer to “16% of what?” :  it is 1.3%.

16% of 1.3 = 0.21

So in reality the chance of dying in hospital within 30 days of admission is only 0.21% greater if admitted at a weekend compared to a weekday.

The study also reports on 30-day mortality whether dying in hospital or following discharge home. There is now a total of 284,852 deaths. 

284,852 / 14,217,640   =  1 /  49.9 =  0.02   =  2% deaths of those admitted.

An excess of deaths for those admitted at the weekend would be 16% higher, which means 0.32% greater than weekday admissions.

Another way of looking at this, is that a weekday admission gives a 98% chance of survival 30 days, compared to 97.68% following a weekend admission. Not a significant difference.

This clearly not going to make newspaper headlines or lead to a parliamentary debate.

But what is the best way of expressing this increased risk? Is it 16% increase or is it 0.32% increase? I suggest the latter, but I have no interest in headlines. The absolute increase, 0.32% is most realistic and it conveys the most useful message to the public.

(I thank Dr Steve McCabe from Portree, Skye, for drawing my attention to this) 

Data source:
Journal of the Royal Society of Medicine. 2012 February; 105: 74–84.
Weekend hospitalization and additional risk of death: An analysis of inpatient data.
N Freemantle, et al.


What is the benefit of statins?

We have seen this in an earlier post but it is another example of how statistics can, in this case, exaggerate benefit.

A clinical trial of statin therapy was conducted in the West of Scotland and published in 1994 (is was called WOSCOPS). It studied men aged between 55 and 65, with high cholesterol levels. They had the world’s highest incidence of death from coronary heart disease (CHD, heart attack). Half were given a statin every day for 5 years and the other half acted as controls, being given a dummy placebo tablet.

The five-year death rate in the controls (untreated) was 4.2%, and in the controls it was 3.1%.
This looks like a death rate reduction of 1.1%  (ie 4.2-3.1)
This looks rather undramatic, but it means that 100 at-risk men must take a statin tablet every day for 5 years to prevent (delay) one death. Not very good.

How can we amplify the effect and make it look better?

1.1 is approximately one quarter of 4.1, in other words 25%.

So the advertising of statins tells us that they reduce death rate from CHD by 25%.

This is much more impressive and leads to big sales. But it still means that 100 high-risk men must take a statin for 5 years to delay one death.

There remains the question: 25% of what?

It was more than 20 years ago that the study was performed. Since then the death rate from coronary heart disease has gone down by a factor of 40, from 800 to 20 per 100,000 per year. Therefore it will now require 4000 men (40 x 100) to take a statin for five years to delay one death. This is however kept quiet, the information has been hidden. 

We are still told of the 25% reduction of deaths, but not that it is now 25% of something only one fortieth of what it was 20 years ago. Is it worthwhile?

Beware statistics when expressed as percentages. 
Remember to use olfactorarithmetic - does it smell right?