Monday, 4 August 2014

Vitamin D is best given frequently in moderate doses

We normally synthesise vitamin D by the action of the sun on our skin during the summer, and the reserves built up during this time will last us throughout the winter. This indicates that vitamin D can be stored within the body, and because vitamin D is an oil it is stored in the liver and in body fat. 

This has led to vitamin D supplement sometimes being given in a single large dose, a “bolus”. This can be an injection, and the dose of vitamin D as ergocalciferol is usually 300,000 or 600,000 units. It is to be given only once or twice each year and the assumption is that it will go straight into the fat of the body and then it will be released gradually over several months. It appears to be very convenient.

But it is not quite so simple in the human metabolically active environment  and this is not the way in which nature works. The normal pattern is for a gradual accumulation of vitamin D during the summer months. Experience now informs us that vitamin D supplement given as a single large dose is not as effective as when given as frequent smaller or moderate doses.

Vitamin D is synthesised by the action of ultra-violet light from the sun on 7–dehydrocholesterol (7–DHC) in the skin. 7–DHC  is a four-ring structure but the sun converts it into a three-ring structure, the vitamin D molecule, which has one hydroxyl group ( –OH). It then enters the blood stream and the circulation takes it through the liver. At this stage a second hydroxyl group is added. It is then called calcidiol, indicating two hydroxyl groups, and it is also known as 25(OH)D as the second hydroxyl group is in the 25 position of the molecule.

Vitamin D is a pre-hormone, meaning that it must be converted into an active hormone. This occurs when a third hydroxyl group is added, this time in the 1 position, forming calcitriol, 1,25(OH)D. This process takes place predominantly in the kidneys, but also in some inflammatory cells.

Calcitriol is biologically highly active, and this activity must be carefully controlled. If the blood level of calcitriol becomes too high then the amount of calcium in the blood and urine will increase to a level that might cause problems. Therefore to maintain a precise balance the amount of calcitriol produced must be matched by the same amount being de-activated.

The de-activation of calcitriol is, like its synthesis, controlled by a genetically determined enzyme. Similarly it adds a hydroxyl group to the molecule, now a fourth group and this time in the 24 position. The result of this hydroxylation is 1,24,25(OH)D, which has no biological activity. 



The enzyme concerned appears to be switched on by the earlier production of calcidiol. In fact the enzyme, controlled by the gene 
CYP27A1, is responsible for both of these hydroxylation processes, the 1 hydroxylation production of calcidiol and also the –24 deactivation of calcitriol. A large input of vitamin D activates the enzyme allowing this vitamin D to be converted into calcidiol.  The body’s metabolism will anticipate high levels of calcitriol. The –24 hydroxylation deactivation of calcitriol is switched on and the rate of de-activation of calcitriol will be increased. The enzyme also deactivates directly surplus calcidiol.

All is well when vitamin D synthesis develops gradually during the summer months. But when we interfere with the process of nature things can go wrong. 

The synthesis of vitamin D in the adult is in the range of 2,000 to 3,000 units per day, depending of location of residence and lifestyle. If vitamin D is synthesised for about six months of the year, this would amount to between about 300,000 and 600,000 units per year.

The body does not expect to receive six months or a years vitamin D in a single dose. When an injection of 300,000 or 600,000 units is given, the metabolic assumption will be that this is the start of a regular input of vitamin D and so it must be followed by a similar level of inactivation.

Initially this works out well. The high rate of conversion of vitamin D to calcidiol and calcitriol is matched by an increase of the –24 hydroxylation process leading to inactivation of calcitriol. However when the rapid peak of vitamin D and calcidiol/calcitriol has passed, the enzyme CYP27A1 will remain at a high level of activity. The absence of a large amount of vitamin D does not matter at this stage, but a high level of de-activation of calcitriol will remain. The result is a major reduction of blood levels of calcidiol and calcitriol until the next large injection is given.

The message is that vitamin D should not be given in large bolus doses as this will result in a low blood level of vitamin D. It should be given gradually as in nature. Daily or weekly doses of vitamin D are fine, but single doses at intervals of a month or more are to be avoided.

This information has been presented by Dr Reinhold Vieth of the University of Toronto, Canada.



Monday, 23 June 2014

There is more to the sun than vitamin D

A conference on the subject of Vitamin D took place in London on May 23-25 2014. It was a big conference with presenters and delegates from all over the world.

There were presentations of particular importance.

Observational studies have consistently shown that people with high blood levels of vitamin D have many health advantages. A clear example of this from the UK is shown in Figure 1.



We can see in the first group that “All-causes” death rate is almost twice as high in those with the lowest blood levels of vitamin D compared to those with the highest levels. Death rates are much higher for deaths from cardio-vascular disease, cancers and infectious diseases in those with the lowest levels of vitamin D.

If we are interested in good health and long life, we would want to be in the groups with the highest blood levels of vitamin D, greater than 30 ng/ml or 70 nmol/L. It would appear that the sensible way forward would be to give by mouth supplements of vitamin D so as to improve blood levels. This has been done on several occasions, but the results have been disappointing. This could be for a number of reasons.

First the dose might have been inadequate to achieve a satisfactory blood level, but this has been checked in recent studies. One piece of information that came out of the conference is that there does not appear to be an advantage in achieving blood levels of vitamin D above about 40 ng/ml (100 nmol/L).

Second it might be necessary to start with a high blood level of vitamin D early in life, but it would take many decades for this to be investigated further.

Third, it might be necessary to give the supplement for longer has been the case in the trials. Time will tell as more publications appear.

The fourth explanation, and the most interesting, is that the blood level of vitamin D is a measure of the exposure of an individual to the sun. This suggests that vitamin D being an index of exposure to the sun might not be a substitute for the sun, which is of fundamental importance to human heath (and to that of all animals). 



The action of the sun on the skin converts 7-dehydrocholesterol (7-DHC) into vitamin D (cholecalciferol), which is then activated in the liver to 25-hydroxy vitamin D (calcidiol).

A study from Edinburgh described in an earlier Post (Nitric Oxide) showed that exposure to ultraviolet light reduced the blood pressure of the subjects tested. However there was no change in the blood level of vitamin D, indicating that the beneficial effect of ultraviolet light on the blood pressure must have been mediated by something other than vitamin D. The sun acting on the sun releases nitric oxide from nitrates in the skin and it is this that has a beneficial effect on the circulation.



If people are deficient in vitamin D then a supplement by mouth will be of some benefit. It will heal the bone disease osteomalacia and there might be other benefits. However it would appear that vitamin D taken by mouth does not have the full range of benefits of the sun.

It is course possible that the beneficial effects of the sun on the skin will include not only vitamin D and nitric oxide but that there might be other actions that have not yet been recognised.





Wednesday, 7 May 2014

Is Obesity Bad for You ?


The generally held view is that we are witnessing an epidemic of obesity which will lead to countless deaths from diabetes and heart disease. This is one of many health warnings that we hear, all of which deliver dire warnings of impending doom. Slimness is all-important. The overweight are made to feel guilty. It is claimed that the obese are or shortly will be the majority of the population. 

Obesity is usually measured by the BMI (Body Mass Index) which is body weight adjusted for height. A tall person could have the same weight as a short person but would have a smaller BMI, and would need to have a greater weight before being labelled as overweight or obese. 

A person of height 1.8m is said to be underweight if less than 60kg, normal if 60-80kg, overweight if 80-100kg, and obese if greater than 100kg. The weakness of BMI is that it does not just indicate the fat content of the body but also its protein content and so it is not a reliable indicator of obesity. The definitions of normal and obese have been redefined to make more people “obese”.

Figure 1


The forecast epidemic of obesity is not happening and the proportion of the population said to be obese has not changed during the past ten years. It is about 35 per cent at present. However the prevalence is not distributed evenly across society. 

Obesity in the UK is much more common in the poor, the socio-economically disadvantaged, than in the wealthy, and this is especially true in women. The reason is not known but it is somehow linked to other diseases. It is not clear whether it is behavioural, the result of faulty eating or lack of exercise. It could be biological, perhaps linked to the bacteria living within the intestine, which are related to and possibly causative of obesity. It could be something to do with low levels of vitamin D reflecting lower exposure to the sun.

Figure 2

We can see in this UK study from 1976 (Department of Health and the Medical Research Council) that in all groups obesity was more common in the "lower social groups", what we call today the rather cumbersome "socio-economically disadvantaged".

It is said that the obese will die from heart disease and heart failure. However this has not been the experience during the past twenty years. The survival of people with heart failure is surprisingly better in the obese than in the thin. This has been demonstrated in Sweden, in people following a heart attack. In the USA it has been shown that the long-term survival of normal-weight people with type 2 diabetes was only half of that of overweight or obese people with type 2 diabetes. 

It is the very underweight who have the worst survival but this appears only in older people. Young people have such a low death rate that differences will not be apparent. 

Figure 3  Relative death rates related to BMI


Figure 3 shows that the relationship between death rate and BMI is what is called a “U-shaped” curve. This is quite common in health-related biological measurements, for example death rates and alcohol consumption. It means that extremes are not good and it is best to be in the middle, in this case normal weight or overweight. 

Being grossly overweight is bad, but normal weight people should not strive to be thin.

There is also a difference between the “healthy obese” and the “unhealthy obese”, and judged by cardiovascular parameters such as blood pressure and insulin resistance. Strict adherence to BMI does not seem to be sensible.

See also: 
New Scientist 2014; Volume 222, Number 2967, Page 44.


Thursday, 17 April 2014

White skin - Neanderthal inheritance


White skin - Neanderthal inheritance

My younger son Daniel pointed out to me an error in my last post, the topic of which was a recent increase of the frequency of rickets in children in the UK. As would be predictable from previous experience, the incidence was higher in children of South Asian and Black ethnicity than in white children.

I mentioned in the Post that the dark skin people living in tropical parts of the world is an evolutionary adaptation to the high intensity of the sun. Daniel pointed out that strictly speaking this is not correct, as Homo sapiens originated in Africa and the vast majority of the world population is dark-skinned, living in tropical or semi-tropical zones.

It is the white-skinned population of the world that is a small minority, differing from the majority, and it is this group that is an evolutionary variant, and adapted to living only in the northern temperate area of north-west Europe. What was the origin of this small group?

Neanderthal (Neander tal) is a valley close to Dusseldorf in Germany. It is indicated by a sign on the autobahn viaduct that crosses the valley. Skeletal remains of a humanoid species were first discovered in the valley in the 18th century and have since been discovered in other parts of Europe. The humanoid became know as Neanderthal Man, scientifically Homo sapiens neanderthalensis.

Recent genetic studies have shown that the white-skinned variant of Homo sapiens has in the genome some DNA derived from the now extinct species Neanderthal Man. In other words at some time in the distant past there was a small amount of interbreeding between the new Homo sapiens spreading out of Africa and the older Neanderthals living in Europe, about 60,000 years ago. Little genetic advantage was passed on to Homo sapiens but one is a genetic factor for fair skin, an advantage when living in northern Europe. Another genetic factor is fair straight hair.

Darwin introduced the idea of survival of the fittest, indicating that if a new gene gives an advantage, then those with that gene will thrive. In north-west Europe a pale skin gave an advantage in the form of enhanced vitamin D synthesis.

And so, thanks to the Neanderthal humanoid, white-skinned people are genetically adapted to living very distant from the equator. Dark skin is the global norm of pure Homo sapiens. The Neanderthals became extinct as Homo sapiens proved to be superior, but their genetic influence lives on in white-skinned people.

 Reference:


http://www.newscientist.com/article/mg22129542.600-neanderthalhuman-sex-bred-light-skins-and-infertility.html#.U0WpP16AQpE

Wednesday, 2 April 2014

Rickets - a recent increase in children

Rickets is a disease of children. Formation of bone is impaired so that the bones become soft. When the child starts to walk the bones of the legs bend, giving rise to “bow-legs”. In severe cases the pelvis will contract and if in a girl it will lead to subsequent difficulty with labour. 

It is due to deficiency of vitamin D, which is essential for the process of “ossification”, the incorporation of calcium into bone. Rickets was very common in industrial cities in the 19th century. A hundred years ago it was recognised as being due to lack of sunlight, which acts on the skin to produce vitamin D. 

It was also discovered that vitamin D was present in fish oils, and this led to an opportunity to treat or prevent rickets despite lack of sunlight penetration through heavily polluted atmosphere. Cod liver oil became a standard medication for children in northern Europe, but this became out of fashion in the latter half of the 20th century when clean air legislation and holidays in the sun became effective.

Rickets became a disease of history, but in the 1960s there were medical case reports of rickets in children of immigrants from South Asia into the northern parts of the UK. This became a more widespread problem as the result of increased immigration into the UK. In the USA rickets was identified in Black children.

Dark pigmented skin is an evolutionary adaptation to tropical and sub-tropical regions where sunlight intensity at ground level would cause severe burn in non-pigmented white skin. The pigmented skin is less efficient in the synthesis of vitamin D but this is not important when sun intensity is high.  It is of great importance when the individuals move to live in north-west Europe. The northern parts of the UK are further north than anywhere in China, and sun intensity is relatively low, even in the summer.

South Asian people are adapted culturally to living in a hot hot sunny climate by dress that covers virtually all of the body. This is an important additional factor leading to vitamin D deficiency when resident in north-west Europe.  

The trickle of case reports has now reached a significant number, and among white children in addition. A study from Oxford has shown a significant increase in the incidence of rickets during the past 15 years. Incidence means the number of new cases each year. The numbers are small, one increasing to three per 100,000 children aged less than 15 years. The study includes both the incidence within Oxford and also within England.


Increase in rickets in recent years

The incidence is lower in Oxford than in England overall, probably the result of a lower immigrant population. Ethnicity is important in the development of rickets, but it has been occuring in white children in recent years.

Overall an important part of the increase in the incidence of rickets is an increase in the proportion of the ethnic minority and especially the South Asian of the UK population. During the past few years, the birth rate of the indigenous population has decreased by 2%, whereas that of the ethnic minorities has increased 19%. 


 Proportion is the the proportion of children who have rickets

But the incidence of rickets has increased in white children, and this requires an explanation. There is no suggestion of a decrease in the intensity of the sun and so we must for behavioural factors. 




Saturday, 8 March 2014

Should there be population screening for coeliac disease ?

Should there be population screening for coeliac disease ?

This Post follows two previous posts on the subject of Coeliac Disease

http://www.drdavidgrimes.com/2013/12/is-there-really-epidemic-of-coeliac.html

http://www.drdavidgrimes.com/2014/02/how-common-is-coeliac-disease.html


Anaemia is one of the classical features of coeliac disease and so it follows logically that if someone is found to be anaemic, that person might have coeliac disease. It is suggested that 5% of people with iron deficiency anaemia have coeliac disease.

Anaemia, low haemoglobin concentration in the blood, has many causes. In most cases there is an impairment of the production of red blood cells. This can be the result of specific disease of the bone marrow, where red blood cels are produced. Anaemia can also be due to the absence of an essential factor in the synthesis of haemoglobin, the oxygen-carrying complex that is within the red cells. The three essential factors are iron, folic acid and vitamin B12. Shortage of iron is usually due to blood loss, occasionally dietary shortage of iron, and rarely the malabsorption of iron from food through the intestinal wall into the blood-stream. This is what happens in coeliac disease.

Causes of iron deficiency anaemia:
                   Menstruation        20%
                       Cancer colon       5–10%
            Cancer stomach     5%
             Coeliac disease      5%
            Poor diet                 5%
Pregnancy

Not all people diagnosed with coeliac disease develop anaemia. The coeliac disease blood test is so commonly positive with the condition being unrecognised, and so it does not follow that coeliac disease when diagnosed will necessarily be the cause of anaemia.  

It is simple to treat deficiency of iron or folic acid with a course of tablets. It then follows that additional life-long gluten free diet might not give an advantage. 

A 86 year old man was found to have anaemia. His family doctor referred him for a gastroscopy, wondering quite rightly whether or not he had a bleeding lesion in the stomach. He did not, but the endoscopist rather enthusiastically took a biopsy from the duodenum, knowing that coeliac disease might cause anaemia. The biopsy showed partial villous atrophy, the typical appearance of coeliac disease, and the man was sent to my outpatient clinic for further evaluation. His anaemia had been treated and he felt as well as he could expect for his age. Should he be advised to take a gluten free diet? The answer was clearly “No”. It is important to treat a person and not just a biopsy specimen. There would be nothing to gain from him taking a gluten free diet and in this particular case there would be a great deal to lose. He lived alone and he could not cook – he had depended on his wife for sustenance until her death a few years earlier. His present diet seemed to bread either as toast or sandwiches. No wonder he had developed iron deficiency anaemia. A gluten free diet would have brought about his death from starvation.



Is there a place for screening

I have noticed that paediatricians are keen on case-finding. It is known that although coeliac disease does not follow a Mendelian inheritance pattern it is more common in a family member. After diagnosing coeliac disease in a child a paediatrician might tell the family doctor to arrange screening blood test for other family members. I remember well the father of a child newly diagnosed with coeliac disease being sent to me with a positive coeliac blood test. He was a fit and healthy young man without anaemia and without symptoms. Was there are point in trying to coerce him into a life-long gluten free diet? 

This was a long time ago but it made me stop and think about case-finding among the asymptomatic. It is very simple, just a blood test. Much more simple that mammography or cervical cytology. It leads to “diagnosis” but to no more than that. There is certainly no national enthusiasm for screening for coeliac disease, but very enthusiastic individuals might emerge – they usually do and cannot be kept down.

Wheat intolerance 

Many people with intestinal disorders are classified as irritable bowel syndrome. They often find that bread makes them worse and avoidance of bread an other wheat products helps. Testing for coeliac disease will occasionally be positive but it will be negative in the majority. This might come as a disappointment as the diagnosis of coeliac disease is something very positive. Furthermore with this diagnosis will come the entitlement to certain gluten free food products on prescription.

What is happening in people with wheat intolerance but negative coeliac testing is that the intestinal tract cannot handle wheat (not just gluten), probably because of an imbalance of intestinal bacteria. They will take a gluten free diet, best called a wheat free diet and they should not be labelled as having coeliac disease.

It is estimated that in the past only about ten per cent of people with coeliac disease were diagnosed. The simplicity of diagnosis since the introduction of the coeliac disease blood test makes it possible for many more to be diagnosed. These could be people with minimal “illness” such as mild anaemia, usually without symptoms but noted on routine testing. Population screening for coeliac disease would be based on people with no symptoms and no suspicion of illness. Positive testing would lead to the imposition of an unwelcome diet with no significant advantage. Although there will inevitably be enthusiasts for coeliac disease screening, the objective view is that although it would be very easy, it would be of no value.

Sunday, 9 February 2014

How common is coeliac disease?

How common is coeliac disease?

This follows from the previous post Is there really an epidemic of coeliac disease?”

The characteristic syndrome of coeliac disease is rare. It usually will develop in children when, following a milk-only diet, weaning to cereals takes place. The child will fail to gain weight, develop diarrhoea and be generally unwell. After coeliac disease is diagnosed the change to a gluten-free diet will be followed by a rapid improvement.

If coeliac disease is diagnosed in adult life it is assumed that it has been present since early childhood and has been clinically latent. It can be diagnosed at any age, even among the elderly. The presentation would be with anaemia, shortage of iron and folic acid, low weight and diarrhoea.

The new simple blood test allows the testing of people in whom there is no suspicion of coeliac disease, and this allows an estimate of its frequency in the general population. The result is quite remarkable and it now appears that about 90% of people with coeliac disease are undiagnosed. They are walking around happily and taking a normal diet.

But many of them might not be as well as they think. Some will have low-grade abdominal symptoms, often classified as an irritable bowel. If they are demonstrated as having coeliac disease, then a gluten-free diet might give them a sense of health better than they had previously experienced.

Testing for coeliac disease in adults should be considered in circumstances such as irritable bowel syndrome, symptoms being abdominal discomfort after eating, with bloating and diarrhoea.  The testing should be considered especially if there are other pointers such as low weight or weight loss, and iron deficiency anaemia. It can however be diagnosed in people who are overweight.

Irritable bowel symptoms are so common in society that many people accept them as just part of life, and indeed they are often just features of normal intestinal physiology. There comes a time when they interfere with life and this is when a visit to the doctor comes about. Hitherto undiagnosed coeliac disease might then be diagnosed by the simple blood test. 

What harm does it do?

It is vital that this is understood, otherwise justification for “treatment” will make no sense.

In the early 1970, when I was taking Crosby capsule biopsies, case reports came out in the medical journals of various gastro-intestinal cancers in adults who had been diagnosed as coeliac disease in childhood. This was thought to be a big problem, and that very strict adherence to a gluten free diet was imperative to avoid cancer. We thus entered an area of medicine by fear, and this still exists: “You must keep to a very strict gluten free diet or will develop cancer”.

Experience now tells us that these case reports were exceptional and that there is no excess of common gastro-intestinal cancers in coeliac disease. There is however an increase of a very rare form of intestinal lymphoma, but it remains rare. A large community based study of coeliac disease from Nottingham UK, as opposed to previous and biased hospital based reports, indicates a very minimal increase in deaths from intestinal lymphoma, and rather surprisingly a significant reduction of deaths from lung and breast cancers. So coeliac disease is far from being all bad news.

The risk of rare lymphoma has become far less than originally thought. In the past, without up-to-date information, we have been basing risk on only the 10% of coeliac disease patients that have been diagnosed. When we consider all people now assessed as having coeliac disease, the risk of lymphoma immediately reduces by a factor of ten, and this is a big factor.

The purpose of treatment therefore changes from cancer prevention and life prolongation to treatment of the illness, if indeed there is an illness. When coeliac disease has been diagnosed it is worth trying a gluten free diet. If it improves the abdominal symptoms then the person diagnosed will want to continue the diet. Some will find that they are extremely sensitive to gluten and that just a tiny amount will cause symptoms that might persist for several days.


On the other hand some people will notice no benefit. Although they have undoubtedly being diagnosed as having Gluten Enteropathy (disease of intestine caused by gluten), strictly speaking do they really have coeliac disease if they are coming to no harm?