Tuesday 27 August 2013

We are only 10% human

Looking at some of our fellow citizens you might not be surprised that they are only 10% human, but it actually applies to all of us.

Each of us composed of about 100 trillion cells, but only 10 trillion of these are human cells. About 90 trillion cells are micro-organisms, much smaller than animal cells. They sit happily on us or mainly inside us, the majority in the intestinal tract. The complexity is beyond current understanding and very few can be identified. Only a tiny minority can be grown outside the body. The huge numbers might be appreciated when we realise that dead bacteria make up about 95% of our faeces. There is very little food residue as our digestive system is extremely efficient and micro-organisms digest virtually all of the small amount that is left over.

The vast number of micro-organisms can be identified by their DNA. Each human cell has about 25,000 genes, 10,000 of which are shared by a banana! There are not enough genes to explain all our inheritance and functions, so that either more genes will be identified in the future or some genes will be found to have multiple functions. The micro-organisms of the gastro-intestinal tract of an individual have in total up to 3.3 million genes. It is likely that this genetic presence is of considerable importance to us and it certainly cannot be ignored. It is interesting to note that in a number of animals, and in particular in easily-studies insects, microbial parasites can influence behaviour to the advantage of the parasite. It has been suggested that in human beings the development of the later stages of syphilis cause a behaviour change to a higher level of less discriminative sexual activity so as to enhance the spread of the causative parasite, Treponema pallidum.

Our microbiota might have interesting effects on body function in ways that are  not yet fully appreciated. It becomes part of our inheritance. This is a very active research area.

We inherit a great deal from our parents, but not all of it is genetic. Think of money, social background, place of residence etc. But we also inherit micro-orgnanisms and most of these come from our mother. It has been assumed that the foetus, safely within its amniotic sac, is sterile, which of course means 100% human. However recent research suggests that there might be transfer of micro-organisms across the placenta. This is not altogether surprising as we know that certain micro-organisms can seriously damage the foetus. Historically the most important has been syphilis, and more recently rubella, and very rarely toxoplasma and listeria.

But the foetus is generally well protected. It is during the process of birth that the baby comes into contact with its mother's micro-organisms. An outstanding example of this is Hepatitis B virus, which is transmitted during birth following mixing of blood, but these days the baby can be given immunological protection. This has been a particular problem in China where a significant proportion of the population carries the hepatitis B virus.

The vast majority of the micro-organisms that the baby will acquire from its mother during the next few months or years do no harm at all, and they form an important part of the body. They will do no harm as long as the body can defend itself against them, that is keep them in their place. Part of this involves physical barriers. These are metabolically active and depend on blood flow and associated metabolism, the vitality of the body. The skin is a strong barrier but it is easily damaged. The intestine has a much weaker barrier, but defence of the gastro-intestinal tract is helped by the acidity of the stomach. The vagina also produces defensive acid. When we die the barriers break down and the organisms take over.

The immune system is designed to protect us from micro-organisms that might have passed the physical barriers. It vitally important but there can be failures. Rarely there might be a genetic defect of immune activity. HIV is an acquired form of immunological failure. Much more common is vitamin D deficiency. A bacterium such as the one that causes tuberculosis might lie dormant on or in the body, but  failure of immunity due to shortage of vitamin D might lead it to become active and cause disease.

Although this vast number of micro-organisms can live happily on us and in us, they must be kept in place. At times of weakness they can cause disease and this is most likely to happen in the newborn and in the very elderly, in both groups defence mechanisms being sub-optimal. For example the ability to cough is of great importance in keeping the lungs clear of inhaled infection. If the ability to cough fails then pneumonia is likely to occur and this is seen very commonly in the very elderly and people weakened by other illnesses.

The micro-organisms are always poised to take over.

Wednesday 21 August 2013

Vitamin D testing

Blood testing for vitamin D was until recently an uncommon request and undertaken in a the setting of a research laboratory. In the past few years there is a greatly increasing awareness of the importance vitamin D, and this has been associated with a great increase in blood testing.

We can see here experience from the USA, tests funded by Medicade. We are looking at not all blood tests but those concerned with immunology and nutrition. In the year 2000 vitamin D testing was bottom of the list, but since then the increase in number of tests has been enormous.

The question is whether this is just a new fad or whether it is of clinical importance. Low blood levels of vitamin D have been linked to a variety of diseases including coronary heart disease, stroke, diabetes, Crohn's disease, rheumatoid arthritis, breast cancer, colon cancer, prostate cancer, multiple sclerosis, and above all early death. It would appear therefore that to identify and correct low blood levels of vitamin D would seem to be sensible, even thought the benefits in the long term are not yet clear.

The extent of vitamin D deficiency is far greater than anticipated. It is more common in northern Europe than in southern Europe, similarly in France and the UK. Locations with the lowest levels have the worst health. The majority of people appear to have low levels of vitamin D, especially Asian and Black ethnic minorities in the UK and the USA.

It is necessary note that unfortunately there are two units of measurement in use, nanogrammes per ml and nanomols per litre. When you receive your result it is essential to know which of the two measurement unit are being used. The ideal levels are shown here, but very few people achieve these levels naturally.

Most vitamin D comes from the action of the sun on the skin, and few people achieve adequate sun exposure as judged by vitamin D levels in the blood.  Air conditioning in hot countries makes this worse on an international scale as industrialisation expands, people spending more times indoors. It is necessary to eat a great deal of fish each day to achieve a good vitamin D level by diet. Sun-bed use leads to good levels of vitamin D and sun-beds used without sunburn are not dangerous to adults.

If vitamin D deficiency is so common it could be argued that we should all take a supplement without blood testing. This is safe with a supplement of 2000 units per day. "Overdose" of vitamin D is extremely rare and does not occur if exposure to the sun is good as the sun acting on the sun appears to de-activate circulating vitamin D. It will occur if calcium is given with vitamin D and it is not necessary.

It is reasonable to recheck blood vitamin D level after  about 3 months of taking a supplement. Experience now tells us that this is probably not necessary as the response is predictable at stabilisation within the ideal range.

We can see that with supplement, in this case 20,000 units each week, the blood level does not keep increasing. It rises into the ideal range and then it stabilises. Vitamin D is consumed by the body, it is utilised and then inactivated. The supplement dose that achieves a steady state is obviously the amount that the body requires each day.

We can also see that although multiple measurements were performed on this particular subject, experience now tells us that if the ideal range is achieved after three months that will become the steady state.

Although the blood testing of vitamin D is increasing, many people who are at risk of vitamin D deficiency are still not being tested. We can anticipate that the number of tests will increase. Perhaps vitamin D testing should be part of routine testing in early pregnancy, as if a child starts life with a good level of vitamin D, it is likely to continue through life. It appears to be that vitamin D is most important in the early stages of life, particularly before birth and shortly after.

Tuesday 13 August 2013

Diabetes is usually the result of a visit to the doctor

Patient mongering is the process by which the medical profession creates new patients out of people who have considered themselves to be normal. There are many examples of this and one is diabetes, of which there is alleged to be an epidemic.

 It could be argued that the commonest cause of diabetes is going to see the doctor. Any of us might be summoned for a "routine check". We go in perfectly well but we come out with diabetes. In fact it has been part of UK government policy to encourage diabetes case-finding by family doctors. By WHO standards the diagnosis of diabetes has been based on the finding of a blood sugar concentration of 11 mmol/L on at least two occasions. However the goalposts have been widened and a blood glucose above 8 has now appears to have become the threshold for diagnosis. We can take things a stage further and people (now patients) are told that they have diabetes even without any measurement of blood glucose.

For many years in the control of diabetes, especially brittle type 1 insulin dependent diabetes, it has been realised that a blood glucose recording is only a snapshot and not a very good basis for adjusting treatment. The came along a new test, HbA1C, a measure of glucose attached to haemoglobin. This was found to be a better measure of diabetes control, reflecting glucose control over a long period of time.

When a well person attends the family doctor (more likely the practice nurse) for the "routine check" a series of blood tests will be ticked off on the request form and HbA1C is now likely to be included. If it is marginally elevated then the person will become a patient with diabetes, and probably an unhappy patient who continues to feel perfectly well.

It is little surprise that we have a declared epidemic of diabetes. Is it an epidemic of disease or an epidemic of diagnosis with a lower threshold? It certainly has an effect on the person diagnosed. There will be inevitably anxiety with concerns about future blindness or legs amputated, consequences that in reality we do not see today. The financial consequence is an increase in insurance premiums, and possibly detrimental effects on employment.

Does the diagnosis of diabetes benefit the person in any way? A community based study of 15,000 people in the east of England received the award from the British Medical Association this year for its investigation of this. [1] The study involved enthusiastic case-finding of diabetes in certain general practices but not in the others. The health outcomes of the two groups turned out to be identical after 10 years, no difference in total deaths, heart deaths, cancer deaths etc. Case finding was clearly shown to be of no benefit. However negative results of this sort do not deter enthusiasts, especially when there is financial reward to the doctors and other health professionals, and also of course the pharmaceutical industry.

The disadvantage of type 2 diabetes has been the cardiovascular associations, mainly coronary heart disease (CHD). In fact it has often been following a heart attack (myocardial infarction, MI) that type 2 diabetes has been identified. More recently it has been by case-finding described above. Now that the epidemic of CHD is virtually over (see previous post, An epidemic of CHD) the disadvantage of type 2 diabetes is also diminishing. The prognosis for people with type 2 diabetes is now little different from the general population.

An interesting Viewpoint on type 2 diabetes appeared in the Lancet recently, written by Edwin Gale, professor emeritus at Bristol University, UK [2]. Although in some respects a bit heavy and philosophical, it is also like a breath of fresh air. Gale points out that in 1907 it was said that diabetes could only be defined in terms of glucose, and that this view has never been successfully challenged. He argues that if after more than a century type 2 diabetes can still only be defined as "idiopathic hyperglycaemia" (high blood glucose of unknown cause) then let us call it that. The term diabetes implies a specific disease process but type 2 diabetes isn't. "Diabetes" itself is a translation of "plentiful urine" of which there are two types: "mellitus" means sweet or honey-like, containing sugar, and other much more rare type is "insipidus", meaning insipid or tasteless. Such a descriptive term for type 2 diabetes is inappropriate in the present era. Now that we are in the era of case-finding by blood testing, we rarely find glucose in the urine, and no excess of urine production.

In the absence of coronary heart disease the disadvantage of a high blood glucose is not clear and the Cambridge study above suggests that there is no disadvantage. Gale has a long interest in the epidemiology of diabetes and he emphasises the continuum of blood glucose levels in the population. Demarcation between normal and diabetes is artificial. He goes on to indicate that the level that determines whether or not a person has "diabetes" is decided by a committee rather than by a function of nature, and so as mentioned above the prevalence of the condition can be changed overnight, thereby creating a pseudo-epidemic and robbing many people of their perception of good health. Doctors are very good at this.


1. Simmons RK et al. Screening for type 2 diabetes and population mortality over 10 years (ADDITION-Cambridge): a cluster-randomised controlled trial.  Lancet 2012; 380: 1741-48.

2. Gale EAM. Viewpoint: is type 2 diabetes a category error? Lancet 2013; 381: 1956-57.

Thursday 8 August 2013

Nitric oxide: there is more to the sun than VitaminD

From the evolutionary point of view the skin can be seen to be a bit of a disaster. Although when unblemished it is remarkably beautiful, it is also very delicate and is easily damaged by scratches, insects, cold and by overexposure to the sun. Most animals have protection by scales, feathers, hair or particularly thick skin. We human beings have none of these and is it is difficult to understand why in terms of evolution our skin is so delicate and defenceless. It needs protection by the use of clothes. Some protection from excess sun exposure is provided by skin pigmentation and evolution has determined that the tropical zones are inhabited by people with heavily pigmented skin, with fairer skinned people living further from the equator.

The skin can be viewed as a large endocrine organ. In it is synthesised a steroid compound 7-dehydrocholesterol (7-DHC). The sun converts this into Vitamin D. This is a pro-hormone which is activated in the liver and the kidneys (for further details see eBook “Vitamin D and Evolution” Medical Briefs series, Kindle and iTunes £0.99). 

In its activated form vitamin D has many important mechanisms. The most well-known is its vital effect on bone maturation. It is involved with the activation of immunity. It is also known to activate many of our genes.

There are however several advantages from the sun that are not easily explained by vitamin D. One of these is blood pressure which is on average higher in the winter than in the summer. We can of course construct an elaborate hypothesis to involve vitamin D but this would not be particularly helpful. It is better to simply start with the sun as the hypothetical controlling factor, as the sun energy at ground level is the factor that defines the summer and the winter.

The next step would be to give controlled ultraviolet radiation as a simulation of sunshine to see if it reduces blood pressure and also to whether or not blood levels of vitamin D increase as blood pressure falls. This has been undertaken by research in Edinburgh. 

It was noted that UV radiation did indeed lower blood pressure, but the wave-length was controlled so that there was no change in vitamin D level. This means that there is more to the benefit of the sun than vitamin D, and another physiological mechanism needs to be identified.

It has been known for some time that Nitric Oxide (NO) is a normal physiological mediator of arterial function (Nobel prize 1998). The research has now identified that nitric oxide is produced in the skin under the influence of ultraviolet light from the sun. It is this that appears to have the effect of lowering blood pressure.  It is suggested that the sun acts on nitrates and similar nitrogen-containing compounds (from the diet) in the skin to release nitric oxide, which then has beneficial effects on the circulation. The increase of blood level of nitric oxide in the blood  follows UV exposure very rapidly. There might be other beneficial effects of the sun mediated by nitric oxide and no doubt further research is under way.

The practical results of this are quite profound. We find many people with severe vitamin D deficiency resulting from little or no exposure to the sun. These are mainly people who almost completely cover their skin with clothes, very orthodox Muslim and Jewish women in particular. We are giving them vitamin D supplements by mouth to correct their vitamin D deficiency, but this new research from Edinburgh suggests that this is not enough and that there is no real substitute for the effect of the sun on our skin.

It means that vitamin D deficiency is in part a correctable risk factor for a variety of diseases, but is also just an index of inadequate exposure to sunlight. 

For further details of this research please view the TED lecture by Richard Weller.

Monday 5 August 2013

What your doctor probably knows about cholesterol

The YouTube that I invite you to watch is to my mind hilarious. Human experience never to ceases to provide us with a constant supply of comedy. Sadly, underlying most comedy there is tragedy.

The realities of life can be bewildering and those of you who visit doctors will understand this. I recently met in the outpatient clinic a lady who had just spent about ten days in hospital and during this time she had been seen by 18 doctors (she counted them carefully, being surprised by the profusion). She told me that she had been discharged from hospital without any comprehension of her illness and no clear diagnosis. “If only two doctors had told me the same thing I might have been satisfied, but I received 18 different opinions”. When I told the retrospective diagnosis she informed me that one of the 18 doctors had said the same thing and so she believed me. Perhaps she was also confident of my knowledge and plausibility.

You might be bewildered about what we are told about cholesterol. Most doctors will conform to existing popular “wisdom” (= misinformation), that cholesterol is poisonous, toxic or generally bad for you. You might also find a few cholesterol sceptics but no-one is likely to sit on the fence. Who do you believe?

What it is important to understand is that most post-graduate medical education is funded by the pharmaceutical industry. This might be just free lunches at a regular hospital meeting (without the free lunch the doctors would be unlikely to attend) but it extends to major international conferences. Cardiology in particular is funded to a major extent in this way, including financial support for some academic departments. Major medical journals must now insist on conflicts of professional interest being declared by the authors. Extravagant coercion by pharmaceutical companies has been constrained by government action in recent years.

It is sad that so few doctors seem to keep up-to-date with original articles in the medical journals, but to do so is a big undertaking. The internet gives immediate access to the papers but not the time to read and digest them. The doctor is therefore likely to give an uncritical popular view, reinforced by what he or she has been told by the pharaceutical representative.

So enjoy watching the YouTube movie about the patient who visits his doctor concerning his cholesterol level. He is being encouraged to take Lipitor, atorvastatin, one of the major statin medications. 

The blood cholesterol level of 225 is in units of mg/100ml as used in the USA, corresponding to the UK and European SI units of 5.5 mmol/L

The Lipitor paradox:

Friday 2 August 2013

Cholesterol - evidence suppressed for 25 years

 “After age 50 years there is no increased overall mortality with either high or low serum cholesterol levels.”

This is the conclusion reached after a 30 year follow-up study of the citizens of the town of Framingham, Massachusetts, USA.

At the onset of the study conventional wisdom was that coronary heart disease (CHD) was the result of excessive cholesterol in the diet, leading to excessive cholesterol in the blood, and as a consequence heart attacks due to the arteries becoming "furred up". This was the diet-cholesterol-heart hypothesis, but it needed to legitimized by a prospective study. The National Heart, Lung, and Blood Institute of the USA chose the town of Framingham for this long-term research project, which started in the mid 1950s. 

The publication at the end of the study, entitled "Cholesterol and Mortality", contained the conclusion quoted above. It was met by what can only be termed a stunned silence, and then was quickly put away before the conclusion could enter the public domain. It has remained put away for 25 years.

It is a bit like the long-term research project described in "The Hitch-hikers' Guide to the Galaxy". The objective was to provide the answer to life, the universe and everything. Before releasing the answer the supercomputer (Deep Thought) warned the project team that "You are not going to like this". The answer was 42, and answer that was not expected, and not liked. As with Framingham the expensive project gave the "wrong" answer and huge amounts of money were effectively wasted.

Remember that at least 90% of CHD deaths occur after the age of 50 years. Think how many people above the age of 50 years during the past half-century have had their serum cholesterol level tested to try to predict coronary risk when it doesn’t. 

Of course this finding does not fit in with the diet-cholesterol-heart hypothesis, and therefore cholesterol screening continues without critical assessment. At huge expense efforts are made to reduce the cholesterol level of the population to no avail. 

Here we can see the distribution of cholesterol levels in young men aged 30-49 years. We can see that those who turned out to have CHD had on average slightly higher levels than those without CHD. This was expected at the onset of the project.

The data from the Framingham study was presented in a series of survival curves one for each age range and sex all grouping the subjects into bands of blood levels of cholesterol. 

We see the survival curves of the young men, aged 31-39. At the end of the study they were aged 61–69 and we can see that 68% of those with the highest cholesterol were still surviving compared to 84% with the lowest cholesterol levels. So far so good.

Things were rather different with young women and we know that overall their health is excellent with an advantage over men. Cholesterol level in the blood made very little difference and overall about 90% survived the thirty year study. Cholesterol measurements were not helpful in young women.

We must remember that death from CHD has been rare below the age of 40, and now that the epidemic of CHD is virtually over (see Post 5) it does not occur at all. Most deaths have occurred above the age of 70, but it is important to look at the age-group between 50 and 70, those who have not yet had their three score and ten.

The first thing to notice, published in 1978 but apparently untouched by human brain, is that the distribution curves in men aged 50–62 are identical for those with and those without CHD. In other words blood levels of cholesterol are of no value and do not predict CHD.

And then we look at the survival curves. As we might expect only 10% of men aged 50–65 surveyed for 30 years. However the important thing is that the blood levels of cholesterol had no influence whatsoever on survival.

So, what is the point of cholesterol testing? The beneficiaries are not the population tested, but think what it has done for the anti-cholesterol industries, pharmaceutical and food production.

The only interesting observation is the over-expression of cholesterol in young men with CHD. This brings us to basic science of the defensive value of cholesterol and the investigation of this has been neglected. 

We have seen in Post 8 (29-07-2013) that to give statins to men in this age-group would necessitate  40,000 men for five years to delay one death, a cost of £14.7M. The best thing is to forget about cholesterol. 


Anderson KM, Castelli WP, Levy D. Cholesterol and mortality: 30 years of follow-up from the Framingham study. JAMA 1987; 257: 2176-2180.