The UK debate over folic acid highlights science’s role in public health ethics
By the time a human embryo is the size of a blueberry, it has already laid the foundations of its central nervous system. Just four weeks after conception, a flat plate of cells in the embryo has folded over and fused into a tube, the beginnings of its brain and spinal cord. If this neural tube does not close properly, the foetus may develop spina bifida – literally ‘split spine’ – potentially causing a range of disabilities. Such neural tube defects (NTDs) affect hundreds of thousands of births every year, making them one of the most common types of birth defect.
To avoid NTDs, women are advised to take folic acid supplements if they are planning to become pregnant. Folate, also known as vitamin B9, plays a key role in DNA synthesis and repair, making it an essential ingredient of a successful pregnancy. But many conceptions are unplanned – about half in the UK – and by the time a pregnancy is confirmed, NTDs may already have formed. That is why more than 80 countries require flour to be fortified with folic acid. By ensuring that most of the population consumes enough folate in their diet, these countries have dramatically reduced the incidence of NTDs. It is a simple, inexpensive and remarkably effective public health intervention.
But the UK, along with all other EU countries, does not mandate folic acid fortification. Here, the fortification debate has raged for more than two decades, and last month a scientific review of the evidence made a renewed call for the government to take action. Yet the review is unlikely to shift the government’s position. The reasons behind this intransigence offer a classic case study in the limitations of scientific evidence in policymaking.
Evidence versus ideology
Back in 1991, a clinical trial run by the UK’s Medical Research Council concluded that giving folic acid supplements to women before conception and during pregnancy reduced NTDs by about 80%. Some health authorities acted quickly on the finding. In 1992, for example, the US Public Health Service recommended that all women of childbearing age took folic acid supplements, whether or not they intended to become pregnant. But it soon became obvious that these public health campaigns were not gaining much traction. So in 1998, the US government ruled that most flours must contain adequate amounts of folic acid. Canada mandated fortification around the same time, and it had a striking effect: by 2000, the incidence of NTDs there had roughly halved to 0.86 cases per thousand births.
Successive UK governments have been barraged with scientific advice recommending that they follow suit. The medical establishment overwhelmingly supports the move. And fortification is hardly unprecedented in the UK: non-wholemeal flour must be fortified with iron, calcium and other B vitamins, for example.
Objections to mandatory folic acid fortification have shifted over time. Initially, there were concerns that high doses of folic acid could mask the symptoms of vitamin B12 deficiency in some elderly people. Then there was a purported link between folate and bowel cancer. In both cases, clinical studies have proved those fears to be unfounded.
What remains is an ideological argument against fortification. In 2016, Lord Prior, responding on behalf of the government, said that it had no plans for mandatory folic acid fortification: ‘It is about doing all we can to help people make the right choice, but ultimately accepting, outside of extreme circumstances, that the final choice has to be made by them and not by the government.’
The debate has reached a stalemate because the participants are arguing past each other. The unstoppable force of clinical evidence has met the immovable object of moral philosophy. So how can scientists who advocate fortification break through the impasse?
Although clinical evidence alone may not be enough to win the argument, it can help to support a strong ethical case for fortification. One recent study estimated that if the UK had mandated fortification in 1998, it would have prevented about 2000 NTD pregnancies. It demonstrates that the government’s current position on mandatory fortification has moral consequences.
We should also question whether the government really is ‘doing all [it] can to help people make the right choice’. A survey of half a million women in the UK found that the proportion who took folic acid supplements before pregnancy actually declined from 35% to 31% in the decade up to 2012. So arguments for fortification should go hand in hand with demands for a much stronger programme to encourage women to take folic acid supplements – the two strategies are complementary, rather than mutually exclusive.
One ethical objection to mandatory fortification is that it offers the wider population no benefits to outweigh any potential risks. But clinical studies suggest that fortification in the US has almost eliminated folate-deficiency anaemia in the elderly. And in countries with widespread dietary folate deficiency, fortification can offer a way to improve the nutrition of the entire population.
Campaigning has also played an important role in persuading lawmakers to adopt mandatory fortification. International agencies such as the World Health Organization have tended to target low and middle-income countries, which stand to benefit most from fortification. If advocates of fortification in the UK could unite behind a similar long-term, strategic and sustained campaign, it may shift the public mood and persuade policymakers to act.