This increase in function is followed by a plateau in the range of adequate intake. Meaning that increases/decreases within this range of adequate intake don't result in any observed benefit/harm. Then with further increases of intake, there will be impairment of function, with the mechanisms regulating that nutrient being overwhelmed from excessive intake.
An important point in this respect is that the biological activity of nutrients are absolutely specific: deficiency can only be prevented by that micronutrient, not by another, even if they are chemically related. For example, iron and zinc may be metal minerals, but zinc can't prevent/reverse anaemia. Calcium may be crucial for bone health, but only vitamin D adequacy prevents rickets. The relevance of this fact is that the conditions which arise from a nutrient deficiency arise from a deficiency in a single micronutrient, and can thus be remedied by the provision of that nutrient in adequate amounts. Indeed, it is in the context of widespread prevalence of diseases associated with single nutrient deficiency that modern nutrition science was born.
History of Single-Nutrient Deficiency
Technically, the first evidence of single nutrient deficiency disease being cured with the provision of the particular nutrient came in 1747, in the Royal Navy. A navy surgeon named James Lind conducted an experiment aboard the HMS Salisbury, where he assigned sailors with scurvy to consume a range of natural supposed remedies: mustard, vinegar, garlic, oranges and lemons. Only the sailors prescribed the citrus fruits recovered from scurvy.
However, the prevailing medical thought held at the time was the humoral model of medicine, in which conditions arose due imbalances in 4 bodily fluids (blood, phlegm, yellow bile, and black bile). Lind believed scurvy was a digestive issue that arose from wet and cold conditions, and therefore the corresponding therapy should reverse these conditions: so he thus boiled the juice of citrus fruits. Unfortunately, this boiling of the fruits would destroy the ascorbic acid (vitamin C) that is responsible for preventing scurvy. And in the process, confounding his own experiment. It would be another 42 years before the Royal Navy officially adopted citrus fruits for the prevention of scurvy at sea.
By the late 19th Century a more formal science was emerging. The first example of converging lines of evidence informing a food-based conclusion in relation to deficiency states is that of the condition beriberi in Asia, which was widespread at the time.
In 1886, Christian Eijkman (who subsequently won a Nobel Prize) conducted an experiment in Jakarta (then the Dutch East Indies) feeding chickens polished rice or wholegrain rice. This resulted in the onset of symptoms, similar to those of beriberi in humans, in the chickens fed polished rice, thus proposing a hypothesis for the origins of the disease.
This finding was carried into an observational study by Adolphe Vorderman, a Dutch doctor in the Dutch East Indies, who gathered data from several prisons throughout the islands on food intake and records of beri-beri. Despite the observational nature of the study undertaken, Vorderman was acutely conscious of the potential for bias, and so blinded prison officials to the purpose of the experiment to avoid the potential for them to improve the prisoners’ diets. He found a strong association between the type of rice and incidence of the disease, with white rice correlating to increased prevalence of beri-beri, observations not seen with red rice.
These findings were then tested in a Malaysian asylum population in 1905 by Dr William Fletcher, who assigned inmates to either white or red rice, publishing his findings in The Lancet in 1907 that red rice prevented beri-beri while polished white rice did not.
However, it is important to note that although the conclusions of the research into beriberi pointed to the type of rice, it was not known what in the rice may explain the observed associations or why it did so. This is because it was not until 1912 that biochemist Casimir Funk coined the term ‘vitamin’, having identified a set of compounds vital to life.
In 1913, Funk discovered vitamin B1, thus identifying the nutrient that explained thiamine deficiency, beriberi. This discovery led to a domino effect, including vitamin A in the same year, vitamins E and D in 1922, and vitamin B9 in 1933. The evolution of nutrition science at the time reflected the public health concerns of the day, typified by the beriberi example, of identifying constituents in foods that provided a remedy for disease states associated with a deficiency in a given micronutrient. Vitamin C was isolated in 1932, and finally identified as the causative deficiency in scurvy, some 185-years after Lind’s experiment. Niacin deficiency was identified as the cause of pellagra, iron for anaemia, iodine for goitre, and vitamin D for rickets. Interventions targeted these isolated nutrients, including the fortification of salt with iodine, of milk with vitamin D, and of flour with vitamins B1 and B3.
Collectively the discovery of micronutrients, and their association with specific disease states, was a significant step forward for public health, and ignited a focus on preventative measures in public health nutrition to achieve nutritional adequacy amongst the population. However, in the modern developed world, deficiency states do remain - iron deficiency anaemia for example - and there may be also specific contexts in which risk of nutrient deficiency is greater.
This Sigma Statement will discuss current evidence for nutrient deficiencies in the developed world specifically. This is not to dismiss the critical role of nutrient deficiencies in the developing world, and the scourge of stunting which increases mortality risk by 5.5-fold compared to non-stunted children. However, to properly address the issue of nutrient deficiencies in the developing world would require ventilation of a range of factors which influence nutritional inadequacy: water, sanitation, hygiene, access to healthcare services, and food security issues, to note a few. Many of these issues are not factors in the developed world. The unifying cause of nutrient deficiencies globally is poverty, thus low and middle income countries bear the greatest burden in relation to micronutrient deficiencies. Nonetheless, poverty, food insecurity, and poor diet quality in developed countries continue to contribute to micronutrient deficiencies.
Globally, the micronutrient deficiencies of most concern are iron, vitamin A, zinc, folate (vitamin B9), and iodine. The prevalence of vitamin A deficiency is primarily confined to the developing world, however, there are varying degrees of nutritional inadequacy for the remaining micronutrients in the developed world. There may also be potential for vitamin D insufficency, given the reliance on sunlight exposure for its synthesis, especially in extreme latitudes where there is insufficient UV exposure for several months of the year. Given all this, the current Statement will focus on the following nutrients:
- Vitamin D
Iron deficiency is a concern in both sexes, occurring in up to 16% of children under the age of 5 in the UK population, with diagnosed iron deficiency anaemia in 8%. This prevalence of iron deficiency in pre-school aged boys and girls reflects the fact that through childhood (prior to the onset of puberty) nutritional considerations remain largely the same for both sexes. Cause for concern in this age group in relation to iron deficiency is warranted, given the associations between iron deficiency and impaired growth, development, and cognition.
However, the onset of puberty spurs developmental differences with important implications for nutrition during adolescence (defined as ages 10-24). The onset of the menstrual cycle increases risk for iron deficiency in females. Iron deficiency and iron deficiency anaemia is the most significant micronutrient deficiency in females at all stages of adolescents (early adolescence being ages 10-14 and late adolescence being ages 15-19), and into early adulthood (ages 20-24).
The recommended daily intakes for iron in the UK for adolescent girls (11-18yrs) is 14.8mg/d, and 11.3mg/d in same-aged boys. National Diet and Nutrition Survey (NDNS) data in the UK indicates that ~8% of girls in this age group have iron deficiency, compared to only 1.8% of boys. And 46% of girls were seen to have iron intakes below the lower reference nutrient intake level. Both diet and (if warranted) supplementation, may assist achieving nutritional adequacy for iron.
The differences in iron requirements between women (particularly pre-menopausal) and men is reflected in the Dietary Reference Intakes (DRIs) published by the National Academies of Sciences, Engineering, and Medicine, who have set the recommended dietary allowance (RDA) for adults as:
- Men = 8 mg/day
- Women up to 50 years old = 18 mg/d
- Women more than 50 years old = 8 mg/d
- Pregnancy = 27 mg/d
A relevant consideration for iron status is the source of iron, as heme iron from meat, fish, and poultry, is more readily absorbed than non-heme iron found in plant foods. However, there are factors which enhance non-heme absorption, in particular ascorbic acid (vitamin C). In females with iron deficiency and a low bioavailability diet (i.e., majority non-heme sources), the addition of ascorbic acid (from citrus fruit) to each meal resulted in a significant increase in absorption rate to 22.9%.
Diagnosis of iron deficiency anaemia is treated with supplementation of 65mg elemental iron (200mg ferrous sulfate) three times per day, and this should be overseen by an individuals' medical and nutrition professional advisors.
Folate (or vitamin B9) occurs naturally in foods and is ubiquitous in foods with green pigmentation. Folic acid is the synthetic version of folate, used in supplements and in food fortification. Folic acid is more bioavailable than folate from foods, as factors like the incomplete release of folate from the food matrix and destruction in the gastrointestinal tract may yield a lower absorbable uptake of dietary folate. On the other hand, folic acid is metabolised into tetrahydrofolate, identical to the compounds that would be derived from natural dietary folate intake.
For adults generally, the recommended daily intake of folate is 200 µg/d. Current advice for peri-natal supplementation in women intending to become pregnant is to take 400 µg/d of folic acid prior to conception and until the 12th week of pregnancy. While most adults do appear to achieve this based on sampling of blood folate levels, the requirements for folate double for pregnancy.
A long line of evidence stretching back to the early 1960's established unequivocally that folate deficiency causes neural tube defects. As a result of this accumulated evidence, countries began to enact food fortification programs, adding folic acid to cereal grain products, of which the first mandatory fortification program was introduced in the US in 1998. Public health recommendations for pregnant women to supplement with folic acid had started in a number of developed countries the early 1990's, however, these recommendations had not led to any noticeable decline in prevalence of neural tube defects. This is believed to be due to the fact that folate repletion is required pre-conception and supplementation was likely only being initiated post-confirmation of pregnancy.
In developed countries that have initiated mandatory or voluntary folic acid fortification programs, the current prevalence of folate deficiency is now very low. However, there is some evidence that where folic acid fortification is voluntary (i.e., at the discretion of industry), folic acid levels in foods may decline resulting in an increasing incidence of neural tube defects. Thus, preventing inadequate folic acid/folate levels in the population continues to be an important public health nutrition goal.
Genetic variability in the expression of the enzyme methylenetetrahydrofolate reductase (MTHFR) influences dietary folate requirements. Specifically rather than the 677C genotype, the gene variant 677T leads to reduced MTHFR activity, and so therefore increases increased folate intake to overcome this. The other common variant in the MTHFR gene from A1298A to A1298C also influences folate requirements. In subjects with genetic deficiency in MTHFR, supplemental folic acid of 400 µg/d normalises plasma folate.
Iodine is a trace mineral which is primarily utilised for the synthesis of thyroid hormones, and the majority of iodine in the body is stored in the thyroid gland. In 2007, a World Health Organisation report declared that:
"On a worldwide basis, iodine deficiency is the single most preventable cause of brain damage."
Iodine Deficiency Disorders (IDD) represents a spectrum of disorders, including (but not limited to) goitre, poor IQ, and cretinism. Globally, iodine deficiency accounts for a population-wide lower intelligence quotient (IQ) of up to 13.5 points below areas with no iodine deficiency.
As a trace mineral, the iodine content of most foods and beverages is low, with seaweed, milk and fish providing the best sources. Generally recommendations for iodine are for 150 µg/d, but for pregnant women this increases to 220 µg/d.
As a result of the low levels in foods, up to 120 countries globally have fortified table salt with iodine, which is an effective means of reducing iodine insufficiency. Iodine insufficiency may be defined as a population average urinary iodine concentration of <100 µg/L, which is increased to <150 µg/L for pregnancy (more on this below). Universal salt iodisation, which fortifies all salt for human and livestock consumption with iodine, including food industry products, is the most efficacious strategy to reduce iodine deficiency, and has been enacted in up to 30 countries.
Europe has the highest prevalence of iodine deficiency, and the lowest coverage of salt iodisation in the world, with up to 52% (~460 million people) in Europe having insufficient iodine intake. This may reflect the fact that most European countries have not enacted universal salt iodisation, in addition to the fact that majority of salt intake in populations is derived from processed foods, not home cooking additions, and limited numbers of foods being fortified with iodised salt. It also appears that, because more overt signs of iodine deficiency like goitre are more rare, a degree of public health complacency took hold in many European countries with regard to iodine deficiency, and the attitude towards the need for more comprehensive programs like universal salt iodisation. For example, recent research in the UK in adolescent girls revealed a median urinary iodine concentration of 80 µg/L, with those who do not consume milk or fish and vegans both at higher risk of insufficiency.
Due to cow's milk providing a substantial contribution to population iodine intakes, vegans are at a higher risk of iodine insufficiency. Cow's milk may contain up to 430 µg/L, while in the UK only 3 of 47 commercially available plant "milk" substitutes were fortified with iodine, and the average iodine concentration in plant "milk" substitutes was 1.7% of the value for cow's milk.
Until such time as fortification is more widespread, for individuals' following a vegan diet, it may be prudent to look for iodine-fortified "milk" substitutes.
Maternal iodine requirements increase by over 50% during pregnancy, particular during the first trimester when maternal thyroid hormones also cover thyroid hormone requirements for the developing foetus. Mild to moderate iodine deficiency in pregnancy, defined as a urinary iodine concentration <150 ug/L, is of concern here as the increased iodine requirements of pregnancy pose a greater risk for relative iodine deficiency. Mild iodine deficiency in the first trimester may result in lower cognitive capacities in children, as measured at 8-11 years of age.
Recommendations for pregnant women include a intake of 220 ug iodine, to cover the increased maternal requirements. Given the importance of iodine repletion during the first trimester, it may be prudent to ensure sufficient iodine pre-natally. The recommended intake can be made up through a combination of supplementation and foods. Supplementation of 150 µg in the form of potassium iodine (KI), in addition to dietary intake, would ensure a 220 µg/d intake. It is also recommended to continue with 220 µg/d through breastfeeding, with the RDA in North America set at an increased level of 290 µg/d for the lactation period. This is important given that only around 20% of women in a UK survey sample were aware of the increased requirements for iodine.
The fact that 10% of all human proteins bind with zinc indicates the wide range of physiological functions under the influence of zinc proteins. These diverse biological processes include the activity of growth factors, cell-signalling transcription factors cytokines, receptors, and more than 300 enzymes.
Zinc is absorbed through "zinc importer proteins" in the small intestine, and is tightly regulated in the body with a small pool of available zinc. Consequently, consistent dietary intake of zinc is required to ensure adequate supply of this mineral for the myriad physiological functions which zinc is required for.
While zinc is not generally thought of as a nutrient of concern in the developed world, plasma zinc remains the primary biomarker of zinc status, yet this is non-specific and not a robust indicator of zinc status and may remain in the normal range despite marginal zinc deficiency. As the primary food sources of zinc are animal-origin foods, this is relevant to shifting patterns of diet in developed countries in certain demographics of the population. As a metal mineral, zinc shares some similarities with iron in factors that influence zinc absorption, with phytates, fibres and calcium potentially impairing absorption.
The recommended daily intakes of zinc are 9.5 mg/d and 7.0 mg/d in males and females, respectively (with the RDA in North America set at 11 mg/d and 8 mg/d). UK data does indicate that these targets are being met the majority of the adult female population. However, there was a shortfall in requirements observed in males, particularly in the 20-59 age group.
Given that zinc is both easier to obtain in animal foods and that compounds and nutrients in plants can inhibit absorption (e.g. phytate and fibre), this poorer bioavailability is reflected in recommendations that vegans and vegetarians may need 50% more zinc than non-vegetarians. So male vegans are recommended to consume up to 16.5 mg/d of zinc (vs. the RDA of 11 mg/d), and females up to 12 mg/d (vs. 8 mg/d).
Vitamin D is obtained from skin exposure to ultraviolet B sunlight, dietary sources and supplements. Vitamin D comprises:
- Vitamin D2 - which is found in plants from the irradiation of ergosterol
- Vitamin D3 - which is synthesized in human skin and found in limited food sources, primarily fatty fish, mushrooms and egg yolks.
The primary source of vitamin D3 synthesis is human skin exposure to UVB sunlight, from which photons are absorbed by 7-dehydrocholesterol and converted into previtamin D3.
Both vitamin D2 and D3 undergo conversion in the liver to 25-hydroxyvitamin D (25(OH)D), the primary circulating form of vitamin D. 25(OH)D is metabolised in the kidney via the 1-a hydroxylase enzyme (CYP27B1) to the active hormonal form 1,25-dihydroxyvitamin D (1,25(OH)2D). The main physiological function of vitamin D is to maintain calcium and phosphate homeostasis. The primary action of 1,25(OH)2D is to regulate calcium absorption from the gut. 1,25(OH)2D is active in almost all body cells. The direct effect of vitamin D is to act on bone via intestinal calcium and phosphate absorption, through stimulation of osteoblast activity and osteoclast mineral resorption. In addition, vitamin D plays a vital role in stimulating the immune system and influencing neuromuscular function. Clinical consequences of deficiency, for which the most common are rickets and osteomalacia, are observed with prolonged serum 25(OH)D below 25 nmol/L.
Vitamin D deficiency is a public health concern of global scope, and is associated with the pathogenesis of skeletal and extra-skeletal disease states. Characteristic risk factors for vitamin D deficiency include winter season, lack of sun exposure (including covered clothing styles), increasing age, female sex, dark skin pigmentation, malnutrition, and obesity.
Data from the National Health and Nutrition Examination Survey (NHANES) has suggested over 75% of the US adult population is insufficient (defined by the Endocrine Society as as serum 25(OH)D levels of <75nmol/L) in vitamin D, independent of ethnic and environmental characteristics.
However, in the UK and EU a different threshold is applied. In the UK, <25 nmol/L is considered deficient, but public health authorities have not defined sufficiency. In the EU, <25-50 nmol/L is a range of insufficiency, while >50 nmol/L is deemed sufficient.
In the UK, based on recent National Diet and Nutrition Survey data, low blood levels fo vitamin D were found in:
- 23% of adults (19 to 64yrs)
- 21% of adults >65yrs
- 22% of children (11 to 18yrs)
For those living at latitudes of > 40°N or 40°S, there is a greater risk of deficiency (especially in winter) due to lack of sufficient UV exposure, even when outdoors. In such cases, supplementation may be required to sustain appropriate blood levels.
As there is insufficient evidence to set a RDA, there is an Adequate Intake (AI) set at:
- 19 -50 years = 5.0 µg (200 IU)/d
- 51 - 70 years = 10 µg (400 IU)/d
- > 70 years = 15 µg (600 IU)/d
[Adequate Intake (AI) is an intake (not a requirement) that is likely to exceed the actual requirements of almost all individuals in an age, sex, and life-stage group; established when scientific evidence is not sufficient to determine an RDA.]
Note: For details on research looking at the impact of vitamin D supplementation on health outcomes, check out episode 393 of the podcast.
Although DHA (docosahexaenoic acid) is a polyunsaturated omega-3 fatty acid. rather than a micronutrient, it is nutrients that is worth discussing, especially in specific contexts; namely pregnancy and vegan diets.
In addition to the importance of folate and iodine described above, the long-chain polyunsaturated fatty acids docosahexaenoic acid (DHA) and arachidonic acid (AA) are critical nutrients during pregnancy. AA levels are, however, maintained at a relative constant through endogenous triglyceride lipolysis, and so additional supplementation for general perinatal care is not required. However, from 20-weeks gestation additional DHA becomes important for the crucial period of brain growth in the infant through the last trimester.
To maintain sufficient DHA levels, current recommendations include consumption of a maximum of 2 servings (12 oz total) of low-mercury oily fish, with additional supplementation of 200-300mg/d DHA. However, in cases where there is no direct dietary source of DHA (either due to an individual avoiding fish for pregnancy or due to dietary restrictions), the advised dose would be 600mg DHA/d from 20 weeks gestation.
On a vegan diet, while omega-3 fatty acids can be obtained from plant foods in the form of alpha-linolenic acid (ALA), the marine omega-3's (EPA and DHA) cannot be obtained as a direct dietary source. Therefore, an algae-based form of the latter may be prudent, particularly for pregnancy as DHA conversion from plant-derived ALA conversion is insufficient for DHA requirements. A dose of 500mg DHA would equate to 1-2g of microalgae oil.
Conclusions: Key Points
- Due to early advances in nutrition science, diseases of single-nutrient deficiency have largely been eradicated from populations in the developed world.
- There remain concerns for insufficient nutrient intakes of a number of important micronutrients, which may differ relative to diet or lifestage.
- Globally, the micronutrient deficiencies of most concern are iron, vitamin A, zinc, folate (vitamin B9), and iodine.
- The prevalence of vitamin A deficiency is primarily confined to the developing world.
- In the developed world, iron, folate, iodine, zinc, and vitamin D, are among the more common nutrient insufficiencies.
- These can all be addressed through diet, supplementation, or a combination thereof.
- Specific considerations are required for pregnancy and also for vegan diets, which again may require a combination of diet and/or supplementation to maintain nutritional adequacy.
Statement Primary Author: Alan Flanagan
Alan is the Research Communication Officer at Sigma Nutrition. Alan is currently pursuing his PhD in nutrition at the University of Surrey, UK, with a research focus in chrononutrition. Alan previuosly completed a Masters in Nutritional Medicine at the same institution.
Excellent article, well worth reading
Hello! Thanks for the fantastic article.
While this article does a great job outlining the prevalence of vitamin D deficiency it does not really discuss the indications for supplementation. Barbell Medicine recently released a podcast and the major thrust that I took away is that, except for a few very specific clinical scenarios, and although vitamin D deficiencies are associated with various disease states observationally, the interventional work mostly does not show a significant effect on cancer risk, bone health, depression, or sports performance. Would you generally agree with that summery statement?
We did a podcast a few months ago that is largely in agreement with that, yes.
Would you recommend screening everyone for deficiency in these nutrients on a regular basis?
In short, no. Especially if they have no symptoms and their dietary pattern is a healthy one.
For deeper discussion of why over-testing and over-screening is problematic, see this episode with Dr. Austin Baraki: https://sigmanutrition.com/episode334/