|Micronutrient Deficiency Conditions: Global Health Issues|
Recommended citation: TH Tulchinsky. Micronutrient deficiency conditions: global health issues. Public Health Reviews 2010;32:243-255.
AbstractMicronutrient deficiency conditions are widespread among 2 billion people in developing and in developed countries. These are silent epidemics of vitamin and mineral deficiencies affecting people of all genders and ages, as well as certain risk groups. They not only cause specific diseases, but they act as exacerbating factors in infectious and chronic diseases, greatly impacting morbidity, mortality, and quality of life. Deficiencies in some groups of people at special risk require supplementation, but the most effective way to meet community health needs safely is by population based approaches involving food fortification. These complementary methods, along with food security, education, and monitoring, are challenges for public health and for clinical medicine. Micronutrient deficiency conditions relate to many chronic diseases, such as osteoporosis osteomalacia, thyroid deficiency colorectal cancer and cardiovascular diseases. Fortification has a nearly century long record of success and safety, proven effective for prevention of specific diseases, including birth defects. They increase the severity of infectious diseases, such as measles, HIV/AIDS and tuberculosis. Understanding the pathophysiology and epidemiology of micronutrient deficiencies, and implementing successful methods of prevention, both play a key part in the New Public Health as discussed in this section, citing the examples of folic acid, vitamin B12, and vitamin D.
Keywords: Micronutrient deficiency conditions, global health, folic acid, vitamin D, vitamin B12, deficiency,
Full textTheodore H Tulchinsky, MD, MPH - Braun School of Public Health and Community Medicine. Hebrew University-Hadassah, Hadassah Ein Karem, Jerusalem, Israel.
Micronutrient Deficiencies (MNDs) are of great public health and socioeconomic importance worldwide. They affect low-income countries but are also a significant factor in health problems in industrialized societies with impacts among wide vulnerable groups in the population, including women, children, the middle-aged, and the elderly. They affect all populations in Europe and more severely in the transition Countries of Eastern Europe (CEE), the former Soviet Union, and Countries of Central Asia (CAR). They significantly contribute to chronic diseases as the major causes of morbidity and mortality in these countries.
The World Health Organization (WHO) considers that more than 2 billion people worldwide suffer from vitamin and mineral deficiencies, primarily iodine, iron, vitamin A and zinc, with important health consequences.1
In 2006, WHO published a landmark document entitled Guidelines for Food Fortification with Micronutrients, and introduced the publication as follows:
“Interest in micronutrient malnutrition has increased greatly over the last few years. One of the main reasons for the increased interest is the realization that micronutrient malnutrition contributes substantially to the global burden of disease…. In addition to the more obvious clinical manifestations, micronutrient malnutrition is responsible for a wide range of non-specific physiological impairments, leading to reduced resistance to infections, metabolic disorders, and delayed or impaired physical and psychomotor development. The public health implications of micronutrient malnutrition are potentially huge, and are especially significant when it comes to designing strategies for the prevention and control of diseases such as HIV/AIDS, malaria and tuberculosis, and diet-related chronic diseases.” 2
This WHO publication goes on to emphasize that micronutrient malnutrition is not, as was widely assumed, only a problem of developing countries. WHO defines food fortification as the practice of deliberately increasing the content of an essential micronutrient, i.e., vitamins and minerals (including trace elements) in a food, in order to improve the nutritional quality of the food supply and provide a public health benefit with minimal health risk.
THE GLOBAL SCOPE OF MICRONUTRIENT DEFICIENCY CONDITIONS
Globally, the problem is enormous. Micronutrient deficiencies are not always clinically apparent or dependent on food supply and consumption patterns. They are associated with physiologic effects that can be life-threatening or more commonly damaging to optimal health and functioning. Iron deficiency is the most prevalent nutrition problem in the world. Folic acid deficiency remains responsible for excess birth defects, and many other micronutrient deficiencies are affecting populations at risk of growing obesity and with poor habits of physical exercise. Vitamin D deficiency, once pandemic among children in the industrialized countries, is now found to be extremely widespread, can lead to osteoporosis and bone fractures and may become life-threatening or leave an elderly person permanently handicapped, thus reducing length and quality of life.
Individual medical care or nutrition counseling cannot always deal effectively with micronutrient deficiencies of such widespread proportions. Some are intrinsic in current dietary patterns and some result from life situations determined by place of residence, religious practices, and recreational activities. They can directly impact communicable disease severity, as in the case of HIV, tuberculosis, and measles, and can greatly affect quality of life.2
The responses to these challenges as discussed in the section by Harrison include dietary diversification, food fortification, and direct supplements to specific risk groups, such as women, infants, children, middle-aged and elderly. Policies and programs to address micronutrient deficiencies depend on public health leadership and understanding of the vital role this issue plays in the policies of the New Public Health. For example, Recommended Dietary Intakes based on the Food and Nutrition Board of the US Institute of Medicine are widely used internationally. The major micronutrient malnutrition issues affecting populations in developed and developing countries addressed in the WHO Guidelines are shown in Table 1.
Micronutrients Deficiency Conditions and Their Worldwide Prevalence
Source: Adapted from Allen L et al.: Table 1.2 pp 6-10.2
In this section, we include three articles on the vital topic of the silent epidemic of micronutrient deficiency conditions and their public health importance:
Other articles in this issue on chronic diseases,6 infectious diseases,7 public health in the US,8 and global health9 also relate issues of micronutrient deficiencies and their prevention as fundamental to the New Public Health.
These topics are included in this introductory issue of Public Health Reviews as they currently receive wide attention in public health with an array of successful interventions, including fortification, supplementation and food based strategies for the alleviation of these conditions as an essential element of the New Public Health.
The earliest recognized clinical trial in micronutrient deficiency was conducted on sailors on board the HMS Salisbury by James Lind, leading to his famous report on scurvy in 1753.10 His findings eventually led to routine daily lime juice issuance to British sailors, who were then known as “limeys.” This was followed over the next century by advances in scientific knowledge of the importance of iron and iodine nutrient elements in health. In the 1880s, Kanehiro Takaki demonstrated that dietary changes eradicated beriberi among Japanese sailors, followed by Christiaan Eijkman (Nobel Prize 1929) in Java who identified dietary factors as a cause of chicken polyneuritis and neuropathy in humans.11 The term vitamins (derived from “vital amines”) was initially described in 1912 by Casimir Funk and has become the subject of great scientific and public health advances as well as common discussion and professional controversy ever since.12
In the early decades of the twentieth century, an epidemic of pellagra was investigated by Joseph Goldberger of the U.S. Public Health Service (USPHS) and was determined to be a nutritional deficit and not an infectious disease, as widely thought.13 An epidemic of pellagra in the southern US in the 1920s cost thousands of lives, and finally disappeared after fortification of flour with B vitamins was implemented by state law in many southern states, as dramatically shown in Figure 1.14,15
Fig. 1. Number of reported pellagra deaths, by sex of decedent and year – United States, 1920-1960.
Used by permission. © American Journal of Clinical Nutrition, American Society for Clinical Nutrition. Source: Centers for Disease Control. Achievements in public health, 1900-1999: Safer and healthier foods.15
In 1917, many recruits to the U.S. Army were rejected due to goiter and this led to investigations and the determination that iodizing salt would be the best approach to address the problem. Fortification of salt with iodine was introduced in Switzerland in 1923 to prevent goiter and cretinism, and in the United States a year later. This strategy was later adopted by the WHO as a global public health measure of the highest importance.16 Despite nearly a century of use of iodine to prevent cretinism, goiter and other manifestation of lack of iodine in soil and basic foods, iodine deficiency is still widespread in Europe and globally. An estimated 2 billion individuals have insufficient iodine intake, and South Asia and sub-Saharan Africa are particularly affected, while about 50 percent of Europe remains mildly iodine deficient.17
In 1941, President Franklin D. Roosevelt held a nutrition conference in the White House, which concluded that fortification of basic foods was the best way to prevent silent malnutrition and the recommendations were implemented throughout the US, as well as in Great Britain and Canada.18,19 However, in the post war period, this became less well regulated in Canada and Britain where vitamin fortification was no longer enforced due to the seeming disappearance of clinical rickets.
In the 1990s, the issue once again came to the forefront of public health policy when the UK Medical Research Council determined and confirmed that folic acid taken before pregnancy prevented the majority of neural tube birth defects. However, giving supplements to all women capable of becoming pregnant achieved compliance of no more than one-third of the population at risk.20,21 As a result, the U.S. Food and Drug Administration mandated adding folic acid to the required fortificants in “enriched flour”. Canada (mandatory since 1979), Chile, and many other countries followed suit with mandatory fortification of flour with folic acid becoming the common approach.
The US Centers for Disease Control and Prevention reported in 2008 that the number of countries practicing flour fortification rose from 33 in 2004 to 54 in 2007. Most countries fortify with iron and folic acid but many include thiamin, riboflavin, and niacin as well, increasing the number of protected persons by 540 million in just three years. Regionally, this breaks down to 97 percent of the population in the Americas, an increase from 5 to 44 percent in the Middle East, from 26 to 31 percent in the African Region, and from 16 to 21 percent in the Southeast Asia Region. However, as of 2007, only 6 percent of the population in the European Region and 4 percent in the Western Pacific Region were protected by a combination of fortificants.22
Discussion of food fortification as an important public health issue in Europe is limited by national and European Union free trade complexities.23,24 Folic acid fortification, although practiced on a voluntary basis by some food manufacturers, has not been made mandatory in any European country as of November 2009.25 The United Kingdom Food Standards Agency (UKFSA) recommended mandatory fortification of flour in 2007, but this was delayed by the Chief Medical Officer with instruction to review the evidence. The UKFSA renewed its recommendation in October 2009,26 but the matter has been referred to the political level and there has not been a decision as yet.
During the past decade, much attention has been given to vitamin D deficiency in the nutrition, medical, and public health professional literature. A growing body of evidence points to vitamin D deficiency in high percentages of the world population, and is associated with increased risk of osteoporosis, cardiovascular diseases, cancer, and other chronic conditions such as asthma. Specific deficiency conditions are becoming commonly reflected in the emergence of clinical rickets among breastfed infants of dark skinned migrants living in northern latitudes, and deficiency among youngsters spending more and more time in front of computer screens instead of outside on soccer fields.3 In 2008, the American Academy of Pediatrics (AAP) reviewed with concern the growing body of evidence of widespread vitamin D deficiency among infants, children, and adolescents despite longstanding fortification of milk with vitamin D. They recommended an increase of the levels of vitamin D supplements to 400 International Units per day, and widened the recommendation to include all infants, children, and adolescents based on historical evidence of effectiveness of supplementation and its safety.27
Dietary Reference Intakes (DRI) are the most recent set of dietary recommendations established by the Food and Nutrition Board of the Institute of Medicine, 1997-2001.28,29 These standards are under continuing review and periodic revision as the cumulative evidence base and body of knowledge evolves. The issues in food fortification are complex, often politically charged and under continuing surveillance. Resistance to food fortification continues, claiming that it is costly, interferes with individual choice, and could potentially overdose the population with possibly harmful substances. The long delay of adaption of folic acid fortification of flour in Europe is a case in point.
Vitamin supplementation is needed for infants, children up to adolescence, women in the age of fertility (iron, folic acid), middle-aged women (vitamin D), the elderly (vitamin D), patients in chronic care facilities (vitamin D), and patients with chronic diseases (e.g., HIV/AIDS) among others.30 Food supplements and fortification are used to supplement dietary intake and ensure adequate amounts of nutrients vital to a healthy life.31
Common foods that are frequently fortified are shown in Table 2, adapted from the US Dietary Supplement Fact Sheet of the Office of Dietary Supplements of the National Institutes of Health.32
This section on micronutrient deficiency conditions presents authoritative reviews with recommendations on public health nutrition policies, and the cases of folic acid, vitamin B12 and vitamin D. These issues are important for policy makers, as well as public health teachers and practitioners, and most certainly for students.
Widely Used Fortified Foods
Adapted from: Office of Dietary Supplements, National Institutes of Health. Vitamin D and healthful diets. Dietary Supplement Fact Sheet. Available from URL: http://dietary-supplements. info.nih.gov/factsheets/vitamind.asp (Updated 13 November, 2009 and accessed 9 March, 2010).
Public health nutrition addresses individual and population health needs including those of groups at high risk for micronutrient deficiency conditions. A comprehensive food policy includes food security and distribution with special emphasis on the elderly and low-income populations. Food fortification is necessary for developed and developing countries to ensure essential nutrients in processed foods, improving their suitability for human nutrition. In conjunction with this, regulation of fortification is important to avoid risks due to “promiscuous” fortification.
Vitamin and mineral fortification and supplementation policies need to be promoted as the epidemiologic, nutritional, and sociological scientific basis of human nutrition expands, specifically addressing widespread deficiencies of micronutrients essential for individual and population health. Use of other strategies such as preventive supplementation and foods suitable for fortification should be mandated under governmental responsibility for safe and healthful food products. These should be regulated and sometimes subsidized to prevent micronutrient deficiencies for many groups at special risk for such deficiencies and their associated negative health and societal effects.
From these conclusions, a number of recommendations arise for national governments and international agencies, as well as for public health practitioners, teachers, and policy makers. Their importance is measured by the potential positive impact on a nation’s health with safety and low cost, and should be integral to active governmental leadership in a wider context of public health nutrition policies including agricultural practices and subsidies, widely promoted and involving food industries, as well as manufacturing, processing and marketing of food. Food fortification and vitamin/mineral supplementation have a long tradition in public health practice, and as the scientific and evidence base grows, these will continue to evolve as a vital element of the New Public Health in the 21st century33 including:
Recognition and attention to micronutrient deficiencies as a group of important public health issues are vital to preventing diseases and promoting health. The science and epidemiology of the key role of micronutrients and their hidden effects are continuously evolving but their central role should be very clear in research, teaching, practice, and policy in the New Public Health.
Conflicts of interest: None declared.
+ Add comment