Chronic disease — a broad category that includes such common conditions as diabetes, cardiovascular disease, and cancer — is the leading cause of death worldwide. According to the World Health Organization, noncommunicable conditions account for two-thirds of the 57 million deaths each year and almost half of the decreased life expectancy and disability caused by diseases around the world. In the United States and the developed world, where people tend to eat energy-dense foods and lead sedentary lives, many of the leading causes of death are chronic diseases, which in many cases can be prevented. Meanwhile, developing countries struggle to address both chronic and infectious diseases within their nascent, underfunded health care systems. Present and future public health efforts must focus on strategies to combat chronic disease, including the application of genomic research.
Indeed, the mapping of the human genome seemed to put the unreachable almost within our grasp, promising genetic interventions to halt diseases at their source. It turns out, however, that while genes play a large role in disease susceptibility, it is nearly impossible to single out a specific gene sequence that causes chronic disease.
Chronic disease arises and progresses from multiple causes. Many of the risk factors — such as poor diet, lack of exercise, and smoking — can be addressed by changes in behavior. Others are changeable in theory but not necessarily under the control of the individual, such as stress, poverty, toxin exposures, and other cultural or environmental conditions. Still other important contributing causes are known as “non-modifiable risk factors” — fundamental attributes such as genetics, gender, family history, and age. It is important to note here that many behavioral risk factors and social conditions affect how genes are expressed and thus influence the progression of chronic disease through the “non-modifiable” risk factor of our genetic code. When health care professionals prescribe implementing lifestyle modifications to stave off chronic disease, the patient should understand that these positive health behaviors, although sometimes hard to sustain, improve health by working at the genetic level.
Thus in the realm of chronic disease prevention, genomic medicine has developed in a new direction, one that is beginning to have practical applications. Nutrigenomics is the study of hereditary factors that influence a person’s response to diet — both how genes influence nutrient absorption and metabolism, and how nutrients influence gene expression. That is, we are beginning to understand in detail how the “modifiable” lifestyle risk factors for disease are linked to and expressed through the supposedly “non-modifiable” risk factor of our genes.
This field has two main lines of inquiry. The first studies how and why individuals respond variably to food and nutrient intake. For example, when a child enters a clinic with rickets — a disease in which there is impaired mineralization of growing bones, primarily due to vitamin D deficiency — the doctor will recommend increasing vitamin D through diet and supplementation. A child who is not responsive to this treatment will be diagnosed with vitamin D-resistant rickets. Attempting to determine why the metabolism of vitamin D is blocked will then guide the course of treatment. Some possible sources of impaired mineralization include defects in the vitamin D-binding protein, which allows for transport in the blood; ineffective enzymes required for activation of the vitamin; or mutations in the vitamin D receptors required to activate or repress genes that regulate processes like bone formation. If there is just one mutation in any of these genes, it could result in an individual having altered enzyme activity, ineffective nutrient metabolism, or impaired use of a specific nutrient.
This area of nutrigenomics works to determine an individual’s response to nutrients, to aid in determining appropriate dietary intake. Continued research in this field may greatly advance the personalization of modern medicine, allowing health care professionals to make individualized care decisions based on the patient’s own genetic makeup.
The other branch of nutrigenomics takes the reverse approach: it seeks to understand how food intake affects gene expression. Even with the human genome now sequenced, science is still struggling to understand how gene expression is regulated. It turns out that individual characteristics are not strictly determined by the genetic code; a host of other “epigenetic” factors influence how and which genes are expressed. [For more on the findings and significance of epigenetics, see the ongoing series by Stephen L. Talbott in these pages: “Getting Over the Code Delusion,” Summer 2010; “The Unbearable Wholeness of Beings,” Fall 2010; “What Do Organisms Mean?,” Winter 2011.]
When our bodies receive inappropriate levels of certain nutrients (such as folic acid, Vitamin D, zinc, and selenium, to name just a few) epigenetic processes like DNA methylation and histone modification are affected, thus altering the expression of gene sequences. Nutrients in the diet can also affect transcription factors that regulate how genes are expressed — acting directly, by binding to specific transcription factors (which is the primary mechanism of fat-soluble vitamins), or indirectly, by activating or deactivating transcription factors through peptides like insulin, glucagon, and kinases. These changes in gene expression can in turn encourage or inhibit various internal processes.
Vitamin A is an example of a fat-soluble vitamin that regulates gene expression. In the absence of retinoic acid (the active form of vitamin A), a complex of two nuclear receptors binds to specific DNA sequences of our genes. When the complex is bound to these gene sequences, the DNA remains tightly coiled and does not allow transcription to be initiated. When retinoic acid is present, however, it binds to the complex in such a way as to relax the DNA, making the gene open for transcription and translation into proteins and enzymes. These proteins and enzymes are then involved in many functions throughout the body, including vision, cell proliferation and differentiation, embryonic development, and various immune functions. When vitamin A is deficient, blindness and increased infection-related morbidity can occur, along with other deleterious health consequences. The fat-soluble vitamins (namely A, D, E, and K), when in their activated forms, can enter a cell’s nucleus and bind to nuclear receptors, thereby modifying the way these genes are transcribed. But an overabundance of these fat-soluble vitamins, although rare, can have a toxic effect: unlike water-soluble vitamins, high concentrations of fat-soluble vitamins can be stored in the body, affecting the epigenetic regulation of all genes that contain a particular sequence, potentially disrupting normal gene transcription.
As we learn more about specific nutrient-gene interactions, a new nutrigenomics “gene therapy” may evolve that uses nutrition to properly express our genes — be that dampening the proliferation of cancer cells or up-regulating immune cells for greater protection from infection. As this field progresses, there are complicated public health and ethical issues to consider. With greater insight into nutrient-gene interactions, will the government begin implementing more population-based dietary interventions? Will fortification of the food supply expand? What will be the standards for scientific evidence used to shape such policy? Will supplement manufacturers be regulated as pharmaceutical companies are?
Food fortification already occurs in bread products with niacin, thiamine, and folic acid, and in milk with vitamin D. And at the individual level, the National Health and Nutrition Examination Survey, conducted by the Centers for Disease Control and Prevention (CDC) between 2003 and 2006, found that 53 percent of Americans use some type of dietary supplement. Since much of the population is already “fortifying” their individual diets, policies related to further fortification of the food supply may have unintended consequences.
Folic acid fortification is an example of a policy that has had positive outcomes as well as unintended consequences. In 1996, the Food and Drug Administration mandated that all cereal grain products be fortified with folic acid, a synthetic form of the naturally occurring B vitamin folate. Folate and folic acid are an essential source of methyl donors for DNA, RNA, protein, and lipid methylation, and thus have global effects within the body. Deficiency in folate/folic acid can cause fatal nervous system problems such as neural tube defects. When the mandate was enacted, the FDA’s primary objective was to decrease the incidence of neural tube defects, with a secondary objective of decreasing cardiovascular disease and some cancers associated with folate/folic acid deficiency. According to the CDC, neural tube defects decreased 27 percent after the mandate.
But this was not the only effect. Excess folic acid can disrupt normal epigenetic regulation and promote the progression of certain cancers. As Eoin P. Quinlivan documented in the American Journal of Clinical Nutrition, the FDA predicted that folic acid consumption would increase by roughly 100 micrograms per person per day, but the result was more than twice that; and with more Americans ingesting dietary supplements, many are surpassing the FDA’s safe upper limit of 1,000 micrograms per day. Although folic acid supplementation can decrease cancer risk in healthy cells, when precancerous cells are present, too much folic acid can increase the risk for some cancers. Precancerous cells can be aided along by folic acid to synthesize and replicate their DNA more rapidly. Young-In Kim of the University of Toronto has pointed to some preliminary evidence of increased risk or adverse effects of folic acid supplementation for childhood leukemia, breast cancer, and colorectal cancer. As of now, it is difficult to determine whether the benefits of folic acid fortification on public health outweigh these potential negative effects.
One lesson that may be drawn from this example is that human physiology is too diverse for broad-based interventions in the food supply; personalized medicine, on the other hand, may be able to deliver appropriate levels of supplementation to people, based on individualized knowledge of the needs and susceptibilities inherent in their genetic makeup. And so, as scientists continue to discover new diet-gene interactions and develop tests that may lead to novel therapies, nutrigenomics has become a promising and exciting field. The discovery of new diet-gene interactions bears out Thomas Edison’s statement, “The doctor of the future will no longer treat the human frame with drugs, but rather will cure and prevent disease with nutrition.”
But there is a deeper lesson to be drawn from the folic acid example. In this case, the actual terms of the mandate were not as important as the public perception it created: more is better. This interpretation feeds into a disconcerting mentality that pervades our culture — namely, that a healthy diet is composed of isolated, specifically targeted nutritional components.
Even beyond the always-incompletely-understood systemic effects of radically adjusting the intake of individual nutrients, nourishing our bodies ought to be about more than achieving the proper balance of specific nutritional components. Food is cultural, symbolic, emotional, and both more complex and much simpler than it seems. Fortification and supplementation should not take precedence over consuming whole, unprocessed foods like fruits and vegetables. Enjoying a well-balanced, tasty meal with family and friends may be just as effective for chronic disease prevention as developing “personalized diets” that regulate epigenetic gene expression. When we eat a variety of minimally processed foods, we provide ourselves with the energy, the essential building blocks, and the regulatory direction to maintain and protect cells, tissues, organ systems, and our whole body from disease.
Likewise, no amount of individualized diet-tweaking can substitute for overall healthy behaviors. Even within the realm of genetic research and nutrition, paying attention to the interactions between lifestyle, food intake, and gene expression will be crucial in battling chronic disease and promoting general well-being. Health care professionals and the public must understand that chronic diseases arise from many factors; so even if a research team discovers a few specific genes responsible for a disease, attaining health will still require maintaining a healthy lifestyle.
Translating biological discoveries and tests into meaningful clinical applications will require much time and money, and hastening the process may have unintended consequences, as seen in the folic acid example. Health care professionals will have an important role to play in interpreting the implications of genetic research for their patients. If a patient is shown genetically to have an increased risk for a particular disease, will this encourage him or her to make positive lifestyle changes? Or conversely, will a protective genetic test result lead to a slackening of positive health behaviors? How we allow our understanding of nutrient-gene interactions to drive medicine and shape our behaviors is up to us. We can take comfort in the fact that the science of nutrigenomics seems to be confirming what common sense has held all along: as Hippocrates said, “Let thy food be thy medicine and thy medicine be thy food.”
The New Atlantis is building a culture in which science and technology work for, not on, human beings.
Health Food and the Double Helix