Neurons Anticipate Body’s Response to Food and Water

Summary: Findings provide a new insight into how the brain regulates food and water intake.

Source: BIDMC.

Using leading-edge technology, neuroscientists at Beth Israel Deaconess Medical Center (BIDMC) gained new insight into the brain circuitry that regulates water and food intake. In a new study, the team of researchers monitored the activity of the neurons that secrete a hormone in response to ingesting food and water. In their paper, published online today in Neuron, the researchers demonstrated that a subset of neurons starts to prepare the body for an influx of water in the seconds before drinking begins. These neurons help regulate intake by anticipating the effects of drinking from the “top down,” rather than taking cues from the body.

“This study supports the view that when we suddenly detect the availability of food or water, our body starts to prepare itself within seconds for the upcoming bout of eating or drinking,” said co-corresponding author, Mark Andermann, PhD, Assistant Professor of Medicine in the Division of Endocrinology, Diabetes and Metabolism at BIDMC. “We predict that deficits in this ‘top-down’ control could lead to overshoots in eating or drinking, with many negative consequences.”

Image shows neurons.

Andermann and colleagues, including co-corresponding author, Bradford B. Lowell, MD, PhD, a Professor of Medicine in the Division of Endocrinology, Diabetes and Metabolism at BIDMC, recorded the activity of neurons responsible for releasing the anti-diuretic hormone vasopressin in mice. Vasopressin plays a crucial role regulating the body’s relative concentration of water versus salt after eating or drinking, which could otherwise dramatically alter the mix.

“It’s critical to survival that the body has ways to prevent the water concentration outside of cells from changing,” said Lowell. “Anticipating the future consequences of ingesting water helps the body get a head-start on managing water balance. The form of rapid, top-down control of this process that we discovered is one important way of managing it.”

In their experiments, Andermann and Lowell watched as the activity of vasopressin-releasing neuron rapidly decreased – within seconds – when water was presented to water-restricted rodents, before they even drank it. In contrast, the sight and smell of food increased the activity in these neurons – again, within seconds – but only following food consumption. That difference in timing suggested that separate neural networks regulate these reactions to water and to food.

“This type of rapid regulation was not known to exist and has only been discovered in the last year for hunger neurons and for vasopressin neurons,” said Lowell. “It likely occurs for all forms of homeostatic control. It’s interesting to speculate whether there are individuals out there who have abnormalities in this kind of top-down control.”

“By the same token, we may one day learn that enhancing this top-down control might be a way of regulating meal size without interfering with baseline appetite or with the pleasure of taking the first bite of something delicious,” Andermann said, adding their high-tech methodology will allow them to further investigate the neurons directly “upstream” of the vasopressin neurons. “Because we can now monitor and manipulate the activity of specific sets of neurons, we’re getting closer to being able to directly test these hypotheses and working toward strategies to improve human health.”


Study coauthors include Yael Mandelblat-Cerf, PhD; Angela Kim (undergrad); Christian R. Burgess, PhD; Siva Subramanian (undergrad); Bradford Lowell, PhD; and Mark Andermann, PhD, all of the Division of Endocrinology, Diabetes and Metabolism at BIDMC; and Bakhos A. Tannous, PhD, of the Department of Neurology at Massachusetts General Hospital.

Funding: This work was supported by a Charles A. King Trust Postdoctoral Fellowship; a Davis Family Foundation Postdoctoral Fellowship; grants from the National Institutes of Health (R01 DK075632, R01 DK096010, R01 DK089044, P30 DK046200, P30 DK05752, NIH R01 DK109930, DP2 DK105570); the Pew Scholars Program in the Biomedical Sciences; the Klarman Family Foundation, and the Smith Family Foundation.

Source: Jacqueline Mitchell – BIDMC
Image Source: image is adapted from the BIDMC press release.
Original Research: Abstract for “Bidirectional Anticipation of Future Osmotic Challenges by Vasopressin Neurons” by Yael Mandelblat-Cerf, Angela Kim, Christian R. Burgess, Siva Subramanian, Bakhos A. Tannous, Bradford B. Lowell, Mark L. Andermann in Psychotherapy and Psychosomatics. Published online December 15 2016 doi:10.1016/j.neuron.2016.11.021


Bidirectional Anticipation of Future Osmotic Challenges by Vasopressin Neurons

•Recordings from vasopressin neuroendocrine motor neurons (VPpp) in behaving mice
•Feeding, but not cues predicting food, increase VPpp neuron activity within seconds
•Drinking and cues predicting water reduce VPpp neuron activity within seconds
•Drinking-related reductions in activity reach steady state prior to systemic feedback

Ingestion of water and food are major hypo- and hyperosmotic challenges. To protect the body from osmotic stress, posterior pituitary-projecting, vasopressin-secreting neurons (VPpp neurons) counter osmotic perturbations by altering their release of vasopressin, which controls renal water excretion. Vasopressin levels begin to fall within minutes of water consumption, even prior to changes in blood osmolality. To ascertain the precise temporal dynamics by which water or food ingestion affect VPpp neuron activity, we directly recorded the spiking and calcium activity of genetically defined VPpp neurons. In states of elevated osmolality, water availability rapidly decreased VPpp neuron activity within seconds, beginning prior to water ingestion, upon presentation of water-predicting cues. In contrast, food availability following food restriction rapidly increased VPpp neuron activity within seconds, but only following feeding onset. These rapid and distinct changes in activity during drinking and feeding suggest diverse neural mechanisms underlying anticipatory regulation of VPpp neurons.

“Bidirectional Anticipation of Future Osmotic Challenges by Vasopressin Neurons” by Yael Mandelblat-Cerf, Angela Kim, Christian R. Burgess, Siva Subramanian, Bakhos A. Tannous, Bradford B. Lowell, Mark L. Andermann in Psychotherapy and Psychosomatics. Published online December 15 2016 doi:10.1016/j.neuron.2016.11.021

Healthy diet had the healthiest sleep patterns

First nationally-representative analysis reveals people who eat a varied diet have healthier sleep duration.

“You are what you eat,” the saying goes, but is what you eat playing a role in how much you sleep? Sleep, like nutrition and physical activity, is a critical determinant of health and well-being. With the increasing prevalence of obesity and its consequences, sleep researchers have begun to explore the factors that predispose individuals to weight gain and ultimately obesity.

Now, a new study from the Perelman School of Medicine at the University of Pennsylvania shows for the first time that certain nutrients may play an underlying role in short and long sleep duration and that people who report eating a large variety of foods; an indicator of an overall healthy diet, had the healthiest sleep patterns. The new research is published online in the journal Appetite.

The image is The Fruit Vendor by  painter J W Godward. The image shows a woman sleeping next to a statue of a lion and a table of fruit.

“Although many of us inherently recognize that there is a relationship between what we eat and how we sleep, there have been very few scientific studies that have explored this connection, especially in a real-world situation,” said Michael A. Grandner, PhD, Instructor in Psychiatry and member of the Center for Sleep and Circadian Neurobiology at Penn. “ In general, we know that those who report between 7 – 8 hours of sleep each night are most likely to experience better overall health and well being, so we simply asked the question “Are there differences in the diet of those who report shorter sleep, longer sleep, or standard sleep patterns?”

To answer this question, the research team analyzed data from the 2007-2008 National Health and Nutrition Examination Survey (NHANES) sponsored by the Centers for Disease Control and Prevention. NHANES includes demographic, socioeconomic, dietary, and health-related questions. The sample for the survey is selected to represent the U.S. population of all ages and demographics. For the current study, researchers used the survey question regarding how much sleep each participant reported getting each night to separate the sample into groups of different sleep patterns.

Sleep patterns were broken out as “Very Short’’ (<5 h per night), ‘‘Short’’ (5–6 h per night), ‘‘Standard’ (7–8h per night), and ‘‘Long’’ (9 h or more per night).  NHANES participants also sat down with specially trained staff who went over, in great detail, a full day’s dietary intake. This included everything from the occasional glass of water to complete, detailed records of every part of each meal. With this data, the Penn research team analyzed whether each group differed from the 7-8 hour “standard” group on any nutrients and total caloric intake. They also looked at these associations after controlling for overall diet, demographics, socioeconomics, physical activity, obesity, and other factors that could have explained this relationship.

The authors found that total caloric intake varied across groups. Short sleepers consumed the most calories, followed by normal sleepers, followed by very short sleepers, followed by long sleepers. Food variety was highest in normal sleepers, and lowest in very short sleepers. Differences across groups were found for many types of nutrients, including proteins, carbohydrates, vitamins and minerals.

In a statistical analysis, the research team found that there were a number of dietary differences, but these were largely driven by a few key nutrients. They found that very short sleep was associated with less intake of tap water, lycopene (found in red- and orange-colored foods), and total carbohydrates, short sleep was associated with less vitamin C, tap water, selenium (found in nuts, meat and shellfish), and more lutein/zeaxanthin (found in green, leafy vegetables), and long sleep was associated with less intake of theobromine (found in chocolate and tea), dodecanoic acid (a saturated fat) choline (found in eggs and fatty meats), total carbohydrates, and more alcohol.

“Overall, people who sleep 7 – 8 hours each night differ in terms of their diet, compared to people who sleep less or more. We also found that short and long sleep are associated with lower food variety,” said Dr. Grandner. “What we still don’t know is if people altered their diets, would they be able to change their overall sleep pattern? This will be an important area to explore going forward as we know that short sleep duration is associated with weight gain and obesity, diabetes, and cardiovascular disease. Likewise, we know that people who sleep too long also experience negative health consequences. If we can pinpoint the ideal mix of nutrients and calories to promote healthy sleep, the healthcare community has the potential to make a major dent in obesity and other cardiometabolic risk factors.”

Notes about this neurobiology research article

Other authors for Penn include Nicholas J. Jackson and Jason R. Gerstner, PhD.

This research was supported grants from National Institutes of Health (T32HL007713, 12SDG9180007 and P30HL101859).

Contact: Jessica Mikulski – Penn Medicine
Source: Penn Medicine press release
Image Source: The image of J.W Godwards “The Fruit Vendor” is available via Wikimedia Commons and licensed as public domain. Feel free to use.
Original Research: Abstract for “Dietary nutrients associated with short and long sleep duration. Data from a nationally representative sample” by Michael A. Grandner, Nicholas J. Jackson, Jason R. Gerstner and Kristen L. Knutson in Appetite. Published online January 19 2013 DOI: 10.1016/j.appet.2013.01.004

Scientists Find Surprising Answers to ‘Food Coma’ Conundrum

Summary: Researchers believe they may have found a reason for that all too well known holiday phenomenon, a ‘food coma’.

Source: Scripps Research Institute.

Anyone who has drifted into a fuzzy-headed stupor after a large holiday meal is familiar with the condition commonly known as a “food coma.” Now scientists from the Florida campus of The Scripps Research Institute (TSRI), Florida Atlantic University and Bowling Green State University may have finally found a reason for the phenomenon.

Until recently, there has been little more than anecdotal evidence to suggest that “food coma” is an actual physical condition — and the scientific evidence that does exist is unable to explain why some people fall asleep immediately after eating, some later and some not at all.

“Different foods play different roles in mammalian physiology, but there have been very few studies on the immediate effects of eating on sleep,” said TSRI’s Associate Professor William Ja, who led the study, published today in the online journal eLife.

Ja and his colleagues used Drosophila, the common fruit fly, as a model, due to the well-documented sleep-metabolism interaction in which flies suppress sleep or increase locomotion when starved. They created a system called the Activity Recording CAFE (ARC), a small plastic chamber that allowed them to record fly activity before and after feeding.

Researchers found that after a meal, flies increased sleep for a short period before returning to a normal state of wakefulness. Their response varied according to food intake — flies that ate more also slept more. Further investigation of specific food components showed that while protein, salt and the amount eaten promoted sleep, sugar had no effect.

“The protein link to post-meal sleep has been mostly anecdotal, too, so to have it turn up in the study was remarkable,” Ja said. “In humans, high sugar consumption provides a quick boost to blood glucose followed by a crash, so its effect on sleep might only be observed beyond the 20 to 40 minute food coma window.”

Image shows a cooked turkey.

The fact that larger-sized meals increased sleep in fruit flies may also have parallels in human behavior–it’s known that electrical activity increases in the brain with meal size and during certain stages of sleep. Salt consumption also influences sleep in mammals.

Unpublished data suggest that the “food coma” condition might be a way to maximize gut absorption of protein and salt, two nutrients that flies might prioritize or find limited in nature, Ja added.

“Using an animal model, we’ve learned there is something to the food coma effect, and we can now start to study the direct relationship between food and sleep in earnest,” Ja said. “This behavior seems conserved across species, so it must be valuable to animals for some reason.” The study also found some intriguing physiological reasons behind after-meal fly napping.

“By turning on and off neurons in the fly brain, we identified several circuits dedicated to controlling postprandial sleep,” said TSRI Graduate Student Keith Murphy, the first author of the study. “Some of these circuits responded to protein and others to circadian rhythm, demonstrating that the behavior has a diversity of inputs.”


In addition to Ja and Murphy, the other authors of the study, “Postprandial Sleep Mechanics in Drosophila,” are Sonali A. Deshpande, James P. Quinn, Jennifer L. Weissbach and Seth M. Tomchik of TSRI; Maria E. Yurgel, Alex C. Keene and Ken Dawson-Scully of Florida Atlantic University; and Robert Huber of Bowling Green State University.

Funding: This work was supported by the National Institutes of Health (grant R21DK092735), an Ellison Medical Foundation New Scholar in Aging Award and a Glenn Foundation for Medical Research Award for Research in Biological Mechanisms of Aging.

Source: Eric Sauter – Scripps Research Institute
Image Source: image is in the public domain.
Original Research: Full open access research for “Postprandial sleep mechanics in Drosophila” by Keith R Murphy, Sonali A Deshpande, Maria E Yurgel, James P Quinn, Jennifer L Weissbach, Alex C Keene, Ken Dawson-Scully, Robert Huber, Seth M Tomchik, and William W Ja in eLife. Published online November 22 2016 doi:10.7554/eLife.19334

Scripps Research Institute. “Scientists Find Surprising Answers to ‘Food Coma’ Conundrum.” NeuroscienceNews. NeuroscienceNews, 22 November 2016.


Postprandial sleep mechanics in Drosophila

Food consumption is thought to induce sleepiness. However, little is known about how postprandial sleep is regulated. Here, we simultaneously measured sleep and food intake of individual flies and found a transient rise in sleep following meals. Depending on the amount consumed, the effect ranged from slightly arousing to strongly sleep inducing. Postprandial sleep was positively correlated with ingested volume, protein, and salt—but not sucrose—revealing meal property-specific regulation.

Silencing of leucokinin receptor (Lkr) neurons specifically reduced sleep induced by protein consumption.

Thermogenetic stimulation of leucokinin (Lk) neurons decreased whereas Lk downregulation by RNAi increased postprandial sleep, suggestive of an inhibitory connection in the Lk-Lkr circuit.

We further identified a subset of non-leucokininergic cells proximal to Lkr neurons that rhythmically increased postprandial sleep when silenced, suggesting that these cells are cyclically gated inhibitory inputs to Lkr neurons.

Together, these findings reveal the dynamic nature of postprandial sleep.

“Postprandial sleep mechanics in Drosophila” by Keith R Murphy, Sonali A Deshpande, Maria E Yurgel, James P Quinn, Jennifer L Weissbach, Alex C Keene, Ken Dawson-Scully, Robert Huber, Seth M Tomchik, and William W Ja in eLife. Published online November 22 2016 doi:10.7554/eLife.19334

Anabolic and catabolic process, hormones and exercise

catabolic hormones.JPG

The body faces a catabolic state during normal metabolic functions. This idea, opposed to an anabolic state, actually defines the breakdown of foods and nutrients so that they will later have the ability to build up and add to the muscle or tissue growth process.


Catabolic exercises are largely aerobic, meaning they consume oxygen, and help burn calories and fat. The use of oxygen is a key factor in catabolism, as oxygen is a reducing agent in many chemical processes. Typical catabolic/aerobic exercises are jogging, cycling, swimming, dancing or any physical activity done for at least 20 minutes at moderate intensity. Time is a major factor in getting results because after about 15-20 minutes, the body switches from using glucose and glycogen to using fat to sustain the energy requirements of the body. For that catabolic process, oxygen is required. By combining aerobic and anaerobic exercises on a consistent basis, a person can use anabolic and catabolic processes to reach or maintain an ideal body weight as well as improve and sustain overall health.

Anabolic processes

Anabolic processes use simple molecules within the organism to create more complex and specialized compounds. This synthesis, the creation of a product from a series of components, is why anabolism is also called “biosynthesis.” The process uses energy to create its end products, which the organism can use to sustain itself, grow, heal, reproduce or adjust to changes in its environment. Growing in height and muscle mass are two basic anabolic processes. At the cellular level, anabolic processes can use small molecules called monomers to build polymers, resulting in often highly complex molecules. For example, amino acids (monomers) can be synthesized into proteins (polymers), much like a builder can use bricks to create a large variety of buildings.

Catabolic processes

Catabolic processes break down complex compounds and molecules to release energy. This creates the metabolic cycle, where anabolism then creates other molecules that catabolism breaks down, many of which remain in the organism to be used again.

The principal catabolic process is digestion, where nutrient substances are ingested and broken down into simpler components for the body to use. In cells, catabolic processes break down polysaccharides such as starch, glycogen, and cellulose into monosaccharides (glucose, ribose and fructose, for example) for energy. Proteins are broken down into amino acids, for use in anabolic synthesis of new compounds or for recycling. And nucleic acids, found in RNA and DNA, are catabolized into nucleotides as part of the body’s energy needs or for the purpose of healing.

The Catabolic Idea

By defining the catabolic state within the human body, avid fitness enthusiasts have the ability to achieve their goals more easily. For example, by knowing that muscles actually endure a break down phase because of hormones released during each workout, you have the ability to counteract this phenomenon by consuming high-quality nutrient sources before, during or after your exercise sessions.

In the most basic written form, the catabolic process involves anything and everything that naturally occurs or induces the breakdown of larger molecules into several smaller building blocks. These separate parts eventually combine in a process known as anabolism, which greatly benefits muscle tissue growth.

Both catabolism and anabolism work together naturally in the human body in order to maintain a healthy energy level and durable, functional muscle tissue. However, before any muscle gains the ability to benefit from these two major processes, simple scientific factors have to take their proper course.

The Catabolic Process

When food enters the body, from the very first moment, larger sized molecules naturally become smaller. The idea of digestion actually implies catabolism. Once food particles break down into smaller nutrients, these chemical strains that once composed the larger nutrient molecules release energy through an oxidation process.

The catabolic process releases energy that works to help maintain proper muscle activity. The oxidation process that occurs during catabolism helps synthesize the necessary chemical building blocks that adenosine triphosphate (ATP). Multiple ATP molecules give cells the power to transfer more energy produced during the catabolic process to the anabolic process.

In basic terms, catabolism acts as the sole energy provider for the proper preservation and growth in nearly all cells.

Importance of Catabolism

Aside from helping fuel the human body with energy that’s necessary to grow and function, catabolism sometimes acts as a negative process that leads to adverse health effects. This does not occur often, but when the body has an extremely high rate of catabolism, as opposed to anabolism, muscle tissue and essential fat deposits found within the body become depleted.

For example, during rest, the body tends to recover and remain in an anabolic state. When the body does not properly rest for long periods of time, as in prolonged vigorous exercise, muscle tissue will continue to break down. Without proper nutritional intake, the natural process of tissue growth and repair will not take place.

Catabolism is the set of metabolic pathways that breaks down molecules into smaller units that are either oxidized to release energy, or used in other anabolic reactions.[1] Catabolism breaks down large molecules (such as polysaccharides, lipids, nucleic acids and proteins) into smaller units (such as monosaccharides, fatty acids, nucleotides, and amino acids, respectively).

Cells use the monomers released from breaking down polymers to either construct new polymer molecules, or degrade the monomers further to simple waste products, releasing energy. Cellular wastes include lactic acid, acetic acid, carbon dioxide, ammonia, and urea. The creation of these wastes is usually an oxidation process involving a release of chemical free energy, some of which is lost as heat, but the rest of which is used to drive the synthesis of adenosine triphosphate (ATP). This molecule acts as a way for the cell to transfer the energy released by catabolism to the energy-requiring reactions that make up anabolism. (Catabolism is seen as destructive metabolism and anabolism as constructive metabolism). Catabolism therefore provides the chemical energy necessary for the maintenance and growth of cells. Examples of catabolic processes include glycolysis, the citric acid cycle, the breakdown of muscle protein in order to use amino acids as substrates for gluconeogenesis, the breakdown of fat in adipose tissue to fatty acids, and oxidative deamination of neurotransmitters by monoamine oxidase.


There are many signals that control catabolism. Most of the known signals are hormones and the molecules involved in metabolism itself. Endocrinologists have traditionally classified many of the hormones as anabolic or catabolic, depending on which part of metabolism they stimulate. The so-called classic catabolic hormones known since the early 20th century are cortisol, glucagon, and adrenaline (and other catecholamines).

In recent decades, many more hormones with at least some catabolic effects have been discovered, including cytokines, orexin (also known as hypocretin), and melatonin.

Many of these catabolic hormones express an anti-catabolic effect in muscle tissue. One study found that the administration of epinephrine (adrenaline) had an anti-proteolytic effect, and in fact suppressed catabolism rather than promoted it.[2] Another study found that catecholamines in general (the main ones being, epinephrine, norepinephrine and dopamine), greatly decreased the rate of muscle catabolism.

Catabolic hormones include:

  • Adrenaline: Also called “epinephrine,” adrenaline is produced by the adrenal glands. It is the key component of the “fight or flight” response that accelerates heart rate, opens up bronchioles in the lungs for better oxygen absorption and floods the body with glucose for fast energy.
  • Cortisol: Also produced in the adrenal glands, cortisol is known as the “stress hormone.” It is released during times of anxiety, nervousness or when the organism feels prolonged discomfort. It increases blood pressure, blood sugar levels and suppresses the body’s immune processes.
  • Glucagon: Produced by the alpha cells in the pancreas, glucagon stimulates the breakdown of glycogen into glucose. Glycogen is stored in the liver and when the body needs more energy (exercise, fighting, high level of stress), glucagon stimulates the liver to catabolize glycogen, which enters the blood as glucose.
  • Cytokines: This hormone is a small protein that regulates communication and interactions between cells. Cytokines are constantly being produced and broken down in the body, where their amino acids are either reused or recycled for other processes. Two examples of cytokines are interleukin and lymphokines, most often released during the body’s immune response to invasion (bacteria, virus, fungus, tumor) or injury.


Foods with very high water content, such as celery, also have this tiny catabolic effect. But the nutritional value of water and celery are not high enough to properly sustain an organism, so relying solely on these foods to lose weight can lead to serious health complications

Liver cancer , China has 50% of worldwide cases, Molds in food

Liver cancer is the fifth most common cancer in men and the ninth in women. An estimated 782,500 new liver cancer cases occurred in the world during 2012, with China alone accounting for about 50% of the total. Rates are more than twice as high in men as in women. Liver cancer rates are the highest in Central America, West and Central Africa, and East and Southeast Asia.

toxins mold

Aflatoxins are among the most potent mutagenic and carcinogenic substances known. Differential potency of aflatoxin among species can be partially attributed to differences in metabolism; however, current information on competing aspects of metabolic activation and detoxification of aflatoxin in various species does not identify an adequate animal model for humans. Risk of liver cancer is influenced by a number of factors, most notably carriage of hepatitis B virus as determined by the presence in serum of the hepatitis B surface antigen (HBsAg+ or HBsAg-). About 50 to 100% of liver cancer cases are estimated to be associated with persistent infection of hepatitis B (or C) virus.

The potency of aflatoxin in HBsAg+ individuals is substantially higher (about a factor of 30) than the potency in HBsAg- individuals. Thus, reduction of the intake of aflatoxins in populations with a high prevalence of HBsAg+ individuals will have greater impact on reducing liver cancer rates than reductions in populations with a low prevalence of HbsAg+ individuals. The present analysis suggests that vaccination against hepatitis B (or protection against hepatits C), which reduces prevalence of carriers, would reduce the potency of the aflatoxins in vaccinated populations and reduce liver cancer risk.

This research has led to potential chemopreventative strategies for liver cancer in populations at high risk for aflatoxin exposure. “Aflatoxin is a very lipid soluble molecule so that when we ingest it, it’s rapidly absorbed. And it goes first to the liver where there are enzymes that will chemically biotransform it into a very reactive chemical, which attacks with very high preference, our DNA, causing damage to that DNA, mutations, perhaps in genes that enhance our susceptibility to cancer production. Aflatoxin is also a very cytotoxic molecule so it will directly kill some of our liver cells creating a void, if you will, that causes the remaining liver cells to replicate and perhaps grow at a faster rate than we would like. The combination of DNA damage and cell proliferation triggers the liver cancer process.”

Mold-contaminated crops can be a serious problem especially in countries where proper storage facilities are limited.

Professor Kensler explains their clinical trials in which chlorophyllin was administered as a therapy and the resultant levels of aflatoxin DNA damage products present in urine samples.


It has been estimated that the DNA in each cell of the body suffers 10,000 oxidative hits per day, leading to the formation of more than 20 different oxidative DNA lesions.108 Human studies show lifestyle and other environmental influences may profoundly modify outcomes of aging.109,110

It is not just environmental toxins (e.g., cigarette smoke, coal dust, and diesel emission particles) that pose a concern. Foods cooked at high temperatures also inflict cellular damage. Deep-fried foods along with well-done beef steak, hamburgers, and bacon cause the formation of gene-mutating heterocyclic amines.111,112 Even so-called healthy foods contain small amounts of undesirable substances.113

Chlorophyllin has been shown to have DNA-protective and antioxidant properties, inhibiting DNA adduction.101,114-116 Chlorophyllin also quenches all major oxygen species and acts to protect mitochondria.117-119 Chlorophyllin also has a role in preventing unavoidable dietary exposure to aflatoxin, a naturally occurring mycotoxin, by reducing its oral bioavailability.120

Keywords: John Hopkins bloomberg school of public health, liver toxicity, environmental health sciences, food and drug administration, liver cancer, aflatoxin, grain product, school of public health, hepatitis b, strict guidelines, department of agriculture, biomarkers, carcinogen, metabolite, food supply, byproduct


Cancer results from a diseased genome, excited by environmental toxins

Cancer results from a diseased genome. Each tumor contains a collection of genomic aberrations that activate oncogenes and inactivate tumor suppressor genes.

A recent survey of the scientific literature identified 229 oncogenes (or “dominant” cancer genes) and 62 tumor suppressors (“recessive” cancer genes), suggesting that more than 1% of the human genome may contribute directly to carcinogenesis and/or tumor progression (Futreal 2004).

Since many tumor mechanisms likely remain undiscovered, these numbers may underestimate the full spectrum of human cancer genes.

Moreover, the path to cancer may require at least 5–10 genetic mutations (Hahn and Weinberg 2002).

Theoretically, then, the total number of different genetic combinations possible across all human cancers exceeds ten trillion and may even reach 1018.

These estimates imply that a comprehensive genomic approach to cancer therapeutics may be exceedingly difficult to achieve.

Recent insights, however, suggest a more favorable conclusion:

The enormous complexity possible in theory may indeed prove both functionally reducible and therapeutically tractable in practice.

Among these is the recognition that most human cancers derive from perturbations within a finite number of fundamental physiological processes directing cellular proliferation, survival, angiogenesis, and invasion/metastasis (Hanahan and Weinberg 2000).

By itself, this conceptual framework does not completely resolve the challenge of tumor complexity, because many diverse genetic players and mutation chronologies may affect each of these properties.

Nonetheless, the notion that cancer involves definable biological hallmarks suggests that, ultimately, logic and order may be discerned from the immense genomic diversity characteristic of human cancer once the appropriate molecular contexts are more fully understood.

Consistent with this viewpoint is the recognition that cancer genomic aberrations, although complex, do not occur randomly. Instead, a relatively small number of cancer genes tend to undergo alterations at high frequencies.

The fact that cellular pathways involving RAS, p53, and pRb (among others) undergo genetic mutations so commonly (Vogelstein and Kinzler 2004) not only endorses the “hallmarks of cancer” model, but also suggests that cancers tend to employ the same genomic alterations to enact these processes. Thus, despite the inevitable complexity, an increased knowledge of cancer genomic alterations should contribute markedly to the elaboration of essential and broadly applicable tumor mechanisms.


Another pivotal insight pertaining to deconvolution of cancer genomic complexity derives from the recent observation that some tumors require continued activity of a single activated oncogene for survival (Weinstein 2002).

Termed “oncogene addiction,” this phenomenon was first demonstrated in transgenic mouse models that enabled conditional overexpression of oncogenes such as myc, ras, and bcr-abl (Chin et al. 1999; Felsher and Bishop 1999; Huettner et al. 2000; Jain et al. 2002; Pelengaris et al. 2002). In these models, induction of the relevant oncogene triggered cancer formation; however, subsequent loss of oncogene expression resulted in regression and apoptosis of tumor cells. The presence of oncogene addiction in human malignancies was first demonstrated in chronic myelogenous leukemia (CML), which harbors the BCR-ABL translocation; and in gastrointestinal stromal tumors (GIST), which contain oncogenic mutations in the c-kit gene. Targeting the tyrosine kinase activity of these oncogenes with the small-molecule inhibitor imatinib was sufficient to induce complete remissions in the great majority of patients (Druker et al. 2001; Demetri et al. 2002; Kantarjian et al. 2002). More recently, oncogene addiction was also demonstrated in a subset of lung cancers that contain base mutations or small deletions in the epidermal growth factor receptor (EGFR) gene; these alterations confer sensitivity to EGFR inhibitors such as gefitinib or erlotinib (Lynch et al. 2004; Paez et al. 2004). Thus, a single oncogenic lesion may play a decisive role in tumor maintenance, even when many additional genetic alterations have also accrued (Kaelin 2004).

A synthesis of the oncogene addiction involves a massive apparent genetic complexity may be underpinned by a much smaller collection of critical “dependencies” operant in human tumors. By this view, the predicted tumor promoting effects of many genomic perturbations may converge onto a finite number of physiological processes, which in turn exhibit an even smaller set of limiting “nodes” or “bottlenecks” within key cellular pathways directing carcinogenesis.


This gene encodes a transcription factor that contains both basic helix-loop-helix and leucine zipper structural features. It regulates the differentiation and development of melanocytes retinal pigment epithelium and is also responsible for pigment cell-specific transcription of the melanogenesis enzyme genes. Heterozygous mutations in the this gene cause auditory-pigmentary syndromes, such as Waardenburg syndrome type 2 and Tietz syndrome. Alternatively spliced transcript variants encoding different isoforms have been identified. [provided by RefSeq, Jul 2008])

Tumors arise from the pigment cells (melanocytes)

GeneCards Summary for MITF Gene

MITF (Microphthalmia-Associated Transcription Factor) is a Protein Coding gene. Diseases associated with MITF include tietz albinism-deafness syndrome and waardenburg syndrome, type 2a. Among its related pathways are IL6-mediated signaling events and Transport to the Golgi and subsequent modification. GO annotations related to this gene include transcription factor activity, sequence-specific DNA binding and RNA polymerase II core promoter proximal region sequence-specific DNA binding. An important paralog of this gene is TFE3.

UniProtKB/Swiss-Prot for MITF Gene


Transcription factor that regulates the expression of genes with essential roles in cell differentiation, proliferation and survival. Binds to symmetrical DNA sequences (E-boxes) (5-CACGTG-3) found in the promoters of target genes, such as BCL2 and tyrosinase (TYR). Plays an important role in melanocyte development by regulating the expression of tyrosinase (TYR) and tyrosinase-related protein 1 (TYRP1). Plays a critical role in the differentiation of various cell types, such as neural crest-derived melanocytes, mast cells, osteoclasts and optic cup-derived retinal pigment epithelium.

Source: Cold Spring Harbor Symposia on Quantitative Biology, Volume LXX. © 2005 Cold Spring Harbor Laboratory Press 0-87969-773-3.

On chemicals and cancer, an eminent German oncologist says that cancer is caused by environmental toxins. Others agree. While there are obviously other issues, fungus, viruses, genetics, etc., the major change in the world that could have lead to the explosion of cancer over the last 100 years has been the introduction of tens of thousands of chemicals into the environment. Chemicals that we had never been exposed to before. Ones that our bodies don’t know how to handle. The link between toxic Chemicals and Cancer becomes clearer the longer we are surrounded by them.

Dr Tuttle: Most of the time we don’t know what causes a specific patient’s thyroid cancer. The only well-accepted risk factor for the common types of thyroid cancer — papillary and follicular thyroid cancers — is exposure to ionizing radiation that occurs after exposure to fallout from nuclear reactors (like that following the Chernobyl accident), atomic bombs or therapeutic uses of radiation during young childhood.  However, since the incidence of thyroid cancer has dramatically increased over the last 20 years, both in the United States and abroad, many investigators are re-examining the possibility that some environmental factor may be linked to the rise in thyroid cancer. But as of now, no specific chemical or environmental factor has been demonstrated to commonly cause thyroid cancer in humans.