The average American consumes somewhere between two to three pounds of sugar each week. Over the last twenty years, our national sugar consumption exploded from 26 pounds to 135 pounds of sugar—per person—annually. Compare that to sugar consumption in the late 1800s, when the average consumption was five pounds per person-per year. A time, incidentally, when heart disease and cancer were virtually unknown.

While your brain requires a pretty constant supply of the blood sugar product glucose in order to function properly, constantly eating refined sugars and slurping down sodas does not provide the best route for sugar intake. On the contrary, researchers at the Salk Institute in California found that high glucose levels resulting from quick, easy sugar intake slowly but surely damage cells everywhere in the body, especially those in the brain.

Unfortunately, having too little glucose and having too much glucose are both problematic. When your blood sugar levels drop, your hypothalamus sends out a distress signal that leads to the release of adrenaline to your liver, ordering it to turn excess fat into glucose.

When you consume too much sugar, your pancreas secretes insulin to nudge that extra sugar into your cells, and too much insulin can deplete your normal glucose levels, depress your immune system, and lead to kidney disease.

Plus, excess insulin also promotes fat storage, which sets up a vicious cycle. Either extreme can leave you feeling woozy, nervous, fatigued, and shaky.

Two additional reasons why excess refined sugar is detrimental to your brain:

  1. A research group at the University of Wisconsin found that the brain may react to excess refined sugars found in food as if they were a virus or bacteria. The resulting immune response may cause cognitive deficits such as those associated with Alzheimer’s disease.
  2. High blood sugar coupled with performing a mentally challenging task is associated with high levels of cortisol—a stress hormone known to impair memory.

In other words, that second piece of cake at the company birthday party might stress out you, your body, and your brain . . . and affect your afternoon work efficiency!

Your Brain on Sugar

It’s pretty clear—excessive glucose in the form of refined sugar can be very detrimental to your brain, ultimately affecting your attention span, your short-term memory, and your mood stability. Excessive refined sugar can:

  • Block membranes and thereby slow down neural communication.
  • Increase free radical inflammatory stress on your brain. Free radicals can rupture cells.
  • Interfere with synaptic communication.
  • Cause neurons to misfire and send erroneous messages that take time and energy to sort out.
  • Increase delta, alpha, and theta brain waves, which make it harder to think clearly.
  • Can eventually damage your neurons.

Is There Such a Thing as Healthy Sugar?

Not really . . . a simple sugar is a simple sugar. However, those occurring in real food, such as fructose in fruit and lactose in milk, also provide other nutrients so are slightly more healthy than any other sugar. And even though health food stores love to promote honey, molasses, maple syrup, or agave as natural sweeteners, they are still simple sugars, with the same fattening calories and little nutritive value as refined white sugar. They do, however, tend to be a tad sweeter, so maybe you’ll be happier with a smaller amount, but don’t kid yourself about them being healthier. Sugar is sugar, and you need to limit how much you consume on a daily basis.

Go Light on the Honey, Honey

Although honey is a natural sweetener, 96 percent of honey consists of the simple sugars fructose, glucose, and sucrose. Honey also has the highest calorie content of all sugars with 65 calories per tablespoon, compared to the 48 calories per tablespoon found in table sugar. The increased calories are bound to cause increased blood serum fatty acids, as well as weight gain, on top of the risk of more cavities.

Why Soda Crashes and Burns Your System

Your brain uses 65 percent of your body’s glucose, but too much or too little glucose can have a detrimental effect on brain function. One can of soda contains 10 teaspoons of table sugar, all of which floods into a blood stream that typically contains a total of 4 teaspoons of blood sugar. The rush alerts your pancreas to release a lot of insulin. Some sugar is quickly ushered into the cells, including brain cells, and the rest goes into storage or into fat cells. An hour later, your blood sugar may fall dramatically, creating low blood sugar, and these rapid swings produce symptoms of impaired memory and clouded thinking.


Actually we have studies dating back to the 1970’s and even earlier showing that the entire cycle of sugar and carbohydrate addiction is induced by a deficiency of serotonin. Serotonin is known to be the “happy” neurotransmitter and it can only be made in the brain from protein. Because sugar does nothing to replenish depleted serotonin, it’s hard to break the addiction cycle.

The natural food supplements L-tryptophan and 5HTP provide the brain with more of the raw material it needs to make serotonin. The same studies mentioned above also showed that when people take these supplements in appropriate doses their cravings abate and their consumption of carbohydrates and overall calories decrease. (J Pharm Pharmacol 1975 Jan; 27 (1): 31-7; Brain Res Bull 1986 Nov; 17 (5): 681-9; Pharmacol Biochem Behav 1986 Oct; 25 (4): 711-6; J Neurol Transm 1989; 76 (2): 109-17).

Armed with this knowledge, physician Marty Hinz, MD, has built a large and successful practice focused entirely on treating weight problems with these types of supplements. Dr. Hinz has also stated repeatedly that in his experience – spanning more than a decade and thousands of patients – these amino acid (protein) supplements work better for appetite control than any medication, including the ill-fated phen-fen combination. For more on Dr. Hinz and his work visit www.neuroreplete.com.


Effects of Nutrients on Neurotransmitter Release1

Richard J.Wurtman 2

Contrary to earlier expectations, it has now become well established that the amounts of neurotransmitter released when certain neurons fire normally vary over a broad range. One process that generates such variations involves receptors on the neurons’ own presynaptic terminals: when activated by the neurotransmitter molecules that the neuron has released into the synapse, by concurrently released neuromodulators such as adenosine, or by other transmitters (e.g., the enkephalins) released at axoaxonal synapses, these receptors initiate intracellular events that diminish the number of neurotransmitter molecules released subsequently.

Another type of process that particularly affects the release of amine neurotransmitters depends on changes in the composition of the blood plasma induced by eating or by prolonged physical activity. Changes in plasma levels of choline or of certain amino acids lead to changes in brain levels of the precursors for these neurotransmitters—choline for acetylcholine, tryptophan for serotonin, and tyrosine for the catecholamines. These, in turn, regulate the rates at which the transmitters are synthesized, their concentrations within nerve terminals, and ultimately, the quantities released each time the neurons fire. For one transmitter—serotonin—the relevant variations in plasma composition probably affect most, if not all, of the neurons that release it. For other transmitters (e.g., the catecholamines), individual nerve cells can become more or less precursor dependent at any time, depending on the rates at which they happen to be firing.

Unlike the receptor-mediated presynaptic modulation of transmitter release, precursor-dependent modulation depends primarily on metabolic events occurring outside the brain and arising from a particular type of voluntary behavior, such as eating or exercise. Indeed, the primary physiological role of this dependency may be sensory (i.e., to provide the omnivore’s brain with information about what has been eaten or about important changes in macronutrient requirements, so that the individual can better decide what to eat next). However, because precursor-dependent neurotransmitters are involved in a wide variety of normal (and pathological) brain mechanisms besides those controlling food intake, this relationship may have broad physiological and medical implications. It also provides benign ways of influencing neurotransmission, and thus mental and physical performance.

FOOD CONSUMPTION, TRYPTOPHAN AVAILABILITY, AND BRAIN SEROTONIN SYNTHESIS

The initial observation that physiological changes in precursor availability (i.e., after food consumption) could affect neurotransmitter synthesis was made in studies on rats performed in 1971 (Fernstrom and Wurtman, 1971). Animals were allowed to eat a test diet that contained carbohydrates and fat but that lacked protein. Soon after the start of the meal, brain levels of the essential (and scarce) amino acid tryptophan were found to have risen, thus increasing the substrate saturation of the enzyme that controls serotonin synthesis, tryptophan hydroxylase. The resulting increase in brain serotonin levels was associated with an increase in brain levels of serotonin’s metabolite, 5-hydroxyindole acetic acid, thus suggesting that serotonin release had also been enhanced. (Direct evidence that physiological variations in brain tryptophan concentrations affect serotonin release was not obtained until 1987 [Schaechter and Wurtman, 1989].)

The rise in brain tryptophan levels after consumption of this test diet was accompanied by either a small increase (rats) or no change (humans) in plasma tryptophan levels. Both of these changes had been unanticipated, since the insulin secretion elicited by dietary carbohydrates was known to lower plasma levels of most of the other amino acids. However, the unusual response of plasma tryptophan to insulin was soon recognized as resulting from the amino acid’s unusual propensity to bind loosely to circulating albumin. Insulin causes nonesterified fatty acid molecules to dissociate from albumin and to enter adipocytes. This dissociation increases the protein’s capacity to bind circulating tryptophan; hence, whatever reduction insulin causes in free plasma tryptophan levels is compensated for by a rise in the tryptophan bound to albumin, yielding no net change in total plasma tryptophan levels in humans (Madras et al., 1974). Because this binding is of low affinity, the albumin-bound tryptophan is almost as able as free tryptophan to be taken up into the brain.

Considerably more difficult to explain were the data then obtained on what happens to brain tryptophan and serotonin levels after rats consume a meal rich in protein. Although plasma tryptophan levels were found to rise, reflecting the contribution of some of the tryptophan molecules in the protein, brain tryptophan and serotonin levels either failed to rise or, if the meal contained sufficient protein, actually fell (Fernstrom and Wurtman, 1972). The explanation for this paradox was found to lie in the transport systems that carry tryptophan across the blood-brain barrier (Pardridge, 1977) and into neurons. The endothelial cells that line central nervous system capillaries contain various macromolecules that shuttle specific nutrients or their metabolites between the blood and the brain’s extracellular space. One such macromolecule mediates the transcapillary flux (by facilitated diffusion) of tryptophan and other large neutral amino acids (LNAAs) such as tyrosine; others move choline, basic or acidic amino acids, hexoses, monocarboxylic acids, adenosine, adenine, and various vitamins. The amount of any LNAA transported by the macromolecule depends on its ability to compete with the other circulating LNAAs for binding sites. Thus, the ability of circulating tryptophan molecules to enter the brain is increased when plasma levels of the other LNAAs fall (as occurs after insulin is secreted) and is diminished when the plasma levels of the other LNAAs rise, even if plasma tryptophan levels remain unchanged. Since all dietary proteins are considerably richer in the other LNAAs than in tryptophan (only 1.0–1.5 percent of most proteins), consumption of a protein-rich meal decreases the plasma/tryptophan ratio (the ratio of the plasma tryptophan concentration to the summed concentrations of its major circulating competitors for brain uptake, principally, tyrosine; phenylalanine; the branched-chain amino acids leucine, isoleucine, and valine; and methionine). This, in turn, decreases tryptophan’s transport into the brain and slows its conversion to serotonin. (Similar plasma ratios predict brain levels of each of the other LNAAs—including drugs such as levodopa (L-dopa)—following meals or other treatments that modify plasma amino acid patterns (Wurtman et al., 1980). This is why a high-protein meal interferes with levodopa’s therapeutic effect, whereas a high-carbohydrate, protein-free meal can lead to abnormal movements caused by too much levodopa suddenly entering the brain (Wurtman et al., 1988).

The fact that administration of pure tryptophan could increase brain serotonin synthesis, thereby affecting various serotonin-dependent brain functions (e.g., sleepiness and mood), has been known since at least 1968. What was novel and perhaps surprising about the above findings was their demonstration that brain tryptophan levels—and serotonin synthesis—normally undergo important variations in response, for example, to the decision to eat a carbohydrate-rich (as opposed to a protein-rich) breakfast or in response to the administration of a very low dose of tryptophan (Fernstrom and Wurtman, 1971).

It remained possible, however, that mechanisms external to the serotonin-releasing neuron might exist. These mechanisms kept such food-induced increases in serotonin’s synthesis from causing parallel changes in the amounts released into synapses. Indeed, it was known that if rats were given very large doses of tryptophan that were sufficient to raise brain tryptophan levels well beyond their normal range, the firing frequencies of their serotonin-releasing raphe neurons decreased markedly; this was interpreted as reflecting the operation of a feedback system designed to keep serotonin release within a physiological range. Similar decreases in raphe firing had also been observed in animals given drugs, such as monoamine oxidase (MAO) inhibitors or serotonin-reuptake blockers, which cause persistent increases in intrasynaptic serotonin levels. Indeed, the administration of serotonin uptake inhibitors such as fluoxetine can cause the prolonged inhibition of serotonin release (Gardier and Wurtman, 1991). However, when rats were given small doses of tryptophan that were sufficient to raise brain tryptophan levels but not beyond their normal peaks or when they consumed a carbohydrate-rich meal, which raised brain tryptophan levels physiologically, no decreases in raphe firing occurred. Hence, food-induced changes in serotonin synthesis were found to affect the amounts of serotonin released per firing without slowing the neuron’s firing frequencies, thus “allowing” modulation of the net output of information from serotonergic neurons.

BRAIN SEROTONIN, NUTRIENT CHOICE, AND CARBOHYDRATE CRAVING

If rats are allowed to pick from foods in two pans presented concurrently and containing differing proportions of protein and carbohydrate, they choose among the two so as to obtain fairly constant (for each animal) amounts of these macronutrients.

However, if before “dinner” they receive either a carbohydrate-based snack or a drug that facilitates serotonergic neurotransmission, they quickly modify their food choice, selectively diminishing their intake of carbohydrates (Wurtman and Wurtman, 1979).

These observations support the hypothesis that the responses of serotonergic neurons to food-induced changes in the relative concentrations of plasma amino acids allow these neurons to serve a special function as sensors in the brain’s mechanisms governing nutrient choice (Wurtman, 1983, 1988).

Perhaps these neurons participate in a feedback loop through which the composition of breakfast (i.e., its proportions of protein and carbohydrate) can, by increasing or decreasing brain serotonin levels, influence the choice of lunch.

The ability of serotonin-containing neurons to distinguish between two foods (or the net compositions of two meals or snacks) depends upon the extent to which the foods produce significantly different plasma tryptophan/LNAA ratios.

Thus, a food (e.g., berries for rats or popcorn for people) which contains carbohydrates but little or no protein is easily distinguished from one (e.g., meat or eggs) that is rich in protein. Less easily distinguished would be one containing, say, 10 percent protein from one containing 15 percent protein, unless one of the foods happens to lack carbohydrates entirely (Yokogoshi and Wurtman, 1986). Perhaps the food-plasma-serotonin connection evolved because certain carbohydrates taste too good; to maintain its muscle mass, the bear must eventually stop eating honey and go catch a fish.

A similar mechanism may operate in humans and may underlie the tendency of people in all known cultures to eat about 13 percent of their total calories as protein and about four to five times as much carbohydrate as protein.

Subjects housed in a research hospital were allowed to choose from six different isocaloric foods (containing varying proportions of protein and carbohydrate but constant amounts of fat) at each meal, taking as many small portions as they liked; they also had continuous access to a computer-driven vending machine stocked with mixed carbohydrate-rich and protein-rich isocaloric snacks.

It was observed (Wurtman and Wurtman, 1989) that the basic parameters of each person’s food intake (total number of calories, grams of carbohydrate and protein, and number and composition of snacks) tended to vary only within a narrow range on a day-to-day basis and to be unaffected by placebo administration.

To assay the involvement of brain serotonin in maintaining this constancy of nutrient intake, pharmacological studies were undertaken in individuals in whom the feedback mechanism might be impaired.

These were obese people who claimed to suffer from carbohydrate craving, manifested as their tendency to consume large quantities of carbohydrate-rich snacks, usually at a characteristic time of day or evening (Wurtman et al, 1985). (Too few protein-rich snacks were consumed by the subjects to allow assessment of drug effects on this source of calories.)

Administration of dexfenfluramine, an antiobesity drug that increases intrasynaptic serotonin levels by releasing the transmitter and then blocking its reuptake, suppressed this carbohydrate craving. Other drugs thought to enhance serotonin-mediated neurotransmission selectively (e.g., the antidepressants zymelidine, fluvoxamine, and fluoxetine) have also been found to cause weight loss over the short term and may also selectively suppress carbohydrate intake. This contrasts with the weight gain (and carbohydrate craving) often associated with less chemically specific antidepressants such as amitriptyline.

Severe carbohydrate craving is also characteristic of patients suffering from seasonal affective disorder syndrome (SADS), a variant of bipolar clinical depression associated with a fall onset, a higher frequency in populations living far from the equator, and concurrent hypersomnia and weight gain (O’Rourke et al., 1989). A reciprocal tendency of many obese people to suffer from affective disorders (usually depression) has also been noted. Since serotonergic neurons apparently are involved in the actions of both appetite-reducing and antidepressant drugs, they might constitute the link between a patient’s appetitive and affective symptoms. Some patients with disturbed serotonergic neurotransmission might present themselves to their physicians with problems of obesity, reflecting their overuse of dietary carbohydrates to treat their dysphoria.

(The carbohydrates, by increasing intrasynaptic serotonin, would mimic the neurochemical actions of bona fide antidepressant drugs, such as the MAO inhibitors and tricyclic compounds [Wurtman, 1983].)

Other patients might complain of depression, and their carbohydrate craving and weight gain would be perceived as secondary problems. Another group might include women suffering from premenstrual syndrome (PMS) who experience late-luteal-phase mood disturbances, weight gain, carbohydrate craving (Brzezinski et al., 1990), and sometimes bloating and fluid retention. Yet another group includes people attempting to withdraw from nicotine (Spring et al., 1991), a drug that releases serotonin (Ribeiro et al., submitted for publication).

The participation of serotonergic neurons in a large number of brain functions besides nutrient choice regulation might have the effect of making such functions hostages to eating (seen in the sleepiness that can, for example, follow carbohydrate intake), just as it could cause mood-disturbed individuals to consume large amounts of carbohydrates for reasons related to neither the nutritional value nor the taste of these foods. In support of this view, it was observed that the serotonergic drug dexfenfluramine can be an effective treatment for both the affective and the appetitive symptoms of SADS (O’Rourke et al., 1989), PMS (Brzezinski et al., 1990), and smoking withdrawal (Spring et al., 1991).

UNDER WHAT CIRCUMSTANCES WILL NUTRIENT INTAKE AFFECT NEUROTRANSMISSION?

On the basis of the tryptophan-serotonin relationship, one can formulate a sequence of biochemical processes that would have to occur in order for any nutrient precursor to affect the synthesis and release of its neurotransmitter product.

First, plasma levels of the precursor (and of other circulating compounds, such as the LNAAs, that affect tryptophan’s availability to the brain) must be allowed to increase after its administration (or after its consumption as a constituent of foods). In other words, plasma levels of tryptophan, the other LNAAs, or choline cannot be under tight homeostatic control comparable to, for example, that of plasma calcium or osmolarity. In actuality, plasma levels of tryptophan, tyrosine, and choline do vary severalfold after the consumption of normal foods, and those of the branched-chain amino acids may vary by as much as five- or sixfold.

Second, the brain level of the precursor must be dependent on its plasma level (i.e., there must not be an absolute blood-brain barrier for circulating tryptophan, tyrosine, or choline).

In fact, such absolute barriers do not exist for these nutrients; rather, facilitated diffusion mechanisms that allow these compounds to enter the brain at rates that depend on the plasma levels of these ligands are in operation.

Third, the rate-limiting enzyme within presynaptic nerve terminals that initiates the conversion of the precursor to its neurotransmitter product must, similarly, be unsaturated with this substrate so that when presented with more tryptophan, tyrosine, or choline it can accelerate synthesis of the neurotransmitter. (Tryptophan hydroxylase and choline acetyltransferase [CAT] do indeed have very poor affinities for their substrates tryptophan and choline.)

As discussed below, tyrosine hydroxylase activity becomes tyrosine-limited when neurons containing the enzyme have been activated and the enzyme has been phosphorylated (Wurtman, 1988; Wurtman et al., 1980).

Available evidence suggests that only some of the neurotransmitters present in the human brain are subject to such precursor control, principally, the monoamines mentioned above (serotonin; the catecholamines dopamine, norepinephrine, and epinephrine; and acetylcholine) and, possibly, histidine and glycine.

Pharmacological doses of the amino acid histidine do elevate histamine levels within nerve terminals, and the administration of threonine, a substrate for the enzyme that normally forms glycine from serine, can elevate glycine levels within spinal cord neurons (and, probably, thereby ameliorate some of the clinical manifestations of spasticity [Growdon et al., 1991]).

One large family of neurotransmitters, the peptides, is almost certainly not subject to precursor control.

Brain levels of these compounds have never been shown to change with variations in brain amino acid levels; moreover, there are sound theoretical reasons why it is unlikely that brain peptide synthesis would respond.

The immediate precursor for a brain protein or peptide is not an amino acid per se, as is the case for some of the monoamine neurotransmitters, but the amino acid molecule attached to its particular species of transfer RNA (tRNA).

In brain tissue, the known enzymes that catalyze the coupling of an amino acid to its tRNA have very high affinities for their amino acid substrates, such that their ability to operate at full capacity in vivo is probably unaffected by amino acid levels (except possibly in pathological states that are associated with major disruptions in brain amino acid patterns, such as phenylketonuria).

Little information is available concerning the possible precursor control of the nonessential amino acids, such as glutamate, aspartate, and γ-aminobutyric acid (GABA), even though these are probably the most abundant neurotransmitters in the brain. It is difficult to do experiments on these relationships; the precise biochemical pathways that synthesize glutamate and aspartate within nerve terminals are not well established, and for GABA, although it is well established that its precursor is glutamate, brain levels of that amino acid cannot be raised experimentally without sorely disrupting normal brain functions.

The macromolecule that transports acidic amino acids such as glutamate and aspartate across the blood-brain barrier is unidirectional and secretes these compounds from the brain into the blood by an active transport mechanism (Pardridge, 1977). Hence, administration of even an enormous dose of monosodium glutamate will not affect brain glutamate levels unless it elevates plasma osmolarity to the point of disrupting the blood-brain barrier.

TYROSINE EFFECT ON DOPAMINE AND NOREPINEPHRINE SYNTHESIS

Because tyrosine administration had not been shown to increase brain dopamine or norepinephrine levels in otherwise untreated animals, it was initially assumed that the catecholamine neurotransmitters were not under precursor control, even though (1) plasma tyrosine levels do increase severalfold after protein intake or tyrosine administration; (2) the LNAA transport system does ferry tyrosine, like tryptophan, across the blood-brain barrier; and (3) tyrosine hydroxylase, which catalyzes the rate-limiting step in catecholamine synthesis, is unsaturated in vivo (Wurtman et al., 1980).

It did seem possible, however, that a pool of neuronal dopamine or norepinephrine might exist for which synthesis did depend on tyrosine levels, but which was of too small a size in relation to the total catecholamine mass to be detected.

Hence, studies were performed to determine whether catecholamine synthesis or release could be affected by changes in brain tyrosine concentrations. At first, catecholamine synthesis was estimated by following the rate at which dopa, the product of tyrosine’s hydroxylation, accumulated in the brains of rats treated acutely with a drug that blocks the next enzyme in catecholamine formation (aromatic l-amino acid decarboxylase). Tyrosine administration did increase dopa accumulation, whereas other LNAAs decreased both dopa accumulation and brain tyrosine levels. Catecholamine release was then estimated by measuring the brain levels of metabolites of dopamine (homovanillic acid [HVA], dihydroxyphenylacetic acid [DOPAC]) or norepinephrine (methoxyhydroxyphenylglycol sulfate [MHPH-SO4]). Administration of even large doses of tyrosine had no consistent effect on these metabolites. However, if the experimental animals were given an additional treatment designed to accelerate the firing of dopaminergic or noradrenergic tracts (e.g., dopamine receptor blockers, cold exposure, partial lesions of dopaminergic tracts, and reserpine), the supplemental tyrosine caused a marked augmentation of catecholamine release (Wurtman, 1988; Wurtman et al., 1980). These initial observations formed the basis for the hypothesis that catecholaminergic neurons become tyrosine sensitive when they are physiologically active and lose this capacity when they are quiescent.

The biochemical mechanism that couples a neuron’s firing frequency to its ability to respond to supplemental tyrosine involves phosphorylation of the tyrosine hydroxylase enzyme protein, a process that occurs when the neurons fire.

 

 High tyrosine foods:
  • Spirulina: Spirulina is a seaweed, which is rich in tyrosine and other nutrients as well
  • Soy Foods
  • Eggs: The trusty egg is a powerhouse of nourishing vitamins
  • Cheese
  • Fish And WILD Salmon
  • Poultry
  • Meat
  • Beans And Grains

This phosphorylation, which is short-lived, enhances the enzyme’s affinity for its cofactor (tetrahydrobiopterin) and makes the enzyme insensitive to end product inhibition by catechols; these changes allow its net activity to depend on the extent to which it is saturated with tyrosine.

An additional mechanism underlying this coupling may be an actual depletion of tyrosine within nerve terminals as a consequence of its accelerated conversion to catecholamines (Milner et al., 1987).

If slices of rat caudate nucleus are superfused with a standard Krebs-Ringer solution (which lacks amino acids) and are depolarized repeatedly, they are unable to sustain their release of dopamine; concurrently, their contents of tyrosine, but not of other LNAAs, decline markedly.

The addition of tyrosine to the superfusion solution enables the tissue to continue releasing dopamine at initial rates and also protects it against depletion of its tyrosine. The concentrations of tyrosine needed for these effects are proportional to the number of times the neurons are depolarized. (Of course, the intact brain is continuously perfused with tyrosine-containing blood, making it highly unlikely that tyrosine levels fall to a similar extent, even in continuously active brain neurons. However, they might decline somewhat, since tyrosine is poorly soluble in aqueous media and diffuses relatively slowly.)

More recently, in vivo dialysis techniques have been used to assess tyrosine’s effects on brain dopamine release. When otherwise untreated animals receive the amino acid systemically, there is, after 20–40 min, a substantial increase in dopamine output from nigrostriatal neurons unaccompanied by detectable increases in dopamine’s metabolites DOPAC or HVA. However, this effect is short-lived, and dopamine release returns to basal levels after 20–30 min. This latter response probably reflects receptor-mediated decreases in the firing frequencies of the striatal neurons (to compensate for the increase in dopamine release that occurs with each firing) and, perhaps, local presynaptic inhibition. If animals are given haloperidol, a dopamine receptor-blocking agent, before—or along with—the tyrosine, the supplemental tyrosine continues to amplify dopamine output for prolonged periods (During et al., 1989).

Tyrosine has now been shown to enhance the production and release of dopamine or norepinephrine in a variety of circumstances. This amino acid may ultimately have considerable utility in treating catecholamine-related diseases or conditions; it may also prove useful in promoting performance— particularly in high-stress situations.

EFFECTS OF CHOLINE ON SYNTHESIS OF ACETYLCHOLINE AND PHOSPHATIDYLCHOLINE

The amounts of acetylcholine released by physiologically active cholinergic neurons depend on the concentrations of choline available. In the absence of supplemental free choline, the neurons will continue to release constant quantities of the transmitter, especially when stimulated (Maire and Wurtman, 1985). However, when choline is available (in concentrations bracketing the physiological range), a clear dose relationship is observed between its concentration and acetylcholine release (Blusztajn and Wurtman, 1983; Marie and Wurtman, 1985). When no free choline is available, the source of the choline used for acetylcholine synthesis is the cells’ own membranes (Blusztajn et al., 1987).

Membranes are very rich in endogenous phosphatidylcholine (PC), and this phospholipid serves as a reservoir of free choline, much as bone and albumin serve as reservoirs for calcium and essential amino acids.

It has been suggested that a prolonged imbalance between the amounts of free choline available to a cholinergic neuron and the amounts needed for acetylcholine synthesis might alter the dynamics of membrane phospholipids to the point of interfering with normal neuronal functioning (“autocannibalism”) (Blusztajn and Wurtman, 1983; Nitsch et al., 1992a), for example, in patients with Alzheimer’s disease.

PPC

In that event, providing the brain with supplemental choline would serve two purposes: it would enhance acetylcholine release from physiologically active neurons and it would replenish the choline-containing phospholipids in their membranes (Wurtman, 1985).

Neurons can draw on three sources of free choline for acetylcholine synthesis: that stored as PC in their own membranes, that formed intrasynaptically from the hydrolysis of acetylcholine (and taken back up into the presynaptic terminal by a high-affinity process estimated to be 30–50 percent efficient in the brain), and that present in the bloodstream (and taken into the brain by a specific blood-brain barrier transport system). The PC in foods (e.g., liver and eggs) is rapidly hydrolyzed to free choline in the intestinal mucosa (or is broken down more slowly after passage into the lymphatic circulation). Consumption of adequate quantities of PC can lead to severalfold elevations in plasma choline levels, thereby increasing brain choline levels and the substrate saturation of CAT.

The PC molecules consumed in the diet, as well as those formed endogenously in neuronal membranes, are very heterogeneous with respect to their fatty acid compositions. Some PCs (e.g., those in soybeans and nerve terminals) are relatively rich in polyunsaturated fatty acids; others (e.g., those in eggs) are highly saturated. PCs are also heterogeneous with reference to their mode of synthesis. Brain neurons produce PC by three distinct biochemical pathways: the sequential methylation of phosphatidylethanolamine (PE), the incorporation of preexisting free choline via the CDP-choline cycle, or the incorporation of free choline via the base exchange pathway (in which a choline molecule substitutes for the ethanolamine in PE or the serine in phosphatidylserine [PS]). Quite possibly, the different varieties of PC may subserve distinct functions; for example, one type of PC, distinguished by its fatty acid composition or its mode of synthesis, could be preferentially utilized to provide a choline source for acetylcholine synthesis or could be formed preferentially during the processes of cell division or synaptic remodeling. Similarly, one particular species might be especially involved in the pathogenesis of particular degenerative diseases afflicting cholinergic neurons (e.g., Alzheimer’s disease).

Supplemental choline or PC has been used with some success in the treatment of tardive dyskinesia. A summary of related publications (Nasrallah et al., 1984) concluded that choline and the cholinesterase inhibitor physostigmine were about equally efficacious and that choline was less toxic. Most patients exhibited some reduction in the frequency of abnormal movement, but in only a few cases was there complete cessation of the movements. Choline sources have also been tried in the treatment of Alzheimer’s disease. Most well-controlled studies have treated subjects for relatively short intervals (6–8 weeks) and have focused on younger subjects, with little or no success. A single double-blind study administered the PC for 6 months (Little et al., 1985).

Improvement was noted in about one-third of the subjects; the average age of the responders was 83 years and that of nonresponders was 73 years, a relationship thought to be compatible with evidence that Alzheimer’s disease may be more restricted to cholinergic neurons in subjects who become symptomatic at a later age.

Occasional reports have also described the useful effects of choline or PC in treating mania, ataxia, myasthenic syndromes, and Tourette’s syndrome.

Very recently it has been observed (Nitsch et al., 1992a) that the brains of people dying of Alzheimer’s disease (but not Down’s Syndrome) contain reduced levels of PC and free choline (and PE and free ethanolamine) but major increases in those of the PC metabolite glycerophosphocholine and the PE metabolite glycerophosphoethanolamine.

 

CHOLINE

These changes were not restricted to regions containing plaques, tangles, or amyloid.

Since low brain choline levels both impair acetylcholine synthesis and accelerate the breakdown of membrane PC and since adequate acetylcholine may be needed to prevent the formation of the amyloid protein of Alzheimer’s disease (Nitsch et al., 1992b), supplemental choline and ethanolamine could have a role in the prevention of this disease.

CONCLUSIONS

  • The design of experiments to display the potentially useful effects of foods and nutrients on the ability to perform well, particularly under stressful circumstances, will require considerable sophistication. These chemicals are not nearly as potent as drugs and, in fact, lack intrinsic potency, having first to be converted to a neurotransmitter within a nerve terminal and then to be released from that terminal. (Of course, they are also likely to be significantly less toxic than drugs; this is perhaps their major advantage.) Such experimental design should be entrusted to people who are well trained in studying human behavior and who also fully understand the ground rules that determine when the food or nutrient is most likely to be effective (e.g., for tyrosine, when particular catecholamine-releasing neurons are firing frequently for long periods).
  • At this point, too few adequate experiments have been done with human subjects to begin to assess the utilities of neurotransmitter precursors such as tyrosine or choline in increasing or sustaining performance; in fact, a number of poorly designed studies muddy the waters.
  • Tyrosine’s effect on performance must be examined in situations in which subjects are under real stress. Choline’s effects on memory must be studied in experiments in which the nutrient is given for a sufficiently long period of time (i.e., one compatible with what is known about the dynamics of the choline-phosphatidylcholine interaction).
  • The peripheral actions of the neurotransmitter precursors may turn out to be very useful (e.g., tyrosine’s ability to normalize blood pressure when it is both too high and too low [Wurtman et al., 1980] and choline’s ability to sustain exercise tolerance in subjects whose plasma choline levels have been reduced by, for example, long-distance running [Conlay et al., 1986; Sandage et al., 1992]).
  • The development of foods or nutrients used to sustain performance—or otherwise to improve normal behaviors—requires guidance by the U.S. Food and Drug Administration, and perhaps other agencies as well, regarding how these compounds will be regulated. It is absolutely mandatory that all such preparations be safe and of adequate purity; it is also essential that they be adequately labeled, providing the user with full information about their indications, dosages, contraindications, and side effects. However, if and when it can be shown that their use is largely nutritional (i.e., to meet the body’s needs for more of the particular nutrient because environmental circumstances have increased those needs), then perhaps they can be designated as foods.
  • Considerable additional research should be done to identify special populations with unusual responses to foods or nutrients that affect neurotransmitters (e.g., the carbohydrate cravers who overconsume carbohydrate-rich snacks in order to relieve depressive symptoms). Heterogeneity of response will doubtless also exist among people in the military (e.g., those with mild seasonal depression or premenstrual syndrome and those giving up smoking).

ACKNOWLEDGMENTS

Some of these studies were supported by National Institute of Mental Health grant MH-28783, U.S. Air Force grant AFOSR-830366, National Aeronautics and Space Administration grant NAG-2–210, and National Institutes of Health grant RR00088–24 to the Clinical Research Center, Massachusetts Institute of Technology.

Acetylcholine/Choline Deficiency in Chronic Illness – eat soft boiled eggs

To my friends who love to drink alcohol be it San Miguel beer or red wine, do eat protein rich foods like soft boiled eggs when drinking.

To my BFF with pancreas health issues, eat soft boiled eggs.

Connie

Acetylcholine/Choline Deficiency in Chronic Illness – The Hunt for the Missing Egg.

Those who lack choline are prone to mental illness, heart disease, fatty liver and/or hemorrhagic kidney necrosis and chronic illness as choline is oxidized to betaine which acts as an important methyl donor and osmolyte. With fatty liver, a person can be prone to diabetes and other chronic illness.  Eggs are rich in choline.  Choline is also found in a wide range of plant foods in small amounts. Eating a well-balanced vegan diet with plenty of whole foods should ensure you are getting enough choline. Soymilk, tofu, quinoa, and broccoli are particularly rich sources.

Eggs are an excellent source of choline and selenium, and a good source of high-quality protein, vitamin D, vitamin B12, phosphorus andriboflavin. In addition, eggs are rich in the essential amino acid leucine(one large egg provides 600 milligrams), which plays a unique role in stimulating muscle protein synthesis.

ucm278430We hear a lot about vitamins and minerals such as B12, folate, magnesium, vitamin C, and so on, but there seems very little talk these days on the importance of dietary lecithin and choline. Are you consuming an adequate amount of acetylcholine, or other phospholipids? The odds are that you are not.

A little bit about choline

The human body produces choline by methylation of phosphatidylethanolamine (from dietary sources such as lecithin and others) to form phosphatidylcholine in the liver by the PEMT enzyme. Phosphatidylcholine may also be consumed in the diet or by supplementation. Choline is oxidized to betaine which acts as an important methyl donor and osmolyte.

For those wanting to see how this relates to the methylation cycle, below is a nice graphic (courtesy of Wikipedia).

Choline metabolism

It is well known that magnesium deficiency is widespread (57% of the population does not meet the U.S. RDA according to the USDA), but the numbers for choline deficiency are even more shocking.

According the National Health and Nutrition Examination Survey (NHANES) in 2003-2004, only about 10% of the population have an adequate intake of choline. This means about 90% of the population consumes a diet deficient in choline. Furthermore, those without an adequate intake of choline may not have symptoms.

Along with folate and B12 deficiency, inadequate consumption of choline can lead to high homocysteine and all the risks associated with hyperhomocysteinaemia, such as cardiovascular disease, neuropsychiatric illness (Alzheimer’s disease, schizophrenia) and osteoporosis. Inadequate choline intake can also lead to fatty liver or non-alcoholic fatty liver disease (NAFLD).

The most common symptoms of choline deficiency are fatty liver and/or hemorrhagic kidney necrosis. Consuming choline rich foods usually relieve these deficiency symptoms. Diagnosing fatty liver isn’t as simple as running  ALT and AST since nearly 80% of people with fatty liver have normal levels of these enzymes according to a population study published in the journal Hepatology. In fact, in an experiment, 10 women were fed a diet low in choline. Nine developed fatty liver and only one had elevated liver enzymes.


Estrogen and Choline Deficiency

Given the connection between low lipids and choline deficiency, it would be tempting to think that as long as someone has enough cholesterol and TG that they will be protected from choline deficiency.  Unfortunately this is not the case.  Having adequate lipids does indeed help support healthy choline levels, but it does not guarantee a person will avoid choline deficiency.  The truth is that choline deficiency can come from more than one source.  Both sex hormone levels and genetic SNPs may lead to a choline deficiency by influencing the PEMT enzyme – the enzyme responsible for synthesis of choline inside the body.  Recent research now confirms how hormones and genetic polymorphisms play a major role in choline deficiency.

The body can make choline only one way; that is by methylating a molecule of phosphatidylethanolamine (PE) into a molecule of phosphatidylcholine (PC).  The body’s only method for accomplishing this is via the enzyme PEMT (phosphatidylethanolamine N-methyltransferase) which is found in the liver, brain, muscle, fat and other tissues.1,2    As with other well-known methylation enzymes like MTHFR and COMT, the PEMT enzyme can have genetic SNPs that slow it down.  When this enzyme slows down the body cannot make choline in high amounts and choline deficiency is more likely.  But there is more to the story of PEMT than just polymorphisms.  In addition to being slowed by SNPs, PEMT is also dependent upon the hormone estrogen for activation. 1, 3  What this means is that the PEMT enzyme, the body’s only method of synthesizing choline, has not one but two Achilles heals.  The PEMT pathway and how it relates to phosphatidylcholine production is shown in Figure 1.3 below.

Communicating Vessels4-PEMT

Figure 1.3 – PEMT is shown as the rate-limiting reaction in the production of phosphatidylcholine inside the human body.  Due to genetic and hormonal variances, most people have a PEMT enzyme working too slow and are susceptible to choline deficiency when there is not enough choline in the diet.  ACoA – Acetyl-CoA; TG – Triglycerides; PE – phosphatidylethanolamine; PC – phosphatidylecholine; PEMT – phosphatidylethanolamine N-methyltransferase.

As mentioned above, the sex hormone estrogen is intimately linked with the production of choline.  Women have a biological advantage here as the premenopausal female body has much higher levels of estrogen than does the male body.  When a woman becomes pregnant this advantage is taken to an extreme, as pregnancy increases estrogen levels over 30 times normal.4  A successful pregnancy requires high amounts of nutrients delivered to the growing baby, esp. choline.  Since the mother’s body is building a human being from scratch, there is an added burden on her biology to provide enough nutrition to her growing baby.  Viewed from this perspective, the high estrogen levels during pregnancy can be seen to act like a biochemical insurance policy.  Since the PEMT enzyme requires estrogen to function, pregnancy allows a woman to make extra choline for her developing child.  Furthermore, the nervous system is the first system to form in utero and is a tissue that requires high levels of choline for proper development.5, 6  Choline plays such an important role in cell membranes, myelin sheaths, and nervous system tissue that the high estrogen levels during pregnancy help make sure the growing brain and nervous system is nourished.  It is a genius system that assures the health and survival of the child.

Even though Nature has conferred an advantage to females by providing them with higher estrogen levels, esp. during pregnancy, this alone cannot protect against a lack of choline in the diet.  All the estrogen in the world will not save a woman from choline deficiency if the gene responsible for producing choline is slowed down by a polymorphism.  Genetic research has shown that the gene responsible for synthesizing choline, the PEMT gene, is susceptible to common polymorphisms which alter its function by slowing it down.  In a recent study looking at a population in North Carolina, men and women of various ages were placed on a choline-deficient diet.  They were followed closely for up to 42 days on a low choline diet consisting of less than 50mg choline per day.  Throughout the study, the participants’ liver function was continuously assessed for any sign of fatty liver and damage.  After eating a choline deficient diet for just six weeks, 63% of participants developed liver dysfunction and choline blood levels dropped 30% in every single participant, including premenopausal females.7  During this six week trial of low dietary choline the odds of developing liver dysfunction were 77% for men, 80% for postmenopausal women and just 44% for premenopausal women.7  Based on what has been discussed so far about estrogen and choline, it makes sense that men and postmenopausal women would be more susceptible to developing fatty liver since they don’t have high estrogen levels.  And based on the fact that estrogen levels drive choline production, premenopausal women should have been protected from fatty liver since they make higher amounts of choline – but that was not the case.

With dietary choline restricted to just 50 mg/day, approximately half of the premenopausal group also suffered liver dysfunction, suggesting that a choline deficient diet can even harm women with higher estrogen levels.  In addition, blood tests revealed that premenopausal female experienced a 30% loss of choline on a low choline diet right along with everyone else.   Despite the fact that higher estrogen levels allow fertile women to make more choline, many were not able to make enough to avoid problems.  A PEMT gene polymorphism is the only mechanism that can explain how women with high estrogen levels are still susceptible to choline deficiency when placed on a low choline diet.

Just like many individuals in the population, some of the premenopausal women inherited one or two copies of the PEMT gene which slows down the production of choline.   This study showed that fatty liver occurred in 80% of the premenopausal women with two copies of PEMT and in 43% with only one copy of PEMT.8  What this means is that a premenopausal woman with two copies of the slowed PEMT gene has exactly the same risk of fatty liver as a postmenopausal woman.  It is as if inheriting two copies of the PEMT gene effectively shuts off all estrogen-related choline production in the body.  If a woman only has a single copy of the slowed PEMT gene, she will still have a roughly 50% chance of liver dysfunction on a low choline diet.  Thus a single copy of the gene is only slightly better than two copies, as at least some estrogen-related choline production is preserved.

If having a PEMT gene can put one at risk for choline-related diseases like fatty liver, then it is important to know how common these genes are in population.  We know that 74% of all women in the study had a SNP in the PEMT that made their PEMT enzyme unresponsive to estrogen.9  This means that only 26% of women can make enough choline on a low choline diet; and that ability depends on whether the woman is still fertile or has entered menopause.  In this way genetics can take away the biological advantage that high estrogen levels usually offer to premenopausal females.  Women with these PEMT genes will be at risk for choline deficiency and liver damage just like all men and post-menopausal women – two groups who don’t have enough estrogen to make choline regardless of their genes.  Due to all the interference from the PEMT gene, dietary choline levels must be optimized for the vast majority of our population.

Summary of PEMT and Choline Deficiency:

  • In humans, choline is only made by the PEMT enzyme
  • Estrogen is required for the PEMT enzyme to activate and function normally
  • Men and postmenopausal women have an elevated risk of choline deficiency due to low estrogen levels.
  • The PEMT enzyme is commonly slowed down by polymorphisms, making it unresponsive to estrogen levels
    • 74% of women have at least one copy of a slowed PEMT
    • Homozygous carriers of PEMT have much higher risk of choline deficiency
    • Men, postmenopausal women, and premenopausal women with PEMT SNPs need to increase choline intake in the diet to offset elevated risk of liver dysfunction

The take away here is that studies have recently shown that because of common genetic polymorphisms, choline deficiency is a widespread problem.  Normally the hormone estrogen allows the body to make choline from scratch.  However, genetic variation in the PEMT enzyme, estrogen levels and gender differences prevent most people from making adequate choline.  Realistically then the only group in our population who is protected from choline deficiency are premenopausal females without a single copy of the slowed PEMT gene.   Every single male, every single postmenopausal woman, and 74% of premenopausal woman all require daily intake of approx. 500 mg of choline to prevent fatty liver, organ damage, and the associated health problems.7  If the body is already depleted, then levels that simply prevent deficiency won’t be enough to replete the body.  In these cases, higher daily doses of at least 1 gram or more are needed to replenish the tissues.  Choline it seems must be absorbed from the diet in just about everyone except for the few young women who have a normal PEMT gene and can synthesize choline regardless of dietary intake.


7 Ways How To Cook Eggs to Maximize Nutrition – Dr. Anthony Gustin

BEST TO WORST WAYS HOW TO COOK EGGS FOR MAXIMUM NUTRITION: 1. SOFT BOILED. We’ll start out with the best way how to cook eggs for nutrition: soft boiled. This is when you boil anegg, but it is still a little runny and the yolk is definitely not hard. It might take a little more work than other methods, but soft boiled …

raw yolks as choline source for ‘cetams? – Brain Health – LONGECITY

http://www.longecity.org › LONGECITY › Bioscience, Health & Nutrition › Brain Health

Oct 9, 2005 – I know that uncooked egg yolks are a source of choline (180-215mg?) and phosphatidylcholine. As I understand it, cooking the yolk kind of ‘denatures’ the choline and makes it unavailable. … Whole raw eggs are my source of choline and protien (and other stuff too).

Cooking eggs destroy it’s choline benefits?? : Nootropics – Reddit

Jan 9, 2013 – 8 posts – ‎5 authors

There is a lot of speculation out there as to whether cooking an egg decreases the amounts of choline it contains. Most of what I see online says that it does. However, I have yet to find any scientific evidence that is the case. In fact, the USDA has a choline fact sheet, which has cooked eggs having slightly …

Eggs, Choline, & Cancer | NutritionFacts.org

Oct 14, 2013 – soft boiled egg contains useable lysine. Boil the egg for another minute to hard state, the lysine is not useable. Lysine is needed for the body to make use of the next 5 most important essential aminoes so that other aminoes can be made by the body. Lysine is denatured at about 110 degrees. As the heat …

The Single Best Way to Eat an Egg – Health Wire

Mar 5, 2015 – If you choose not to eat your eggs raw, poached or softboiled would be the next best option. This leaves the yolk still runny and the … Eggs also contain choline, selenium, biotin, B vitamins, phosphorus, and more, making them are one of the healthiest foods you can eat. And, contrary to popular belief, …

5 Ways to Get More Choline in Your Diet: Secret of Radiant Living

The ideal sources of choline are animal foods like egg yolks and liver, which contain the most concentrated amounts of this nutrient and can be easily incorporated into the diet to meet … Veggies from this group, including cauliflower, cabbage, bok choy and broccoli, boast around 65 mg of cholineper cup cooked.

Soft-Boiled Science: Egg-cellently Cooked Eggs – Scientific American

Mar 28, 2013 – Hard-boiled eggs are commonly used for dying Easter eggs, but a softboiled egg can make a yummy breakfast or snack. How does … One large egg has about 75 calories, many essential nutrients, lots of high-quality protein, various vitamins, multiple minerals, choline, folate and riboflavin. Eggs can help …

How to Boil Eggs: The Hard Truth About Boiled Eggs – Dr. Mercola

Jun 7, 2014 – Eggs are a phenomenal source of protein, fat, and other nutrients, including choline and the antioxidants lutein and zeaxanthin. … While less “well done” eggs are still preferable (such as poached, soft-boiled, or over easy with very runny yolks), a hard-boiled egg makes a fine snack or source of protein for …

The Only Way to Get the Most Nutrition From Eggs – Waking Times

Feb 21, 2015 – This nutrient loss occurs regardless of whether the egg is removed from the shell (for example, during poaching) or left inside the shell during cooking (for example, during soft or hard boiling). If you compare the nutrient value of one large raw egg to one large hard-boiled egg in the latest version of the U.S. …

Eat Your Eggs! Choline and the Link With Fatty Liver – Fatty Liver Diet …

fattyliverdietguide.org › Fatty Liver Diet

Eat Your Eggs! Choline and the Link With Fatty Liver. eggs “Stay away from eggs if you want to be healthy. They have all that fat and cholesterol!” If you’re like most people, you began hearing that information in the 70s, 80s or 90s (depending upon your age). …. I eate boiled eggs but not the yolk because I don’t like them.

 

 

Limit choline-rich foods to prevent blood clots

Common Food Nutrient Tied to Risky Blood Clotting

FACEBOOKTWITTEREMAILPRINT ARTICLE

News Picture: Common Food Nutrient Tied to Risky Blood ClottingBy Amy Norton
HealthDay Reporter

MONDAY, April 24, 2017 (HealthDay News) — A nutrient in meat and eggs may conspire with gut bacteria to make the blood more prone to clotting, a small study suggests.

The nutrient is called choline. Researchers found that when they gave 18 healthy volunteers choline supplements, it boosted their production of a chemical called TMAO.

That, in turn, increased their blood cells’ tendency to clot. But the researchers also found that aspirin might reduce that risk.

TMAO is short for trimethylamine N-oxide. It’s produced when gut bacteria digest choline and certain other substances.

Past studies have linked higher TMAO levels in the blood to heightened risks of blood clots, heart attack and stroke, said Dr. Stanley Hazen, the senior researcher on the new study.

These findings, he said, give the first direct evidence that choline revs up TMAO production in the human gut, which then makes platelets (a type of blood cell) more prone to sticking together.

Choline is found in a range of foods, but it’s most concentrated in animal products such as egg yolks, beef and chicken.

Hazen said he and his colleagues at the Cleveland Clinic wanted to isolate the effects of choline on people’s levels of TMAO and their platelet function. So they studied supplements.

The researchers had 18 healthy adults –10 meat-eaters and eight vegetarians/vegans — take choline supplements for two months.

The supplements provided around 450 milligrams of choline daily — roughly the amount in two or three eggs, Hazen said.

One month in, the study found, the supplements had raised participants’ TMAO levels 10-fold, on average. And tests of their blood samples showed that their platelets had become more prone to clotting.

“This study gives us one of the mechanisms by which TMAO may contribute to cardiovascular disease,” said Dr. J. David Spence.

Spence, who was not involved in the study, directs the Stroke Prevention & Atherosclerosis Research Centre at Western University in London, Ontario, Canada.

For the healthy people in this study, Spence said, the TMAO rise from choline might not be worrisome. But, he added, it might be a concern for people at increased risk of heart disease or stroke.

Spence suggested those individuals limit egg yolks, beef and other foods high in choline.

Hazen had similar advice. “You don’t have to become a vegetarian,” he said. “But you could try eating more plant-based foods, and more vegetarian meals.”

He also pointed to the Mediterranean diet — rich in olive oil, vegetables and fish. In an earlier study, Hazen said, his team found that a compound in olive oil seems to inhibit TMAO formation.

The new study uncovered yet another compound that may counter TMAO: low-dose aspirin.

In a separate experiment, the researchers had some participants take 85 milligrams of aspirin (a baby aspirin) a day, in addition to choline supplements. That, it turned out, lessened the rise in TMAO and the change in platelet activity.

Doctors already prescribe low-dose aspirin to certain people at risk of heart disease and stroke.

It’s possible, Hazen said, that aspirin’s effects on TMAO are one reason it helps ward off cardiovascular trouble.

The current study is small and preliminary. But it’s the latest to suggest that the gut “microbiome” plays a key role in cardiovascular disease, Spence said.

The “microbiome” refers to the trillions of bacteria that dwell in the gut. Spence said researchers are just beginning to understand how gut bacteria and their byproducts affect the cardiovascular system.

But one hope, he said, is to figure out what balance of gut bacteria supports cardiovascular health — and possibly use probiotic (“good” bacteria) supplements to help treat people at high risk of heart disease or stroke.

Spence said his own lab is working on just that.

There are, of course, many factors in heart disease risk — from age to high blood pressure to diabetes to smoking, Hazen pointed out.

“We’re saying a portion of the risk is related to the gut microbiome,” he said.

Hazen and a colleague report potential royalty payments from several companies related to “cardiovascular diagnostics and therapeutics.” One company, Cleveland HeartLab, recently launched a test for measuring TMAO levels.

The findings appear in the April 25 online issue of Circulation.


Choline is an essential nutrient necessary for a wide range of functions from cellular maintenance to creating neurotransmitters. A deficiency in choline often appears as an increase in liver enzymes, and can lead to liver disease, heart disease, and even neurological disorders.

Health benefits of adequate choline include a reduced risk of dementia, cardiovascular disease, and cancer.2 Some studies have shown an increased risk of colon cancer with choline supplements, but natural food sources like those listed below are safe and healthy.

A daily value (DV) has not been established for choline, however, the adequate intake (AI) for adult men is 550mg a day for men and 425mg/day for women. High choline foods include liver, eggs, cauliflower, mushrooms, dark leafy greens, shellfish, asparagus, brussels sprouts, bok choy, and fish.


#1: Liver (Beef)

Choline in 100g Per Slice (81g)
418.2mg (76% AI) 338mg (62% AI)

Other Liver Products High in Choline (%AI per 100g): Veal (73%), Chicken Liver (59%), and Chicken Liver Pate (42%).

#2: Eggs

Choline in 100g 1 cup, chopped (136g) 1 large (50g)
293.8mg (53% AI) 399.6mg (72% AI) 146.9mg (27% AI)

Most of the choline in eggs is contained in the yolk. Fish roe (eggs) are also a good source of choline providing 9% AI per tablespoon, and 17% AI per ounce

#3: Cauliflower (Raw)

Choline in 100g 1 cup chopped (1/2″ pieces) (107g) 1 head medium (5-6″ dia.) (588g)
44.3mg (8% AI) 47.4mg (9% AI) 260.5mg (47% AI)

Broccoli provides 5% AI per cup cooked



#4: Mushrooms (Cooked Shiitake)

Choline in 100g 1 cup sliced (97g) 1 piece whole (19g)
59.4mg (11% AI) 57.6mg (10% AI) 11.3mg (2% AI)

Other mushrooms high in choline (%AI per cup sliced): Oyster (8%), Portabella (7%), Enoki (6%), Maitake (6%), White Button (4%), and Brown Italian (Crimini) (3%)

#5: Dark Leafy Greens (Beet Greens, Cooked)

Choline in 100g 1 cup (1″ pieces) (144g)
42.5mg (8% AI) 61.2mg (12% AI)

Other dark leafy greens high in choline (%AI per cup cooked): Collards (14%), Swiss Chard (9%), and Spinach (6%). Click to see complete nutrition facts.

#6: Shellfish (Oysters)

Choline in 100g 3 oz (85g) 6 medium (59g)
101mg (18% AI) 85.9mg (15% AI) 59.6mg (11% AI)

Other Shellfish and Crustaceans High in Choline (%AI per 3oz (85g) serving): Shrimp (21%), Scallops (17%), Crayfish (13%), Crab (13%), and Lobster (13%)#7:

Asparagus (Cooked)

Choline in 100g 1/2 cup (90g) 4 spears (1/2″ base) (60g)
26.1mg (5% AI) 23.5mg (5% AI) 15.7mg (3% AI)


#8: Brussels Sprouts (Cooked)

Choline in 100g 1/2 cup (78g) 1 sprout (21g)
40.6mg (7% AI) 31.7mg (6% AI) 8.5mg (2% AI)

Cooked Cabbage provides 6% AI per cup cooked

#9: Cooked Bok Choy (Chinese Cabbage) (Pak-Choi)

Choline in 100g 1 cup, shredded (170g)
12.1mg (2% AI) 20.6mg (3% AI)

#10: Fish (Cod)

Choline in 100g 3 oz (85g) 1 fillet (90g)
79.7mg (14% AI) 67.8mg (12% AI) 71mg (13% AI)

Other fish high in Choline (%AI per 3oz (85g) serving): Salmon (18%), Pollock (14%), Flounder (Sole) (12%), Haddock (12%), Perch (12%)

—————
Connie’s comments: My 85 yr old grandma over ate on muscles – seafood the night before she died of heart attack. Always eat light meals during dinner time and limit your calories and servings as we age.

Lipid Based Diets Effectively Combat Alzheimer’s in Mouse Models of Disease

There is accumulating evidence showing that lifestyle factors like diet may influence the onset and progression of Alzheimer’s disease (AD). Our previous studies suggest that a multi-nutrient diet, Fortasyn, containing nutritional precursors and cofactors for membrane synthesis, viz. docosahexaenoic acid, eicosapentaenoic acid, uridine-mono-phosphate, choline, phospholipids, folic acid, vitamins B6, B12, C, E, and selenium, has an ameliorating effect on cognitive deficits in an AD mouse model. In the present study we analyzed learning strategies and memory of 11-month-old AβPPswe/PS1dE9 (AβPP/PS1) mice in the Morris water maze (MWM) task performed after nine months of dietary intervention with a control diet or a Fortasyn diet to characterize diet-induced changes in cognitive performance. The Fortasyn diet had no significant effect on MWM task acquisition.

To assess hippocampus-dependent learning, the strategies that the mice used to find the hidden platform in the MWM were analyzed using the swim path data. During the fourth day of the MWM, AβPP/PS1 mice on control diet more often used the non-spatial random search strategy, while on the Fortasyn diet, the transgenic animals exhibited more chaining strategy than their wild-type littermates. During the probe trial, AβPP/PS1 mice displayed no clear preference for the target quadrant. Notably, in both transgenic and nontransgenic mice on Fortasyn diet, the latency to reach the former platform position was decreased compared to mice on the control diet. In conclusion, this specific nutrient combination showed a tendency to improve searching behavior in AβPP/PS1 mice by increasing the use of a more efficient search strategy and improving their swim efficiency by decreasing the latency to reach the former platform position.

Researchers have devised several lipid-based diets aimed at slowing down progression and relieving symptoms of Alzheimer’s disease.

Alzheimer´s disease (AD) is the most common disease underlying memory problems and dementia in the elderly. One of the invariable pathologies in AD is degeneration of cholinergic synapses in brain cortex and hippocampus. Despite enormous effort to find out an efficient treatment, current pharmacological interventions are limited to a few drugs that alleviate symptoms but do not slow down the underlying disease processes. These drugs include inhibitors of cholinesterases, enzymes that degrade the neurotransmitter acetylcholine, or memantine, a modulator of glutamate neurotransmission.

It is generally accepted that lifestyle and particularly dietary habits influence mental health, and prevalence and progression of AD. Numerous epidemiological studies have revealed profitable effects of dietary intake of especially fish oil on cognitive decline during aging and dementia.

Within the EU-funded project LipiDiDiet (FP7-211696), therapeutic and preventive impact of nutritional lipids on neuronal and cognitive performance in aging, Alzheimer´s disease and vascular dementia, researchers devised several lipid-based diets aimed at slowing down progression and relieving symptoms of AD. Short-term (3 weeks) feeding of young adult APPswe/PS1dE9 mice (transgenic mouse model of AD) with experimental diets containing fish oil or stigmasterol reversed the decrease in responsiveness of hippocampal muscarinic receptors to acetylcholine compared to their non-transgenic littermates. Only fish oil based diet enriched with nutrients supporting neuroprotection (Fortasyn diet) increased in addition the density of muscarinic receptors and cholinergic synapses in the hippocampus.

Image shows an alzheimer's brain slice.

These findings yield important proof-of-principle evidence that regular intake of specific dietary components may help to prevent some of the key early functional changes that take place in the Alzheimer brain. These findings support viability of the dietary approach in AD.

ABOUT THIS ALZHEIMER’S DISEASE RESEARCH

Source: Faizan ul Haq – Bentham Science Publishers
Image Source: Image is in the public domain
Original Research: Abstract for “Lipid-Based Diets Improve Muscarinic Neurotransmission in the Hippocampus of Transgenic APPswe/PS1dE9 Mice” by Helena Janickova, Vladimir Rudajev, Eva Dolejsi, Hennariikka Koivisto, Jan Jakubik, Heikki Tanila, Esam E. El-Fakahany and Vladimir Dolezal in Current Alzheimer Research. Published online February 2016 doi:10.2174/1567205012666151027130350


Abstract

Lipid-Based Diets Improve Muscarinic Neurotransmission in the Hippocampus of Transgenic APPswe/PS1dE9 Mice

Transgenic APPswe/PS1dE9 mice modeling Alzheimer’s disease demonstrate ongoing accumulation of β-amyloid fragments resulting in formation of amyloid plaques that starts at the age of 4-5 months. Buildup of β-amyloid fragments is accompanied by impairment of muscarinic transmission that becomes detectable at this age, well before the appearance of cognitive deficits that manifest around the age of 12 months. We have recently demonstrated that long-term feeding of trangenic mice with specific isocaloric fish oil-based diets improves specific behavioral parameters. Now we report on the influence of short-term feeding (3 weeks) of three isocaloric diets supplemented with Fortasyn (containing fish oil and ingredients supporting membrane renewal), the plant sterol stigmasterol together with fish oil, and stigmasterol alone on markers of cholinergic neurotransmission in the hippocampus of 5-month-old transgenic mice and their wild-type littermates. Transgenic mice fed normal diet demostrated increase in ChAT activity and attenuation of carbachol-stimulated GTP-γ35S binding compared to wild-type mice. None of the tested diets compared to control diet influenced the activities of ChAT, AChE, BuChE, muscarinic receptor density or carbachol-stimulated GTP-γ35S binding in wild-type mice. In contrast, all experimental diets increased the potency of carbachol in stimulating GTP-γ35S binding in trangenic mice to the level found in wild-type animals. Only the Fortasyn diet increased markers of cholinergic synapses in transgenic mice. Our data demonstrate that even short-term feeding of transgenic mice with chow containing specific lipid-based dietary supplements can influence markers of cholinergic synapses and rectify impaired muscarinic signal transduction that develops in transgenic mice.

“Lipid-Based Diets Improve Muscarinic Neurotransmission in the Hippocampus of Transgenic APPswe/PS1dE9 Mice” by Helena Janickova, Vladimir Rudajev, Eva Dolejsi, Hennariikka Koivisto, Jan Jakubik, Heikki Tanila, Esam E. El-Fakahany and Vladimir Dolezal in Current Alzheimer Research. Published online February 2016 doi:10.2174/1567205012666151027130350