Pages

Friday, April 8, 2011

If food addiction exists, blame the brain -- not the cookies

To the food lovers who can't deny themselves an extra cookie (or 10): The problem may begin in your brain, where, scientists say, chemical surges affect your response to food, much in the way an addict responds to alcohol or drugs.

The possibility of food addiction has existed for some time. A new Yale study gives it a boost. In that research, scientists watched the brain activity of women tantalized, and then rewarded, with a chocolate milkshake. Their neural activity was similar to that of drug addicts, scientists said, as brain imaging showed activity surging in regions that govern cravings and falling off in those centers that curb urges.

High-fat and high-sugar foods tend to trigger the strongest reward responses in the brain, said the researchers, a feature that once would have helped our species survive. In America today, we don’t have as much of a problem finding high-calorie food.

Here, essentially, is what occurs in the brain of a drug abuser. Most abused drugs work by flooding the brain’s circuits with the feel-good chemical dopamine. The feeling is pleasurable, so the body wants to get the high again. But abused drugs overload the brain’s circuits, sometimes with two to 10 times as much dopamine as is derived from natural feel-good activities such as eating and sex. The brain adjusts to the overwhelming amount of dopamine, so the user needs more of the drug to achieve the same result.

Whether the desire for sugar-filled candy is truly similar to the craving to snort another white powder is uncertain. There’s evidence that rats can be addicted to sugar, demonstrating binge eating, withdrawal and craving. To a lesser extent, animals may binge on fatty foods too. That makes sense to anyone who has struggled to put the lid back on a Pringles can. But some psychologists argue that those results don’t apply to humans, and the evidence isn’t strong enough to say for sure that sugar addiction, or any other food addiction, plays a role in obesity and eating disorders.

Further, psychologists themselves are unsure how to think about overeating and, in particular, binge eating. Psychologists have been divided  on whether binge eating should be its own psychiatric disorder in the profession's’s diagnostic manual or included with anorexia and bulimia as an eating disorder. By a 2007 estimate, 2% of men and 3.5% of women experience symptoms of binge eating in their lifetimes.

Whether overeating is truly an addiction, some sufferers treat it as such. Overeaters Anonymous is modeled after the 12 steps in groups like Alcoholics Anonymous to help members recover from compulsive eating. Members are urged to eliminate foods that give them cravings, often processed foods with refined sugar.

If food is truly an addiction, that could open the door for interventions that work for drug addicts, such as medicine and behavioral changes. But overeating can often be a warning sign of overwhelming stress and larger emotional problems, and simply blaming it on the brain’s love of dopamine may be masking other problems. Trying to cope with a job loss, stress at work or relationship troubles can all trigger overeating.
To the food lovers who can't deny themselves an extra cookie (or 10): The problem may begin in your brain, where, scientists say, chemical surges affect your response to food, much in the way an addict responds to alcohol or drugs.

The possibility of food addiction has existed for some time. A new Yale study gives it a boost. In that research, scientists watched the brain activity of women tantalized, and then rewarded, with a chocolate milkshake. Their neural activity was similar to that of drug addicts, scientists said, as brain imaging showed activity surging in regions that govern cravings and falling off in those centers that curb urges.

High-fat and high-sugar foods tend to trigger the strongest reward responses in the brain, said the researchers, a feature that once would have helped our species survive. In America today, we don’t have as much of a problem finding high-calorie food.

Here, essentially, is what occurs in the brain of a drug abuser. Most abused drugs work by flooding the brain’s circuits with the feel-good chemical dopamine. The feeling is pleasurable, so the body wants to get the high again. But abused drugs overload the brain’s circuits, sometimes with two to 10 times as much dopamine as is derived from natural feel-good activities such as eating and sex. The brain adjusts to the overwhelming amount of dopamine, so the user needs more of the drug to achieve the same result.

Whether the desire for sugar-filled candy is truly similar to the craving to snort another white powder is uncertain. There’s evidence that rats can be addicted to sugar, demonstrating binge eating, withdrawal and craving. To a lesser extent, animals may binge on fatty foods too. That makes sense to anyone who has struggled to put the lid back on a Pringles can. But some psychologists argue that those results don’t apply to humans, and the evidence isn’t strong enough to say for sure that sugar addiction, or any other food addiction, plays a role in obesity and eating disorders.

Further, psychologists themselves are unsure how to think about overeating and, in particular, binge eating. Psychologists have been divided  on whether binge eating should be its own psychiatric disorder in the profession's’s diagnostic manual or included with anorexia and bulimia as an eating disorder. By a 2007 estimate, 2% of men and 3.5% of women experience symptoms of binge eating in their lifetimes.

Whether overeating is truly an addiction, some sufferers treat it as such. Overeaters Anonymous is modeled after the 12 steps in groups like Alcoholics Anonymous to help members recover from compulsive eating. Members are urged to eliminate foods that give them cravings, often processed foods with refined sugar.

If food is truly an addiction, that could open the door for interventions that work for drug addicts, such as medicine and behavioral changes. But overeating can often be a warning sign of overwhelming stress and larger emotional problems, and simply blaming it on the brain’s love of dopamine may be masking other problems. Trying to cope with a job loss, stress at work or relationship troubles can all trigger overeating.
Read More


Better a Sprint Than a Marathon: Brief Intense Exercise Better Than Endurance Training for Preventing Cardiovascular Disease

ScienceDaily (Apr. 6, 2011) — Exercise is important for preventing cardiovascular disease, especially in children and adolescents, but is all exercise equally beneficial? New research published April 5 in theAmerican Journal of Human Biology reveals that high intensity exercise is more beneficial than traditional endurance training.

"Cardiovascular disease (CVD) is a leading cause of mortality throughout the world and its risk factors have their origins in childhood," said lead author Duncan Buchan from the University of the West of Scotland. "Our research examines the effects of brief, intense exercise when compared to traditional endurance exercise on the markers of CVD in young people."
Buchan's team recruited a group of volunteer school children, forty seven boys and ten girls, and randomly divided the group into moderate (MOD) and high intensity (HIT) exercise teams.
The two groups performed three weekly exercise sessions over 7 weeks. The HIT group's training consisted of a series of 20 meter sprints over 30 seconds. In contrast the MOD group ran steadily for a period of 20 minutes.
By the end of the study the MOD group had completed 420 minutes of exercise while the HIT group had trained for a shorter 63 minutes. The estimated energy expenditure for the HIT intervention was 907.2 kcal in comparison to 4410 kcal for the MOD group.
The results revealed that both groups demonstrated improved CVD risk factors. However, the total exercise time over seven weeks was six times higher for the MOD group compared to the HIT group. Thus, significant improvements in CVD risk factors in the HIT group occurred in only 15% of the total exercise time.
These findings demonstrate that brief, intense exercise is a time efficient means for improving CVD risk factors in adolescents. Although limited to relatively small samples, the findings demonstrate significant improvements in cardiorespiratory fitness, blood pressure, body composition and insulin resistance in healthy adolescent youth after a 7 week intervention of different exercise intensities.
"This is the first study to demonstrate the effects of a novel interval training programme on both traditional and novel CVD risk factors in adolescents," concluded Buchan. "Larger scale and extended interventions must be undertaken so that the long term impact and effects of intermittent training programmes on unfavourable metabolic profiles may be investigated further."
ScienceDaily (Apr. 6, 2011) — Exercise is important for preventing cardiovascular disease, especially in children and adolescents, but is all exercise equally beneficial? New research published April 5 in theAmerican Journal of Human Biology reveals that high intensity exercise is more beneficial than traditional endurance training.

"Cardiovascular disease (CVD) is a leading cause of mortality throughout the world and its risk factors have their origins in childhood," said lead author Duncan Buchan from the University of the West of Scotland. "Our research examines the effects of brief, intense exercise when compared to traditional endurance exercise on the markers of CVD in young people."
Buchan's team recruited a group of volunteer school children, forty seven boys and ten girls, and randomly divided the group into moderate (MOD) and high intensity (HIT) exercise teams.
The two groups performed three weekly exercise sessions over 7 weeks. The HIT group's training consisted of a series of 20 meter sprints over 30 seconds. In contrast the MOD group ran steadily for a period of 20 minutes.
By the end of the study the MOD group had completed 420 minutes of exercise while the HIT group had trained for a shorter 63 minutes. The estimated energy expenditure for the HIT intervention was 907.2 kcal in comparison to 4410 kcal for the MOD group.
The results revealed that both groups demonstrated improved CVD risk factors. However, the total exercise time over seven weeks was six times higher for the MOD group compared to the HIT group. Thus, significant improvements in CVD risk factors in the HIT group occurred in only 15% of the total exercise time.
These findings demonstrate that brief, intense exercise is a time efficient means for improving CVD risk factors in adolescents. Although limited to relatively small samples, the findings demonstrate significant improvements in cardiorespiratory fitness, blood pressure, body composition and insulin resistance in healthy adolescent youth after a 7 week intervention of different exercise intensities.
"This is the first study to demonstrate the effects of a novel interval training programme on both traditional and novel CVD risk factors in adolescents," concluded Buchan. "Larger scale and extended interventions must be undertaken so that the long term impact and effects of intermittent training programmes on unfavourable metabolic profiles may be investigated further."
Read More


What Your Doctor May Not Know About Your Pain Pills

Most people who have chronic pain — a bad back, arthritis, or many other ailments — see their primary care physician for treatment. If ibuprofen doesn't ease the ache, these doctors often prescribe narcotic drugs like Vicodin, Percocet and OxyContin.
Although the drugs, which trace their roots to the opium poppy, reduce pain, they also carry significant risks and can cause breathing to stop in large doses or when mixed with other drugs or alcohol. Yet research shows that many primary care doctors aren't monitoring their patients use of the medicines to make sure they aren't abused or misused.
It's not an academic issue. More people die from accidental overdoses of prescription opioids annually than they do from cocaine and heroin combined: 11,499 in 2007, according to the Centers for Disease Control and Prevention.
 
Patient monitoring can take many forms, all generally aimed at making sure that patients take only the drugs prescribed to them — and don't share or sell them. Some doctors ask patient to sign "pain contracts" or "opioid treatment agreements" that spell out these measures. But a recent study found that three of the most common strategies to ensure patients comply with their drug regimens aren't usedunderused by primary care doctors.
The study, published in February in the Journal of General Internal Medicine, examined the medical records of 1,612 chronic pain patients at eight primary care clinics in the Philadelphia area over a five-year period ending in 2008. It found that only 8 percent were given urine tests, half were scheduled for office visits at least once every six months, and 76 percent were restricted from refilling their prescriptions early.
Part of the problem is practical. "It's easy to say that it's useful to do prescription monitoring and urine screening, but building this stuff into day-to-day practice is hard," says Perry Fine, president of the American Academy of Pain Medicine.
The other sticking point is a lack of education, of physicians and the general public, about how to prescribe and take these drugs safely, say experts. "Primary care doctors haven't been taught a lot about pain management," says Penney Cowan, founder and executive director of the American Chronic Pain Association, a patient advocacy group. This leads them to sometimes undertreat pain, on the one hand, or prescribe it without proper monitoring, on the other.
"Those who need those drugs should be able to get access," she says. "But if a healthcare provider chooses to give them opoids, then patients need to be educated."
Most people who have chronic pain — a bad back, arthritis, or many other ailments — see their primary care physician for treatment. If ibuprofen doesn't ease the ache, these doctors often prescribe narcotic drugs like Vicodin, Percocet and OxyContin.
Although the drugs, which trace their roots to the opium poppy, reduce pain, they also carry significant risks and can cause breathing to stop in large doses or when mixed with other drugs or alcohol. Yet research shows that many primary care doctors aren't monitoring their patients use of the medicines to make sure they aren't abused or misused.
It's not an academic issue. More people die from accidental overdoses of prescription opioids annually than they do from cocaine and heroin combined: 11,499 in 2007, according to the Centers for Disease Control and Prevention.
 
Patient monitoring can take many forms, all generally aimed at making sure that patients take only the drugs prescribed to them — and don't share or sell them. Some doctors ask patient to sign "pain contracts" or "opioid treatment agreements" that spell out these measures. But a recent study found that three of the most common strategies to ensure patients comply with their drug regimens aren't usedunderused by primary care doctors.
The study, published in February in the Journal of General Internal Medicine, examined the medical records of 1,612 chronic pain patients at eight primary care clinics in the Philadelphia area over a five-year period ending in 2008. It found that only 8 percent were given urine tests, half were scheduled for office visits at least once every six months, and 76 percent were restricted from refilling their prescriptions early.
Part of the problem is practical. "It's easy to say that it's useful to do prescription monitoring and urine screening, but building this stuff into day-to-day practice is hard," says Perry Fine, president of the American Academy of Pain Medicine.
The other sticking point is a lack of education, of physicians and the general public, about how to prescribe and take these drugs safely, say experts. "Primary care doctors haven't been taught a lot about pain management," says Penney Cowan, founder and executive director of the American Chronic Pain Association, a patient advocacy group. This leads them to sometimes undertreat pain, on the one hand, or prescribe it without proper monitoring, on the other.
"Those who need those drugs should be able to get access," she says. "But if a healthcare provider chooses to give them opoids, then patients need to be educated."
Read More


Common Dietary Fat and Intestinal Microbes Linked to Heart Disease

ScienceDaily (Apr. 6, 2011) — A new pathway has been discovered that links a common dietary lipid and intestinal microflora with an increased risk of heart disease, according to a Cleveland Clinic study published in the latest issue of Nature.

The study shows that people who eat a diet containing a common nutrient found in animal products (such as eggs, liver and other meats, cheese and other dairy products, fish, shellfish) are not predisposed to cardiovascular disease solely on their genetic make-up, but rather, how the micro-organisms that live in our digestive tracts metabolize a specific lipid -- phosphatidyl choline (also called lecithin). Lecithin and its metabolite, choline, are also found in many commercial baked goods, dietary supplements, and even children's vitamins.
The study examined clinical data from 1,875 patients who were referred for cardiac evaluation, as well as plasma samples from mice. When fed to mice, lecithin and choline were converted to a heart disease-forming product by the intestinal microbes, which promoted fatty plaque deposits to form within arteries (atherosclerosis); in humans, higher blood levels of choline and the heart disease forming microorganism products are strongly associated with increased cardiovascular disease risk.
"When two people both eat a similar diet but one gets heart disease and the other doesn't, we currently think the cardiac disease develops because of their genetic differences; but our studies show that is only a part of the equation," said Stanley Hazen, M.D., Ph.D., Staff in Lerner Research Institute's Department of Cell Biology and the Heart and Vascular Institute's Department of Cardiovascular Medicine and Section Head of Preventive Cardiology & Rehabilitation at Cleveland Clinic, and senior author of the study. "Actually, differences in gut flora metabolism of the diet from one person to another appear to have a big effect on whether one develops heart disease. Gut flora is a filter for our largest environmental exposure -- what we eat."
Dr. Hazen added, "Another remarkable finding is that choline -- a natural semi-essential vitamin -- when taken in excess, promoted atherosclerotic heart disease. Over the past few years we have seen a huge increase in the addition of choline into multi-vitamins -- even in those marketed to our children -- yet it is this same substance that our study shows the gut flora can convert into something that has a direct, negative impact on heart disease risk by forming an atherosclerosis-causing by-product."
In studies of more than 2,000 subjects altogether, blood levels of three metabolites of the dietary lipid lecithin were shown to strongly predict risk for cardiovascular disease: choline (a B-complex vitamin), trimethylamine N-oxide (TMAO, a product that requires gut flora to be produced and is derived from the choline group of the lipid) and betaine (a metabolite of choline).
"The studies identify TMAO as a blood test that can be used in subjects to see who is especially at risk for cardiac disease, and in need of more strict dietary intervention to lower their cardiac risk," Dr. Hazen said.
Healthy amounts of choline, betaine and TMAO are found in many fruits, vegetables and fish. These three metabolites are commonly marketed as direct-to-consumer supplements, supposedly offering increased brain health, weight loss and/or muscle growth.
These compounds also are commonly used as feed additives for cattle, poultry or fish because they may make muscle grow faster; whether muscle from such livestock have higher levels of these compounds remains unknown.
"Knowing that gut flora generates a pro-atherosclerotic metabolite from a common dietary lipid opens up new opportunities for improved diagnostics, prevention and treatment of heart disease," Dr. Hazen said. "These studies suggest we can intelligently design a heart healthy yogurt or other form of probiotic for preventing heart disease in the future. It also appears there is a need for considering the risk vs. benefits of some commonly used supplements."
ScienceDaily (Apr. 6, 2011) — A new pathway has been discovered that links a common dietary lipid and intestinal microflora with an increased risk of heart disease, according to a Cleveland Clinic study published in the latest issue of Nature.

The study shows that people who eat a diet containing a common nutrient found in animal products (such as eggs, liver and other meats, cheese and other dairy products, fish, shellfish) are not predisposed to cardiovascular disease solely on their genetic make-up, but rather, how the micro-organisms that live in our digestive tracts metabolize a specific lipid -- phosphatidyl choline (also called lecithin). Lecithin and its metabolite, choline, are also found in many commercial baked goods, dietary supplements, and even children's vitamins.
The study examined clinical data from 1,875 patients who were referred for cardiac evaluation, as well as plasma samples from mice. When fed to mice, lecithin and choline were converted to a heart disease-forming product by the intestinal microbes, which promoted fatty plaque deposits to form within arteries (atherosclerosis); in humans, higher blood levels of choline and the heart disease forming microorganism products are strongly associated with increased cardiovascular disease risk.
"When two people both eat a similar diet but one gets heart disease and the other doesn't, we currently think the cardiac disease develops because of their genetic differences; but our studies show that is only a part of the equation," said Stanley Hazen, M.D., Ph.D., Staff in Lerner Research Institute's Department of Cell Biology and the Heart and Vascular Institute's Department of Cardiovascular Medicine and Section Head of Preventive Cardiology & Rehabilitation at Cleveland Clinic, and senior author of the study. "Actually, differences in gut flora metabolism of the diet from one person to another appear to have a big effect on whether one develops heart disease. Gut flora is a filter for our largest environmental exposure -- what we eat."
Dr. Hazen added, "Another remarkable finding is that choline -- a natural semi-essential vitamin -- when taken in excess, promoted atherosclerotic heart disease. Over the past few years we have seen a huge increase in the addition of choline into multi-vitamins -- even in those marketed to our children -- yet it is this same substance that our study shows the gut flora can convert into something that has a direct, negative impact on heart disease risk by forming an atherosclerosis-causing by-product."
In studies of more than 2,000 subjects altogether, blood levels of three metabolites of the dietary lipid lecithin were shown to strongly predict risk for cardiovascular disease: choline (a B-complex vitamin), trimethylamine N-oxide (TMAO, a product that requires gut flora to be produced and is derived from the choline group of the lipid) and betaine (a metabolite of choline).
"The studies identify TMAO as a blood test that can be used in subjects to see who is especially at risk for cardiac disease, and in need of more strict dietary intervention to lower their cardiac risk," Dr. Hazen said.
Healthy amounts of choline, betaine and TMAO are found in many fruits, vegetables and fish. These three metabolites are commonly marketed as direct-to-consumer supplements, supposedly offering increased brain health, weight loss and/or muscle growth.
These compounds also are commonly used as feed additives for cattle, poultry or fish because they may make muscle grow faster; whether muscle from such livestock have higher levels of these compounds remains unknown.
"Knowing that gut flora generates a pro-atherosclerotic metabolite from a common dietary lipid opens up new opportunities for improved diagnostics, prevention and treatment of heart disease," Dr. Hazen said. "These studies suggest we can intelligently design a heart healthy yogurt or other form of probiotic for preventing heart disease in the future. It also appears there is a need for considering the risk vs. benefits of some commonly used supplements."
Read More


How The 'Pox' Epidemic Changed Vaccination Rules

Historian Michael Willrich was planning to write a book about civil liberties in the aftermath of Sept. 11 when he stumbled across an article from The New York Times archives. It was about a 1901 smallpox vaccination raid in New York — when 250 men arrived at a Little Italy tenement house in the middle of the night and set about vaccinating everyone they could find.
"There were scenes of policemen holding down men in their night robes while vaccinators began their work on their arms," Willrich tells Fresh Air's Terry Gross. "Inspectors were going room to room looking for children with smallpox. And when they found them, they were literally tearing babes from their mothers' arms to take them to the city pesthouse [which housed smallpox victims.]"
The vaccination raid was not an isolated incident. As the smallpox epidemic swept across the country, New York and Boston policemen conducted several raids and health officials across the country ordered mandatory vaccinations in schools, factories and on railroads. InPox: An American History, Willrich details how the smallpox epidemic of 1898-1904 had far-reaching implications for public health officials — as well as Americans concerned about their own civil liberties.
"110 years ago, vaccination was compelled by the state," he says. "But there no effort taken by the government to ensure that vaccines on the market were safe and effective. We live in a very different environment today where there are extensive regulations governing the entire vaccine industry."
At the turn of the 20th century, explains Willrich, there were little to no regulations governing the pharmaceutical industry. Many people were forced to receive the vaccine — most of the time against their will.
"There was one episode in Middlesboro, Ky., where the police and a group of vaccinators went into this African-American section of town, rounded up people outside this home, handcuffed the men and women and vaccinated them at gunpoint," says Willrich. "It's a shocking scene and very much at odds with our daily-held notions of American liberty."
People infected with small pox would also be quarantined against their will in large isolation hospitals called pest houses.
"People would literally dragged there against their will," he says. "Some of the most poignant scenes are when mothers are fighting with health officials to keep their children in their own homes rather than have them be taken off to a pesthouse. People at the time rightly associated pest houses with death. That's where someone was taken to die."
Resistance To Vaccinations
From the very start of the organized vaccination campaign against smallpox, there was public resistance, says Willrich. The battle between the government and the vocal anti-vaccinators came to a head in a landmark 1902 Supreme Court decision, where the Supreme Court upheld the right of a state to order a vaccination for its population during an epidemic to protect the people from a devastating disease.
"But at the same time, the Court recognized certain limitations on that power — that this power of health policing was no absolute and was not total and there was a sphere of individual liberty that needed to be recognized," says Willrich. "Measures like this needed to be reasonable and someone who could make a legitimate claim that a vaccine posed a particular risk to them because of their family history or medical history [would not have to be vaccinated.]"
In addition, the Supreme Judicial Court of Massachusetts stipulated that a state couldn't forcibly vaccinate its population.
"[They said,] 'Of course, it would be unconstitutional and go beyond the pale for health officials to forcibly vaccinate anyone because that's not within their power,'" says Willrich. "And I think that's really a shoutout to the Boston health authorities who were employing forcible vaccination all the time in the poorest neighborhoods in the city."
Because so many refused to get vaccinated, there were isolated incidents of smallpox outbreaks in the United States until 1949, says Willrich. It wasn't until 1972 that the U.S. government decided to stop mandatory vaccination against smallpox, in part because the disease had been largely eradicated.
The Current Anti-Vaccine Controversy
In 1998, the British medical journal The Lancet published a report by Dr. Andrew Wakefield that suggested that there might be a link between autism and the measles, mumps and rubella (MMR) vaccine.
"This paper was thoroughly discredited and debunked but the idea that vaccines might somehow be the cause of autism stuck," says Willrich. "And so, according to some of the most recent studies, something like one-fifth of all American parents believe that vaccines cause autism. This is simply not true. But it's a powerful association in the public mind."
Wakefield is no longer allowed to practice medicine in England and The Lancet withdrew the study in 2010. In January, 2011, the British Medical Journal said that the study wasn't just wrong — it was "a deliberate fraud" that altered key facts to support the link between vaccinations and autism.
Even though the study was discredited, many people continue to believe the link between vaccinations and autism, says Willrich.
"[In 2003,] according to the CDC, there was something like 22 percent of American parents of young children were refusing one or more vaccines for their children," he says. "Five years later, that percentage had nearly doubled to about 40 percent of all Americans. So the vaccine controversy today is one of the most important public health crises we face in America."
And, he says, public health officials can and should do more to inform the public that the American Academy of Pediatrics, the American Medical Association and the CDC all believe that vaccines are safe.
"I think this is the time for doubling their efforts to spread the good word about vaccines and also have a candid public discussion about the risks and benefits," he says. "There's no more opportune moment than the present to launch a new publicity campaign around vaccines. ... Viruses spread in human populations from person to person and if you have a vast majority of a community vaccinated against that virus, the virus will simply never have a toehold in that community."
Historian Michael Willrich was planning to write a book about civil liberties in the aftermath of Sept. 11 when he stumbled across an article from The New York Times archives. It was about a 1901 smallpox vaccination raid in New York — when 250 men arrived at a Little Italy tenement house in the middle of the night and set about vaccinating everyone they could find.
"There were scenes of policemen holding down men in their night robes while vaccinators began their work on their arms," Willrich tells Fresh Air's Terry Gross. "Inspectors were going room to room looking for children with smallpox. And when they found them, they were literally tearing babes from their mothers' arms to take them to the city pesthouse [which housed smallpox victims.]"
The vaccination raid was not an isolated incident. As the smallpox epidemic swept across the country, New York and Boston policemen conducted several raids and health officials across the country ordered mandatory vaccinations in schools, factories and on railroads. InPox: An American History, Willrich details how the smallpox epidemic of 1898-1904 had far-reaching implications for public health officials — as well as Americans concerned about their own civil liberties.
"110 years ago, vaccination was compelled by the state," he says. "But there no effort taken by the government to ensure that vaccines on the market were safe and effective. We live in a very different environment today where there are extensive regulations governing the entire vaccine industry."
At the turn of the 20th century, explains Willrich, there were little to no regulations governing the pharmaceutical industry. Many people were forced to receive the vaccine — most of the time against their will.
"There was one episode in Middlesboro, Ky., where the police and a group of vaccinators went into this African-American section of town, rounded up people outside this home, handcuffed the men and women and vaccinated them at gunpoint," says Willrich. "It's a shocking scene and very much at odds with our daily-held notions of American liberty."
People infected with small pox would also be quarantined against their will in large isolation hospitals called pest houses.
"People would literally dragged there against their will," he says. "Some of the most poignant scenes are when mothers are fighting with health officials to keep their children in their own homes rather than have them be taken off to a pesthouse. People at the time rightly associated pest houses with death. That's where someone was taken to die."
Resistance To Vaccinations
From the very start of the organized vaccination campaign against smallpox, there was public resistance, says Willrich. The battle between the government and the vocal anti-vaccinators came to a head in a landmark 1902 Supreme Court decision, where the Supreme Court upheld the right of a state to order a vaccination for its population during an epidemic to protect the people from a devastating disease.
"But at the same time, the Court recognized certain limitations on that power — that this power of health policing was no absolute and was not total and there was a sphere of individual liberty that needed to be recognized," says Willrich. "Measures like this needed to be reasonable and someone who could make a legitimate claim that a vaccine posed a particular risk to them because of their family history or medical history [would not have to be vaccinated.]"
In addition, the Supreme Judicial Court of Massachusetts stipulated that a state couldn't forcibly vaccinate its population.
"[They said,] 'Of course, it would be unconstitutional and go beyond the pale for health officials to forcibly vaccinate anyone because that's not within their power,'" says Willrich. "And I think that's really a shoutout to the Boston health authorities who were employing forcible vaccination all the time in the poorest neighborhoods in the city."
Because so many refused to get vaccinated, there were isolated incidents of smallpox outbreaks in the United States until 1949, says Willrich. It wasn't until 1972 that the U.S. government decided to stop mandatory vaccination against smallpox, in part because the disease had been largely eradicated.
The Current Anti-Vaccine Controversy
In 1998, the British medical journal The Lancet published a report by Dr. Andrew Wakefield that suggested that there might be a link between autism and the measles, mumps and rubella (MMR) vaccine.
"This paper was thoroughly discredited and debunked but the idea that vaccines might somehow be the cause of autism stuck," says Willrich. "And so, according to some of the most recent studies, something like one-fifth of all American parents believe that vaccines cause autism. This is simply not true. But it's a powerful association in the public mind."
Wakefield is no longer allowed to practice medicine in England and The Lancet withdrew the study in 2010. In January, 2011, the British Medical Journal said that the study wasn't just wrong — it was "a deliberate fraud" that altered key facts to support the link between vaccinations and autism.
Even though the study was discredited, many people continue to believe the link between vaccinations and autism, says Willrich.
"[In 2003,] according to the CDC, there was something like 22 percent of American parents of young children were refusing one or more vaccines for their children," he says. "Five years later, that percentage had nearly doubled to about 40 percent of all Americans. So the vaccine controversy today is one of the most important public health crises we face in America."
And, he says, public health officials can and should do more to inform the public that the American Academy of Pediatrics, the American Medical Association and the CDC all believe that vaccines are safe.
"I think this is the time for doubling their efforts to spread the good word about vaccines and also have a candid public discussion about the risks and benefits," he says. "There's no more opportune moment than the present to launch a new publicity campaign around vaccines. ... Viruses spread in human populations from person to person and if you have a vast majority of a community vaccinated against that virus, the virus will simply never have a toehold in that community."
Read More


Tuesday, April 5, 2011

Feline Blood Typing Methods

Seth M, Jackson KV, Giger U: Comparison of five blood-typing methods for the feline AB blood group system, Am J Vet Res 72:203, 2011.

The AB blood group system is the most clinically important blood group system in cats. Life-threatening hemolytic reactions can occur because of an A-B mismatched breeding or transfusion. Very strong naturally occurring anti-A alloantibodies in the plasma of type B cats older than 3 months make it also critical to differentiate type B cats from type A and AB cats. The prevalence of type B blood ranges from 0% in Siamese to approximately 50% in such breeds as the Turkish Van and Turkish Angora. Therefore, it becomes crucial that veterinary practitioners accurately and swiftly determine a cat’s blood type. The present study compared 5 feline AB typing methods for ease of use and accuracy. 490 anticoagulated blood samples from sick and healthy cats were submitted. The tube agglutination assay (TUBE) was the gold standard method that card agglutination (CARD), immunochromatographic cartridge (CHROM), conventional slide (SLIDE), and gel-based assays were compared against. Point of care typing assays such as CARD had 91% agreement (53 of 58), and CHROM had 95% agreement (55 of 58). Laboratory typing assays such as GEL had 99% agreement (487 of 490), and SLIDE also had 99% agreement (482 of 487). Four samples with discordant test results came from cats with FeLV-related anemia. The results indicate current laboratory and in-clinic methods provide simple and accurate typing for the feline AB blood group system with few discrepancies. It was recommended to do retyping after in-clinic typing with the GEL or TUBE laboratory methods. [VT]

Related articles:
Proverbio D, Spada E, Baggiani L et al: Comparison of gel column agglutination with monoclonal antibodies and card agglutination methods for assessing the feline AB group system and a frequency study of feline blood types in northern Italy, Vet Clin Pathol 40:32, 2011.

More on cat health: Winn Feline Foundation Library
Join us on Facebook
Follow us on Twitter
Seth M, Jackson KV, Giger U: Comparison of five blood-typing methods for the feline AB blood group system, Am J Vet Res 72:203, 2011.

The AB blood group system is the most clinically important blood group system in cats. Life-threatening hemolytic reactions can occur because of an A-B mismatched breeding or transfusion. Very strong naturally occurring anti-A alloantibodies in the plasma of type B cats older than 3 months make it also critical to differentiate type B cats from type A and AB cats. The prevalence of type B blood ranges from 0% in Siamese to approximately 50% in such breeds as the Turkish Van and Turkish Angora. Therefore, it becomes crucial that veterinary practitioners accurately and swiftly determine a cat’s blood type. The present study compared 5 feline AB typing methods for ease of use and accuracy. 490 anticoagulated blood samples from sick and healthy cats were submitted. The tube agglutination assay (TUBE) was the gold standard method that card agglutination (CARD), immunochromatographic cartridge (CHROM), conventional slide (SLIDE), and gel-based assays were compared against. Point of care typing assays such as CARD had 91% agreement (53 of 58), and CHROM had 95% agreement (55 of 58). Laboratory typing assays such as GEL had 99% agreement (487 of 490), and SLIDE also had 99% agreement (482 of 487). Four samples with discordant test results came from cats with FeLV-related anemia. The results indicate current laboratory and in-clinic methods provide simple and accurate typing for the feline AB blood group system with few discrepancies. It was recommended to do retyping after in-clinic typing with the GEL or TUBE laboratory methods. [VT]

Related articles:
Proverbio D, Spada E, Baggiani L et al: Comparison of gel column agglutination with monoclonal antibodies and card agglutination methods for assessing the feline AB group system and a frequency study of feline blood types in northern Italy, Vet Clin Pathol 40:32, 2011.

More on cat health: Winn Feline Foundation Library
Join us on Facebook
Follow us on Twitter
Read More


Monday, April 4, 2011

Wheat and Schizophrenia

Schizophrenia is an unfortunate disease of the brain. A progressive disorder, it often presents with social withdrawal, paranoia, hearing voices, that sort of thing. After quite a while (sometimes decades) you get a kind of "burnout" effect where the voices and whatnot lessen, but the afflicted is left with all the negative symptoms of social withdrawal, thought blocking, and an inexpressiveness known as "flat affect." MRI of the brain will show "large ventricles" at this point, meaning cell death (brain damage) has caused the active, lively part of the brain to shrink. You'll see schizophrenia in any large public park in any major city. If you ask the guy on the bench that everyone is avoiding if he wants something to eat, and he answers with paranoid meaningless word salad, that's schizophrenia, most likely. He had parents, brothers, sisters, maybe even a college degree. Even if he wanted to stay in a treatment facility or group home, in most places there aren't enough spots for all the mentally ill, so many end up homeless or in jail. A tough road for someone with an organic brain disease.

Most of the research on schizophrenia is focused on the neurotransmitters dopamine, acetylcholine, and histamine and genetic polymorphisms of transporters and receptors. The usual questions are asked about ineffective brain chemistry. The usual treatment is neuroleptic medication (hopefully decreases excess dopamine in the right place and leaves it well enough alone in other corners of the brain). And I've seen medicine do a decent job of clearing up the psychosis symptoms many times. Medicine tends to have pretty serious side effects, though, so a big push in research these days is to identify those folks at high risk for schizophrenia before it happens, hopefully to prevent the illness in the first place through various means. Often those means include more medications - but with Big Pharma funding many studies, those are the solutions that are found.



One intrepid researcher, F. Curtis Dohan, spent a lot of his career chasing an unlikely suspect in the pathogenesis of schizophrenia, wheat. His fascinating paper, Genetic Hypothesis of Idiopathic Schizophrenia: It's Exorphin Connection, can be found in free full text via the link.

Anyway, there's a funny thing about schizophrenia, turns out that quite a few of the adult schizophrenics on an inpatient psychiatric unit in 1967 happened to have a major history of celiac disease (gluten/wheat intolerance) as children. As in 50-100 times the amount of celiac disease that one would expect by chance. Celiac doctors also noticed their patients were schizophrenic about 10X as often as the general population. That's a lot! In addition, epidemiological studies of Pacific Islanders and other populations showed a strong, dose-dependent relationship between grain intake and schizophrenia. The gluten-free populations had extremely rare occurrence of schizophrenia - just 2 in 65,000 versus about 1 in 100 as we have in the grain-eating West. When populations Westernized their diets (flour, sugar, and beer), schizophrenia became common. In some clinical trials, gluten made new-onset acutely ill schizophrenics much worse, but only occasional long-term patients responded to gluten restriction. The long-term sufferer has already had a lot of damage - if wheat somehow toxic to the brain, then it would be vital to stop the insult early on in the course of the disease to see improvement.

National Institutes of Health investigators looked for poisonous protein fragments derived from gluten, gliadin (wheat proteins), and casein (a milk protein). They found them - potent opiate (yes, opiate as in morphine. Or heroin) analogs they called "exorphins." Many of these studies were done in rats, and the results are very creepy if you are fond of bread and milk (or rats). Turns out, you take wheat gluten, add stomach enzymes, and you end up with fragments of proteins that are potent opiates (1). The cute thing is these fragments aren't digested by the small intestine and definitely end up in the body and brain of rats that are fed gluten orally. Inject these same proteins directly into the brains of poor unfortunate rats, and you get rat seizures. Interestingly, people with schizophrenia seem to have a lot of these opioid-like small gluten-derived peptides in their urine. Way more than people without schizophrenia.

Let me review what is perhaps the most important part of the Dohan paper - a gluten-free diet definitely improved some of the new-onset schizophrenics on the inpatient unit. Not all of them. But 2 out of 17 or so. Putting back the wheat made the affected a lot worse. In another study, 115 patients on a locked ward were all given a gluten free milk free diet. They were released into the community on average twice as fast as the similar patients on another, diet as usual ward (p=.009). It is of note that repeat studies didn't show the same thing, but instead of 17 or 115 patients, these studies had 4 or 8 patients, and these were studies of people who had schizophrenia for many years, where much damage was already done.

Historically, prior to WWII, when grain consumption was super-high and neuroleptics (those medications, as you recall, which affect brain dopamine levels and are used to treat schizophrenia) did not yet exist, there are reports of schizophrenics having marked and unexplained fluctuations in weight and gut symptoms, poor iron absorption just like celiac sufferers, and "post-mortem abnormalities like those subsequently discovered in celiac patients." Why aren't these found now? Well, Dohan contended that a side effect of these neuroleptic medications is that they decrease the permeability of the gut. Meaning gluten may not be able to weasel through quite so easily.

Which begs the question, is that the side effect? Or perhaps the principle effect? Who knows? Dohan's paper was published in 1988 and ended with some ideas about how to study the question further (such as by feeding identical twins of schizophrenics a high gluten diet to see what happens - somehow I don't think that experimental design would pass an institutional review board nowadays.) Well, nothing much happened research-wise until around 2005, but what has been discovered is interesting. There is no "smoking gun," but there certainly is a lot of smoke.

In Markers of Gluten Sensitivity and Celiac Disease in Recent-Onset Psychosis and Multi-Episode Schizophrenia it was found that individuals with recent-onset psychosis and with multi-episode schizophrenia who have increased antibodies to gliadin may share some immunologic features of celiac disease, but their immune response to gliadin differs from that of celiac disease.

In this very clever work done by Samaroo and Dickerson et al, published as Novel immune response to gluten in individuals with schizophrenia, immune responses and celiac disease biomarkers were tested in schizophrenics. It turns out that schizophrenics tended to have a lot of anti-wheat antibodies floating around in their systems, but these antibodies were nearly entirely different from the ones that people with celiac disease have. That means that the usual test for gluten issues, the tests for celiac, wouldn't come up positive in schizophrenics, even though they have unusual immune reactions to wheat.

In A Case Report of the Resolution of Schizophrenic Symptoms on a Ketogenic Diet, a high fat, low carb, low protein diet (thus very low in wheat) results in the remission of psychotic symptoms in a single case report.
Schizophrenia is an unfortunate disease of the brain. A progressive disorder, it often presents with social withdrawal, paranoia, hearing voices, that sort of thing. After quite a while (sometimes decades) you get a kind of "burnout" effect where the voices and whatnot lessen, but the afflicted is left with all the negative symptoms of social withdrawal, thought blocking, and an inexpressiveness known as "flat affect." MRI of the brain will show "large ventricles" at this point, meaning cell death (brain damage) has caused the active, lively part of the brain to shrink. You'll see schizophrenia in any large public park in any major city. If you ask the guy on the bench that everyone is avoiding if he wants something to eat, and he answers with paranoid meaningless word salad, that's schizophrenia, most likely. He had parents, brothers, sisters, maybe even a college degree. Even if he wanted to stay in a treatment facility or group home, in most places there aren't enough spots for all the mentally ill, so many end up homeless or in jail. A tough road for someone with an organic brain disease.

Most of the research on schizophrenia is focused on the neurotransmitters dopamine, acetylcholine, and histamine and genetic polymorphisms of transporters and receptors. The usual questions are asked about ineffective brain chemistry. The usual treatment is neuroleptic medication (hopefully decreases excess dopamine in the right place and leaves it well enough alone in other corners of the brain). And I've seen medicine do a decent job of clearing up the psychosis symptoms many times. Medicine tends to have pretty serious side effects, though, so a big push in research these days is to identify those folks at high risk for schizophrenia before it happens, hopefully to prevent the illness in the first place through various means. Often those means include more medications - but with Big Pharma funding many studies, those are the solutions that are found.



One intrepid researcher, F. Curtis Dohan, spent a lot of his career chasing an unlikely suspect in the pathogenesis of schizophrenia, wheat. His fascinating paper, Genetic Hypothesis of Idiopathic Schizophrenia: It's Exorphin Connection, can be found in free full text via the link.

Anyway, there's a funny thing about schizophrenia, turns out that quite a few of the adult schizophrenics on an inpatient psychiatric unit in 1967 happened to have a major history of celiac disease (gluten/wheat intolerance) as children. As in 50-100 times the amount of celiac disease that one would expect by chance. Celiac doctors also noticed their patients were schizophrenic about 10X as often as the general population. That's a lot! In addition, epidemiological studies of Pacific Islanders and other populations showed a strong, dose-dependent relationship between grain intake and schizophrenia. The gluten-free populations had extremely rare occurrence of schizophrenia - just 2 in 65,000 versus about 1 in 100 as we have in the grain-eating West. When populations Westernized their diets (flour, sugar, and beer), schizophrenia became common. In some clinical trials, gluten made new-onset acutely ill schizophrenics much worse, but only occasional long-term patients responded to gluten restriction. The long-term sufferer has already had a lot of damage - if wheat somehow toxic to the brain, then it would be vital to stop the insult early on in the course of the disease to see improvement.

National Institutes of Health investigators looked for poisonous protein fragments derived from gluten, gliadin (wheat proteins), and casein (a milk protein). They found them - potent opiate (yes, opiate as in morphine. Or heroin) analogs they called "exorphins." Many of these studies were done in rats, and the results are very creepy if you are fond of bread and milk (or rats). Turns out, you take wheat gluten, add stomach enzymes, and you end up with fragments of proteins that are potent opiates (1). The cute thing is these fragments aren't digested by the small intestine and definitely end up in the body and brain of rats that are fed gluten orally. Inject these same proteins directly into the brains of poor unfortunate rats, and you get rat seizures. Interestingly, people with schizophrenia seem to have a lot of these opioid-like small gluten-derived peptides in their urine. Way more than people without schizophrenia.

Let me review what is perhaps the most important part of the Dohan paper - a gluten-free diet definitely improved some of the new-onset schizophrenics on the inpatient unit. Not all of them. But 2 out of 17 or so. Putting back the wheat made the affected a lot worse. In another study, 115 patients on a locked ward were all given a gluten free milk free diet. They were released into the community on average twice as fast as the similar patients on another, diet as usual ward (p=.009). It is of note that repeat studies didn't show the same thing, but instead of 17 or 115 patients, these studies had 4 or 8 patients, and these were studies of people who had schizophrenia for many years, where much damage was already done.

Historically, prior to WWII, when grain consumption was super-high and neuroleptics (those medications, as you recall, which affect brain dopamine levels and are used to treat schizophrenia) did not yet exist, there are reports of schizophrenics having marked and unexplained fluctuations in weight and gut symptoms, poor iron absorption just like celiac sufferers, and "post-mortem abnormalities like those subsequently discovered in celiac patients." Why aren't these found now? Well, Dohan contended that a side effect of these neuroleptic medications is that they decrease the permeability of the gut. Meaning gluten may not be able to weasel through quite so easily.

Which begs the question, is that the side effect? Or perhaps the principle effect? Who knows? Dohan's paper was published in 1988 and ended with some ideas about how to study the question further (such as by feeding identical twins of schizophrenics a high gluten diet to see what happens - somehow I don't think that experimental design would pass an institutional review board nowadays.) Well, nothing much happened research-wise until around 2005, but what has been discovered is interesting. There is no "smoking gun," but there certainly is a lot of smoke.

In Markers of Gluten Sensitivity and Celiac Disease in Recent-Onset Psychosis and Multi-Episode Schizophrenia it was found that individuals with recent-onset psychosis and with multi-episode schizophrenia who have increased antibodies to gliadin may share some immunologic features of celiac disease, but their immune response to gliadin differs from that of celiac disease.

In this very clever work done by Samaroo and Dickerson et al, published as Novel immune response to gluten in individuals with schizophrenia, immune responses and celiac disease biomarkers were tested in schizophrenics. It turns out that schizophrenics tended to have a lot of anti-wheat antibodies floating around in their systems, but these antibodies were nearly entirely different from the ones that people with celiac disease have. That means that the usual test for gluten issues, the tests for celiac, wouldn't come up positive in schizophrenics, even though they have unusual immune reactions to wheat.

In A Case Report of the Resolution of Schizophrenic Symptoms on a Ketogenic Diet, a high fat, low carb, low protein diet (thus very low in wheat) results in the remission of psychotic symptoms in a single case report.
Read More


Despite Assurances on Milk, Radiation Fear Lingers

SAN FRANCISCO — By scientific standards, the radiation found over the last week in batches of milk on the West Coast was minuscule and, moreover, not dangerous to humans.

But the mere mention of any contamination in that most motherly of beverages still stirred concern in people like Marilyn Margulius, an interior designer from Berkeley, Calif., who called her daughter on Thursday and told her not to let her 10-year-old son drink milk.

“There is a big trust issue with this,” said Ms. Margulius, 71, who was shopping at a Whole Foods Market in Berkeley.

“The health department does not want people to panic. Milk is probably O.K., but who the heck knows?”

There have been repeated public assurances this week — officials said an adult would need to drink thousands of liters of the milk containing radiation at the levels found so far before it would be remotely dangerous. Officials also tried to tamp down anxiety from dairy farmers concerned about bad press.

“I’ve had members call to ask whether we’ve seen the media, and media calling to ask how this is impacting our members,” said Michael Marsh, the chief executive of Western United Dairymen, the milk industry’s West Coast trade association. Mr. Marsh said he had repeated the assurances given by officials, but he also understood the fears in the supermarket’s refrigerated aisle.

“Consumers, doubtless, when they hear about something like this, the cautionary principle kicks in,” he said. “Even if you’d have to drink an oil tanker of this.”

Dr. Elizabeth N. Pearce, an associate professor of medicine at the Boston University School of Medicine and an expert on the thyroid, where radioactive iodine is often absorbed, said there was no health risk, even for small children, from milk on American shelves.

Still, she said, “If people feel safer not drinking it, there are no long-term health effects from abstaining for two weeks, either.”

The alarm was sounded on Wednesday, when federal officials announced that tests had detected a trace amount of iodine 131 — a radioactive byproduct released by leaks at the Fukushima Daiichi nuclear plant in Japan — in a sample taken on March 25 in Spokane, Wash. The level of radiation was tiny and would have to be more than 5,000 times higher to reach the “intervention level” set by federal officials.

Jason Kelly, a spokesman for the Washington State Department of Agriculture, said the positive sample came from a gallon of pasteurized whole milk produced at the Darigold plant in Spokane, which processes milk from a number of farms in Washington and Idaho.

Jim H. Klein, a spokesman for Darigold, said milk from the Spokane plant was distributed in the Northwest, but said there was no reason for concern.

Some people even saw a potential silver lining in their bottom line. Mike Vieira, owner of a small dairy west of Spokane that produces about 250 gallons of milk a day, said the scare might be good for his business because customers know where his milk comes from, as opposed to milk from larger dairies that is pooled at a processing plant.

“We’ve had regular customers come to the farm buying milk,” he said. “And they’re buying six or eight gallons.”

The California health department also confirmed Wednesday that it had detected a tiny amount of radioactive iodine in a sample collected Monday from a dairy in San Luis Obispo County on the state’s Central Coast.

Milk in San Luis Obispo is regularly tested because the Diablo Canyon nuclear power plant is on the county’s southern coast. Officials said that monitoring — done weekly since the crisis began — was accountable for finding the contamination.

Dr. Penny Borenstein, the county’s public health officer, said that the tests would continue and that dairy products would continue to be safe. “The situation in Japan continues to evolve, but we are still 6,000 miles away.”
SAN FRANCISCO — By scientific standards, the radiation found over the last week in batches of milk on the West Coast was minuscule and, moreover, not dangerous to humans.

But the mere mention of any contamination in that most motherly of beverages still stirred concern in people like Marilyn Margulius, an interior designer from Berkeley, Calif., who called her daughter on Thursday and told her not to let her 10-year-old son drink milk.

“There is a big trust issue with this,” said Ms. Margulius, 71, who was shopping at a Whole Foods Market in Berkeley.

“The health department does not want people to panic. Milk is probably O.K., but who the heck knows?”

There have been repeated public assurances this week — officials said an adult would need to drink thousands of liters of the milk containing radiation at the levels found so far before it would be remotely dangerous. Officials also tried to tamp down anxiety from dairy farmers concerned about bad press.

“I’ve had members call to ask whether we’ve seen the media, and media calling to ask how this is impacting our members,” said Michael Marsh, the chief executive of Western United Dairymen, the milk industry’s West Coast trade association. Mr. Marsh said he had repeated the assurances given by officials, but he also understood the fears in the supermarket’s refrigerated aisle.

“Consumers, doubtless, when they hear about something like this, the cautionary principle kicks in,” he said. “Even if you’d have to drink an oil tanker of this.”

Dr. Elizabeth N. Pearce, an associate professor of medicine at the Boston University School of Medicine and an expert on the thyroid, where radioactive iodine is often absorbed, said there was no health risk, even for small children, from milk on American shelves.

Still, she said, “If people feel safer not drinking it, there are no long-term health effects from abstaining for two weeks, either.”

The alarm was sounded on Wednesday, when federal officials announced that tests had detected a trace amount of iodine 131 — a radioactive byproduct released by leaks at the Fukushima Daiichi nuclear plant in Japan — in a sample taken on March 25 in Spokane, Wash. The level of radiation was tiny and would have to be more than 5,000 times higher to reach the “intervention level” set by federal officials.

Jason Kelly, a spokesman for the Washington State Department of Agriculture, said the positive sample came from a gallon of pasteurized whole milk produced at the Darigold plant in Spokane, which processes milk from a number of farms in Washington and Idaho.

Jim H. Klein, a spokesman for Darigold, said milk from the Spokane plant was distributed in the Northwest, but said there was no reason for concern.

Some people even saw a potential silver lining in their bottom line. Mike Vieira, owner of a small dairy west of Spokane that produces about 250 gallons of milk a day, said the scare might be good for his business because customers know where his milk comes from, as opposed to milk from larger dairies that is pooled at a processing plant.

“We’ve had regular customers come to the farm buying milk,” he said. “And they’re buying six or eight gallons.”

The California health department also confirmed Wednesday that it had detected a tiny amount of radioactive iodine in a sample collected Monday from a dairy in San Luis Obispo County on the state’s Central Coast.

Milk in San Luis Obispo is regularly tested because the Diablo Canyon nuclear power plant is on the county’s southern coast. Officials said that monitoring — done weekly since the crisis began — was accountable for finding the contamination.

Dr. Penny Borenstein, the county’s public health officer, said that the tests would continue and that dairy products would continue to be safe. “The situation in Japan continues to evolve, but we are still 6,000 miles away.”
Read More


Highly-anticipated Scoliosis Exercise DVD by Dr. Kevin Lau set to release March 02, 2011!

Highly-anticipated Scoliosis Exercise DVD by Dr. Kevin Lau set to release March 02, 2011!

Read More


Sunday, April 3, 2011

Schizophrenia Does Not Shrink Brains – Antipsychotics Do: JAMA Study

Schizophrenia is not to be taken lightly and sufferers often lead difficult lives ruled by the the ongoing battles in their minds. Psychiatrists have labeled schizophrenia as a “brain disease” capable of shrinking the mass of the brain and created a venue to manufacture pharmaceutical chemical cocktails as treatment. Now there is evidence of the futility and damage of such claims as a new study by researchers at the University of Iowa Carver College of Medicine shows that schizophrenia does not damage brain tissue. Instead, it is prescribed antipsychotics that causes this physical loss of the mind.
The most concerning victims of these chemical cocktails is children. Held captive by bogus labels of mental disease, this vulnerable population – our future – is being medicated at increasingly younger and younger ages. Recent data conducted by IMS Health shows that 24 million U.S. children are on ADHD medications, almost 10 million are on antidepressants with another six and a half million on other antipsychotics. No doubt, the medication of children has become quite the industry.[1]
This research is clear: the longer a person is on drugs and the higher the dose, the more damage is done. Damage that surpasses that of substance abuse issues. The outrage of this ongoing medical maiming gives more evidence of the necessity to stop the trend of drugging versus healing.
~Health Freedoms
One of psychiatry’s favorite claims is that schizophrenia is a “brain disease”. By that, they mean it’s caused by a physically defective brain. They try to prove it by claiming that the disease causes the brain to shrink. Of course, they make this claim without a shred of proof—and the fact that it’s an invention is now proven by a study published in the Archives of General Psychiatry, a publication of the Journal of the American Medical Association.
Here’s how authors of “Long-term Antipsychotic Treatment and Brain Volumes” described the reason for their study:
Progressive brain volume changes in schizophrenia are thought to be due principally to the disease. However, recent animal studies indicate that antipsychotics, the mainstay of treatment for schizophrenia patients, may also contribute to brain tissue volume decrement. Because antipsychotics are prescribed for long periods for schizophrenia patients and have increasingly widespread use in other psychiatric disorders, it is imperative to determine their long-term effects on the human brain.
So, Beng-Choon Ho and his fellow researchers at the University of Iowa Carver College of Medicine investigated by studying 211 patients diagnosed with schizophrenia. All had received MRIs soon after diagnosis, and they each had an average of 3 scans in 7.2 years. They examined brain volume changes over time, focusing on how long the illness had lasted, whether antipsychotics were used, the severity of the illness, and other substance abuse to see how they affected brain shrinkage.
Here are the results in a nutshell:
  • The longer a patient was on antipsychotics, the more the brain shrank.
  • The more antipsychotics a patient was given, the more the brain shrank.
  • The severity of illness had little or no effect on brain shrinkage.
  • Substance abuse had little or no effect on brain shrinkage.
Psychiatrists and pharmaceutical manufacturers have invented the claim that schizophrenia is a brain disease and that it shrinks the brain. They have trotted out pictures from brain imaging to “prove” that people they’ve diagnosed with schizophrenia need to take their brain-destroying chemicals—all the while ignoring the simple fact that the pictures were of brains that had been treated with those chemicals.
National Institute of Mental Health’s Policy
It’s interesting to note that, though the study was funded by the National Institute of Mental Health (NIMH), an agency that seems to exist for the promotion of psychiatric drugs. They had no involvement with the manuscript’s approval or review. Considering how strongly NIMH advocates for antipsychotic use in schizophrenia, among a myriad of other conditions, this detail is important. Frankly, I do not believe that the study would ever have been produced if they’d had any control over how the study was reported.
NIMH’s official policy is so antipsychotic-slanted that it states that non-drug treatment should only be applied in patients who are receiving antipsychotics. They believe that only those “who are already stabilized on antipsychotic medication” should even by considered for psychosocial treatment.(1) In other words, according to the NIMH, the problem is physical and must be treated with chemicals.
Antipsychotic Treatment of Children for an Invented Disorder
Children—people whose brains are supposed to be developing—are given antipsychotics! Worse, as in many other nonexistent diseases, psychiatry has invented another “pre” disorder: early onset schizophrenia spectrum (EOSS) disorder. Yes, since children rarely become schizophrenic, they’ve invented pre-schizophrenia. Of course, the treatment of choice is antipsychotics.
What is happening to these children’s developing brains? How are their lives circumscribed by such brutality?
The invention of a nonexistent brain disorder has resulted in psychiatric researchers speculating that “a static lesion occurring in the perinatal period could negatively interact with developmental changes that occur much later, closer to young adulthood, when onset of the disorder typically occur”.(2) Based on absolutely nothing but wishful thinking, they postulated the existence of a brain defect that results in a nonexistent “pre” disorder that they’ve named EOSS.
And they call themselves scientists! They expect us to take them seriously—so seriously that we’re supposed to turn our children over to them to stunt their brains while they call it therapy.
Evasion of Testing Whether Antipsychotics Have Harmful Effects
According to research published in the American Journal of Psychiatry back in 2000, studies have found brain damage, that is, atrophied brain tissues, in patients with schizophrenia—but they have not investigated whether it happened in the absence of antipsychotic drugs.
Psychiatry has hidden behind the claim that their antipsychotics are so effective and so safe that it would be wrong to use control groups in tests. The authors of the JAMA brain shrinkage study even stated in their report:
The current study could have been strengthened by having control groups, eg, schizophrenia patients assigned to deferred or no antipsychotic treatment or healthy volunteers treated with antipsychotics for comparable periods. However, ethical standards in human subject research prohibit such comparison groups.
They have no problem dividing people into drug and no-drug groups when testing for the efficacy of a chemical believed to be lifesaving, as in cancer drugs, but they claim that it’s unethical to do the same thing in people with schizophrenia! (This is the same excuse given for not testing vaccine efficacy.)
Whenever modern medicine claims that it would be unethical to study the effects of a drug or treatment against a placebo group, they’re claiming that the efficacy of the treatment and its relative lack of adverse effects is so dramatic that it would be a moral offense to withhold the treatment from a placebo group. It’s an argument used only when there is no desire to actually find out of the treatment is either effective or safe.
There is only one moral offense in this sordid tale. It’s the one committed by doctors, Big Pharma, and pseudo scientists who play these games with people’s lives. Children’s entire lives are distorted, disabled, diminished, and devastated by these vicious games. Adults are treated as nothing more than repositories of their profitable drugs and treatments, no matter how much harm is done. Shrinking people’s brains is an acceptable cost of scooping up profits and fees. The only real trick is to hide it from the public for as long as possible.
By: Heidi Stevenson
Sources:
Schizophrenia is not to be taken lightly and sufferers often lead difficult lives ruled by the the ongoing battles in their minds. Psychiatrists have labeled schizophrenia as a “brain disease” capable of shrinking the mass of the brain and created a venue to manufacture pharmaceutical chemical cocktails as treatment. Now there is evidence of the futility and damage of such claims as a new study by researchers at the University of Iowa Carver College of Medicine shows that schizophrenia does not damage brain tissue. Instead, it is prescribed antipsychotics that causes this physical loss of the mind.
The most concerning victims of these chemical cocktails is children. Held captive by bogus labels of mental disease, this vulnerable population – our future – is being medicated at increasingly younger and younger ages. Recent data conducted by IMS Health shows that 24 million U.S. children are on ADHD medications, almost 10 million are on antidepressants with another six and a half million on other antipsychotics. No doubt, the medication of children has become quite the industry.[1]
This research is clear: the longer a person is on drugs and the higher the dose, the more damage is done. Damage that surpasses that of substance abuse issues. The outrage of this ongoing medical maiming gives more evidence of the necessity to stop the trend of drugging versus healing.
~Health Freedoms
One of psychiatry’s favorite claims is that schizophrenia is a “brain disease”. By that, they mean it’s caused by a physically defective brain. They try to prove it by claiming that the disease causes the brain to shrink. Of course, they make this claim without a shred of proof—and the fact that it’s an invention is now proven by a study published in the Archives of General Psychiatry, a publication of the Journal of the American Medical Association.
Here’s how authors of “Long-term Antipsychotic Treatment and Brain Volumes” described the reason for their study:
Progressive brain volume changes in schizophrenia are thought to be due principally to the disease. However, recent animal studies indicate that antipsychotics, the mainstay of treatment for schizophrenia patients, may also contribute to brain tissue volume decrement. Because antipsychotics are prescribed for long periods for schizophrenia patients and have increasingly widespread use in other psychiatric disorders, it is imperative to determine their long-term effects on the human brain.
So, Beng-Choon Ho and his fellow researchers at the University of Iowa Carver College of Medicine investigated by studying 211 patients diagnosed with schizophrenia. All had received MRIs soon after diagnosis, and they each had an average of 3 scans in 7.2 years. They examined brain volume changes over time, focusing on how long the illness had lasted, whether antipsychotics were used, the severity of the illness, and other substance abuse to see how they affected brain shrinkage.
Here are the results in a nutshell:
  • The longer a patient was on antipsychotics, the more the brain shrank.
  • The more antipsychotics a patient was given, the more the brain shrank.
  • The severity of illness had little or no effect on brain shrinkage.
  • Substance abuse had little or no effect on brain shrinkage.
Psychiatrists and pharmaceutical manufacturers have invented the claim that schizophrenia is a brain disease and that it shrinks the brain. They have trotted out pictures from brain imaging to “prove” that people they’ve diagnosed with schizophrenia need to take their brain-destroying chemicals—all the while ignoring the simple fact that the pictures were of brains that had been treated with those chemicals.
National Institute of Mental Health’s Policy
It’s interesting to note that, though the study was funded by the National Institute of Mental Health (NIMH), an agency that seems to exist for the promotion of psychiatric drugs. They had no involvement with the manuscript’s approval or review. Considering how strongly NIMH advocates for antipsychotic use in schizophrenia, among a myriad of other conditions, this detail is important. Frankly, I do not believe that the study would ever have been produced if they’d had any control over how the study was reported.
NIMH’s official policy is so antipsychotic-slanted that it states that non-drug treatment should only be applied in patients who are receiving antipsychotics. They believe that only those “who are already stabilized on antipsychotic medication” should even by considered for psychosocial treatment.(1) In other words, according to the NIMH, the problem is physical and must be treated with chemicals.
Antipsychotic Treatment of Children for an Invented Disorder
Children—people whose brains are supposed to be developing—are given antipsychotics! Worse, as in many other nonexistent diseases, psychiatry has invented another “pre” disorder: early onset schizophrenia spectrum (EOSS) disorder. Yes, since children rarely become schizophrenic, they’ve invented pre-schizophrenia. Of course, the treatment of choice is antipsychotics.
What is happening to these children’s developing brains? How are their lives circumscribed by such brutality?
The invention of a nonexistent brain disorder has resulted in psychiatric researchers speculating that “a static lesion occurring in the perinatal period could negatively interact with developmental changes that occur much later, closer to young adulthood, when onset of the disorder typically occur”.(2) Based on absolutely nothing but wishful thinking, they postulated the existence of a brain defect that results in a nonexistent “pre” disorder that they’ve named EOSS.
And they call themselves scientists! They expect us to take them seriously—so seriously that we’re supposed to turn our children over to them to stunt their brains while they call it therapy.
Evasion of Testing Whether Antipsychotics Have Harmful Effects
According to research published in the American Journal of Psychiatry back in 2000, studies have found brain damage, that is, atrophied brain tissues, in patients with schizophrenia—but they have not investigated whether it happened in the absence of antipsychotic drugs.
Psychiatry has hidden behind the claim that their antipsychotics are so effective and so safe that it would be wrong to use control groups in tests. The authors of the JAMA brain shrinkage study even stated in their report:
The current study could have been strengthened by having control groups, eg, schizophrenia patients assigned to deferred or no antipsychotic treatment or healthy volunteers treated with antipsychotics for comparable periods. However, ethical standards in human subject research prohibit such comparison groups.
They have no problem dividing people into drug and no-drug groups when testing for the efficacy of a chemical believed to be lifesaving, as in cancer drugs, but they claim that it’s unethical to do the same thing in people with schizophrenia! (This is the same excuse given for not testing vaccine efficacy.)
Whenever modern medicine claims that it would be unethical to study the effects of a drug or treatment against a placebo group, they’re claiming that the efficacy of the treatment and its relative lack of adverse effects is so dramatic that it would be a moral offense to withhold the treatment from a placebo group. It’s an argument used only when there is no desire to actually find out of the treatment is either effective or safe.
There is only one moral offense in this sordid tale. It’s the one committed by doctors, Big Pharma, and pseudo scientists who play these games with people’s lives. Children’s entire lives are distorted, disabled, diminished, and devastated by these vicious games. Adults are treated as nothing more than repositories of their profitable drugs and treatments, no matter how much harm is done. Shrinking people’s brains is an acceptable cost of scooping up profits and fees. The only real trick is to hide it from the public for as long as possible.
By: Heidi Stevenson
Sources:
Read More


Artificial Dye Safe to Eat, Panel Says

WASHINGTON — There is no proof that foods with artificial colorings cause hyperactivity in most children and there is no need for these foods to carry special warning labels, a government advisory panel voted Thursday. 

The Food and Drug Administrationconvened the expert panel after agency scientists for the first time decided that while typical children may be unaffected by the dyes, those with behavior problems may see their symptoms worsen by eating food with synthetic color additives.
The debate over artificial dyes began in the 1970s when Dr. Benjamin Feingold, a pediatric allergist from California, had success treating the symptoms of hyperactivity in some children by prescribing a diet that, among other things, eliminated foods with artificial coloring.
But once the agency conceded that some children might be negatively affected by the foods, it had to decide what to do. The Center for Science in the Public Interest, an advocacy group, petitioned the agency to ban the dyes or, at the very least, mandate warnings that foods containing the dyes cause hyperactivity in children. Major food manufacturers staunchly defended the safety of artificial dyes and said no bans or warnings were needed.
Artificial coloring is present in popular products like Froot Loops cereal, Jell-O, Life Savers candy and Hostess Twinkies.
The F.D.A. did not ask the committee about a ban, and the committee voted 8 to 6 that even a warning was not needed.
The Grocery Manufacturers Association hailed the votes. “We agree with today’s F.D.A.’s advisory committee finding which determined that there is insufficient evidence of a causal link between artificial colors and hyperactivity in children,” it said.
Dr. Michael Jacobson, executive director of the advocacy group, said he was disappointed but pleased that the debate about the safety of artificial colorings had been renewed. “At least the F.D.A. is now acknowledging that dyes affect some children,” he said. “That’s a big change.”
WASHINGTON — There is no proof that foods with artificial colorings cause hyperactivity in most children and there is no need for these foods to carry special warning labels, a government advisory panel voted Thursday. 

The Food and Drug Administrationconvened the expert panel after agency scientists for the first time decided that while typical children may be unaffected by the dyes, those with behavior problems may see their symptoms worsen by eating food with synthetic color additives.
The debate over artificial dyes began in the 1970s when Dr. Benjamin Feingold, a pediatric allergist from California, had success treating the symptoms of hyperactivity in some children by prescribing a diet that, among other things, eliminated foods with artificial coloring.
But once the agency conceded that some children might be negatively affected by the foods, it had to decide what to do. The Center for Science in the Public Interest, an advocacy group, petitioned the agency to ban the dyes or, at the very least, mandate warnings that foods containing the dyes cause hyperactivity in children. Major food manufacturers staunchly defended the safety of artificial dyes and said no bans or warnings were needed.
Artificial coloring is present in popular products like Froot Loops cereal, Jell-O, Life Savers candy and Hostess Twinkies.
The F.D.A. did not ask the committee about a ban, and the committee voted 8 to 6 that even a warning was not needed.
The Grocery Manufacturers Association hailed the votes. “We agree with today’s F.D.A.’s advisory committee finding which determined that there is insufficient evidence of a causal link between artificial colors and hyperactivity in children,” it said.
Dr. Michael Jacobson, executive director of the advocacy group, said he was disappointed but pleased that the debate about the safety of artificial colorings had been renewed. “At least the F.D.A. is now acknowledging that dyes affect some children,” he said. “That’s a big change.”
Read More