Monday, 10 June 2013

Millions on Verge of Diabetes Don't Know It

THURSDAY, March 21 (HealthDay News) -- Only 11 percent of the estimated 79 million Americans who are at risk for diabetes know they are at risk, federal health officials reported Thursday.
The condition, known as prediabetes, describes higher-than-normal blood sugar levels that put people in danger of developing diabetes, according to the U.S. Centers for Disease Control and Prevention.
"We have a huge issue with the small number of people who know they have it. It's up a bit from when we measured it last, but it's still abysmally low," said report author Ann Albright, director of the CDC's Division of Diabetes Translation.
"We need people to understand their risk and take action if they are at risk for diabetes," Albright said. "We know how to prevent type 2 diabetes, or at least delay it, so there are things people can do, but the first step is knowing what your risk is -- to know if you have prediabetes."
Things that put people at risk for prediabetes include being overweight or obese, being physically inactive and not eating a healthy diet, Albright said. These people should see their doctor and have their blood sugar levels checked, she said.
There is also a genetic component, Albright said, which is why having a family history of diabetes is another risk factor. "Your genetics loads the gun, then your lifestyle pulls the trigger," she said.
According to the report, published in the March 22 issue of the Morbidity and Mortality Weekly Report, the lack of awareness of prediabetes was the same across the board, regardless of income, education, health insurance or access to health care.
One expert found the numbers troubling.
"People don't know about prediabetes, they don't exercise, they don't eat appropriate foods and we are going to have many more diabetics in the near future than we have now," said Dr. Spyros Mezitis, an endocrinologist at Lenox Hill Hospital in New York City.
The danger of prediabetes is that it can progress to full-blown diabetes, with all the complications that condition entails, including heart, kidney, circulation and vision problems.
Albright noted that 30 percent or more of those with prediabetes will develop diabetes over the course of a decade.
The number of Americans with diabetes is already staggering. According to the American Diabetes Association, 25.8 million children and adults in the United States -- 8.3 percent of the population -- have diabetes.
"The good news is we know there are things you can do to prevent or delay the development of type 2 diabetes," Albright said. "You can prevent or delay diabetes if you lose 5 percent to 7 percent of your body weight and get 150 minutes of physical activity a week."
Another expert said it starts with what you eat.
Eating a healthy diet that limits sugars and carbohydrates is important, said Dr. Joel Zonszein, director of the Clinical Diabetes Center at Montefiore Medical Center in New York City.
Exercise and diet can reduce the risk of diabetes by about 58 percent, he said, and "giving the drug metformin can reduce the risk by 31 percent. Lifestyle changes, together with metformin, which the American Diabetes Association recommends for prediabetes, will be very effective.

Sunday, 9 June 2013

Diabetes cases continue to climb in Georgia

The Atlanta Journal-Constitution
The number of Georgians with diabetes continues to rise, jumping 145 percent from 1995 to 2010, according to a new Centers for Disease Control and Prevention study.
In Georgia, 9.8 percent of residents said they had diabetes in 2010, compared with 4.0 percent in 1995, said Linda Geiss, lead author of the study, released Thursday.
Only Oklahoma and Kentucky saw a bigger jump in the number of cases during that 15-year period, according to the study.
Geiss said Southern states overall have the highest rates of diabetes in the nation because of residents’ sedentary lifestyles and poor eating habits.
Diabetes is a chronic disease resulting from the body’s inability to process sugar. It is the nation’s seventh-leading cause of death, according to the CDC. It’s also a leading cause of kidney failure, nontraumatic lower-limb amputations, new cases of blindness, heart disease and stroke, according to the CDC.
The new study is based on telephone surveys of at least 1,000 adults in each state in 1995 and 2010.
Mississippi had the highest obesity rate and the highest diabetes rate, the CDC reported. Nearly 12 percent of Mississippians say they have diabetes, compared with the national average of 7 percent.
“Unfortunately, people are still eating too much and exercising too little,” said Michael Gault, executive director of the American Diabetes Association for the Atlanta region. His organization offers numerous programs to raise awareness of the disease and to help people who have contracted diabetes.
The association’s Tour de Cure bike ride attracted Scott Dube, who has Type 2 diabetes, which may be prevented through lifestyle changes and accounts for 90 percent to 95 percent of U.S. cases.
Dube, 51, of Kennesaw was diagnosed in 1986 during a routine physical. He has ridden in the diabetes association’s annual bike rides for many years.
Having diabetes has caused him to make better life choices, Dube said. “It’s a struggle. But it’s not a thing that should just keep you down and out.”
Dube made the necessary lifestyle changes, including cycling and other exercise and changes to his diet.
“It starts with your mind-set,” he said. “You have to be positive and aggressive to fight it.”
For more information on diabetes, go to www.yourdiabetesinfo.org, which is part of a partnership of the CDC and the National Institutes of Health that provides resources to improve the treatment and outcomes of people with the disease, promotes early diagnosis, and works to prevent or delay the onset of Type 2 diabetes.

Thursday, 27 January 2011

The Diabetes Epidemic

The CDC just released its latest estimate of diabetes prevalence in the US (1):
Diabetes affects 8.3 percent of Americans of all ages, and 11.3 percent of adults aged 20 and older, according to the National Diabetes Fact Sheet for 2011. About 27 percent of those with diabetes—7 million Americans—do not know they have the disease. Prediabetes affects 35 percent of adults aged 20 and older.
Wow-- this is a massive problem. The prevalence of diabetes has been increasing over time, due to more people developing the disorder, improvements in diabetes care leading to longer survival time, and changes in the way diabetes is diagnosed. Here's a graph I put together based on CDC data, showing the trend of diabetes prevalence (percent) from 1980 to 2008 in different age categories (2):


These data are self-reported, and do not correct for differences in diagnosis methods, so they should be viewed with caution-- but they still serve to illustrate the trend. There was an increase in diabetes incidence that began in the early 1990s. More than 90 percent of cases are type 2 diabetics. Disturbingly, the trend does not show any signs of slowing.

The diabetes epidemic has followed on the heels of the obesity epidemic with 10-20 years of lag time. Excess body fat is the number one risk factor for diabetes*. As far as I can tell, type 2 diabetes is caused by insulin resistance, which is probably due to energy intake exceeding energy needs (overnutrition), causing a state of cellular insulin resistance as a defense mechanism to protect against the damaging effects of too much glucose and fatty acids (3). In addition, type 2 diabetes requires a predisposition that prevents the pancreatic beta cells from keeping up with the greatly increased insulin needs of an insulin resistant person**. Both factors are required, and not all insulin resistant people will develop diabetes as some people's beta cells are able to compensate by hypersecreting insulin.

Why does energy intake exceed energy needs in modern America and in most affluent countries? Why has the typical person's calorie intake increased by 250 calories per day since 1970 (4)? I believe it's because the fat mass "setpoint" has been increased, typically but not always by industrial food. I've been developing some new thoughts on this lately, and potentially new solutions, which I'll reveal when they're ready.


* In other words, it's the best predictor of future diabetes risk.

** Most of the common gene variants (of known function) linked with type 2 diabetes are thought to impact beta cell function (5).

Two Wheat Challenge Ideas from Commenters

Some people have remarked that the blinded challenge method I posted is cumbersome.

Reader "Me" suggested:
You can buy wheat gluten in a grocery store. Why not simply have your friend add some wheat gluten to your normal protein shake.
Reader David suggested:
They sell empty gelatin capsules with carob content to opacify them. Why not fill a few capsules with whole wheat flour, and then a whole bunch with rice starch or other placebo. For two weeks take a set of, say, three capsules every day, with the set of wheat capsules in line to be taken on a random day selected by your friend. This would further reduce the chances that you would see through the blind, and it prevent the risk of not being able to choke the "smoothie" down. It would also keep it to wheat and nothing but wheat (except for the placebo starch).
The reason I chose the method in the last post is that it directly tests wheat in a form that a person would be likely to eat: bread. The limitation of the gluten shake method is that it would miss a sensitivity to components in wheat other than gluten. The limitation of the pill method is that raw flour is difficult to digest, so it would be difficult to extrapolate a sensitivity to cooked flour foods. You might be able to get around that by filling the pills with powdered bread crumbs. Those are two alternative ideas to consider if the one I posted seems too involved.

Monday, 24 January 2011

Blinded Wheat Challenge

Self-experimentation can be an effective way to improve one's health*. One of the problems with diet self-experimentation is that it's difficult to know which changes are the direct result of eating a food, and which are the result of preconceived ideas about a food. For example, are you more likely to notice the fact that you're grumpy after drinking milk if you think milk makes people grumpy? Maybe you're grumpy every other day regardless of diet? Placebo effects and conscious/unconscious bias can lead us to erroneous conclusions.

The beauty of the scientific method is that it offers us effective tools to minimize this kind of bias. This is probably its main advantage over more subjective forms of inquiry**. One of the most effective tools in the scientific method's toolbox is a control. This is a measurement that's used to establish a baseline for comparison with the intervention, which is what you're interested in. Without a control measurement, the intervention measurement is typically meaningless. For example, if we give 100 people pills that cure belly button lint, we have to give a different group placebo (sugar) pills. Only the comparison between drug and placebo groups can tell us if the drug worked, because maybe the changing seasons, regular doctor's visits, or having your belly button examined once a week affects the likelihood of lint.

Another tool is called blinding. This is where the patient, and often the doctor and investigators, don't know which pills are placebo and which are drug. This minimizes bias on the part of the patient, and sometimes the doctor and investigators. If the patient knew he were receiving drug rather than placebo, that could influence the outcome. Likewise, investigators who aren't blinded while they're collecting data can unconsciously (or consciously) influence it.

Back to diet. I want to know if I react to wheat. I've been gluten-free for about a month. But if I eat a slice of bread, how can I be sure I'm not experiencing symptoms because I think I should? How about blinding and a non-gluten control?

Procedure for a Blinded Wheat Challenge

1. Find a friend who can help you.

2. Buy a loaf of wheat bread and a loaf of gluten-free bread.

3. Have your friend choose one of the loaves without telling you which he/she chose.

4. Have your friend take 1-3 slices, blend them with water in a blender until smooth. This is to eliminate differences in consistency that could allow you to determine what you're eating. Don't watch your friend do this-- you might recognize the loaf.

5. Pinch your nose and drink the "bread smoothie" (yum!). This is so that you can't identify the bread by taste. Rinse your mouth with water before releasing your nose. Record how you feel in the next few hours and days.

6. Wait a week. This is called a "washout period". Repeat the experiment with the second loaf, attempting to keep everything else about the experiment as similar as possible.

7. Compare how you felt each time. Have your friend "unblind" you by telling you which bread you ate on each day. If you experienced symptoms during the wheat challenge but not the control challenge, you may be sensitive to wheat.

If you want to take this to the next level of scientific rigor, repeat the procedure several times to see if the result is consistent. The larger the effect, the fewer times you need to repeat it to be confident in the result.


* Although it can also be disastrous. People who get into the most trouble are "extreme thinkers" who have a tendency to take an idea too far, e.g., avoid all animal foods, avoid all carbohydrate, avoid all fat, run two marathons a week, etc.

** More subjective forms of inquiry have their own advantages.

Thursday, 20 January 2011

Eating Wheat Gluten Causes Symptoms in Some People Who Don't Have Celiac Disease

Irritable bowel syndrome (IBS) is a condition characterized by the frequent occurrence of abdominal pain, diarrhea, constipation, bloating and/or gas. If that sounds like an extremely broad description, that's because it is. The word "syndrome" is medicalese for "we don't know what causes it." IBS seems to be a catch-all for various persistent digestive problems that aren't defined as separate disorders, and it has a very high prevalence: as high as 14 percent of people in the US, although the estimates depend on what diagnostic criteria are used (1). It can be brought on or exacerbated by several different types of stressors, including emotional stress and infection.

Maelán Fontes Villalba at Lund University recently forwarded me an interesting new paper in the American Journal of Gastroenterology (2). Dr. Jessica R. Biesiekierski and colleagues recruited 34 IBS patients who did not have celiac disease, but who felt they had benefited from going gluten-free in their daily lives*. All patients continued on their pre-study gluten-free diet, however, all participants were provided with two slices of gluten-free bread and one gluten-free muffin per day. The investigators added isolated wheat gluten to the bread and muffins of half the study group.

During the six weeks of the intervention, patients receiving the gluten-free food fared considerably better on nearly every symptom of IBS measured. The most striking difference was in tiredness-- the gluten-free group was much less tired on average than the gluten group. Interestingly, they found that a negative reaction to gluten was not necessarily accompanied by the presence of anti-gluten antibodies in the blood, which is a test often used to diagnose gluten sensitivity.

Here's what I take away from this study:
  1. Wheat gluten can cause symptoms in susceptible people who do not have celiac disease.
  2. A lack of circulating antibodies against gluten does not necessarily indicate a lack of gluten sensitivity.
  3. People with mysterious digestive problems may want to try avoiding gluten for a while to see if it improves their symptoms**.
  4. People with mysterious fatigue may want to try avoiding gluten.
A previous study in 1981 showed that feeding volunteers a large dose of gluten every day for 6 weeks caused adverse gastrointestinal effects, including inflammatory changes, in relatives of people with celiac disease, who did not themselves have celiac (3). Together, these two studies are the most solid evidence that gluten can be damaging in people without celiac disease, a topic that has not received much interest in the biomedical research community.

I don't expect everyone to benefit from avoiding gluten. But for those who are really sensitive, it can make a huge difference. Digestive, autoimmune and neurological disorders associate most strongly with gluten sensitivity. Avoiding gluten can be a fruitful thing to try in cases of mysterious chronic illness. We're two-thirds of the way through Gluten-Free January. I've been fastidiously avoiding gluten, as annoying as it's been at times***. Has anyone noticed a change in their health?


* 56% of volunteers carried HLA-DQ2 or DQ8 alleles, which is slightly higher than the general population. Nearly all people with celiac disease carry one of these two alleles. 28% of volunteers were positive for anti-gliadin IgA, which is higher than the general population.

** Some people feel they are reacting to the fructans in wheat, rather than the gluten. If a modest amount of onion causes the same symptoms as eating wheat, then that may be true. If not, then it's probably the gluten.

*** I'm usually about 95% gluten-free anyway. But when I want a real beer, I want one brewed with barley. And when I want Thai food or sushi, I don't worry about a little bit of wheat in the soy sauce. If a friend makes me food with gluten in it, I'll eat it and enjoy it. This month I'm 100% gluten-free though, because I can't in good conscience encourage my blog readership to try it if I'm not doing it myself. At the end of the month, I'm going to do a blinded gluten challenge (with a gluten-free control challenge) to see once and for all if I react to it. Stay tuned for more on that.

Thursday, 13 January 2011

Does Dietary Saturated Fat Increase Blood Cholesterol? An Informal Review of Observational Studies

The diet-heart hypothesis states three things:
  1. Dietary saturated fat increases blood cholesterol
  2. Elevated blood cholesterol increases the risk of having a heart attack
  3. Therefore, dietary saturated fat increases the risk of having a heart attack
To evaluate the second contention, investigators have examined the relationship between blood cholesterol and heart attack risk. Many studies including MRFIT have shown that the two are related (1):

The relationship becomes much more complex when you consider lipoprotein subtypes, density and oxidation level, among other factors, but at the very least there is an association between habitual blood cholesterol level and heart attack risk. This is what you would want to see if your hypothesis states that high blood cholesterol causes heart attacks.

Now let's turn to the first contention, the hypothesis that dietary saturated fat increases serum cholesterol. This idea is so deeply ingrained in the scientific literature that many authors don't even bother providing references for it anymore. When references are provided, they nearly always point to the same type of study: short-term controlled diet trials, in which volunteers are fed different fats for 2-13 weeks and their blood cholesterol measured (2)*. These are the studies on which the diet-heart hypothesis was built.

But now we have a problem. Nearly every high-quality (prospective) observational study ever conducted found that saturated fat intake is not associated with heart attack risk (3). So if saturated fat increases blood cholesterol, and higher blood cholesterol is associated with an increased risk of having a heart attack, then why don't people who eat more saturated fat have more heart attacks?

I'll begin to answer that question with another question: why do researchers almost never cite observational studies to support the idea that dietary saturated fat increases blood cholesterol? Surely if the hypothesis is correct, then people who habitually eat a lot of saturated fat should have high cholesterol, right? One reason may be that in most instances, when researchers have looked for a relationship between saturated fat intake and blood cholesterol, they haven't found one. Those findings have essentially been ignored, but let's have a look...

The Studies

It's difficult to do a complete accounting of these studies, but I've done my best to round them up. I can't claim this post is comprehensive, but I doubt I missed very many, and I certainly didn't exclude any that I came across. If you know of any I missed, please add them to the comments.

The earliest and perhaps most interesting study I found was published in the British Medical Journal in 1963 and is titled "Diet and Plasma Cholesterol in 99 Bank Men" (4). Investigators asked volunteers to weigh all food consumed at home for 1-2 weeks, and describe in detail all food consumed away from home. Compliance was good. This dietary accounting method was much more thorough than in most observational studies today**. Animal fat intake ranged from 55 to 173 grams per day, and blood cholesterol ranged from 154 to 324 mg/dL, yet there was no relationship whatsoever between the two. I'm looking at a graph of animal fat intake vs. blood cholesterol as I write this, and it looks like someone shot it with a shotgun at 50 yards. They twisted the data every which way, but were never able to squeeze even a hint of an association out of it:
Making the most out of the data in other ways- for example, by analysis of the men very stable in their diets, or in whom weighing of food intake was maximal, or where blood was taken close to the diet [measurement]- did not increase the correlation. Because the correlation coefficient is almost as often negative as positive, moreover, what is being discussed mostly is the absence of association, not merely association that is unexpectedly small.
The next study to discuss is the 1976 Tecumseh study (5). This was a large cardiovascular observational study conducted in Tecumseh, Michigan, which is often used as the basis for comparison for other cardiovascular studies in the literature. Using the 24 hour dietary recall method, including an analysis of saturated fat, the investigators found that:
Cholesterol and triglyceride levels were unrelated to quality, quantity, or proportions of fat, carbohydrate or protein consumed in the 24-hr recall period.
They also noted that the result was consistent with what had been reported in other previously published studies, including the Evans county study (6), the massive Israel Ischemic Heart Disease Study (7) and the Framingham study. One of the longest-running, most comprehensive and most highly cited observational studies, the Framingham study was organized by Harvard investigators and continues to this day. When investigators analyzed the relationship between saturated fat intake, serum cholesterol and heart attack risk, they were so disappointed that they never formally published the results. We know from multiple sources that they found no significant relationship between saturated fat intake and blood cholesterol or heart attack risk***.

The next study is the Bogalusa Heart Study, published in 1978, which studied the diet and health of 10 year old American children (8). This study found an association by one statistical method, and none by a second method****. They found that the dietary factors they analyzed explained no more than 4% of the variation in blood cholesterol. Overall, I think this study lends little or no support to the hypothesis.

Next is the Western Electric study, published in 1981 (9). This study found an association between saturated fat intake and blood cholesterol in middle-aged men in Chicago. However, the correlation was small, and there was no association between saturated fat intake and heart attack deaths. They cited two other studies that found an association between dietary saturated fat and blood cholesterol (and did not cite any of the numerous studies that found no association). One was a very small study conducted in young men doing research in Antarctica, which did not measure saturated fat but found an association between total fat intake and blood cholesterol (10). The other studied Japanese (Nagasaki and Hiroshima) and Japanese Americans in Japan, Hawai'i and California respectively (11).

This study requires some discussion. Published in 1973, it found a correlation between saturated fat intake and blood cholesterol in Japan, Hawai'i but not in California. The strongest association was in Japan, where going from 5 to 75 g/day of saturated fat (a 15-fold change!) was associated with an increase in blood cholesterol from about 175 to 200 mg/dL. However, I don't think this study offers much support to the hypothesis upon closer examination. Food intake in Japan was collected by 24-hour recall in 1965-1967, when the diet was mostly white rice in some areas. The lower limit of saturated fat intake in Japan was 5g/day, 1/12th what was typically eaten in Hawai'i and California, and the Japanese average was 16g, with most people falling below 10g. That is an extraordinarily low saturated fat intake. I think a significant portion of the Japanese in this study, living in the war-ravaged cities of Nagasaki and Hiroshima, were over-reliant on white rice and perhaps bordering on malnourishment.

In Japanese-Americans living in Hawai'i, over a range of saturated fat intakes between 5 and 110 g/day, cholesterol went from 210 to 220 mg/dL. That was statistically significant but it's not exactly knocking my socks off, considering it's a 22-fold change in saturated fat intake. In California, going from 15 to 110 g/day of saturated fat (7.3-fold change) was not associated with a change in blood cholesterol. Blood cholesterol was 20-30 mg/dL lower in Japan than in Hawai'i or California at any given level of saturated fat intake (e.g., Japanese eating 30g per day vs. Hawai'ians eating 30g per day). I think it's probable that saturated fat is not the relevant factor here, or at least it's being trumped by other factors. An equally plausible explanation is that people in the very low range of saturated fat intake are the rural poor who eat an impoverished diet that differs in many ways from the diets at the upper end of the range.

The most recent study was the Health Professional Follow-up study, published in 1996 (12). This was a massive, well funded study that found no hint of a relationship between saturated fat intake and blood cholesterol.

Conclusion

Of all the studies I came across, only the Western Electric study found a clear association between habitual saturated fat intake and blood cholesterol, and even that association was weak. The Bogalusa Heart study and the Japanese study provided inconsistent evidence for a weak association. The other studies I cited, including the bank workers' study, the Tecumseh study, the Evans county study, the Israel Ischemic Heart study, the Framingham study and the Health Professionals Follow-up study, found no association between the two factors.

Overall, the literature does not offer much support for the idea that long term saturated fat intake has a significant effect on the concentration of blood cholesterol. If it's a factor at all, it must be rather weak, which is consistent with what has been observed in multiple non-human species (13). I think it's likely that the diet-heart hypothesis rests in part on an over-interpretation of short-term controlled feeding studies. I'd like to see a more open discussion of this in the scientific literature. In any case, these controlled studies have typically shown that saturated fat increases both LDL and HDL, so even if saturated fat did have a small long-term effect on blood cholesterol, as hinted at by some of the observational studies, its effect on heart attack risk would still be difficult to predict.

The Diet-heart Hypothesis: Stuck at the Starting Gate
Animal Models of Atherosclerosis: LDL


* As a side note, many of these studies were of poor quality, and were designed in ways that artificially inflated the effects of saturated fat on blood lipids. For example, using a run-in period high in linoleic acid, or comparing a saturated fat-rich diet to a linoleic acid-rich diet, and attributing the differences in blood cholesterol to the saturated fat. Some of them used hydrogenated seed oils as the saturated fat. Although not always consistent, I do think that overall these studies support the idea that saturated fat does have a modest ability to increase blood cholesterol in the short term.

** Although I would love to hear comments from anyone who has done controlled diet trials. I'm sure this method had flaws, as it was applied in the 1960s.

*** Reference cited in the Tecumseh paper: Kannel, W et al. The Framingham Study. An epidemiological Investigation of Cardiovascular Diseases. Section 24: The Framingham Diet Study: Diet and the Regulation of Serum Cholesterol. US Government Printing Office, 1970.

**** Table 5 shows that the Pearson correlation coefficient for saturated fat intake vs. blood cholesterol is not significant; table 6 shows that children in the two highest tertiles of blood cholesterol have a significantly higher intake of saturated fat, unsaturated fat, total fat and sodium than the lowest tertile. The relationship between saturated fat and blood cholesterol shows no evidence of dose-dependence (cholesterol tertiles= 15.6g, 18.4g, 18.5g saturated fat). The investigators made no effort to adjust for confounding variables.