…ask Harper et al. in their study of the effects of calorie restriction (CR) on mice caught in the field (as opposed to standard strains commonly used in laboratory research).

The study (from the lab of veteran mouse wrangler Steven Austad) comes to a startling (and, to CR aficionados and practitioners, disturbing) conclusion: No.

To investigate whether mice genetically unaltered by many generations of laboratory selection exhibit similar hormonal and demographic responses to caloric restriction (CR) as laboratory rodents, we performed CR on cohorts of genetically heterogeneous male mice which were grandoffspring of wild-caught ancestors. Although hormonal changes, specifically an increase in corticosterone and decrease in testosterone, mimicked those seen in laboratory-adapted rodents, we found no difference in mean longevity between ad libitum (AL) and CR dietary groups, although a maximum likelihood fitted Gompertz mortality model indicated a significantly shallower slope and higher intercept for the CR group. This result was due to higher mortality in CR animals early in life, but lower mortality late in life. A subset of animals may have exhibited the standard demographic response to CR in that the longest-lived 8.1% of our animals were all from the CR group. Despite the lack of a robust mean longevity difference between groups, we did note a strong anticancer effect of CR as seen in laboratory rodents. Three plausible interpretations of our results are the following: (1) animals not selected under laboratory conditions do not show the typical CR effect; (2) because wild-derived animals eat less when fed AL, our restriction regime was too severe to see the CR effect; or (3) there is genetic variation for the CR effect in wild populations; variants that respond to CR with extended life are inadvertently selected for under conditions of laboratory domestication.

Why might laboratory selection select for animals that show a CR effect?

First of all: The best lab mouse is a fertile and fecund mouse, one that reproduces early and often. These animals might have an ad libitum (AL) intake far in excess of their wild cousins, consistent with the observation that lab mice eat more than wild ones when confronted with an all-you-can-eat kibble buffet.

The beneficial effects of CR are generally a biphasic function of total caloric intake: there’s a sweet spot of maximum benefit at, say, 60% AL, but below that the mice are generally less healthy than the AL cohort, because they’re frankly starving.

One upshot of the selection of lab mice for efficient conversion of kibble into more mice (and their concomitant bigger appetites) is that wild mice on AL diets might be closer to the CR “peak” than lab mice — so that reducing their intake to 60% actually puts them below the peak, into the starvation part of the dose-response curve. If this is the case, then wild mice might well benefit from CR, just at a higher percentage of AL calories.

But it might also be, as the authors point out, that lab mice have been inadvertently selected for some trait that enhances the benefits of CR. Wild populations may be a mix of some mice that benefit from CR and some that don’t (e.g., the early-mortality subjects in this study).

This is the possibility that’s of particular concern for humans. Via the emerging field of pharmacogenomics, we’re becoming increasingly aware of the impact that genetic variation can play on the efficacy of therapeutics: Sometimes a drug will benefit only 20% of the trial population, but it later turns out that 100% of those 20% had the same haplotype at a particular locus, making the drug great for them and lousy for everyone else. This phenomenon isn’t limited to drugs; it extends (e.g.) to the benefits people reap from exercise and the risk of getting cancer after a lifetime of smoking.

What if calorie restriction is the same? I would argue that we’re more like wild mice than lab mice, and that there are going to be existing variants in the human population that benefit differently from CR (just like any other course of treatment). What do we do if it turns out that CR confers both early mortality (on one subset of the population) and delayed aging (in another, or in everyone who survives the increased early-mortality risk)?

We’ve based a lot of our optimism about CR in humans on data from rodents, and while the human and primate data is promising so far, it’s also limited in statistical power and temporal depth. Simple intellectual honesty mandates that these results give us pause, and encourage rational reflection about all the possibilities that might result from being our own guinea pigs.