Calorie restriction—limiting the amount of energy consumed by an organism, while ensuring proper nutrition—is among the most reliable ways to extend healthspan in a wide range of organisms. A vast literature has documented the effects of CR (and related approaches such as intermittent fasting) in animals reared in the laboratory, and I have written extensively about them here. However, we still know comparatively little about the benefits of such regimens in human beings.
In some sense, this is unsurprising: humans, near and dear to our hearts though they might be, make lousy model organisms. Even within the same society, we eat differently from one another, exercise differently (or not at all), and live in wildly diverse settings, making it difficult to control for all relevant variables. To compound the problem, we already enjoy long lives—inconvenient indeed if one hopes to observe the effects of dietary changes on lifespan within a reasonable interval of time.
However, this has not stopped intrepid researchers from initiating (and then following through!) on clinical trials of CR in humans. Over the past decade, the first two phases of the cleverly named CALERIE* study showed that fairly drastic, long-termCR is safe and well tolerated in human subjects. More recently, an ancillary study of the second phase of CALERIE performed a detailed examination of the changes in energy expenditure associated with CR, measuring multiple endpoints, including markers of oxidative stress and aging more generally.
The results, published earlier this month, provide further validation that CR is safe in non-obese human subjects. Although the data are too preliminary to tell us whether CR extends lifespan in humans, they are consistent with previous findings in model organisms, suggesting that (subject to heavy qualifications, and limited by the the temporal scope of the study) CR may also work in people.
The measurements the authors made help us understand how it might work — and the results have ramifications for two prominent theories of aging.
The participants in the study lost weight (unsurprisingly), but their overall energy expenditure decreased even further, significantly more than would be predicted from the decrease in body mass. This is in line with the predictions of the “rate of living” hypothesis, which states (to massively simplify) that lifespan is inversely proportional to metabolic rate. In these human subjects, CR decreased energy expenditure per unit mass, an indicator of the metabolic rate — so, if it eventually turns out that CR extends lifespan in humans (which, again, we don’t yet know), it could be doing so by inducing a metabolic adaptation.
The CALERIE subjects also had lower levels of oxidative stress, as revealed by measurements of a panel of molecular markers. The authors argue that this observation supports the well-known “free radical theory of aging” (FRTA)†, which holds that aging could be caused by endogenously generated oxygen radicals. We could interpret the data as showing that CR decreases reactive oxygen species (ROS) production (possibly, but not necessarily, by decreasing the overall metabolic rate). Thus, if CR also extends lifespan, it may also achieve this end by lowering the oxidative stress burden.
However, an alternative interpretation is available. The FRTA is one of the oldest modern theories of aging, and has undergone numerous rounds of attack, defense, and revision. On this topic, my own thinking has been been heavily influenced by a fantastic review by Siegfried Hekimi, which attempts to reconcile the large body of circumstantial evidence implying a causative link between oxidation and aging with a growing body of genetic and biochemical evidence that it does not. We can synthesize these two apparently contradictory positions by positing that under most circumstances, oxidative molecules such as ROS are not causes of aging, but instead act as second messengers to signal other kinds of stress that actually do contribute to the aging process. From this standpoint, we would still expect to see a reduction in the levels of oxidative markers in a person who was aging more slowly, but we would not conclude that the decrease in ROS levels was causally responsible for slower aging per se.
How the effects of CR in humans influence our thinking about the causes of aging will ultimately depend on whether CR actually extends longevity. On that issue, the first primate past the post will be not H. sapiens but M. mulatta, We know that CR confers health benefits in macaque, but the final lifespan data are still pending. Given the close evolutionary relationship between great apes and Old World monkeys , we can be fairly confident that if CR extends lifespans in macaques, it will also do so in humans.
In the meantime, however, there are no red flags in the human data, and the results to date seem promising. So far, so good. Watch this space for future developments—in 20 years or so, we’ll have some lifespan data for you.
Redman et al. “Metabolic Slowing and Reduced Oxidative Damage with Sustained Caloric Restriction Support the Rate of Living and Oxidative Damage Theories of Aging.” Cell Metabolism 27(4):805–815 (2018). DOI: 10.1016/j.cmet.2018.02.019
* Comprehensive Assessment of the Long-term Effects of Reducing Intake of Energy. I’m inclined to forgive the misspelling, but would “Outcomes” have been so bad?
† Sometimes called the mitochondrial free radical theory of aging (MFRTA), which evokes fewer giggles from Scrabble fans than FRTA.