Sign in Register Submit Manuscript

Hapres Home

Location: Home >> Detail

Adv Geriatr Med Res. 2020;2(1):e200002. https://doi.org/10.20900/agmr20200002

Review

“Diet and Exercise Will Help You Live Longer”: The Meme that Turns on Housekeeping Genes

Kurt A. Escobar 1,* , Lauren M. Visconti 1, Alec W. Wallace 1, Trisha A. VanDusseldorp 2

1 Physiology of EXercise and Sport Lab, Department of Kinesiology, California State University, Long Beach, 1250 Bellflower Blvd, Long Beach, CA 90840, USA

2 Department of Exercise Science and Sport Management, Kennesaw State University, 520 Parliament Garden Way NW, Kennesaw, GA 30144, USA

* Correspondence: Kurt A. Escobar, Tel.: +1-562-985-7983.

Received: 31 August 2019; Accepted: 09 December 2019; Published: 17 December 2019

This article belongs to the Virtual Special Issue "Aging and Metabolism"

ABSTRACT

“Diet and exercise will help you live longer” is a well-known meme. While often taken for granted, its foundations span back to our evolutionary environment and its effects extend into our intracellular environment. Humans evolved under conditions of high physical activity and periodic privation which shaped our genes. During these times of energetic challenge, an evolutionary conserved recycling system, autophagy, would have been activated to provision energy through the degradation of intracellular proteins, organelles, and lipids. With physical activity no longer a requisite for survival and caloric abundance rather than caloric shortage defining the modern human environment, the signals for autophagy are no longer obligatory. Moreover, humans have evolved an avoidance of physical activity and caloric restriction (CR). This leads to an accumulation of intracellular components causing degeneration and disruption of cellular homeostasis. This deleterious accrual of cellular materials also occurs during aging, in part, by an age-related decline in autophagy. What’s more, humans live in a period of history where advances in sanitation and medicine have allowed us to live to unprecedented ages, resulting in long-lived humans with progressive system-wide degeneration. Exercise and CR practices promote age-related health and longevity through their activation of autophagic housekeeping, but evolutionary inertia pushes us to avoid them. However, humans are unique in that we can harness our own genes as well as propagate our own memes. In order to yield the benefits of cellular housekeeping through exercise and CR practices, we should understand our genes and become memesters.

KEYWORDS: autophagy; longevity; exercise; physical activity; caloric restriction; memes; evolution; intermittent fasting; proteostasis

INTRODUCTION

Evolutionary history has been that of frequent energy challenge. Autophagy is an energy-sensitive recycling system that life has employed for billions of years which degrades cellular contents in response to metabolic challenges and/or nutrient restriction [1]. Humans, who have evolved under high and frequent energy demands and periodic privation [2], would have used autophagy as a means to provision energy through the degradation of cellular proteins, organelles, and lipids during times of calorie shortage and to support the physical activity required for survival. With the removal of physical activity as a survival requirement and food excess more characteristic than food shortage in developed civilizations, so too removed are the signals for autophagy, leading to the accumulation of intracellular components [3]. Coupled with the age-related decline in autophagy, proteostasis is ill-maintained, dysfunctional mitochondria are not removed, and intracellular lipids progressively accumulate during the aging process and precipitate degeneration and disease [4]. Maintenance of the intracellular environment through autophagy is associated with age-related health and longevity [5,6]. Increasing data show that exercise and the caloric restriction (CR) increase autophagy activity in humans which mediates their known effects on aging and longevity [3,7,8]. The human body responds to these stressors positively because they mimic signals from our evolutionary environment; the environment to which our genes are adapted. An understanding of evolutionary history dating back from early eukaryotic life to pre-modern humans serves to explain the therapeutic effects of exercise and CR on human function and aging as well as our predisposition to avoid these behaviors and our attitudes toward them in culture at large.

Just as our genes influence our biology, our genes influence cultural phenomena [9–11]. “Diet and exercise will help you live longer” is a culturally understood phrase and idea, a meme. Coined by Richard Dawkins in 1976 in The Selfish Gene, a meme is a unit of cultural information such as behaviors, norms, or beliefs that may be propagated within human societies, similar to the propagation of genes in a species [12]. While not an undisputed concept and its exact relation to genetics has been hotly contested [13–16], memetics provide an explanation for cultural and behavioral phenomena in a population. The relationship between memes and genes is a controversial one. However, the need for physical activity and energy restriction is in human genes and their practice activate longevity-promoting processes making “Diet and exercise will help you live longer” a meme that is, indeed, informed by human genes [2,17,18]. At the same time, physical activity avoidance and the instinct of caloric overconsumption are also in the human genes, as they would have promoted energy conservation in the pre-modern environment [2,17,18]. Consequently, the predisposition to a sedentary lifestyle and food overconsumption are part of the behavioral software human genes have handed us (encapsulated in another meme: “Diet and exercise? I’ll start tomorrow”). This is occurring as developments in sanitation and medicine are allowing humans to live to unprecedented ages compared to our ancestors [19] resulting in long-lived humans with progressive physical, metabolic, and cognitive degeneration [18,20].

Exercise and energy restriction practices such as CR and intermittent fasting (IF) attenuate degeneration and promote longevity [3,21,22] by mimicking the environmental stresses for which our genes are shaped and activating autophagy [2,17]. While often taken for granted, the foundation of “Diet and exercise will help you live longer” spans back to our evolutionary environment and its effects extend into our intracellular environment. Propagation of this meme may be useful in combatting the evolutionary inertia of physical inactivity and calorie overconsumption to promote healthy aging of the population. This review will discuss the relationship between exercise, CR, and human genes, their effect on aging and longevity, and how this meme carries with it wisdom billions of years old.

THE BLESSING AND BURDEN OF EVOLUTION

Evolution by natural selection does not adapt a species to its environment, but its previous environment; to that of its predecessors [23]. Historically, the rate of a species’ environmental change has been modest enough that its subsequent adaptation renders the organism suitable for its current environment [24]. However, humans are unique in that we have developed the capability of drastically altering our physical and social environment [24,25], namely by mastering many of the natural stressors that faced our ancestors throughout our evolutionary history [18,26]. Among these are the necessity of physical activity for survival as well as food insecurity. Therefore, modern humans are not adapted for our current environment of food abundance and physical inactivity [2,18], such as that found in developed countries. As a consequence, chronic degenerative diseases as well as obesity afflict a significant number of populations, stemming from the mismatch between our genes and environment [27]. This is occurring during a period of unprecedented human lifespan due to modern sanitation and medicine [19], leading to long-lived humans with progressive physiological degeneration as a result of the aging process in addition to a sedentary lifestyle and overnutrition. At the cellular level this is characterized by the accumulation of damaged and dysfunctional proteins, organelles, and lipids [8,28].

Humans are adapted for physical activity, specifically endurance-based activity [2]. Human skeletal structure, muscle fiber phenotype, thermoregulation, and metabolism are adapted for long distance trekking, a key feature of human history in contrast to other non-human primates [29]. The evolution of bipedalism of early hominins and the eventual hunting and gathering of early Homo approximately two million years ago required long-distance trekking [29]. Endurance running was also likely a key feature of hunter-gatherers as persistence hunting was used as a means to acquire meat [29]. Thus, physical activity, particularly of endurance nature, was obligatory for human survival for which endurance-favored genotypes were selected [2,30]. A lack of endurance-based activity leads to a global degeneration of the body including the metabolic, musculoskeletal, cardiovascular, cognitive, and reproductive systems [31].

While humans have been selected for physical activity and energy restriction, we have also been selected to avoid them [2]. Amidst food insecurity and high physical demands, humans struggled to stay in energy balance and evolved to rest whenever possible in order to conserve energy [2]. Caloric expenditure that was not related to survival or reproduction was maladaptive while avoiding voluntary physical activity was adaptive [2]. Similarly, in order to attempt to stay in energy balance, humans evolved food seeking and (over)consumption behaviors; humans are foragers, adapted to capitalize on caloric availability [17,30]. Moreover, the ability to increase adiposity was an advantage allowing for the storage of calories [18,30,32]. Additionally, not only was non-obligatory physical activity a waste of energy, but the maintenance of high-capacity bodily systems (i.e., musculoskeletal, cardiorespiratory) is energetically expensive, and it was maladaptive to support systems whose capacities exceeded that of what was necessary [2]. As such, the human body responds to physical activity in a dose-dependent manner by adjusting its capacity exactly to the imposed demands [33,34]. Accordingly, once the demand is removed, the increased capacity is lost and the body reverts to baseline status [2,35]. This explains the “use it or lose” principle of exercise and the need for “lifelong exercise”, albeit difficult to sustain.

These (evolutionary) adaptations explain the high rates of physical inactivity, especially amongst the older population. Only approximately half of US adults meets the physical activity guidelines of 150 min of moderate or 75 min of vigorous intensity aerobic activity per week [36] and only a third of US adults aged, 65 years or older, meet the guidelines [37]. In the United Kingdom, approximately a third of older adults aged 65–74 years are inactive while 52% of adults aged over 75 years are inactive [38].

The characteristics of human physiology and neurobiology are worth grappling with in order to attempt to affect health through physical activity and dietary interventions throughout human lifespan. While problematic in the current environment, in previous environments biological and behavioral mechanisms that insured energy overconsumption, minimized expenditure, and maximized storage were advantages [2,18,30,32]. The predisposition of physical activity avoidance and caloric overconsumption is a feature of the human species, not simply a bug of individual character. Human biology is informed by the past and looks far beyond the several decades of a single human life to the interest of the species and dictates behavior accordingly; from a gene’s perspective, exercise and CR are maladaptive [2]. Per Richard Dawkins, we are indeed held hostage by the selfish gene [12]. However, as evolved conscious animals, humans are capable of decision making at the individual level and can influence cultural attitudes and norms [10]; it may be possible to overcome this evolutionary inertia.

AUTOPHAGY AS A MEANS TO PROVISION ENERGY

Autophagy is an energy-sensitive intracellular recycling system that has been highly conserved throughout evolution and is present in all known eukaryotic cells from yeast to humans [8,39]. During energetic stress or nutrient depletion, cellular materials are engulfed by double-membraned vesicles called autophagasomes which then travel and fuse to the lysosome where materials are degraded [1]. Constituent materials are then released back into the cytosol where they can be used for metabolism [40]. Given the energy insecurity that organisms would face throughout evolutionary history, autophagy serves as a means to provision energy in times of starvation by digesting its own contents for survival. “Autophagy”, in the Greek meaning “self-eating”, can be divided into three primary pathways dependent on the cargo and the form of transport to the lysosome: chaperone-mediated autophagy (CMA), microautophagy, and macroautophagy [1]. Currently, macroautophagy is best understood (as well as the primary type studied in humans within the context of nutrition and exercise) and will serve as the focus in this review, being referred to as “autophagy” hereafter. Autophagy operates at constitutive levels and maintains the proteome and organelle population by sequestering and degrading damaged and dysfunctional proteins, organelles, and lipids [41,42]. Maintenance of the proteome and organelle quality is key in cellular homeostasis as the accumulation of deleterious and long-lived proteins and mitochondria leads to progressive degeneration and cell death [4]. Accumulation of intracellular lipids also disrupts cellular homeostasis and metabolism [43,44].

Caloric restriction, nutrient depletion (i.e., serum or amino acids in cell culture), and increased energetic demand (i.e., acute exercise) activate autophagy [45,46], and energy and nutrient surplus inhibit autophagy through post-translational modification and transcriptional programs [39,47] (Figure 1). Increases in adenosine monophosphate (AMP), nicotinamide adenine dinucleotide (NAD+), and calcium (Ca++) initiate autophagy [46]; however, depletion of cytosolic Acetyl-Coenzyme A (AcCOA) appears to be a primary regulator of autophagic induction through protein deacetylation [48,49]. In fact, AcCOA has been shown to dictate autophagy activity irrespective of nutrient status [48]. Maintenance of AcCOA levels abrogates starvation-induced autophagy, while depletion of AcCOA amidst non-starvation conditions initiates autophagy [48]. Though the mechanisms are unclear, AcCOA status appears to be associated with regulation of 5’ adenosine monophosphate kinase (AMPK) and the target of rapamycin complex 1 (TORC1), positive and negative regulators of autophagy, respectively [48]. Along with NAD-dependent deacetylase, sirtuin1 (SIRT1), AMPK upregulates autophagy in multiple ways including through TORC1-dependent [50,51] and -independent pathways [46,52].

FIGURE 1
Figure 1. Exercise and caloric restriction activate autophagy through shared pathways. Solid blue lines represent positive regulation. Solid red lines represent negative regulation. Dotted blue line represents positive regulation, but unknown downstream mTORC1-mediated effects on autophagy in response to resistance training. NAD+: nicotinamide adenine dinucleotide; AMP: adenosine monophosphate; Ca++: calcium; SIRT1: NAD-dependent deacetylase sirtuin-1; AMPK: adenosine monophosphate kinase; FOXOs: forkhead box transcription factors; PGC-1α: Peroxisome proliferator-activated receptor gamma coactivator 1-alpha; mTORC1: mammalian target of rapamycin; TFEB: transcription factor EB.

Energetic stress elevates AMP and NAD+, activating AMPK and SIRT1, respectively, which can stimulate autophagy through their inhibition of TORC1 [46]. TORC1, which is upregulated by nutrient availability, particularly amino acids, prevents autophagasome formation through hyperphosphorylation of Atg1 (ULK1 in mammals) on Ser757 [52] and inhibits activation of the coordinated lysosomal expression and regulation (CLEAR) network by phosphorylating transcription factor EB (TFEB) on the lysosomal membrane and preventing its translocation to the nucleus [53]. The CLEAR network regulates the expression of autophagy-related genes (Atgs) [54,55]. AMPK inhibits TORC1 through activation of tuberous sclerosis complex 2 (TSC2), preventing TORC1 from binding to its activator Ras homologue enriched in brain (Rheb) on the membrane of the lysosome [51]. AMPK also inhibits Raptor, a primary regulatory protein complex of TORC1 [56]. At the same time, during nutrient depletion and TORC1 inhibition, Ca++ is released from the lysosome into the cytoplasm, activating calcineurin which promotes TFEB’s translocation to the nucleus to activate the CLEAR network [54]. Regulation of TORC1 by SIRT1 is less clear, but may involve TSC2 [50]. AMPK and SIRT1 also upregulate expression of Atgs through activation of transcription factors FOXO1, FOXO3, and PGC-1α [46]. AMPK additionally directly initiates autophagasome formation through ULK1 phoshporylation of Ser555 [52]. Expansion, recognition of autophagic cargo, and fusion of the autophagosome to the lysosome (autolysosome) is mediated by microtubule-associated protein 1 light chain 3 (LC3) [1,57]. Deacetylation of LC3 by SIRT1 has been shown to be key in translocating it from the nucleus to the cytosol allowing it to interact with Atg7 and Atg13 under nutrient depletion conditions [49]. Formation of the autolysosome allows for the degradation of autophagic cargo by the lysosome which releases the constituent materials back into the cytosol for cellular metabolism [40].

While the canonical non-selective autophagic recycling of proteins and mitochondria is well known, macroautophagy has been discovered to operate selectively [58] including targeting intracellular lipid droplets, lipophagy [44]. Fatty acids are a primary substrate for energy metabolism and while the degradation of stored triglycerides had previously been thought to be mainly catabolized by hormone-sensitive lipase (HSL) and adipose triglyceride lipase (ATGL), growing literature suggests lipophagy serves as a vital lipolytic system, particularly under nutrient stress [59]. Lipophagy is regulated through canonical nutrient sensing pathways including at the transcriptional level through TFEB [60], FOXOs [61,62], and PGC-1α , as well as through TORC1 signaling [63]. Lipophagy holds implication in metabolic diseased states as it regulates lipid accumulation, which is a hallmark of a number of age-related metabolic diseases and degeneration [43,44].

AUTOPHAGY AS A TARGET FOR LIFESPAN AND LONGEVITY

Maintaining intracellular protein, organelle, and lipid quality is fundamental to cellular homeostasis and human function [4,64]. The accumulation of these materials leads to cellular dysfunction, cell death, and impaired tissue and organismal function [7,65]. In our evolutionary past, the intracellular environment would have been maintained by autophagy in response to physical activity and periodic energy deprivation. In addition to the removal of obligatory physical activity and energy insecurity leading to an accumulation of cytosolic materials, the accrual of deleterious materials is a hallmark of aging itself [4]. Thus, autophagy represents a primary effector of the aging process [8]. Decrements in autophagic function occur with advancing age; this has been observed in numerous organs and tissues in several model species, including humans [28,66,67]. The loss of autophagic clearance likely precipitates the progressive accrual of damaged proteins and organelles and likely contributes to the aging phenotype [5,6]. The mechanisms of the age-related impairment of autophagic function are not established. However, a reduction in Atgs at the mRNA and protein levels has been observed with advancing age [6,68,69] as has SIRT1 expression [6,70]. Enhanced activity of TORC1 in aged cells has also been implicated in age-related autophagy downregulation [71,72].

It is well established that the inhibition of autophagy results in premature aging while its enhancement postpones the aging phenotype and promotes lifespan in a number of model organisms [6,73,74]. Loss-of-function and silencing of Atgs decrease lifespan of Caenorhabditis elegans (C. elegans) [75] and Drosophila melanogaster (D. melanogaster) [71]. Further, knockout of Atgs in mice precipitates age-related dysfunction, including accumulation of defective organelles [76–78], protein aggregation [79–81], disorganized mitochondria [77,78], and endoplasmic stress [76]. Moreover, dysregulation of autophagy regulators (over-expression mitochondrial aldehyde dehydrogenase (ALDH2) [82] and deficiency of macrophage migration inhibitory factor (MIF) [83]), have been demonstrated to be associated with advanced aging via impairments in cardiac tissue function (i.e., contractile function, myocardial remodeling). In contrast, augmented autophagy increases lifespan in simple model organisms [6,28,74] and promotes longevity in tissues of mice [67]. This was demonstrated by Ren et al. [84], who found that AKt2 ablation protects against cardiac aging and dysregulation through restored FOXO1-related autophagy, as well as mitochondrial integrity. Inhibition of TORC1 is also well-known to produce lifespan extension effects in numerous model organisms including mice [85–87]. TORC1 is a central regulator of cell growth and protein synthesis, thus is central to the accumulation of cellular protein characteristic in aged cells [88]. The lifespan-related effects of TORC1 inhibition, at least in part, involve subsequent activation of autophagy [87]. Inhibition of autophagy during TORC1-inhibition abolishes life-extending effects [89,90]. Notably, excess TORC1 activity may be implicated in the age-related decline in autophagy [91]. Thus, autophagy has become a focus as a pathological mechanism as well as a therapeutic target, both pharmacological and behavioral, in aging. Behavioral interventions that maintain or enhance autophagy hold practical potential in the promotion of healthy aging and longevity in humans.

EXERCISE ATTENUATES AGE-RELATED DEGENERATION AND PROMOTES LONGEVITY

The human body is developed to undergo a greater degree of physical and metabolic stress than that which the modern human typically experiences [2]. Exercise represents a physiological stress that the body (somatically) adapts to in order to better accommodate the stress [34]. Repeated acute increases in mRNA of stress-response genes lead to an increased basal expression of proteins and increased capacity and function [92]. This response is not relegated to skeletal muscle; exercise elicits global responses and adaptations affecting virtually every system and organ [34,93]. This systemic response explains the ability of exercise to promote health, prevent diseases, and treat symptoms in a wide swath of tissues and systems. Moreover, this phenomenon is age-independent as older adults may realize the therapeutic benefits of exercise [94,95]. Indeed, regular physical activity produces widespread positive effects on age-related health [31]. Recently, the underpinnings of the effects of exercise on health and aging have garnered much attention [3,96]. Autophagy has become increasingly implicated in the responses and outcomes of exercise, particularly those related to aging [7,97,98].

Humans are adapted for endurance-based activities [2]. A lack of endurance-based activity leads to progressive systemic degeneration, pathological aging, and premature death [31]. Indeed, physical inactivity increases the prevalence of over a dozen degenerative conditions spanning numerous tissues and systems including the cardiovascular, musculoskeletal, endocrine, reproductive, and neurocognitive systems [31]. These include age-related disease such as atherosclerosis, type 2 diabetes, sarcopenia, osteoporosis, neurodegenerative disease such as Alzheimer’s and Parkinson’s, and cancer [31]. Further, low aerobic capacity or, more specifically, maximum oxygen consumption (VO2max) is associated with increased mortality and may actually serve as a biomarker of premature death [31]. Reports suggest physically active people may extend lifespan between approximately 2–6 years from the average human lifespan [99–101]. Highly-trained individuals have been shown to have a lower mortality rate and higher life expectancies than non-athletic populations as well as lower prevalence of degenerative diseases [102,103].

Resistance-based exercise also promotes improved function and age-related health, particularly that of the musculoskeletal system. At the phenotypic level, aging is associated with decrements in skeletal muscle mass and strength leading to sarcopenia [104]. This progressive loss in muscle mass ranges from 1% to 2% per year in healthy, physically active adults ages fifty and over [105]. Skeletal muscle atrophy impairs muscular function, hinders force production, and reduces physical capacity. This decline in physical function is predictive of incident disability, nursing home admission, and subsequently, all-cause mortality in the elderly [104]. However, resistance training can attenuate this age-related loss of muscle mass and mitigate physical impairments concomitant of muscle wasting. It has been established that resistance training increases muscle size, muscle strength, and functional capacity [106–108]. Likewise, there is sufficient evidence demonstrating that muscular fitness, particularly muscular strength, is inversely associated with all-cause mortality [104,106,107]. In men, ages sixty and older, high muscular strength is correlated with a lower relative risk of all-cause mortality [104]. Additionally, the rate of strength decrement is a prominent risk factor for all-cause mortality in men, aged sixty and less, irrespective of muscle mass [104]. Furthermore, resistance training can improve physical functionality, including both neuromuscular and mass-related components, in sarcopenic adults [105,109]. This has shown to increase performance in both simple and complex activities of daily life in the elderly [109,110]. Resistance training also produces significant benefits in glucose regulation and insulin sensitivity [111]. Moreover, resistance training may also prevent cognitive decline in older adults [112,113].

AEROBIC EXERCISE ACTIVATES AUTOPHAGY

Aerobic exercise induces energetic stress, particularly in skeletal muscle [114], which mimics the physical activity demands of pre-modern humans. In response, autophagy is upregulated in order to meet the heightened energy demands as well as mediate the clearance of proteins and organelles damaged by heat, pH, and/or mechanical stresses from acute exercise [46]. However, acute exercise has been shown to induce autophagy in a number of tissues including the brain [115], heart [115], adipose tissue [115], pancreatic β cells [115], and peripheral blood mononuclear cells (PBMCs) [116]. While the signaling mechanisms are not fully understood, this systemic autophagic response coincides with the ability of exercise to exert protective effects in diverse tissues including those of which autophagic function is implicated such as neurodegeneration, insulin resistance, atherosclerosis, and inflammation [31,65,93].

Exercise activates autophagy through post-translational modification as well as transcriptional programs [46]. Elevations of AMP and NAD+ in response to acute exercise activate AMPK and SIRT1, respectively [34]. AMPK and SIRT1 upregulate autophagy through induction of the autophagic machinery, including autophagasome formation [52], activation of Atg transcription factors FOXO1 and FOXO3 and PGC-1α, and the inhibition of mTORC1 (mammalian target of rapamycin complex 1) [46]. Alterations in Ca++ in response to muscle contraction also cause the transcription of Atgs through the CLEAR network via TFEB following its translocation to the nucleus by calcineurin dephosphorylation and mTORC1 inactivation [117].

Tumor suppressor protein p53 has been identified as a regulator of autophagy [118,119] and is likely involved in exercise-induced autophagy [120]. p53 regulation of autophagy appears to be determined by post-translational modification [121] and/or subcellular location [118,122]. Nuclear p53 induces autophagy, whereas cytosolic p53 inhibits autophagy [118]. Acute exercise has been shown to increase nuclear p53 abundance [120] as well as increase phosphorylation of Ser15, which is indicative of increased activity [123]. In the single human study investigating the effect of acute exercise on p53 and autophagy activity, 1 h of cycling at 70% VO2max in untrained males increased nuclear localization of p53, but did not elicit changes in markers of autophagy [120].

While currently ill-characterized, muscle contraction-induced p53 activity appears to involve a bidirectional cross-talk between AMPK, where AMPK results in an initial activation of p53 via phosphorylation of Ser15 [122,124,125] while downstream target gene, Sestrin2, may act to further upregulate AMPK as well as inhibit mTORC1 [126–130]. Sestrin2 also serves as a downstream target of p53 in autophagy activity modulation [119,124,126,127] and has been shown to be coimmunoprecipitated with AMPKα2 in response to acute exercise in mice [127]. C2C12 myotubes overexpressing Sestrin2 demonstrated an increased phosphorylation of ULK Ser555 [126]. Further, it has been documented that Sestrin2 also is capable of directly associating with ULK1 and p62 to promote autophagic flux in HEK293 cells [131]. While it is unclear if the responses occurring in vitro reflect the signaling events that occur in exercised human skeletal muscle, Sestrin2 likely serves as a primary mode of p53-autophagy signaling, although p53 is also involved with the transcription of PGC-1α [122].

THE AUTOPHAGIC DOSE OF AEROBIC EXERCISE

The autophagy response to acute exercise appears to be regulated in a duration and intensity-dependent manner [120,132,133]. In humans, 2 h of moderately-high-intensity cycling (70% VO2peak) resulted in greater activation of autophagy than low-intensity cycling (55% VO2peak) in the vastus lateralis [132]. While both the high- and low-intensity exercise protocols resulted in reductions of LC3II and LC3II:LC3I ratio, only the high intensity bout showed decreased protein levels of p62. p62 is a bridging protein involved in delivering substrates to the autophagosome [1]. Decreased p62 content serves as an indicator of elevated autophagic flux [134]. Additionally, the transcription of autophagy-related genes were only upregulated following the high-intensity bout, as measured by LC3 p62, GABARAPL1, and Cathespin L mRNA. Phosphorylation of ULK1 Ser317 was also significantly upregulated only in the high-intensity group. Importantly, AMPK activity was significantly increased following the high-intensity bout but not in the lower-intensity bout, implying the lower intensity exercise did not reach a threshold of energetic stress [132].

Jamart and colleagues collected muscle biopsies from the vastus lateralis of 11 male ultra-endurance athletes 2 h prior to and immediately following a mean of ~24 h of treadmill running [135]. Energy expenditure was calculated using run distance, running time, and running economy while energy intake was monitored throughout the bout. Energy intake during exercise covered 30% of energy needs resulting in caloric deficit over the 24 h. Phosphorylation of Akt (an upstream activator of mTORC1), mTORC1, and 4E-BP1 (a downstream target of mTORC1) decreased, while FOXO3a and AMPK activity was significantly increased. LC3II protein expression was increased post-exercise by ~550% of baseline. This was accompanied by a significant increase in Atg12-Atg5 conjugation, another indication of autophagosome formation [135]. Similarly, the same group reported in a separate study that free-running ultra-marathon performance (lasting ~28 h) increased transcription of Atg4b, Atg12, GABARAPL1, and LC3 in the vastus lateralis, with magnitudes of the increase ranging from 59 to 286% [133].

In contrast, Moller and colleagues [136] showed that 60 min of cycling exercise at 50% VO2max resulted in a decreased LC3II:I ratio, but reported no reduction of p62 in the vastus lateralis 90 min following exercise. No changes in Atgs were noted despite an upregulation of AMPK and ULK1 Ser555. Similarly, muscle biopsies taken before and after 20 min of low intensity cycling (~50% VO2max) revealed no significant impact on protein expression of p62, LC3I, and LC3II, with no associated change in phosphorylated AMPK [137]. And whereas Schwalm et al. [132] showed 2 h of cycling at 70% VO2max was sufficient in eliciting increases in autophagic flux and a number of Atgs in well-trained athletes, Tachtsis et al. [120] reported no changes in ULK1, LC3I, LC3II, or p62 protein expression following 1 h of cycling at 70% VO2max in untrained males despite an increased nuclear localization of p53. These findings help highlight the importance of exercise duration and intensity in stimulating autophagic induction and point to a threshold for activation, likely involving AMPK-mediated determination of energy demand. As such, investigation into the autophagic response to high-intensity interval training (HIIT) is warranted as scarce data exist. While not HIIT per se, Brandt and colleagues [138] showed 60 min of cycling at ~60% VO2max interspersed with 30 s of high-intensity sprints every ten minutes increased LC3I, LC3II, and BNIP protein. However, there was no difference compared to 60 min of continuous cycling alone. This suggests an autophagic threshold had been reached with the continuous exercise alone.

Interestingly, metabolic health appears to affect the autophagy response to acute exercise. Dysregulated autophagy occurs in numerous diseased states including those relating to aging such as poor glucose handling and type 2 diabetes [65]. Sixty minutes of cycling at 50% VO2max with 5 min of rest interspersed every 20 min increased autophagy in PBMCs of healthy adults, but not in prediabetic subjects [139]. Cultured PBMCs of the prediabetic subjects also showed a blunted autophagic response to rapamycin treatment.

These data are interesting in that a threshold for autophagy induction appears to exist at ~≥60 min at ≥50% VO2max. However, it is known that regular aerobic exercise and physical activity at lower durations and intensities such as walking can produce positive health outcomes and increase basal expression of Atgs (detailed in following sections). This calls to question whether heightened activation following an acute bout of exercise is required for chronic changes in autophagy function.

THE AUTOPHAGIC RESPONSE TO RESISTANCE EXERCISE

While limited data are presently available describing the relationship between endurance exercise and autophagy, even less exist relating to resistance exercise. However, given the distinct stresses, metabolic vs mechanical, it is likely the autophagic responses to aerobic and resistance exercise, respectively, are dissimilar. Primary adaptations to resistance exercise (i.e., muscular hypertrophy and strength) rely on the activation of mTORC1 and the accretion of myofibrillar proteins where rates of protein synthesis exceed that of degradation [140–142]. Mechanical loading of skeletal muscle activates mTORC1 for up 48 h following acute resistance training [140,141]. And while protein degradation is also activated [140,143], autophagy activity in the post-resistance exercise period is not well characterized.

A single study has shown autophagy increased post-resistance training in older adults. Fry et al. [143] reported that 8 sets of 10 repetitions of leg extension at 70% one repetition maximum (1RM) decreased LC3II protein content in the vastus lateralis at 3, 6, and 24 h post-exercise in older individuals (70 ± 2.0 years), as well as at 6 and 24 h in younger individuals. LC3II:I ratio was also reduced at 3, 6, and 24 h in both young and older individuals. Smiles et al. [144] conducted a study combining resistance training and short-term CR in young, resistance-trained subjects. Biopsies from the vastus lateralis were taken 1 h and 4 h following resistance exercise, which consisted of 6 sets of 8 repetitions of leg press at 80% 1RM following 5 days of energy deficit. However, there were no changes in protein and mRNA expression of a number of autophagy signaling proteins, Atgs, or markers of autophagic flux at wither post-exercise time point. Similarly, Glynn et al. [145] showed that 10 sets of 10 repetitions at 70% 1RM using leg extension did not result in any change in LC3II protein content or LC3II:I ratio at 1 h post-exercise despite a significant increase in AMPK activity post-exercise. Presently, more data are needed to characterize the autophagy response to resistance-based exercise as well its relationship with mTORC1 in the post-exercise period.

THE CHRONIC EFFECTS OF EXERCISE ON AUTOPHAGY

In addition to providing substrate during the energetic demands of exercise, autophagy is required for a number of metabolic responses and outcomes [116,132,135,137,146,147]. This would reflect the systems’ maintenance that would have been instigated by pre-modern human physical activity. In a benchmark study, He et al. [146] showed mutant mice deficient in exercise-induced autophagy had impaired AMPK activation, GLUT4 translocation, and glucose uptake in skeletal muscle in addition to reduced exercise capacity. Another rodent study reported that deregulation of Atg4 resulted in the accumulation of defective mitochondria, impaired oxidative energy production, and depleted muscular ATP stores in cardiac and skeletal muscle in addition to poor exercise capacity [148]. Other works demonstrate autophagy is necessary for aerobic training-induced oxidative shifts of muscle fibers [98], mitochondrial biogenesis [98,147,149], and angiogenesis in mice [98]. Chaperone-assisted selective autophagy (CASA) is also involved in cytoskeleton maintenance and adaptation in skeletal muscle in response to resistance training in humans [150].

Data are scarce relating to the chronic effects of training and physical activity on autophagy. However, several rodent studies point to augmented autophagic activity following regular exercise. Markers of autophagy and autophagy signaling, LC3, Atg7, beclin-1, and FOXO, were increased in skeletal muscle of mice following 4 and 8 weeks of endurance training [98,151]. Ten weeks of HIIT produced increases in LC3II, LC3I, LC3II:I ratio, beclin-1, Atg3, Atg16, and Atg12 in cardiac muscle of mice; however, moderate-intensity continuous exercise did not [152]. Similarly, HIIT for ten weeks increased LC3II, LC3II:I, Atg3, and beclin-1 in skeletal muscle as well as myocardium in mice [153]. Moreover, nine weeks of resistance exercise resulted increased levels of Atg5, Atg12, Atg7 and Beclin-1 in aged rats [154]. This outcome was accompanied by increased total AMPK protein, increased AMPK phosphorylation, and increased FOXO3 activation, as well as downregulated mTORC1 activity.

In humans, Brandt et al. [138] showed eight weeks of 60 min of cycling at ~60% VO2max alone and 60 min of cycling interspersed with 30 s of high-intensity sprints every ten minutes elevated expression of LC3I and BNIP in skeletal muscle of moderately-trained 25 year old males (19–33 years old) while the continuous training increased beclin-1. Additionally, four weeks of resistance training increased markers of CASA in skeletal muscle of 25 ± 2 year old males [150]. Evidence suggests autophagy may be augmented in older adults as consequence of regular exercise and habitual physical activity. In older obese women (67.1 ± 8.8 years), six months of moderate-intensity walking (bouts of 15 min totaling 150 min per week) and resistance training led to elevations of LC3, Atg7, and FOXO3 mRNA in skeletal muscle [155]. Eight weeks of 25–30 min of aerobic training at 70–75% maximum heart rate with 1-min intervals of 90–95% maximum heart rate (progressing from 1 to 4 intervals over the training period) increased protein expression of Atg12, Atg16, beclin-1, and LC3II:I ratio in PBMCs of older men and women (69.7 ± 1.0 years) [156]. Eight weeks of resistance training also increased expression of Atg12, Atg16, LAMP2 and reduced content of p62 and ULK1 Ser757 phosphorylation in PBMCs in older men and woman (69.6 ± 1.0 years) [157]. Additionally, expression of Bcl-2 and Atg5-Atg12 complex was higher in older males (52 ± 11 years) that engaged in lifelong recreational soccer (football) training compared to non-active age-matched males [158]. These training- and physical activity-related autophagic enhancements were associated with improvements in predictors of all-cause mortality and age-related health; weight loss and increased walking speed [155], increased aerobic power [156,158], and enhanced muscle strength [157].

Autophagy function declines with aging which is associated with age-related pathologies. [4,28,66,67]. An enhancement in autophagic function likely explains, at least in part, the robust effects of regular exercise on age-related health and longevity. Interestingly, physical activity below durations and intensities documented to activate autophagy are capable of eliciting chronic changes in Atgs [155,156]. This suggests physical activity that may be more tolerable, particularly in aging populations, can produce changes in autophagy through exercise.

RESTRICTION OF CALORIES AUGMENTS AUTOPHAGY

Caloric restriction and nutrient depletion increase autophagy activity as this would have provisioned substrates for metabolism during periods of energy shortage. These stresses activate autophagy in vitro and in vivo in animals and humans through similar signaling pathways as acute exercise including elevated AMPK and SIRT1 activity and mTORC1 inhibition [3,159] and are associated with increased lifespan and longevity [8,22,45,90]. Caloric restriction increases lifespan of yeast, C. elegans [90], and Drosophila 2–3 fold [160], and 30–50% in mice [161,162]. In non-human primates, CR reduces mortality and attenuates age-related pathologies [163,164]. In humans, risk factors for all-cause mortality are reduced and markers of longevity are increased when compared to non-CR controls and populations [22,165–169]. Similar to other primates [163,164], humans may see longevity promoting effects at ~25–30% CR, a degree of energy restriction that does not induce malnutrition [22,167,169]. Data from the Calorie Restriction Society who employed a 30% CR show that long-term CR (average of >15 years) are leaner than age-matched controls and have reduced risk factors for type 2 diabetes, cardiovascular disease, stroke, cancer, and vascular dementia. Further, these individuals had healthier blood pressure, fasting glucose, insulin, and lipid levels, as well as decreased C-reactive protein, TNFα, IL-6, and improved insulin sensitivity [167,170,171]. Calorie Restriction Society members also had better left ventricular end diastolic function and reduced sympathetic and increased parasympathetic modulation of heart rate variability [170,172,173]. In fact, heart rate variability of members were comparable to that of individuals 20 years younger [173]. Shorter practices of CR can also improve markers of metabolic health and longevity. For example, CR of 25% for two years has been shown to improve adiposity, circulating leptin, fasting insulin, insulin sensitivity, blood lipids, C-reactive protein, TNFα, and IGFBP-1 [172,174–176]. Notably, however, neither two-year CR or long-term CR reduced circulating IGF-1 [166], a pro-aging biomarker. Short-duration CR has also been shown to reduce blood pressure [177] and blood glucose [177,178].

While CR produces longevity-associated effects, adherence is difficult as it runs counter to our evolved food seeking behaviors [17,18]. Additionally, long-term CR may introduce concerns of malnutrition when physician oversight is absent, as may be the case for healthy adults seeking to prevent disease and promote longevity [179]. Additionally, CR produces reductions in metabolism and losses in body mass, particularly lean body mass, which may be a concern for aging individuals and/or sarcopenic adults [179]. Indeed, CR can decrease bone mass, muscle size and strength, and maximal aerobic capacity in proportion to the reduction in body weight [175,176]. However, short duration, total restriction of calories on the timescale of hours or days, such as in IF and time-restricted feeding (TRF), appears capable of producing similar or even more robust effects without the losses of lean body mass observed in CR [180]. Short duration, restriction of calories may also be more tolerable than long-term CR. Moreover, Mitchell and colleagues [181] propose much of the life-extending effects of CR may be attributed to TRF following a study of single meal feeding in mice.

A number of fasting practices exist with ranging timeframes of CR. Strict fasting involves cessation of caloric intake for a prolonged period of time (i.e., days and weeks) to reduce the frequency of caloric consumption, with or without a reduction of total caloric consumption [182,183]. Alternatively, IF involves frequent, short-duration CR such as “5:2 fasting”, alternate day fasting, periodic fasting, and daily TRF, all which are capable of producing health benefits comparable to that of CR [183–185]. The 5:2 fasting protocol involves ad libitum feeding 5-days per week with 2-days of severe CR (i.e., 500 kcals/day) or complete cessation of caloric intake (i.e., 24-h fast) on non-consecutive days [179,183]. Alternate day fasting utilizes a 24-h fast every other day and periodic fasting incorporates an extended fast (i.e., up to 48 h) repeated on the basis of weeks to months [186]. Time-restricted feeding is a daily fasting cycle that restricts feeding to a refined period of time (i.e., 8 h), while fasting the remainder of the day (i.e., 16 h) [187]. Frequent, short-duration fasting practices may prove effective as IF every other day has been shown to increase lifespan in rats more than fasting every third or fourth day [188]. Additionally, TRF promotes weight loss while maintaining lean mass in both rodents and humans [189,190].

Data on variations of IF show improved markers of cardiometabolic health and longevity. As highlighted by Anton and colleagues [191] IF produces decrements in total cholesterol between 6–21%, LDL cholesterol by 7–32%, triglycerides by 16–42%, 3–8% and 6–10% in systolic and diastolic blood pressure, respectively, and fasting glucose by 3–6%. Moreover, it has been observed that even incomplete restriction of calories (i.e., intermittent CR) improves a number of cardiometabolic health markers [192,193]. For example, seven weeks of a liquid-based and food-based IF-CR protocol of 880–1080 kcals for six days followed by one day of 120 kcals, totaling a 30% reduction in kcals over the seven-day protocol, improved body composition, lipid profiles, and adipokines [194]. However, the liquid-based IF-CR diet produced greater changes in LDL, LDL peak size, small LDL particles, total cholesterol, and triglycerides. The liquid-based IF-CR diet also decreased circulating leptin, IL-6, TNFα, and IGF-1 while the food-based IF-CR only reduced leptin. A fasting-mimicking diet (FMD) created by Longo’s group produces improvements in a number of age-related health markers [195,196]. The five-day FMD protocol consists of one day of ~1100 kcals followed by ~700 kcals days two to five and has produced improvements in numerous metabolic and longevity related parameters including body composition, blood pressure, blood lipids, fasting glucose, C-reactive protein, and IGF-1 concentrations after completing once per month for three months [195,196]. Improvements have also occurred after a one-month cycle although less pronounced [195]. Interestingly, a liquid-based version of the FMD resulted in greater changes in these parameters [195]. A post-hoc analysis of one investigation [195] revealed the FMD exerted greater effects in participants with elevated risk factors or metabolic markers associated with metabolic syndrome and age-related diseases such as high BMI, blood pressure, fasting glucose, triglycerides, cholesterol C-reactive protein, and IGF1. Further, a follow-up showed many of these improvements are maintained after three months.

Energy deprivation stimulates autophagy [67,90,197]. Currently, however, few human data relating to CR practices exist. Five days of energy reduction from 45 kcals/kg fat-free mass per day to 30 kcals/kg of fat-free mass per day decreased expression of Atg5, but did not produce changes in FOXO1, FOXO3, ULK1, cAtg12, belcin-1, LC3I, and p62 expression or mRNA abundance of beclin-1, atg12, Atg4b, GABARAP, BNIP3, or LC3. Data using members of the Calorie Restriction Society show an upregulation of autophagy activity as well as decreased markers of growth factor signaling [169]. Mercken and colleagues [169] observed increased expression of AMPK, SIRT family proteins, FOX3A, FOXO4, PGC-1α, belcin-1, Atg4B, and LC3 in skeletal muscle in 15 lean and weight-stable members following an average of 9.6 years of CR. Growth factor signaling, PI3K and Akt, were also downregulated. Yang et al. [22] reported upregulation of ULK1, Atg101, belcin-1, APG12, GAPRAP/GAT-16, Atg4B, and LC3 in skeletal muscle of very lean and weight stable members practicing CR for 3–15 years (mean years not described). The heat shock protein response, another proteostatic system was also upregulated.

Stimulating autophagy is a primary theme of IF within the scope of health and longevity, particularly in the lay media. However, minimal human data exist which make IF and TRF recommendations and guidelines toward this end premature. Jamshed et al. [185] showed that four days of 6:18 TRF, where meals were consumed between 8 am and 2 pm, increased whole blood cell levels of Atg12 in the evening fasting hours of day three and SIRT1 and LC3 mRNA at the end of the fasting period (morning of day five). Interestingly, mTOR mRNA was also increased during the evening of day three. These data are intriguing and should encourage further investigation into the autophagic response to IF and TRF in different tissues in order to establish efficacious interventions. It also interesting to speculate the role of nutrient restriction (i.e., protein restriction) versus restriction of total calories in stimulating autophagy. This would bring to question the influence of macronutrient composition of feedings during CR, and IF protocols.

MACRONUTRIENT CONSIDERATIONS FOR CALORIE RESTRICTION

A feature of CR implicated in its effects is a shifting of reliance from glucose to fatty acids, of which lipophagy is involved [59]. As such, carbohydrate, particularly in lay literature and media, is often targeted as the macronutrient to restrict during CR or IF in order to promote fatty acid and ketone oxidation. However, in the studies discussed in this paper, carbohydrate intake was between that of a normal Western diet, ~45–60% of daily energy intake, during CR and IF periods [22,166,167,169,185,194–196]. Daily protein intake, on the other hand, was kept under ~15% of daily kcals. Indeed, the Longo FMD protocol restricts protein to ~10% of daily energy intake during its five days while carbohydrate intake stands at ~45% [195,196]. Therefore, it seems carbohydrate does not need to be restricted during CR or IF. It is not known if higher intakes of protein, as would occur in carbohydrate restriction, during CR or IF would produce similar longevity effects or if an intake of protein of ≤15% of kcals is necessary as mTORC1 inhibition and suppression of IGF-1 concentrations require protein restriction. It is likely, however, ample amounts of protein are recommended following a period of protein-restricted IF to replenish the amino acid pool for resynthesis of cellular components [142].

Activity of TORC1 is central in aging [72,198]. Inhibition of TORC1 activity produces robust lifespan extension in a number of model species, likely in part by its removed negative regulation of autophagy [89,199,200]. The TORC1 pathway is stimulated by protein intake via direct amino acid signaling from cellular uptake as well as growth factor signaling via IGF-1 [141]. The increase of intracellular amino acid concentrations causes an accumulation of amino acids in the lysosome [47]. The accumulation of lysosomal amino acids activates the Rag protein complex causing it to bind on the Raptor complex on TORC1 and results in its translocation to the lysosomal membrane where it is activated by Rheb [47,141]. Protein intake also influences circulating IGF-1 concentrations [166] and stimulates TORC1 activity through Akt-PI3K growth factor signaling [141]. Protein restriction downregulates TORC1 activity by decreasing amino acid availability and is required to decrease concentrations of IGF-1 during CR. Caloric restriction for 1 and 6 years with ~24% (1.73 g/kg) and ~19% (1.16 g/kg) daily energy from protein, respectively, did not decrease circulating IGF-1 [166]. However, when 6 of the 28 participants decreased protein intake from ~24% to ~10% (0.76 g/kg) for three weeks, serum IGF-1 was reduced by 25%. Despite a lack of IGF-1 suppression, CR still improved insulin sensitivity, circulating leptin, C-reactive protein, insulin, and triiodothyronine [166]. However, protein and amino acid intake in humans, particularly animal sources, is associated with age-related disease and mortality [201], although there is not a consensus on the strength of that association [202].

Insulin can also activate Akt-PI3K growth factor signaling [203] and inhibit autophagy [204]. However, as demonstrated by the cited studies, amid CR and IF, carbohydrate intake and subsequent insulin signaling does not impair their longevity promoting effects or autophagy activity [22,166,167,169,185,194–196]. The significance of macronutrient composition during CR practices and the interplay of protein and energy restriction should be a point of interest particularly for the maintenance of lean body mass in older adults, adaptation to exercise, and long-term adherence.

FUTURE STUDY OF AN AGE-OLD RESPONSE

While aerobic exercise and CR are emerging as efficacious means to promote autophagy and age-related health, data are very scarce relating to IF and autophagy despite the popular meme on lay and social media. This is not to dismiss the other documented effects on markers of health and longevity as well as the potential autophagic responses to IF that have yet to be documented. What is also not known is the combined effect of exercise and IF on autophagy and longevity. Much exercise research is conducted in the fasted state which does affect the acute autophagy response [124], however, the chronic effects of IF and exercise are not known. It stands to reason that engaging in exercise, either aerobic, resistance-based, or both, in addition to an IF protocol may yield greater effects than either alone. Such conditions would mimic conditions for which our genes are adapted. Future research investigating the combined effects of exercise and IF should be pursued as it would enhance our ability to design effective prescriptions to promote longevity in humans. Moreover, identifying tolerable or minimal doses of exercise and CR would prove beneficial considering our evolved avoidance of these practices.

CONCLUSION

Human evolutionary history has been characterized by frequent demands of physical activity and periodic privation which shaped our genes [2]. Humans accommodated these stresses by degrading intracellular components for energy through a billions year-old recycling system, autophagy, which promotes cellular homeostasis, organismal function, and longevity [46]. In the modern environment of sedentary lifestyles and caloric excess, autophagy is not activated, leading to the accumulation of cellular components and organismal degeneration which compound with age [3]. Exercise and CR practices are actionable therapies that augment autophagy, attenuate degeneration, and promote longevity with virtually no side effects with proper instruction and/or supervision [22,31,96,102]. Moreover, these responses are age-independent, thus are likely to be impactful in younger adulthood to promote healthy aging as well as in older adulthood to improve age-related health. While the fine details may yet to be resolved, the meme rings true: diet and exercise will help you live longer.

However, just as we have inherited the genes to respond to exercise and CR, we have also inherited the genes to avoid them [2]. Be that as it may, humans are unique in that we are conscious of our evolutionary operating systems and can make decisions at the individual level rather than that of the gene. Moreover, we can influence our own cultural attitudes and behaviors. We are capable of harnessing our own genes and propagating our own memes for our own health and benefit. Clinicians and practitioners, who possess a key position toward this end, would be well-served to understand the relationship between exercise, CR, and human genes and become memesters.

CONFLICTS OF INTEREST

The authors declare that there is no conflict of interest.

REFERENCES

1.

2.

3.

4.

5.

6.

7.

8.

9.

10.

11.

12.

13.

14.

15.

16.

17.

18.

19.

20.

21.

22.

23.

24.

25.

26.

27.

28.

29.

30.

31.

32.

33.

34.

35.

36.

37.

38.

39.

40.

41.

42.

43.

44.

45.

46.

47.

48.

49.

50.

51.

52.

53.

54.

55.

56.

57.

58.

59.

60.

61.

62.

63.

64.

65.

66.

67.

68.

69.

70.

71.

72.

73.

74.

75.

76.

77.

78.

79.

80.

81.

82.

83.

84.

85.

86.

87.

88.

89.

90.

91.

92.

93.

94.

95.

96.

97.

98.

99.

100.

101.

102.

103.

104.

105.

106.

107.

108.

109.

110.

111.

112.

113.

114.

115.

116.

117.

118.

119.

120.

121.

122.

123.

124.

125.

126.

127.

128.

129.

130.

131.

132.

133.

134.

135.

136.

137.

138.

139.

140.

141.

142.

143.

144.

145.

146.

147.

148.

149.

150.

151.

152.

153.

154.

155.

156.

157.

158.

159.

160.

161.

162.

163.

164.

165.

166.

167.

168.

169.

170.

171.

172.

173.

174.

175.

176.

177.

178.

179.

180.

181.

182.

183.

184.

185.

186.

187.

188.

189.

190.

191.

192.

193.

194.

195.

196.

197.

198.

199.

200.

201.

202.

203.

204.

How to Cite This Article

Escobar KA, Visconti LM, Wallace AW, VanDusseldorp TA. “Diet and Exercise Will Help You Live Longer”: The Meme that Turns on Housekeeping Genes. Adv Geriatr Med Res. 2020;2(1): e200002. https://doi.org/10.20900/agmr20200002

Copyright © 2020 Hapres Co., Ltd. Privacy Policy | Terms and Conditions