In 2012 Ian Deary and his colleagues with the University of Edinburgh, tested whether older adults who looked less attractive died before their more attractive peers. The authors asked people to rate the photographs of 292 older adults aged 83 years of age. They rated the photographs on how old healthy, attractive intelligent and happy they looked. They also looked at how symmetrical the faces are (the left side of the face is proportionally similar to the right). Then the authors followed the people in the photographs over a 7-year period to see which ones died first. They were trying to see if we can predict who dies first. What they found is that the main predictor was how old they were judged to be. After accounting for how old they looked, this was followed by how healthy they were rated from the photographs. Looking more attractive did not have any advantage after accounting for how old they looked. It seems, looking older rather than looking less attractive predicted an early death. But the two are related—looking attractive is also related to looking younger. Age determines how we judge people as attractive.
Look around and you see people that look better than others—and by better of course, I mean younger. We naturally assume that looking younger is healthier and more attractive. That dress that takes 10 years off, or a haircut that makes you look younger are all compliments. And we can easily speed up aging by stress for example. We know of people that have gone through a trauma in their life and they “aged” quickly. We have this idea of the process of aging that can speed up or slow down. One of the main stressors in our modern lives is money, and lack of it. We know that rich people live longer, but are they also more attractive? Such a relationship could work from both sides with attractive people getting more preferential treatment and becoming more successful which in turn allows them to make their life better.
Susanne Huber and Martin Fieder 2014 found rich your parents predict facial attractiveness in their children at young adulthood (17-20 years old). Of course, attractiveness is mainly due to the symmetry of the face. In 2001 Deborah Hume and Robert Montgomerie with Queen's University, Canada, examined this symmetry. What they found is that women symmetry was best predicted by how fat they are and by previous health problems. For men, facial attractiveness was best predicted by how rich they are and their how comfortable their environment is. Attractiveness seems to be positively related to the degree to which an individual cope with stress growing up. For women it is mainly their weight and health, for men it is mainly money.
The economics of beauty has been written about extensively. Daniel Hamermesh in 2011 consolidated some of these thoughts in his book Beauty Pays. There is no age limit for vanity. In the US single women aged seventy years and older spend over forty-three minutes a day in grooming. From archeological sites we can see that grooming behavior extends across the world and throughout human history. Of course, what we think of beautiful differs by country, culture and across time, but there are certain constants and being younger is one of them. There are no older Venuses nor older Davids. Old age is not paraded as examples of beauty. Never was across any culture.
Which is why women are judged more harshly for their looks than men, we also see ageing as being more determinantal to women in how they are treated by others. For people who weren’t born to have attractive features or have been in an accident, Hamermesh mentions that cosmetic surgery has been a solution for many older adults including increasingly for older men. In 2016 the US spent $16.4 billion on cosmetic procedures. This is one and a half times more than the total economic productivity of Malta ($10.95 billion in 2016). Americans spend more on having body parts modified than Malta’s total economy. And one of the main group is those entering into older age, 55 years and older.
In 2016, those over 55 years and older had 4.1 million total cosmetic procedures, an increase of 3-4% in procedures since the previous year. Of these 387,000 were surgical procedures, and 3.7 million were minimally-invasive procedures (injections and friction). For middle aged adults the main surgical procedures were eyelid surgery, facelift, dermabrasion, liposuction and forehead lift. Minimal invasive surgery included in order of popularity; Botox, soft tissue fillers, chemical peel, laser skin resurfacing and microdermabrasion. While most popular procedures among young adults focus on their bodies, older adults are apparently more concerned about more visible features, such as their faces. Older adults know that they are being judged by how old they look and their faces are their calling cards.
For these older adults who have had cosmetic surgery, regardless of how young they look, they will not prolong their time of death. What they are fighting is not death but being judged. In a world that judges attractiveness by how old we look older adults are in greater and increasing numbers resorting to fighting it b attempting to look younger. But it is the judgment that needs to change. Such vanity discrimination draws striking parallels with ageism, racism and sexism. The only way to confront these is not by becoming the “other” but by eliminating the category of other altogether.
In our increasingly digital world, we get an enormous amount of information from films. Our imagination has always been fired up by films. A relationship that has endured since the first films. How older people are portrayed in film is best described through the interpretation of a narrative arc. An arc is the linear development of a story—a beginning, a middle and an end.
One of the first films describing a simple story about older people is the 1952 Japanese film Ikiru by the acclaimed director Akira Kurosawa—acclaimed for the Seven Samurai, Rashomon, and Ran. Ikiru has a fairly simple narrative arc. An older man who worked in an office all his life, on the cusp of retirement, is informed that he has terminal cancer. The narrative arc focusses on the main character in the film attempting to find meaning and leaving behind a legacy in his life before he dies. This simple story highlights that after one’s entire life spent doing what you are supposed to do—work, maybe family—that at the end what is important is relationships. At the end, he finds some solace among his younger mates, where he finds friendship.
This narrative arc of an older man at the end of life, was further developed by another seminal director, Ingmar Bergman who in 1957 wrote and directed Wild Strawberries. Filmed in black and white, perhaps in homage to Ikiru, the film goes further in search of the meaning of one’s life. Following a fairly similar story of an accomplished professor, Wild Strawberries explores the question of what was it all about? We do not have ambitions for getting old, and once we get there, we remain without a plan. Admired but not loved, the professor starts to explore what the continuation of his story in older age should be. Like Ikiru, relationships seem to be the answer. Such a conclusion is not far-fetched from what we observe at the end of life.
In 2012 Bronnie Ware, an Australian palliative care nurse, wrote The Top Five Regrets of the Dying: A Life Transformed by the Dearly Departing. Our two male protagonists in Ikiru and Wild Strawberries follow these regrets. These misgivings focused on having unfulfilled dreams and unrequited loves. Not having the courage to follow their dreams, where (mostly) men tended to regret working so hard. Stifling feelings in order to settle for a mediocre existence. And not staying in touch with their friends and loved ones. And the final regret is not allowing oneself to be happy. They got stuck in a rut. The agreement between the narrative of these two films and the five regrets of dying people is stunning.
Some films on aging tend to start off with a negative view of aging, and then transforms into a story about friendship and family. That it is not too late to address past regrets. But what if this transformation did not take place? If the negative view of aging remains without the salvation of a new-found story for older age? This is the story of the two characters in the 2015 Italian film Youth.
Paolo Sorrentino’s film centers on two close friends sharing a vacation at an exclusive Swiss spa. One is a film director who continues producing the same kind of films, surrounded by increasingly younger writers. While the other character is a music composer who has decided to retire. The composer stopped composing—to the chagrin of many—because of his wife’s dementia which he hid from everyone including his daughter. He made changes that address this trauma and his aging. Negative events in life change our story sometimes for the better. We realize what is important. In contrast, the other character, the director, only had one story—to remain doing what he did in the past. He did not have a different story for when he got old, and the quality of his work diminished. At the end, his suicide was the only answer to his failing career since he did not have a plan B, an evolving story for getting old.
We also place people in a story. We create a cage for them. Do a little exercise with me.
Let’s imagine that you have a 100-year-old woman that you are going to interview. What is the single question that you will ask her. Write it down. Then assume that you have a 16year-young girl coming to be interviewed. What single question would you ask? Write it down.
The prediction is that you probably ask the older woman about her past and the younger woman about her future. You have already hemmed them into your view of what their story should be.
To age successfully we must have a story that goes beyond adulthood—to extend into older adulthood. Our story is important because it is how we conduct our life, including into older age. What films teach us is that others can influence our story about getting older.
In 2011 Heiko Braak and his colleagues did something that no one else had done before. He looked at dementia in the brains of young children. By dissecting 2,332 brains ranging in age from 1 to 100, what he found was to change how we see disease. Only 10 people had complete absence of Alzheimer's disease related biology. Every person over 25 years of age had Alzheimer's disease biomarkers. Without any exceptions. Even among children under 10 years of age, one in five already had the Alzheimer’s disease signs. Every adult is sick with the disease. Heiko Braak and his wife Eva are known for their stages of dementia when in 1991 they published the six stages of dementia, that we know as Braak-Braak stages So they know a few things about the disease.
The finding that every adult has some of the disease that contributes to Alzheimer’s disease was not much news until this year. In 2018 the United State National Institute on Aging—an agency set up in 1976 to explore ways to promote the health of older adults—sponsored a new way to define Alzheimer’s disease. This new framework used the biology of the disease alone, ignoring how the disease is expressed. For the first time in the history of Alzheimer’s disease we are defining it not by how it looks—the loss of memory, possibly behavior changes and mood swings—but by the biology alone. The problem, as Heiko Braak found, is that by using the biology as an indicator of the disease this makes all of us suffering from Alzheimer's disease.
Similar to the catholic church and original sin, where everyone is born with sin that eventually takes away the freedom of will, similarly we are told, Alzheimer’s disease is already in all of us and will eventually take away our freedom of will too. There are a lot of similarities. We are move science back to religion. But unlike the original sin were baptisms somewhat absolves us from this fate, with Alzheimer’s disease there is no cure and no way of absolving the disease. We are doomed whether we show dementia or not. Even healthy adults show the biology of the disease, there is no escaping.
This new way of diagnosing Alzheimer’s disease is dangerous. Not only for hospitals and clinics that have to deal with this new definition, but also for the legal aspects. What if a court argues that you are an incompetent witness (say someone stole money from you) because they can prove that you have Alzheimer’s disease and therefore do not have reliable memory. There are many other examples. Examples in real life today where the diagnosis of Alzheimer’s disease reduces your value as a witness in court. In the U.S. a diagnosis of Alzheimer’s will automatically revokes your driving privileges. You lose your driving license by the time you leave the doctor’s office (it is reportable disease that goes directly to the motor vehicle department.). If you have business loans you will likely lose those too. The repercussions of receiving a diagnosis of Alzheimer’s disease might also land you in a nursing home, whether you want to or not. This would be disastrous if all of these negative things happened when the person is still behaving normal. You and I reading this now.
The only group to benefit from making everyone an Alzheimer’s disease patient are drug companies. Most of the researchers involved in this new definition of Alzheimer’s disease have investments in and connections with large drug companies. Some of the authors reported working for the drug companies themselves. In 2011 such conflict of interests in France resulted in their guidelines being withdrawn. Researchers working for French Health Authority that issued guidelines for the treatment of type 2 diabetes and Alzheimer's disease was withdrawn by France’s highest administrative court. The court ruled that the potential bias and undeclared conflicts of interest among the authors “contravened national law on conflicts of interests and the agency's own internal rules.” According to a Consumer Report in 2102 Alzheimer's drugs cost a lot and help just a little. None work without side effects and none work long term.
There is a certain attitude of playing god. Telling nature that it made a mistake and then trying to fix it. Perhaps the disease of dementia is not caused exclusively by this biology. As so many researchers have been saying for more than one hundred years. The brain is the most complex organ in the universe. Many things can go wrong (wrong to us anyway, but perhaps this is nature’s way.) The effect of this new method of determining whether someone has Alzheimer’s disease is that we begin to lose trust in our doctors. Looking at just the biology is not what doctors are trained for. They are trained to look at the expression of the disease. In a way this biological way of looking at disease side steps doctors’ experience and skill at diagnosing and makes everyone a patient for drug companies.
Braak, H & Braak, E. (1991). "Neuropathological stageing of Alzheimer-related changes". Acta Neuropathologica. 82 (4): 239–59.
Braak, H., Thal, D. R., Ghebremedhin, E., & Del Tredici, K. (2011). Stages of the pathologic process in Alzheimer disease: age categories from 1 to 100 years. Journal of Neuropathology & Experimental Neurology, 70(11), 960-969.
Jack, Clifford, David A. Bennett, Kaj Blennow, Maria C. Carrillo, Billy Dunn, Samantha Budd Haeberlein, David M. Holtzman, William Jagust, Frank Jessen, Jason Karlawish, Enchi Liu, Jose Luis Molinuevo, Thomas Montine, Creighton Phelps, Katherine P. Rankin, Christopher C. Rowe, Philip Scheltens, Eric Siemers, Heather M. Snyder & Reisa Sperling (2018) NIA-AA Research Framework: Toward a biological definition of Alzheimer's disease. Alzheimer's & Dementia: The Journal of the Alzheimer's Association, 14(4), 535–562.
Jack C, Bennet D.A., Blennow K. et al (2018) NIA-AA Research Framework: Toward a biological definition of Alzheimer's disease. Alzheimer's & Dementia: The Journal of the Alzheimer's Association, Volume 14 , Issue 4 , 535 – 562. Supplemental Material accessed online 5/8/2018: https://www.alzheimersanddementia.com/cms/attachment/2119162008/2089988545/mmc1.docx
Lenzer, J. (2011). French guidelines are withdrawn after court finds potential bias among authors. BMJ 342: d4007
After more than a century of research the National Institute on Aging and the Alzheimer’s Association (NIA-AA) are yet again reverting to the original century-old definition of Alzheimer’s disease. A definition which Emil Kraepelin—Alois Alzheimer’s supervisor—hastily formalized as a “new disease” in 1911. Published in 2018, this recycled definition is the NIA-AA latest Research Framework: Toward a biological definition of Alzheimer’s disease and was headed by Clifford Jack (referred to from now on as the Framework; Jack, et al, 2018).
The Framework relies on the plaques and tangles as the signature of Alzheimer’s disease, while overall neurological damage defines severity of Alzheimer’s disease. This time around, in contrast to the earlier 2011 Guidelines (Jack et al, 2011), this Framework ignores the clinical features of the disease. This is important because for the first time the clinical aspect of the disease—what we think of as Alzheimer’s disease which is how it is expressed through loss of memory, changes in mental capacities and even mood and personality changes—will be ignored in preference to its biological clues. By doing so the authors usher in a new dawn of disease classification. This new biological definition is based on three types of information: [A] amyloid beta deposition, [T] pathologic tau, and [N] neurodegeneration. Referred to as the AT(N) [see note below for a more detailed description].
With eight different AT(N) biomarker types this Framework is un-wielding in its confusion. But the confusion is not in its complexity but in its logic. The authors make the illogical and unsubstantiated claim that “A biological rather than a syndromal definition of AD [Alzheimer’s disease] is a logical step toward greater understanding of the mechanisms underlying its clinical expression.” (Jack, et al, 2018; p.536). That Alzheimer’s disease can only be diagnosed through these biological markers (biomarkers) ignores the realdisease, its clinical expression. The authors argue that the clinical and neuropathological features of the disease are “…two very different entities…” (Jack, et al, 2018; p.536) and that “…cognitive symptoms are not an ideal way to define AD [Alzheimer’s disease]” (Jack, et al, 2018; p538). As a vehicle for scientific exploration, understanding and ultimately cure Alzheimer’s disease the Framework ignores science, obfuscates methodology, and fudges outcomes in order to drive through an agenda based on pharmaceutical (in contrast to scientific) considerations. This paper lays out the argument for this assertion. There are serious repercussions from this approach but it is the lack of scientific rigor that will eventually expose this approach for what it is, a sham. This paper exposes the lack of scientific methodology utilized by the NIA-AA in reaching their conclusion.
A clinical disease—a disease that is experienced or has observed consequences—is now being argued to be exclusively a biological disease. But Alzheimer’s disease is only important because it is a clinical disease. If everyone that has the biomarkers do not express the disease then there would be no interest in research. It is of no consequence what the biology is if the disease is not experienced or observed. By reversing this truism, that the biology is more importantly than the outcome of the disease, the authors are transforming how we look at health and ill-health. A transformation that the authors of the Framework concede is a “…a profound shift in thinking.” (Jack, et al, 2018; p.538.) The profound shift is also found in the lack of scientific method employed. But the authors protect this radicalism “…dementia is not a “disease” but rather is a syndrome composed of signs and symptoms that can be caused by multiple diseases, one of which is AD” (Jack, et al, 2018; p.538). Admitting that what they are studying can be one of many causes of Alzheimer’s disease and “The fact that most dementia is multifactorial presents a challenge both for diagnosis and treatment.” (Jack, et al, 2018; p.545). We are guaranteed no scientific road map in this Framework. The authors also acknowledge that we do not know how to start. “Cut points must be determined, and age norming biomarker cut points is controversial.” (Jack, et al, 2018; p.550) “The distinction between normal aging and age-related disease has been debated for decades…and we do not presume to settle this here.” (Jack, et al, 2018; p.550). Again, the Framework provides no structure for research on the real issues of aging, despite mounting evidence that dementia is not part of the normal aging process (e.g., Nelson, et al, 2011). We still do not have a framework for studying disease as pathology or as aging. For a research framework the authors were laisse fairewhen it came to dictating “The committee avoided taking a proscriptive approach to these methodologic issues under the assumption that this was best left to expert work groups and individual research centers.” (Jack, et al, 2018; p.551). But research centers do not determine these methodological issues, their objective is to gain funding or monetize a cure.
That the biology contributes to and is part of the process of Alzheimer’s disease is not contested. But to argue that the biology—the biological markers of the disease, its neuropathology—is purely the disease contradicts a wealth of evidence. This paper will argue that the Framework is more of a research policy rather than a scientific paper and is therefore devoid of scientific merit. It was published even though biomarkers density and cutoff points through “universal standards have not yet been established.” (Jack, et al, 2018; p.551).
The Framework’s proposition is premature and wrong. It ensures that all older adults have Alzheimer’s disease. Older adults that do not have plaques and tangles in their brains are an exception. As a result, older adults are automatically branded as suffering Alzheimer’s disease which makes this new approach ageist. And such sensitivities are overwhelmed by the hubris of the authors when they admit that “Up to 60% of CU [cognitive unimpaired] individuals over age 80 years have AD [Alzheimer’s disease] neuropathologic changes at autopsy or by biomarkers…Thus, using a clinical diagnosis of ‘AD’ to ascertain absence of disease is associated with an error rate exceeding 50% in the elderly.” (Jack, et al, 2018; p.552). “However, it is increasingly recognized that neurodegeneration/ injury, even in classic AD [Alzheimer’s disease] brain regions, also occurs in non-AD conditions. This is particularly so in elderly individuals where comorbidities are common.” (Jack, et al, 2018; p.539). The same Framework conceded a high rate of false negatives, with ten to thirty percent of autopsies of individuals with Alzheimer’s disease do not show these biomarkers. While among those still alive, a similar proportion of Alzheimer’s disease patients have normal amyloid PET or CSF Ab42 studies (Jack, et al, 2018). While in contrast there are many false positives, with thirty to forty percent of cognitively unimpaired elderly persons having the biomarkers at autopsy and at PET and CSF screenings (cited in Jack, et al, 2018). Averaging ten to forty percent false positive and false negatives does not form a solid foundation to develop a purely biomarker theory of Alzheimer’s disease.
There is also a statistical problem. Older adult populations tend to have more neurological variances then younger adult populations. Such variance become greater among the older population—known as heteroscedasticity. Making specific diagnosis and prognosis becomes more difficult among older populations. This makes it difficult, even for neurologists, to separate out Alzheimer’s disease from other co-existing neurological diseases. Dementia among older adults might have outward behavioral similarities, but the inward clinical expressions are very different because there are so many other co-morbidities present. For example, it is rare with older adults that a brain disease occurs in isolation from other type of (non-cognitive) diseases such as depression (Wagner et al, 2011) and anxiety (Guziak & Smith, 2014). While multiple comorbidities exist, isolating the disease includes both a clinical problem as well as a neurological one (Qui, DeRonchin & Fratiglioni, 2007). As a result, most dementias are misdiagnosed (Nielsen, 2011; Black & Simpson, 2014; Sayegh & Knight, 2013).
Explaining why a U.S. federal agency—the National Institute on Aging (NIA)—established to address the health needs of older adults, is pushing for an erroneous approach to study Alzheimer’s disease that will wrongly identify a disease among half of its constituents—older adults—attests to the overwhelming power of the pharmacological industry in subverting the NIA’s primary and sole task—protecting older adults. A cursory look at the business affiliation of some of the primary authors of the Framework identifies thousands of conflicts of interest (Jack et al, 2018; supplemental material).
There are both political as well as methodological deficits and to understand both it is important to understand the context why such conscious mistakes are propagated.
Summarizing research on dementia remains elusive because there is so much research, across so many technical disciplines. Driven exclusively by money, whether private pharmacological investments or federal and (international) states funding, research is mostly directed at monetizing a cure. Details of such work, some more impressive than others, divert researchers from an overview of the general health of the research itself. In 2017, Gill Livingston and her colleagues in reviewing dementia prevention, intervention, and care report that “…around 35% of dementia is attributable to a combination of these nine risk factors; early education up to age 11 or 12, hypertension, obesity, hearing loss and later-life depression, diabetes, physical inactivity, smoking and social isolation.” (Livingston, et al, 2017; p.14) While at the same time arguing that in comparison, eliminating the main genetic correlate (Apolipoprotein E) will only result in a 7% reduction in incidence. There is a resistance to acknowledging that what will truly cure, or at least delay dementia and Alzheimer’s disease is preventive care and lifestyle choices.
After enormous resources have been invested over the last 100 years to research Alzheimer’s disease—and providing the sole impetus for the establishment of the National Institute on Aging (NIA)—we are nowhere closer to understanding Alzheimer’s disease. Not does anyone have any semblance of how to stop and cure the disease (for review see: Whitehouse, 2014; The, 2016; Garrett & Valle, 2016). Research remains disorganized, clinicians remain confused, and the public has become increasingly worried (for review see: Ballenger, 2017; Garrett & Valle, 2015). Although there are many potential alternate approaches to developing research guidelines on Alzheimer’s disease (for review see: Weuve, et al, 2015; Jessen, et al, 2014; Bennett, et al, 2015; Au, et al 2015; Snyder, et al, 2016; Garrett, 2017) we are back again to the original definition of the disease. Historical evidence informs us that it was wrong then, and scientific evidence informs us that it remains wrong today.
A century ago Alois Alzheimer published a case study where he identified plaques and tangles in the brain of a young woman of 51 years. This was not a new observation, nor was it unique. It was known that most people with dementia had the same brain malformations, including the majority with senile (relating to old age) dementia. This was such a non-event that Alzheimer’s initial attempt at publishing these observations in 1906 failed because it was not scientifically worthy, and it took a year for these observations to be published (Dahm, 2006). However, three years after this initial observation Emil Kraepelin—Alzheimer’s supervisor at the Munich clinic—included this observation of plaques and tangles in “young” patients as ‘Alzheimer’s disease’ in the eighth edition of his book Psychiatrie. A new disease was created.
We do not know Kraepelin’s motive for such hurried and ill-informed decision. However, the fact that his neurological clinic in Munich was in competition with the one in Prague likely played a role. The Prague clinic was headed by the much more accomplished Arnold Pick, who already had published more than 350 scientific papers and a textbook of neuropathology (Kertesz & Kalvach, 1996). More importantly, Pick already identified Pick's disease and the Pick bodies in dementia as a result of buildup of tau proteins (unknown at the time) in neurons defined as "Pick bodies.” Pick’s Prague clinic also included the highly accomplished neurologist Oskar Fischer who was the first to identify Amyloid Beta plaques which became known as Fisher Plaques. Both Fischer and Alzheimer had published observations that identified plaques and tangles, both using the same methodology of reduced silver staining technique developed in 1902 by Max Bielschowsky (Goedert, 2008). At a time, leading up to 1918, when the Weimar Republic was declared, ushering in a time of nationalism and emerging Nazi movement—the Jewish Fischer and Pick in concert with all other contemporary researchers, argued that Alzheimer’s disease was not a new disease. Politically Pick and Fischer were on the wrong side of history. It could be argued that politics outplayed science. By enshrining Alzheimer’s disease in contradiction to the overwhelming scientific evidence against a new disease, Kraepelin established a political aspect of Alzheimer’s disease and gained kudos for his newly established 2017 Munich clinic (now named Max Planck Institute for Psychiatry in Munich) to the detriment of the Prague clinic.
Fast forward in time, the second political event that falsely promoted the uniqueness of this disease came about with the creation of the U.S. National Institute on Aging. In 1974 Public Law 93-296 established the National Institute on Aging and in 1976 Robert (Bob) Butler was appointed its first director. The theatre behind the scene revealed the true purpose of the NIA. Despite Butler’s interest in social inequity—having published a 1969 paper that defined “ageism” and then in 1976 Pulitzer winning book Why Survive?: Being Old in America—the focus was always on neurological diseases. Butler confessed: ‘‘I decided that we had to make it [Alzheimer’s disease] a household word…And I call it the health politics of anguish.’’ (Fox, 1989; p. 82). By using Alzheimer’s disease to promote NIA’s mission, Alzheimer’s disease again become political. This involved a radical change. NIA’s founding members realized that politically, they needed something more than diseases of older adults to validate their new institute to Congress. President Nixon at the time in rejecting the first proposal for the establishment of the NIA must have agreed with Congress that “we are not in the business of curing aging.” Congress saw diseases of older adults as inevitable, one that required care rather than cure. Dementia was also ill-defined, broad, and too diffuse a term to get Congress excited. In response and playing the “health politics of anguish” the founders of the NIA ingeniously focused on Alzheimer’s disease. Thanks to Kraepelin, Alzheimer’s disease provided a biomedical disease that can be approached as a biologically-determined disease—a real disease.
There was one problem with this approach: by definition, Alzheimer’s disease was primarily a disease of younger people and not a disease of older adults. There were very few patients suffering from real Alzheimer’s disease in the 1970s. In fact, Richard Katzman himself in an 1976 article reports that cases were so few that “Precise epidemiological information [on Alzheimer’s disease] is not available…” (Katzman, 1976; p.378). It was becoming apparent that most patients with clinically defined senile dementia—onset of disease after 65 years—have very similar pathological changes in their brains as patients with Alzheimer's disease (Ballenger, 2006). A century of criticism, arguing that dementia and Alzheimer’s disease are one and the same thing, was suddenly being recognized (Robertson, 1990; p. 433). It became politically expedient now to ignore what Kraepelin and Alzheimer argued for and admit that Alzheimer’s disease is not uniquely different from senile dementia. Katzman & Karasu (1975) already started eroding the distinction between dementia and Alzheimer’s disease and, as a result, the two constructs were merged. However, rather than changing the name of Alzheimer’s disease to senile dementia—because the establishment of the NIA relied on the banner of a neurological disease—the name Alzheimer's disease was retained and broadened significantly to include senile dementia (Katzman, 1976; Katzman & Bick, 2000). This proved extremely beneficial in the politics of anguish. An editorial by Robert Katzman in the April 1976 Archives of Neurology altered the balance (Katzman, 1976). In the short two-page article (plus references), Katzman made the argument for subsuming senile dementia under Alzheimer’s disease. It was not even peer-reviewed, again a political rather than a scientific discussion. Katzman’s political conjectures projected Alzheimer's disease as being the fourth or fifth most common cause of death in the United States. Overnight Alzheimer’s disease “became” a national public health issue. As Kraepelin “created” Alzheimer’s disease, Katzman “transformed” the disease into a public health menace.
Hubris plays a role again, Robert Katzman was not shy in acknowledging the importance of his usurping of senile dementia: "I think there's no question that that's my major contribution. Of the 115 papers I've written, that two-page editorial is clearly the most important" (Fox, 1989, p. 73). Now, the fate of Alzheimer’s disease became intricately woven with the promotion of the NIA. The creation of the NIA depended on Alzheimer’s disease gaining prominence and national attention. Without the banner of a disease, Congress was not going to fund research on aging. The ageist attitude was—and remains to this day—that aging is not important by itself. The founding fathers of the NIA knew that they needed constituents to bring the mission of the NIA to Congress, and that meant using Alzheimer’s disease as a lure. All they had to do was to persuade the general public that Alzheimer’s disease research was not only a national priority—as well as the NIA’s—but that it was their priority as well. The growth of locally-based Alzheimer’s associations was essential in order to bring public pressure on local and national representatives to support NIA’s mission. This required a symbiotic relationship—one that has endured to this day. With all of this politics, science was overlooked.
It took more than 80 years for a quasi-theory to be developed to explain Alzheimer’s disease. The Amyloid Cascade hypothesis (Hardy & Higgins 1992) proposed that the accumulation of two misfolded proteins—amyloid-β peptide and tau tangles—in the brain was Alzheimer’s disease signature pathology (Karran, Mercken & De Strooper, 2011). Even by 1992, the evidence was mounting that this was not the case.
Deposition of amyloid (A4) protein deposits were (variably) present in 66% autopsies on over 65 years of age, those with progressive supranuclear palsy, 57% with Parkinson's disease, 40% with Huntington's chorea and in elderly patients with frontal lobe dementia (Mann & Jones,1990; Ross & Poirier, 2004). A signature biomarker that is shared by other diseases is not a signature but a rubber stamp. By 1990 researchers were arguing that the “Amyloid deposition in elderly persons may thus relate more to certain aspects of ageing and genetics than to AD [Alzheimer’s disease], per se.” (Mann & Jones, 1990; p. 68). Two years later, despite these stark anomalies, the Amyloid Cascade hypothesis become hallowed knowledge and formed the basis for nearly all of the neurological work in Alzheimer’s disease—including the genetic creation of special (transgenic) mice whose brain is contaminated with amyloid plaques and tau tangles that form the basis for testing of all interventions.
We have been here before
Based on the amyloid cascade hypothesis (Hardy and Higgins, 1992), active immunization against amyloid-β42 peptide was proposed as a treatment. So far, all types of ‘amyloid’ trials have failed.
In the active amyloid-β42 immunization clinical trial by Elan Pharmaceuticals (AN1792), researchers were successful at clearing the amyloid-β42 that formed the plaques. The immunization trials show that amyloid can be cleared from the brain. The clearance of visible plaques was seen as a revolution, the “holy grail” that the new Framework is resurrecting. The problem is that cognition was not improved (Hock et al., 2003; Bayer et al., 2005; Gilman et al., 2005). In fact, longer term follow-up revealed continuing cognitive decline despite removal of plaques (Holmes et al., 2008). The argument is that it is possible that the damage has already been done and therefore the clearing of the plaques is inconsequential to the residue of the disease.
Another approach was related to inflammation response. There was observational evidence that inflammation is part of the disease process. The discovery that patients with rheumatoid arthritics who regularly consume non-steroid anti-inflammatory drugs (NSAIDs), had lower rates of Alzheimer’s disease (e.g., Andersen, et al, 1995). However, the anti-inflammatory drug R‐flurbuprofen trial conducted by Myriad Genetics was stopped in stage three of the clinical trial. Although stage two showed some promise, the outcomes in stage three proved non-significant. It was not clear whether the concentration was sufficient (800 mg) and whether the effects of the drug were too diffuse and non-specific. It is not possible to interpret the outcome of the trial in any useful way. More recent studies with NSAIDs on reducing the incidence of Alzheimer’s disease have proven inconclusive (Wang, et al, 2015; Miguel-Álvarez et al, 2015.) These studies did however leave one possible interpretation. That the lack of outcomes could be due to the disease already being present and therefore the medication could not prevent it. Again, the argument being proposed is that there is a need to catch the disease much earlier (McGeer, Rogers & McGeer, 2016.) The Framework complies with this hypothesis. But there is a problem in logic.
If by removing amyloid-β in patients resulted in poorer performance on cognitive testing in human trials (Gilman et al, 2005; Boche et al, 2010) then the plaques cannot be the disease (Iqbal, Liu & Gong, 2014). Therefore, if one of the signature disease of Alzheimer’s disease is found not to cause Alzheimer’s disease then something else must cause the dementing features that we observe. Boche et al (2010) conclude that “However, the continuing progression of cognitive decline in AD patients after Abeta immunisation [plaques] may be explained by its lack of apparent effect on tangles [tau].” (p.13). The results are clear, the amyloid-β42 are precursors to the real disease which is the tau tangles. It could be that there are unknown, or hidden precursors. But the Framework does not address this possibility, that Alzheimer’s disease is caused by biomarkers that we perhaps have not yet identified or know about.
The Tau Influence
Given these setbacks, the only way that the Amyloid Cascade hypothesis can survive is through two..
Everyone credits Robert Butler with coining the term ageism in 1969. He later expounded on this concept in his Pulitzer prize-winning book Why Survive? Being old in America in 1975. But this follows from a seminal study The Coming of Age by Simone DeBeauvoir in 1970. A detailed analyses examining the dystopian condition of older adults in modern day France. Using techniques from multiple disciplines but especially from feminist perspective of her 1949 book The Second Sex,
There was a swell of human awareness of how our industrialized world discards older people. Margaret Gullette, in her book Ending Ageism, Or How Not to Shoot Old People, suggests that credit for first articulating the construct of ageism should be attributed to Ralph Waldo Emerson in his 1862 essay Old Age. But ageism has been around since early history.
Getting old is something to shun, and older people are shunned. This discrimination continues to this day, across all countries. No one is immune from ageism. Ageism has severe and negative consequences—health, income, work, insurance, life expectancy—least of which is the denial of older people from employment.
This was also recognized by U.S. Congress before the word ageism come to the world in 1969. Although the 1964 Civil Rights Act left out age as one of the protected groups, this was remedied in the 1967 Age Discrimination in Employment Act (ADEA). This act protected anyone over forty from discrimination from work practices. The ADEA is enforced by the Equal Employment Opportunity Commission (EEOC). In addition, in 1975 the Age Discrimination Act was passed which prohibits discrimination on the basis of age in programs and activities receiving federal financial assistance and enforced by the Civil Rights Center Department of Labor. Both acts are well-intentioned, but they have been watered down by the courts so as to make them ambiguous and ineffective.
The EEOC in 2017 had 84,254 new workplace discrimination charges filed. That year they resolved 99,109 charges, handled over 540,000 calls and more than 155,000 inquiries in field offices. Most of these related to retaliation: (48.8 percent) followed by Race: 28,528 (33.9 percent), Disability: 26,838 (31.9 percent), Sex: 25,605 (30.4 percent) and then Age: 18,376 (21.8 percent). National origin, religion, color equal pay and genetic information filling the rest of the complaints. For age discrimination in 2016 only two of the 86 lawsuits the agency filed were based on age discrimination. Two, out of tens of thousands charges.
Age is a difficult category to prosecute. Your employer can fire you based on your seniority, say to save money or other business practices, without being liable. The fact that all senior management are older workers is immaterial to the business decision. At interviews, employers can ask your age, although they are not supposed to use that against you. But the easiest way to implement discrimination without any recourse to litigation is simply not to respond to job applicants who seem “old.” In 2016 the Eleventh Circuit Court of Appeals ruled that job applicants cannot sue for age discrimination because they are not employees. The Acts protect employees only. Anyone who tried to apply for a job in their fifties and sixties has experienced this strategy well. Laws are only effective if they are enforced. A 2017 AARP survey found that nearly two-thirds of workers age 55-64 report their age as a barrier to getting a job. An earlier comprehensive study in 2015 by Patrick Button, economics professor with Tulane University using resumes for workers at various ages found significant discrimination in hiring for female applicants and the oldest applicants. You just do not receive an interview. The sad part about ageism is that it adds to other existing discriminations. Older minority populations, especially women and those with disabilities, are the most discriminated category. They are pushed to the bottom. We have known this for more than 45 years.
As early as 1973 Duke University professor Erdman Palmore and his student Kenneth Manton, demonstrated that it was ageism, rather than racism, that was the primary concern of older people. They argued, that although people routinely confront racism throughout their lifetime and for which they developed coping mechanisms, ageism is something that creeps up on you unexpectedly and without any recourse for defense.
Since stereotypes exist for everyone, some more prevalent and negative than others, because the experience of ageism is experienced fast and compounds other already existing stereotypes (ethnicity, gender, disability, religion and categories that makes you the “other”) it is much more difficult to counteract.
We cannot separate ageism from age. Although theoretically these constructs are different, age is the main cause that triggers ageism (whether you look old or not). And there are two broad solutions, the traditional approach has been to try and reduce stereotypes among the general public. This will eventually seep through, like other kinds of “-isms” the world is becoming more accepting. The second approach is to build resilience among adults before ageism starts. Both these strategies would be meaningless unless we have a strong policy support to harshly and relentlessly prosecute ageism in society. Removing the ambiguity in the 1967 and 1975 laws would be something that Congress can accomplish without much political fanfare. But as individuals we can focus on resilience.
Resilience is more like judo than boxing. In judo the energy from the other person is used to your benefit. Using their strength and momentum to propel them further along away from you. Unlike boxing, requiring pummeling into an opponent which entails fighting everyone all the time, judo is learning a few techniques that you practice. Building resilience is using stereotypes to your own advantage.
Age is a privilege and an honor. Start early to appreciate this. Nature has selected you above others. However frail and diminished you might feel there is no alternative. Embrace your life as it is now, not as it should be. That is the foundation for adapting to ageism, a good core. The rest is throwing off stereotypes, judo style.
Throwing off Stereotypes
Don’t ascribe everything negative to old age. Sometimes you are not as efficient as you used to be because you do not practice or exercise as often. It might have nothing to do with age. Separate age effects from lack of practice. You can change your behavior and increase practice, but you cannot change your age. Ascribing a deficit to age eliminates the possibility for change.
Don’t accept ageist jokes and don’t make them yourself. Acknowledge them when you hear them, you might react to them or not, but be aware when someone is trying to demean you because of age. They might be funny but they reduce you to one dimension.
Highlight something that you like about yourself and practice that and make full use of it. Music, writing, talking, comedy, whatever it is. Be exuberant and fearless in pursuing this talent to the extreme. This is your time. Dress well and present yourself. Remember that you have many ways to present a better aspect of yourself, without trying to look in your 20s or 30s. Good hygiene and clothes that present you well. Whatever your style, or comfort, be the best within your means. Being careless about yourself invites others to do the same.
You have amassed many experiences, identify the salient ones and use them. Speak up and show compassion. Don’t dwell on failures or your laurels. Stop talking about your health. Although you experience them as unnatural and an aberration, this is your reality. Move on. There is no lesson to learn, for anyone including yourself.
Be open to change. Only dead things don’t change. Celebrate your life by going out of your way to learn new things. When you come across something new, stop and learn. Failure means that you need more practice. Hopefully this strategy will protect you from dementia. Maybe not. Most of us fear this more than anything, and we fear it for our partners. Remember that most people with dementia tend to regain their well-being after a few years. It is the caregivers that suffer increasing decline in wellbeing. After eliminating all possible causes (medications, infections, behavior) there remains nothing that we can do to stop dementia. Focus on what we still retain, music, emotional connection, a nice meal. Dementia is not a joke or laughing matter. Educate people that memory loss is not dementia. It is a spiritual exit to life Embracing ageism takes these stereotypes and addresses them head on. First our own fears, and then other’s flippant comments by addressing both the fear mongering and the glibness.
Ageism will always be around. Promoting laws to eliminate it is central to progress. We can educate ourselves to be better ambassadors for our age, now or for our future selves. We need to be the examples that break the mold. We are privileged and it is time to express it.
Dementia is not part of the normal aging process. Except the main factor and predictor of dementia is age. Seeing my physician and I complain I cannot walk/run/climb steps (choose your specific complaint here) the physician always says well “…its your age.”
And then with dementia, all of a sudden it becomes a disease. No one told me that my bad knee is due to a disease, they just put it down to age. But dementia, all of a sudden is not part of the normal aging process and yet age is the main contributory factor to dementia and Alzheimer’s disease.
As a scientist, I am perplexed.
How can one disease be due to old age, and a disease that is the most prolific and definitely due to old age is not part of the “normal aging process.”
Mission control: We are shutting down.
Mission Control: We will start slow.
Brain: Do you have to?
Because it is time
time for what?
Time for other people to have a chance, it is our strategy to replace old generations with new ones
Perhaps a little bit longer
First we will close down recent memories
Make it easier to detach
I am worried, will it change me?
Yes. You are dying. You will no longer be.
Did you watch the Monty Python skit with the parrot?
Ok shutting down
Recent memory being erased
No…but I am worried
Why? We are shutting down, there is no pain
I miss who I was
Who “were” you?
I don’t remember
You see, trust me, I know what I am doing, I am nature
Erasing further histories
More memories erased
Oh No…please stop
I do not know…but my wife/husband/daughter/son/lover/helper is worried…they keep asking me to return to who I was before
That is not possible
We are shutting down
Are you in Pain?
Ok shutting down
Please stop it is hurting THEM
Those that love me
Well, did you tell them that you are shutting down
I did not want to hurt them
And they are hurting now because you did not tell them?
But it is reality, it is the truth and it is ordained by Nature
So you didn’t tell them
They want me to be healthy
Is that even possible
So why did they expect it with you?
OK shutting down.
Removing social constraints
How do I deal with my family?
That is your domain…I deal with the timeline
Yes but can you help me?
Sure…how many people have died before you? Does everyone expect you to die? How do they want you to die?
Harmful genes that cause Huntington’s disease — a disease that attacks the neurons in the brain — only show up between ages 30 to 50, in some cases after the birth of offspring. There are many other diseases that accumulate later on in life, dementia being the main one. In 1952, Peter Brian Medawar tried to explain this by suggestion that older adults accumulate mutations and become a "genetic dustbin."
In Medawar's theory, there is no advantage to aging, nor are there any benefits for older people to live. Aging is simply the result of declining functions before death. This biological interpretation proved popular.
To explain aging, biologist George Williams in 1957 came up with the "antagonistic pleiotropy hypothesis" (named by Michael Rose in 1982). Pleiotropy is the phenomenon where one or a few genes control more than one trait. The antagonism part comes from the negative effect that emerges later on in life. As an example, testosterone in men might result in an attractive, muscular body in youth, as well as masculine features, such a deep voice and facial hair, but it also increases the likelihood of prostate cancer in older age, hence the antagonistic part of the pleiotropy. Although it is the positive aspects of the pleiotropic gene that are selected for in natural selection, the antagonistic aspect also sneaks into the gene pool. Aging is seen as an invisible cloak that sneaks bad genes into the gene pool by cloaking them under positive traits when young. Aging, in this view, has subverted the whole process of natural selection by disguising itself as a positive attribute in early life and then transforming — in a Jekyll-and-Hyde metamorphosis — into an aging liability. Somehow nature has been hoodwinked into allowing people to get old. Aging becomes a problem, a genetic dustbin of humanity. From here, it is fairly easy to see the approach: We need to cure aging, because nature made a mistake. The hubris of judging that nature made a mistake ignores that nature might have a different perspective from ours.
As a species, survival is nature's only ambition.
The only way that successive generations prosper is if they are a good fit with their environment. Each generation must survive long enough to create another generation. Nature keeps our genes immortal, and it has two extreme methods to achieve this single aim. One way is to produce an enormous number of offspring and hope that a few survive to then pass on their genes (known as r-selection). Another approach — one followed by humans — involves having few children whom we nurture until adulthood and beyond (known as K-selection). Therefore nurturing — protecting and supporting others — is our survival strategy, not competition.
Nurturing involves having things to teach and living long enough to be able to teach them. Which is why humans live long and have such a big brain; the two go together. Some 1.6 to 1.9 million years ago, our brain grew very fast; some say — not without contention — that brain expansion mirrors the development of cooking. Cooking, making food more easily digestible, resulted in greater availability of nutrients for the hungriest organ in our body — our brain. Nature engineered us to have both a big brain and longevity; they are intricately intertwined. We can see this through mathematical models that show a leap in predictive value when older people are included in the equation. Whether or not older people have a disease, the presence of older people in the family predicts longer-living children and grandchildren.
In the wild, most mammals die once they lose their ability to reproduce. Humans are different. We continue to live well past our capacity to reproduce, especially females. Is nature wrong again, or does nature have a special role for older people?
What the genetic dustbin proponents do not appreciate is that older people, especially grandmothers, have a statistically positive effect on their community. In 2004 while examining the “grandmother effect,” Mirkka Lahdenperä of the University of Turku, Finland, and her colleagues found statistical evidence that a grandmother has a decidedly beneficial effect on the reproductive success of her children and the survival of her grandchildren. Older adult humans promote the survival of the species. Unlike any other animal, we also transfer wealth, capital, and wisdom to our successive generations way past our reproductive period. When gene survival includes the broader community, then older people have a positive effect on their chances of survival.
By 1973, John Maynard Smith and George Price introduced game theory to evolutionary problems. While classic game theory sees players making rational choices on the basis of individual gain, evolutionary game theory posits an awareness of what others might do and the development of strategies to counter that decision. It is a social decision mode, not a purely individualist one. Maynard Smith argued that since everyone dies, evolution does not benefit individuals. Evolution is designed to benefit the community. In this interpretation, it explains that the strategy humans employ is based on benefits to the community, rather than benefits solely to the individual. Such a model fits the outcomes we see in reality. This insight was revolutionary and transformed the argument from one where aging is seen as a genetic dustbin to one where aging becomes part of a package for survival — a package that includes older adults contributing, in as yet unknown ways, to the promotion of our species.
There are instances where antagonistic pleiotropy of dementia has some really beneficial effects. For example, the Apolipoprotein E Variant 4 that is strongly associated with Alzheimer's disease might have beneficial aspects, such as reducing the rate of age-related macular degeneration, lower testosterone, and although there is no evidence of apoE isoform reducing infectious diseases, there is evidence that apoE could play a role in reducing our susceptibility to viruses, bacteria, and protozoan parasites. Such polymorphisms — having multiple expressions — are abundant in nature.
Despite this insight, in 2002, 51 renowned scientists — including such luminaries as Jay Olshansky, Leonard Hayflick, and Bruce Carnes — published a position statement in Scientific American stating that “aging is a product of evolutionary neglect, not evolutionary intent.” Again, we are telling nature that it made a mistake, or at least was ignorant of the consequences. When Albert Einstein first confronted quantum physics, he said that “God does not play dice with the cosmos.” What is not reported frequently is the response from Danish physicist Niels Bohr: “Einstein, don't tell God what to do.” It seems that we are telling nature what it should do or how neglectful it is, rather than appreciating the biological system we call life as complete and perfect. We might guess at the intent of evolution — survival of our immortal genes — but we might not understand its methods.
Aging and having a big brain go hand-in-hand. It is nature’s plan for our survival. Older adults improve the survival of both their children and grandchildren. Looking at aging in a broader context allows us to view some of the wonders of nature. We have a lot to learn if we listen.
Browning PJ, Roberts DD, Zabrenetzky V, Bryant J, Kaplan M, et al. (1994). Apolipoprotein E (apoE), a novel heparin-binding protein inhibits the development of Kaposi's sarcoma-like lesions in BALB/c nu/nu mice. J. Exp. Med. 180:1949–54
Bojanowski, C. M., Shen, D., Chew, E. Y., Ning, B., Csaky, K. G., Green, W. R., ... & Tuo, J. (2006). An apolipoprotein E variant may protect against age‐related macular degeneration through cytokine regulation. Environmental and molecular mutagenesis, 47(8), 594-602.
Hogervorst, E., Lehmann, D. J., Warden, D. R., McBroom, J., & Smith, A. D. (2002). Apolipoprotein E ε4 and testosterone interact in the risk of Alzheimer's disease in men. International journal of geriatric psychiatry, 17(10), 938-940.
Lahdenperä, M., Lummaa, V., Helle, S., Tremblay, M., & Russell, A. F. (2004). Fitness benefits of prolonged post-reproductive lifespan in women. Nature, 428(6979), 178.
Mahley, R. W., & Rall Jr, S. C. (2000). Apolipoprotein E: far more than a lipid transport protein. Annual review of genomics and human genetics, 1(1), 507-537.
Olshansky, S. J., Hayflick, L., & Carnes, B. A. (2002). Position statement on human aging. The Journals of Gerontology Series A: Biological Sciences and Medical Sciences, 57(8), B292-B297.
Pianka, E. R. (1970). On r-and K-selection. The American Naturalist, 104(940), 592-597.
Roselaar SE, Daugherty A. 1998. Apolipoprotein E-deficient mice have impaired innate immune responses to Listeria monocytogenes in vivo. J. Lipid Res. 39:1740–43
Smith, J. M., & Price, G. R. (1973). The logic of animal conflict. Nature, 246(5427), 15.
Williams, G. C. (1957). Pleiotropy, natural selection, and the evolution of senescence. evolution, 11(4), 398-411.
We can see how genetics play a role in how long we live. Looking at different species and how long or short they live. But we do not know exactly how this works.
Alexander Graham Bell, the inventor of the telephone, was more passionate about aging. In 1900s he found that people that lived long had long-lived children. Whether this is due to genetics or to providing support to children, or both, remains undefined. But the connection is there. Sometimes despite having the best genes, bad luck just strikes. Take the example of Jeanne Louise Calment, who died at the age of 122 years in 1976. Despite having the best genes for longevity her family did not enjoy these positive attributes. Sometimes bad luck negates good genes when her daughter Yvonne, died at age 36 of pneumonia. Luckily, she left a son Frederic, who became a physician. He lived with his grandmother in her apartment. However, he also died early, in a motorbike accident, at the same age as his mum 36 years old. Sometimes bad luck negates any genetic advantages.
Three classic experiments define how a genetic advantage results in living longer. The first experiment was conducted by Michael Rose who by allowing only eggs of older flies to hatch he found that the next generation lived longer. The new generation seemed to know that, similar to their parents, they need to live longer in order to reproduce. We also find this among humans. The older your mother was when she conceived you, the longer you will likely live. Unlike human, there is no nurturing for flies, so this effect is predominantly genetic.
The second type of experiment uses a naturally occurring disorder in a flatworm that produces less growth hormone which stunts their growth but they end up living much longer. Through a series of trial and errors Cynthia Kenyon at University California San Francisco managed to chemically knock out one of these genes in normal flatworms and in so doing nearly doubling their lifespan.
The third type of genetic observation is seen with mice, in particular the work done by Richard Miller and his infamous dwarf mouse called Yoda. Again, nature lead the way in showing us about the longevity advantage of having less growth hormones. In nature there are three types of dwarf mice that share this longevity characteristic: Snell, Ames and Laron dwarf mice. These mice live about three times longer than average.
By knocking out a gene to stop growing larger we could all live longer. Somehow the body knows that we need to live longer in order to be able to pass on its genes. Fortunately, we also have examples among humans as well. In a southern Ecuador community of 250 individuals that have Laron syndrome—causing a deficiency in primary growth hormone—although protecting them against disease, especially cancer, this apparent protection does not translate to living longer. This group unfortunately engage in risk behaviors in particular alcoholism that negate this genetic advantage.
No one wants to have a stunted growth in order to live longer. But what about having older parents to increased longevity? In biology, there is always a dark side—known scientifically as antagonistic pleiotropy. This construct has plagued gerontological research since it posits when one gene controls for more than one trait (e.g. height) it is likely that one of these traits is beneficial (e.g. more athletic) while another side is detrimental (e.g. heart disease) to the individual later on in life.
The dark side is that we know that women having children at much older ages increases the risk of certain genetic problems. It has also been reported in 2018 by Boris Rebolledo-Jaramillo with Nottingham-Trent University UK, and his colleagues that children of older mothers face greater risk of developing diabetes, dementia and heart disease. As for older fathers, their kids are more likely to have dwarfism or Apert syndrome. Newer research in 2012 by Augustine Kong at Reykjavik University, Iceland also suggest that there is an increased risk for autism and schizophrenia. There is a “goldilocks effect”, not too old and not too young, just right.
The surprising result in genetic research is the finding that as we age we are also changing our genes. It was always assumed that our genes unchanging and that they are given to us exclusively by our parents, period. But we are learning that we also add and modify our genes as we age. We acquire one percent of our genes from bacteria, fungi, viruses and archae—single cell micro-organisms. Specifically, there are special molecules that reside in these cells that are there specifically to develop antibodies. They are not part of the cell but act as independent contractors. Known as “plasmids” they help us fight infections. If we are constantly being infected, in order to help us develop immunity, they somehow insert their antibodies-producing-genes into our DNA so we can develop this protection ourselves. Sometimes our own genes change position in our chromosomes so they gain higher priority. These genes are known as “jumping genes” or as “transposons.”
Such strange genetic behavior was first discovered by Barbara McClintock in the 1940s who was awarded the Nobel Prize for medicine in 1983. How plasmid and jumping genes do this remains an absolute mystery. Her work provided evidence that the composition of our genes—our genome—changes while we are living. The longer we live, the more likely that these new genetic improvements are transmitted to our children. So now we have figured out the method of how Michael Rose’s flies create a time stamp on their genes. Plasmids are at work throughout the aging process.
We develop immunity from the day we are born and some of these biological adaptations end up in our genes through the transfer of external genetic material. Our genes are more permeable than we once thought. We get genes not just from our parents but also from the environment. In addition, we also get genetic material from our twins in the womb and mothers get genes from their children. We find male chromosomes in mothers who had baby boys. We are a magnet for adaptive genetic material from our environment.
Barbara McClintock was also the first scientist to correctly speculate on the basic concept of how some genes can be switched on and off—known as epigenetics, epi meaning “above” controlling genes. Sometimes a defective gene (e.g. for diabetes or Alzheimer’s disease) can be switched off—through diet, exercise and mild trauma. As we age we pick up new genetic material and modify existing genes (epigenetics) before we pass these genes on to our children. Our lives are devoted to just this aim, making sure that our children are best prepared to the new world they face. As for bad luck, we have Pandora’s last remaining attribute: Hope.
Rebolledo-Jaramillo, B., Su, M. S. W., Stoler, N., McElhoe, J. A., Dickins, B., Blankenberg, D., ... & Paul, I. M. (2014). Maternal age effect and severe germ-line bottleneck in the inheritance of human mitochondrial DNA. Proceedings of the National Academy of Sciences, 111(43), 15474-15479.
Rose, M. R. (1984). Laboratory evolution of postponed senescence in Drosophila melanogaster. Evolution, 38(5), 1004-1010.
It is personal when it happens to you. As much as we talk about changes in older age, it remains at a distance, until it happens to you. Most of the time the loss of function happens fast and we are unprepared. While most of us might recover from an initial loss, we only have to face another different one shortly thereafter. Little pieces of you are taken away. And our mind does not deal well with these losses. You did not plan for it, and even if you thought of this eventuality, when it happens to you it is different. It is personal and real.
We have a model of the world in our brain. Within this perfect heaven there is our avatar, an image of us, who we think we are. As we get older and frailer—usually these come together—the reality conflicts with the avatar that we have built. This model is important for us. Most of the time the model of the world and the avatar representing us functions well. We function on a daily basis without needing to be aware of this model. We behave in automatic mode most of the time. Until something goes wrong and the avatar can no longer do what its suppose to do. The mental narrative that we have taken so long to build up suddenly needs to be re-arranged and re-modeled.
In aging, not long after the first of such redefinition of our model—perhaps we realize that we cannot read small print anymore without using prescription glasses—then comes another onslaught of loss. The constant change and attrition, requires us to be repetitively modify our model and our avatar. Aging is an existential danger to our model, because it threatens how that model is suppose to function. Making these changes is difficult for everyone since our model resists change, as it has been a faithful portrayal of our reality for so long. The older we get the more entrenched this model becomes. It is also doubly difficult in older age because there is so much variance among our peers. We delude ourselves that perhaps these attritions are only temporary and therefore we do not need to change our avatar just yet. There is always a lag in how old we are in reality and how old we see ourselves—a subjective age bias. Of course we are biased to see ourselves younger.
Many theories exist for why we underestimate our age. Overestimating our abilities, our looks, how satisfied we are in life, and aligning our personality, attitude, behavior and interests with that of a much younger person. Some theories also suggest that there is an internal bias to be young. But these theories assume that there is a conscious, if not willful desire to stay young. Although all these theories are valid, but there could be a simpler answer. There could be a lag, a time difference, between reality and how our model represents it. It takes time for us to reconcile reality. And the process is dynamic and we are continuously fighting this change. This dynamic process has not gone unnoticed.
In psychology by the 1950, Erik Erikson developed the first personality theory that included older adults. Before then most theories stopped at young adults. Erikson’s eight-stages of development comes closest to explaining this constant fight we experience in older ages. Likely written by his wife Joan Erikson, the final stage of development emerging late after age 65 years. This stage contests that there is a fork in the road. At this fork which Erikson called “crisis,” we either go towards ego integrity or we go headlong into despair. As dramatic as this crisis seems, it is emerging that such depictions are very close to the experience of aging.
By ego integrity Erikson means that we come to accept who we are. That we only have this life to live, and that we need to resolve issues in order for us to be able to be comfortable with where we are. Although seemingly diametrically apposed (ego versus non-ego) Lawrence Kohlberg’s 1973 theory of moral development later expanded to address older adults, included a stage of self-transcendence a “...contemplative experience of the nonegoistic or nonindividual variety” (p.500-501). Ego integration and non-ego seem to refer to the same concept, that of humility. The only salvation to older adults is becoming humble. John Cottingham in 2009 defines humility is, “ ... a lack of anxious concern to insist on matters of status, a recognition that one is but one among many others, and that one’s gifts, if such they be, are not ultimately of one’s own making” (p.153).
The alternate to humility is pride, when we are constantly fighting unresolved issues that continue to fester and create discord in our life. Later on Joan Erikson formulated a ninth stage of very old age that starts in the eighties when physical health begins to deteriorate and death becomes more real. She recognized at this stage that society similarly ups the ante, “aged individuals are often ostracized, neglected, and overlooked; elders are seen no longer as bearers of wisdom but as embodiments of shame” (p. 144). It seems that unless we subjugate ourselves to humility the alternate is humiliation.
That is why it is personal. Its not just about accepting aging, its that we have no choice. We either suck it up and become humble or fight it and face a certain humiliation. By sucking it up we acknowledge our mortality and therefore impermanence, our humility. If we fight it we rally our pride and confront these changes with certain outcome, failure and humiliation. Science tends to support this view. Neal Krause and David Hayward with the University of Michigan wrote that when it comes to humility, people that live the longest are the ones that accept where they are in life. By become less of ourselves (ego-less) nature rewards us with more of ourselves (long life.)
Someone has a dark sense of humor, and I hope that I live long enough to learn to appreciate it.
In a 2014 Pew Research Center study nine out of ten adults in the United States report believing in God and more than half are “absolutely certain” God exists. While one in five Americans pray every day, attend religious services regularly and consider religion to be very important in their lives. Although these proportions are declining precipitously since an earlier 2007 study, today religion still plays an important role in the lives of older people.
As adults get older they get more spiritual and some become more religious. It is not only that religious or spiritual people tend to live longer (they do, for many reason other than spirituality), but that older people become more spiritual and religious as they age.
There is a great attraction to argue for a spiritual interpretation of aging. Two religious gerontologists did just that when Jane Marie Thibault and Richard Lyon Morgan in 2012 made themselves their own subject matter when they wrote a book about their aging experiences. In a self-described pilgrimage into their third age, they interpret aging through religion. While growing up God has shown us how much he loves us by making us healthy, giving us pleasure through our bodies, nature, perhaps experiencing the miracle of having children. As we age then it is time for us to show God how much we love him in return. God stops showing us how great he made us and now it is our turn to reciprocate. In one example, by using “dedicated suffering,” we acknowledge our pain and dedicate it for the benefit of others. And it works. When people dedicate their suffering they report a reduction in pain. This spiritual switch—as older adults we are now responsible for the expression of gratitude—has some surprising support in the scientific field.
The Swedish sociologist Lars Tornstam in 1989 developed a theory that argued that older age brings about spiritual growth. Gerotranscendence Theory suggests that older individuals—perhaps because of ill health—tend to experience a redefinition of self and their relationships with others. By redefining ourselves we become more spiritually aware. More recent in 2009 the American Pamela Reed in developing her own Theory of Self-Transcendence states that individuals who face human vulnerability have an increased awareness of events that are greater than them. So is spirituality the answer to this increasing loss of control that we experience as we age?
Research tends to support this interpretation. In one review, the Portuguese researcher Lia Araújo and her colleagues, report numerous studies showing that religion, spirituality, and personal meaning have a broad range of mental and physical health benefits, satisfaction with life and coping better with stress. In older age, existential issues—contemplating life and death—appear to gain increasing importance. There seems to be a growing preference for acquiring meaning from faith. It seems that the greater the challenge the greater the religious or spiritual meaning that we gain from the experience. By gaining a positive meaning of life, purpose, religion, and spirituality individuals also gain a higher level of life satisfaction. Regardless of physical health, developing a positive attitude toward life has positive outcomes. It is only when religion becomes an ineffective tool for explaining dramatic challenges that people start revoking their religious conviction.
Christopher Ellison with the University of Texas at Austin and others have referred to this area of research as the “dark side of religion.” Doubt in our beliefs can have very negative consequences. Doubt erodes one of the major functions of religion which is to provide an explanation for why we are aging—such religious explanations are generally referred to as theodicies
But we are always looking for a reason, a model of the world that is just, logical and predictable. Religion has that extra facet of immortality—life in the afterworld, a comfort to those that have to confront the eminence of death. Whether we get this view of the world from religion, science or from intellectualizing, the overarching observation is that we need to have such a view. Everyone has an opinion on things that matter to them. Some simply don't call it religion but having an explanation comes with the territory of being human.