Ashton Applewhite is a writer who has written a book called This Chair Rocks, and blogs about the advantages and disadvantages of aging in her blog Yo, Is this Ageist? In her writing, line of work, and speeches, Ashton shows that our preconceptions of aging might just be wrong, and that aging doesn’t need to be a disadvantageous, difficult process.
Testimonial & Celebration of Age Justice 2018 - YouTube
Over 170 people attended this historic June 6 rally organized by the Radical Age Movement, which I was honored to be part of. Check out the terrific video and stay tuned for updates on the rally to be held in New York’s Union Square this fall.
This guest post is by Louise Pendry, a Senior Lecturer in Psychology at the University of Exeter in the UK. She’s delighted to be combining her work on online communities, stereotyping and prejudice, with her long-standing personal interest in and (more recently) her lived experience of women and ageing. Currently she is exploring how online communities can help support and empower women as they grow older. You can reach her at firstname.lastname@example.org or find her on Instagram as silverserenity4.
I like to think I’ve got a pretty good attitude to growing older. I’m genuinely enjoying my life post-50. But recently, I hit a roadblock to my progress here, and that roadblock is me, or more precisely, how I sometimes think about me. That’s what I want to share with you.
As a psychology lecturer, I teach a class on stereotyping. Mostly I focus on how we stereotype other people, but lately I’ve started to look at what happens when we stereotype ourselves, especially when we do so on the basis of age (internalised ageism). I like to start each class with concrete examples of the topic I want to tell my students about to make it real. This is easy when it comes to stereotyping other groups (e.g., black males shot in error by police). But this SELF-stereotyping angle was less obvious. I was struggling to find a vivid, relevant illustration. Little did I realise that I’d do something – actually in the class itself – that would give my students a perfect real-life example.
Here’s what happened. It was the week before my planned session on age stereotyping. Part of the class involves students giving presentations on research articles. This is my cue to sit down and grade their efforts. The room layout was cramped and necessitated me clambering (tripping!) around the AV equipment, before climbing inelegantly over a desk to reach a seat. All of this was achieved with much stumbling and huffing and puffing on my part. Trying to make light of it, I smiled and said “Ha! I’m not the woman I was!” Cue much laughing. At the end of class, a student approached me about her presentation the following week, ironically on an article about age self-stereotyping. “You know, you just demonstrated what that article called ‘internalised ageism’,” she said. And she was right. I had used a common negative stereotype about ageing (declining physical fitness) to explain my behaviour and it was more than likely not justified. Actually anyone would have struggled to reach their seat in that classroom, young or old.
Thinking this was just an aberrant moment on my part, I tried not to worry. But as I started to think about it, I realised this was not an isolated episode. I’ve certainly found myself joking about an unfortunate “senior moment” when I’ve mislaid my glasses and can’t recall where. Where’s the harm in that? We all do it, right? Really though, what I’m doing here when I make this kind of humorous self-deprecating remark is classifying a behaviour I’ve performed as proof that my memory’s going and furthermore, highlighting it’s because of a negative and enduring part of the elderly stereotype: forgetful. I’ve pigeonholed myself, written myself off. This memory lapse could be a sign of impending Alzheimers but it’s more likely to signify that my life is way too busy. It could happen to anyone AT ANY AGE if they had as much going on as I often do. Or it could be down to menopausal brain fog (annoying but not necessarily permanent). Minor deteriorations in cognitive function can certainly happen as we age, but that doesn’t mean every slight memory lapse is a sign of serious cognitive decline.
Perhaps you think I’m over-reacting. You might think, “Good grief, where’s your sense of humour?” Now don’t get me wrong, I like a laugh as much as anyone. But here I think it might be an issue because such self-stereotyping can have important consequences for us. Research by Becca Levy and colleagues has shown that when we THINK of old as negative, we can start to FEEL old and may even ACT in a way that confirms negative elderly stereotypes. And that can basically hijack our best efforts to age positively. What does this mean? Well, translated into everyday life, it suggests that the unconscious age self-stereotypes we hold and express (“I left my phone in the fridge, I’m having yet another senior moment!”) affect how well we approach and perform associated tasks in future (“My memory is clearly ****ed. How will I ever remember everything on my to-do list today?”). We become our own self-fulfilling prophecy.
Now I’m aware, I’m going to try to catch myself in the act when I do this, and maybe respond differently. It’s early days yet, but I would say I’m noticing this tendency more in myself and others. I met a friend for coffee recently and we were chatting about her job, and whether she wanted to apply for a new role with more responsibility. “I just don’t feel I’m mentally up to it any more,” she confessed, “I’m too old.” Mindful of all I’ve said above, I replied, “If you really don’t fancy it, that’s up to you. But if you do feel it’s not for you, is it down to your age? Remember, you moved jobs two years ago from one in which you had control over your daily routine to one which has you running around, at the beck and call of others. In your current role, you often feel overwhelmed, and that might be colouring your mindset, making you doubt your abilities. It might not be your age.” “Louise,” she replied, “you would never have said that before, you’d just have agreed with me.” And she’s right, I would have.
I’m not saying there aren’t some downsides to growing older, simply that there may be many explanations for the things we do as we grow older that are not irrevocably tied to age. Pausing to reflect on these alternatives might allow us to reappraise our achievements and move forward with this phase of our lives more positively, be that bit more resilient. As Becca Levy says: “…as all humans age, they should be aware of their own implicit negative views of their group and consciously develop an identity with old age and its positive attributes, using these to compensate for the ill effects of automatic ageism.” Reader, I’m on it!
Five years ago I got a speaking invitation from Maura O’Malley, co-founder of Lifetime Arts, a nonprofit that was creating professional arts programs for olders long before “creative aging” was a thing. (Google it now.) Last night they celebrated their tenth anniversary at a gorgeous gala in NYC, honoring Aroha Philanthropy’s Ellen Michelson for her visionary leadership and me with their inaugural Game Changer Award. (That’s Ellen below on the right—no, we didn’t coordinate our dresses—and Maura in the middle.)
One thing I turn out to be good at is recognizing my people when I encounter them, and the Lifetime Arts crew are definitely among them. They were the first organization to support me, and the first people in the creative aging field to understand that confronting ageism is central to their mission. As I said in my very short acceptance speech, “Artists like you are taking this change out into the world.” I’m grateful.
I also heard last week from San Francisco’s Yerba Buena Center for the Arts that I had been selected for their YBCA 100 list, “which celebrates the people, organizations, and movements that inspire us most. This year we continue to highlight cultural provocateurs and innovators from the Bay Area and around the globe. Welcome to a list that includes alumni such as activist Alicia Garza (2016), journalist Jose Antonio Vargas (2017), and filmmaker Boots Riley (2017).” Cultural provocateur! What an honor, and what fantastic company to be keeping.
left; with my kids, Luke and Morgan, at the Lifetime Arts gala; right, thanking John Leland, friend and author of Happiness is A Choice You Make, who introduced me.
That fantastic slogan is the handiwork of Amy Gorely, an important new spokesperson for the movement to dismantle ageism. Watch her inspiring two-minute talk here. She lives in North Carolina, where I met her at the Wake Forest Aging-Reimagined symposium yesterday, and we’re cooking up ways to collaborate.
Note: if you’re new to my work, do check out the links, which explain or expand upon some key ideas.
Author and activist Barbara Ehrenreich has long been one of my heroes, and I imagine an affinity in our fondness for myth-busting. In her new book, Natural Causes: An Epidemic of Wellness, the Certainty of Dying, and Killing Ourselves to Live Longer, she describes herself as an “amateur sociologist,” and I thought, “Aha, me too!” But although I was a staff writer for a science museum for twenty years, Ehrenreich’s Ph.D. in cellular immunology leaves me in the dust. She brings deep medical expertise to her latest subject: the American delusion that we can evade aging, even death itself, via right doctoring, right acting, and right thinking. (Her previous book, Bright-Sided, skewered Americans’ outsized faith in positive thinking; here’s my take on it.)
I couldn’t agree more with the book’s exposition of the damage done by our reluctance to acknowledge aging and mortality. Age denial is where ageism takes root, and it’s fed at every turn by a culture that frames aging as failure and natural transitions as disease. Shame and fear create markets, capitalism always needs new markets, and healthcare is big business. Age denial not only fosters ageism, it makes a good death less likely. Pretending we’re not getting older, not to mention mortal, makes it harder to embark on the necessary conversations about what we think we’ll want when the time comes. It also leads us to squander resources on costly but ineffective tests and treatments, especially towards the end. Most of Natural Causes is a detailed takedown of those tests and treatments, from mammograms to mindfulness: “preventive medicine” that Ehrenreich the biologist exposes as medically useless and ethically problematic. Fitness nuts live no longer than the rest of us. Placebos work, even when people know they’re being given a placebo. There’s no evidence that meditation is more effective than an hour spent doing anything calming. Most medical screenings—for breast, colon, and prostate cancer, for example, along with annual physicals—also fail the evidence-based test.
At 76, Ehrenreich is calling it quits on all that. She declines longevity for its own sake, and demands that we forsake the illusion that we’re in charge of our biological destinies. “If anything, I hope this book will encourage you to rethink the project of personal control over your body and mind. We would all like to live longer and healthier lives; the question is how much of our lives should be devoted to this project, when we all, or at least most of us, have other, often more consequential things to do,” she writes. One of the reasons Ehrenreich can afford this argument is that she’s a longtime gym rat, and I’d argue that staying reasonably fit is consequential. We know physical activity forestalls both physical and cognitive decline, improving quality of life. But the way we grow old is governed by a whole range of variables—including environment, personality, and genes, compounded by class, gender, race, luck, and the churnings of the global economy—over which we have varying degrees of control. Available only to the well-off, the illusion of control dumps all responsibility onto the individual, conflates luck with virtue, demands optimism without end, and shames when we inevitably fall short.
I do, however, take serious issue with the ageist way in which Ehrenreich frames her central argument. Rather than falling into the anti-aging or “successful aging “ camps, she explains, “I had a different reaction to aging: I gradually came to realize that I was old enough to die [emphasis hers].” Ehrenreich isn’t saying that humans come with an expiration date, pointing out mordantly that the military judges people to be old enough to face mortal danger at age eighteen. The title of her book comes from a phrase used in obituaries when the deceased is over 70: a death from “natural causes.” This raises no eyebrows, as she points out, and it shouldn’t. The death of a young person is indeed harder to bear than that of an octogenarian, because youngers have experienced less of what life has to offer and because the rest of us are robbed of the chance to witness and share those experiences.
That doesn’t make it OK to reduce aging to illness. “Even the most ebullient of the elderly eventually comes to realize aging is above all an accumulation of disabilities,” writes Ehrenreich. Above all? Hardly! Growing old also brings self-knowledge, better mental health, even liberation, and is a period of ongoing growth and development, especially for those with meaningful roles and social supports. Even the most frightened and unenlightened know that despite the loss of cartilage and comrades, aging is different—and way better—than the way it’s portrayed in the culture.
Nor is it acceptable to suggest that olders are useless and disposable. Noting that the hallmark diseases of aging (atherosclerosis, arthritis, Alzheimer’s disease, diabetes, and osteoporosis) are all autoimmune disorders, Ehrenreich proposes that instead of asking why the body attacks itself, a better question might be:
“Why shouldn’t it happen? The survival of an older person is of no evolutionary consequence, since that person can no longer reproduce, unless one wants to argue for the role of grandparents in prolonging the lives of their descendants. It might even, in a Darwinian sense, be better to remove the elderly before they can use up any more resources that might otherwise go to the young. . . . And this perspective may be particularly attractive at a time, like now, when the dominant discourse on aging focuses on the deleterious economic effects of largely aging populations. If we didn’t have inflammatory diseases to get the job done, we might have to turn to euthanasia.”
The fact that the “dominant discourse” on aging is so negative and one-sided—so ageist, in other words—is what makes it so urgent and important to challenge age bias. The conversation is anything but neutral, especially around “living too long.” An ageist culture casts the “end of life problem” in terms of increasing numbers of old people who inconveniently refuse to die, when the underlying issue is the changing nature of healthcare: the plethora of profitable, often legally mandated, high-tech medical interventions the book decries. Whose interests are in play besides those of the patient, and who is her advocate if she needs one? It’s not a particularly radical leap to conceive of assisted suicide and euthanasia as a form of discrimination against the old, the ill, the disabled, and those who are no longer economically productive, cloaked in the rhetoric of compassion. In an ageist and capitalist society, the line between “right to die” and “duty to die” can get blurry alarmingly fast.
“Ideally, the determination of when one is old enough to die should be a personal decision, based on a judgment of the likely benefits, if any, of medical care and—just as important at a certain age—how we choose to spend the time that remains to us,” writes Ehrenreich. But the right to self-determination is important at any age. It’s ageism and ableism that make the old and ill seem less entitled to it, and cutthroat capitalism that sanctions their abandonment. Small wonder that it’s become commonplace to hear even healthy, middle-aged people wondering whether the ethical alternative to “living too long” will be to commit suicide—not because they’re sick, or broke, or have no one to take care of them, but simply because they’ve grown old. That is internalized ageism of the deadliest sort. At any age and in any condition, a person has the right to want to stay alive.
“Once I realized I was old enough to die, I decided that I was also old enough not to incur any more suffering, annoyance, or boredom in the pursuit of a longer life,” writes Ehrenreich. I’d tweak her credo: replace “old enough to die” with “old enough to know better” or “old enough to choose wisely.” It’s not that she’s eager to call it quits, which very few of us do, but that she’s wise enough to spend her days doing what she likes. “As the time that remains to me shrinks, each month and day becomes too precious to spend in windowless waiting rooms and under the cold scrutiny of machines. Being old enough to die is an achievement, not a defeat, and the freedom it brings is worth celebrating.” Liberation from the specter of a long, costly, agonizing exit hooked up to machines is real, and this book liberates.
Ehrenreich is consciously opting for quality of life, which for her means a life as free of doctors and hospitals as possible. For the record, an ageist society grossly underestimates the quality of life of the very old. The bull looks different. Also for the record, much of the care doctors offer patients with terminal conditions is futile, and most doctors would themselves decline it. The medicalization of aging does make what Ehrenreich calls “the truly sinister possibility” more likely: “for many of us, all the little measures we take to remain fit—all the deprivations and exertions—will only lead to a longer chance to live with crippling and humiliating disabilities.” But it’s stigma—ageism and ableism again—that make disability humiliating. And there’s a big difference between taking reasonably good care of yourself and giving yourself over to a life defined by doctoring.
Each of us will have to decide when to abstain or indulge, whether to be scanned or scoped, and how to cope with the consequences. I had every inch of me examined last summer, when I turned 65 and went on Medicare. That was before I’d read Natural Causes. When the next decision point arises, will I have the courage and clarity to forego tests or treatment? Will I break with my privileged demographic, buck my doctor’s advice, brave my family’s disapproval? Will I be not “old enough” but wise enough? I don’t know yet.
Not all of us will make the same choices, and none of us know what we will want at the end. But postponing those reckonings—not dealing with aging and its inevitable end—robs us of calm and content all the way along. I’m glad that growing hospice and palliative care movements are making it easier for more of us to forsake the futile pursuit of immortality, and hope the trend signals a growing cultural willingness to come to terms with the transitions ahead. What better way to maintain the upward trajectory of the U-curve of happiness—along with progress towards a world where ageism is as unacceptable as every other form of prejudice.
In my last post, I wrote about the regrettable tendency to act as though older people and people with disabilities form two separate groups. When groups within companies don’t share information or knowledge, it’s called a “silo mentality.” It reduces efficiency and compromises the culture. Siloing is just as damaging in the social justice sphere, where it fosters disconnection and marginalization.
Swapping silos for intersectionality
The antidote is to think and act intersectionally—a clumsy word for a powerful idea. Black feminist Kimberlé Crenshaw coined the term intersectionality in the 1970s, to address the ways that different forms of oppression—like racism, sexism and ageism— interact and combine to undermine us all. It’s also a way of thinking about the relationship between identity and power: how people and institutions use identity—old, for example, or disabled, or fat, or Muslim, or crazy—to confer or withhold advantage. In Crenshaw’s words, in an article called “Why Intersectionality Can’t Wait,” “intersectionality isn’t just about identities—race, gender, class—but about the institutions that use identity to exclude and privilege.”
These relationships explain why the poorest of the poor, everywhere in the world, are old women of color. Add disability to the mix, and vulnerability increases even more. It’s why, as Crenshaw wrote, “We simply do not have the luxury of building social movements that are not intersectional, nor can we believe we are doing intersectional work just by saying words.”
Many humanitarian efforts leave people with disabilities behind.
A vivid example came my way this week in an eloquent article by Kate Bunting, the CEO of HelpAge USA called “Putting inclusion into practice.” A term that came out of disability rights, inclusion means giving people with disabilities (PWD) full access to society, whether it means providing closed captions or building wheelchair ramps or simply inviting PWD into the conversation. Inclusion is core to HelpAge’s mission to improve the lives of the world’s poorest olders, and Bunting wants it to be a mainstream humanitarian priority.
We’re not there yet, because we don’t collect much information on older people and what data we do collect isn’t broken down by age or disability. Without data, we can’t design programs with those populations in mind. “What this translates to in practice,” she writes, “are distribution centers reachable only by those who can walk there; food only for those capable of digesting it; and emergency warnings understood only by those who can see and hear.” During wartime or emergencies, this makes PWD, many of whom are older, the last to receive resources and the first to die.
The global discussion of gender-based violence omits older women.
Guess who else is underrepresented in data collection? Older women (because they face both sexism and ageism—hello, intersectionality). “Women over 50 have long been ignored both statistically and anecdotally —as if there is a magical age that means a woman is no longer vulnerable to violence and discrimination,” writes Bunting in another powerful post called “#MeToo has no age limit.” No longer reproductively useful, women over 49 are systematically excluded from studies of gender-based violence and health. Again, the lack of information makes it impossible to create interventions that address their needs, even though violence against older women— physical, sexual, and emotional—is an urgent health and human-rights issue. As Bunting points out, women over 50 make up nearly a quarter of women around the world, their share of the population will only grow, and many live in developing countries where social or legal recourse is inadequate or nonexistent.
Governments and organizations are beginning to heed the HelpAge call: “Remember to include older women. Remember them in your work. Remember them in your policy objectives. Remember them in your development programming,” Bunting writes. “They have said, ‘Me too.’ We just haven’t been able to hear them because we never asked.”
None of us are free until all of us are free.
What can the rest of us do? Bust out of our silos. Ask honestly whether we’ve been reaching out to those with less privilege, to people who don’t look like us, or live far away, or don’t seem to have much in common with us. At heart, there’s only “us.” I used to say that ageism was the only form of discrimination that affects everyone, but in fact all oppression burdens us all. Like the T-shirt says, “None of us are free until all of us are free.”
A movement against ageism is underway. If we want it to leave no one out—to be genuinely inclusive—it has to engage people at the margins of society, the ones that people in power deliberately overlook because they can get away with it. It’s no surprise that queer women of color are leading the charge. They’re the ones, in the words of Rutgers professor Brittney Cooper, “who meld race, gender and queer politics into an expansive, inclusive, and just vision of the world.” That world is better for all of us. It’s the world I want to inhabit and that I’m learning to work towards. As a white woman of privilege not used to abandoning her comfort zone, I have a long way to go.
People with disabilities come in all ages, and almost all of us encounter some change in physical or mental capacity as we grow old. Yet, as I wrote in this substantial post, “We act as though old people never become disabled and disabled people never grow old.” Academics and policymakers approach disability and aging as separate fields, as Ann Leahy observed in this post for the International Network for Critical Gerontology (daunting name, terrific resource). Why? Because people in the aging field are understandably leery of seeming to equate aging and disability, and because, as Leahy noted, disability activists tend to be younger and mainly focused on issues that affect people of working age. Because we’re short-sighted and we’re all prejudiced.
This does none of us any favors, something I want to address in a new talk I’m working on. Here’s the passage-in-progress. Comments and critiques very welcome.
Ageism feeds ableism (prejudice against people with disabilities), and vice versa.
Disability and aging are different. They also overlap in important ways. Both olders and people with disabilities encounter discrimination and prejudice. And both groups face stigma. Many olders refuse to use wheelchairs or walkers, even when it means never leaving home. My uncle wouldn’t use a white cane even when he grew completely blind, preferring to rely on the kindness of strangers and taxi drivers. After breaking a bone in her foot, a not-yet-forty-year-old friend likewise declined a cane, deferring to crutches because they signal “injured,” not “old” or “disabled.” Cognitive impairment is even more stigmatized.
Being older or having a disability doesn’t keep us from being ageist or ableist. Age cooties! Handicapped people make me uncomfortable! That’s how prejudice works: it frames the other group—what we think of as the other group, that is—as alien and lesser than ourselves. This makes no sense, because people with disabilities come in all ages, after all, and most of us, if we live long enough, will face changes in physical or mental capacity. Ignoring the overlap also rules out collective activism.
We have a lot to learn from the activists who in the 1970s and ‘80s reframed the way we see disability. They changed it from an individual medical problem into a social problem—bingo!—and then demanded integration, access, and equal rights. We share the same goal: a culture that rejects narrow definitions of “productivity” and attractiveness, finds meaning within limitations—the bull looks different—and takes a realistic and inclusive view of what it means to be human. Let’s join forces.
What affliction do Americans fear most? Alzheimer’s disease. I’m one of them, unless so many bones give out that I have to be carried around in a shovel. But facts comfort me. Abundant new data shows that our fears are way out of proportion to the threat—and that those fears themselves put us at risk.
Fact #1: Dementia rates are falling. As I reported last April, the likelihood of you or me developing dementia has dropped—significantly—and people are getting diagnosed at later ages. That’s despite a surge in diabetes among older Americans, which significantly increases the risk. Numbers remain high—an estimated four million to five million Americans currently have dementia—but that number pales in comparison all the people who are worried about getting it, and about aging in general. Why is that important?
Scientists consider a gene called ApoE to be the primary genetic risk factor in late-onset Alzheimer’s disease, yet many who carry it never develop dementia. How come? Could environmental—and therefore modifiable—factors play a role? The new study, led by Yale’s Becca Levy, worked with a group of 4,765 people over age 60 who were dementia-free at the start, more than a quarter of whom carried the gene. Levy and her team interviewed them regularly over the course of four years, asking them to rank their feelings to prompts such as, “The older I get the more useful I feel.” They found that people with more negative attitudes were twice as likely to develop dementia. In other words, positive age beliefs confer protection against cognitive decline—even among people who are genetically predisposed to the disease.
Both experimental and longitudinal research show that stress, which links to dementia, may be the mechanism. Positive attitudes reduce stress and help us cope with the negative messages about aging and aging that bombard us from all directions. People assimilate cultural beliefs from early childhood on, and as these stereotypes become more relevant over time, we tend to act as though they were accurate, creating self-fulfilling prophecies. (More here about Levy’s theory of stereotype embodiment.) Positive beliefs (e.g. late life is inherently valuable, old age is a time of growth and development, olders contribute to society) help keep us healthy by buffering stress and prejudice: the effects of ageism. Negative beliefs (e.g. it’s sad to be old, old people are ugly, aging means becoming a burden) make us vulnerable to disease and decline.
It’s time for an anti-ageism public health campaign.
Reputable researches are careful not to overstate their findings, but the scientists behind this new study note that that their findings have far-reaching social implications. In personalized medicine, for example, education could bolster positive attitudes in people at higher risk of developing dementia. On a broader scale, as Levy points out, the research “lays a foundation for creating a public health campaign to beat back against ageism and negative beliefs about aging.” I’ve been making this case for years.
No matter how you feel about the longevity boom, or just about hitting that next big birthday, everyone wants olders to stay as healthy as possible for as long as possible. Imagine the benefits to health and human potential of replacing negative stereotypes about age and aging with more nuanced, positive, and accurate portrayals. The 65+ population of the US is expected to double by the year 2030. Let’s get cracking!
This project began 11 years ago as a project about people over 80 who work. Upbeat! Inspirational! Safe! I didn’t realize it at the time, but the project epitomized an approach that has dominated gerontology since the 1980s: “successful aging”— also known as “active,” “healthy,” or “productive” aging. For most of human history, aging was seen as a natural process largely beyond our control. Enter the “successful aging” model, which posits something close to the opposite: eat right, stay fit, choose well, have a good attitude, be “productive,” and we can craft the old age we want. The model emerged to counter to the prevailing narrative of aging as loss and decline alone, and it’s deeply appealing.
Something about this way of thinking made me uneasy, and I was lucky to get a gentle course correction early on from geriatrician Robert Butler, the inventor of the term ageism and one of the older workers I interviewed. “If you get up in the morning and get yourself dressed, you’re being productive,” he told me. Or, as I put it more bluntly years later, “If you wake up in the morning, you’re aging successfully.” As I came to realize, healthy behaviors and “can-do” strategies are terrific, but they can’t hold aging at bay—nor is that something we should aspire to. An active, healthy 65 is still 65, not “the new 50.” Imagining otherwise is denial—a high-end version that overlooks the very important role of socioeconomic class, along with race, gender, and just plain luck, in shaping how “successfully” we age. It leaves behind those who aren’t wealthy or healthy enough to age the “right” way, and it feeds the denial in which ageism takes root.
The model is problematic in lots of other ways as well, as I learned from reading Successful Aging as a Contemporary Obsession: Global Perspectives, a new collection of essays edited by Sarah Lamb. Lamb points out a central irony in the introduction: although “successful aging” came into being to counter aging’s negative image, this hyperpositive way of thinking “is, in ways that can be hard to recognize, in some respects profoundly ageist—resting on a deep North American cultural discomfort with aging, old age, and being old.” To age “successfully” is essentially to not age—to stop the clock—despite the fact that very few of us are going to drop dead without experiencing some kind of diminishment, whether physical, cognitive, or social. As Lamb writes, glossing over those normal transitions not only makes it all the harder to learn from and adapt to them, it sets us up to fail.
In any case why should aging be something to succeed or fail at? That’s the why-didn’t-I-think-of-it question posed by Toni Calasanti and Neal King in the book’s first essay. We don’t talk about successful infancy or “teenagehood” or adulthood, after all, and understand that those life stages come with both pros and cons. “No other stages are treated as if they had no value unique to them, as if no positives resulted from entry into those stages, or as if we needed to justify their existence by minimizing what is unique to them,” they observe. Why should later life be the exception? Calasanti and King also argue that urging people to take responsibility for their own aging ignores the inequities that give rise to ageism in the first place. “It does not confront the notion that old age is worse than middle age, that old people should find ways to be more like their younger selves, [and upholds] other life stages as the models against which elders will be assessed.” In other words, the successful aging modelleaves ageism unchallenged or contributes to it.
What else is problematic about “successful aging”?
It’s classist. Because aging “successfully” requires education, leisure, passports, and access to good healthcare and nutrition and exercise—all of which are expensive—it overlooks social inequalities. The successful ager is an assertive patient-consumer, upholding their civic duty by taking good care of themselves! The emphasis on personal responsibility dovetails with neoliberal and very American ideals of self-governance and independence. This relieves the state of responsibility, which in turn makes it less likely that the less well-off will receive the public support that make it possible to age well—or even to age at all.
I knew about this class bias, but hadn’t given any thought to how it plays out in the arena of caregiving. In an essay about older Chicagoans, Elena D. Buch writes, “Efforts to promote successful aging that focus on increasing self-determination and independence implicitly prioritize the well-being of vulnerable older adults over the well-being of their also-vulnerable care workers, strengthening existing social hierarchies based on race, class, and gender.” Oof.
it’s ableist. The “successful aging” model assumes that olders are healthy and just have to stay that way. There are no canes or wheelchairs in sight. Where does that leave those with chronic conditions, or with a disability, whose numbers inevitably increase with age? An essay by Jessica Robbins Ruszkowski describes Poland’s Universities of the Third Age, a popular educational and social institution in Europe that promotes active aging. Because illness would mean entering the Fourth Age (dependence and decrepitude), “The concept of the Third Age thus makes illness unthinkable.” The result is exclusion, instead of inclusive visions of aging that “go beyond binary constructions of activity and passivity, success and failure, productivity and unproductivity, and health and illness,” Ruszkowski writes. As an alternative, what if funding weren’t restricted to “programs focusing on active aging as such, but toward ensuring that people have the ability to support whatever kinds of social activity they find meaningful?”
As Janelle S. Taylor writes in an essay about friendship in the face of dementia, the conventional “successful aging” narrative requires stopping the clock: achieving physical, cognitive and social stasis. This presupposes an “entire social world … in which other people are also not aging in complicated ways alongside one,” and in which it makes sense to step away from friends who become incapacitated. Do so and we forfeit participation in what Taylor calls a “moral laboratory.” Those who hang in “describe friendship after dementia that is capable of changing rather than simply enduring; and they describe dementia as an impetus for personal and interpersonal transformations that can involve learning, growth, and unexpected gifts in addition to very real experiences of sadness and loss.”
It’s shame-inducing. If we’re responsible for the way we age yet unable to control its course, aging becomes a source of shame and embarrassment. As Abigail T. Brooks writes in an essay about why North American women have cosmetic surgery, this “can give rise to the blaming and shaming of olders for simply being and growing old and for failing to do anything about it.” We experience natural physical transitions as betrayal. It’s an embarrassment, or worse—a moral or political failure—if the trajectory changes, as in the case of a high-achieving, active woman in her 70s who experienced her cancer diagnosis as a personal failure.
It perpetuates gender stereotypes: The advertisements and products that promote “successful aging” “reinforce white, middle-class heterosexist norms of male performance and female beauty,” writes Lamb. Women are supposed to focus on maintaining beauty and men on their capacity to perform, whether in bed or at the gym, which reinforces active and passive stereotypes—desire on the part of men, desirability for women. “Successful aging” “means accepting that how you look (i.e. having a youthful appearance) matters,” writes Brooks. This requires both personal responsibility and hard work, although the women she studied didn’t describe it as work. Those who rejected this equation of youth with beauty and appearance with value—hard work in itself—“reap rewards as they forge new relationships to their aging bodies and as they realize new avenues for self-expression outside of the body altogether.”
It medicalizes the aging process. In an essay about selling youthful sexuality as “successful aging,” Emily Wentzell defines “lifestyle drugs” as “pharmaceutical treatments for conditions that range from baldness cures to eyelash lengtheners that cause social distress rather than physical harm.” The emotional relief they provide is real, she observes but the sources of that pain are not medical conditions but social expectations for how bodies should be—expectations that advertising and medical practice aggressively promote. “So, rather than questioning or challenging cultural expectations, people who cannot meet them increasingly turn to pills that will change their bodies,” writes Wentzell, which is dangerous, increases healthcare costs, and promotes unrealistic expectations, once again setting us up to fail as these strategies inevitably fall short.
Viagra exemplifies this trend, which, Wentzell notes, has “a range of social consequences.” While it counters the damaging stereotype that olders should not be sexual, it promotes a narrow vision of what connotes “healthy” sex and suggests that people should want to have sex the same way throughout their lives. She found a very different attitude in the working-class Mexican men she studied, who perceived erectile dysfunction drugs as dangerous and even absurd, and were content to shift from “macho masculinities” to “being faithful, caring, and emotionally present for their families.” Many of their wives experienced the transition as “a beautiful change.” Wentzell ends the essay with an appeal to question Euro-American ideas of aging as a pathology. “We can fight against this trend by basing our ideas of successful aging on people’s diverse and culturally specific social needs, rather than on the expectation that healthy aging means ‘staying as young as possible’ for everyone, everywhere.”
it’s ethnocentric – The prevailing “successful aging” model is deeply linked to deeply American cultural ideals about productivity, independence, and control over our bodies and our futures, even in old age. Anna I. Corwin’s study of Catholic nuns describes a very different value system. The nuns ceded control and agency to the Divine, which helped them accept changes with equanimity; they valued interdependence over independence; and they saw “being good” as more important “doing good,” so retiring or becoming disabled carried no stigma. As a group, Catholic nuns are happier, healthier, and have fewer cases of Alzheimer’s.
A “new paradigm for well-being across the life course” proposed by Meika Loe likewise “emphasizes learning to ‘be’ in a culture of doing.” She calls it Comfortable Aging. This model requires interdependency, is about accepting vulnerability and limitations, and involves coming to terms with mortality. “While we can fail at being ‘successful’ or ‘productive,’ personal comfort is subjectively defined and attainable,” Loe points out. “Importantly, most structural issues linked to Comfortable Aging are non-age-specific—social respect, affordable housing, community-oriented neighborhoods, access to transportation, dependable services, and care that honors all stages in the lifecycle—these are universal needs.” In other words, a society in which it’s OK to age comfortably is one that supports all its members all along the life course.
Another valuable aspect of Successful Aging as a Contemporary Obsession is its global perspective. Many in the majority world find these ultrapositive images of aging unrealistic and counterproductive. Lamb has done extensive fieldwork in West Bengal, where, far from being idealized, “too much independence is commonly regarded as the worst thing that can befall one in old age.” More than 80% of India’s 65+ population lives with their families, embodying “a relationship of lifelong intergenerational reciprocity.” There’s nothing demeaning about receiving care and support of all kinds, including with toileting. Imagine that!
As Lamb points out, we have much to learn about aging well from some Buddhist, Hindu and Catholic ways of thinking, which “highlight transience as a fundamental part of being human” rather than denying or stigmatizing the changes that accompany us along the life course. “Can we not accept signs of aging—even if they include declines, vulnerabilities, and ephemerality—as in some ways a meaningful part of life?” Lamb asks. “Shouldn’t it be possible to regard old age and death not as intolerable outrages, nor as failures of medicine and self, but rather as inevitable facets of life, defining in part what it is to be human?” That less “successful” world would be a better one for all: less fear-filled, more communitarian, and more open to the transcendent possibilities of life itself.