The most significant development in age studies during the last decade has been the emergence of the cultural dimension in the analysis of age and ageing. This is a promising portent of increasing resistance to ageism.
The most significant development in age studies during the last decade has been the emergence of the cultural dimension in the analysis of age and ageing. This is a promising portent of increasing resistance to ageism.
At the same time—painful facts—the age beat is still barely visible in the positivist mainstream media or the youth-oriented universities, and the general public thinks that how one feels about ageing has nothing to do with ageism. Even the Dalai Lama, 77, from a tradition that venerates ageing, articulates the casual ageism of our fallen world. Gesturing to two American priests, he said, ‘These days, I have felt I am quite old. But with them, I am young’ (Powers 2012: B1). He used ‘young’ as everyone does—to mean a desirable state of being in opposition to the unwanted state of feeling ‘old’.
Is there no escape? Biomedicine constructs us all as bodies-that-will-fail. Ecstatic announcements of ‘anti-ageing’ remedies are effective in increasing ageism, if not in producing cures. The ‘treatments’ proposed operate, if we can’t resist, because of their parasitic intimacy with the vast commerce in ageing: industries that thrive on our supposed sexual dysfunction (Big PhaRMA) and growing uglification (plastic surgeons, ‘beauty’ magazines). People who are vulnerable to the messages may become invested in options for useless somatic intervention.
Age profiteering is buoyed by trends other than the cult of youth. Age is subject to history like any other body-based attributes. The forces that promote narratives of either decline or progress in later life are not in the same equilibrium, or rather imbalance, in 2000 as they were in 1900 or 1950. Now fantasies of the longevity bonanza proliferate alongside growing terrors of living too long and becoming a ‘burden’. This savage contradiction characterizes ‘the New Longevity’. Other distinctive elements of living in a heightened-risk society also distort intergenerational relations and later life: partisan laments at the ‘greying’ of nation-states, just as capitalism is failing to produce adequate numbers of jobs even for mature workers; the alarm over midlife memory loss in the Age of Alzheimer’s; and insidious proposals to defund old age, which circulate widely atop the grey data.
Positive ageing, not to mention cultural gerontology and progressivism, pant in the rear behind the juggernaut. Just as sexism survives good news about women’s advances, ageism survives sunshiny news about ‘bonus years’ and liberal defences of old-age entitlements as due to elders who have paid into the safety nets and deserve them. Radically changing contexts give urgency to critics’ attempt to keep up with the powers playing the age cards. The task of age critics is often to show how age ‘knowledge’—publicly shared decline culture and the framing of policy debates–is constructed, in tandem, by all of the above. Understanding age means going interdisciplinary in unexpected ways, and steeling oneself for unearthing more bad news.
In the United States (where the author of this chapter lives), ageism has been growing worse, becoming in some contexts (as we shall see) lethal. Ageism is most vicious toward the most vulnerable, those who grow past the long midlife into frail old age. But ageism inevitably has effects on younger people as well—on how they anticipate their own life course and judge the value of older people. Evidence has been mounting for decades that ageist attitudes and emotions are acquired in childhood, like sexism and racism, and harden in youth.
A baby learning to walk toddles on tippy-toe across the floor and falls; anyone present applauds. An ‘old man’ falls to the floor and onlookers hold their breath.
Both participate in central cultural narratives of age and expectation. The next step for the baby is known: speaking, growing, learning. Progress. The baby stands up; her laughter when falling may become part of her legend. The old man collapses into another narrative. The next step, even if he rises making a rueful joke, appears to be a walker or a hospital. Decline.
In both cases, onlookers ‘know’ the next step. It clicks into place mechanically. To put the cultural point briefly, life-course beliefs, behaviours and narratives depend on the age of the human object in question. The fall of the baby embodies progress, while the fall of the old man embodies expectations about his failing body. As I have explored in earlier books, Declining to Decline and Agewise, even a man who comes from a highly protected subculture knows the master narrative of decline (Gullette 1997, 2011).
Sometimes people ageing-past-youth know this in excruciating detail. In Doris Lessing’s novel, Love, Again, a theatre director, age 65, sitting one night surrounded by younger actors suddenly ‘feels her age’ sexually and says to herself, harshly, that she, Sarah Dunham, is ‘in exactly the same situation as the innumerable people of the world who are ugly, deformed, or crippled… . [N]o matter how unfeeling or callous one is when young, everyone, but everyone, will learn what it is to be in a desert of deprivation’ (Lessing 1996: 141). Ageing is a lengthy process involving gains as well, but ‘ageing-as-decline’ may flare up suddenly—especially in fiction, which adores epiphanies—as a sudden loss of privilege. Decline, also an ableist and looksist discourse, invokes all the disabilities and losses linked to being no longer young.
But does the ‘old man’ who fell—how old? how strong?—apply similar ideas to himself? As he falls, he may believe that osteoporosis is only a story about old women. Race, class, access to medical care, religion, also help decide what our man thinks, lying on the floor. He may not concur, but has no control over the banal narrative onlookers impose. In nothing do people differ more than in their ageing; in nothing are they more homogenized than by ageism.
In all cultures, children are exposed early to accepted life-story formulations. Learning them is an important activity. These stories tell the meaning of time passing. My grand-daughter Vivi, at age six, has been told she can’t pierce her ears until she is ‘thirteen’. A miniature progress narrative, giving her another reason to look forward in the life course.
At the same time, scholars report that by age eight, American children have ‘well defined negative notions’ about old age (Quadagno 2011: 13). In a Michigan study (Aday et al. 1996: 44), pupils with the most negative views answered, ‘Growing old makes me scared of being sick’ and, ‘When I grow old, I don’t want to be in an old folks’ home. I want to be independent’. Schoolchildren lard their sentence completions with images that include ‘scary’, ‘weird’, ‘lonely’, ‘stingy’, ‘vulnerable’. Young children sit further from older adults than from younger ones (Isaacs and Bearison 1986). Might Vivi soon sit further from me?
Learning is constructed through overlaps among ‘information’, behaviour, semantic events, and intergenerational patterns. New public-health advice to avoid obesity is effected by warning children about ageing-as-disease: heart disease, stroke, diabetes. Parents who act to ‘minimize their own risk of ever needing health care that would be available only at market prices’ (Kunow 2010: 303) might also reinforce their children’s fears of ‘ageing’. So might adults joking about ‘old-timer’s disease’ and comic TV skits about ‘cougars’ hunting younger men. Humour is the last refuge of hate speech. Is it a surprise that increased ‘knowledge’ of the ageing process doesn’t necessarily improve participants’ attitudes toward older adults, even, disappointingly, after the gerontologists intervene? The researchers add, wearily, ‘this finding is not new’ (Cottle and Glover 2007: 511).
Many children placidly expect to be old and nice when they become grandparents. But some turn into young adults who write ugly stuff on the Internet. Speaking as if for his youth cohort, one student writes on Hatebook.com, ‘God forbid these miserable once-were-people not [sic] survive as long as possible to burden the rest of us’ (Stripling 2011). Ageism can be more vicious than racism or sexism because it has no righteously belligerent community vocally resenting it. When the economy crashed in 2008, and American media and the Republican party scandalously described old people as ‘burdens’, a teacher—not coincidentally—set a debate topic for students: ‘Why are old people a burden not only to the family but also society?’ (Yahoo Answers 2008. Many people on the Yahoo Website, however, defended old people). Leni Marshall (2008–9: 56), an English professor and feminist gerontologist, observes, ‘Most students arrive at college with a set of stereotypes about ageing firmly but unconsciously embedded’. Many students reject ageist hate speech but are unable to formulate good arguments against it. Since students may learn ageism from their syllabi, discussing ageism in all classes—including in classics and vampire fiction—is another useful exercise.
Starting in childhood, as age narratives accumulate in our minds (those miscellaneous repositories), we are being aged by culture. Age curricula, mediated and interpreted through everyone’s unique bodily experience, change over any lifetime. National crises force changes, absorbed in bits as we feel interpellated by them, lifelong. How narratives, language, images, institutions, and systems function to ‘re-place’ us in the life course as we think we merely ‘age’—these issues drive the age beat. In the next sections, I follow the money, critical currency, into middle and later life.
The crisis in employment can now start at 40. I invented the term ‘middle-ageism’ to focus on it, but ‘older worker’—which is misleading—is used by economists. The years from 45 to 54 are supposed to be the highest earning, at least for the middle class. The family life course is most expensive then—particularly with young-adult children staying home or boomeranging and older parents suffering longer with chronic illnesses. Women ageing-into-midlife in recent decades began to benefit from this rising age-wave curve. But the arc of the American Dream is being eroded, with enormous consequences for individuals and for our vision of the life course.
Up in the Air, the 2009 movie about a midlife man whose job is to fire others, starts by interviewing real people who had just been canned. The response of one middle-aged African-American woman ‘to her horrifically impersonal firing’, a critic writes, ‘is to quietly claim, with her eyes trained directly on the [camera], that she will jump off a bridge’ if she can’t find work. She followed through (Koresky 2009). Suicide goes up, as do heart attacks, among the unemployed. At midlife, they may rightly despair, because the consequences are dire.
Workers in prime age fall out of work—3.5 million of them in the US between the ages of 45 and 64 in 2012 (Casselman 2012). The jobless rates for men and women older than 55 are the highest they have been since the Great Depression (Cauchon 2009). In an ageist society, if you lose a job at midlife, you will be unemployed for far longer than a younger person—months longer than your own adult child. The average job search for a young woman aged 25 to 34 was 9 months in 2011, while that of a woman 45 to 54 was two and a half months longer (BLS 2011).
The situation for people 45 to 54 has worsened since 2000, when only 16 per cent were unemployed for over 6 months. In 2011 it was half (EPI 2012). If you are under 40, you have a 40 per cent higher likelihood of being interviewed than if over 40, Sara Rix of the American Association of Retired People reported at the Gerontological Society of America meeting in 2011. The average duration of unemployment rises every decade over the working life course.
Midlife unemployment destroys because there are fewer ways back up. Those lucky enough to find jobs again typically lose status, pay and benefits. Their jobs may be temporary; they may be underwater with debt. A Rutgers study estimates that of the millions of American workers who have lost jobs since 2008, 28 per cent are between the ages of 45 and 59. Almost half of these prime age workers had not recovered fully by 2011—that is, had not found new, well-paying jobs and regained their standard of living. A large proportion of this age group have been ‘devastated’—selling possessions, eating less, borrowing from family or friends, feeling humiliated (Zukin et al. 2011). Of men aged 55–61, only 7 per cent are underemployed; for same-age women the percentage is almost three times higher (OWL 2012: 6).
People being cut out of employability are in their prime in terms of skills and experience. At the lower quintiles, they lose their chance to save. ‘Whites experience more rapid rates of wealth accumulation than their minority counterparts during middle and later life, resulting in accelerating wealth disparities with age’, especially for women of colour (Brown and Warner 2008, iii–iv). As of 2010, more than half of households will not have enough retirement income to maintain their pre-retirement standard of living (Kleyman 2012).
Midlife discrimination, although too often accepted as rational (employers save money), is illegal. Over 40, Americans are supposed to be protected by the Equal Employment Opportunity Commission. In one typical suit that went to the EEOC, an employer said to his underlings, ‘We need young blood’. The assumption is that midlifers are not tech-savvy or as quick to learn. If we misplace our car keys, employers may expect us to declare with a grin, ‘old-timer’s disease’. If employers don’t dump midlifers, they often pay us less.
The percentage of complaints based on age has been increasing steadily, from less than 20 per cent in 1997 to 25 per cent in 2008—one in four (EEOC 2011). Women sue 10 years younger than men, in their mid-40s. Plaintiffs rarely win. OWL says only 1 per cent do.
The loss of jobs is not just an effect of the Endless Recession. It is a long-term fact of an economy failing to overcome middle-ageism. Eliminating midlife workers has become a tacit business practice and a disastrous capitalist trend. As the people long called ‘the ageing Boomers’ matured, they increasingly slammed into offshoring, weakened unions, loss of jobs in manufacturing and later in white-collar work, layoffs, ‘early retirement’ and pension defaults instituted by globalizing and privatizing capitalism. The jobs are gone. Andrew Sum, professor of economics at Northeastern University, estimates that in 2012 there were six people seeking full-time work for every one opening (personal communication). Experienced people willing to work hard become another excess labour pool.
Some elites, to borrow from Emerson, ‘would not so much as part from their ice cream’ to save others from poverty and humiliation (Emerson 1929). But even those who profit from the race to the bottom should regret this waste of the nation’s human resources, the lost purchasing power and the tax revenues lost to poverty.
Societies underestimate the historic implications of degrading the midlife. As ‘boomers’ lose work, the midlife age class as a whole loses status, dismissed as future unproductive and expensive seniors. The gravest effect on our vision of the life course comes from destroying the underlying principle of seniority: that people deserve more respect and rising wages—not automatic deflations—as they grow older. Without seniority as a set of practices and values taken for granted, ageing-beyond-youth becomes a hopeless decline. ‘Progress’ becomes a hollow phrase. Middle-ageism is indeed changing what it means to be human. This tragic side of the greying of America should appal the young, because if not stopped it is their future too. And as middle-ageism becomes harder to stymie, lethal ageism becomes harder to resist.
Mainstream discussions about America’s economic woes often get twisted into polemics against the old. Both ‘ageing Boomers’ and senior citizens who vote are represented as powerful and therefore responsible—even for downward mobility and the deficit. Ignoble motives are ascribed to them: they selfishly burden ‘our’ children.
The consequent point-of-view of the ageist deficit hawks is that old people, costing too much while staying alive on Social Security, will soon cost too much trying to postpone death on Medicare. They should go cheaply by refusing medical treatment. A bioethicist, Daniel [hope I would off myself when my time comes] Callahan gained notoriety for this argument in a book sternly titled Setting Limits, although he later accepted a life-saving operation at age 80. This duty-to-die campaign has moved out of bioethics where it was contested (by philosophers such as Christine Overall in Ageing, Death and Human Longevity, 2005), and into major outlets where it can influence multitudes and be supported as the least bad final solution for Medicare’s costs. In Time, for example, Joe Klein (2012) begrudges his mother her heart surgery at 80: she lived another 10 years, but the surgery cost—he tells us—$100,000. The Atlantic Monthly gave Sandra Tsing Loh (2012) space to explain ‘Why Caring for My Ageing Father Has Me Wishing He Would Die’.
Writers pity the young, in articles such as ‘Are Millennials the Screwed Generation?’, or attack Boomers as ‘The Entitled Generation’. Some exacerbate generational warfare, arguing that older people in need of life-saving measures should be permitted to die in favour of younger people. Peter Singer (2009), a well known philosopher, argues grotesquely on the basis of chronological age that the death of a teenager is ‘a greater tragedy than the death of an 85-year-old […] If a teenager can be expected to live another 70 years, saving her life counts as a gain of 70 life-years, whereas […] saving the 85-year-old will count as a gain of only 5 life-years’. His 1:14 ratio of value is deadly.
Expensive care is often called ‘heroic’ ‘over-treatment’ when old people are the recipients, by those who forget that ‘under-treatment’ means medical neglect or disregard of patients’ wishes. Seventy per cent of patients and their families would undergo radical operations requiring Intensive Care Unit treatment again, to achieve as little as 1 month’s survival (Gillick 2001: 247 n1). Yet rationing care by age—and gender—appears common. Of 400 oncologists surveyed, 93 per cent were likely to recommend intensive therapy for a woman with stage IIA breast cancer if she were only 63, but only 66 per cent would do so if the patient was identical but 75 (Foster 2010). Another study (Woodard 2003) found the odds of not receiving chemotherapy were seven times greater for women over 65 than for women under 50. One meta-review (Dale 2003: 11) noted ‘decreasing diagnostic zeal’ toward older cancer patients and concluded that bias may be ‘a primary reason for their having poorer outcomes’ than younger patients.
Articles complaining about over-treatment usually fail to consider under-treatment a genuine problem. An article supporting doctors who ratchet back ‘avoidable’ care ends, one critic joked, ‘by a quick hand-wave at healthcare rationing, just in case you were getting concerned. “This is not, I repeat not, rationing”, said Dr Steven Weinberger of the American College of Physicians. This affords us Preview of the Future #2: Rationing will occur, but few if any will admit it’ (Brownlie 2010). You might think that dying is tough enough without putting pressure on people to do it cheaply, prematurely, and against their wishes.
Cost is in many ways a red herring (Garson and Engelhard 2009). Of the 4,600 Medicare beneficiaries who die every day, most incur low costs. Only a small percentage incur high costs, and most are in fact men between 65 and 74, with cancer. Doctors apparently believe in ‘heroic’ measures to save them. As age at death increased, cancer deaths and Medicare per capita outlays fell. Most over 85 were women (Hogan 2001). Women’s ‘survival advantage’ means many are poorer, more likely to be alone, and living longer with chronic conditions. If people refuse life-sustaining interventions at any age, they may require care of a different kind. Funding Hospice adequately may not be cheaper.
Will Congress decide those younger men shouldn’t get the ‘aggressive’ care they want, or that older women should not receive whatever care they need? It makes ethical sense for everyone concerned to give people more options, respect their choice if they want treatment, manage their pain and avoid posthumous regrets, paying whatever it takes. Perhaps if you have not helped care for someone who is living with dying, you should stay quiet on this subject.
‘Age ideology’ now includes the duty-to-die. A Nobel Prize-winning economist, Paul Krugman (2012), summarized the 2012 Republican presidential campaign goals: ‘a literal description of [the Romney-Ryan] plan is that they want to expose many Americans to financial insecurity, and let some of them die, so that a handful of already wealthy people can have a higher after-tax income. It’s not a pretty picture’. Heidi Hartmann and Martha Burk (2012: 24) explain, ‘The right wing is enjoying the fruits of 30-plus years of work to change the dominant ideology from one of shared responsibility and sacrifice to a “winner take all” society, where social supports are minimal to nonexistent’.
The campaigns waged by this ideology are well-funded; their will relentless. The excess suffering that results from indifference, malice, ignorance, or disregard for social justice is the tragic part of the age crises. Each of my foci (ageism in children, middle-ageism in work-life, scapegoating Boomers and old people, and accelerating our deaths) compels our mission.
The great demographic irony is that the sheer numbers of the adversely-affected grow, but without enabling us to organize a countervailing collective. Many people understand that sexism and racism are constructs, but age lags. Journalists are puzzlingly wedded to a fantasy in which passive ‘Boomers’ bring about enormous changes at the last minute, a ‘tsunami’ rolling richly into old age. Such phony ascriptions endorse helplessness. A young friend writes that she fears ageing. When I say, ‘Fear ageism, not ageing’, she retorts, ‘Ageism! Yes. That’s why I fear ageing!’ A new impasse is foretold, in which we become agewise but remain impotent. In the world that most people inhabit (dominated by capitalism, positivism, and ‘anti-ageing’), realizing that we are aged by our culture is still the mental step that is hardest to take.
But suppose more people take it. Ageism-consciousness has immediate goals as well as a long-term vision of deep educational reform and public resistance to decline. The jobs crisis may force government into creative ways to provide more jobs and economic security to midlife as well as younger people. As political anger grows at those who poison age culture, more parents and teachers will refuse to let children grow up ageist.
The life course, insofar as it is socially constructed, should be something that the youngest can look forward to with anticipation, that all can relish through the long midlife and frail age, and that we can look back on with some satisfaction when dying.
Some of the research for this essay was provided by my Brandeis intern, Katey Duchin.