Скачать книгу

companies. If anything, one would expect people of different ages to be eagerly welcomed into organizations as an expression of diversity—a prime initiative of human resource departments. Discrimination against older people in the workplace is commonplace (and illegal), however, a product of our deeply embedded aversion to people considered past their prime. “It would be awkward and embarrassing to have an older person work for me,” younger friends of mine have explained in my attempt to understand the underlying reasons for ageism in the workplace. True, perhaps, but white people got used to working alongside black people and men alongside women, making age the only remaining demographic criterion in which it is acceptable to discriminate (often in the name of something like “overqualification”). Imagine the legal and social consequences if millions of American employees casually mentioned their discomfort in having to supervise or work for an African American, woman, Latino, or gay or disabled person! A huge class action suit would result if the same kind of bias being shown by corporate America toward older people was based on a job applicant’s gender, race, or other biological attribute, something a clever lawyer might want to think about.

      All expressions of ageism are the natural result of aging being seen in American culture as a state of decline, the downward side of the curve of life. Despite laudable attempts by AARP and some “pro-aging” advocates, the years following the age of fifty or perhaps sixty are commonly considered a kind of existential purgatory between the end of one’s active life and death. Older people are generally deemed weaker, less attractive versions of their younger selves, a terrible and simply untrue expression of identity. It is easy to see how seniors are often viewed as little more than slow-walking, bad-driving, hard-of-hearing, Matlock-watching citizens. (Studies show that ageism and negative attitudes toward older people are present in young children, and these feelings are difficult to change by the time they become tweens.)4 Hollywood has been especially unfriendly toward older people, either portraying them as comic foils or ignoring them completely. This attitude has reinforced cultural stereotypes related to aging and has lowered older people’s own sense of self-worth. Without a clear appreciation for what aging is and how it could be something to embrace rather than deny or ridicule, we may be headed toward a social crisis during the next few decades. Aging should be viewed in positive terms, and older people should be considered valuable parts of society. I urge fellow baby boomers to start telling that story while we still have the chance.

      There is a rich and abundant literature devoted to aging in the United States, of which space here allows only a cursory review. While leading figures in the field of gerontology may have tended to ignore its cultural dimensions, they were instrumental in forging a body of work that helps to theoretically frame this study. The legacy of Matilda White Riley is an especially important one, as she perhaps more than anyone else understood the value of bringing an interdisciplinary perspective to the field. From the late 1960s until her death in 2004, Riley, often in conjunction with her colleague and husband, Jack Riley, “presented a compelling vision of the need for other disciplines to consider the role of social forces in shaping both aging as an individual, lifelong process and age as a feature of culture and social systems,” as Dale Dannefer, Peter Uhlenberg, Anne Foner, and Ronald P. Abeles expressed it soon after her death. Gerontologists from many disciplinary backgrounds were influenced by her work, something all the more remarkable given the fact that she did not begin to study aging until her mid-fifties.5

      It is also difficult to overestimate the contribution of physician and gerontologist Robert Butler to our understanding of aging in America. Butler, who began his career in the 1950s and died in 2013, was the founding director of the National Institute on Aging (NIA), where he made Alzheimer’s disease a primary area of research. He was also the first chair of a geriatrics department at an American teaching hospital (Mount Sinai in New York City). He coined the term “ageism” after observing the lack of respect shown to the elderly and their conditions in medical schools, a theme that heavily informed his 1975 book Why Survive? Being Old in America. In 1963, he published a paper entitled “The Life Review: An Interpretation of Reminiscence in the Aged,” which, without exaggeration, redirected the trajectory of gerontology in this country. Via a “life review,” elderly people were offered the rare opportunity to look back on their personal past and see their “life course” in a larger context. Gerontologists found that such “memory work” proved to be a beneficial psychological process offering significant therapeutic value, so much so that mental health experts and social workers adopted the approach in their own practices. It is easy to see how conceiving of one’s life in narrative terms with a beginning, middle, and end rather than as a more or less random series of events can help an older person make sense of his or her time on the planet, something that Butler keenly recognized. He was a “visionary of healthy aging,” wrote historian W. Andrew Achenbaum in his fine biography of the man, devoting his own life to improving those of older adults.6

      Peter Laslett, an English historian, also contributed greatly to the field after he retired from teaching in the early 1980s. Although he devoted much of his career to British political history, Laslett turned his attention to aging later in his own life. His concept of the Third Age and the Fourth Age, as outlined in his 1989 book A Fresh Map of Life, is as relevant and useful as ever. As his model makes clear, it is important to make a distinction between older people, as there is great variation within the population based on individuals’ respective mental and physical health. Laslett posited that the Third Age is typically one of activity and fulfillment, while the Fourth Age is one of dependency and frailty, a major difference in terms of viewing the concept of aging.7

      Laslett’s supposition speaks directly to the cultural dynamics of aging in the United States today. Americans are inclined to lump all older people together into one group, just one example of how aging is often overgeneralized, misunderstood, and misinterpreted in contemporary society. As Laslett’s theory implies, it is important to distinguish Fourth Agers from baby boomers, as the latter have substantially different health care and economic needs than the former. Likewise, psychologist Bernice Neugarten (a “giant” in the field in her own right) believed there was a pyramid of aging composed of the “young-old, old-old, and oldest old” that also offers a constructive way to segment the population.8 Finally, in their Rethinking Old Age, British sociologists Paul Higgs and Chris Gilleard echoed Laslett’s idea of a Fourth Age while emphasizing its serious social consequences, something the present book also attempts to achieve.9

      While this work is a cultural history of aging in the United States, it is impossible to ignore the universality and timelessness of the subject. Many of the great minds of their day offered key insights regarding aging that are still relevant, ranging from Cicero’s view of getting older as a natural part of life to Francis Bacon’s dream of eliminating disease and perhaps even death. In his Aging in World History, David G. Troyansky offered a whirlwind tour of aging through time and space, beginning with how hunters and gatherers understood getting older, then moving to the concept of old age in classical civilizations and to the role of later life in the Middle Ages and during the Renaissance. Troyansky traces how the modern concept of aging emerged in Europe and North America, leading to its creation as a social problem in the nineteenth and early twentieth centuries. Learning how a cross-section of civilizations over thousands of years have interpreted getting older, not to mention the wisdom of the likes of Aristotle, Socrates, Plato, and Shakespeare, is not only fascinating but has the unexpected effect of making those encroaching physical signs of aging a bit less vexing for a middle-aged reader.10

      A brief overview of aging in the United States from the beginnings of the country through the post–World War II era does much to show how we got to what today is arguably a troubling situation. We now take our youth-oriented culture as a given, but this was not always the case. From the seventeenth through the early nineteenth centuries in America, people who lived a long life were venerated, their advanced age seen as divinely ordained. “Old age was highly respected in early America, perhaps in part because it was comparatively rare,” wrote David Hackett Fischer in his Getting Old in America, with just 20 percent of the population living to be seventy years old in 1790. This began to change soon after the American Revolution, however, as the first Americans to be born in the new country distinguished themselves from those who had immigrated to the colonies. By the turn of the nineteenth century

Скачать книгу