Скачать книгу

appears to be unhappy and uncomfortable with old age,” said Patrick Cardinal O’Boyle, a retired bishop from Washington, D.C., noting that negative attitudes toward anyone who was not young had intensified over the past decade.1

      This sentiment, perhaps best captured by the popular counterculture phrase “Don’t trust anyone over thirty,” was reflective of the country’s rather recent aversion to aging. To be “old in the country of the young,” as Time magazine expressed it in 1970, was to feel like an outsider in one’s own home, an ironic state of affairs given that older citizens had of course been Americans longer than younger ones. Born around the turn of the twentieth century, the nation’s current crop of seniors had served on the front lines of the Depression and World War II, even more reason why their treatment could be considered unfair and even immoral. From a social sense, at least, the so-called generation gap was decidedly in favor of those on the younger side of the fence; those on the older side were typically cast as not in tune with where the country was and was heading. People over sixty-five (the mandatory retirement age at the time) were deemed as having little or nothing to contribute, making many Americans loathe the very idea of aging. (The famous line “Hope I die before I get old” from the 1965 song “My Generation” by the Who could be considered an anthem of the times.) Over the course of the late 1960s and 1970s, aging began to be seen as nothing less than an enemy, something that should be kept at bay or, if at all possible, be wiped out entirely.

      Against this backdrop, it was not surprising that science launched a full-scale attack on aging, with some of the country’s best and brightest dedicating themselves to making old age a thing of the past. Defenders of America’s increasing older population pointed out, however, that aging was a social problem rather than a biological one, and they urged seniors to fight for equal rights just as other marginalized groups had recently done with considerable success. Politicians of the time, including President Nixon, recognized the clout older citizens held and appealed to them as an important voting bloc. While legislation to help seniors live better lives would certainly be helpful, it was proving to be difficult or impossible to change Americans’ feelings toward aging in general. Only the gradual recognition that a large older population would in the future become a major economic problem seemed to capture people’s attention during this era, as aging in America took a historic turn for the worse.

       The Curious Property

      Consistent with the thinking in the mid-1960s that social ills could be eliminated if Americans set their collective mind to it, many concluded that the best way to solve the problem of aging was simply to make it go away. Scientists enlisted in the cause of aging in considerable numbers, seeing it as a frontier that could first be discovered and then conquered. In 1966, for example, the president of the American Chemical Society challenged fellow scientists to join him in what promised to be one of humankind’s greatest pursuits. A “fountain of youth” was just waiting to be discovered, William J. Sparks announced at the annual meeting of the society, with chemistry the means by which to realize this long-sought dream. More specifically, he explained, it was the kind of chemistry that created plastics and synthetic rubber, firmly convinced that human aging was caused by molecules not unlike those manipulated to produce these scientific wonders. While admittedly a controversial idea, Sparks’s molecular theory was representative of the sort of bold approach that needed to be taken if science was to achieve the very real possibility of humans living much longer and much healthier lives. The study of aging as a whole was vastly underfunded and underresearched, he and others at the conference agreed, urging the federal government to devote much more money and effort to solve what was perhaps our most enigmatic puzzle.2

      Unless or until the United States Government launched a full-scale war against aging, however, it was up to individual scientists, most of them industry personnel or university professors, to try to crack the code of how and why humans became older. Extending the human life span was tricky enough, but some researchers in the field were interested only in lengthening the period of life that preceded old age, a view the fountain of youth had engrained in the popular imagination. The Human Genome Project was decades in the future, but already a few scientists theorized that more knowledge about the structure of DNA could lead to the ability to manipulate cellular processes, including those having to do with aging. Others were focused on environmental factors that perhaps caused the body to eventually become, in layman’s terms, allergic to itself. What we interpret as the signs of old age were actually the physical wreckage left by antibodies that had attacked their host, researchers such as Roy L. Walford of the UCLA Medical School suggested. Preventing, reducing the number of, or repairing these mutations was the means by which to prolong the prime of life, Walford held, one of a growing number of scientists investing a good deal of time in unraveling this especially complex riddle.3

      With Cold War rhetoric lingering in the mid-1960s, it was not surprising to hear antiaging efforts expressed in aggressive, sometimes militaristic language. Also in 1966, Bernard L. Strehler of the National Institutes of Health (NIH) gave a talk at the New York Academy of Sciences called “The New Medical and Scientific Attack Against Aging and Death”—a fair representation of the kind of approach that was seen as needed to achieve victory. “An understanding of the biology of aging is within reach of this generation,” Strehler told the scientists, quite typical of the self-assured, rather audacious thinking of the time, where anything seemed possible. Breaking the current life span barrier of seventy to eighty years could be seen as analogous to other contemporary great scientific and technological feats like landing on the moon or harnessing atomic energy, with two things—good brains and loads of money—required to get it done.4 At the very least, many scientists agreed, the human life span could and should be extended by a few decades. Scientists at the California Medical Association, for instance, believed that we all should be living a hundred to 120 years, barring the onset of a rare disease or an accident that prematurely cut life short.5

      Even if it was almost entirely speculation at this point, such theories seemed feasible and promising to journalists covering the scientific beat. “It is not inconceivable that scientists someday may be able to control human life to the extent that old age and senility are all but eliminated,” wrote Harry Nelson, medical editor of the Los Angeles Times in 1966. Nelson envisioned a normal period of childhood followed by “a 60- to 70-year-long plateau of maturity, health, and high performance,” cutting out the two decades or so of old age. (He did not explain how and why death would suddenly occur after so many years of wellness.)6 Other scientists were meanwhile focusing on a single aspect of physical aging, hoping that it would lead to an understanding of the overall process. Arthur Veis of Northwestern University’s medical school was intent on discovering the cause of skin wrinkles, for example, thinking that wrinkles could possibly point the way to why the human body chose to get older in general.7 Scientists were frankly perplexed by the whole notion of aging because it contradicted what was widely recognized as nature’s most powerful instinct: to survive.

      While individual scientists at corporations and universities pursued their particular line of research, the NIH, an agency of the federal government, stepped up its efforts to combat aging. In 1968 (nearly three decades after Nathan Shock, a pioneering gerontologist, began his intramural lab in the Baltimore City Hospitals), the NIH swung open the doors of its brand new Gerontology Research Center in Baltimore, the largest federal facility for aging research in the United States. The goal of the center, which had been founded by Shock and others in 1941, was to uncover “the mysteries and problems of aging that have perplexed our philosophers and scientists over the years,” said William J. Cohen, secretary of the Department of Health, Education, and Welfare (HEW), who dedicated the new building.8 Millions of dollars of taxpayers’ money were going toward winning the war against aging, locating the initiative within the public arena. Indeed, the dream of solving the aging problem went far beyond the halls of science, crossing over into popular culture in the late 1960s. A 1968 episode of The 21st Century, a television series hosted by Walter Cronkite, for example, explored the phenomenon of aging and the possibilities of prolonging human life. “Can we live to be one hundred?” Cronkite asked, visiting the NIH and the Institute of Gerontology in Kiev in the Soviet Union, to try to determine the likelihood of most of us reaching the century mark. (Cronkite himself would make it to ninety-two.)

Скачать книгу