The Research Dilemma

The Research Dilemma

Louis Lasagna

One of the legacies of the national debate over the Clinton health-care plan is a new public ambivalence about the value of medical research and technology.

Share:
Read Time:
8m 24sec

One of the legacies of the national debate over the Clinton health-care plan is a new public ambivalence about the value of medical research and technology. During that debate, Americans were told over and over—and are still being told—that the ballooning national cost of health care could be traced in part to the neverending supply of new diagnostic and therapeutic options produced by medical science: CAT scans, MRIs, surgical procedures, medicines, prosthetic replacements for dysfunctional hips and knees, organ transplants, and so on. The co-villains in this national health-care melodrama were a medical profession profligate in its approach to medical care and a greedy, obscenely profitable health-care industry. And their sins included the promiscuous and irrational use of the new techniques and technologies.

Like all melodramas, this one is not entirely removed from reality. Medical research and technology undoubtedly have contributed to the rising cost of health care. What is often forgotten, however, is that they have also spared us incalculable expense and suffering. Vaccines have eradicated smallpox from the planet, for example, and may someday eliminate poliomyelitis. Cost-benefit analyses for individual diseases show that some treatments generate savings. Continuing digitalis therapy (which is not very costly) in patients with congestive heart failure has been estimated to prevent 185,000 clinic visits, 27,000 emergency room visits, and 137,000 hospital admissions every year. The net annual savings total an estimated $406 million.

In a perfect world, we might be able to separate "good" (i.e. cost effective) research from "bad," but it is an essential characteristic of knowledge, especially the knowledge produced by basic research, that it refuses to follow fixed paths. Peter Medawar, who won a Nobel Prize for his research on the immune system, writes that "nearly all scientific research leads nowhere—or, if it does lead somewhere, then not in the direction it started off with.... I reckon that for all the use it has been to science about fourfifths of my time has been wasted, and I believe this to be the common lot of people who are not merely playing follow-my-leader in research."

Critics who are alarmed by the large share of national wealth claimed by health care should direct their attention to a real villain: disease. Cardiovascular ills, cancer, and Alzheimer’s disease cost the United States more than $300 billion annually in medical expenses and indirect costs such as lost work time. Add arthritis, depression, diabetes, and osteoporosis, and you rack up another $200 billion. Cutting these costs has to be considered an urgent national priority.

Medical discoveries not only reduce expenses but allow the beneficiaries to continue to live productive lives—and, not incidentally, to enjoy something to which, for better or worse, no price tag can be attached: a better quality of life. Every year, there are 500,000 new cases of duodenal ulcer in the United States and four million recurrences. While ulcers may seem to those who don’t have them to be little more than a metaphor for the condition of modern life, they are quite painful—and they cost society between $3 billion and $4 billion annually in direct and indirect costs. Until very recently, doctors believed that ulcers are caused by "stress" or some mysterious form of "hyperacidity"—and that very little could be done about them. But research has shown, as a National Institutes of Health (NIH) panel concluded in 1994, that a treatable microorganism called Helicobacter pylori is responsible. The discovery will vastly improve the quality of life for millions of people in years to come—and save the United States billions of dollars.

Until the 1980s, most medical research in the United States was funded by the federal government, chiefly through NIH, but that has since changed. More than $30 billion is now spent annually on health-related research and development, and over half of that amount comes from industry, chiefly the pharmaceutical industry, with outlays of $16 billion in 1995. (Other research is carried out by manufacturers of medical devices such as heart valves.) NIH is a $12 billion enterprise composed of 17 specialized institutes, which deal with everything from neurological disorders and stroke to dentistry. It channels about twothirds of its money to outside researchers in universities, hospitals, and other institutions. Much of the work funded by NIH is basic research, essential but without any immediate prospect of a payoff. Despite the new budgetary constraints in Washington, Congress has continued to expand NIH’s budget modestly.

Still, much of the momentum in health-care research has shifted to the private sector. The United States is a world leader in pharmaceuticals—a lead American companies maintain in part by plowing an extraordinary 19 percent of sales income into research. Thousands of chemicals are synthesized for every one tested on humans, and of the latter, only 20 to 25 percent make it to market. The time from discovery to marketing of a new drug now averages 10 to 15 years. The average cost of bringing a new drug to market is more than $300 million (counting failures and allowing for the cost of money that could have been invested elsewhere).

The rise of managed care and the new stringency in health care have begun to alter the strategy of the drug companies. A new drug today must be either a "blockbuster" or persuasively better in some way than already available drugs to win acceptance from the formulary committees at health maintenance organizations and hospitals and the pharmaceutical benefits programs that increasingly decide which drugs are bought. Drug companies now have little incentive to develop products that are only incrementally better.

After years of criticism, the Food and Drug Administration (FDA) hasspeeded up drug approvals somewhat (especially for cancer and AIDS drugs), but its demands for data from tests on animals and humans still inflate costs and needlessly prolong the process of getting new drugs into circulation. Congress may soon send a bill streamlining many of the FDA’s procedures to President Bill Clinton’s desk, but it is unclear whether the legislators are going to propose dramatic changes. And they may simply run out of time, leaving the matter to be dealt with after the 1996 elections.

The FDA has until recently focused primarily on its role as protector of the public health, guarding citizens from fraud and ineffective and unsafe drugs. It needs to shift its emphasis to the promotion of public health, which means in part getting new advances onto the market as quickly as possible. Approving an ineffective drug is bad, but so is rejecting or delaying approval of a drug that is effective. While Americans are often urged to look to Europe for models of national health-care systems, relatively little is said of Europe’s speedier drug regulation processes, which frequently make new treatments available to patients long before they are in the United States.

More flexibility at the FDA, however it is achieved, is essential to the success of America’s nascent biotechnology industry. Many of the most exciting medical discoveries of the future could come from this new field. Few of the roughly 1,300 biotechnology firms in the United States have become profitable so far, largely because it takes so much time to develop and win approval of a new drug, diagnostic kit, or vaccine. Nevertheless, the industry has already created such important laboratory-made health products as recombinant proteins (human insulin and human growth hormone), erythropoietin (for anemia from various causes), and alpha interferon for hairy cell leukemia.

Many diseases are still poorly treated: most cancers, Alzheimer’s disease, multiple sclerosis, cystic fibrosis, and muscular dystrophy, to name a few. The apples picked from the research tree thus far have been those on the lower branches. Among the most exciting prospects on the higher branches is gene therapy, a technique whereby defective genes in human beings can be repaired or replaced. A number of genes responsible for inborn diseases have been identified and isolated, and the exact nature of the defects characterized. But history also teaches us humility, or at least it should. We have known the molecular basis of sickle cell anemia for half a century, but treatment remains grossly inadequate. Even for those few diseases in which a defect in a single gene is responsible, repair or replacement of the affected gene may not provide a cure.

Inevitably, however, medical research is going to present us with painful dilemmas. We now have, finally, a treatment for a very rare genetic disorder called Gaucher’s disease, which causes the body to produce a flawed version of an enzyme needed in the metabolism of lipids. The enzyme in question is expensive to produce, and treatment at launch was estimated to cost $100,000 to $300,000 per patient per year. To my knowledge, insurers have been reimbursing for this treatment, but would they if the disease afflicted not a handful of people but millions? What if gene therapy for cystic fibrosis worked, but cost $1 million per patient? Would our health-care system pay for it? What about Alzheimer’s, that cruel disease whose victims Elie Wiesel once eloquently compared to books losing their pages one by one, leaving nothing at the end but dusty covers? What if an effective therapy is discovered but is "too expensive"?

As a society that already spends 14 percent of its wealth on health care, the United States is eventually going to confront a reluctance to pay large new sums for all of the fruits of medical research. Restraining research might allow us to avoid the creation of expensive new treatments, but it would also mean sacrificing the most affordable fruits and abandoning the prospect of unexpected breakthroughs. It is a route we cannot afford. Eventually Americans will need to confront the need for rationing—not the inescapable rationing that occurs on the battlefield or in times of natural disaster, but rationing of services we can supply but for which as a society we simply are unwilling to pay. And that is a route for which we are completely unprepared.

 

About the Author

Louis Lasagna, M.D., is dean of the Sackler School of Graduate Biomedical Sciences at Tufts University and has been for 20 years director of the Tufts Center for the Study of Drug Development.

More From This Issue