Doctors, dressed up in one professional costume or another, have been in busy practice since the earliest records of every culture on earth. It is hard to think of a more dependable or enduring occupation, harder still to imagine any future events leading to its extinction. Other trades—goldsmithing, embalming, cathedral architecture, hexing, even philosophy—have had their ups and downs and times of vanishing, but doctoring has been with us since we stumbled into language and society, and will likely last forever, or for as long as we become ill and die, which is to say forever.
What is it that we expected from our shamans, millennia ago, and still require from the contemporary masters of the profession? To do something, that’s what.
The earliest sensation at the onset of illness, often preceding the recognition of identifiable symptoms, is apprehension. Something has gone wrong, and a glimpse of mortality shifts somewhere deep in the mind. It is the most ancient of our fears. Something must be done, and quickly. Come, please, and help, or go, please, and find help. Hence, the profession of medicine.
You might expect that such a calling, with origins in deepest antiquity, would by this time have at hand an immense store of traditional dogma, volumes and volumes of it, filled with piece after piece of old wisdom, tested through the ages. It is not so. Books do exist, of course, but all of them are shiny new, and nearly all the usable knowledge came in a few months ago. Medical information does not, it seems, build on itself; it simply replaces structures already set in place, like the New York skyline. Medical knowledge and technical savvy are biodegradable. The sort of medicine that was practiced in Boston or New York or Atlanta fifty years ago would be as strange to a medical student or intern today as the ceremonial dance of a !Kung San tribe would seem to a rock festival audience in Hackensack.
Into this jumpy, always revising profession there strolled, several years ago, an inquisitive anthropologist, not totally an innocent, fresh from two years of living with a !Kung San tribe in Kalahari, with another possible book on his mind. Melvin Konner, Ph.D., a professor at Harvard, had hankered in boyhood to be a physician but had given up the idea in his undergraduate Harvard years and gone on to graduate school instead. After establishing himself as an academic expert in comparative behavior, his curiosity about medicine returned in force, and he arranged for admission to Harvard Medical School at the age of thirty-five. He does not entirely explain the move; in part, maybe large part, he did so out of professional anthropological interest: What was the tribe of medical doctors really like when viewed closely from the inside? The thought of becoming a physician, looking after sick people, was near the top of his mind, but not, as with his classmates, the only thought.
His book deals almost exclusively with the third year of school, the time when students are done with their didactic courses in basic biomedical science and are now in white coats, out on the wards of the teaching hospital, learning the laying on of hands and discovering that the hands are to be laid on equipment more often than on patients.
Early on in this year, Konner began to encounter the aphorisms on which the practice of medicine has long been based. Unlike aphorisms in general, these nuggets of perceived wisdom are made up anew by each generation, but a few of them have old histories. The best of these, which Konner recognizes for its immense importance, is a short sequence of prescriptions given to him by one of his teachers:
“If it’s working, keep doing it.”
“If it’s not working, stop doing it.”
“If you don’t know what to do, don’t do anything.”
The third of these, Konner says he later realized, is “the most difficult one by far, the one least adhered to in common medical practice, and beyond a doubt the most important.”
Of course. Indeed, it is the ambiguity arising from this plain piece of sense that is principally responsible, in my view, for most of the problems that face contemporary medicine, including the unprecedentedly bad press, the mutual mistrust and disillusionment among doctors and their patients, the escalating costs of health care and insurance, including malpractice insurance—in short, the state of crisis in which the profession finds itself today.
I take it further. The dilemma of modern medicine, and the underlying central flaw in medical education and, most of all, in the training of interns, is the irresistible drive to do something, anything. It is expected by patients and too often agreed to by their doctors, in the face of ignorance. And, truth to tell, ignorance abounds side by side with the neat blocks of precise scientific knowledge brought into medicine in recent years.
Advertisement
It is no new thing. In 1876, on the occasion of the country’s first centennial, a book entitled A Century of American Medicine, 1776–1876, was published. The five authors were indisputable authorities in their several fields, from the faculties of Harvard, Columbia, and Jefferson Medical College. The book is a summary of the major achievements in American medicine of the previous century. The optimistic last sentence of the book is perhaps more telling than the writers may have realized: “It is better to have a future than a past.” A very large part of the past in that century of medicine was grim.
Early on, there was no such thing as therapeutic science, and beyond the efforts by a few physicians to classify human diseases and record the natural history of clinical phenomena, no sort of reliable empirical experience beyond anecdotes. Therapeutics was a matter of trial and error, with the trials based on guesswork and the guesses based mostly on a curious dogma inherited down the preceding centuries from Galen. Galen himself (c.130–c.200) had guessed wildly, and wrongly, in no fewer than five hundred treatises on medicine and philosophy, that everything about human disease could be explained by the misdistribution of “humors” in the body. (Konner has disguised Harvard in his book, for some reason, as the Flexner School of Medicine associated with the Galen Memorial Hospital, and I do wish he hadn’t.)
Congestion of the various organs was the trouble to be treated according to Galen, and by the eighteenth century the notion had been elevated to a routine cure-all, or anyway treat-all: Remove the excess fluid, one way or another. The ways were direct and forthright: Open a vein and take away a pint or more of blood at a sitting, enough to produce faintness and a bluish pallor, place suction cups on the skin to draw out lymph, administer huge doses of mercury or various plant extracts to cause purging, and, if all else failed, induce vomiting. George Washington perhaps died of this therapy at the age of sixty-six. Hale and hearty, he had gone for a horseback ride in the snow, later in the day had a fever and a severe sore throat, took to his bed, and called in his doctors. His throat was wrapped in poultices, he was given warm vinegar and honey to gargle, and over the next two days he was bled from a vein for about five pints of blood. His last words to his physician were, “Pray take no more trouble about me. Let me go quietly.”
Beginning around the 1830s, medicine looked at itself critically, and began to change. Groups of doctors in Boston, Paris, and Edinburgh raised new questions, regarded as heretical by most of their colleagues, concerning the real efficacy of the standard treatments of the day. Gradually, the first example of science applied to clinical practice came somewhat informally into existence. Patients with typhoid fever and delirium tremens, two of the most uniformly fatal illnesses of the time, were divided into two groups, one treated by bleeding, cupping, purging, and the other athletic feats of therapy, while the other group received nothing more than bed rest, nutrition, and observation. The results were unequivocal and appalling, and by the mid-nineteenth century medical treatment began to fall out of fashion and the era known as “therapeutic nihilism” was well launched.
The great illumination from this, the first revolution in medical practice in centuries, was the news that there were many diseases that were essentially self-limited. They would run their predictable course, if left to run that course without meddling, and, once run, they would come to an end and certain patients would recover by themselves. Typhoid fever, for example, although an extremely dangerous and potentially fatal illness, would last for five or six weeks of fever and debilitation, but at the end about 70 percent of the patients would get well again. Lobar pneumonia would run for ten to fourteen days and then, in lucky, previously healthy patients, the famous “crisis” would take place and the patients would recover overnight. Patients with the frightening manifestations of delirium tremens only needed to be confined to a dark room for a few days, and then were ready to come out into the world and drink again. Some were doomed at the outset, of course, but not all. The new lesson was that treating them made the outcome worse rather than better.
It is difficult to imagine, from this distance, how overwhelming this news was to most physicians. The traditional certainty had been that every disease was aimed toward a fatal termination, and without a doctor and his energetic ministrations, or barring miraculous intervention by a higher force, all sick people would die of their diseases, whatever they were. To recognize that this was not so, and that with rare exceptions (rabies the most notable one) many sick people could get well by themselves, went against the accepted belief of the time. It took courage and determination, and time, to shake off the old idea.
Advertisement
Looking back over the whole embarrassing record, the historians of that period must be hard put to it for explanations of the steadily increasing demand, decade after decade, for more doctors, more clinics and hospitals, more health care. You might think that people would have turned away from the medical profession, or abandoned it. Especially since, throughout the last half of the nineteenth century and the full first third of this one, there was so conspicuously little that medicine had to offer in the way of effective drugs or indeed any kind of technology. Opium, digitalis, quinine, and bromides (for the “nerves”) were the mainstays. What else did physicians do during all those years that kept their patients calling and coming?
Well, they did a lot of nontechnology, and it was immensely effective. Mainly, they made diagnoses, explained matters to the patient and family, and then stood by, taking responsibility. To be sure, there were skeptics and critics all around, but they had always been around. Montaigne wrote bluntly, concerning doctors, “I have known many a good man among them, most worthy of affection. I do not attack them, but their art. It is only fear of pain and death, and a reckless search for cures, which blinds us. It is pure cowardice that makes us so gullible.” Molière made delightful fun of doctors in his century. Dickens had some affection but no great respect for the doctors, most of them odd, bumbling eccentrics, who turned up as minor but essential characters in his novels. Shaw was a scathing critic of medicine and its pretensions, clear into modern times.
But the public regard, and loyalty, somehow held. It is exemplified by a memorial tablet in the north wall of St. James Church in Piccadilly, in honor of Sir Richard Bright (1789–1858), the discoverer of the kidney disease that still bears his name, and a not atypical Harley Street practitioner during the period of transition from the try-anything to the just-observe schools of medicine. The plaque reads, in part:
Sacred to the memory of
Sir Richard Bright, MD DCL
Physician Extraordinary to the Queen
He Contributed to Medical
Science Many Scientific Discoveries
And Works of Great Value
And Died While In the Full Practice
of His Profession
After a Life of Warm Affection
Unsullied Purity
And Great Usefulness
This is what nineteenth-century people expected their doctors to be, and believed most of them were in real life. The expectation survives to this day, but the reality seems to have undergone a change, in the public mind anyway.
There are many very good physicians around, as gifted and sought after as Bright was in his time, unquestionably better equipped by far to deal with lifethreatening illnesses, trained to a level of comprehension of disease mechanisms beyond any nineteenth-century imagination, but “warm affection” and “unsullied purity” have an anachronistic sound these days, and even “great usefulness” is open to public questioning. The modern doctor is literally surrounded by items of high technology capable of preventing or reversing most of the ailments that used to kill people in their youth and middle years—most spectacularly, the bacterial and viral infections chiefly responsible for the average life expectancy of less than forty-five years in Bright’s day. But medicine’s agenda still contains a long list of fatal and incapacitating diseases, mostly the chronic disabilities of older people, and there is still no technology for these, not even yet a clear understanding of their underlying mechanisms.
The unequivocal successes include military tuberculosis, tertiary syphilis of the brain and heart, poliomyelitis, the childhood contagions, septicemias, typhoid, rheumatic fever, and valvular heart disease, and most of the other great infectious diseases, now largely under control or already conquered. This was the result of the second big transformation in medicine, starting about fifty years ago with the introduction of the sulfonamides, penicillin, and the other antibodies, gifts straight from science. The revolution continues in full force, thanks to what is now called the “biological revolution,” but it is still in its early stages. With new technologies of fantastic power, such as recombinant DNA and monoclonal antibodies, disease mechanisms that were blank mysteries, totally inaccessible, just a few years back are now at least open to direct scrutiny in detail. The prospects for comprehending the ways in which cancer works, as well as other illnesses on what is becoming a long list, are now matters of high confidence and excitement among the younger researchers within the universities and in industrial laboratories.
But the future is not yet in sight, and medicine is still stuck, for an unknowable period, with formidable problems beyond the reach of therapy or prevention. The technologies for making an accurate diagnosis have been spectacularly effective, and at the same time phenomenally complex and expensive. This new activity is beginning to consume so much of the time of the students and interns, and the resources of the hospital in which they do their work, that there is less and less time for the patient. Instead of the long, leisurely ceremony of history taking, and the equally long ritual of the complete physical examination, and then the long explanations of what has gone wrong and a candid forecast of what may lie ahead, the sick person perceives the hospital as an enormous whirring machine, with all the professionals—doctors, nurses, medical students, aides, and porters out in the corridors—at a dead run. Questionnaires, fed into computers along with items analyzing the patient’s financial capacity to pay the bills, have replaced part of the history. Blood samples go off to the laboratory, and the CAT scan and Nuclear Magnetic Resonance machines are relied upon as more dependable than the physical examination.
Everyone, even the visitors, seems pressed for time; there is never enough time; the whole place is overworked to near collapse, out of breath, bracing for the next irremediable catastrophe—the knife wounds in the Emergency Ward, the flat lines on the electroencephalogram, the cardiac arrests, and always everywhere on every ward and in every room the dying. The old Hippocrates adage, “Art is long, Life is short,” is speeded up to a blur.
It was into this environment that Dr. Konner stepped for his third year training, and, as an already-trained anthropologist, he took notes. The leaders of the tribe, the elders, to his considerable surprise, were the interns, only two years ahead in their training, and just beyond them the residents. Almost all the medicine he would learn in that year would be taught to him by these young, harried and hurried, often bewildered neophytes, and his annotations at this stage of his education were touched frequently by some bitterness, even subdued rancor. As sometimes happens in anthropology, the professional observer doesn’t much like the members of the society he has joined, and as also happens (one thinks of Colin Turnbull and his year among the Iks) they don’t much like him.
But personal conflicts aside, the deeper problem encountered by this student was the frenetic hyperactivity of everyone around him. His few really delightful experiences were on the one service where time held its own clock, and where the outcome was almost always decisive and satisfying: the obstetrical wards. He adored delivering babies; it is a pleasure to read of his pleasure. He might have liked surgery as a craft, but couldn’t abide most of the young aggressive surgeons, or the astonishing vulgarity of their jargon. He felt marginally at home among the psychiatrists, but was not much impressed by what they could do for their patients, especially those with severe psychoses.
The biggest difference between Konner’s clerkship in 1983–1984 and those of the students I remember from earlier decades—all the way back fifty-one years to my own third year at Harvard—seems to be the near total absence of senior physicians. As I recall, they were always near at hand, in and out of the wards, making rounds at all hours, displaying for the students’ benefit the complete repertoire of seasoned, highly skilled doctors. Where on earth were these people in Konner’s Harvard?
He mentions only a few such figures, and they seem to have left little impression on him; he does not write about their presence the way a young student would describe his role models. The real instructors were the interns and residents, and the professors were off somewhere else. This aspect of the book puzzles me. If Konner’s memory is accurate, medical school has changed a lot in recent years.
And perhaps, to some degree anyway, it has. Everyone is too busy, urgently doing something else, and there is no longer enough time for the old meditative, speculative ward rounds or the amiable conversations at bedside. The house staff, all of them—interns, residents, junior fellows in for the year on NIH training fellowships—are careening through the corridors on their way to the latest “code” (the euphemism for the nearly dead or the newly dead, who too often turn out to be, in the end, the same), or deciphering computer messages from the diagnostic laboratories, or drawing blood and injecting fluids, or admitting in a rush the newest patient. The professors are elsewhere, trying to allocate their time between writing out their research requests (someone has estimated that 30 percent of a medical school faculty’s waking hours must be spent composing grant applications), doing or at least supervising the research in their laboratories, seeing their own patients (the sustenance of a contemporary clinical department has become significantly dependent on the income brought in by the faculty’s collective private practice), and worrying continually about tenure (and parking). About the only professionals who are always on the wards, watching out for the unforeseen, talking and listening to the patients’ families, are the nurses, who somehow manage, magically, to hold the place together for all its tendency to drift toward shambles. I wish Konner had paid more attention in his notes to the central role of nurses in the modern teaching hospital; they deserve more respect and attention from everybody (and much more pay).
Konner finished medical school and wrote his book, and that seems to have been that. He decided against an internship, and is back in place as a professor of anthropology, evidently content. However, he does not have prescriptions to offer for fixing the things he perceives as wrong in medical education. Nor do I, except for two proposals, more like obsessive wishes for the future than a recipe for the present. My first hope is for removal of substantial parts of the curriculum in the first two years, making enough room for a few courses in medical ignorance, so that students can start out with a clear view of the things medicine does not know. My second hope is for more research into the mechanisms of that still-unsolved list of human diseases. The trouble with medicine today is that we simply do not know enough; we are still a largely ignorant profession, faced by an array of illnesses that we do not really understand, unable to do much beyond trying to make the right diagnosis, shoring things up whenever we can by one halfway technology or another. (The transplantations of hearts, kidneys, livers, and lungs are the only measures available when we lack any comprehension of the events responsible for the prior destruction of such organs.) A great deal of the time and energy expended in a modern hospital is taken up by efforts to put off endgame.
We will be obliged to go on this way and at steadily increasing expense, as far as I can see, until we are rid of disease—at least rid of the ailments that now dominate the roster and fill the clinics and hospitals. This is not asking for as much as one might think. We will never be free of our minor, self-limited ills, nor should we be planning on postponing dying beyond the normal human span of living—the seventies and eighties for most of us, the nineties for the few more (or less) lucky among us. But there is a great deal we will be able to do as soon as we have learned what to do, both for curing and preventing. It can never be done by guessing, as the profession learned in earlier centuries. Nor can very much be changed by the trendy fashions in changed “life styles,” all the magazine articles to the contrary; dieting, jogging, and thinking different thoughts may make us feel better while we are in good health, but they will not change the incidence or outcome of most of our real calamities. Everyone should stop smoking; but we are still obliged, like it or not, to rely on science for any hope of solving such biological puzzles as Alzheimer’s disease, schizophrenia, cancer, coronary thrombosis, stroke, multiple sclerosis, diabetes, rheumatoid arthritis, cirrhosis, chronic nephritis, and now, topping the list, AIDS. When we have gained a clear comprehension, in detail, of what has gone wrong in each of these cases, medicine will be earning its keep.
This Issue
September 24, 1987