The Future of Dermatology

The Future of Dermatology

Recognition of skin diseases and their empiric treatment undoubtedly preceded the dawn of recorded history. It would be difficult to imagine that our preagricultural food-gathering ancestors, who surely could identify plants that were edible and nutritious and those that were toxic, were not equally capable of recognizing cutaneous abnormalities and using natural remedies for their alleviation. It would also seem reasonable that systemic disorders, unless flagrantly symptomatic or fatal, would go unrecognized. Thus, it is predictable that as human life began to be chronicled in written records, descriptions of skin diseases and their treatment would be prominent, given their ease of detection by even the untrained eye. The Ebers Papyrus and the Old Testament, as well as Hippocratic and Galenic writings, contain numerous descriptions of skin diseases and cutaneous manifestations of systemic diseases. During the next two millennia, the literature in dermatology increased, and it continues to increase at an almost exponential rate. Thus, for the medical historian, there is so much material available that writings today are much more likely to be highly selective, such as a comprehensive review of so-and-so’s disease rather than a comprehensive treatise encompassing all of an organ and its afflictions.

This is perhaps well exemplified at the annual meeting of the History of Dermatology Society (where historical vignettes are presented and discussed) as well as at the innumerable reunions at the annual meeting of the American Academy of Dermatology that provide alumni with an opportunity to reminisce about the “good old days” (historical vignettes, if you will).

The present in dermatology is about as equally well documented as its past. The educational programs of international, national, regional, and local academies and societies in addition to periodicals and books provide abundant information about the state of current knowledge in dermatology and areas worthy of further study. Thus, the past and present of dermatology, although too voluminous to be mastered in their entirety, are at least accessible.

The Future of DermatologyBut what of dermatology’s future? Other than plans, programs, and proposals to study or do this or that, the printed programs of major dermatology societies usually contain nary a word about the future, which presumably must be of some concern to their members. Perhaps it is a significant risk of being wrong that dissuades any reasonable person from tackling this topic. After all, in the words of mutual fund prospecti, “past results do not guarantee future performance.”

However, the theme of this essay is just that. I have the temerity to give it a try because I happen to know the future of dermatology or, at least, what it can or should be. I can assure the reader that it is not a manifestation of senility or the possession of a prized, particularly prescient crystal ball. Rather, it is based on the insight gained from involvement in the education of medical students, residents, and practicing physicians for more than 40 years.

Fifty years ago when I applied for admission to medical school, acceptance was highly uncertain because of the exaggerated number of applicants whose education had been interrupted by World War II, and by virtue of the G.I. Bill and the financial resources provided to participate in a prolonged educational process that only a generation before was available mainly to the sons and daughters of the well-to-do. Once a person was admitted, academic failure was rare, and upon graduation the choice of a specialty was virtually guaranteed regardless of a student’s class standing. If a medical school graduate wanted to be an ophthalmologist or an orthopedist or a dermatologist—or a general practitioner, for that matter—he or she became one simply because there were at least as many approved residency positions available as there were applicants to fill them. Thus, until the past decade or so, a degree from a U.S. or a Canadian medical school was a “blank check,” with the specialty of choice to be filled in by the graduate. As a result, each specialty, on average, had as its trainees a cross-section of medical school graduates. With few exceptions, medical students who were graduated in the bottom third of the class were as likely to be as well represented in excellent training programs as those from the upper half or third. These were the human resources with which modern dermatology was built. And how productive they were! Beginning from about 1950 there was an unprecedented crescendo of new information, both basic and clinical, that revolutionized the diagnosis and treatment of skin disease.

At the annual meeting of the American Academy of Dermatology in 1998, I was invited to give a talk at the Resident’s Colloquium. The program was entitled “The Year 2000—Emerging Trends and New Challenges for the Young Dermatologist.” It is possible that the residents in attendance represented the largest percentage of high school valedictorians, magna and summa cum laude graduates, and members of Alpha Omega Alpha at any one place and at any one time during the entire Academy meeting. Such is the demand for residency positions in dermatology that during the past decade only the upper 10% to 20% of medical school graduates stood much chance of acceptance. The potential of this group of “the best and the brightest” to contribute to our understanding of skin diseases and their diagnosis and treatment is unprecedented.

Nevertheless, this present generation of young dermatologists faces problems and challenges that would have been unimaginable 20 or 30 years ago. With the notable exception of a few spectacular advances in preventive medicine—polio immunization, for example—most of the progress in health care since World War II has resulted not in the cure or elimination of disease but in the prolongation of illness. Not so long ago, terminal renal disease was inexpensive to treat. With the introduction of dialysis and transplantation, however, it has become extremely costly. Multiply these costs by the economic burden of advancing therapy for cardiovascular diseases and cancer, among many others, and it is not surprising that about one seventh of the gross national product of the United States is expended on health care. It is no wonder that those who pay the bills (government, insurance companies, managed care groups, labor and industry, and patients) are anxious to reduce, or at least contain, health care costs. Absent a moratorium on research and new drug development, it is predictable that the coming years will see even greater progress in our ability to prolong illness and its attendant rising costs. However, the projected budget of the National Institutes of Health will probably increase markedly during the next 5 years, so funding for basic and clinical research will correspondingly increase. It is also highly unlikely that the major pharmaceutical companies or the myriad of recently founded niche players will curtail their search for the next breakthrough multimillion- (or billion-) dollar drug. Thus, the only practical way to reduce or contain health care costs is the rationing of care or the reduction in reimbursements in health care delivery by physicians and hospitals.

Given the fact that there are far more voting patients than physicians and hospital administrators, it is likely that their legislative representatives will favor a reduction in reimbursements over rationing. Of all the functions physicians perform, cognitive activities are the ones most likely bear the brunt of reductions in reimbursements, while procedures, whether required or elective, will continue to be more highly compensated.

These circumstances exert enormous pressure on young dermatologists who, although well trained in the cognitive aspects of this specialty, are increasingly trained to perform nontraditional, lucrative cosmetic procedures. Faced with substantial debt, a young family to support, and the start-up costs of private practice or the uncertainty of success in an academic career, it is no wonder that a commitment to cosmetic procedures is increasingly tempting.

My first prediction is that if training in cosmetic procedures begins to dominate residency education and the practice of its recent graduates, dermatology as a true specialty will cease to exist, simply because most (if not all) cosmetic procedures can be learned and performed by almost any other physician. It is not necessary to be a board-certified dermatologist to hire an aesthetician or to sell privately labeled cosmetic products in the office.

A review of the program of the last annual meeting of the Academy of Dermatology provides strong evidence about how pervasive and popular cosmetics and cosmetic procedures have become. For example, although dermatologic surgeons have made many outstanding contributions to the treatment of major skin diseases—especially Mohs micrographic surgery, cryosurgery, and laser therapy of port wine stains and other significant cutaneous vascular lesions—the program contained a mere five talks on Mohs surgery, four on cryosurgery, and one on the laser therapy of vascular birthmarks. In contrast, 32 sessions were devoted to cosmetic surgical procedures, another 27 specifically to cosmetic laser procedures, and five to sclerotherapy. Medical dermatology fared as poorly: two talks on contact dermatitis, three on atopic dermatitis, three on acne, and three on psoriasis. It certainly appears that many dermatologists are far more interested in taking advantage of how their clients would like to look than in seeing the innumerable patients with cutaneous diseases that they alone can diagnose and manage.

Despite these disparities in program content, medical dermatology continues to flourish. Not only do medical dermatologists continue to manage traditional cutaneous diseases such as eczema, acne, and warts, but they are observing an enormous increase in diseases that were unknown or unpredictable not many years ago. Virtually every patient who has HIV infection or has undergone organ transplantation (especially bone marrow transplantation) or chemotherapy of any sort develops during their lifetime a skin disease or diseases, many previously undescribed. It appears that every advance in medicine leads to a new cutaneous disease. Given the thousands of new drugs under development, it is likely that many of those that eventually reach the market also will be responsible for unique cutaneous reactions. By virtue of their education and training, dermatologists have an exclusive franchise to diagnose and treat the ever-growing number of iatrogenic skin diseases as well as the difficult task of managing more common problems (acne, psoriasis, contact dermatitis, etc., etc.) and those less common cutaneous disorders that only dermatologists know and recognize.

A medical dermatologist also has the advantage of being a member of a team. Those non-dermatologists whose treatment has produced a severe or puzzling cutaneous reaction or who believe that the skin condition in their patient may be a clue to the diagnosis of a systemic disorder have no recourse but to refer to a dermatologist. I believe I can safely predict that these circumstances will continue, regardless of what health care delivery system develops, and that the medical dermatologist will be a valued member of the health care team.

As a result of these considerations, I foresee dermatology divided into three major parts—cosmetic, surgical, and medical. These parts will be sharply circumscribed for some, but not for the majority of dermatologists. Medical dermatologists have always performed some surgery and would willingly prescribe a topical retinoid for a patient concerned about his or her sun-damaged face. Similarily, the Mohs surgeon will likely continue to try to devote a portion of his or her practice to cosmetic procedures rather than treating malignancies exclusively. However, given the constant rise in the incidence of skin cancer of all types, it is likely that the treatment of this disease will continue to dominate the practice of the Mohs surgeon. For the medical dermatologist and the Mohs surgeon, the future is bright indeed. The demand for their services can do nothing but increase, both because a growing number of patients will require their expertise and because more non-dermatologist-referring physicians will recognize it. Growing demand and the lack of significant competition provide a solid foundation for optimism. In sharp contrast, the future of cosmetic dermatology is far less secure because it is and will become far more competitive and will remain strongly dependent on economic factors. In a booming economy, expendable income is plentiful and elective procedures are affordable. Given a downturn, however, spider veins and wrinkles are unlikely to have a high priority in the family budget.

Overall, the future of dermatology can be bright, but only if dermatologists continue to be dermatologists and devote their efforts to patients who truly need their services.