The tragic Ebola epidemic in Africa — and its resultant echoes on American shores — has filled the news for weeks now. Several US-based patients have been flown back from Africa, where the mortality rate for Ebola approaches 70%, to be treated in the United States where the mortality rate is quite low when early treatment is aggressively initiated. There was an article in the Wall Street Journal in August, see here, about physicians contracting Ebola because of a simple shortage of latex gloves in Africa. Reading this article got me thinking about personal protection in medical situations at the start of my career and how much things have changed since my days in medical school.
Hungarian physician Ignaz Philipp Semmelweis (1818-1865), known as “The Savior of Mothers,” was a pioneer of antiseptic methods. In the early 1800’s, in the obstetric hospital where Semmelweis practiced, there was a three times greater rate of maternal mortality from “childbed fever” when women were delivered by physicians than by midwives. This attention-grabbing statistic is attributed to an even more astonishing fact, one that Semmelweis discovered after conducting lengthy investigations: midwives washed their hands before delivering babies and physicians did not. The established medical community did not accept Semmelweis’ premise of antisepsis when he published the results of his study in 1847, and he was widely ridiculed for his theories and eventually forced to resign from his position in the obstetric hospital. His wife had him committed to an asylum soon after, at the age 47, because of his “deranged” ravings regarding his findings on germs, and he was beaten by the asylum guards some 14 days after his committal and ironically died from infection from those wounds soon after. So I guess the dissemination of what we now consider to be basic sanitation habits didn’t result in happy endings for everyone!
Today, hand washing is as routine in a hospital as sterilizing instruments, and it is done to protect the patient from contracting bacteria or viruses from the other patients that the doctor or nurse has recently examined, as well as from all the germs carried into the hospital from the outside world, but physicians and nurses wear non sterile gloves when examining a patient to protect themselves from the critters carried by the patient. But that has been a surprisingly recent advancement in medical practice.
In the winter of 1975, I was taking Human Anatomy during the first semester of my first year of medical school. The privilege of being able to dissect a human body, someone who had so recently been someone’s mother, father, daughter, or son was impressed upon us at the first session of anatomy class. We were instructed to recognize the sanctity of these bodies and treat them with utmost respect and, for the most part, we did. This is not to imply that any group of medical students qualified for sainthood and, of course, there was the occasional inappropriate joke, but we really were, on the whole, quite respectful. We were divided into groups of four students per cadaver, and we took turns dissecting, watching, taking notes, and commenting. Some groups of students did name their cadavers, and I guess calling your cadaver “Susie” probably qualified as being a bit disrespectful, but it did help lighten the emotional load of facing more dead bodies than most students could have ever imagined. Anatomy lectures occurred every morning for 50 minutes, and we were in the lab for three hours, three times each week, dissecting. That’s a lot of time spent with dead bodies.
The first day I walked into the anatomy lab was a special day, the day I remember feeling I had literally crossed the line from being a non-medical person to becoming a medical insider. Seeing that cadaver for the first time was the beginning of my learning to separate myself emotionally from the patient. To understand how necessary this process is, can you imagine how difficult it would be to dissect a body while thinking about what this person had been like during their lifetime, who their family was, and the simple fact that a very short time ago they were cheering for their favorite sports team or telling someone how much they loved them? The process of establishing distance between doctor and patient really began, for me, that day I walked into the anatomy lab.
The lab was a large room containing uniform rows of wheeled tables holding coffin-size stainless steel boxes, rows of six boxes across the room by ten rows down the long side of the room with about five feet between the tables, just enough room for students to maneuver around all sides. Bright florescent lights glowed over each box, and the standard-issue hospital linoleum tiles added to the room’s chill sterility. If there were tables surrounded by small chairs instead of loaded stainless steel boxes, the room would look exactly like the lunch room I ate in when I was in elementary school, and that grotesque layering of images hit me as I gazed upon those boxes, knowing full well what lay in wait inside each one. It was an impressive visual, and there was a palpable aura of fear and excitement among all of us as we contemplated seeing the body that we would soon be dissecting down to its smallest detail over the next five months. But even more breathtaking than the sight of these stainless steel boxes was the smell. The acrid, unforgettable smell of formaldehyde was overwhelming, sickening, nauseating, and it was inescapable. How would we ever get used to this awful smell? We walked slowly to our assigned spots and opened the lid and saw a formaldehyde-soaked cloth covering a body. There was a crank on the end of the box, very similar to the crank that we would become accustomed to using at the end of hospital beds. That crank was used to raise and lower our patients’ head or feet. This crank was used to raise and lower our cadaver out of its formaldehyde bath. One of my foursome slowly and silently cranked the body out of the fluid, rolled down the cloth and, for many of the medical students, it was the first time they had seen a dead body. (I had already seen more than my share of dead bodies as I had worked for five months before medical school at Atlanta’s Grady Hospital Burn Unit, but that is a story for a another blog.)
After a long soaking in formaldehyde, a human body turns an unmistakably awful grayish color and looks like more like a zombie from a B movie than an actual person. But every one of us knew we were facing a real human being who had once been alive and was now dead. That first day was one of the last times we would look at this former person and feel a human spirit, an actual kindredness, a measure of shared humanity. After that day, we saw little beyond networks of muscles, ligaments, nerves and organs lying in that box before us, and from then on, we worried more about our upcoming examinations or understanding complex disease patterns than we did about the humanity of that former person. We spent countless hours dissecting and memorizing all the bits and pieces that the human body contains, and we spent countless days, weeks, and months inhaling formaldehyde and going back out into the world reeking of formaldehyde.
I was chosen to take up the scalpel and make the first incision on my group’s cadaver as we dissected the neck looking for the brachial plexus, a large network of nerves that extended from the neck to the upper chest. I was both proud and apprehensive as I sliced into the skin. I remember us students talking about our not wearing gloves as we dissected, not out of any concern about catching diseases or even actually touching the innards of a dead human being, but because of the awfulness of the formaldehyde. For the next 5 months, the skin of my hands was constantly crinkled as if I had stayed too long in the pool, but this phenomenon was not from anything as fun as swimming; rather it was from the hours of touching a formaldehyde-soaked cadaver with no protective barrier. No matter how much I washed, I could not get the odor of formaldehyde off my hands. Out, damned spot! Out, I say! But like Lady MacBeth, I washed and washed, to no avail. When I ate sandwiches, I could both smell and taste the formaldehyde that resisted every brand of soap I tried. But can you imagine, especially from the perspective of today, that no gloves were provided to us? As impoverished peanut-butter-eating medical students, very few of us could afford to buy our own. It was suggested by our teacher that we use liquid soap to soap up our hands, thus creating a barrier between our skin and the formaldehyde. While some of my colleagues did this, I found it impossible to accurately handle a scalpel with slippery, soapy hands. To the best of my knowledge, in my dissection year of 1975, nobody got infected from their cadaver.
It’s not possible to even imagine today’s medical students not wearing gloves during anatomy lab. The first thing I do after washing my hands upon entering many patients’ rooms is put on a pair of gloves, especially when the patient has an infection involving MRSA or another drug-resistant bacteria. This procedure seems to work quite well to prevent the spread of these infections from patient to patient. But back in the day, those really not-so-long-ago days, we rarely wore gloves to protect us. Not when we examined the inside of a patient’s mouth, not when we drew blood or gave injections, not when we changed dressings, and not when we dissected a human cadaver. (The only times we always wore gloves were to conduct pelvic or rectal examinations.) My, how times have changed! And in such a relatively short period of time! Commonplace then, unthinkable now.
Please share this post using your favorite social media site by clicking the link below and consider subscribing to get updates of new posts. Thanks for reading.
- Obamacare is Not THE Answer
- “All Bleeding Stops”