State of Nineteenth-Century Medicine
Nineteenth-century medicine was ill-prepared to deal with war. Although several important advances had occurred since the American Revolution (1775–1783), the basic cause of disease remained elusive. The role of bacteria in infection was yet to be defined and the availability of antibiotics to treat infections was far in the future. Some common symptoms, such as fever, were even assumed to be diseases themselves. Miasmas, which were thought to emanate from decaying matter, were blamed for causing most illnesses, and purging the body of these miasmas was considered essential to restoring a patient’s health. This could be accomplished by inducing the patient to vomit, urinate, and sweat profusely or by administering strong purgatives or laxatives.
Doctors found it to be appropriate that these treatments also depleted the strength of their patients because fever, rapid pulse, and flushed appearance were considered to be signs of dangerous overstimulation. Unfortunately, most chronically ill soldiers were already weakened by exposure to the elements, inadequate diet, physical and mental stress, and, above all, dehydration from chronic diarrhea; however, the miasma theory did have one beneficial effect. Physicians believed that low, swampy areas were a primary source of illness, and the soldiers, as a rule, avoided these areas when selecting campgrounds. This allowed them to reduce their exposure to mosquitoes, which, though not known at the time, actually did carry diseases. Other insects such as flies also spread disease by contaminating food and causing chronic diarrhea. Both soldiers and surgeons alike looked forward to the first frost in the fall, which would drastically reduce the incidence of “camp fever” and malaria.
Some advances in medicine were made in the decades before the war, which proved to be very beneficial to both sides. Opium and its derivatives, laudanum and paregoric, were still used for pain, but the decades before the war saw the manufacture of morphine sulfate, which proved to be a much more effective pain reliever. Ether, a common solvent, was accidentally found to have painkilling properties when its vapors were inhaled. It was adopted by the dental profession in the decades before the war and was found to be safe and effective. Other volatile fluids were tested to see if they had a similar effect and this led to the discovery of the anesthetic properties of chloroform. Chloroform was used many thousands of times by the British during the Crimean War (1853–1856), and was also quite successful. During the Civil War, chloroform was the anesthetic agent of choice because it was less volatile (making it easier to transport and store) and less likely to explode in the presence of a lantern or candle.
Another significant advance, in the decades before the war, was the extraction of the active ingredient quinine sulfate from the bark of the Peruvian cinchona tree. The Peruvian bark had been used for centuries to treat “intermittent fever,” or malaria. The availability of quinine sulfate provided a safe, predictable, and reliable method of both preventing and treating intermittent fever. Since most of the Atlantic and Gulf coasts in America were areas where malaria was prevalent, the use of quinine sulfate saved many thousands of lives.
The last significant development to occur prior to the Civil War was vaccination for smallpox. Prior to the work of the English physician Edward Jenner with cowpox, the actual smallpox virus was used to inoculate patients, producing a mild form of the disease and conferring a natural immunity. With Jenner’s research, first published in 1798, this could be done more safely with the cowpox virus. The success of vaccination and the isolation of smallpox cases prevented this ancient scourge from becoming a significant problem for Civil War soldiers. Most other contagious diseases, especially erysipelas (a streptococcal infection) and gangrene, were also controlled to a great extent with isolation techniques.
Problems with Civil War–era Treatment
Still, at the time of the Civil War, most medical treatment was not only unhelpful but could actually be harmful. This was partially mitigated by the fact that certain inappropriate treatments were applied much less vigorously than in the seventeenth and eighteenth centuries. For instance, physicians considered “blood letting” to be a proper course of treatment only under a few specific conditions associated with congestion; it was not used at all in treating wounded men, who frequently had already lost a great deal of blood. Harsh laxatives, sometimes called “drastics,” were often used and included heavy-metal salts such as mercury chloride (calomel and “Blue Mass”). Ironically, an effective treatment for chronic diarrhea—consisting of a diet limited to clear liquids and the use of opiates to slow intestinal contractions—was available and sometimes used in conjunction with other, inappropriate measures. Some surgeons appear to have learned from personal experience and gradually shifted to regimens favoring the more appropriate treatments.
In addition, soldiers faced other natural hazards. These included other insect- and parasite-related diseases, electrocution by lightning, snakebites, and drowning. Certainly the most common natural hazards were the extremes of temperature associated with the change of seasons. During the hottest months, when campaigning was most active, soldiers frequently were incapacitated and occasionally died from heatstroke. During the coldest months, cold-related injuries and even death by freezing were not uncommon. This can be attributed primarily to inadequate clothing and shelter. Although this was most common in the Confederate army, there were times in both armies when the supply system proved inadequate.
Dietary problems also occurred because soldiers’ meals consisted primarily of hard bread and some form of preserved meat. Union troops were consistently issued vegetables to prevent scurvy, now known to be caused by a vitamin C deficiency. The Confederate troops could usually procure similar dietary supplements by foraging or by paying exorbitant prices to camp merchants. This system worked well until the last year of the war when the Virginia countryside was picked bare of foodstuffs. As a result, cases of scurvy increased, as did the mortality rates of Confederate troops who underwent surgery.
The only other well-documented vitamin deficiency was night blindness, which is caused by a deficiency of vitamin A. Nineteenth-century surgeons had no idea what caused night blindness, but one theory held that it was caused by sleeping outdoors with the eyes open and exposed to moonlight (hence the popular term “moon blindness”). At night regiments were sometimes forced to march with the soldiers placing their hands on the shoulder of the person ahead of them because their vision was so impaired.
Wartime Disease
There were at least twice as many deaths from disease as from combat-related injuries during the Civil War. However, this ratio varied considerably from year to year. In 1861, deaths from disease exceeded those from combat-related injuries by at least 12 to 1, a rate similar to that during the Revolutionary War. Few large battles created fewer combat deaths, while life in camp exacerbated the conditions for disease. Meanwhile, the majority of the recruits, especially in the South, were from rural areas and had never been exposed to childhood diseases such as measles and mumps. In young adults, these diseases and their complications are often fatal, especially when combined with the decreased resistance associated with an inadequate diet and chronic exposure to the elements.
During the warm months of the year, bacterial diseases consisted mainly of intestinal infections from contaminated food and water. Flies infected food, causing infectious diarrhea, or “camp fever,” while mosquitoes spread malaria, or “intermittent fever.” Both of these scourges virtually disappeared during the winter months, except for a persistence of low-grade infectious diarrhea caused from direct contamination of the water supply from improperly placed latrines and poor camp hygiene. Antebellum U.S. Army regulations took into account the association of filth and disease and called for the proper location and maintenance of latrines. Inexperienced officers were usually unaware of the need to keep their camps clean, and their men tended to ignore such instructions even when given. Over time, however, the connection between cleanliness in camp and a lower rate of illness became obvious and soldiers adjusted accordingly. Aside from low-grade diarrhea, the winter months saw colds, coughs, sore throats, and pneumonia made fatal for lack of effective treatment.
Still, by the latter part of the war, both Union and Confederate soldiers enjoyed generally good health. The weak had died or gone home, while the remainder developed natural immunities either by surviving disease or being vaccinated. Ironically, Confederate soldiers benefitted from their inability, because of the blockade, to procure harsher medications. By relying on readily available botanicals, they enjoyed similar benefits with much less drastic side effects.
Hospitals
The total absence of specific treatments for virtually all diseases, except malaria, meant that any positive results would be due to the ability of the Union and Confederate governments to provide their wounded with shelter from the elements, a nourishing diet, and a relatively clean environment. To achieve this, they established makeshift hospital systems by utilizing existing tobacco warehouses, public buildings, and the few hospitals that already existed. These systems eventually evolved to include huge complexes of pavilion-style hospitals.
Wounded or ill soldiers, especially those who simply needed a few days’ rest before returning to duty, were seen in modest facilities near camp. More serious cases were transferred to a building or a collection of tents designated as a general hospital, where they would be provided care for up to ten to twelve days. Longer-term patients were transported by water or rail to cities where large, fixed general hospitals were established. Because most of coastal Virginia quickly fell under Union control, transportation of Confederate casualties was primarily by rail. Cities and towns along these rail lines became hospital centers. These included Richmond, Petersburg, Charlottesville, Gordonsville, and Liberty (now Bedford), as well as numerous smaller towns in the upper (or southern) Shenandoah Valley along the Virginia and East Tennessee Railroad. Union casualties were evacuated, primarily by water, to cities in the North. Important Union hospital centers in Virginia included Fort Monroe, Hampton and Portsmouth in the Hampton Roads area, and Alexandria.
The Confederate medical centers, most notably Richmond, soon established large pavilion-style hospitals. They consisted of a hundred or more independent wooden barracks that were kept well ventilated and drained, and could be easily isolated in case of disease outbreak or fire. The hospitals purchased foods locally; maintained vegetable gardens, herds of dairy cows, and ample supplies of fresh water; and even boasted icehouses and breweries. Altogether, the patients who recovered did so because the hospital removed them from the environment that had contributed so much to their illness. Two of these large hospitals, Chimborazo and Winder, are still considered to be the largest ever constructed in the Western Hemisphere.
Battlefield Casualties
The treatment of the battlefield casualty began on the battlefield itself, where the regimental assistant surgeons established aid stations close to the fighting. The wounded made their own way or were carried by litter. Initial treatment consisted of arresting bleeding, splinting broken bones, and administering stimulants and pain relievers, all aimed at improving the chances that the patient would survive until he could be treated at the larger division hospitals. The assistant surgeons and two enlisted men from each company provided this initial transportation and care, and, although there is no evidence that they were fired on intentionally, they were frequently wounded or killed while performing these duties.
The majority of wounded that made it to the larger field hospitals suffered from wounds to the extremities. Those who suffered from wounds to the head, chest, and abdomen occasionally survived, but usually not because of any surgical intervention. Wounds to the extremities fell into two categories: minor wounds that did not require amputation, and major wounds that did require it. Minor wounds could be successfully treated by thorough cleansing, removal of foreign material, and bandaging. However, wounds complicated by massive tissue destruction or involving a bone or joint usually were considered candidates for amputation, and for good reason. These wounds, which inevitably became infected, often led to sepsis, a systemic with a mortality rate of more than 90 percent. Even after amputation, infection was a concern, but the healthy tissue and the unimpeded drainage from the open end of a fresh stump tended to prevent sepsis and reduce the mortality rate to 20 to 25 percent. Surgeons preferred a simple circular amputation where the tissues were divided with circular incisions at a slightly higher level in each tissue layer. This left a cuff of tissue to provide coverage of the bone without closing the end of the stump as would be done with a flap amputation. (This “circular” technique is still the procedure of choice on the modern battlefield.)
In the era before antiseptic surgery, sterility was not considered necessary and anyway probably would have been impossible to achieve and maintain in the Civil War environment. Nevertheless, some surgeons wrote that they were at least clean. They washed away visible dirt when possible and kept their instruments free of blood and pus that would corrode the metal. Patients were most likely to survive if an amputation was performed within 48 hours of a wounding; this was known as a primary amputation. At the general hospitals, the stumps would occasionally need to be revised, or re-amputated, at a higher level due to infection in the bone or soft tissues. Otherwise, they were treated with continuous water dressings (still a good method of wound care). Styptics and cautery were techniques used to control occasional bleeding and various chemicals considered to be antiseptics (because they prevented sepsis) also made the wounds smell better. Of course these compounds, such as iodine and carbolic acid, were actually killing bacteria and would be used in the postwar period during the early days of aseptic surgery.
Pain Control
Almost from the moment of injury, efforts were made to ensure that the wounded soldier felt as little pain as possible. Unfortunately, the first substance administered to a wounded man was usually alcohol because it was felt to be a stimulant. But alcohol actually suppresses the nervous system and dilates the blood vessels, neither of which is helpful, especially in cases involving major blood loss. The assistant surgeon, at the aid station, also used an oral narcotic, usually morphine sulfate, and might allow the patient to inhale the vapors of chloroform for additional pain relief. At the division hospital, usually prior to examination of the wound and certainly prior to any surgery, the patient was given additional alcohol, narcotic, and enough chloroform to render him insensible to pain. Many soldiers wrote that they remembered their surgery, but felt no pain. Confederate general Thomas J. “Stonewall” Jackson, who had his left arm amputated, was said to have stated that the sound of the saw on the bone was the sweetest music he had ever heard. He also described chloroform as “an infinite blessing.”
Transporting the Wounded
The transfer from the field hospital to the fixed general hospitals was an excruciating experience of several days’ duration, especially early in the war. During this transition, patients were often jostled about in poorly suited wagons or railroad cars for days with little or no additional medication and few or no attendants to see to their comfort. Sometimes even food and water were inadequate. To deal with such problems, the Union army added additional ambulances and personnel in order to provide better care. The Confederate government established a system of independent “corps hospitals” where the wounded could be cared for until they were better able to bear transportation to a larger hospitals. They could stay at the corps hospital even after the army had moved on and surgeons had dismantled their field hospitals to follow it. Corps hospitals also had personnel who could accompany patients in transit. This development of independent hospital units that accompany combat units but are not organic to them is reflected to a great extent in modern armies.
Lessons Learned
Overall, the medical and surgical care provided to Union and Confederate soldiers in Virginia was state of the art for its time and adequate to prevent many thousands of deaths. Vaccination for smallpox, isolation of most contagious diseases, and the recognition of the importance of cleanliness and sanitation all led to the public health improvements in urban and rural areas in the postwar period. Also, based on wartime experience, medicine shifted to a hospital-based system of care, thus facilitating the twentieth-century improvements in diagnosis and treatment. Most important, a large group of highly skilled surgeons, whose skills developed throughout the war, were available to lead Virginia into the era of modern surgery.