This summer my family and I explored Fort Ligonier, an eighteenth century British fort in Western Pennsylvania, and the Bushy Run Battlefield, a historic site of the Seven Years War (1756-1763). My children opined about what it must have been like to live in those days. As we looked at the hospital buildings, however, my daughter said “the thing that I would miss the most is 21st century medicine. “
She is not alone. Some people attend Renaissance Fairs and pretend to live in Medieval Europe. Others reenact the Civil War or other major conflicts. No one that I have ever spoken to, however, wants to give up modern medicine. Not that modern medicine is perfect. Too often it is impersonal, profit driven, complicated and expensive. However, compared to much of existed before, it is miraculous. We would do well to remember that, and be thankful for it.
Early American Treatments
Much of European and American medicine in the 17th and 18th centuries was based partly on the idea that health required the removal of toxins from the body, and partly on teachings of the Greek physician and philosopher Galen. Practicing during the Antonine Plague, Galen refined the Hippocratic theories about imbalance of humors (blood, black bile, yellow bile, and phlegm) causing disease. Because of the toxin theory, primary treatments of “heroic” physicians in the era focused on eliminating toxins. Patients would be bled with leeches or cutting veins open to remove blood toxins, blistered with rubbing and mustard plasters on the skin to eliminate skin toxins, and given ipecac (causing vomiting) or calomel (causing diarrhea) to purge the intestines. These treatments and other drugs often did more harm than good. Surgical interventions could be equally frightening; total dental extraction was a common treatment for arthritis until the early 20th century.
There were exceptions, as some drugs were effective. Quinine from cinchona bark treated malaria, digitalis from foxglove helped heart failure, colchicine helped gout, laudanum (opium) improved pain, and alcohol, which served as a solvent for most liquid medicines, made almost everything feel better.
Nonetheless, with disease and death decimating every population, people were desperate for almost anything that they thought might help. The success of the Lewis and Clark Expedition was due in large part to the cooperation of the Native Americans they encountered. A big reason that these tribes helped the intrepid explorers was because Lewis and Clark tried to heal the Indians, as they would have called them. The two were as skilled as any physician of their day; one morning Clark saw four men, eight women and a child, giving liniment for painful joints and laudanum for hysteria.
More science was filtering into medicine. In the early 1800s Dr. Benjamin Dudley told his colleagues that their patients would do better if they boiled their surgical instruments. Dr. J Crawford wrote articles suggesting the mosquitos were the source of malaria and yellow fever.
Native American Medicine
Indian healers or “medicine men” used a variety of local herbs to combat illness. Sweat lodges were used for lung disease, skin disorders, and rheumatism. They emphasized fluids, nutrition and rest, and avoided some of the “heroic” treatments of the white man. Native Americans used willow bark, now known to contain salicylates similar to modern aspirin, to treat pain and fever. Medicine men were skilled at reducing broken bones. Further, medicine and religion were intertwined so their rituals often aided in the healing process. As a result of these practices, patients under of the care of Native American healers sometimes did better than those under European care.
Diphtheria became a greater problem as populations grew, and it would often kill by suffocation because an inflammatory membrane blocked the patient’s airway. A medicine man would force a sinew covered in sandburs and buffalo fat down the throat of a diphtheria patient. The melting fat would adhere to the inflammatory membrane and the healer would pull it out.
Explorers, soldiers and pioneers typically lived on hard bread, salted beef, whiskey and coffee. Indians ate more leafy vegetables and the viscera (organs) of animals. These often contained high levels of vitamin C and so the natives were less likely to get scurvy.
Native Americans suffered terribly, however, from Old World infectious diseases. Lacking centuries of acquired immunity, whole Indian villages perished from smallpox, measles, and influenza. White mortality from smallpox was less than 30%, and mortality from measles and flu was far less. Alcohol was another scourge that afflicted Native Americans in their encounters with European settlers.
The Medical Marketplace in the 18th Century
The medical profession in the 21st century includes a bewildering variety of practitioners, from allopath (Doctor of Medicine, MD) and osteopath (Doctor of Osteopathy, DO) to physician’s assistants (PA), midwives, and nurse practitioners (NP). Modern medicine also has homeopaths, chiropractors, and others. Most of medicine, however, is highly regulated and practitioners need to meet high academic standards and undergo rigorous training for their credentials.
The medical marketplace was no less varied, albeit far less regulated, in 18th and 19th century America. Allopaths practiced “heroic medicine” such as bleeding patients, blistering them, or giving the purges such as “thunderclappers” to clean their bowels. Under the theory that “like treats like” and “small doses cure”, homeopaths provided tiny doses of herbs and other chemicals in the hope of cure. Herbalists used larger doses of botanicals to heal. Inoculators specialized in giving inoculations and vaccines, and midwives delivered babies, especially in the West. Much of this specialization was a product of advertising. Indian healers advertised natural, Indian remedies. Bone setters and “cancer specialists” proclaimed their skills in local newspapers.
There was notable overlap between the groups. For example, all used plant products to try to heal. Effective medicines at the time including colchicine (gout), digitalis (CHF), quinine (malaria), and laudinum (pain). Many drugs were dissolved in alcohol and made into elixirs, making the patients feel better if not actually doing them any good.
Medical care was expensive and not widely available on the frontier. When a doctor (of whatever type) arrived in town, the success or failure of his first case would mark him as competent or not. In the 19th century, both before and after the Civil War, towns sprang up at rivers, railroad junctions, mining locations, forts, and many other places. If a doctor was not successful in one area he could just move on to the next. Some physicians would “circuit ride” with ministers, providing medical services while the other provided spiritual services.
Physician training was through a medical college or apprenticeship, but some “physicians” simply bought their degrees through a “diploma mill”.
Medical Care in the Civil War
At the start of the war, medical providers, as well as everything else, were in short supply on both sides. They needed to rapidly expand their medical staffs and sometimes took on physicians with specious credentials. In time, however, the demands of combat casualty care weeded these men out. The medical departments, Union and Confederate, became highly professional. By this time, almost all physicians had stopped bleeding their patients.
In a battle, the assistant surgeon would trail about 100 yards behind the forward line of troops, followed by personnel of the infirmary corps to evacuate the wounded. Triage developed similarly on both sides. Morning sick call was done by regimental surgeons. The minimally injured were returned to duty immediately while the moderately injured were treated immediately. Soldiers who were severely injured after a battle were given laudanum and were placed under a tree to die. The few who managed to rally, defying the expectations of the triage staff, would receive care after the other soldiers. Wounds that penetrated the abdominal cavity were almost always fatal. Chaplains were assigned to each field hospital, and they served to comfort the sick and dying and mediate disputes between Union and Confederates soldiers.
Chloroform was the primary means of anesthesia, sometimes augmented by whiskey. Since the Minie ball did far more tissue damage than a traditional spherical musket ball, wounds were more serious. Men hit in the torso usually died, but those with extremity wounds could survive. Amputations were common. Germ theory would not be widely known or accepted for at least another decade, and Joseph Lister’s revolutionary surgical sterilization with carbolic acid also lay in the future, so infections were common. For head injuries, trephination was done at site of skull fracture if possible, or over parietal bone when not. Overall, 90% of combat injuries were from the Minnie ball, 80% of the wounds were on the extremities, 70% of soldiers with amputation survived, and 46% of soldiers undergoing trephination survived. Enlisted and junior officers with amputations would be discharged. Higher ranking officers (those on horseback) could remain on active duty.
During the battle of Gettysburg, the Confederates had a field hospital at the Daniel Lady farm. After the battle there were 8000-9000 wounded confederates and 1300 wagons. A torrential storm hit on July 4, slowing the withdrawal but also slowing the Union pursuit. The Confederate wounded were in a wagon train 17 miles long under the command of General Boden. At Green Castle Union civilians attacked the wagon train. Later Union cavalry attacked and captured 100 wagons. Finally the retreating Confederates had to wait five days to cross the swollen Potomac River. Lesser wounded men walked 45 miles to a hospital in Winchester.
As deadly as combat was, disease and non-battle injuries killed more soldiers on both sides. During the Civil War, 53 of every thousand soldiers died from disease. This is 2-3 times as many as died from battle related injuries. Unfortunately medical treatment was poor. Diseases such as malaria could be successfully treated with quinine, but most infectious diseases could not. Having little to offer, physicians might use “blue mass”, a combination of mercury and flour that was made into pills and used for many diseases.
Suppositories could be made with hog fat to ensure that it would melt when in the rectum. Beeswax was used for making medications as well. Mercury, unfortunately, is a toxin and “blue mass” probably hurt more than it helped. Blue mass and other chemicals also found their way into pills. The druggist would combine the active ingredient with the pill base, such as flour. He would then roll a small amount of the mass on a pill rolling board, forming a long strand that he would then cut into individual pills. Once the mass had dried, the pill was ready to give to the patient.
Civil War medicine boasted some vivid personalities. Brigadier General William A. Hammond, an early Surgeon General of the Union Army, tried to prohibit the medical use of mercury compound such as blue mass and calomel. He immediately faced a storm of resistance, was court martialed, and was discharged. Jonathan Letterman was General George B. McClelland’s chief medical officer. He devised an ingenious medical evacuation and ambulance system at Antietam. Letterman hospital was opened in California only five months after Gettysburg. Dr. Hunter McGuire was the chief physician and surgeon of Confederacy. Sally Tompkins, a nurse, was the only woman commissioned in Confederate army. She was given the rank of captain.
After Edward Jenner discovered a vaccine against smallpox in 1796, vaccines became an important part of medical practice. This is not to say that immunizations were immediately accepted, much to the contrary, but medicine finally had a powerful weapon against infectious disease. Missionaries took the smallpox vaccine to inoculate Native Americans as part of their ministry. When the Gros Ventres Indians on the Milk River in Montana contracted smallpox in 1869, the US Government provided sent in blankets, supplies and medicine.
Babies were generally born at home under the care of female relatives and a local midwife. Complications were rare, but fetal and maternal mortality were much higher than today, especially on the frontier. Women were expected to deliver, put the baby to breast, and then get back to work. This may have been harsh for some, but such practice had the advantages of getting the mother up and avoiding both pneumonia and blood clots associated with bed rest.
Gonorrhea and syphilis were ubiquitous in the 19th century. Indian women, soldiers, explorers, miners and prostitutes traded venereal diseases among themselves incessantly and there was no effective treatment. Gonorrhea usually showed symptoms in men but not in women, who could easily become sterile from pelvic inflammatory disease. Syphilis caused symptoms in both sexes, and often led to heart disease, or neurosyphilis and insanity. Mercury compounds were used to treat chancres and other external manifestations of syphilis, but could not affect the course of the disease. No one knows whether Meriwether Lewis died of suicide or homicide when he perished in 1809, but many historians think that neurosyphilis played a role.
Unwanted pregnancies occurred as well. Abortion was practiced within both Native American and White communities, but delivering the child to other family members or to an orphanage was a common alternative.
People sometimes bemoan the weaknesses of modern medicine, failing to remember what came before. While we work to improve technology, as well as compassion, in present day healing, it is always useful to look back and see how far we have come. It is also useful to recognize the heroism as well as the frailties, and the acceptance as well as the bigotries, of those who have gone before us. We can only hope that our descendants do the same for us.
 Volney Stelle, Bleed, Blister and Purge, pp6-7
 Volney Stelle, Bleed, Blister and Purge, pp4-5
 Volney Steele, p19
 Volney Steele, p 20
 Steele, p5-6
 Volney Steele, p 8
 Steele, 39