Share This Article

“I am badly injured, Doctor; I fear I am dying… I think the wound in my shoulder is still bleeding.” His clothes were saturated with blood, and hemorrhage was still going on from the wound… His suffering at this time was intense; his hands were cold, his skin clammy, his face pale, and his lips compressed and bloodless; not a groan escaped him—not a sign of suffering, except the slight corrugation of his brow, the fixed rigid face, and the thin lips so tightly compressed, that the impression of the teeth could be seen through them. Except these, he controlled by his iron will, all evidence of emotion, and more difficult than this even, he controlled that disposition to restlessness, which many of us have observed upon the field of battle, attending great loss of blood.

Dr. Hunter McGuire, Stonewall Jackson’s physician, gives a textbook-perfect description of the shock the general suffered after his fatal wounding in 1863. It was a condition long familiar to military surgeons. The pallid, sweaty restlessness that often precedes the death of the severely wounded was first called “shock,” in 1743, by those who had treated gunshot wounds, and who thought that these symptoms were caused by the violent, jarring impact of the bullet. They described shock eloquently as “the rude unhinging of the machinery of life” or “a momentary pause in the act of death,” but they were unable to treat it.

Even Dr. McGuire, who seems to have understood that blood loss is the cause of shock, misunderstood the condition so gravely that he thought it had certain restorative qualities, and he actually bled the general. In fact, this belief in the restorative effects of shock led some of McGuire’s colleagues to amputate without anesthesia, because they believed that the anesthesia would counteract the strengthening effects of shock.

We now understand that anesthesia, by causing the blood vessels to relax and thus the blood to pool, can reduce the circulating volume enough to cause shock. McGuire’s colleagues, therefore, made the accurate observation that men already in shock tolerate anesthesia poorly. However, their interpretation that shock has some “strengthening” effect was clearly wrong. Modern-day anesthesiologists transfuse patients with fluid or blood, depending on the circumstances, to counteract the effects of blood pooling.

It has been only since World War II that we have finally understood shock. We now know that this condition is caused by a loss of circulating blood, leading to a lack of oxygen in the tissues and eventual death. States of disease that cause failure of the circulation, such as heart failure, or loss of fluid volume, such as dehydration, can also result in shock; nevertheless, hemorrhage, particularly in wounded men like Stonewall Jackson, is the most common cause. If we are able to replace lost blood, we can perform virtual miracles. We can reverse that “momentary pause” and heal people wounded even more severely than Jackson was.

The history of the understanding of shock and development of blood transfusion is, like the history of medicine itself, intertwined with the history of war. From the earliest days of medicine, war not only has forced physicians to develop techniques for repairing bodies mutilated by ever more ingenious weapons, but has also provided great numbers of patients for the trial-and-error advancement of science. The military hospital has been like some grim laboratory where physicians, often desperately understaffed and under-equipped, must modify procedures or, because of either the novelty of the injury or the lack of standard equipment, invent new ones. They also see so many men suffering from similar injuries and diseases that they can closely monitor trends in the course of an illness and the success or failure of a particular treatment.

The catalogue of medical advances stimulated by the exigencies of war is a long one. Much of the understanding of how the human body works has been discovered in observing and trying to repair the injured. Indeed, wounded men have afforded the only opportunity to study the workings of a living human body; caring for them is the only legitimate form of human vivisection. Doctors from Galen (the second-century physician often called the “father of modern medicine,” because he insisted that an understanding of human anatomy was the necessary foundation of medicine) to the surgeons of the Vietnam War have learned invaluable lessons by observing and treating wounded soldiers.

Surgery has advanced more on the battlefield than perhaps any other medical specialty. A military surgeon is likely to treat more cases of severe trauma in the course of a single engagement than he would in years of civilian practice. A review of  20th century surgery underscores how indebted we are to the war-wounded. Every surgical specialty has benefited from the experiences gained in war (the use of blood transfusions is a prime example), and some fields—surgery of the intestine, chest, head, and blood vessels—have been virtually transformed by them. Other disciplines, such as plastic surgery, needed the urgency of war and the plight of its survivors to catapult them into the position of hill-fledged, modern specialties. Still other branches of medicine were actually born in the military field hospital—emergency-room medicine and the science of artificial-limb technology, for example. Even the modem urban trauma center, with its rapidly responding medic units providing early intervention, is an outgrowth of war. Physicians and medics with experience in Vietnam have been invaluable in organizing and staffing our large, regionally centralized emergency systems.

Significant advances have been made in other areas of medicine as well. Progress in combating infection and disease made World War II the first conflict in which an American soldier was more likely to die from his battlefield injuries than from infectious disease. Preventing and treating infectious disease, therefore, have had tremendous strategic importance: typhus, typhoid, syphilis, bubonic plague, and influenza have all played roles in the history of war by sometimes obliterating the advantages of superior force, weapons, or leadership. Military-supported studies of everything from sanitation to drugs and vaccines have benefited not only soldiers but peacetime civilians as well, especially children and pregnant women.

Penicillin, the first potent antibiotic drug, was discovered by Sir Alexander Fleming, who, distressed by the numbers of men he had seen die of infected wounds during World War I, devoted his career to research on antibacterial substances. Although he discovered penicillin before the outbreak of World War II, it was the war’s pressing need for antibacterial remedies that inspired the enormous and cooperative effort required to produce the drug. In 1941 there was only enough penicillin to treat a few patients (and it was so precious that it was recovered from the patients’ urine and reused), but by the invasion of Normandy in 1944, ample penicillin was being produced to supply the Allied armed forces. Indeed, penicillin was considered of such potential strategic importance that in the early 1940s the scientists working on its development not only made plans to destroy all documents pertaining to their research if the Germans were to invade Britain, but even spread the mold that produces penicillin into the lining of their clothes, where it could be preserved and transported undetected. Today penicillin is an indispensable part of the clinician’s armory against disease—another weapon that has made medicine in the latter half of the 20th century so powerful.

Even the language of war has infiltrated and influenced medical theory. Nineteenth-century theories on disease and the body’s defenses were couched in the bombastic rhetoric of imperialistic military strategists: Bacteria were described as rapacious invaders and the immune system as a valiant defending army. Today’s understanding of those same bodily forces incorporates the contemporary realization that excessive emphasis on military might has the capacity to be destructive: We now believe that it is the body’s own defensive actions that cause the most significant damage in some types of disease.

The specific history of blood transfusion clearly demonstrates the many ways in which war can stimulate medical progress. War’s influence was fundamental to the development of transfusion: It stimulated changes in medical theory, technology, and practice. These changes radically transformed the capacity and scope of medicine. Blood transfusion was a technique almost never used before World War I and infrequently resorted to between the wars, but now it is commonplace. It has made operations ranging from face-lifts to heart transplants possible and safe. How revolutionary the ability to transfuse a patient and the understanding of shock have been to the practice of medicine is hard to overestimate. Resuscitation of trauma victims and advances in anesthesiology, in virtually every type of surgery, and in the treatment of burns, diarrhea, renal failure, anemia, and complications of childbirth are all based, practically and theoretically, on the capacity to transfuse.

Although it seems almost incomprehensible to us now, for thousands of years blood was not regarded as a remedy for hemorrhage. It has, however, been part of the medical and mystical pharmacopoeia from time immemorial. Almost every culture has invested blood with mysterious and powerful properties. It was the food reserved for the angry biblical god, and the bath given to Egyptian pharaohs afflicted with leprosy. Blood was long thought to determine both the physical and mental attributes of a person. Drinks and baths of blood were prescribed for centuries as treatments for madness and the infirmities of old age. To rejuvenate themselves, Romans drank the blood of bulls and gladiators slain in the arena. In 1492 an attempt was made to restore the health of Pope Innocent VIII, sapped by years of debauchery, by giving him the blood of three sacrificed young boys. Blood was also used in the rituals of war and state. The Scythians sealed oaths and treaties by mixing the blood of both parties in a cup of wine and drinking. The Roman devotees of Mithra anointed themselves with bull’s blood to give them strength and prowess on the battlefield. And Herodotus describes how the Neurii tribes of central Africa drank the blood of their vanquished enemies to celebrate their victory and honor the victims.

The belief that blood might determine what were deemed to be some of a person’s intangible qualities survived to modern times. Until the Korean War, the U.S. Army blood banks classified blood by the race of the donor, and the German blood-banking program established in 1940 permitted only “Aryans” to be blood donors. It is not surprising, then, that when people did begin to consider the possibility of transfusion, it was with the aim of curing madness and decrepitude.

In 1628 William Harvey, physician to James I and Charles I, published his revolutionary book that described the heart’s connections to arteries and veins to form a circulatory system. Prior to his work, blood was thought to ebb and flow through the body like the tide, while the heart mixed the blood with a spiritual essence inhaled through the lungs. Harvey’s work established the minimum knowledge required to perform a transfusion and inspired a spate of experiments on the circulatory system.

Christopher Wren, the famous English architect and a founder and fellow of the Royal Society of London, and his friend Robert Boyle, the chemist, performed the first well-documented intravenous infusion experiments, injecting opium, wine, and beer into the veins of dogs. Their experiments caught the attention of an Oxford physician, Richard Lower, who became the first person to attempt a direct blood transfusion. In 1665 he transferred the blood from the artery of one dog into the vein of another. He describes the experiment:

The dog first set up a wailing but soon its strength was exhausted and convulsive twitchings began. In order to resuscitate this animal from such a great loss of its own blood with the blood of another, I securely bound a large hound along the smaller dog and allowed its blood to flow from the cervical artery into [the smaller dog, whose]… jugular was again [sewn up] and its chains unleashed. The animal immediately leapt down from the table and apparently forgetful of its injuries, fawned upon its master. It then cleansed itself of blood, rolled in the grass, and apparently was no more inconvenienced than if it had been thrown into a flowing stream. 

Lower was so encouraged by his success with dogs that he decided to attempt a human transfusion. (He had no way of knowing that his initial success was due in part to the fact that, unlike humans, canine blood does not naturally contain factors that might make one dog’s blood incompatible with another’s.) Despite his observation of the marked effects of exsanguination and subsequent replacement of the blood, Lower did not consider transfusion a treatment for blood loss. Transfusion, he wrote, should be used “in arthritic patients and lunatics” who would find “perhaps as much benefit from the infusion of fresh blood as from withdrawal of the old”; exchanging the blood of “Old and Young, Sick and Healthy, Hot and Cold, Fierce and Fearful, Tame and Wild animals” would also give fruitful results. Samuel Pepys, the chronicler of 17th century English life, echoed Lower’s theories on the nature of blood when he commented in his diaries that these experiments “did give occasion to many pretty wishes, as of the blood of a Quaker to be let into an Archbishop.”

GET HISTORY’S GREATEST TALES—RIGHT IN YOUR INBOX

Subscribe to our HistoryNet Now! newsletter for the best of the past, delivered every Monday and Thursday.

Accordingly, Lower’s first transfusion was undertaken with the aim of calming the “too warm” brain of a “poor” and “debauched” Cambridge cleric (who was paid 20 shillings) by transfusing him with the blood of a docile lamb. The transfusion appeared to be partially successful: The cleric survived and six days later was able to report, in Latin, that he was feeling much better; but he still seemed, at least to Pepys, who heard him give his address, “cracked a little in the head.”

Lamentably, other animal-to-human transfusions did not meet with similar success; in fact, they often proved to be fatal. These early patients no doubt died from immune reactions to the foreign blood, infections, complications arising from infusing clotting blood, or from hemorrhage itself. A French physician who claimed to have preceded Lower in conducting the first animal-to-human transfusions was charged with murder when one of his patients died. He was acquitted, but transfusions were soon outlawed throughout Europe.

Blood transfusion was used to treat hemorrhage for the first time during the 19th century. Nevertheless, because nothing was understood about the incompatibility of different blood groups or blood coagulation, and aseptic medical technique was still in its infancy, the procedure was rarely successful and therefore used only in the most desperate of cases—above all in obstetrics, and occasionally, during the American Civil War and the Franco-Prussian War, to treat trauma. In these military settings transfusion was used only where there was direct, external evidence of bleeding, because, as we saw with Dr. McGuire, the connection between blood loss and shock was not understood. In The Medical and Surgical History’ of the War of Rebellion (1861—65), a heroic effort at compiling the medical records and experiences of the Civil War, shock is described as “a general perturbation of the nervous system… the person affected turns suddenly pallid… the surface of the body is cooled and bathed in profuse perspiration… the circulation is feeble… the mental condition is one of agitation… this is independent of any loss of blood.” The description of men dying of hemorrhage a few pages later in the same tome comments upon their “blanched” appearance, but does not associate this with the pallor of the men in shock.

Despite the observation of one Union physician—who studied the death of men on the battlefield and performed autopsies on many of them—that blood loss was a major, if often unrecognized, cause of death in those who perished, only three cases of transfusion from that war are recorded. It is curious that only three apparently were performed: Two were clearly successful. Private G. P. Cross, a 19-year-old wounded in the right leg before Petersburg on June 16, 1864, was transfused with blood by Surgeon E. Bentley, who noted that “immediately after the injection a marked difference was noticed in the patient’s pulse, which became stronger.

Private Cross survived his wound and the war. Private J. Mott, 37 years old, also responded well: “The man’s general condition was greatly improved. His pulse became fuller and slower, he slept well… altogether the prognosis became more favorable. Unfortunately for Private Mott, when he hemorrhaged again a week later, he was not transfused again and died the following day. The third man died of his wounds, and no comment is made on the effects of the transfusion.

In 1900 it was discovered that not all human blood is identical, but rather can be classified into groups. Indeed, if the blood from two different groups is mixed, a reaction ensues, causing the blood cells to clump together or burst. This reaction, which was the basis for the discovery of blood incompatibilities, contributed to the failure of many early transfusions. By matching the blood type of donor and recipient, then, the problem of incompatibility is circumvented. Interestingly, the importance of this discovery was not recognized until the increased use of blood transfusions, prompted by the two world wars, made it relevant to most practitioners.

World War I dramatically increased the number of “desperate” cases and so spurred the development of blood-transfusion technology. Three main problems confronted the military surgeon performing a transfusion: blood incompatibilities, blood coagulation, and the lack of an orthodox theory of shock that justified transfusion as a rational treatment. The first two problems were overcome during World War I; the third was to have a devastating effect on how the Americans treated their war-wounded in World War II.

Minimizing the coagulation of blood as it was transferred from donor to recipient allowed blood transfusion to be useful in a military setting. Before this occurred, only direct transfusions were performed, and even these could be complicated by coagulation. Direct transfusions therefore not only could be dangerous, but also were enormously cumbersome. The procedure was not unlike that described by Lower for his dogs, with blood of the donor being transferred immediately into a vein of the recipient. In 1914 and 1915 three different researchers discovered that sodium citrate could act as an anticoagulant without dangerous side effects. Sodium citrate thus made indirect transfusion, and consequently the storage and typing of blood, possible. With stored blood more men could be transfused, and so resuscitation could begin closer to the front. In November 1917, during the Battle of Cambrai, preserved blood was first introduced into a casualty clearing station; by 1918 transfusions performed in advanced dressing stations were keeping men alive until they reached the clearing stations, where they could be operated upon.

Sir Geoffrey Keynes, brother of the famous economist, invented early standard transfusion equipment and wrote the first English-language text on blood transfusions. He described his experience in World War I:

The donors were chosen by preliminary blood grouping of both patient and prospective donor, a procedure which was still a novelty. Official encouragement took the form of allowing a fortnight’s extra leave in “Blighty” (England) to the donors chosen from among the lightly wounded men. Potential donors lined up eagerly for the test—rejection was regarded almost as a slur on their integrity.Transfusion naturally provided an incomparable extension of… life-saving surgery… A preliminary transfusion…enabled me to do a major amputation single-handed. A second transfusion then established the patient so firmly on the road to recovery that he could be dismissed to the ward without further anxiety. At other times I was greatly distressed by the state of affairs in one large tent known as the “moribund ward.” This contained all the patients regarded by a responsible officer as being probably past surgical aid, since it was our duty to operate where there was reasonable hope of recovery, rather than to waste effort where there seemed to be none. The possibility of blood transfusion now raised hopes where formerly there had not been any, and I made it my business during any lull in the work to steal into the moribund ward, choose a patient…transfuse him, and carry out the necessary operation. Most of them were suffering primarily from shock and loss of blood, and in this way I had the satisfaction of pulling many men back from the jaws of death.

Despite the dramatic results achieved by transfusions and the technical advances that helped facilitate them, not all the wounded who might have benefited from a transfusion received one. The procedure was unwieldy and blood in short supply. World War I physicians thus began transfusing the wounded with fluids other than blood. This had already been shown to be a successful treatment for those in shock due to the severe dehydration caused by cholera. In an attempt, therefore, to maintain the circulation and to use a solution with blood-like qualities, they tried infusing a number of solutions, including saline and gum acacia. These solutions were variously successful; nevertheless, it became abundantly clear that the replacement of fluid or volume alone was a remarkable, if temporary, step forward in the treatment of wounded men in shock.

In an article published in 1920, Major W. Richard Ohler argued that hemorrhage is the single most important factor in shock. It was his contention, supported by extensive wartime experience with transfusions, that restoring red blood cells, with their oxygen-carrying capacity, is at least as crucial in the treatment of shock as the replacement of fluid volume. In this view, however, he was not in the majority. This was partly because, in the words of another physician, E. G. C. Bywatters, ” ‘shock’ [is] a mysterious condition with as many definitions as there are writers on it. ” As the symptoms of shock can be caused by things other than hemorrhage, and the amount of blood lost is hard to calculate in the wounded, the importance of red blood cells was not often recognized. It was widely held that a toxin released by damaged tissue caused the lowered blood pressure, the feeble pulse, and the sweaty restlessness of shock. This theory, combined with the effectiveness of fluid replacement and the difficulty of using blood, concentrated much of the interwar research on finding a solution that would be easy to administer and store and that could treat the symptoms of shock. Plasma, the fluid part of blood with the blood cells removed, emerged from this research as the ideal substance. It does not need to be typed, can be frozen and dried, and maintains circulating volume well.

this article first appeared in military history quarterly

Military History Quarterly magazine on Facebook  Military History Quarterly magazine on Twitter

Whatever their views on shock and hemorrhage, Ohler, Keynes, and physicians like them were convinced by their military experience that blood transfusion was an invaluable medical tool. This conviction inspired them to introduce the technique to their skeptical civilian colleagues, who were afraid that transfusion might “get in the way” of surgeons. But now, emboldened by wartime successes, they attempted longer, more complicated operations, and they were pleased with the results.

During this same period, shock continued to be a subject of intense laboratory research. Evidence began accumulating, much of it from the laboratory of Alfred Blalock, that fluid loss at the site of injury was the most important factor in producing circulatory collapse in shock. This began changing the concept of shock, but more firmly entrenched the idea that plasma was an adequate treatment for every type of shock. Although Blalock notes in one of his seminal papers on the subject that following severe trauma it is mostly blood that is lost, he adds that with less severe trauma the escaping fluid is roughly equivalent to plasma. Laboratory animals that had been bled responded nicely to plasma replacement. Therefore, the loss of fluid, not of blood, was considered the main cause of shock. These facts, and the cumbersomeness of using blood, convinced many American physicians, even some of the most learned and astute, that plasma was an adequate treatment for men with traumatic shock.

The Spanish Civil War was the first war in which blood and plasma secured from a civilian population were used to supply medical installations on the front. From August 1936 to January 1939, the Barcelona Blood Transfusion Service collected over 9,000 liters of blood for the Republican army. They maintained a roster of about 28,900 donors, aged 18 to 50, whose blood type, syphilis titers, and even psychological profiles were known. The blood was preserved with sodium citrate and glucose in sterile containers and kept under refrigeration. Refrigerated trucks and coolers were used to transport it to the field. The medical staff all had their blood types tested in case stored blood was not available.

The strict protocol for the administration of blood and plasma indicated that the distinction between hemorrhage and traumatic shock was still being made. Men with severe hemorrhage were to get only blood; those with hemorrhage and shock, both blood and plasma; those with shock alone, only plasma. But by the end of the Spanish Civil War, some Republican surgeons were not making this distinction; they realized that hemorrhage could be the cause of shock and that blood was the treatment for it. Joseph Trueta, a renowned Republican surgeon who pioneered and perfected several new surgical techniques during this engagement, wrote that “transfusion with plasma…is only a temporary measure, however: the patient needs hemoglobin to combat his anoxemia [lack of oxygen] and for this purpose the presence of red blood corpuscles… is essential.”

The evangelical zeal of British surgeons like Keynes and the success of the whole-blood service operating on the Republican side during the Spanish Civil War prompted the establishment of a British whole-blood program six months before Great Britain entered World War II. Despite this foresightedness, the British supply-and-distribution system was initially inadequate. At El Alamein, the first major battle in which all medical units, even the most advanced, were supplied with blood, the demand for blood outstripped available resources. In order to augment the blood supply, an enterprising officer, Major G. A. Buttle, sent his men to pick up old beer bottles in the streets of Cairo; these were sterilized, converted into containers for blood, and shipped, refrigerated, to the front. Edward Churchill, a Harvard surgeon who joined the army at the American declaration of war and was sent to North Africa in March 1943, describes visiting this transfusion unit:

I saw Egyptian civilians sitting on the floor, placing bottles of blood into containers and packing straw around them. It seemed unbelievably primitive and yet, in the opinion of the doctors who needed this blood, Buttle’s accomplishments warranted the Victoria Cross. To supply an army with a large quantity of blood in such a manner invited difficulties with putrefaction and infection.

The American preparation for war was far worse in this respect than the British. The development of an American blood service was first considered in May 1940, but not implemented. When the United States entered the war a year and a half later, only a program to collect blood for processing into plasma had begun operation. As the nation was plunged into the war, it became increasingly apparent, at least to those field surgeons who were treating the wounded, that whole blood, not plasma, was crucial to the survival of their patients. The army and the surgeon general, however, were less impressed with the reports they were getting from the North African front than with the fact that plasma was considerably less diffcult to preserve, transport, and administer than whole blood. Moreover, although many military surgeons had concluded that shock resulted from loss of blood, theories abounded among civilian and laboratory practitioners supporting the view of plasma as a suitable substitute for whole blood. The use of plasma did save lives; but the use of whole blood would have saved many more. The entries in the diary of Major Kenneth Lowry dated February 2 and 3, 1943, are instructive:

To date we have lost only one case here, a lower one-third thigh amputation with multiple wounds of the left leg and thigh. He was in profound shock in spite of 1,500cc of plasma, 500cc of blood and lots of glucose. The operation did not increase his shock, but neither did he improve. Blood is so precious! so urgently needed! What we do give is being obtained from our own personnel who are most willing, but they really need it themselves after putting in long hours without rest or sleep.

We could not find a donor for a splendid chap from Maine last night. He was in severe shock and needed something in addition to plasma and glucose, so Frosty [a fellow surgeon] gave his blood, took a short rest and went back to operating again… I cannot help but add one remark which I have observed in our work. Dried human plasma is saving hundreds of lives that would surely otherwise be lost. Of course whole blood is better but is more difficult to obtain.

Within two and a half weeks after arriving in North Africa to serve as a consultant in surgery for the U.S. Fifth Army, Edward Churchill sent a memorandum to the army surgeon of the Fifth declaring that blood was the agent of choice in the resuscitation of most casualties, and that the continued dearth of blood and reliance on plasma alone would increase the morbidity and mortality of the wounded. This memorandum was the first of many; he continued to send reports, drawn from meticulous study of British and American casualties, and grew frustrated when his repeated pleas for whole blood and the equipment to procure and store it safely were ignored. His superiors further asked him not to send any personal communications to Washington, but instead allow his information to be transmitted by the appropriate channels. Churchill describes the reception of his reports:

I was not popular when I said that wound shock is blood volume loss. It is identical with hemorrhage. The wounded require replacement of blood loss…The Theatre Surgeon General, Frederick Blesse, was placed in a difficult situation. I was a new consultant whom he had never seen before and who said: “We must have blood.” The Surgeon General had said that we must fight the war on plasma.

Churchill began to feel that a “huge vested interest… starting up from assumptions and erroneous thinking” surrounded the program providing plasma to the army. Impressed only by the need for blood and not by military hierarchy or standard procedure, Churchill felt his only recourse was to ask the New York Times to report how urgently whole blood was needed and how inadequate plasma was as a substitute. An article appeared on August 26, 1943. “The initial breakthrough,” writes Churchill, “thus came with upsetting the balance of power in Washington through the New York Times and making people in the States begin to think reasonably about the need to transfuse the wounded and realize that World War Il could not be fought on plasma Soon we were able to get refrigerators for the mobile hospitals. For a long time they had to draw their own blood, but they could draw it in advance of pressing need.”

Yet the response elicited by the Times article was insufficient. It did not fuel immediate or large-scale reform. The surgeon general, therefore, did not implement plans made in 1943 for the overseas provision of blood to the Mediterranean and European theaters, until the casualties of the Normandy invasion in June 1944, and a personal visit to the Mediterranean theater, convinced him of the urgent need for whole blood. Finally, in August 1944, the 1943 plan was implemented to supply the European theater from the United States, and in November a similar airlift supplied the Pacific. Until then the Americans had had to rely on local blood supplies, often begged from the British. During the invasion of Anzio in 1944, for example, the Fifth Army’s surgeon complained to his superiors: “In this tactical situation we must have blood shipped in large quantities to the beachhead. If you don’t get it to us, we’ll get it from the British.” But the British had little or no blood to spare.

At the 1944 meeting of the Southern Surgical Association, Colonel P. S. Gillespie, a British surgeon, commented on the American “borrowing” of blood, aiming his remarks at the “whole blood battlefront”:

I have often wondered at the physiological differences between the British and American soldier. The former, when badly shocked, needs plenty of whole blood but the American soldier, until recently, has got by with plasma. However, I seemed to observe a change of heart when I was in Normandy recently and found American surgical units borrowing 200—300 pints of blood daily from British Transfusion Units, and I’m sure they were temporarily and perhaps even permanently benefited by having some good British blood in their veins.

Indeed, surgeons were at the time so desperate for blood and even plasma with which to treat the wounded that they tried to expand their stores by using bovine serum albumin. By so doing they saved the lives of some, but risked the lives of others. Like Lower and his fellow pioneering blood transfusionists had found before them, World War II physicians encountered some patients who reacted violently to the injection of the blood products of another species. Detailed observations of this phenomenon were made, leading to a greater understanding of human immunological responses.

It was only late in the war that whole blood was readily available to American surgeons. Once implemented, the American blood program was very successful. On Okinawa the treatment of 40,000 casualties involved the use of approximately 40,000 pints of blood, all flown in from the United States. New American equipment also allowed for safer and less complicated blood-banking and was soon adopted by the British. Despite this success, at the end of World War II the Americans disbanded their blood program. They then repeated the mistakes of World War II at the outbreak of the Korean War, when it took six months to establish an adequate and constant supply of blood to U.S. troops.

It is hard to understand why the United States, ignoring the advice of its own field surgeons and of the National Research Council, an organization of scientists established to advise the government on national-security issues, was so slow to establish a whole-blood program in World War II. Perhaps the biggest stumbling block to the establishment of the program was theoretical. Before World War II the concept of shock was still very vague and erroneous in many respects. The absence of a cogent theory explaining the need for blood, combined with the expense of shipping whole blood and provisioning the army with blood-collecting equipment, prompted military administrators to ignore the reports received from their doctors at the front.

In addition, the surgeon general’s office harbored the misguided, yet typical, notion that the medical research it supported in U.S. laboratories would provide answers to the problems faced by the surgeons in field hospitals, not the other way around. And although laboratory research was enormously important to the war effort, controlled conditions obtaining in the laboratory yielded results that were not always applicable to the hurlyburly of a field hospital. There is a big difference, for example, between a rabbit that has had 75 percent of its blood volume removed with a syringe in the quiet of its cage and a soldier shot in the gut far from the field station. Thus, the blood-transfusion story can be seen as yet another example of a problem fundamental to military operations: the gap between the front and the rear, which often makes the rear-based bureaucracies unresponsive to the needs of those engaged in combat.

Why the United States, unlike the British, had not learned from the experiences of World War I and the Spanish Civil War is less explicable, particularly when one remembers that Americans developed many of the innovations in transfusion equipment during the earlier war. American medical experience during World War I was limited compared to the British, and for some reason the British medical literature, but not the American, was full of discussion about the successful transfusion techniques used during the Spanish Civil War. Americans were reluctant to become engaged in another European conflict and thus delayed anticipating the needs of war, both medical and military. Furthermore, once it became clear that American involvement in World War II was inevitable, the American government regarded data derived from British medical experiences early in World War II as important military intelligence and so made it unavailable to the people for whom it could have proved invaluable. Edward Churchill, who was a member of the National Research Council before the United States entered the war, complained: “Any written document, any report regarding the care of the British wounded was a carefully guarded secret. The Office of the Surgeon General of the Army would not allow even the N.R.C. to see such records. All information was filed away under lock and key.”

The medical profession was also at fault. No concerted attempt had been made to evaluate the conviction that many World War I surgeons had concerning the importance of red blood cells in the treatment of hemorrhagic shock. No clinical studies comparing the virtues of plasma with those of whole blood were undertaken. And the U.S. Armed Forces embarked on their plans to provide only plasma to the troops with no evidence of protest from the ranks of their own surgeons or those convened by the NRC to advise them. One can only speculate on why it took so long for doctors to endorse the need for whole blood. It is clear that many thought that providing blood, with its need for refrigeration and typing, was impractical. Transfusion with plasma did constitute an improvement over the past in the treatment of shock; and in patients less severely wounded than a soldier—that is, most civilian patients—it must have seemed adequate. Even human error contributed to the initial lack of medical support for instituting a program for the procurement, distribution, and use of blood: In 1941 the NRC’s Committee on Transfusions agreed that the U.S. Armed Forces should use whole blood in the treatment of shock; but somehow this opinion was omitted from the minutes of the meeting. Two years later the omission was recognized.

Given the limited number of American surgeons participating in World War I, the discontinuous nature of military surgery as a specialty, and the difficulty of keeping good medical records in the midst of war, it is perhaps understandable that the conviction of those World War I surgeons was not widely appreciated. Nevertheless, the reluctance to accommodate new sources of information or challenge prevailing dogma can be as devastating medically as militarily. During World War II, good medical studies on shock proved, once and for all, that whole blood is the best treatment for hemorrhagic shock. In conducting these studies, the American medical profession finally acknowledged, in the words of medical historian Sir Clifford Allbut, “how fertile the blood of warriors to rearing good surgeons.”

War and medicine have had a complex relationship. Medical progress has been stimulated both directly and indirectly by war. Because medical advances can be so crucial to military campaigns, medical research has found a good patron in the armed forces. But perhaps more significantly, medical progress has been a by-product of war. This progress, bought at the enormous cost in human lives that only war would afford, has been the most lasting and vital benefit of war.


Margaret B. Wheeler has published numerous articles on scientific matters. She is completing a residency in internal medicine at the University of California at San Francisco.

This article originally appeared in the Summer 1993 issue (Vol. 5, No. 4) of MHQ—The Quarterly Journal of Military History with the headline: Fertile Blood.