Blood (10 page)

Read Blood Online

Authors: Lawrence Hill

Recently, we have seen a breakthrough that may satisfy both scientists and ethicists. In 2012, Shinya Yamanaka of Kyoto University won the Nobel Prize for demonstrating that adult cells could be returned to their embryonic state, only to have them develop into another type of stem cell. In a nutshell, he took an ordinary skin cell, reprogrammed it so that it resembled a primitive embryonic stem cell, and then reprogrammed it again along a new cellular pathway — without resorting to experimentation on embryos. In effect, he tricked a skin cell into becoming an embryonic stem cell. This heralded the creation of a new category of stem cells, known as induced pluripotent stem cells (iPSCs), which offer the benefits of embryonic stem cells without the ethical baggage.

More developments followed Yamanaka's discovery. In July 2013, the Japanese Ministry of Health, Labour and Welfare reviewed a proposal to use induced pluripotent stem cells in a clinical trial for age-related macular degeneration. It will take time to exploit all of the practical applications of Yamanaka's research, and the debate over the use of human embryos for stem cell research will not end tomorrow.

Reflecting in December 2012 on Yamanaka's Nobel Prize and on other advances in stem cell research, Matt Ridley made the following observation in the
Wall Street Journal
: “In the not-very-distant future, when something is going wrong in one of your organs, one treatment may be to create some stem cells from your body in the laboratory, turn them into cells of that organ . . . and then subject them to experimental treatments to see if something cures the problem.” Given our newfound ability to create induced pluripotent stem cells, Ridley suggested the debate over embryonic stem cells might be fading into history. He wrote: “If stem cells derived from the patient's own blood are to offer the same therapeutic benefits as embryonic stem cells, without the immunological complications of coming from another individual, then there would be no need to use cells derived from embryos.”

The vigour and duration of the stem cell research debate underlines, once again, how deeply we humans react to issues involving the use and treatment of our own bodies and our blood. Blood stem cells come from the marrow of the big bones buried deep in our bodies. We call them immature cells, which is true because they are so young. However, the term
immature
is also ironic, to my way of thinking, because stem cells can also be thought of as the mothers of all other cells in the human blood and body. Mothers have great power and importance, in our minds and evidently in our blood. Mothers produce the embryos that offer stem cells with such fantastic possibilities — embryos contemplated with equal intensity by researchers and ethicists.

SINCE KARL LANDSTEINER IDENTIFIED
the main blood types in 1901, and since physicians began carrying out successful transfusions over the next years, the sharing of blood has offered opportunities to give in one of the most noble, selfless ways possible. When you give blood, you don't even get a grateful smile or hug from the recipient of your blood. Because your blood has been broken into parts, and there are many recipients, you will never know them, and they will never know you. You know that you are helping to preserve human life, and that general knowledge is good enough. The desire to reach out and help others in need reflects the best parts of our humanity.

Within minutes of hearing the news of the terrorist attacks in New York City and Washington on September 11, 2001, donors began lining up outside collection agencies in the United States. In Oklahoma City, for example, three hundred people were standing in line outside the Oklahoma Blood Institute by 11:30 a.m. on the day of the attacks. Nationwide, donors gave 1.5 million units of blood within two days of 9/11, creating a shortage of storage bags. The supply of blood eventually exceeded demand, and authorities asked donors to wait. After the Boston Marathon bombings in 2013, donors again turned out in such large numbers that they were asked to wait and schedule appointments to give blood later.

Since transfusions began, millions of people around the planet have received blood that helped save their lives. I am one of them, and I will be forever grateful. Today, according to the World Health Organization, people around the world donate more than eighty million units of blood annually. (A unit is about 450 millilitres, or just less than one American pint.) But the gift of blood has been a double-edged sword. It has brought out some of our worst fears and prejudices.

To set the politics of blood donation in context, I wish to go back to early- and mid-twentieth-century America, when racial segregation permeated virtually every corner of American life, and the mistreatment of blacks even
spilled into the arena of health care. Take, for example, the Tuskegee Syphilis Study, which is now considered the most infamous biomedical research project in U.S. history. Doctors funded by the United States Health Service subjected hundreds of African-Americans to forty years of medical experiments without telling them that they had syphilis, without treating them for it, or intervening as they died and their wives contracted the disease and their children were born with congenital syphilis. The study began in 1932 and continued until 1972, when the media exposed it.

Another realm of health care in which American blacks felt the sting of rejection and disrespect relates to who was allowed to donate blood during World War II — when blood was in big demand and when the technology finally existed to rush large quantities of it to critically injured soldiers — and who was not. I will explore these politics — now almost three-quarters of a century removed — by remembering the life, words, and death of Charles Richard Drew.

Drew was born in Washington, D.C., in 1904, of black parents. His father was a carpet-layer and his mother a schoolteacher. He was a tall, light-skinned, athletic boy. He could have passed for white, had he been so inclined, but there was nothing about Charles Drew that sought to deny his heritage.

The society in which Drew grew up separated blacks from whites in schools, theatres, swimming pools, department stores, sporting events, and even in federal government cafeterias in the nation's capital. Drew attended the segregated Dunbar High School in Washington and then studied at Amherst College in Massachusetts.

Drew left the United States to attend medical school at McGill University in Montreal from 1928 to 1933, graduating second among the 137 students in his class. He went on to do two years of internship and a residency in internal medicine in Montreal. He obtained his M.D. and master of surgery degrees but had trouble finding a job in the United States. The Mayo Clinic, where he had hoped to work, turned Drew down, as did Columbia University. Allen O. Whipple, head of surgery at Columbia, gave Drew a frank explanation for his rejection.

Spencie Love, who wrote a biography of Drew in 1996, quotes a dean of the Howard medical school who witnessed the conversation and described it after Drew's death. Whipple told Drew that he was of the wrong race and economic class to treat the most wealthy and privileged of American citizens: “I am certain that you could do well with the average patient. But could you, with your background, feel at ease and render competent service if one of your patients in surgery were a Morgan, an Astor, a Vanderbilt, or a Harkness? We have such patients here. In the selection of our residents we choose only from among superior students and we take into account their family and personal background.”

Drew taught for a few years at Howard University
and received an additional two years of training in surgery at Columbia at the hands of Dr. Whipple, the same man who had declined him earlier. Although blacks did not generally find openings for surgical residencies in the United States until the late 1940s and early 1950s, Drew, through a combination of persistence, personal charm, and his light skin colour, won over Dr. Whipple and gained a full training experience, including visiting hospital wards and treating patients.

The outbreak of World War II presented Drew, who at Columbia had become a leading expert on blood storage technology, with the opportunity of a lifetime. In 1940 — before the bombing of Pearl Harbor, which drew the Americans into the war — Drew was hired as the medical director of the Blood for Britain project, which was to ship liquid plasma from the United States to British soldiers who had been wounded in France. In 1941, he served for a short time as the medical director of the first American Red Cross blood bank, overseeing a pilot project involving the use of dried plasma for the anticipated wartime needs of American soldiers.

In November 1941, by which time Drew had returned to Washington, D.C., to lead Howard University's surgery program, the American Red Cross announced that it would exclude blacks from donating blood. Black leaders objected so vociferously that the Red Cross modified its policy, announcing in 1942 that “the American Red Cross, in agreement with the Army and Navy, is prepared to accept blood donations from colored as well as white persons . . . In deference to the wishes of those from whom the plasma is being processed, the blood will be processed separately so that those receiving transfusions may be given blood of their own race.”

Drew weighed in on the matter several times. In 1942, he said, “I feel that the recent ruling of the United States Army and Navy regarding the refusal of colored blood donors is an indefensible one from any point of view. As you know, there is no scientific basis for the separation of the bloods of different races except on the basis of the individual blood types or groups.” Just six years before his death, while being honoured by the
NAACP
for his work in blood plasma research, Drew again criticized the blood segregation policies: “It is with something of sorrow today that I cannot give any hope that the separation of the blood will be discontinued . . . One can say quite truthfully that on the battlefields nobody is very interested in where the plasma comes from when they are hurt. They get the first bottle they get their hands on. The blood is being sent from all parts of the world. It is unfortunate that such a worthwhile and scientific bit of work should have been hampered by such stupidity.” Drew also commented on how the blood segregation policy was a “source of great damage to the morale of the Negro people, both civilian and military.”

By the time of his accidental death at the age of forty-five, Charles Drew had risen to the position of head of surgery at Howard University College of Medicine. He might have continued to rise in prominence as a blood plasma expert, surgeon, university professor, and critic of discriminatory blood donation policies had he not fallen asleep while driving on a highway in North Carolina and crashed his car in 1950. After a long day of work at the hospital in D.C., Drew had been travelling with three other doctors to a medical conference in Tuskegee, Alabama. They had planned to help train black doctors and take part in a free medical clinic. The other passengers survived the accident, but Drew was thrown from the car and sustained fatal injuries.

For decades after his death, rumours persisted that the prominent surgeon and pioneer in blood plasma storage and shipping had bled to death because the Alamance General Hospital in North Carolina refused to treat him. But that is not true. Drew was rushed to Alamance General and treated promptly and thoroughly, as Love says in
One Blood: The Death and Resurrection of Charles R. Drew
. However, his injuries were massive, and his death could not be averted.

It's ironic that this tragedy should befall a man who had helped to save other lives by finding efficient ways to store blood plasma so it could be safely shipped from the United States to Britain in the early days of World War II. How symmetrical, and disturbing, that the surgeon who drew upon his medical training to challenge American policy barring African-Americans from donating their blood to white patients ended up being thrown from his vehicle and dying as the blood ran from his body.

It wasn't only black physicians who opposed the wartime blood segregation policy. At least one white physician, who would go on to become famous in his field, shared Drew's beliefs.

Bernard Lown, a cardiologist, creator of the direct current defibrillator, and internationally recognized peace activist, began studies in medicine at Johns Hopkins University in Baltimore in 1942, by which time the Americans had entered World War II. Lown, who was born in Lithuania in 1921 and moved to the United States at the age of thirteen, had family members who were murdered during the Holocaust, and he believed that the Allies were fighting in World War II to create a better world. During his time at Johns Hopkins, blood destined for patients in the university hospital was segregated into two categories: “colored” and “white.” Because Lown knew that these were absurd distinctions, he took great delight in mixing up the tags on “colored” and “white” blood bags.

Lown was kicked out of medical school after it was discovered that he had supplied “colored” blood to a white patient from Georgia who had insisted that he not be given any “nigger blood.” But colleagues raised a fuss on his behalf and Lown threatened to go public with the blood segregation policy; he was quickly reinstated.

Describing his acts of private rebellion at the Johns Hopkins medical school in more detail on his blog, Lown writes: “Black blood had to be kept apart from white blood. This was especially galling since apartheid in blood had no scientific basis. Yet it was being practiced in one of the leading medical schools in the country, an institution that prided itself on being a pioneer in promoting science-based medicine while it distinguished donated blood with tags labelled either C (for ‘colored') or W (for ‘white') . . . I decided not to partake in the immoral charade. Single-handedly I sabotaged the system. I did it with a black crayon. Whenever we were running low on white blood, I would take a number of bottles of black blood and add on the tag a mirror letter C to the one already there. The result resembled the letter W. Lo and behold, the blood was now white . . . For me it was both a medical and political maturing experience. I learned that if one wishes to effect social change, one must never walk alone, and that historical transformations are largely bottom up. Over my long life I have witnessed profound advances in diminishing the racist color line that pervades our country. This spurs a sense of unquenchable optimism. It affirms the poetic words of Martin Luther King Jr., that ‘the arc of history is long but it bends toward justice.'”

Other books

Warning! Do Not Read This Story! by Robert T. Jeschonek
Rose of the Mists by Parker, Laura
White Serpent Castle by Lensey Namioka
Nomads of Gor by John Norman
Angel Creek by Sally Rippin
The Iron Dragon's Daughter by Michael Swanwick
The Rehearsal by Eleanor Catton
No More Lonely Nights by Charlotte Lamb