SIX
In November 1915, Goodhue County, Minnesota, opened the Mineral Springs Sanatorium (“for consumptives”) at Cannon Falls, a small town about thirty-five miles south of Minneapolis. Mineral Springs had only thirty-four bedswhen it opened, a small hospital, even by the standards of the rural Midwest. Forty years later, the number of beds had increased, but the disease that filled them, year in and year out, was always the same: the “white plague”—pulmonary tuberculosis.
One of those beds, in October 1945, was occupied by a twenty-one-year-old woman believed to be days away from death. It’s unknown how she contracted pulmonary tuberculosis, but the overwhelming likelihood is that she breathed in aerosol droplets that had previously occupied the respiratory tract of another infected person. Perhaps that person sneezed, or coughed, or just exhaled. It didn’t matter; while a sneeze can transfer a million cells of the bacterium known as Mycobacterium tuberculosis, ten is all it takes to start a colony. The young woman, Patricia Thomas, had definitely been colonized. She had endured months of high fever, loss of appetite, weight loss, muscle atrophy, and the disease’s chronic, blood-tinged cough.
In the year of her hospitalization, seventy-five thousand Americans would be killed by the deadliest infectious disease in history. By then, tuberculosis had killed one-seventh of all the humans who had ever lived: more than fifteen billion people.
That much devastation leaves a trail. Skeletons exhumed from Egyptian gravesites more than four thousand years old have the characteristic deformities of Pott’s disease, or spinal tuberculosis, as do skeletons from Neolithic sites in northern Europe and the Mideast. Almost from the moment human civilizations began to record the stories of their lives, tuberculosis was a featured player, generally appearing in the last chapter. Assyrian clay tablets describe victims coughing blood before they died. Hippocrates treated patients wasting away with chest pain and drowning in bloody sputum. Ancient Chinese physicians called the disease xulao bing; Europeans, consumption. Recommended treatments had included insect blood, breast milk, bloodletting, high-altitude living, sea travel, drinking wine, avoiding wine, and—most famously—the “king’s touch,” which was the belief that a divinely sanctioned monarch, such as the king of England or France, could, by a laying on of hands, cure what they knew as scrofula.*
However, the recorded history of tuberculosis is only the last part of its story. For a long time, the accepted wisdom was that tuberculosis emerged around the same time that humans discovered agriculture and started forming settled communities, during the so-called Neolithic Demographic Transition, or NDT, which began some ten to twelve thousand years ago. The reason that medical historians assumed this for so long has to do with the complicated nature of tuberculosis itself. Most of the really scary infectious diseases are what demographers call “crowd diseases.” Crowd diseases tend to feature high mortality—up to 50 percent when untreated—and a very rapid means of transmission. From an evolutionary perspective, the two go hand in hand; pathogens that kill their hosts need to spread rapidly, or die out. Most crowd diseases are known to have emerged around the time of the NDT, when Homo sapiens not only experienced a rapid population explosion but an even faster increase in population density. Villages and cities, where people and domestic animals lived in far closer quarters than the nomadic hunter-gatherer societies they succeeded, created a target-rich environment for crowd disease pathogens. For that reason, a large number of crowd diseases started as zoonoses: animal diseases that were able to jump to human hosts.
But while tuberculosis is both highly dangerous and very easily spread, it also has the characteristic features of chronic, noncrowd diseases, the ones caused by pathogens that thrived in the low-density societies that preceded the NDT. Most such pathogens were not originally zoonotic, and are typically able to lie dormant within a host for years, only to be reactivated when opportunity presents. Treponema pallidum, the bacterium that causes syphilis, is a famous example, as is Borrelia burgdorferi, the bacterium behind Lyme disease. Another is Mycobacterium leprae, which is the causative agent for leprosy; and predictably enough, so is its cousin, M. tuberculosis. One implication of this fact is that tuberculosis is not, as once was thought, a disease that jumped from domestic animals to humans. Since it’s older than domestication, it’s far likelier that humans originally gave it—the version that is now known as M. bovis—to farm animals, rather than the other way around. It certainly appeared in human populations long before our ancestors first experimented with animal husbandry. Decades after Patricia Thomas encountered M. tuberculosis, genetic analyses of hundreds of different strains of the bacterium decided the issue: The pathogen had originated in Africa, somewhere between forty and seventy thousand years ago.
In the same way that H. sapiens dispersed, evolved, and expanded, so too did M. tuberculosis: from Mesopotamia eight to ten thousand years ago, to China and the Indus valley a few thousand years later, eventually to Europe and the New World. It didn’t, of course, stop there. M. tuberculosis experienced another evolutionary burst only a few centuries ago. The most successful modern lineage—the so-called Beijing lineage—experienced a population increase over the last two centuries almost exactly paralleling the sextupling of human population during the same span.
A population of pathogenic bacteria that seemed always to be growing larger was bad news for its human hosts. But it wasn’t quite bad enough. As the pathogen was increasing its numbers, it was also evolving new and deadly virulence factors, which is one reason that Patricia Thomas was dying in her hospital bed. Though M. tuberculosis has been, by the standards of bacteria, an extremely slow-evolving and stable organism—the ubiquitous bacterium known as E. coli divides and replicates about once every twenty minutes, while M. tuberculosis does so only every fifteen to twenty hours, which means that spreading an evolutionary adaptation through a population takes fifty times as long. The small portions of its genome that do mutate almost always increase its virulence.
Even today, tuberculosis virulence isn’t fully understood. However, it is known that unlike most pathogens, including the strep bacterium that nearly killed Anne Miller,* M. tuberculosis doesn’t produce toxins. Instead, the microbe is ridiculously efficient at hijacking the host’s own defenses and transforming them into deadly attackers. Any time the host’s immune alarm bells go off, it summons macrophages, the oversized white blood cells, to the site of infection. The macrophages, whose job it is to engulf and digest foreign objects, form cavities: vacuoles known as phagosomes that surround the invading pathogens. Once surrounded, the macrophage then connects the phagosome to the lysosome, a chemical wood chipper that uses more than fifty different enzymes, toxic peptides, and reactive oxygen and nitrogen compounds that can, in theory, turn any organic molecule into mush.
When they attempt this with M. tuberculosis, however, things don’t work out as planned. The bacterium secretes a protein that modifies the phagosome membrane so it can’t fuse with the lysosome. Thus protected, it is able to transform the macrophage from an execution chamber to a comfortable home—one with a well-stocked larder, since another of the pathogen’s talents is the ability to shift from dining on mostly carbohydrates (which is what it eats when grown in a Petri dish) to consuming fatty acids, particularly the cholesterol that is a common component of human cell membranes.
It’s the replication that matters. Within three to eight weeks after breathing an aerosol containing a few hundred M. tuberculosis bacteria, the host’s lymphatic system carries them to the alveoli of the lungs: the tiny air sacs where carbon dioxide is exchanged for oxygen. As M. tuberculosis forms its colonies inside macrophages, they create lesions: calcified areas of the lung and lymph node. Some burst; others form a granuloma—a picket fence of macrophages—around the colony. Within three months, the interior of the granuloma necrotizes, that is, undergoes cellular death. Some of the deaths occur within the lungs, leading to the painful inflammation known as pleurisy, which can last for months. Other infested areas, known as tubercles, break off from the lungs and travel via the bloodstream to other parts of the body, becoming the frequently fatal form of the disease that physicians know as extrapulmonary tuberculosis—Patricia Thomas’s diagnosis. When it settles in the skeletal system, it can cause excruciatingly painful lesions in bones and joints. When it lands in the central nervous system, as tubercular meningitis, it causes the swelling known as hydrocephalus; on the skin, where it’s known as lupus vulgaris, it leaves tender and disfiguring nodules.
And, even when the body’s determined immune system destroys most of the granulomas, they leave behind huge amounts of scar tissue, which weaken the host’s ability to breathe. Bronchial passages are permanently blocked. Frequently, the cells needed for oxygen uptake are so damaged that victims suffocate. Sometimes the deadliest attacks of all are friendly fire. The immune system’s inflammatory response, which evolved to clear out damaged cells and allow rebuilding to follow, can overshoot the mark, especially when confronted with an especially robust (or wily) invader. When it does, histamines and the other compounds that increase blood flow and ease the passage of fluid through cell membranes cause enough fever and swelling to kill hosts as well as pathogens.
Most victims, nonetheless, survive a first bout with tuberculosis, generally because the colonies of M. tuberculosis lack the time to achieve maximum size before the host’s immune system intervenes. The problem, however, is that the disease doesn’t disappear. It stays latent, waiting for the opportunity to start the cycle of replication. One in ten times the full-blown disease will reappear as secondary tuberculosis, either because of reexposure to the pathogen, poor enough nutrition that the host’s immune system is damaged, or even hormonal changes. The result is that the same host of symptoms attacks a much-weakened host within a few years of the initial infection. Which is what happened to Patricia Thomas.
With the disease’s talent for hijacking the body’s own defenses and capacity for remaining lethal years after it had seemingly been defeated, it’s no surprise that physicians have been battling the white plague for as long as there have been physicians. They haven’t always accurately identified it, or even agreed with one another about any of its characteristics. Hippocrates thought the disease was inherited; Galen that it was contagious. Fifteen centuries later, the debate still raged. Paracelsus, the Swiss physician whose theory of health depended on the proper balance between mercury, sulfur, and salt, couldn’t understand why, if the disease was contagious, so many people in European cities who were exposed to it exhibited no symptoms.
Even after Robert Koch identified the guilty bacterium in 1882, the controversy wasn’t settled. One reason that the belief in tuberculosis as a hereditary condition proved so durable was that environmental conditions can, indeed, affect the likelihood of activating a latent infection. Pure mountain air didn’t cure TB, but it did seem to slow it down.
As a result, sanatoriums, places where patients could cough out their lives in relative comfort, sprang up all over Europe, especially in regions near the mountains or the sea. “Climate cures” built boomtowns throughout the American West, from Albuquerque to Pasadena to Colorado Springs, each of them advertising themselves as the ideal destination for tubercular patients. Edward Trudeau, an American physician who believed tuberculosis to be hereditary, contracted the disease himself in 1873, and built a European-style sanatorium in Saranac Lake, New York, where residents* were confined to bed. Though not bedrooms. Trudeau, convinced of the curative powers of clean mountain air, required his patients to sleep outdoors, even in subzero temperatures. The Saranac Lake model actually became so popular that, in the first decade of the twentieth century, one American in 170 lived in a self-declared sanatorium.
The idea of housing consumptives together would scarcely have made sense absent the conviction that tuberculosis victims weren’t themselves infectious. It was a plausible enough notion. Because the pathogen could establish colonies without—at first—causing much in the way of symptoms, hosts could be exposed to the disease without appearing to contract it. Since living in places where the air was clean (and, more important, the sanitation well managed) seemed to relieve symptoms and even slow the progress of the disease, many physicians thought tuberculosis couldn’t be infectious (or, at least, not mostly so). At a time when the germ theory itself was still very novel, nineteenth-century European and American societies were largely ignorant of the dangers of spreading the disease. One consequence was that it became a trope for the nineteenth century’s literary romantics. Pale heroines languishing beautifully and consumptive children who “appear like fairies to gladden the hearts of parents and friends for a short season” became notorious clichés of Victorian literature. Nor was tuberculosis of interest only to romantics; the residents of the Berghof atop Thomas Mann’s Magic Mountain form a cross-section of early twentieth-century European intelligentsia, united only by metaphysics and obstructed breathing.
Like the European society they symbolized, though, their ailment was undergoing a gigantic transformation. As the germ theory took hold, tuberculosis was transformed from a romantic ailment to a contagious one. Consumptives were no longer regarded as brave symbols of individual suffering, but as a social danger, the modern version of lepers.* Public health campaigns warned everyone to take care around coughing and sneezing, and especially advocated for isolating tubercular patients. By the end of the 1920s, sanatoriums were already being transformed from comfortable resorts for the wealthy to quasi prisons for the poor.
Segregating the infected from the not yet infected was harsh, but nearly the only useful response to the disease. Koch’s tuberculin had been both a failure and a scandal; Colenso Ridgeon’s cure in A Doctor’s Dilemma was a fiction. Shaw’s play asked audiences if “thirty men . . . found out how to cure consumption . . . Why do people go on dying of it?” Though a tuberculosis vaccine had been developed by the French physicians Albert Calmette and Camille Guérin as early as 1916, and first administered to humans in 1921, it was at best only partly effective.* One popular surgical procedure, the so-called pneumothorax technique, deliberately collapsed an infected lung to allow the lesions caused by the tubercular granulomas to heal. The only characteristic all of these purported cures shared was an almost complete lack of effectiveness. Patients came to sanatoria like Mineral Springs not to be cured, but to be made as comfortable as possible while their bodies repaired themselves, or—more frequently—while they waited to die.
—
A month into her stay at the Mineral Springs Sanatorium, on November 15, 1945, Patricia Thomas became the first patient to receive an injection of a new compound called streptomycin. After a series of injections of this new compound over the following months, she went home, completely cured.
Her battle with Mycobacterium tuberculosis ended, but the war over the discovery of streptomycin would rage on, never really subsiding. Truces are broken regularly, whenever advocates for the two scientists at the heart of the quarrel—Selman Abraham Waksman and Albert Schatz—square off. Entire books have been devoted to arguing one side or the other. One of the most prestigious science magazines in the world, Nature, was, for a time, a battlefield, when Schatz’s champion, the microbiologist and historian Milton Wainwright, fought on the page with Waksman’s defender, William Kingston, a professor of business and history.
Here’s what isn’t disputed in this war for credit: Albert Schatz graduated from Rutgers University in New Brunswick, New Jersey, in May 1942 with a degree in soil science, and immediately started graduate work under one of the field’s leading lights, Selman Waksman. Both professor and student specialized in actinomycetes, a suborder of soil-dwelling bacteria with thousands of known members, even in the 1940s.
Their interest was more than simple intellectual curiosity. Ever since the 1920s, actinomycetes, an unusual group of bacteria that exhibit Penicillium-like branching filaments, had been identified as powerful antagonists to other bacteria. This wasn’t really unexpected, since a teaspoon of soil can be home to a billion or more bacteria, which meant a huge number of competitors in the never-ending hunt for the raw materials essential to produce the biomass needed by all life: DNA, RNA, amino acids, fats, and the like. In a nutrient-rich environment like soil, packed with organic material and trace minerals, the competition is fierce, and actinomycetes were already known to be using some very powerful weapons indeed, each one an enthusiastic killer of other bacteria.
In 1939, the French-born microbiologist René Dubos, a former student of Waksman’s then working at the Rockefeller Institute for Medical Research in New York, had isolated two distinct compounds from another family of soil bacteria, B. brevis. Each of the compounds, which Dubos named tyrothricin and gramicidin, were, like penicillin, enthusiastic killers of Gram-positive pathogens. Unfortunately, unlike penicillin, they didn’t do so by weakening the distinctive Gram-positive cell wall. Tyrothricin blocked the synthesis of proteins; gramicidin made cell membranes permeable to salts. Since animal cells have membranes and depend on protein synthesis, both activities made the compounds nearly as deadly to hosts as to pathogens.
Dubos’s discoveries weren’t inconsequential; gramicidin is still prescribed today for infections of the skin and throat. But to treat systemic infections like tuberculosis, a drug must enter the bloodstream. Since neither gramicidin nor tyrothricin could do so safely, their largest contribution to the antibiotic revolution was to hint that soil might be valuable for growing more than just crops. Somewhere in the dirt there had to be something that would kill pathogens while leaving their hosts alive.
The pursuit of a substance with the right balance of destructiveness and discretion was a full-time job at Waksman’s lab at Rutgers. Throughout the 1920s and 1930s, Waksman and his team had been collecting different soils from all over the eastern United States, isolating the varieties of actinomycetes they contained, and then growing them in Petri dishes filled with agar. Once they had a colony, the graduate students working in Waksman’s labs would expose it to another sort of bacteria, and note the results; if the newly introduced bacteria failed to thrive, then that particular colony of actinomycetes had an antibiotic property.
This is the most tedious sort of research, completely lacking in bursts of inventive genius or innovative experimental design. It did not, however, lack for institutional support.
In 1939, Merck & Co. had concluded an agreement with Selman Waksman that provided the lab with an ongoing grant for the study of antibiotics. Merck’s support included money—Waksman had initially been engaged to consult on microbial fermentation for $150 a month; later that year, Merck added another $150 for working on “antibacterial chemotherapeutic agents,” along with experimental animals, and funding for a fellowship in the Rutgers lab. (The investment in the fellowship wasn’t purely altruistic; Waksman’s first fellow, Jackson Foster, later became the director of Merck’s microbiological lab.) Waksman would do the research, and in return for the funding, and for handling “production, purification . . . and to arrange for clinical trials,” Merck would own the patents from any resulting research, and would pay a royalty of 2.5 percent of net sales to the Rutgers Endowment Foundation, a nonprofit charity originally established to solicit donations from alumni.
For its first year or two, Merck’s investment hadn’t paid much in the way of dividends. However, at the beginning of 1940, Waksman reacted to news of the progress of Florey’s team by saying, “These Englishmen have discovered [what] a mold can do. I know the actinomycetes can do better” and proved as good as his word. By the spring of 1941, he had presented his patrons at Merck with the first fruits of his actinomycete farm: the antibacterial compounds clavacin, actinomycin,* and streptothricin. The harvest was promising if not spectacular. All three compounds were effective, but toxic; strepothricin, in particular, was frustratingly able to kill a variety of Gram-negative pathogens in mice, but had the unfortunate side effect of destroying kidney function in the four human volunteers on whom it was—prematurely, not to say irresponsibly—tested.
Waksman was undaunted. It was around this time that he coined the word by which this variety of antibacterial drugs would henceforth be known: antibiotic, which he defined as “a chemical substance produced by microorganisms”—i.e., penicillin, but not Salvarsan—“which has the capacity to inhibit the growth of and even destroy bacteria and other organisms.”
The problem was finding the antibacterial needle in the enormous actinomycete haystack. Years later, Waksman would tell people, “We isolated one hundred thousand strains of streptomycetes [as actinomycetes were then known]. Ten thousand were active on agar media, one thousand were active in broth culture, one hundred were active in animals, ten had activity against experimental TB, and one turned out to produce streptomycin.” Though the numbers are casual approximations, the technique was essentially that: Throw lots of actinomycetes up against the wall, and see which ones stuck.*
Which is how Schatz spent his days, from May 1942, when he joined Waksman’s lab, to November, when he was drafted to serve in an Army Air Forces medical detachment in Florida. It’s what he did during his off hours in Florida, which he spent finding, and sending, different soil samples back to Waksman’s lab in New Brunswick. And it’s what occupied him after he was given a medical discharge, in June 1943, and returned to Waksman’s lab.
Credit: Rutgers University
Albert Schatz (1920–2005) and Selman Waksman (1888–1973)
He did so as one of the few buck privates in the United States Army who took a cut in pay returning to civilian life. Private Schatz had been earning fifty dollars a month while in the service, along with free housing, food, and clothing. As a civilian PhD candidate, he performed the (literally) dirty work of analyzing soil samples for even less: forty dollars a month, which was well below the minimum wage for a forty-hour week. And Schatz, like PhD candidates then and now, worked a lot more than forty hours. By his own, understandably aggrieved, account, “During the four month interval between June and October, 1943, I worked day and night, and often slept in the laboratory. I prepared my own media and washed and sterilized the glassware I used.” He even cadged his meals from the stuff grown in the university’s agricultural college. He didn’t do so without complaint. But he did it, convinced that actinomycetes held the key to a yet-to-be-discovered class of pathogen killers.
And those too-toxic-for-humans antibiotics like streptothricin killed Gram-negative bacteria. The dissertation that Schatz was researching on a salary of ten dollars a week was explicit: “Two problems, therefore, appeared to be of sufficient interest to warrant investigation; namely . . . a search for an antibiotic agent possessing . . . activity . . . against Gram-negative eubacteria . . . and a search for a specific antimycobacterial agent.” The sulfa drugs and penicillin were both enzyme blockers, the first inhibiting the ones bacteria needed to synthesize an essential B vitamin, the second blocking the enzymes needed to assemble the giant molecule of amino acids and sugars that make up the bacterial walls of the Gram-positive pathogens streptococci, staphylococci, and clostridia. But Gram-negative bacteria, with their very different cell walls, are some of the most prolific killers in human history, including Yersinia pestis, the bacterium that causes bubonic plague, and Vibrio cholerae, which causes cholera.*
Schatz’s phrase “specific antimycobacterial agent” was a euphemism for “TB killer.” It was also a red flag for Waksman, Schatz’s boss and thesis advisor, who recognized that his soil science lab wasn’t a secure research facility for something as dangerous as the white death bacillus. Even so, he agreed to support Schatz’s research, with the proviso that it be conducted in an isolated basement that he turned over to his graduate student, either out of an exaggerated fear of a tuberculosis outbreak—Schatz’s recollection—or a perfectly reasonable exercise of caution, since the lab lacked what were, even in 1943, state-of-the-art defenses against disease outbreaks: no ultraviolet lights that could be used to kill dangerous microbes; no negative-pressure fans and air filters to keep them from escaping.
Whatever the reason, Schatz’s “exile” (his word) produced results. On October 19, 1943, after exposing hundreds of actinomycetes from different soils to the most virulent of tuberculosis bacilli, a variant of M. tuberculosis designated H-37, he found two that were antagonistic, one taken directly from the soil, the other swabbed from the throat of a chicken. Both were variants of a bacterium known since 1914 as Actinomyces griseus. Schatz renamed it Streptomyces griseus. No one knows who named the substance they produced; the first mention of streptomycin in print was in a letter from Selman Waksman to Randolph Major at Merck.
Here was the first step in finding a practical antibiotic. The next one, which Florey’s team at Oxford had faced three years before, was running a series of trials to find if it worked not just in vitro, but in vivo—in living, breathing animals. As with the first penicillin extracts, this required streptomycin to be produced in quantity. A sufficient amount for one-off experiments could be distilled in Waksman’s New Jersey lab. Or, it could as long as Schatz was willing to be Waksman’s Norman Heatley. He set up a similarly makeshift system in his basement lab; to prevent the liquid from boiling away while he slept, the night porter at Waksman’s lab awakened him whenever the liquid fell below a red line Schatz had drawn on the distillation flasks.
However, testing Schatz’s new compound for effectiveness in vivo needed a more sophisticated facility than the young biologist’s basement lab could provide. And in 1945, there was no research hospital in America more sophisticated than the Mayo Clinic. Founded in 1846 by an English émigré doctor named William Worrall Mayo as an outpost for what was literally frontier medicine, the clinic moved to Rochester, Minnesota, in 1864, where Mayo joined the U.S. Army Medical Corps as a member of the state’s enrollment board, which examined recruits for the Union army.
From the time Mayo’s sons William J. and Charles H. joined him, the clinic was leading the transition from medicine as art to medicine as science. And, as the scientific advances of the nineteenth century were, by the beginning of the twentieth, almost entirely the result of collaboration between specialists, so too was medicine at the Mayo Clinic. By 1889, when the Mayos joined with the Sisters of St. Francis to build Saint Mary’s Hospital, William J. Mayo was writing, “It has become necessary to develop medicine as a cooperative science: the clinician, the specialist, and the laboratory workers uniting for the good of the patient. . . . Individualism in medicine can no longer exist.” When William Mayo first expressed this sentiment about “cooperative science,” he was really writing about a perceived deficiency in clinical practice, not medical research; he wanted to apply the best features of the latter to the former, which, in part, explains the decision to reconfigure the clinic as a not-for-profit charity in 1919. Now, staff would be salaried, not contract, physicians. This, the Mayos believed, would encourage collaboration between specialized researchers and clinicians, who would no longer run the risk of financial penalties for enlisting the best research in their practices.
This forward-thinking philosophy would transform the Mayo Clinic into a world-famous research laboratory. The transformation was already complete when the top veterinary pathologist in the country, William Feldman, joined its Institute of Experimental Medicine in 1927. His research partner, Corwin Hinshaw, had an even more unusual background: Before receiving his medical degree in 1933 from the University of Pennsylvania, he had already done graduate work in zoology, bacteriology, and parasitology. What the two had in common was an interest in lung disease, particularly bovine, avian, and especially human tuberculosis.
By the middle of 1943, Feldman and Hinshaw had performed experiments testing a variety of compounds, including several of the sulfa drugs, and a related class of drug known as the sulfones on tuberculosis in guinea pigs. Sulfones had shown some promise in treating another mycobacterium-caused disease—leprosy—and the Mayo team reasoned, eo ipso, that they might be effective against tuberculosis. They were to be disappointed; the curative powers of antileprosy medications—Promin (from Abbott Laboratories) and Promizole (from Parke-Davis)—against M. tuberculosis weren’t completely absent, but nearly so.
Far more encouraging was Selman Waksman’s work with actinomycetes. After reading his papers on streptothricin, Feldman visited Rutgers in November 1943, and encouraged Waksman to keep the Mayo team in mind for anything that showed streptothricin’s antibacterial effectiveness without its very high cost in toxic side effects.
In February 1944, Feldman and Hinshaw received an advance copy of Schatz’s first streptomycin paper. Though the paper listed twenty-two different bacteria that were either killed or inhibited in vitro by the newly discovered compound, they saw only one: M. tuberculosis. The following month, Waksman wrote a letter to Feldman asking whether he and Hinshaw would be able to perform clinical tests in vivo on the drug.
It took Schatz five weeks to distill the 10 grams that the Mayo team requested, but by the end of April, a batch of streptomycin was on its way from New Brunswick to Rochester. On April 27, Hinshaw and Feldman began testing it on a dozen different pathogens they successively injected into four very unlucky guinea pigs.
The results were more than encouraging. Streptomycin was effective against nearly all of them: bubonic plague, tularemia, even the intestinal disease known as shigellosis. Most important: By the time the tests were completed on June 20, the preliminary results were almost too good to be true. Streptomycin cured tuberculosis.
At least, it cured the disease in four rodents. To know the compound’s real effectiveness, the test would need to be replicated as a proper experiment, with more subjects and a like number of controls: guinea pigs who would be given the disease, but not the treatment. For that second, crucial bit, considerably more streptomycin was needed than Schatz and his Heatley-like factory could dream of producing. On July 9, Hinshaw and Feldman arrived in New Jersey, this time so that Selman Waksman could introduce them to his patrons from Merck at their Rahway lab.
Feldman and Hinshaw had, perhaps overoptimistically, already selected and infected twenty-five guinea pigs before leaving Rochester, but their arguments failed to persuade Merck’s team. The pharmaceutical company’s chemists and pharmacologists were extremely dubious about their ability to produce the quantities needed for a proper trial. And even if they could, they questioned the wisdom of allocating resources to an unproven drug when the need for more and better penicillin production was clearly a greater national priority. Merck—along with the other sixteen companies that the War Department had enlisted in the penicillin project—was fully committed to the Florey-Chain-Heatley process. Its manufacturing facilities (and, particularly, its fermentation vats, which would be essential for producing large quantities of either penicillin or streptomycin) were working three shifts a day to produce the penicillin demanded by the war effort. This was only sensible, since the most common infections resulting from the wounds to American soldiers, sailors, and airmen were caused by Gram-positive bacteria: staphylococci,streptococci, enterococci, and especially clostridia: C. tetani, the vector for tetanus toxemia, and C. perfringens, which caused gas gangrene. Penicillin killed them. Streptomycin—even if it worked—wouldn’t.
At the moment it must have seemed to Waksman, Feldman, and Hinshaw (and even the absent Schatz) that their research was being sidetracked just when they were tantalizingly close to a breakthrough. George Merck joined the meeting. Merck’s chief executive knew, better than anyone in the room, the importance of the penicillin project to both the company and the war effort. He also possessed, more than anyone in the room, a vision of the future of drug development. And he alone had the executive authority to decide which was more important. Merck overruled his scientific staff and agreed to invest in a production line for streptomycin in the plant in Rahway; a month later, the company broke ground on a brand-new $3.5 million facility in Elkton, Virginia.
Even better: He directed Randolph Major, Merck’s research director, to assign fifty researchers to the project. Major knew precisely who he wanted to head the team. Ten years before, he had hired two Ivy League–trained chemists, Max Tishler and Karl Folkers. He assigned Tishler to work on a series of challenging problems in chemical synthesis—vitamin B2, cortisone, and ascorbic acid—while Folkers spent two years studying the poisonous alkaloids collectively known as curare as a possible anesthetic.* Both were put on the penicillin project in 1943, and were working on it seven days a week when Major reassigned them to streptomycin. Folkers became the company’s head of research, Tishler its head of development.
Tishler and Folkers were both children of immigrant families who received the most prestigious and rigorous graduate education available in the United States—Tishler was awarded his PhD from Harvard’s chemistry department, where he studied under the soon-to-be president of the university James Bryant Conant; Folkers did his postdoctoral work at Yale. Both joined Merck in the 1930s precisely because the company’s brand-new research lab promised the kind of support available nowhere else. Folkers, in particular, chose Merck over a higher-paying job working for General Electric because his “lab” at GE was more like a storeroom “with some chicken wire to separate it from the rest of the area.”
Working together as a team for the first time on streptomycin, Tishler and Folkers got on brilliantly, both with one another and with Waksman, whom Tishler remembered forty years later as an “extremely imaginative, able, wonderful scientist, and a very dedicated and prolific writer. . . . He was probably the best living scientist of the soil; no one has approached his expertise since then.” No one was better suited by training and temperament to get a sufficient quantity of streptomycin from New Jersey to the eagerly awaiting Hinshaw and Feldman.
By mid-July 1944, they had received enough streptomycin to begin their world-changing experiment. The twenty-five infected guinea pigs were injected with streptomycin every six hours for sixty-one days. Twenty-four control animals were not. The results, even now, are startlingly obvious. After forty-nine days, eighteen of the control animals showed tubercular nodes in their lungs; only one of the treatment animals did. Eight control animals had tuberculosis in their livers; none of the treatment animals did. And seventeen of the untreated animals—71 percent—had died. Of the twenty-five guinea pigs given streptomycin, twenty-three had survived to the end of the experiment.*
Waksman, Schatz, Hinshaw, Feldman, Folkers, and Tishler had their answer. Another wonder drug, one that seemed to be in every way as transformative as penicillin, had been identified, isolated, and tested.
Who would get to tell the world?
Streptomycin, like penicillin before, and tetracycline and others after, was the work of dozens of highly trained and ambitious professionals, and their motivations aren’t easily categorized. For some, it’s all about the pleasures of puzzle solving: Howard Florey famously said, “People sometimes think that I and the others worked on penicillin because we were interested in suffering humanity. I don’t think it ever crossed our minds about suffering humanity. This was an interesting scientific exercise, and because it was of some use in medicine is very gratifying, but this was not the reason that we started working on it. . . .” For others, though, Florey included, recognition was the thing. Scientists dream of the undying fame that comes with Copley Medals and Nobel Prizes, but even at more modest levels, authoring a major paper is the key to academic status and even employment. This was true even outside of universities; out of a shared conviction, promoted by Major, that Merck needed to be preserved as an attractive place for academics to work, Tishler and Folkers developed a protocol whereby Merck scientists were allowed—even encouraged—to publish results any time after a patent was filed, rather than waiting for the patent to be issued. And, let it not be said that scientists, even those not employed by profit-making enterprises like Merck, are immune to financial attractions, both from grants and the patent revenues whose loss had so enraged Ernst Chain.
What all these forms of compensation share, though, is the premium they place on priority: Recognition, prizes, grants, and patents are rewards only for first-place finishers. The problem with the news about streptomycin was that there were two important discoveries: first, the Rutgers discovery that the compound produced by S. griseus was effective against Gram-negative bacteria in vitro; and second, the Mayo results that it worked just as well in vivo. Complicating matters was the fact that Feldman and Hinshaw, because of their affiliation, were able to guarantee that their own paper could appear in the Proceedings of the Mayo Clinic—a prestigious, peer-reviewed journal—within weeks of submission, while Schatz and Waksman were still awaiting a date from the Proceedings of the Society for Experimental and Biological Medicine.
It was a polite disagreement, but a nontrivial one, and it required a third party to adjudicate it. Waksman persuaded Randolph Major to—politely—dissuade the Mayo team from publishing until the Rutgers streptomycin paper could appear. Since Merck had a long-standing relationship with Waksman (to say nothing of an interest in controlling the public release of information about a potential new miracle drug before it was closer to being produced in quantity), Major was happy to tell Feldman that while he “quite understood” his eagerness to publish, “You might care to wait until the publication of [Waksman’s] initial in vitro results.”
Major—and, presumably, his boss, George Merck—didn’t care about publishing priority. But he did have a powerful regard for the company’s relationship with Selman Waksman’s lab, which was starting to look like one of the most important research investments the company had ever made. Waksman knew this, and made his case to his corporate patrons accordingly: He needed to publish first.
It was at roughly the same time that Waksman performed an even more unlikely act of salesmanship: persuading George Merck to give up the patent on streptomycin.
Because of the 1939 agreement, Merck & Co. owned the patent on a potential super antibiotic, while penicillin could be produced by anyone as a generic drug. So the fact that Merck did, in fact, transfer ownership of the patent to the Rutgers Endowment Foundation seems, on the face of it, mystifying. Though George Merck famously, and sincerely, believed that people came before profits, he was also the CEO of a for-profit company that was certainly capable of providing streptomycin to millions of people; and since someone was going to own the patent, why not Merck?
The best guess is that, like penicillin (and so much else), the fate of streptomycin was intimately tied up with the overarching strategic objective of the time: winning the Second World War.
Even before it had become a shooting war, the nations that would become the Allied powers had been worrying—probably excessively—about the potential development of biological weapons by the Axis. It was widely believed in London and Washington that Japanese agents had tried to buy yellow fever virus as early as 1939. Other intelligence reports suggested that German biologists were secretly teaching their Japanese partners how to use anthrax and typhus militarily. The Swiss had reportedly told the United Kingdom that Nazi Germany had plans to use “bacteria of every description” in any coming war.
The identities of everyone in the United States who was privy to this intelligence aren’t known, but one was certainly George Merck, the head of the War Research Service.
As documented by the journalist and author Peter Pringle, and substantiated by the National Academy of Sciences committee on biological warfare, and the War Research Service, Merck had met with the Office of Strategic Services, a wartime intelligence agency, in 1943 in response to a request that he recommend biological weapons for use in covert operations behind enemy lines. But his more important duty was to protect the Allies against any likely biological attack; and it seems at least plausible that he decided that his responsibility to his nation trumped the one he had as the head of his family corporation. If streptomycin was a potential defense against German or Japanese biological warfare, it was imperative that production be scaled up immediately.* By the end of 1944, he agreed to forgo the monopoly granted by the 1939 agreement with Selman Waksman; within two years, six companies—Abbott, Lilly, Merck, Pfizer, Squibb, and Upjohn—were all manufacturing streptomycin.
Whatever Merck’s motives, his decision freed Waksman to secure the patent rights on behalf of Rutgers, and, on February 9, 1945, Schatz and Waksman jointly applied for a patent on streptomycin, swearing under oath that they were codiscoverers of the new drug. Three months before, Merck-manufactured streptomycin had been used, for the first time, on a human being: Patricia Thomas.
Thomas had spent more than a year at Mineral Springs, her symptoms gradually worsening, especially in her right lung. In October 1944, her physician, Dr. Karl H. Pfuetze, sent her to the Mayo Clinic for a consultation and examination by Corwin Hinshaw, who immediately started her on the course of streptomycin that would save her life. Over the next five months, she received five courses comprising multiple doses . . . doses that, in the absence of any prior human testing, were best-guess estimates by her physicians. Mayo Clinic surgeons operated on her nearly useless right lung, excising the diseased section and returning most of its function. In the summer of 1945, she returned home, where she would marry and have three children. Streptomycin saved her life, but she was still permanently weakened by the disease, and would die in June 1966, at only forty-two years old.
For Waksman, 1946 was an annus triumphalis, the first of many. He traveled to Europe, where he was presented with the first of what would become twenty-two honorary doctoral degrees and sixty-seven prizes and awards, including the Lasker Award, the Trudeau Prize, and the Nobel Prize in Physiology or Medicine. Also, in 1946, on May 3, Waksman and Schatz both assigned their pending patent* to the Rutgers Foundation (by then the Rutgers Research and Endowment Foundation) in return for one dollar apiece.
Schatz, who had by now completed his PhD dissertation, watched from the sidelines as Selman Waksman became a national hero. In 1948, a dozen newspapers hailed Waksman as the discoverer of the new wonder drug; in April 1949, Time profiled him as the model of a humble scientist in an article entitled “Man of the Soil.” In November, it put him on the magazine’s cover, accompanied by the folksy reminder that “the remedies are in our own backyards.” Schatz was unmentioned in either article. He registered his concern in a letter to his onetime boss. Waksman’s reply deserves quoting in detail:
You know quite well that we gave you all the credit that any student can ever hope to obtain for the contribution that you have made to the discovery of streptomycin. You know quite well that the methods for the isolation of streptomycin had been worked out in our laboratory completely long before your return from the army, namely for streptothricin.
Unsurprisingly, this failed to mollify Schatz. Nor did a subsequent letter in which Waksman told his onetime grad student and co-patent holder:
You must, therefore, be fully aware of the fact that your own share in the solution of the streptomycin problem was only a small one. You were one of many cogs in a great wheel in the study of antibiotics in this laboratory. There were a large number of graduate students and assistants who helped me in this work; they were my tools, my hands, if you please.
The “man of the soil” had dug one shovel too deep. In March 1950, Schatz filed suit in federal court.
More than reputation was at stake. In the intervening years, Schatz had learned—it’s unclear how—that the 1946 assignment of patent rights to the Rutgers Foundation, for which both Schatz and Waksman agreed to accept a single dollar in compensation, wasn’t the only document signed by Waksman that month. He had, in addition, signed an agreement with the foundation that provided him with 20 percent of the royalties they would henceforth receive for licensing the rights to manufacture streptomycin. Even better (or worse), Schatz discovered that the agreement between Waksman and the foundation was contingent on Waksman persuading Schatz to sign away his rights. For a dollar.
By 1950, Waksman’s 20 percent had paid him approximately $350,000.
Though Waksman attempted a defense against the suit, he was severely handicapped by the documents he had himself signed—not merely the patent application, with its solemn oaths that he and Schatz were codiscoverers, but the original papers in which the discovery of streptomycin was described. Back in 1944, two articles had appeared in The Proceedings of the Society for Experimental and Biological Medicine entitled, respectively, “Streptomycin: A Substance Exhibiting Antibiotic Activity Against Gram-Positive and Gram-Negative Bacteria” and “Effect of Streptomycin and Other Antibiotic Substances upon Mycobacterium tuberculosis and Related Organisms.” The authors were Albert Schatz, Elizabeth Bugie (another of Waksman’s graduate students), and Selman Waksman . . . in that order. And while Waksman later asserted that it was his policy to give students first position on papers to help their careers, he did so only twice in his entire publishing life. Both were the streptomycin papers he had coauthored with Schatz.
In December, the case was settled without trial. Schatz was granted 3 percent of the foundation’s streptomycin royalties, which would amount to about $12,000 a year, and a one-time payment of $125,000 for foreign patent rights. Waksman was granted 10 percent (and 7 percent was granted to everyone else who had worked in the lab during the key months of 1943, up to and including the dishwasher).
If money had been at the heart of the dispute, the settlement should have, well, settled things. Possibly, had the judges who awarded the Nobel Prize in Physiology or Medicine in 1952 recognized Schatz as well as their honoree, Waksman, it might have done so. Or if Waksman had, in his Nobel Lecture, done more than mention Schatz, in a single sentence, as one of twenty different lab assistants. But probably not. To his dying day, Schatz refused to recognize the fact that Waksman was a far more accomplished scientist, one who had made a dozen different landmark discoveries both before and after Schatz’s tenure in the lab. In 1949, Waksman even discovered another actinomycete-derived antibiotic, neomycin. Moreover, though Schatz spent decades insisting he was academically marginalized for pursuing on his just rights, it’s not hard to see why other biology departments were wary of hiring someone who had attacked his thesis advisor in print, in court, and even, astonishingly, by sending an open letter to the king of Sweden in an attempt to sabotage the Nobel Prize ceremony. In Selman Waksman’s mind, on the other hand, the great achievement was entirely the result of an experimental machine he had designed and built decades before Schatz arrived in New Brunswick—and that Albert Schatz was, therefore, an easily replaceable component.
And there lies the real misunderstanding. In a letter written on February 8, 1949, Waksman—again turning up the dial on his indignation—wrote to Schatz, “How dare you now present yourself as innocent of what transpired when you know full well that you had nothing to do with the practical development of streptomycin” (emphasis added). Unwittingly, Waksman had revealed the underlying truth of the scientific discovery; and it didn’t serve either Schatz or himself. He was correct that Schatz had little to do with the “practical development” of the new antibiotic. But neither had he. William Feldman and Corwin Hinshaw at Mayo had more to do with demonstrating the therapeutic value of streptomycin than anyone at Rutgers. So had Karl Folkers and Max Tishler and the more than fifty researchers assigned by George Merck to supervise the streptomycin project. The discovery of streptomycin was a collective accomplishment dependent on dozens, if not hundreds, of chemists, biologists, soil scientists, and pathologists.
And one statistician.
—
Streptomycin was a miracle for those suffering from tuberculosis and other bacterial diseases unaffected by penicillin. It was iconic—as much, in its way, as penicillin—in demonstrating the way industrial support could turbocharge university research. But it achieved its most consequential result as the subject of the most important clinical trial in medical history: the one that, for the first time, and forever after, quantified the art of healing.
Though it is frequently described as such, the streptomycin trial of 1948 to 1950 wasn’t anything like medicine’s first clinical trial. The Bible’s Book of Daniel records King Nebuchadnezzar’s more or less accidental test of two different diets—one, by royal commandment, only meat; the other vegetarian—over ten days, after which the perceived health of the vegetarians persuaded the king to alter his required diet. The sixteenth-century French surgeon Ambroise Paré tested two different treatments for wounds—one, a noxious-sounding compound made of egg yolks, rose oil, and turpentine; the other of boiling oil, which was even worse. Two centuries later, the Scottish physician James Lind “selected twelve patients [with] the putrid gums, the spots and lassitude, with weakness of the knees” typical of the deficiency disease scurvy, and gave two of them a daily dose of a “quart of cyder,” two of them sulfuric acid, two vinegar, two seawater, two a paste of honey. The final two, given oranges and lemons, were marked fit for duty after six days, thus clearly demonstrating how to prevent the disease, a scourge of long maritime voyages.*Even fictional characters got into the act. The climax of Sinclair Lewis’s 1925 novel, Arrowsmith, takes place on the fictional Caribbean island of St. Hubert’s, in which the eponymous hero gives half the population of the parish of St. Swithin’s a vaguely described antiplague serum—a “phage”—and the other half nothing at all.
Nor was the streptomycin trial the first time medicine had recognized the importance of sampling. The idea of sampling—choosing a test population so that it reflects the characteristics of the entire universe of similar people, and large enough that conclusions wouldn’t be confounded by a tiny number of outliers—was, in some sense, centuries old; the sixteenth-century Dutch physician and chemist Jan Baptist van Helmont had dared his fellow doctors to compare their techniques with his, proposing:
Let us take out of the hospitals . . . 200 or 500 poor People that have Fevers, Pleurisies, etc. Let us divide them into halfes, let us cast lots, that one of them may fall to my share, and the other to yours [and] we shall see how many funerals both of us shall have. . . .
William Farr, compiler of abstracts in Britain’s General Register Office, first identified the huge difference in childhood mortality between rich and poor, leading to thought experiments on sanitation. And the pioneers of mathematical statistics, Karl Pearson and Ronald Fisher, had applied techniques like analysis of variance and regression to a variety of health-related subjects, such as height and blood pressure. In 1943, Britain’s Medical Research Council actually funded a comparative trial to see if an extract of Penicillium patulinum could cure the common cold (it couldn’t). During the first four decades of the twentieth century, more than one hundred papers cited so-called alternate allocation studies, in which every other patient admitted to a hospital or clinic was given a treatment, and the others were used as a control. But before streptomycin, the perceived efficacy of any medical treatment remained entirely anecdotal: the accumulated experience of clinicians.
This is the sort of thing that works just fine to identify really dramatic innovations, like the smallpox vaccine, or the sulfanilamides, and certainly penicillin. But most advances are incremental; this is true not just for medical innovations, but all technological progress. And identifying small increments of improvement isn’t a job for clinicians. Physicians, no matter how well trained in treating disease, are just as vulnerable as anyone else to cognitive biases: those hiccups of irrationality that give an excessive level of significance to the first bit of information we acquire, or the most recent, or, most common of all, the one we hope will be true.* Medicine needs statisticians.
It was a lucky coincidence that the most influential medical statistician of the twentieth century had a special interest in tuberculosis. Austin Bradford Hill, professor of medical statistics at the London School of Hygiene and Tropical Medicine—like Selman Waksman, a nonphysician—had spent the First World War as a Royal Navy pilot in the Mediterranean, where, in 1918, he acquired pulmonary tuberculosis. Treated with bed rest and the dangerous, unproven, but nonetheless popular technique of deliberately collapsing his lung, he somehow recovered, and was granted a full disability pension, which he would collect for the next seventy-four years, until his death in 1992.
Hill’s personal connection with tuberculosis wasn’t the only reason that streptomycin was the ideal candidate for the then-revolutionary idea of randomizing a population of patients and carefully comparing outcomes between those who received a particular treatment and those who didn’t. Because tuberculosis was so variable in its symptomology—most people who have M. tuberculosis are asymptomatic; some will have a chronic ailment for years; and others die within weeks—it was, even more than most diseases, subject to the “most people get well” problem. And, as all those people who flocked to sanatoria proved, tuberculosis was highly susceptible to environmental differences: People really did do better at Saranac Lake than in cities, where clean air and water were still rare. Streptomycin didn’t cure tuberculosis the way penicillin cured staph infections; before Patricia Thomas left Mineral Springs, she received five courses of treatment over more than four months. Teasing out the best practices for a treatment that worked that slowly—slower than some diseases resolve in the absence of any treatment at all—reallydemanded sophisticated mathematics.
Despite the obvious need for statistical analysis of tuberculosis treatments, Bradford Hill had another obstacle to overcome in persuading Britain’s Tuberculosis Trials Committee, which he had joined in 1946, to fund a randomized trial to evaluate streptomycin: ethics.
Such a trial would mean denying what was widely (and correctly) believed to be a lifesaving drug to the members of a control group. But without a control group, it would be impossible to arrive at a definitive conclusion about the treatment group. This was an especially thorny issue in 1946, as the Nuremberg Military Tribunals were, at that very moment, revealing the horrific results of human experimentation in Nazi Germany, where patients were frequently denied treatment in the name of science.* Letting chance decide which tuberculosis patients got the miracle drug, and which would serve as a control group, would probably have been an insuperable problem, but for one thing: There just wasn’t enough of the drug to go around. A significant number of tuberculosis patients would be denied treatment in any case. As Hill subsequently wrote:
We had exhausted our supply of dollars in the war and the Treasury was adamant we could only have a very small amount of streptomycin . . . in this situation it would not be immoral to do a trial—it would be immoral not to, since the opportunity would never come again. . . .
The Medical Research Council’s Trial of Streptomycin for Pulmonary Tuberculosis began early in 1947 with 107 patients in three London hospitals. All of them were young, and had recently acquired severe tuberculosis in both lungs. Fifty-five of them were assigned to the treatment group, which would be given streptomycin and bed rest, while the fifty-two members of the control group would receive bed rest and a placebo: a neutral pill or injection indistinguishable to the patient from the compound being tested. The two groups were selected entirely at random, based on numbers chosen blindly by the investigator and placed in sealed envelopes, thus assuring that no selection bias would occur unconsciously. Nor were the patients themselves told whether they were receiving streptomycin or a placebo.
Hill insisted that results of the test wouldn’t depend on clinical evaluation alone, but on changes in chest X-rays, which would be examined by radiologists unaware of whether the subject had been in the treatment or the control group. The exercise was therefore a real beauty: a randomized, triple-blind (neither patients, nor clinicians, nor evaluators were told who had received the treatment) clinical trial, the first in the history of medicine.
The results were just as compelling as the methodology, though in an unexpected way. After six months, twenty-eight patients on the streptomycin regimen had improved, and only four had died. The control group, meanwhile, had lost fourteen of its fifty-two patients.
But while streptomycin “worked,” the rigor that Hill had built into the experiment’s design revealed streptomycin’s weaknesses just as clearly as its strengths, even for treating tuberculosis. Because it took months for the treatment to show a statistically demonstrable effect, some of the bacteria were certain to develop resistance to the therapy while it was still going on. And, indeed, when the subjects were revisited three years later, thirty-five of the members of the control group had died . . . but so had thirty-two who had received the treatment.
The experiment threw a massive dose of reality-based cold water on the enthusiasm of the first clinical reports. Something more than streptomycin was clearly required to redeem the drug’s initial promise. Luckily, Hill had a good idea what the “something more” should be. In 1940, a Danish physician, Jörgen Lehmann, had reasoned that, since acetyl salicylic acid, the compound better known as aspirin, increased the oxygen consumption of M. tuberculosis, its chemical cousin—para-aminosalicylic acid, or PAS—might act to inhibit oxygen consumption, thus killing (or at least disabling) the aerobic bacterium. It was a decent enough theory, and in 1946 Lehmann had published an article in the Lancet with his results, which were modest enough. By itself, PAS was only a marginally effective treatment for tuberculosis. But because its mechanism worked to inhibit oxygen consumption by the bacterium, the logic went, it strengthened the action of streptomycin, which needs oxygen to enter the bacterial cells.*
In his second trial, beginning in December 1948, Hill duplicated exactly the same experimental structure—same randomization, same X-ray evaluation—as his first. This time, however, he added Lehmann’s oxygen inhibitor to the treatment group. Less than a year later, the power of what would come to be known as RCT, for randomized controlled trials, was vindicated. The Medical Research Council announced that they had shown “unequivocally that the combination of PAS with streptomycin considerably reduces the risk of the development of streptomycin-resistant strains of tubercle bacilli.” The three-year survival rate using the combination of the two drugs was an almost unbelievable 80 percent.*
In less than three years, penicillin and streptomycin had achieved more victories in the battle against infectious disease than anything in the entire history of medicine since Galen. Both were unprecedentedly powerful weapons against pathogens; but it was streptomycin that revealed a method for finding more of the same: the combination of Selman Waksman’s protocol for finding antibacterial needles in haystacks made of soil, and Bradford Hill’s arithmetic for revealing their clinical value.