Emerging Epidemics: The Menace of New Infections

Emerging Epidemics: The Menace of New Infections

by Madeline Drexler
Emerging Epidemics: The Menace of New Infections

Emerging Epidemics: The Menace of New Infections

by Madeline Drexler

Paperback(Revised)

$17.00 
  • SHIP THIS ITEM
    Qualifies for Free Shipping
  • PICK UP IN STORE
    Check Availability at Nearby Stores

Related collections and offers


Overview

As timely as it is urgent, the well-researched book from veteran science journalist Madeline Drexler delivers a compelling report about today's most ominous infections disease threats. Focusing on a different danger in each chapter-from the looming risk of lethal influenza to in-depth information on the public health perils posed by bi"terrorism"Drexler takes readers straight to the front lines, where scientists are racing to catch nearly invisible adversaries superior in speed and guile. Drawing on a powerful combination of fresh research and surprising history, she warns us that the most ceaselessly creative bioterrorist is still Mother Nature, whose microbial operatives are all around us, ready to ounce whenever conditions are right.

Product Details

ISBN-13: 9780143117179
Publisher: Penguin Publishing Group
Publication date: 12/23/2009
Edition description: Revised
Pages: 336
Product dimensions: 5.30(w) x 7.90(h) x 0.90(d)
Age Range: 18 Years

About the Author

Madeline Drexler is a science and medical journalist. Formerly a medical columnist for The Boston Globe Magazine, she was a 1996-1997 Knight Science Journalism Fellow at MIT. She was awarded the 1992 International Biomedical Science Journalism Prize by the General Motors Cancer Research Foundation. Her articles have appeared in The New York Times, The American Prospect, Self, Good Housekeeping, and many other national publications.

Read an Excerpt

Bioterror



In the taxonomy of human warfare, artillery that pounds, pierces, cuts, blasts, or burns—even in the unexpected form of a Boeing 767—is deemed conventional. However horrific the events of September 11, 2001, what is considered unconventional, even beyond the pale, is attack via disease—which came just a few weeks later.

The autumn 2001 wave of anthrax cases—spread by spores that infiltrated post office sorting machines and mail storage areas, then government and executive offices and private homes, and eventually the lungs and skin and blood of defenseless victims—illustrated a danger about which terrorism experts had long warned: to those inclined to use them, viruses and bacteria make ingenious instruments of destruction. Unlike nuclear or chemical weapons, they self-propagate and adapt. They jump continents with ease. And a small amount can cause vast damage; after World War II, for instance, the U.S. Army concocted a botulinum toxin so potent that a pound, if expertly dispersed, could kill a billion people.

Chemical weapons are poisonous substances that are inorganic and manmade. Biowarfare agents, by contrast, are living organisms, or toxins secreted by living organisms, that can be used against people, animals, or crops. When TNT or chemical bombs explode, emergency crews rush to the scene. In a bioterrorist assault, health officials—as we have so dismayingly witnessed—may not notice until too late. Because pathogens require an incubation period to multiply in the body before they trigger symptoms, the first sign could occur days or weeks after the attack. By that time, the trail would be cold, and itmight be impossible to determine if the epidemic was a freak event or a malicious act of aggression.

Ironically, this conundrum—the delayed onset of an epidemic and the opacity of its cause—created one of the jewels of modern public health. In 1951, the CDC's first crack team of disease detectives—its Epidemic Intelligence Service, or EIS, officers—hit the road. Twenty-two young doctors and a sanitary engineer trained in a public health boot camp, readying themselves to solve any strange or unaccountable outbreak. At the height of the Cold War, the impetus behind forming the team was the prospect of biowarfare, as the very image of an "intelligence" corps conducting disease "surveillance" suggests. "[A]ny plan of defense against biological warfare sabotage required trained epidemiologists," wrote Alexander Langmuir, the EIS founder and legendary chief of the agency's epidemiology branch, "alert to the possibilities and available for call at a moment's notice anywhere in the country."

Until September 2001, the public knew the CDC's gumshoes for their peacetime achievements—helping figure out Legionnaires' disease, the Jack in the Box E. coli outbreak, the aspirin/Reye's syndrome link, hantavirus pulmonary syndrome, and countless other mysteries. Each year, EIS officers conduct between 800 and 1,000 such field investigations. But on September 11 of that year—when terrorist-hijacked commercial airliners crashed into the World Trade Center Towers in New York City, into the Pentagon, and into a field in Pennsylvania—the EIS's original mission was dramatically revived. Within hours of the World Trade Center attacks, EIS officers fanned out to hospitals throughout the city, looking for evidence of novel disease syndromes. The CDC issued a nationwide alert to doctors and laboratories to watch for unusual symptoms and pathogens. The Department of Health and Human Services authorized shipments totaling 50 tons from the never-before-tapped National Pharmaceutical Stockpile, a massive cargo of antibiotics, ventilators, and other emergency supplies created specifically to counteract a chemical or bioweapons attack. In early October, when a Florida man became the first American in a quarter century to succumb to inhalational anthrax—the first clue to the organized dissemination of anthrax spores that soon preoccupied the country—the CDC immediately chartered a jet and sent 15 investigators, including EIS officers, to comb the area for signs of bioterrorism.

Before these cataclysmic events, the warnings that such terrorism was imminent had sounded at times like a five-alarm mantra: "It's not if," the chorus went, "but when." Their Cassandra predictions came true sooner than most had expected.



The New Line-Up



Bioterrorism agents kill by suffocating pneumonia, septic shock, massive bleeding, or paralysis. At the CDC, the worst of the worst are known as Category A agents: unusually deadly organisms readily spread in the environment or transmitted from person to person, that could trigger public panic and social upheaval, and that require special public health precautions. Topping the list are the agents that cause anthrax, smallpox, and plague.

After October 2001, the anthrax bacterium—a marquee agent of biowarfare, oft-threatened in the United States but never employed—suddenly became a familiar and credible danger. That October was a turning point in another way as well: while inhalational anthrax had always been considered a death sentence, aggressive medical intervention was shown to save at least some patients with the dreaded pulmonary form of the disease. Anthrax begins mildly enough, with fever, fatigue, and sometimes a cough. What often follows are a few cruelly deceptive days of respite. All the while, toxins released in the bloodstream by bacteria are destroying cells and causing fluid to accumulate in tissues. Leaky blood vessels cause blood pressure to plummet and organs to fail. Then comes the fatal blow: abruptly labored breathing followed, within a day or two, by shock leading to death.

The disease does not spread person to person, but by bacterial spores—tiny, hard-coated spheres that can survive for decades in the soil. This hardiness has been demonstrated every time workers have tried to clean out anthrax spores from the environment. In 1940, the British detonated anthrax explosives on Gruinard Island, a heath-covered outcrop off Scotland; four decades later, the spores were still viable. Crews had to deluge the island with a cocktail of 280 tons of formaldehyde mixed with 2,000 tons of sea water. In the 1970s, demolition experts tackled the spore-ridden Arms Textile Company building, on the Merrimack River in Manchester, New Hampshire. After the building was decontaminated with formaldehyde, anthrax spores survived in dust that had become embedded in the structure's cracks. Eventually, workers had to disassemble the plant and incinerate it brick by brick and board by board, then treat what remained with chlorine and bury the waste. It's no wonder that soon after the 2001 terrorist attacks, the United States and Uzbekistan signed an agreement to remove anthrax bacteria from a remote island in the Aral Sea, where the Soviet Union in 1988 had dumped hundreds of tons of weaponized spores from its biowarfare program—spores that remained not only virulently alive but vulnerable to theft.

The very properties that make anthrax a good bioweapon—its stability and lethality and ease of cultivation—recommended the bacterium to be the first "germ" that Robert Koch employed to prove the germ theory. The bacillus changes into spores when faced with lack of nutrients or water, or when subjected to chemical shock. It morphs back to its rapidly multiplying vegetative form when conditions are more palmy—for instance, in the nutrient-rich blood or tissue of the lungs. For this quirk of biology alone, anthrax is a perfect terrorist tool. As Ed Regis writes in The Biology of Doom, "Sporulation . . . was God's gift to germ warfare."

But anthrax possesses other features that make it a desirable weapon. The bacterium lives in soils worldwide and in its natural hosts, grazing livestock, thus making it widely available to anyone with nefarious motives. In the United States, anthrax zones closely parallel the nineteenth-century cattle drives from Texas to Montana. Back then, anthrax was primarily a livestock disease, as it is today; in 2001, an epidemic of anthrax killed cattle and bison in the upper Midwest and Canada. Grazing animals become infected by eating and inhaling large quantities of spores in contaminated dirt. In humans, most anthrax infections erupt on the skin. Indeed, the word anthrax comes from the Greek word for coal, referring to the infection's black cutaneous lesions. In the early twentieth century, common sources of cutaneous anthrax were imported animal hides and shaving brushes made from contaminated horse hair. Another form of the disease, gastrointestinal anthrax, is contracted by eating tainted meat.

Inhalational anthrax is the rarest form of the disease, and the most deadly. Dubbed "woolsorter's disease" in nineteenth-century England, it was spread by spores nestled in contaminated sheep wool. Before the introduction of sanitation and vaccination, workers in goat-hair mills were regularly exposed to high concentrations of anthrax spores; inexplicably, however, few suffered inhalational anthrax, perhaps because there weren't enough spores in the air or because the exposed workers developed some immunity. Between 1900 and 1976, there were only 18 reported cases of inhalational anthrax in the United States. Many of these cases were bizarre: a woman who played bongo drums made of infected skin, a construction worker who handled contaminated felt, gardeners whose infections were traced to contaminated bone meal fertilizer. In 1957, four people died of inhalational anthrax in the Arms Textile Company building in New Hampshire; all had worked with fibers from a single shipment of black goat hair from Pakistan, to be used in the manufacture of lining for men's suits. Ironically, the deaths took place during the trial of an anthrax vaccine that was quietly being conducted at the plant by the CDC. None of the workers who died had been immunized; after their deaths, the trial was discontinued and all the company's workers were required to take the vaccine.

America's current stockpile of anthrax vaccine, approved only to prevent the skin form of the disease, was derived from these 1957 studies and licensed in 1970 by the Food and Drug Administration. Reserved for military use, it can be released to civilians only with permission from the Defense Department. Weeks after the first anthrax cases in 2001, the vaccine received FDA approval for use in high-risk individuals, such as lab workers and hazardous material clean-up crews. Because the vaccine can have serious side effects and must be given in a complicated regimen, health officials have not recommended immunizing the entire population in the absence of a serious danger of widespread attack. Today, only one company—BioPort Corporation of Lansing, Michigan—manufactures the drug. But the company has not made the vaccine since 1998, having failed FDA inspection standards. Government and private researchers are working on new anthrax vaccines intended to be safer and easier to administer.

If released in a fine-particle mist, anthrax spores can ride air currents for 50 miles or more. Once a person becomes infected by inhaling the invisible spores, symptoms may not show up for weeks; in an accidental release of bioweapon spores in the Soviet Union in 1979, one patient didn't become ill until 46 days after infection. This delay means that, unless forewarned about a large-scale attack, health workers would mount a hopelessly tardy response at best. Once symptoms do appear, it's often too late. Antibiotics are most effective before a person becomes ill. Untreated, inhalational anthrax infections kill 80 percent of victims. In 1993, the Office of Technology Assessment, a research arm of Congress, estimated that if 220 pounds of aerosolized anthrax spores were released over Washington, D.C., between 130,000 and three million people would die—lacking a massive program to dispense prophylactic antibiotics—making such an attack potentially as lethal as a hydrogen bomb.

Perhaps a more loathsome agent of bioterror is smallpox. Caused by the orthopoxvirus variola, smallpox was officially declared eradicated from the face of the earth in 1980, after a tortuous 11-year World Health Organization (WHO) campaign, dubbed Target Zero, to accomplish just that goal. D. A. Henderson, the imposing physician and epidemiologist who was recruited from the CDC to lead the 1966-1977 effort—and who many believe should have received the Nobel Prize for his efforts—has referred to "innumerable instances in which the program balanced on a knife edge between success and disaster." In India alone, 120,000 field workers scoured the vast nation, tracking down every last case in every last village and vaccinating everyone in contact with the patient. To make sure smallpox patients stayed put until all their scabs had fallen off, field workers nailed shut the victims' doors and posted guards around the clock. In Ethiopia, one of the last bastions of smallpox, WHO crews sometimes trekked 100 miles to vaccinate nomads, risking death during the nation's civil war. In 1976, the very moment when Henderson was about to declare victory after the disease was finally snuffed out of that country, an outbreak surfaced in neighboring Somalia, requiring another year of searching for new cases, so that every last human contact could be vaccinated. Finally, in 1977, the last known naturally acquired infection occurred in a young Somali hospital cook, who survived.

Thus smallpox, which had killed 500 million people in the twentieth century alone, and infected one-tenth of all humans since the first agricultural settlements around 10,000 BC, became the first human disease to be eradicated as a naturally spread contagion. The next year, in a bizarre and tragic epilogue, a medical photographer in Birmingham, England, became infected and died, apparently when virus particles drifted through an air duct from a research lab one floor below. She infected her mother, who lived. As a standard public health precaution, the laboratory director was quarantined. Overcome with guilt and grief, he committed suicide.

In one of the bitter ironies in public health history, D. A. Henderson is today trying to help the world prepare for an intentional release of the virus he helped wipe out. Henderson founded the Johns Hopkins Center for Civilian Biodefense Strategies, and in the fall of 2001 became director of the Office of Public Health Preparedness within the U.S. Department of Health and Human Services, organizing a plan to defend against bioterrorism. Officially, the last remaining stocks of the smallpox virus are sequestered in two facilities sanctioned by the World Health Organization: the CDC in Atlanta and the State Center of Virology and Biotechnology (VECTOR) in Novosibirsk, Russia. But most observers, including Henderson, believe that other laboratories inside and outside Russia secretly harbor the virus. The WHO, which had approved destruction of the remaining official stocks, delayed that action until the end of 2002, to allow time for scientists to further study the virus's genetic structure and tricks for subverting the immune system, discoveries that may lead to antiviral therapies, vaccines, and rapid diagnostic and analytic tools in case of attack. The prospect of destroying the official stocks of smallpox virus had riven the research community. Many scientists insisted that the benefits of research outweigh the dangers of release, and that pretending the virus has not already proliferated is an illusion that serves no purpose.

Others—notably Henderson—argued that there is little scientific insight that can't be gained using other orthopoxviruses, that the risks of accidental or intentional release eclipse any possible scientific advances, and that restricting the virus to outlaw states raises the moral bar against possible terrorist use. In late 2001, the Bush administration decided to retain U.S. smallpox stockpiles indefinitely, for research purposes.

If bioterrorists released smallpox virus, it would, by Henderson's reckoning, become a global calamity within six weeks. Before the disease was eradicated, smallpox spread naturally when an ill person released droplets from the mouth into the air. "When smallpox is epidemic," wrote a seventeenth-century observer, "entire villages are depopulated, markets ruined and the face of distress spread over the whole country." "Face" was meant literally. After the smallpox virus incubates for about 12 days in the body, victims suddenly feel weak, feverish, and achy; some become delirious. A rash appears, with especially dense lesions on the face, arms, and legs. The lesions are tense and deeply embedded in the skin. High fever, bloody sores, and seeping pustules give way to hemorrhaging, a sharp drop in blood pressure, and secondary infections leading to death. Should the victim live, the lesions will turn into scabs that eventually fall off, leaving depigmented scars. Though talk has abounded lately about suicide terrorists infected with smallpox, victims don't become contagious until the rash appears, by which time they would be far too ill to ramble about. Smallpox kills about 30 percent of its unvaccinated victims. Virgin-soil epidemics of smallpox, such as occurred in Native American tribes when European settlers came, have been known to kill as many as half their victims.

Smallpox radiates in ever-widening waves. Every silently infected person in the first swell of cases infects 10 to 15 more, who unless quarantined go on to infect 10 to 15 more. Given historical precedents, if the first generation of cases numbers 200 to 300, the next may be 2,000 to 3,000, and so on, if populations are not vaccinated and sick patients are not isolated. Henderson observed this steep rate of increase when smallpox raced through Yugoslavia in 1972. In that outbreak, a pilgrim returning from Mecca infected 11 family members and friends, who went on to spread the disease. In the four hospitals where the pilgrim stayed until he died, he infected 42 workers and other contacts. Not until four weeks after he became sick was the illness correctly diagnosed, by which time 150 people were already infected. In all, 175 people contracted smallpox and 35 died. So ominous was this outbreak in a nation that had not seen a case since 1946 that public health authorities acted with breathtaking speed. Within ten days, they vaccinated 20 million residents and quarantined some ten thousand under military guard in hotels and apartments. Neighboring countries sealed their borders. After nine weeks, the outbreak was halted.

Unlike with many diseases, immunity to smallpox begins to wane about ten years after vaccination. Here in the United States, where vaccinations ended in 1972, most residents have lost their protection, as have most people around the globe. There is less immunity in the world now than there ever has been in the history of humankind. The United States has only 7.5 to 15 million doses of vaccine—nowhere near enough to stem even a modest-sized epidemic. (The wide-ranging estimates of doses reflects the fact that administering the vaccine entails lightly puncturing the skin with a special bifurcated needle rather than giving a conventional injection, and unskilled vaccinators can waste the drug.) The most recent lot of smallpox vaccine was manufactured in 1982, and was produced in the traditional way—by scarifying the flanks and bellies of calves and harvesting infected lymph, a method that doesn't meet modern manufacturing standards for sterility. A new vaccine, in the works since 2000, will be grown in human cells suspended in large bioreactor tanks, and will be purer. After the 2001 terrorist attacks on the World Trade Center and the Pentagon, U.S. president George W. Bush decided to accelerate the expansion of the smallpox vaccine supply. That fall, with a rash of anthrax cases and the cloud of bioterrorism hanging over the country, Secretary of Health and Human Services Tommy Thompson negotiated a plan to produce enough additional smallpox vaccine to reach 286 million stockpiled doses by the end of 2002, enough to inoculate every American if necessary.

Until this ample supply is ready, the United States will depend on a federal emergency plan to respond to smallpox—a disease that last occurred in the country in 1949. According to this plan, if just one case of smallpox were confirmed, the patient would be immediately quarantined, and the CDC would dispatch vaccine from the government cache while alerting the FBI and the White House. Administered within a few days of exposure, smallpox vaccine can prevent or significantly reduce subsequent symptoms. CDC investigators would grill the victim's family about every step the patient had taken over the previous three weeks, and would ask for the addresses and phone numbers of every person who had close contact with the patient. State health officials would track down all those contacts—and their contacts—and vaccinate them. The plan is similar to the WHO's model during the smallpox eradication program—a plan specifically developed to make a little vaccine go a long way. Once there is enough vaccine for everybody, should all Americans be prophylactically immunized against smallpox? That may depend on whether smallpox materializes as a realistic threat. The smallpox vaccine produces adverse complications in approximately 1 in 13,000 people, ranging from rashes to lethal brain inflammations. Today, the risk of complications may be even higher because more people are living with immune systems weakened by conditions like cancer, HIV infection, and organ transplants. If the risk of dying from the vaccine is greater than the risk of getting the disease, mass immunization would make no sense.

Other potential biowarfare agents, though not all Category A, are nearly as frightening. Take plague. When its bacterium, Yersinia pestis, infects the lungs, causing a highly contagious form of the disease known as pneumonic plague, untreated patients quickly progress from fever and cough to respiratory failure, shock, and death; antibiotics are virtually useless if taken more than 24 hours after symptoms begin. Or consider the bacterial infection brucellosis, a.k.a. undulant fever, normally transmitted from cattle or goats or unpasteurized milk. A mere ten organisms can trigger symptoms from fever and sweats to weight loss and depression lasting sometimes longer than a year, coming and going in waves (thus its historic name). The sturdy bacterium that causes tularemia can remain alive for months in subfreezing temperatures; it brings on fever, prostration, weight loss, and pneumonia, and kills about a third of those not treated. Another potential bioagent, the rickettsial organism behind Q fever, kills 4 percent of victims and leaves the rest with throbbing headaches and eye pain that can last weeks. A third of its victims also develop hepatitis. Finally, viral hemorrhagic fevers, such as the mysterious Ebola or Marburg, begin with a high temperature, fatigue, and dizziness and can progress to bleeding under the skin, in internal organs, and from the mouth, eyes, or ears, leading to shock, coma, and death. There is no cure.

Although not living themselves, other potential weapons are derived from living things. These are especially terrifying. Botulinum toxins are the most lethal compounds known—15,000 times more toxic than the nerve agent VX and 100,000 times more toxic than the nerve agent sarin used in the 1995 Tokyo subway attack—and researchers estimate that as little as one gram of aerosolized botox could kill more than 1.5 million people. Within a day after exposure, victims experience blurred vision, difficulty talking and swallowing, and paralysis that creeps down from the shoulders and stills breathing. Ricin, a potent toxin easily extracted from castor beans, can be breathed in or ingested; if breathed in, it brings death from severe respiratory distress within days, and here too there is no treatment.

All told, of the 50 top bioweapon pathogens, only 13 have vaccines or treatments. In the short term, the advantage lies with the offense.

As frightening as this array of agents may be, imagine if their lethal capabilities were mixed and matched. By inserting genes from one organism into another, scientists may someday be able to design hybrid munitions that are more lethal, more sturdy, and perhaps even capable of eluding the immune system. "As a consequence, the quaint notion that you could list all of the bad pathogens that might be made into weapons and just forbid them and scan the world for them is ridiculous," says physician Tara O'Toole, director of the Johns Hopkins Center for Civilian Biodefense Strategies. "Because now all you have to do is click in the new gene, you get a new pathogen, you get a new weapon that ain't on the list." With this malign twist on molecular biology, writes D. A. Henderson, "the potential armamentarium is all but infinite."

Until recently, skeptics dismissed the potential of such malevolent biology, noting that virtually every time an organism was genetically manipulated, it ended up less, not more, virulent. Soviet experiments to hitch smallpox and Ebola viruses seemingly came to nought. But in 2001, an accidental discovery by Australian researchers gave the lie to the idea that designer diseases were pie-in-the-sky. While attempting to make a mouse contraceptive vaccine for pest control, the scientists had inserted into a mousepox virus a gene that makes large quantities of interleukin-4, a molecule produced naturally in mice and in humans. To their surprise, the designer virus crippled the ranks of the immune system that battle viral infection. As a result, nearly all the mice died from what normally is a mild infection. The new virus also resisted vaccination. Now scientists wonder whether terrorists, using similar techniques, could fashion human viruses that would wipe out the immune system. In 1998, Russian scientists reported that when they inserted genes from the harmless bacterium Bacillus cereus into the anthrax bacterium, Bacillus anthracis, they created a new form of anthrax that resisted both penicillin and vaccines.

Bioterrorism's potential doesn't end at human disease. Agroterrorism against crops and livestock could be just as devastating—which is why, shortly after the 2001 terrorist attacks, the Bush administration suddenly proposed spending tens of millions of dollars to hire more agricultural inspectors, a long-overdue action. U.S. agriculture has become dramatically less diverse genetically, making our food commodities as defenseless before an exotic pathogen as the Incas and Aztecs were before Europeans' smallpox. The centralization and globalization of our food supply leaves foods vulnerable anywhere along the chain, from farm to fork. About half of the American meat supply, for instance, is processed by three companies, meaning that livestock diseases will more readily spread because animals are concentrated in fewer places. A planned attack with foot-and-mouth disease—the highly transmissible infection that shook the British and European livestock industry in 2001—could theoretically wipe out a big chunk of American beef. Contaminating seed supplies or fields of soybeans with spores of soybean rust would reverberate globally, since the United States raises 50 percent of the world's soybean crop. From deliberate attacks with wheat rust and rice blast to sabotage of poultry with Newcastle disease or even the contamination of salad bars, the possibilities for mayhem are endless.



A Short History of Biowarfare



Until the autumn of 2001, skeptics dismissed talk of biological warfare as science fiction, the rhetoric of self-serving bureaucrats and professional paranoids. Some heard an echo from the Cold War 1950s, when the danger was said to be nuclear attack and schoolchildren memorized useless civil defense drills.

But a look at the history of warfare—especially in the twentieth century—might reinforce our newfound fears. Biological weapons have been around since the beginning. Romans, Persians, and other ancients tossed carrion into wells and reservoirs to taint their adversaries' drinking water. In 1346, the three-year siege of the Black Sea port of Kaffa ended when attacking Tatars catapulted bodies of bubonic plague victims over the city walls; fleeing victims may have brought the Black Death to western Europe. In 1763, the commander in chief of the British forces in North America, preoccupied with a restive coalition of Indians on the Western frontier, hit on the idea of sending smallpox-infested blankets as gifts to the "disaffected tribes." During World War I, Central Powers spies infected Russian horses and mules on the Eastern front with glanders, which in turn infected soldiers.

"[P]estilences methodically prepared and deliberately launched upon man and beast . . . Blight to destroy crops, Anthrax to slay horses and cattle, Plague to poison not armies but whole districts—such are the lines along which military science is remorselessly advancing," Winston Churchill wrote in 1925. It was the year of the Geneva Protocol, the treaty that described chemical and "bacteriological" methods of warfare as "justly condemned by the general opinion of the civilized world."

Condemnation or no, biowarfare was about to enter its heyday. From 1932 to 1945, Japan carried out one of the most active biological warfare programs in history. The hub of Japan's program was Ping Fan, a small Manchurian village guarded by watchtowers, a moat, and a tall brick wall garlanded with high-voltage lines and barbed wire. There, directing the infamous Unit 731, Japanese army doctor Shiro Ishii explored the potential of plague, typhoid, paratyphoid A and B, typhus, smallpox, tularemia, infectious jaundice, gas gangrene, tetanus, cholera, dysentery, glanders, scarlet fever, brucellosis, tickborne encephalitis, hemorrhagic fever, whooping cough, diphtheria, pneumonia, meningitis, venereal diseases, tuberculosis, and salmonellosis. At its peak, Unit 731 cultivated 660 pounds of plague bacteria each month, as well as millions of fleas to carry the agent in airborne attacks. Researchers tied prisoners to stakes and detonated a shrapnel bomb to infect them with the organism that causes gas gangrene. They fed prisoners chocolates spiked with anthrax spores, biscuits laced with plague bacteria, milk tainted with cholera vibrios, dumplings contaminated with typhoid. Army physicians practiced vivisection to observe the disease process in real time. Ping Fan's researchers referred to their test subjects as "logs." Predominantly Chinese citizens, these dehumanized victims included White Russians, Soviet prisoners, criminals, and mental patients. During World War II, Japan reportedly attacked at least 11 Chinese cities with biological agents. Western historians estimate that, from 1932 to 1945, at least 10,000 prisoners died as a result of experimental infection or execution following germ experimentation. Calculations by Chinese and Japanese historians run higher; they say at least 270,000 soldiers and civilians perished in this monstrous enterprise.

In 1947, the American military debriefed Shiro Ishii and other leaders of the Japanese program and then cut an extraordinary secret deal: immunity from prosecution if the Japanese would divulge to U.S. interrogators the details of their biological experiments. Unbeknownst to most of its citizens, America had launched its own biological warfare program, led by George W. Merck, president of the pharmaceutical company bearing his name. "Biological Warfare is, of course, Ôdirty business,'" Secretary of War Henry L. Stimson wrote in 1942 to President Franklin D. Roosevelt, "but . . . I think we must be prepared." A special committee within the National Academy of Sciences supported that view and concluded that militarizing microorganisms was "distinctly feasible." Indeed, committee members seemed inspired by the prospect. "Meningococcal meningitis might be spread by spraying meningococci in crowded quarters," they advised. "Typhoid could be introduced by sabotage into water and milk supplies and by direct enemy action into reservoirs. . . . Botulinus toxin might be conveyed in lethal amounts through water supplies. . . . Plague could be introduced into any of the large cities or ports by releasing infected fleas or rats. . . . Diphtheria can be spread by dissemination of cultures in shelters, subways, street cars, motion picture theaters, factories, stores, etc., by surreptitiously smearing cultures on strap handles and other articles frequently touched."

In 1943, Camp Detrick (later Fort Detrick) in Frederick, Maryland, became the headquarters of U.S. germ warfare activities. Scientists experimented with anthrax, botulinum toxin, brucellosis, tularemia, psittacosis, plague, Venezuelan equine encephalitis, Q fever, cholera, dengue, shigellosis dysentery, glanders, Rocky Mountain spotted fever, and other human scourges, as well as the animal diseases fowl pest and rinderpest, and rice, potato, and cereal blights. As an insurance policy, U.S. researchers also worked on vaccines, toxoids, and other post-exposure treatments for the very organisms they were cultivating as weapons. By the end of the European war, the United States and Britain together assembled, though never released, a bomb that would rain anthrax spores over cities, leaving them potentially uninhabitable. The Americans had also developed a weapon that spread the livestock disease brucellosis—which, while highly infectious in people, is not as deadly as anthrax, and thus was considered to be a more "humane" weapon.

During the Cold War, the geopolitical rationale for biowarfare expanded, as did the U.S. arsenal. "There is no doubt that bacteriological warfare offers a unique psychological advantage," observed U.S. Army brigadier general William Creasy. "Man's dread of disease is universal. The mysteriousness and invisibility of bacteriological warfare agents, the knowledge that they strike via the simplest and most basic sources of man's security—food, drink, and the air he breathes—and a feeling of helplessness in dealing with the unknown, all add to the psychological potential." Creasy, who led the Army Chemical Corps's germ weapons program, fully understood the practical consequences of this high-stakes mind-game. "Biological warfare," he remarked in 1951, "is essentially public health and preventive medicine in reverse." American scientists searched for agents that were virulent, compact, reliable, and stable. To insure the safety of American troops advancing on poisoned terrain, researchers focused on pathogens that would disable victims but wouldn't spread person to person and wouldn't remain noxious in the battlefield. Above all, scientists wanted to create a dry, light bacterial or viral agent whose particles were minuscule—between one and five microns in diameter (a few dozen would line up across a single human hair), the better to reach the tiny alveoli in the lungs, where they could be absorbed into the bloodstream.

The secret U.S. program staged test runs in American cities using supposedly harmless bacterial stand-ins. Between 1949 and 1969, the military conducted 239 such open-air experiments over populated areas, from San Francisco to St. Louis to Minneapolis, to track how clouds of bacteria would drift and decay in the environment. In 1950, the navy staged mock attacks off the coast of San Francisco, releasing millions of particles of the bacterium Serratia marcescens toward the city. Researchers chose Serratia because it grows in pink colonies, making it easy to detect on culture plates set up through the metropolitan area. Unexpectedly, however, 11 patients with urinary tract infections triggered by Serratia turned up at local hospitals. One man died at a hospital that had never until then recorded such an infection; the true source of his fatal illness would remain classified for decades. Today, doctors know that Serratia does occasionally cause disease, especially in immunocompromised or otherwise debilitated persons. According to political scientist Leonard Cole, who has written two books on the covert operations, other individuals in the path of the military's practice exercises may also have sickened or died. "But we cannot know," he says, "and never will know."

The experiments continued. In 1964, the U.S. Army sprayed Bacillus globigii—as a spore-former, a good stand-in for anthrax—in Washington National Airport, proving that infected passengers could travel to more than 200 cities. In 1966, technicians dropped lightbulbs filled with simulants through ventilating grates and onto the tracks of the New York City subway, and calculated that a real attack using dried agents would have produced 12,000 cases of anthrax, 200,000 of tularemia, and 300,000 of Q fever; the wide range reflects how many—or how few—organisms are necessary to cause infection. During the Vietnam War, Seventh Day Adventists, who were conscientious objectors and served as noncombatants, volunteered to subject themselves to airborne tularemia, Q fever, and other agents, in a project dubbed "Operation Whitecoat." All told, between 1950 and 1969, the U.S. government sank more than $700 million into secret bioweapons research. By the end, scientists had transformed into weapons of war seven biological agents: Bacillus anthracis, which causes the human and livestock disease anthrax; Clostridium botulinum, a soil bacterium that produces botulinum toxin; Francisella tularensis, the bacterial agent of tularemia, or rabbit fever; various Brucella species of bacteria, which cause brucellosis, or undulant fever; the mosquito-borne virus causing Venezuelan equine encephalitis; Staphylococcus enterotoxin B, a bacterial source of food poisoning; and Coxiella burnetii, the rickettsial organism that causes highly infectious Q fever. Their methods remain classified to this day.

"We were fighting a fire, and it seemed necessary to risk getting dirty as well as burnt," recalled a leading Ft. Detrick scientist. "We resolved the ethical question just as other equally good men resolved the same question at Oak Ridge and Hanford and Chicago and Los Alamos." Though the scientists felt they were doing their patriotic duty, says Colonel Edward Eitzen, the current commander at the U.S. Army Medical Research Institute of Infectious Diseases (USAMRIID) facility at Ft. Detrick, "From our perspective, it seems like a perversion of science." But, Eitzen adds, "Whether this is right or wrong, a lot of what we know today about how to defend against biological agents is a direct offshoot of some of what was learned during that program."

In 1969, President Richard Nixon ordered the unilateral dismantling of the U.S. biological weapons program. Stockpiles were destroyed and the facilities for developing and producing them dismantled or converted to peaceful uses. "Mankind," Nixon stated, "already carries in its own hands too many of the seeds of its own destruction." Historians apparently never asked the president why he made that decision. They speculate that it may have sprung from a desire to forge a relationship with the USSR; from the realization that bioweapons are both unreliable in the field and risky to stockpile; and especially from fears that a strong biowarfare program in the United States would encourage Third World nations to embark on programs of their own, setting off a chain of one-upmanship that would ultimately boomerang on the U.S. Why, this last argument went, should the United States, the richest country in the world, with its massive arsenal of expensive and technically complex nuclear weapons, make war cheaper and easier for other nations? In 1972, the United States and the Soviet Union signed the Convention on the Prohibition of the Development, Production and Stockpiling of Bacteriological (Biological) and Toxin Weapons, and on their Destruction—more conveniently known as the Biological Weapons Convention, or BWC.

The very next year, the USSR embarked on the largest bioweapons buildup in its history. At its peak in the late 1980s, the program's 50-plus labs and testing sites would employ 65,000 researchers and technicians perfecting the science and art of germ warfare. The sprawling operation was camouflaged under the name Biopreparat, with an ostensible mission of developing civilian pharmaceuticals. In reality, Biopreparat was what has been dubbed a "toxic archipelago." Scientists toiled on 52 different agents that could be used as weapons, among them the organisms causing smallpox, anthrax, plague, Ebola and Marburg hemorrhagic fevers, yellow fever, tularemia, brucellosis, Q fever, botulinum toxin, and Venezuelan equine encephalitis. Genetic hybrids were whipped up from the most deadly ingredients.

It was 1979 when the United States first got an inkling of this enterprising activity. That year, intelligence experts picked up reports of an anthrax epidemic in the town of Sverdlovsk, an industrial center in the Ural mountains. From the start they suspected a biowarfare mishap, but could not prove it. Not until 1989, when a Soviet defector told British authorities about the Soviets' top-secret germ warfare program—tales of deadly bacteria nestled in warheads and bombs, impervious to heat and cold and drugs—did the West begin to fathom the depth of the Soviet program. In 1992, the deputy director of Biopreparat—a defector who adopted the Westernized name of Ken Alibek—delivered even more shocking news. Biopreparat, he reported, had cooked up 2,000 strains of anthrax alone; a facility housing 7,000 employees worked on nothing but anthrax. According to Los Alamos National Laboratory molecular biologist Paul Jackson, "They've probably forgotten more than we'll ever know."

When the WHO declared smallpox eradicated in 1980, Soviet planners again discerned an advantage. "Where other governments saw a medical victory," Alibek wrote in Biohazard, "the Kremlin perceived a military opportunity." The USSR, he contends, produced 20 tons of smallpox virus each year—enough to kill the human population many times over—that could be mounted on intercontinental ballistic missiles and bombs. Protected by insulation and refrigeration, the viral payload would have produced an aerosolized cloud that, in theory, could have finished off any American survivors of a nuclear attack: a literal case of overkill.

Dissatisfied with the arsenal found in nature, Soviet scientists also attempted to fashion new organisms with enhanced properties for warfare. Though Western observers can still only guess at these accomplishments, accounts from former scientists who defected suggest that Soviet researchers did create bacterial and viral strains with higher virulence and stability than in naturally occurring strains, along with the capability to induce odd symptoms that would confuse diagnosis and treatment. They produced forms of plague bacteria that could secrete diphtheria toxin and that resisted antibiotics. They crafted viruses that genetically coded for bacterial toxins. They figured out how to make the fragile Marburg virus, which causes deadly hemorrhagic fever, rugged enough to be placed on a weapon. They fabricated "subtle agents" that could alter personality or make victims aggressive or sleepy. They even tried to recover influenza virus in corpses of victims from the 1918 pandemic, hoping to insert genes from the relict strain into currently circulating flu viruses. Alibek has asserted that Soviet Union researchers variously combined the viral agents of smallpox, Marburg, Ebola, Machupo, and Venezuelan equine encephalitis, though Western scientists are dubious. "Alibek has got a clear conflict of interest," says a U.S. government scientist. "He's in the United States now and has to make a living, and his area of expertise is Russian biological warfare. If that's not a threat, he's out of work." What's indisputably frightful is that, while Biopreparat sites have apparently stopped this research, today's Russian Ministry of Defense labs may still be carrying it on, under a cloak of secrecy that the West has not been able to penetrate.

As mind-boggling as these speculations are, the accidental 1979 anthrax outbreak at Sverdlovsk hinted at what real biowarfare would look like. Early on an April morning, at a military production plant for anthrax, a shift worker removed a clogged filter. The filter was part of the exhaust system of a drying machine that removed liquid from industrial scale cultures of anthrax spores. The worker forgot to replace the filter with a new one. No one knows how much time elapsed before someone noticed the error—perhaps a few hours. During that interval, an invisible plume of anthrax spores floated out of the plant and became windborne. The wind blew in a constant direction all day long, and the plume widened as it moved over nearby villages. People on the street, workers at a nearby ceramic plant, townsfolk sitting at their windows: all inhaled the spores. Two days later, dozens of victims appeared at hospitals gasping for breath, feverish, vomiting, their lips turning blue. Four days later, the victims began to die. Soviet authorities promptly confiscated medical records and officially blamed the outbreak on consumption of anthrax-contaminated black market meat. Sixty-eight died in the epidemic, among at least 79 infected. Sheep and cattle as far as 30 miles downwind also perished. According to Harvard University biologist Matthew Meselson, the anthrax spores that drifted on the wind could have weighed a total of anywhere from four milligrams to nearly a gram. Other Western scientists put the figure in the range of grams to kilograms. If an unintended accident could have such fatal results, what would happen in a well-stocked, deliberate attack? The Soviet Union, after all, at one time had 30 metric tons of anthrax ready to go.

The motive and the means for biological warfare have not been confined to the world's superpowers. In 1974, not long after the Soviet Union commenced its bioweapons buildup, Iraq—another signatory of the Biological Weapons Convention—began its own. When the Persian Gulf war erupted, Iraq was capable of launching missiles with biological payloads—but, for technical or political reasons, did not. Partly through United Nations inspections and through the admissions of Saddam Hussein's government, it is now known that Iraqi scientists worked on anthrax, botulinum toxin, cholera, plague, gas gangrene, Salmonella, ricin, staphylococcal enterotoxin, camelpox, cancer-causing molds called aflatoxins, rotavirus, and hemorrhagic conjunctivitis virus. During the Gulf War, the Hussein regime was alleged to have had aerial bombs and missile warheads packed with botulinum toxin, anthrax, and aflatoxins. All told, Iraq possessed at least half a million liters of various agents, enough to kill the world four times over. Many of Iraq's seed strains had been purchased between 1986 and 1991 from the American Type Culture Collection, a Maryland repository for government and academic researchers that has since tightened its export rules. While some observers feared that U.S. bomb attacks of Iraqi bioweapons sites might unleash epidemics, it never happened. For whatever reason, the organisms didn't survive or spread in the sunlight and air, nor have reports of local epidemics ever surfaced.

Iraqi officials have barred UN weapons inspectors from entering the country since December 1998. Intelligence experts suspect that Saddam Hussein's regime has rebuilt factories capable of producing chemical and biological agents, and may have resumed making weapons. They suspect Iraq can now produce a high grade of dry anthrax spores. Some observers fear that Iraq is working with camelpox, either as a way to create smallpox (the camelpox virus contains all the smallpox genes), or in order to manipulate camelpox so that it is virulent in people.

Iraq and Russia are not the only nations that the U.S. government suspects of harboring biological weapons. At least a dozen countries, including Iran, Libya, Syria, China, and North Korea, are believed to possess or to be trying to acquire such armaments. Western intelligence experts have sketchy evidence that North Korea's program may actually outstrip Iraq's. According to Alibek, Moscow State University for years trained scientists from Eastern Bloc states, Iran, Iraq, Syria, and Libya. After the breakup of the Soviet Union, Iran and presumably other nations tried to recruit Russian biologists for their own germ war enterprises. Whether Russian scientists—and Russian biological matériel—made their way to nations known to support terrorism is anybody's guess. Ironically, America's military supremacy may stoke the biowarfare ambitions of countries that could not defeat the United States using the approved forms of mass murder. After his nation's 1988 cease-fire with Iraq, Iranian president Hashami Rafsanjani described chemical and biological arms as the "poor man's atomic bombs."

The current debate over the bioterrorism threat hinges on whether the superpowers' technical expertise has leaked out. Some observers are convinced it hasn't. "It's not like Einstein and Madame Curie are rushing to join these terrorist organizations," says Amy Smithson, who directs the Chemical and Biological Weapons Nonproliferation Project at the Henry L. Stimson Center in Washington, D.C. According to Smithson, the technical hurdles that before September 2001 stood between terrorist groups and the highly efficient and effective dissemination of biological agents still exist. "Just because someone knows how to make something in a fermenter does not mean that they understand the intricate post-production steps that are required for technical dissemination. That's the reason why the former Soviet Union employed thousands and thousands of scientists: to get really good at the odious business of biological warfare."

Moreover, goes this argument, no rogue state sympathizer would be foolish enough to abet such terrorists. Even during the Gulf War, Iraqi president Saddam Hussein observed certain boundaries. He had both biological and chemical weapons, and he didn't use them. Nor did the Soviet Union draw on its vast BW stockpiles during the Cold War. "It's not the moral restraints on the terrorists. It's the moral restraints on the leaders of nation-states," says Smithson. "It's pretty clear that Al Qaeda and its types have no problem with suicidal missions. But the purpose of a nation-state is to perpetuate itself. Cooperating with terrorists on finances or on training camps or places to hide and even on conventional weapons training is one thing. Cooperating with terrorists on chemical and especially biological agents is another thing entirely. That spells hell to pay."

But just because we have no tangible proof that terrorists possess biological weapons doesn't prove that they don't. Absence of evidence isn't evidence of absence—as recent history proves. Not until the West got the lowdown from Soviet defectors in the late 1980s and early 1990s did it begin to fathom the vastness of the Soviet BW enterprise. Even the accidental release of anthrax spores in 1979 from a Soviet military facility failed to tip off Western authorities. Likewise, the dimensions of the Iraqi biowarfare machine were unknown until UN inspectors started poking around after the Gulf War. "I am not reassured that we don't know about a teeny-weeny operation, possibly happening somewhere in the Mideast in the mountains of Afghanistan," says Johns Hopkins's Tara O'Toole. "And I don't think many people believe the CIA has tremendous expertise in these matters. It's very difficult to find these kinds of weapons before they're used, as everybody will admit." Here in the United States, it wasn't until after the fact that investigators found mountains of evidence that airline hijackings were about to take place—hardly an encouraging precedent for a mission to rout out unobtrusive miscreants with a stash of microbes. "What we saw on September eleventh is that terrorists are indeed quite well organized and sophisticated and capable of carrying out complex deeds requiring planning and determination," says O'Toole, "and that they're willing to cross the so-called barrier of moral repugnance and kill thousands of civilians without warning."



Methods of Madness



The prospect of bioterrorism may be closer than ever because undertaking it is easier than ever. "The main thing driving this is the trajectory of modern biological sciences: it is going straight up and fast as a rocket. There are a lot more people out there who know these basic biological techniques than there were atomic scientists in 1945," says O'Toole. Or as Nobel laureate Joshua Lederberg puts it, "The question . . . is: What levels of insanity do we have to prepare for?"

Technical expertise isn't some futuristic hypothesis—it's present fact. "There's no question in our mind," says O'Toole, "that organized terrorists could mount at least a small bioterrorist attack now." How big is "small"? In her estimate, "dozens or hundreds of people being infected in an indoor aerosol release, maybe more than one of those. . . . I agree that it's unlikely terrorists could create the kind of very efficient and accurate weapon that the U.S. had in the nineteen-sixties right now. I'm not so sure that will be impossible five years from now, given how technology is moving forward and simplifying the steps that you would need to isolate, harden, and disperse these organisms." Microbiologist Raymond Zilinskas, a senior scientist at the Monterey Institute of International Studies, agrees that a BioUnabomber is not a farfetched possibility. "I worry about the lone operative, the disgruntled or crazed scientist," he says. "That problem is going to grow, because as the population of microbiologists and biotechnologists grows, the absolute number of people that go bad would also grow."

Aspiring criminals can easily buy growth media and fermenters, agar and second-hand incubators, seed strains for common foodborne organisms. As investigators learned belatedly in the fall of 2001, laboratory samples of anthrax bacteria had for decades moved freely and without documentation among researchers and universities—samples ripe for theft. Moreover, the soil in certain locales is loaded with anthrax spores. With heat treatment and selective culture media, a competent microbiologist can easily tease out virulent strains. Converting the bacteria to spores merely requires adding certain chemicals or starving the organisms, and then culling out the spores. To spread microorganisms and toxins, motivated attackers can purchase off-the-shelf equipment from medical, agricultural, or industrial suppliers. Such experimental technologies as aerosolizers, currently being tested to spread vaccines to chicken flocks, will soon be available to farmers and anyone else with the cash. Asthma inhalers, crop-dusting equipment, photocopiers: all can spew out respirable particles of precise size and consistent quality.

The "Method" sections of articles in scientific journals are a gold mine for potential terrorists, describing precisely where the authors bought their materials and how they carried out their experiments. As Russian defector Ken Alibek testified before Congress in 2000: "Just by reading scientific literature published in Russia in the last few years, a biological weapons developer could learn techniques to genetically engineer vaccinia virus and then transfer the results to smallpox; to create antibiotic-resistant strains of anthrax, plague, and glanders; and to mass-produce the Marburg and Machupo viruses. Billions of dollars that the Soviet Union and Russia put into biotechnology research are available to anyone for the cost of a translator." More disquieting scientific "methods" are available in biological and chemical warfare cookbooks sold on the Web.

"An effective biological weapons program can be set up in a typical suburban basement, using basic high school or college lab equipment and materials easily ordered from catalogs," epidemiologist Mike Osterholm and journalist John Schwartz write in Living Terrors. They outline an easy if hypothetical attack on a crowded suburban mall. Using rapid-fasten Velcro strips, the villain attaches what appears to be a thermostat box to the wall. Inside is a microaerosolizer no larger than a pack of gum. Powered by a small store-bought camcorder battery, the unit transforms a few tablespoons of fluid—say, smallpox virus—into an invisible mist. Results: thousands would inhale the virus in the Muzak-filled ground zero, each infecting a dozen or so more in a multiplying geometry.

Though biological agents can be spread as liquids, dry powders disseminate far more easily. Particles one to five microns in diameter act like a gas, slowly settling from the air. But once released into the atmosphere, most biologic agents die or lose their virulence, their rate of decay contingent on a host of factors, including ultraviolet radiation, temperature, humidity, and pollution. Which means that the biggest remaining secrets from the U.S. and USSR biowarfare programs are how the agents were brewed and stored. The Cold War superpowers both developed techniques for suspending or dissolving optimal quantities of agents or toxins in special solutions containing preservatives, adjuvants, and antistatic chemicals. Every weaponized pathogen and toxin has its own formulation that prevents it from losing potency, clogging nozzles, clumping and falling to the ground, drying up in the atmosphere, or being killed by ultraviolet light.

Until the anthrax cases of 2001, the looming precedent for bioterrorists was the Aum Shinrikyo cult in Japan, which staged the 1995 sarin nerve-gas attack on the Tokyo subway system that killed 12 and injured upwards of 1,000. Before that headline assault, the group had 10 times tried and failed to sow disease by dispersing biological agents. Led by a former yoga teacher known as Shoko Asahara—who preached a mystical blend of Tibetan Buddhism, Hinduism, Christianity, Nostradamus, and pseudoscience—Aum Shinrikyo members came to believe that an apocalypse was just around the corner. To back up this prophecy and lend credence to their leader's divine revelations, cult members—many university-trained in the sciences—attempted to wreak havoc in cities and at U.S. military bases in Japan. Using a sprayer and fan, they repeatedly tried to disseminate anthrax from atop a cult-owned building in Tokyo. A specially fitted automobile spraying botulinum toxin through its exhaust toured downtown Tokyo during the 1993 wedding of Japan's Crown Prince. Whether because of the agent's inherent weakness or because of problems with misting devices, none of these forays succeeded. In frustration, the cult turned to a chemical for the subway attack. Sarin, after all, had a track record; in 1994, the cult's sarin attack in the town of Matsumoto had killed seven and injured 144, including several of the judges who were their targets. Just one month before the Matsumoto attack, Aum had mulled staging sarin attacks in the United States. Authorities suspect Aum also experimented with the agent of Q fever, and that a cult mission dubbed the "African Salvation Tour" traveled to Zaire in 1992 to obtain Ebola virus.

In the United States, the only successful large-scale bioterrorist attack was a decidedly low-tech incident of food poisoning in Oregon in September 1984. In this case, the culprits belonged to another cult—the Rajneeshees, led by a Nietzsche-inspired Indian guru named Bhagwan Shree Rajneesh—that had planned to take control of a rural county commission. They intended to swing the vote by sickening residents on the day of the election, having registered homeless people who would approve the cult slate. In their state-licensed clinical laboratory, called the Pythagoras Clinic, they grew a strain of Salmonella typhimurium; the organism was originally a control sample used to meet quality assurance standards for the lab. That September, cult members ran a field trial, pouring vials of the bacterium into salad dressings and coffee creamers at ten salad bar restaurants in The Dalles, a small town on the interstate. Within days, 751 people in the community became sick, though unreported cases among out-of-state travelers probably means there were many more victims. At the time, investigators considered bioterrorism a remote possibility. Despite the successful practice attack, the Rajneeshees decided against a repeat performance during the November elections. Not until a year later, when authorities were investigating the group for other criminal activities, did a cult member confess to the crime. The CDC delayed publishing details of the incident for 13 years, for fear of spawning copycat episodes.

Prior to 2001, the United States had seen only a handful of other small-scale attacks or planned assaults using biologicals. In 1996, a disgruntled lab employee at St. Paul Medical Center in Dallas infected 12 of her coworkers with Shigella dysenteriae type 2. (She had earlier practiced the trick with her estranged fiancé.) The organism, first described after a winter epidemic in an Eastern European prison camp during World War I, is now rare in developed countries. Using her supervisor's computer e-mail account, the employee anonymously invited her workmates to the staff break room, where she had laid out an enticing display of blueberry muffins and doughnuts—injected with Shigella samples removed from the lab freezer. Everyone who partook got sic k.

In 1994 and 1995, four members of the Minnesota Patriots Council, a band of antigovernment tax protesters, were convicted for conspiring to kill local and federal law enforcement agents with ricin, a deadly toxin made from the seeds of the castor bean plant. Two hundred times more potent than cyanide, and with no known antidote, ricin seems to carry a certain cachet among would-be conspirators, possibly because of the mistaken belief that it cannot be traced after it breaks down in the body. The Soviet KGB was said to favor ricin. In 1978, the Bulgarian secret police murdered dissident Georgi Markov in London by shooting a ricin-filled pellet into Markov's thigh from an airgun concealed in an umbrella. The Minnesota Patriots Council plot was somewhat less elegant. The group obtained its castor beans from a mail-order outlet called Maynard's Avenging Angel Supply. The strategy was to extract the ricin from the beans, blend it with a mixture of aloe vera and the solvent dimethyl sulfoxide (DMSO), and smear the concoction on doorknobs of the intended victims' homes or inside their shoes. A Council member called police before the plan was set in motion.

In what now seems like an innocent prelude, the United States in the late 1990s suffered an epidemic of bioterror bluffs. One of the first took place in April 1997, when the Washington, D.C., headquarters of the B'nai B'rith received a package leaking red fluid, inside which was a petri dish labeled Anthracis Yersinia: a fictional recombinant of anthrax and plague that doesn't exist in the natural world, but the invocation of which apparently terrified local authorities. Two city blocks were cordoned off, 109 people were quarantined, and 30 individuals underwent the humiliating strip-and-scrub of a hazardous materials decontamination. In the end, the suspicious substance was identified as red gelatin. That November, as part of a Clinton administration campaign to build public support for the imminent bombing of Iraq, Secretary of Defense William Cohen hoisted a five-pound sack of Domino sugar on a Sunday morning talk show, explaining that a comparable quantity of anthrax would kill half of Washington, D.C. In February 1998, Larry Wayne Harris, a loquacious "Christian Patriot" and well-known antigovernment character, was arrested in Las Vegas with what he himself touted as "military grade anthrax" (an ambiguous label) and which turned out to be a benign vaccine strain. A few years earlier, Harris had purchased plague bacteria from the same company that had sold Iraq starter strains for its biological cache. The media had a field day.

Soon after Harris's arrest came a deluge of anthrax threats—all hoaxes, many giddily covered by the press, which only fueled the fire. The crime du jour, anthrax hoaxes became so mundane that a California accountant accused of embezzlement and perjury actually called in a hoax to avoid a hearing in U.S. Bankruptcy Court. Chicago's Wrigley Building was shut down for six hours in response to an agent later found to be hot cocoa mix. From 1997 to early 2001, some 13,000 hapless building occupants had been ordered to strip and be hosed down as part of the misguided HAZMAT (for hazardous materials) emergency response to these ruses.

But hoaxes are just that—hoaxes—and do not necessarily reflect the real threat. Even before the attacks of 2001, an academic cottage industry had been weighing the relative bioterror dangers posed by white supremacists, Christian or Islamic fundamentalists, tax-protesting zealots, rogue dictators, disaffected employees in the life sciences, freelance extremists, wacky cults, and deranged individuals. In 1999, one scholar even warned of "pre-millennial tensions," a prophecy that seemed to spring not from a CIA briefing book but from the pages of Glamour or Redbook.

After October 4, 2001—when the first of a series of anthrax cases hit the news—the floodgates opened. In two weeks, the FBI received 2,500 ultimately unfounded reports of possible anthrax, from abortion clinics to libraries to government offices. Meanwhile, real anthrax spores were silently infecting postal workers, media employees, and other unsuspecting victims. Grappling with America's first organized bioterrorist attack, the public health system proved itself to be as unprepared as the sibyls had long warned.



The Home Front



No national emergency since the 1918-1919 flu pandemic has truly tested the U.S. health care system. During the pandemic, 28 percent of Americans became ill and 2.5 percent of those who were infected died. Just as with a broad-scale bioterrorist attack, doctors and nurses were in short supply, in part because they themselves had become sick. Medical students assumed physicians' duties. Druggists couldn't fill their orders and desperate customers quickly stripped pharmacy shelves of over-the-counter remedies. Gymnasiums, state armories, and parish halls were transformed into emergency hospitals. Caskets and burial plots ran out, and bodies were stranded in homes. At the Surgeon General's request, state and local health officials suspended public gatherings.

What impulses, good and bad, would be unleashed today if the United States were widely attacked with a bioweapon? Would people break into pharmacies to get drugs? Would a black market in antibiotics or vaccines flourish? Would fast food restaurants be commandeered to administer drivethrough prophylaxis for contagious diseases? Would citizens angrily second-guess official decisions about who gets rationed drugs? Would patients being cared for at home use the Internet to arrange for food and drugs? Would military or police authorities have to imprison contagious patients refusing to submit to quarantine, treatment, or vaccination? Would wild rumors circulate about the perpetrators of the attack?

Public reaction to the scattered but nerve-wracking cases of anthrax in 2001—with drug stores running out of the antibiotic Cipro, the worried well flooding hospital emergency rooms, and gas masks selling like hotcakes on the Internet—may be a pale presentiment of the country's response to a more widespread and murderous bioter-rorist attack. Likewise, the fractures in the U.S. public health system revealed during the autumn of 2001 could become even more pronounced under a large-scale assault.

October 2001 did demonstrate that doctors' diagnostic reflexes are getting faster when faced with suspicious symptoms; like the first cases of West Nile virus in New York City, the initial cases of both inhalational and cutaneous anthrax were caught by astute physicians. But what if terrorism hadn't already been saturating the headlines? The first victims of an unannounced attack might drift in to emergency rooms, doctors' offices, and urgent care clinics. They would have vague flulike symptoms, since most infectious diseases begin with aches, fever, and chills—the immune system's first response to infectious invaders. At this stage of the outbreak, clues may be too subtle and too diffuse to pick up. Ideally, an alert doctor would raise a red flag if something were amiss and would call the local public health department. But if the early wave of patients could walk out, they would probably be sent home with a diagnosis of flu—as, tragically, was a Washington, D.C., postal worker in October 2001, who hours later died of inhalational anthrax.

To begin wrapping their minds around the practical challenges of a massive bioterrorist assault, health officials before the assaults of 2001 had begun staging so-called tabletop exercises, during which they rehearsed their decisions and actions around a table as a hypothetical attack unfolds. The results were revealing, but not especially reassuring. In 1998, when federal officials play-acted an outbreak of a fictional smallpox/Marburg hybrid virus along the Mexican-American border, they discovered huge gaps in logistics and departmental turf wars. Hospitals sagged under the strain, federal quarantine laws failed, and an international political crisis exploded.

One of the biggest drills took place in May 2000 in Denver. Dubbed TOPOFF, because top officials from the federal government took part, the exercise simulated what would happen after a release of pneumonic plague at the city's center for performing arts. In this drill, public health officials, hospital employees, and political leaders stayed at their workplaces, as they would during a real incident. The drill unfolded according to a prewritten script that most of the participants did not know in advance. Participants learned about made-up events in this virtual attack—frightened citizens flooding emergency rooms, residents fleeing the city—from slips of paper handed to them over four days.

The drill kicked off when ten "patients"—in reality, healthy actors—began showing up at hospitals complaining of fever and cough. By the end of the first day, according to the script, there would be 783 cases of pneumonic plague and 123 deaths. Hospital staff called in sick, and antibiotics and ventilators were scarce. Though the script called for emergency "push packs" from the CDC's National Pharmaceutical Stockpile, Denver officials realized they didn't have enough people to unload the two airplanes full of drugs and supplies. In this city of one million, the official antibiotic distribution center would have been able to hand out drugs to only 140 people an hour—appallingly short of the goal of 100,000 people a day. In real conference calls during the drill, health officials took hours trying to agree on the proper dosage of antibiotics or deciding how close a potentially infected person would have had to stand to a victim in order to be considered "exposed." By day four, 3,700 people were infected and 950 were dead. Bodies piled up in morgues. In response to these scripted developments, city and state officials issued virtual orders commanding residents to remain in their homes and closing airports, bus stations, and train depots. With state borders closed, however, virtual food supplies ran out. Soon, cases emerged in other states, as well as in England and Japan.

In June 2001, in an exercise dubbed "Dark Winter," a mock National Security Council depicted by former senior government officials wrestled with a fictional smallpox outbreak that began after a release of the virus in shopping malls in Oklahoma City, Philadelphia, and Atlanta. Over a simulated time span of about two weeks, vaccines ran out, officials bickered over quarantine measures, and 6,000 Americans were dead or dying. By the finale, the imaginary epidemic had spread to 25 states and killed several million. When U.S. vice president Dick Cheney saw a video of the Dark Winter practice drill, shortly after the September terrorist attacks, he was so alarmed that he raised concerns about the smallpox vaccine supply that very day at a National Security Council meeting.

If the United States did face biowarfare, gas masks and private reserves of ciprofloxacin would be pretty much useless (gas masks because they would have to be donned immediately, Cipro because it doesn't work against many pathogens and it would engender antibiotic-resistant bacteria). Only a robust public health system—one that instantly registers aberrant syndromes and anomalies in infection rates, figures out the problem, and quickly intervenes—could actually curb the spread and devastation of the disease. In the ideal world, doctors and nurses would immediately recognize the unfamiliar symptoms of a wide spectrum of biowarfare agents; state and local health departments would continually collect data from hospitals about patients with suspicious pneumonias, meningitis, blood infections, diarrhea, botulism-like symptoms, rashes with fever, and fatal unexplained fevers—symptoms that, en masse, suggest deliberate infection; pharmacies would report spikes in over-the-counter drug sales; laboratories would perform rapid tests that would unmask an intentionally released pathogen; extra hospital beds and emergency supplies would be in place; a wide range of vaccines and antibiotics would be stockpiled; officials would know in advance precisely the decisions—about vaccinations, quarantines, travel restrictions, and so on—that they would make in a crisis.

Unfortunately, public health has long been considered a poor second cousin to curative medicine. Indeed, it's a bitter joke in the profession that a master's of public health is the only degree that reduces one's salary. In CDC labs, plastic sheeting protects equipment from leaky ceilings. So strapped for money are many local health departments, no one staffs the phones on weekends. Transforming this creaky system into one that can handle germ warfare is a herculean task. "One of the biggest lessons from Dark Winter was the clamor from the participants for more information," says Tara O'Toole. "What they wanted to know was: What is the scope of this attack? Is it one attack or multiple attacks? How many are sick? Where are they? Are things getting worse? Are things getting better? Everything they wanted to know was public health data. It wasn't information that could come from FBI or CIA or the military."

An advanced public health system, of course, pays double on its investment, because while girded for the possibility of bioterrorism, it is also ready for more common contingencies, such as a schoolyard meningitis outbreak or an urban hot spot of drug-resistant TB. "An emergency system that's dusted off and used only during a rare event isn't going to work," says Tara O'Toole. "These systems have to be part of our daily routine if they're going to operate during crisis." The rapid diagnostic techniques that make it possible to swiftly detect anthrax or plague could quickly diagnose pneumonia in nursing home residents, or antibiotic-resistant staph in premature newborns. The sensor technology that can identify biowarfare agents in the body or in the environment could be used to detect Salmonella in chicken or E. coli in ground beef. The distribution systems for delivering antibiotics and vaccines after a terrorist attack could deliver antivirals and vaccine during a flu pandemic. If all public health departments shared the same electronic surveillance architecture, they could instantly mesh data on any breaking epidemic, from Cyclospora to St. Louis encephalitis. A public health system that can handle a massive anthrax assault—or, even more horrifying, a return of smallpox—should be able to respond to just about anything.

But such a system is a long way off. Thirty percent of all U.S. hospitals are in the red and nearly 60 percent of academic medical centers can't meet their operating expenses. Hospitals are short on beds; thin or nonexistent profit margins and managed care demands for cost-cutting have forced them to send more patients home right after surgery and to operate near capacity all the time. Budget pressures have also forced doctors to order fewer laboratory cultures, instead just treating unidentified infections with broad-spectrum antibiotics—bypassing potential clues to a deliberately spread epidemic. Just-in-time management also means that hospitals have small supplies of drugs on the shelves; today, they often run short of drugs for common infections. A national shortage of nurses and pharmacists would compound problems in the event of a widespread attack. There's no slack in the system—no "surge capacity," to use public health jargon—should hundreds or thousands of people suddenly get sick. If hospitals run out of beds during an unremarkable flu season, they would be hopelessly besieged after a biological weapon release. In a survey of 30 hospitals in four states and Washington, D.C., published in 2001, none were prepared to handle large numbers of casualties caused by biological, chemical, or nuclear weapons; indeed, 26 hospitals reported that they could only handle 10 to 15 victims at once.

Back in 1951, when the CDC's Epidemic Intelligence Service got off the ground, there was a mystique about "shoe leather epidemiology." There still is—and some jobs, such as interviewing victims, still must be done face to face. But shoe leather can't keep up with today's complex and speeded-up world. Some state epidemiologists are tracking cases with pushpins on paper maps, instead of databases and software that can quickly link cases. "Many of my public health colleagues," CDC director Jeffrey Koplan testified before Congress, "are still working on technologies that involve paper and pen, telephones, while their kids are at home using the Web and Internet to order from Lands' End and Toys R Us." In the event of a large-scale attack, the dearth of real-time numbers will stymie health officials. "We're not going to be able to say how many cases there are, where they are, whether everybody came from the same hockey arena, or whether this is more than one attack," says Tara O'Toole. During the deadly 1918 flu pandemic, O'Toole adds, "The public health system lost credibility overnight because they couldn't say if the epidemic was waxing or waning."

This loss of credibility was glaring in the fall of 2001, when a steady drumbeat of anthrax news unnervingly sounded through every news cycle. Federal officials publicly disagreed about key questions, from the scope of contamination to the potency of anthrax spores delivered through the postal system to exactly how victims were exposed and what their treatment should be. Authorities warned of future attacks and in the same breath minimized an individual's risk of disease. CDC administrators blamed the FBI for withholding information critical to public health decisionmaking. To be fair, these officials were trying to keep up with an outbreak that didn't follow the script terrorism experts had written—the premises of which were partly based on old military experiments using research animals. Even so, no single government office or official seemed able or willing to coordinate the frantic, chaotic medical and law enforcement investigations. Can the new Office of Homeland Security, created in response to the September 11, 2001 attacks, pull together the political power and personal determination needed to cut through entrenched bureaucracies' deep suspicions of change and of each other?

One reason the United States was caught off guard in 2001 was a patchwork national policy on bioterrorism preparedness. Part of the problem lay on Capitol Hill: in the competition for a windfall of counterterrorism dollars in the late 1990s, public health advocates had been forced to jostle against sharp elbows from the military, law enforcement, and intelligence communities. In Congress, oversight for counterterrorism cuts across 11 Senate committees and 14 from the House. "This new mission of civilian biodefense has been dropped in upon an organizational landscape that is uncharted and basically unfriendly to the mission," observed Richard Falkenrath of Harvard's Kennedy School of Government. Civilian biodefense, he added, was "a homeless mission"—an irony, considering it was part of the EIS's original job.

Frustrated public health officials perceived budgetary brink-manship instead of a reasoned analysis of the problem. National security leaders apparently didn't understand the consequences of biological attack. "The notion of it being an epidemic had escaped them," says Tara O'Toole. Peggy Hamburg, while assistant secretary in the Department of Health and Human Services during the Clinton administration, couched her arguments for public health improvements in terms of the nation's safety and security. Yet at the time, opponents often asked her why the United States needed to invest tens of millions of dollars to produce a vaccine for a disease—smallpox—that didn't exist. Hamburg concedes that some of her liberal colleagues were also nervous that the mission of public health had somehow gotten mixed up with the agendas of the military and of law enforcement. In September 2001, just before the first anthrax cases came to light, a General Accounting Office report presciently stated that the U.S. public health care system was fragmented and poorly trained to respond to germ attacks—as we all saw. "Turf wars and overlapping jurisdictions," the New York Times reported a few weeks later, "are hampering progress in the investigation of the anthrax outbreaks."

Back in the 1950s, Alexander Langmuir exploited the gov-ernment's fear of Communist-inspired biowarfare to build a superb system of disease surveillance. But as historian Elizabeth Fee points out, "At the same time that funding for biological warfare research was increasing in the United States, funds for local health departments were cut sharply." Will the front lines again be shortchanged? In the event of a bioterrorist attack, as the GAO report asserted, "cities would probably be on their own for the first 24 to 72 hours." According to the Henry L. Stimson Center, a public policy research organization, the federal 2001 fiscal year budget for combating terrorism was $9.7 billion; of that, less than $100 million went to public health infrastructure and surveillance. Indeed, just in the month after the initial wave of anthrax attacks in 2001, many state health departments had consumed a full year's budget. No doubt, funding will grow and shift in the coming years. In late 2001, the Department of Health and Human Services requested billions of dollars to speed production of the smallpox vaccine, boost hospital preparedness, hire more epidemiologists, and tighten security at laboratories that handle bioterrorist agents. But all public health is local, and fortifying our defenses against germ warfare will above all require routing federal money to the immediate fields of battle.

Before the 2001 spate of anthrax cases, the small amount of bioterrorism preparedness money that did trickle down to state health departments improved readiness across the board. "It's such a dry desert in public health," says Tara O'Toole, "that the capacity to buy some basic equipment, to stand up some rudimentary epidemiological programs, to hire a couple of more people, will make a difference." Because of the federal government's previous investment in preparedness, state labs now have more Biosafety Level 3 facilities for dangerous pathogens that can cause disease through inhalation. And because the CDC had spent more than $8 million in 2000 to staff and supply a network of 81 public health labs to detect bioterrorist pathogens, the agency was relieved of some of its onerous laboratory caseload when the anthrax emergency struck.

Still, as health officials realized in 2001, the U.S. public health system desperately needs faster methods of detecting and treating deadly pathogens. Traditionally, lab technicians have had to grow organisms in culture media before making an ID—a process that sometimes takes days. Around the corner are portable devices that can quickly decipher the genetic material of a suspicious agent, compare its DNA to that of known strains, and even discern single nucleotides that may be giveaways for antibiotic resistance or genetic engineering. Scientists are experimenting with biosensors that connect living tissues to electronic chips that would trip an alarm. Researchers are also working on versatile treatments for terrorist-sown diseases. At the Pentagon's Defense Advanced Research Projects Agency (DARPA)—the toils of which led to the development of the Internet and of stealth aircraft—scientists are making antitoxins to neutralize the products of deadly bacteria, and topical patches to protect exposed victims against anthrax and other diseases. True to its reputation for unorthodoxy, DARPA is even breeding strains of bees that can track and follow the sources of airborne toxins. Meanwhile, the National Institutes of Health and the Defense Department are developing vaccines against every major weaponizable germ; theoretically, the vaccines could be administered prophylactically to hospital workers and police, and distributed to the general population after an attack to stop the disease from spreading.



Fingerpointing



Although September 2001 and its aftermath were highly publicized, that won't necessarily be so in future bioterrorism. The most insidious aspect of an intentionally planted epidemic is that it could be hard to distinguish from a natural disease outbreak. Public health officials would take notice any time lots of people suddenly became ill from a single disease agent or suffered pulmonary symptoms (suggesting an aerosolized microorganism or toxin); if large numbers of people became ill from an agent not previously seen in their geographic area; or if several deadly epidemics erupted simultaneously. But these same criteria also describe recent high-profile outbreaks of natural causation—outbreaks that themselves looked at first like bioterrorism but proved not to be.

The 1976 epidemic of a highly fatal respiratory infection at an American Legion convention in Philadelphia, for instance, had all the hallmarks of a deliberate chemical or toxin attack—especially since it took months to identify the bacterium that causes what we now know as Legionnaires' disease. When young Navajos started mysteriously drowning in their own lung fluid in 1993, not far from the U.S. National Laboratories at Los Alamos and at Sandia, rumors floated that the agent had escaped from the national labs or had been released as an act of genocide against the Navajo people—until it was discovered that the hantavirus later dubbed Sin Nombre virus comes from contact with rodent wastes. AIDS in Africa in the 1980s, dengue fever in Cuba in 1981, pneumonic plague in India in 1994, foot-and-mouth disease in hogs in Taiwan in 1997, Nipah virus among animals and humans in Malaysia and Singapore in 1998, West Nile virus in New York City in 1999, foot-and-mouth disease in Great Britain and Europe in 2001: all were at some point suspected of having been deliberately introduced.

Especially in nations where, unlike in the United States, infectious disease is endemic, bioterrorism is easy to suspect but hard to confirm. "Once an allegation is made, it is impossible to disprove it completely, since the nature of the weapon makes it almost invisible," writes historian John Ellis Van Courtland Moon. "If it is difficult to prove that it has ever been used, it is impossible to prove that it has not been used. Doubt is never totally exorcised."

And though the United States has been the victim of the most high-profile attack to date, it is not above suspicion as a possible culprit. Just a week before the September 2001 terrorist attacks, American media reported that the U.S. government had conducted secret research on biowarfare preparedness. The Pentagon had drawn up plans to reproduce a Russian genetically engineered strain of the anthrax bacterium in order to test the U.S. military anthrax vaccine, and had built a mock germ factory in Nevada from commercially available materials. Meanwhile, the Central Intelligence Agency had constructed a model of a Soviet germ bomblet that the agency feared was being sold on the international market.

Many experts believe these sub rosa experiments violated the spirit, if not the letter, of the 1972 Biological Weapons Convention. The treaty, at this writing ratified by 144 countries, states that signatory nations would "never in any circumstances develop, produce, stockpile, or otherwise acquire or retain" biological weapons. As prophylactic and defensive research, the two Pentagon programs were permissible under the BWC, says Barbara Hatch Rosenberg, a biologist and chairman of the Federation of American Scientists Working Group on Biological Weapons. (In October 2001, the Pentagon decided to continue its research on genetically engineered anthrax bacilli.) The CIA germ bomb, a potential delivery system for biological agents, was not.

The BWC is an agreement between nations, not international law. One of the provisions negotiated after 1972 compels signatory nations to report annually on their defensive research. "The U.S. government was incredibly shortsighted," says microbiologist Raymond Zilinskas, who served on a UN weapons inspections team in Iraq. "This kind of research is permissible under the Biological Weapons Convention if it's carried out in the open and reported as part of the confidence-building measures—neither of which was done. Here we are doing activities that, if we found out they were being done in Iraq or Iran or North Korea, we would probably immediately bomb the hell out of them."

"It makes it look like we're trying to get away with something," adds Rosenberg. "I don't believe the U.S. intends to develop or possess offensive biological weapons. But I think a lot of the world does, and I think this plays right into their hands."

Do researchers "need to create every monster bug in order to know how to defend against it?" asks Harvard University's Matthew Meselson. As he sees it, "Anything that's done in dark secrecy is going to arouse suspicion. I am worried about the defensive work that's going on. I don't see any coordination or overall safety mechanism to make sure that we don't actually stimulate the very thing we dread. If we saw anybody else doing this, we would be very upset." Indeed, Meselson is convinced that the prospect of bioterrorism "is likely to depend not so much on the activities of lone misanthropes, hate groups, cults, or even minor states as on the policies and practices of the world's major superpowers." Our government, he says, should approach the problem "as though it were a species interest and not a parochial American interest."

To fortify the BWC, delegates have tried to tack on provisions requiring on-site visits to make sure nations are obeying the treaty and challenge inspections when nations are suspected of violations. In the summer of 2001, the Bush administration wiped out years of work on the protocol by being the only nation to reject the draft text, on the grounds that surprise inspections could threaten national security or reveal drug companies' commercial secrets. After September 11, it reaffirmed that stance. The decision, while roundly criticized, goes to a central dilemma: virtually all the agents and equipment needed to make bioweapons are also needed for legitimate medical and industrial purposes. Conversely, any operation that makes vaccines, antibiotics, feed supplements, or fermented beverages could be converted to making biological weapons. The razor-thin distinction is one of intent. Biowarfare materials are inherently "dual use," and thus difficult to police. According to Zilinskas, most secret biowarfare work is likely to remain hidden unless an accident tips off the world (as happened in 1979 in Sverdlovsk); a nation's defeat reveals information about its biowarfare program (as happened with Iraq after the Gulf War); or intelligence sources ferret out clues to a secret program.

Barbara Rosenberg fears that by pulling out of the protocols and conducting secret studies, the U.S. government gave license to rogue states to do their own dubious experiments—research that may eventually find its way to terrorists like the Al Qaeda organization. "I don't think terrorists can possibly launch an attack without support from a government that's carried out extensive work on biological weapons," she says. "In turning down the treaty, the U.S. has turned down one of the very few means we have for exerting pressure on foreign governments not to get into that."

While treaties and negotiations are under way to prevent an attack, the defense for either a natural or intentional epidemic is the same: a robust global public health surveillance system, internationally financed and managed. Notes Tara O'Toole, basic research done in the name of bioterrorism preparedness—a kind of BioApollo project, as ambitious as the mission to reach the moon—could benefit all nations. "Most of the mortality in the Third World in the coming decade is going to be from infectious disease," she says. "If we become so smart about the immune system and fighting off infectious diseases that we think we can handle just about anything that gets thrown at us—because we'll have the diagnostic capability and the ability to rapidly formulate a biological response—we're going to have spin-offs that make a momentous difference to the health of the Third World. That, in turn, should narrow the gap between the haves and the have-nots—which is at least part of the reason that these asymmetric weapons are attractive: as something to use against the great Satan of the American hegemon."

Time and again, history has shown that humankind's battle against disease can unite, at least temporarily, the enemies in shooting wars. Cease-fires have been brokered in Sudan, Sierra Leone, Angola, and other countries so that immunization days could be held. During El Salvador's civil war from 1985 to 1990, there were three annual ceasefires between government and guerrilla forces to provide every child in the country immunization and booster shots. These "days of tranquility," as they were called, give historian and political scientist Leonard Cole hope. "Those who insist that biological weapons are the weapons of the future must explain why they have not been weapons of the past. Why have these easy-to-make, easy-to-disseminate, inexpensive weapons almost never been used? The answer, at least in part, rests in reasons that inspired the days of tranquility." As he sees it, "A party that suspends fighting in order to eradicate disease one day is far less likely to spread it the next."

But after 2001, one must ask: What of groups that are willing to violate our deepest moral precepts—groups that have no political constituency, no goal of tranquility—groups that answer only to themselves?

Table of Contents

Disease in Disguise
Winged Victories
Food Fright
Superbugs
The Once and Future Pandemic
Infection Unmasked
Bioterror
Think Locally, Act Globally
Selected Bibliography

Interviews

Exclusive Author Essay
The Unwanted Guest

Being a newly published author normally makes you a coveted guest at your friends' dinner parties, at least for a week or two. Writers, after all, can unreel tales of their topic when the conversation gets slow.

The trouble is, my new book -- Secret Agents: The Menace of Emerging Infections -- is about the many and diverse infectious horrors that modern life has handed us. What's worse, I am all too willing to inform my dining companions about the disease-causing bacteria and viruses lurking in our homes, workplaces, backyards, children's day care centers, even perhaps in the very food that is now being passed around the table.

My monologue usually begins when I arrive and notice the salad being finalized for display. "Have you washed the lettuce?" I blurt out. "If not, we may be at risk for E. coli O157:H7." If some smart-aleck snorts in disbelief, and tells me that E. coli is found only in raw hamburger, I'll list the dozen or so recent outbreaks in carrots, apples, coleslaw, unpasteurized juices, and, yes, lettuce.

Once we are settled in at the table, I am often moved to comment upon the pathogenic properties of poultry. "Ever notice that it's hard to buy a chicken these days without Salmonella or Campylobacter?" Of course, I don't mean to cast gloom on the occasion, so I'll add a helpful consumer precaution. "Always cook your bird until the juices run clear," I say, tipping the serving platter to demonstrate my point.

At this point, the guests may lapse into silence. I sense an opening for a new subject. "How many of you leave your kids in day care?" After this informal poll, I'll breezily mention that these facilities are "microbial cesspools" -- that phrase always gets people's attention -- teeming with Shigella, Streptococcus, and who knows what else.

"That's why we have antibiotics," someone might say. Of course, I quickly correct this careless delusion. "Actually," I'll respond, "we're using too many antibiotics in this country. Bacteria are becoming resistant to everything we throw at them. If we keep this up, we'll be back in the Dark Ages of medicine."

Now the plates are being cleared for dessert. Inwardly, I'm hoping that the final course does not consist of fresh raspberries from Guatemala -- surely everyone remembers the Cyclospora epidemic from a few years back? To banish that from my mind, I'll turn to current events. "If you were a bioterrorist, which weapon would you deploy: anthrax, smallpox, or plague?" My dining companions stare at me helplessly. Since I don't want them to feel embarrassed for not knowing how to answer, I describe the strategic pros and cons of each.

By now, the mood is unmistakably subdued. But I consider it a teachable moment. "Emerging infections are evolution in action," I'll explain. "Never underestimate an adversary that has a 3.5 billion year head start."

Someone tosses down a fork in exasperation. "Can't you talk about anything besides your book?"

My mind casts about for fresh conversational tidbits. "Have you heard," I ask, "the latest theories on global warming?" (Madeline Drexler)

From the B&N Reads Blog

Customer Reviews