Upgrade to the Flash 9 viewer for enhanced content, including the ability to browse & search through your favorite titles.
Click here to learn more!
The adenoviruses are a family of forty-nine viruses, identified by sequential letters and numbers, first found in adenoids (lymphoid tissue at the back of the pharynx) in the early 1950s, but definitely around for a long time before that. There are also adenoviruses that infect only animals. The ones that like people cause about 5 percent of all respiratory illnesses, from mild flu-like symptoms to pneumonia, which are rarely fatal though of special concern among military recruits. Some types cause gastrointestinal illnesses; others can cause conjunctivitis among swimmers in lakes or insufficiently chlorinated swimming pools. Most children around the world have been infected with the more common adenoviruses by the time they reach school age.
Outbreaks of nontrivial respiratory illnesses in new soldiers have been a recognized problem since at least the Civil War. The term "acute respiratory disease" gained acronym status as ard during World War II. Up to twenty cases a week per one hundred recruits could be expected, creating a significant drag on training. Starting in 1971, all American military recruits were vaccinated against the most common adenoviruses, but in 1996 the sole manufacturer of the vaccine stopped production. By 1999, illness during basic training had returned to pre-vaccine era levels, and in 2000 the first two deaths occurred since 1972.
Just why some of the adenoviruses enjoy these young soldiers so much, compared to civilians, is not clear, but crowding and stress are clearly part of the problem. Populations of recruits have traditionally been subject to diseases usually seen in childhood, such as chicken pox, mumps, and rubella (German measles), perhaps because of the mixture of immune and nonimmune bodies from many different locales. Except during massive mobilizations, troops can be separated into small units when they first arrive, which helps to avert epidemics. Then again, many of these guys and gals don't mind a few days in sick bay.
The ability of adenoviruses to infect our tonsils and adenoids for long periods of time implies that they have learned how to outwit our defenses. They do not fall into a latent state, like herpes viruses, but reproduce constantly at a snail's pace, changing the cells they enter in such a way that the immune system cannot tell the difference. Some of their tricks give them the power to produce tumors, but this has been observed only in experimental hosts such as hamsters. Why adenoviruses cannot cause cancer in people is unknown.
When the Lord visited "a very severe plague" upon Pharaoh's cattle in Exodus 9, it was surely anthrax. Tough Bacillus anthracis spores can persist for decades in alluvial soil like that of the Nile valley, ravaging herds no matter whose side God is on--in this case, the Israelites were probably spared because they camped on sandy ground above the river's floodplain. Livestock dying in the grip of anthrax's gruesome spasms and convulsions would definitely seem cursed. (The lower Mississippi River valley is another well-documented haven for the disease--the first American cases were reported among animals in Louisiana in the early 1700s.) Anthrax is known as a zoonotic disease (from the Greek prefix zo-, meaning "life" or "animal," and nosos, "disease") because people catch it from animals--primarily cattle but sometimes sheep, horses, pigs, or goats--not from other people. Spores enter the body by means of cuts (cutaneous anthrax, which causes the coal-colored skin lesions for which the microbe is named, from the Greek anthrakis, meaning "coal"); the breath (inhalation or pulmonary anthrax, rare but quite deadly); or ingesting bad meat (gastrointestinal anthrax). The spores then multiply and produce enough toxin within a week or so to injure surrounding cells and tissue. Today, of course, we know anthrax as bioterrorist fan mail.
In mid-nineteenth-century England and Germany anthrax was called woolsorter's and ragpicker's disease, respectively, because these workers caught it from spores in fibers and hides. Robert Koch helped launch the science of bacteriology in 1876 by developing a method to grow pure anthrax culture in the laboratory. When anthrax spread through sheep herds in France in 1877, Louis Pasteur began research that produced the first successful vaccination of livestock in 1881. Laws were enacted in the 1920s to require testing of shaving brushes that used horsehair or pig bristles. The largest outbreak in the United States happened in 1957, when nine employees of a goat hair processing plant became ill after handling a contaminated shipment from Pakistan. Four of five patients with the inhalation form of the disease died. In the 1970s, cases were traced to contact with souvenir drums from Haiti fitted with goatskin drumheads. An amateur weaver died after breathing spores in imported yarn. Doctors recorded hundreds of cases of cutaneous anthrax in the United States in the twentieth century, chiefly among agricultural workers; only eighteen cases of inhalation anthrax were noted during that time, with the last reported in 1978.
Because the spores are so efficient, anthrax has long been researched for use as a weapon. In 1925, drawing on the language of World War I peace treaties, the Geneva Protocol banned bacteriologic warfare and was ratified before World War II by all the great powers except the United States and Japan. After the war, President Truman withdrew the treaty from the Senate. A Soviet resolution calling on all United Nations members to ratify the ban was rejected in 1952. From then on, the United States and the Soviet Union, as well as other major nations, maintained extensive facilities for producing and testing germ weapons, including anthrax. In 1969, following an incident in which nerve gas drifted off a secret military test range in Utah and killed hundreds of sheep, President Nixon renounced chemical and biological weapons. In 1975, the United States finally ratified the international ban on such armaments.
A 1979 anthrax epidemic in Sverdlovsk (now Yekaterinburg) in the Soviet Union raised concern about whether the outbreak was natural or due to activities at a military microbiology lab. Work on offensive biological weapons was illegal, but defensive development of vaccines and protective gear was permitted by the treaty. Soviet officials claimed at the time that victims developed gastrointestinal anthrax after eating contaminated meat and cutaneous anthrax from contact with infected livestock. In 1992, Russian President Boris Yeltsin announced that the KGB had admitted that the outbreak had a military cause. Two years later, an independent team of American and Russian experts determined that a windborne aerosol of anthrax spores that had escaped from a military lab on April 2, 1979, produced the epidemic that killed sixty-eight of the seventy-nine cases, making it the largest documented outbreak ever of human inhalation anthrax.
Bioterrorist-related cases of cutaneous and inhalation anthrax in the United States in 2001 (including five deaths from inhalation) have been widely reported in the popular and professional media. Readers of all levels can stay abreast at the Centers for Disease Control and Prevention's Web site (www.cdc.gov), thereby avoiding the hysteria that understandably accrues to this subject and is, indeed, part of the psychological intent of terrorism. The book is being rewritten, so to speak, on the pathogenesis, diagnosis, clinical manifestations, therapy, epidemiology, and decontamination of anthrax. Even the number of inhaled spores necessary to produce disease--previously based on animal tests at military labs--is being reconsidered in light of what happened to postal workers and others. Most of the military scientific information from decades past, especially regarding so-called weaponized germs like the material released in U.S. Senate offices, was secret and therefore unavailable or is of dubious value to current researchers and physicians.
For the general public, the risk of getting anthrax remains negligible, at least in developed countries with modern animal husbandry and industrial hygiene. The greatest fear is fear itself, now as ever.
The word "arbovirus" derives from the phrase "arthropod-borne virus," which means viruses that propagate inside insects and other arthropods and reach us through bites. There are more than 520 known arboviruses, of which about a hundred cause disease in humans, usually with no apparent symptoms. But encephalitis, yellow fever, dengue fever, and a veritable zoo of exotic tropical fevers and malaises--Mayaro, Kyasanur, Bunyamwera, Marituba, Punta Toro, Candiru, West Nile--give these microbes a deservedly bad reputation. Mosquitoes, ticks, and sandflies are the most common carriers, with various other flies and mites guilty to a lesser degree.
People are usually "dead-end" hosts for the arboviruses. That is, most of the germs don't need us for long-term survival--we just get in the way sometimes and then feel sick. Birds are much more crucial to arboviruses as long-term hosts. The exceptions of great significance are yellow fever, dengue, and chikungunya, where we serve as a vital link in their life cycle. Before the jet age, many of the arbovirus illnesses were encountered only by Indiana Jones types, but today they can turn up almost anywhere, at least in isolated cases. Epidemics are unlikely wherever the associated insects are kept under control.
Arenaviruses--named after the Latin word for sand because of grainy ribosomes (protein factories) they capture from host cells--are a family of viruses first identified in 1933 during an encephalitis outbreak in St. Louis, Missouri. They turned out to be innocent in this epidemic, but one was found to cause meningitis. Members of the tribe that have so far been identified sport names that would make a powerful magical incantation: Amapari, Junin, Lassa, Latino, Machupo, Guanarito, Sabia, Parana, Pichinde, Tacaribe, Tamiami. Their natural hosts would spike the brew even more: bats, rats, and mice. Some of these viruses bring on meningitis or various hemorrhagic fevers when humans come into contact with infected excreta. Feh!
They can be quite deadly. Argentinian hemorrhagic fever, caused by the Junin virus discovered in 1953 near the Junin River, may produce a 20 percent mortality rate as it disrupts capillaries and essentially causes its victims to bleed to death. The Bolivian brand (Machupo, also named after a river), first called el typho negro, the black typhus, can reach 30 percent mortality. It probably emerged via field mice of the genus Calomys as jungles were cleared for agriculture during the 1950s. Lassa fever, which first came to the attention of Western scientists when American nurses were stricken in a Nigerian village of that name in 1969, topped 60 percent mortality in one hospital outbreak. Patients are therefore attended by medical staff in biohazard moon suits--not a reassuring sight at bedside. Disinfection procedures under such circumstances are called "barrier nursing," a term that neatly captures the scene.
Fortunately, most of these killers have remained confined to certain geographic areas, though cases have appeared elsewhere through the swift vagaries of modern travel. In 1999 and 2000, three fatal cases of indigenous arenavirus infection occurred in California involving Whitewater Arroyo, a strain first detected in New Mexican woodrats in the early 1990s. One victim apparently cleaned rodent droppings at home just before getting sick, but the other two had no such history of exposure.
Direct person-to-person contagion is the exception, not the rule, so with proper surveillance these diseases should stay in the exotic category. Unfortunately, they most often occur in parts of the world where effective monitoring may be too much to expect of state medical agencies.
As many as two million Americans are bitten by dogs every year, accounting for 1 percent of emergency room visits and more than $30 million in direct medical-care costs. Most nonrabid bites are provoked, and most victims are males under twenty years old. Nearly a third of these wounds may become infected, regardless of first-aid measures like washing or applying salves. In general, bites to the hand are more serious and more frequently become infected than chomps to other areas. Infections are due mostly to germs in the dog's mouth, including Pasteurella, staphylococci, and Weeksiella. Snakes account for another eight thousand punctured Americans annually. But human bites, including those self-inflicted by finger suckers, are far worse than animal bites. Many of the bacteria in human bite wounds may be resistant to penicillin. The bacteriology of Clenched-Fist Injuries, where one bloke socks another in the teeth, is similar to that of bites and should be taken into consideration by Hollywood screenwriters. Vampires are already big money in Tinseltown, of course.
One of the planet's great ubiquitous organisms, the bacterium that causes whooping cough, Bordetella pertussis, cannot be avoided. It is everywhere and ferociously virulent, infecting 90 percent or more of susceptible individuals after exposure through close respiratory contact. It thus strikes early in life, with fatality rates highest among the youngest--a hundred years ago it was a major cause of infant mortality around the world, and it may still grimly reap wherever people are too poor or isolated to get immunized. It causes about 40 million cases of whooping cough (pertussis) and 400,000 deaths annually. A vaccine was widely available in the United States by the mid-1940s. Pertussis deaths began to decline for unknown reasons even before that decade, but local outbreaks still occur and increased during the 1980s in the United States, perhaps due to lax immunization. In 1999, there were 7,288 cases reported in the United States, 11 percent of which were among kids between one and five years old whose immunity should have been assured. Adolescents and adults are susceptible as immunity from childhood vaccination weakens.
The terms "pertussis" and "whooping cough" first appeared in English in the seventeenth century. Before that the disease had been called "chin cough" or, in France, quinta (apparently referring to an interval of about five hours between coughing spasms) and coqueluche (also used for influenza). After incubating for one to two weeks, Bordetella pertussis invades the respiratory tract, causing a hacking cough. Over the next ten to fourteen days this progresses into a paroxysmal cough that may happen five to fifteen quick times in a row, followed by a high-pitched, crowing deep breath and vomiting. Convalescence begins within a month, and most people recover after a seven-week illness. Pneumonia and brain damage are possible complications. Whooping cough was epidemic in the American colonies, particularly in South Carolina. Bordetella pertussis was not isolated until 1906, by the Belgian physiologist and Nobel Prize winner Jules Bordet (1870-1961).
Fighting the germ requires diligent effort: Children in developed countries usually receive three doses of vaccine (a combination of diphtheria, tetanus, and pertussis vaccines, or the dtp shot) during the first six months of life, with boosters around fifteen months and before starting school. Since 1940, this vaccine containing whole dead bacteria has cut the incidence of whooping cough by 99 percent in the United States, but it can on rare occasion produce serious side effects, including high fever and convulsions. In 1988, the U.S. government established an $80 million fund to compensate children injured by the vaccine, prompting some 4,700 claims. Since then, research has focused on developing safer vaccines that contain antigenic components of B. pertussis, rather than whole cells.