A concise history of antimicrobial therapy (serendipity and all)

Article

Some brilliant observations and more than a little luck have given physicians their most powerful weapons against infectious disease. Can we keep the upper hand?

 

A concise history of antimicrobial therapy (serendipity and all)

Jump to:
Choose article section... The bad old days The germ theory of disease The search for "magic bullets" The first sulfa drugs The discovery of penicillin Antibiotics in abundance The bugs strike back Safeguarding the advantage

By Andrew J. Schuman, MD

Some brilliant observations and more than a little luck have given physicians their most powerful weapons against infectious disease. Can we keep the upper hand as bacteria continue to develop resistance to antibiotics?

In the course of a typical day in a pediatric office, dozens of prescriptions are written for antibiotics to treat a variety of routine infections. Yet it was not so long ago, before antibiotics were available, that pediatricians and other physicians were powerless against infectious diseases. Our repertoire of treatments was painfully limited to measures directed at reducing fever, encouraging hydration, and quarantining the sick to contain the spread of life-threatening disease.

In little more than half a century, antibiotics have radically changed the course of pediatric medicine. This review looks at the impact of infectious disease on children in the preantibiotic era and highlights the many serendipitous events that led to the discovery of pediatrics' most powerful therapeutic agents.

The bad old days

When pediatrics was in its infancy in the last quarter of the 19th century, children often died of diseases that are now preventable with vaccines or easily treated with antibiotics. Typhoid and cholera were common and marked by sporadic epidemics. Other major causes of death in infants and children included diphtheria, tuberculosis, scarlet fever, pertussis, meningitis, and pneumonia.

The 1850 United States census attributed more than 100,000 deaths to what were then called "miasmic diseases," including cholera, typhoid fever, croup, scarlet fever, diphtheria, and whooping cough. Young children were most susceptible—more than half of the deaths of children between 5 and 9 years of age were caused by these diseases. As many as 70% of children who contracted diphtheria died, and mortality from scarlet fever among children under 2 years was 55% in 1840. Even as late as 1915, mortality from scarlet fever among children under 5 years was around 30%. (Some experts believe that the scarlet fever of this period was a different disease, with a different etiology, than the illness we call scarlet fever today.) The death rate from whooping cough approached 25% among infected children, and the disease killed more than 10,000 children a year in the US as late as 1906.1

Physicians used a variety of remedies to combat illness. Bloodletting was commonplace, as was the aggressive use of emetic and cathartic agents. Bloodletting was achieved by using leeches or lancing veins with nonsterile knives and applying heated cups to the incision to encourage the free flow of blood. One of the most common purgative agents was calomel, a preparation of mercuric chloride, which was given to children in high and frequent doses.

A prevailing belief at the time was that any measure that altered a patient's physiologic status combatted infection and could rid the body of toxins responsible for disease. For this reason, ill children were treated aggressively with sweating by placing them under dozens of blankets, or with "puking or purging," as mentioned. It was also believed that the human body could harbor only one illness at a time. Introducing a second infection—via an infected blister, for example—could "displace" an existing illness with one more likely to be survivable. Accordingly, children with serious illnesses were often treated by blistering of the skin with hot pokers, boiling water, or plasters containing acids.1

As medicine evolved, so did novel treatments to combat infectious disease. Pertussis was treated by irradiating the chest with X-rays and, until the late 1800s, pharyngeal diphtheria was treated by injecting chlorine water into the diptheritic pseudomembrane or spraying the membrane with hydrogen peroxide. In 1885, a New York pediatrician, Joseph O'Dwyer, treated laryngeal diphtheria in children successfully by intubating the larynx with what became known as "O'Dwyer tubes." Eventually, as the bacterial basis of diphtheria came to be understood, this practice was replaced by the use of diphtheria antitoxin, which reduced mortality from diphtheria dramatically. The disease was subsequently conquered by the introduction of active immunization with toxin-antitoxin preparations.

The germ theory of disease

Until the late 1800s, a time of enlightenment and transition for medicine, many different theories were advanced to explain the cause and spread of infectious disease. The Greeks introduced the concept that health resulted from the proper balance of four "humors"—blood, phlegm, yellow bile, and black bile—which led to the theory that bloodletting, emetics, and cathartics could bring the imbalance of humors that caused disease back into harmony. This idea persisted into the late 19th century. Later theories speculated that diseases resulted from the right (or wrong) combinations of climate, poor hygiene, and exposure to bad air.2

The first theory that resembles our current understanding of disease transmission was formulated in 1546 by the Italian physician Girolamo Fracastorius, who thought that disease was caused by invisible "particles" (translated into English as "germs"), which could spread through the air and produce illness by direct or indirect contact with human beings. It was not until 1683 that Antonie Van Leeuwenhoek, using a primitive microscope, demonstrated the existence of microorganisms, including protozoa and bacteria, and showed that they could be observed in the saliva and stool of humans and animals. He also discovered that the microorganisms he observed were fragile, and could be killed with vinegar or heat.3

The first clinician to suspect that infection was spread from one person to another was Ignaz Semmelweis, a Hungarian physician. In 1847, he observed that the incidence of "childbed fever" in his Vienna Lying-in Hospital was dramatically lower among women whose babies were delivered by nurse midwives (3.9%) than those whose children were delivered by physicians (10%). He also noted that nurse midwives washed their hands between patients whereas physicians did not. He assumed that an agent responsible for infection—which he called "putrid particles"—was being transmitted by contact.

By implementing a policy of universal hand washing with lime chloride solution at the hospital, Semmelweiss drastically reduced the mortality of childbed fever to 1.27%. When he published his theories, however, few physicians believed him or adopted his recommendations. He remained a "medical martyr" until his practices were corroborated by later investigators who proved that bacteria were responsible for puerperal fever.2

The next advances in the germ theory of disease came from two investigators in the late 19th century: Louis Pasteur, a French chemist, and Robert Koch, a German physician. Pasteur discovered that yeast was responsible for fermentation and that bacteria caused wine to spoil. By heating wine, and killing contaminating bacteria, he could prevent wine from spoiling. This discovery eventually led to widespread "pasteurization" of wine and was later adapted to prevent milk from spoiling.

Pasteur also discovered the microbial origin of anthrax, traced the causative organism's complicated life cycle, and determined that transmission could be avoided by burning, rather than burying, the corpses of infected animals. With his discovery that chickens could be protected from Vibrio cholerae infection by inoculation with old cultures of cholera bacteria, he laid the foundation for the process of attenuating bacteria to produce immunity. Pasteur's studies inspired Joseph Lister to disinfect operative wounds with carbolic acid to prevent postoperative wound infection, thereby introducing the concept of "antiseptic" surgery.2

Whereas Pasteur pioneered the concept that bacteria were responsible for human disease, his rival Robert Koch was responsible for developing the modern science of microbiology. He perfected techniques for growing pure colonies of bacteria first on potato slices, then gelatin, then agar with enriched nutrients. He invented methods of fixing and staining bacteria and techniques for photographing the bacteria he viewed through his microscopes.

Koch demonstrated that specific bacteria were responsible for diseases including tuberculosis, cholera, anthrax, and many others. With his former professor, Jacob Henle, he formulated what have come to be known as the Henle-Koch postulates for proving that a specific organism is the causative agent of a particular disease3:

  • A specific organism must be identified in all cases of a disease

  • Pure cultures of the organism must be obtained

  • Organisms from pure culture must reproduce the disease in experimental animals

  • The organism must be recovered from the experimental animals.

The search for "magic bullets"

Once bacteria were understood to cause human diseases, the search began for agents that could kill infecting organisms while leaving the affected human unharmed. Some researchers focused on mobilizing the immune system to combat infection; others sought to develop chemical agents or "magic bullets" to eradicate microorganisms. The first approach met with success initially, leading to the use of passive immunity to fight infection and, eventually, to the development of vaccines to prevent many adult and pediatric diseases.

Two students of Robert Koch, Emil von Behring and Shibasabura Kitasato, isolated serum from animals injected with modified diphtheria bacteria and used the isolated antitoxin to treat a child with diphtheria successfully. The serum was produced commercially in 1892 and, in a short time, reduced mortality from diphtheria from 70% to 21%.1 In 1901, von Behring received the first Nobel Prize for medicine for his discovery of the first effective agent to treat infection.

In the first decade of the 20th century, Paul Erlich, a German physician, developed the first "magic bullet," a drug to treat syphilis. At that time, 10% of the adult male population of the US had syphilis, and congenital syphilis accounted for 1% of infant mortality. Before Erlich's discovery, the only treatment for syphilis was mercury, a poison that caused hair and tooth loss, mouth ulcers, and abdominal pain. Many patients considered the treatment worse than the disease.

Ehrlich began to search for chemicals capable of killing microbes in 1906, and his research eventually led him to investigate the antibacterial properties of an arsenic containing synthetic dye called atoxyl. In 1909, after 605 failed attempts to develop an effective drug from atoxyl, Ehrlich's 606th experiment with an atoxyl derivative, arsphenamine, succeeded. The compound was marketed in 1910 under the name of salvarsan and subsequently came into widespread use. In 1912, several American pediatricians, including L. E. La Fétra and Luther Emmett Holt, began to use salvarsan to treat infants with congenital syphilis.1

The first sulfa drugs

With the success of salvarsan, the search for other "magic bullets" intensified, but the next antibiotic was not discovered for almost two decades. In 1932, Gerhard Domagk, a biochemist and the director of a German chemical company, began to experiment with textile dyes to see if any could effectively treat streptococcal infection. He discovered that a sulfa compound called Prontonsil cured mice that had been injected with a lethal dose of streptococci.

Shortly afterward, Domagk's daughter became violently ill with a streptococcal infection. When all other remedies failed her, Domagk administered Prontonsil, and she recovered rapidly and fully. One year later, Domagk published a clinical report describing how Prontonsil cured a 10-month-old child with staphylococcal septicemia.3 Studies subsequently showed that Prontosil was effective in vivo, but not in vitro, because the body metabolized it into sulfanilamide, the agent that is active against bacteria. Still later, sulfa drugs were found to interfere with the metabolism of para-aminobenzoic acid, thereby stopping the growth of bacteria and exerting a bacteriostatic rather than a bacteriocidal effect.

Prontonsil was introduced into the US in 1935 to treat a child with Haemophilus influenzae meningitis at Babies Hospital in New York. Over the next several years, pharmaceutical companies produced many new sulfonamide antibiotics, including sulfapyridine and sulfadiazine. They proved more effective against pneumococcal infection than sulfanilamide, but were associated with nausea and kidney stones. Eventually, Hoffman-LaRoche developed the soluble antibiotic sulfisoxazole, which became the most commonly prescribed sulfa drug.

In the early 1940s, huge quantities of sulfa drugs were prescribed in the US because they were found to be effective against pneumonia, meningitis, gonorrhea, and puerperal infection. When taken in a small dose daily, they reduced the recurrence of rheumatic fever. During World War II, American troops were given prophylactic doses of sulfa drugs to prevent streptococcal infection. Eventually, sulfa-resistant strains emerged.

In 1968, pediatricians began to prescribe a suspension of sulfamethoxazole and trimethoprim as an alternative to amoxicillin for otitis media and urinary tract infections. It is still a commonly used antibiotic for urinary tract infection.

The discovery of penicillin

The initial focus of antibiotic research was on synthetic chemicals with antimicrobial properties, but microbiologists next began to search for "natural" antibiotics. It had long been observed that one bacterial species inhibited the growth of others when introduced into the same culture medium. Researchers assumed that one species produced antibiotic substances that assured its survival at the expense of potential invaders.

The British physician and surgeon Alexander Fleming had gained limited notoriety in 1922 by discovering that tears and nasal secretions could inhibit the growth of bacteria. He subsequently identified and isolated the enzyme lysozyme from these secretions as well as from saliva, hair, and skin, and eggs, flowers, and vegetables. He speculated that lysozyme was a part of a universal defense mechanism that all living creatures possessed to prevent invasion by bacteria. Unfortunately, lysozyme had no effect on pathologic bacteria, and Fleming eventually abandoned research on the enzyme to study staphylococci.

Fortunately, Fleming was somewhat untidy in his laboratory. One day in 1928, he noticed that a mold (Penicillium notatum) that had contaminated old culture dishes in which staphylococci were growing produced a zone of inhibition where it grew.3 He later isolated a substance, which he called penicillin, from the mold and found that it effectively eradicated many different types of bacteria. He published his findings in 1929 but never attempted to administer penicillin to lab animals inoculated with bacteria.

Had it not been for other investigators at Oxford University in England who chanced upon Fleming's original paper, penicillin might never have been introduced as an antibiotic. The investigators—Howard Florey and Ernst Chain, a German Jew who had fled from Germany to England as Hitler rose to power—began to attempt to produce enough penicillin to determine its potential utility as an antibiotic. With the aid of a $5,000 grant from the Rockefeller Institute, Florey and Chain increased the yield of penicillin by growing P notatum in porcelain bedpans, and Chain produced small quantities of purified penicillin for testing. The penicillin he prepared was 1,000 times more potent than Fleming's original "mold juice extract" and appeared to have at least 10 times the antibiotic activity of sulfa drugs.

In a now-famous study published in the Lancet in 1940, Florey and Chain injected 50 mice with streptococci and treated half of them with penicillin. At the end of the experiment, all the untreated mice were dead; 24 of the 25 penicillin-treated mice survived.

One year later, Florey and Chain produced enough penicillin for a clinical test. The antibiotic was first administered to a policeman with streptococcal septicemia, who improved while receiving penicillin but eventually died once the penicillin supply was exhausted after five days of treatment. The next beneficiaries were children—a 15-year-old with hemolytic septicemia and a 4-year-old with cavernous-sinus thrombosis and sepsis. The 15-year-old survived; the 4-year-old died from a ruptured aneurysm after being cured of infection by the penicillin.

Unlike the sulfa drugs, which could be manufactured in quantity at reasonable expense, researchers labored for years to devise methods to produce useful quantities of penicillin. Florey and his associates left Britain for the US in 1941 because Britain could not allocate the resources Florey required to expand production while the country was at war. In the US, Florey and other researchers discovered that another fungus, Penicillium chrysogenum, produced twice as much penicillin as the original strains of P notatum—two units of penicillin for each milliliter of medium. When P chrysogenum was irradiated with ultraviolet light or X-rays, it could produce as much as 1,000 units of penicillin for each milliliter of medium. By 1944, more than 20 billion units of penicillin were being produced in the US, and production increased to more than 6,000 billion units the next year.

The first pediatrician to study the efficacy of penicillin in children was Roger L. J. Kennedy, who treated 54 children—with dramatic results. He used penicillin G, the first available therapeutic penicillin, which had to be administered intravenously or intramuscularly because gastric acid destroyed it rapidly when it was given orally. Within a few years, methods for manufacturing semisynthetic penicillins, which could withstand stomach acid and were effective when given orally, were developed. In 1953, penicillin V was prepared merely by adding phenoxyacetic acid to the growth medium.

In the years that followed, the pharmaceutical industry learned to modify penicillin by adding side chains of molecules. Ampicillin, introduced in 1961, was the first penicillin to have efficacy against gram-negative bacteria. Methicillin (1960), nafcillin (1961), and oxacillin (1962) were effective in treating infection caused by penicillinase-producing Staphylococcus species. Carbenicillin (1964) proved to be effective against Pseudomonas species. Amoxicillin, which could be given every eight hours, was introduced in 1969.3

Antibiotics in abundance

Following the success of penicillin, pharmaceutical researchers began to investigate a variety of fungi and bacteria to determine whether other useful antibiotics could be isolated. In 1945, Gioseppe Brotzu, an Italian researcher, isolated three cephalosporin compounds from the fungi Cephalosporium acremonium. It was not until 1964, however, that the first two therapeutic cephalosporins—cephalothin and cephaloridine—were introduced. They were effective against gram-positive and gram-negative bacteria and penicillin-resistant staphylococci. The first effective oral cephalosporin, cephalexin, was introduced in 1967. It was of limited use in children because the suspension tasted horrible and was not very effective in treating otitis media. Pediatricians would have to wait for an effective (and palatable) oral cephalosporin suspension until 1979, when Eli Lilly and Company marketed cefaclor. It rapidly became one of the most popular antibiotics for otitis media, strep pharyngitis, and skin and respiratory infections.

When the American microbiologist Selman Waksman—who, by the way, is credited with inventing the word antibiotic—first studied the genus of soil bacteria Actinomyces, the result was several antibiotics that were too toxic for therapeutic use. In 1943, however, Waksman isolated streptomycin from Streptomyces griseus, an actinomycete. This first aminoglycoside antibiotic was found to be effective against tuberculosis. Other aminoglycosides soon followed. Neomycin was discovered in 1949; kanamycin, in 1957; and gentamicin, in 1963.

Many other actinomycetes were found to produce antibiotics as well. Chloramphenicol, derived from Streptomyces venequelae, was released in 1948 as one of the first broad-spectrum antibiotics with antirickettsial activity. It remained popular for decades but is used rarely today because of its association with aplastic anemia.

Tetracycline—derived from hydrogenolysis of chlortetracycline, which was produced from yet another actinomycete—became commercially available in the early 1950s. Doxycycline followed in 1966; minocycline, in 1972. Erythromycin, a macrolide antibiotic derived from Streptomyces erythreuse, was isolated in 1952, and vancomycin, a glycopeptide antibiotic derived from Streptomyces orientalis, was isolated in 1956. Today vancomycin is the drug of choice for treating methicillin-resistant Staphylococcus aureus infections.

In 1962, researchers identified nalidixic acid, a byproduct of chloroquine synthesis, as being a potent "quinolone" antibiotic, but it was not until 1982 that fluorinated quinolone compounds, including ciprofloxacin, became available for general use. A new antibiotic of the oxazolindinone class, called linezolid, was recently approved by the FDA and should be available by the time you read this. Representing the first "new" antibiotic class in decades, linezolid was actually discovered more than 30 years ago but was never produced commercially. It is being introduced now because of the growing threat of antibiotic resistance.

The bugs strike back

It was Alexander Fleming who first warned the medical community to use antibiotics cautiously. Overzealous, indiscriminate use would encourage the evolution of bacteria resistant to "magic bullets." His prediction came true not long after sulfa drugs and penicillin were introduced. By 1946, just three years after penicillin use became widespread, hospitals began reporting a rising occurrence of penicillin-resistant staphylococci. By the 1960s, penicillin-resistant pneumococci and gonococci were reported as well.

Penicillin was originally available without a prescription and was often used to treat nonbacterial infection, often in subtherapeutic doses. This undoubtedly contributed to the appearance of penicillin resistance.

Today, according to the Centers for Disease Control and Prevention (CDC), as many as 30% of infections caused by pneumoccoci are not susceptible to penicillin, and data gathered from intensive care units around the country indicate that 28% of nosocomial infections are resistant to the preferred antibiotic treatment.4 Most worrisome is the appearance of vancomycin-resistant strains of S aureus in Japan and the US since 1997.

To combat the evolving threat of antibiotic resistance, the CDC recently released a plan—Preventing Emerging Infectious Diseases: A Strategy for the 21st Century—which can be viewed on the agency's Web site ( www.cdc.gov ).4 The plan involves improving surveillance for drug-resistant infections, accelerating research that focuses on understanding antimicrobial resistance, developing infection control strategies to prevent disease transmission, developing new vaccines, and educating physicians to prescribe antibiotics more prudently. The Food and Drug Administration recently addressed the issue of drug resistance with a new rule for labeling antibiotics (see "How the FDA evolved").

Safeguarding the advantage

It is easy to forget the years of research and fortuitous events that have armed pediatricians with an arsenal of antibiotics to cure infections that were once life-threatening. Our obligation now is to preserve the efficacy these drugs by adopting judicious prescribing habits. If we do that, we will have effective antibiotics for years to come.

REFERENCES

1. Cone TE: History of American Pediatrics. Boston, Little, Brown, and Company, 1979

2. Duin N, Sutcliffe J: A History of Medicine. New York, Barnes and Noble Books, 1992

3. Cowen DL, Segelman AB: Antibiotics in Historical Perspective. Merck and Company, 1981

4. Centers for Disease Control and Prevention: Preventing Infectious Diseases: A Strategy for the 21st Century. www.cdc.gov/ncidod/emergplan/

Dr. SCHUMAN is adjunct assistant professor of pediatrics at Dartmouth Medical School, Lebanon, N.H., and practices pediatrics at Hampshire Pediatrics, Manchester, N.H. He is a contributing editor for Contemporary Pediatrics. He has nothing to disclose in regard to affiliations with, or financial interests in, any organization that may have an interest in any part of this article.

How the FDA evolved

The Food and Drug Administration, which oversees the safety of food, human and veterinary drugs, medical devices, and cosmetics, began life in 1862 as a section of the United States Department of Agriculture charged with investigating "adulteration of agricultural commodities." It was staffed by a lone chemist. Today the FDA has more than 9,000 employees and an annual budget that exceeds $1.2 billion.

In 1906, Congress passed the Federal Food and Drug Act to prevent the unlawful adulteration of food and medicine and to ensure that food and drug labels are accurate. But 30 years later, as the food and drug markets changed, the FDA found itself without the authority to prevent the sale of many products that made false claims and others that were inherently hazardous. In 1937, Elixir Sulfanilamide was marketed as an over-the-counter remedy. It contained chemicals similar to antifreeze and killed 107 people, many of them children. This tragedy prompted passage of the 1938 Food, Drug, and Cosmetic Act, which mandated that manufacturers prove to the FDA that products are safe before they bring them to market. Modifications to the law in 1951 enabled the FDA to differentiate drugs that could be sold over the counter from those that require a prescription.

Since then, the FDA has assumed responsibility for regulating clinical drug trials, safeguarding the blood supply, implementing the Clinical Laboratory Improvement Amendments of 1988, and overseeing the introduction of new vaccines. This year, the FDA published a rule requiring that antibiotic labels carry warnings that advise patients of the societal and individual hazard of antimicrobial resistance—the hope being that this small measure will reduce the estimated 50 million "unnecessary" prescriptions for antibiotics written by physicians every year.

The pediatrician's antibiotic trivia quiz

1. Pediatricians prescribe amoxicillin more often than any other antibiotic. What percentage of all prescriptions written for children in the United States are for amoxicillin?

a. 75% b. 50% c. 25% d. 15%

2. Which nongeneric antibiotic is prescribed most often for children in the US?

a. Cefzil b. Zithromax c. Biaxin d. Omnicef

3. For which diagnosis are antibiotics most often prescribed for children in the US?

a. otitis media b. pharyngitis c. sinusitis

4. How much do pharmaceutical companies spend each year promoting their products, including antibiotics, to pediatricians?

a. $17 million b. $77 million c. $117 million

5. Which of the following nongeneric antibiotics is the least expensive alternative for an uninsured patient for a typical course of treatment for otitis media in a 10-kg child?

a. Cefzil b. Zithromax c. Biaxin d. Omnicef

Answers

1. c. According to data provided to Contemporary Pediatrics by Verispan, Inc., approximately 25% of all antibiotic prescriptions written for children are for amoxicillin.

2. b. According to Verispan, Zithromax is the most-often-prescribed nongeneric antibiotic for children, comprising about 10% of all pediatric antibiotic prescriptions. Cefzil is next with 5%; Omnicef and Biaxin each have about 3%.

3. a. According to Verispan, otitis media is the condition for which antibiotics are most often prescribed, with 29% of the total. Pharyngitis follows with 10% and sinusitis is third with 8%.

4. c. Again according to Verispan, pharmaceutical companies spend more than $117 million each year promoting their products to pediatricians!

5. c. According to the 2003 edition of Mosby's Drug Consult, the least expensive alternative is Biaxin ($15.53 retail). A typical course of treatment with Zithromax costs $28.60 retail, with Cefzil $32.65, and with Omnicef $43.19.

The short but eventful history of antibiotics

1909
Ehrlich discovers salvarsan (arsphenamine) the first "magic bullet"
1928
Fleming discovers penicillin
1932
Domagk discovers sulfonamides
1940s
Penicillin becomes commercially available Cephalosporins are isolated
1950s
Tetracycline becomes available
1952
Erythromycin is isolated
1953
Penicillin V becomes available
1956
Vancomycin is isolated
1957
Kanamycin becomes available
1961
Ampicillin becomes available
1962
Quinolone antibiotics discovered Oxacillin is introduced
1963
Gentamicin is introduced
1968
Combination of trimethoprim and sulfamethoxazole becomes available
1969
Amoxicillin is introduced
1979
Cefaclor becomes available
1980s
Fluoroquinolones become available
2003
Linezolid becomes available

 

Andrew Schuman. A concise history of antimicrobial therapy (serendipity and all). Contemporary Pediatrics October 2003;20:65.

Related Videos
Angela Nash, PhD, APRN, CPNP-PC, PMHS | Image credit: UTHealth Houston
Allison Scott, DNP, CPNP-PC, IBCLC
Joanne M. Howard, MSN, MA, RN, CPNP-PC, PMHS & Anne Craig, MSN, RN, CPNP-PC
Juanita Mora, MD
Natasha Hoyte, MPH, CPNP-PC
Lauren Flagg
Venous thromboembolism, Heparin-induced thrombocytopenia, and direct oral anticoagulants | Image credit: Contemporary Pediatrics
Jessica Peck, DNP, APRN, CPNP-PC, CNE, CNL, FAANP, FAAN
Sally Humphrey, DNP, APRN, CPNP-PC | Image Credit: Contemporary Pediatrics
© 2024 MJH Life Sciences

All rights reserved.