Antibiotics have been a crucial (and controversial) component of meat production for decades. In the late 1940s, biologists inadvertently discovered that feeding livestock tetracycline made the animals grow faster and from that revelation, an industry was born. Today, food-producing animals raised and eaten in the United States receive almost 30 million pounds of antibiotics per year. That’s several times what our country’s 300 million humans take and, unlike humans, animals receive antibiotics when they are not sick.
Most of those 30 million pounds are given to pigs, chickens and cattle in small doses every day, for growth promotion and disease prevention — that is, to fatten them and protect them from the conditions in which they are raised.
This routine drugging has been debated almost since farmers began doing it in the 1950s. The doses given to livestock
to promote growth or prevent disease are smaller than the amount it would take to cure sick animals; they kill only the weak bacteria, letting stronger, drug-resistant ones survive and spread. British scientists began detecting a spike in antibiotic-resistant infections in humans in the 1960s, and in 1977, the U.S. Food and Drug Administration tried to ban some routine animal dosing, blaming it for increasing amounts of antibiotic resistance. Since then, hundreds of scientific studies have traced a link between antibiotic use in livestock and antibiotic-resistant bacteria on farms and in the outside world.
At the same time, antibiotic-resistant human illnesses have been worsening around the world, producing what the director of the U.S. Centers for Disease Control and Prevention calls “nightmare bacteria” that cannot be treated by traditional methods.