In a blog posted at the time, I registered my surprise and disappointment that CDC had failed to include animal settings in its 2012 campaign and urged the agency to expand the focus this year. I’m sorry to report that CDC did not get the message. Get Smart About Antibiotics Week 2013 (November 18-24) again focuses almost exclusively on human medical use.
CDC’s Dr. Tom Chiller explained (in a letter to a colleague) that the agency’s educational site aimed at food animals, Know When Antibiotics Work on the Farm, is not currently funded and doesn’t have any dedicated staff.
Given the urgency of the issue, why, for heaven’s sake, not?
How many swine producers know that routine use of antibiotics doesn’t deliver economic benefits in finisher pigs? Dan Charles of National Public Radio did a great piece highlighting this fact, but it needs to be hammered home across farm country. Swine producers, just like human patients, need to be educated about opportunities to reduce the demand for antibiotics.
There are many avenues animal producers could take to avoid unnecessary use of antibiotics, among them encouraging appropriate weaning times, all-in-all-out animal management systems, and clean, uncrowded facilities. Consumers could be urged to do their part by looking for and purchasing meat produced with fewer antibiotics.
Ignoring the problem of animal antibiotic use has not made it go away. FDA 2010 data reveal that 80 percent of antibiotics sold in the US were intended for use in animals. Many of those drugs–for example, penicillins, tetracyclines and erythromycin–are in the same classes as drugs critical for human medicine.
Feeding enormous quantities of antibiotics in feedlots and poultry and swine houses generates huge populations of bacteria in animal guts that are resistant to human drugs. Resistant bacteria have easy routes to humans: on food, through the environment, and via people who work with and around the animals.
Antibiotic-resistant food-borne illness is becoming all too common. Recently, Costco announced an expanded recall of Foster Farm chicken implicated in an outbreak of antibiotic-resistant Salmonella that has already sickened over 320 people.
Scientists agree that antibiotic use on farms contributes to increasing levels of severe and difficult-to-treat human (and for that matter, animal) disease. Just this month, two former Commissioners of the FDA, Donald Kennedy and David Kessler, wrote to the Executive Office of the President urging “swift action to curb unnecessary use of medically important antibiotics in food animal production.”
In September of this year, CDC itself issued a report, Antibiotic Resistance Threats in the United States, 2013, calling antibiotic use the single most important factor leading to antibiotic resistance and noting that there is more use of antibiotics in food production than human medicine. The report endorsed phasing out the use of antibiotics for growth promotion, saying the drugs should “only be used to treat infections.”
Just like the CDC’s human campaign, the CDC campaign on animals should be aimed at unnecessary uses: feed efficiency and growth promotion (purely economic uses) and routine disease prevention, which often compensates for stressful conditions in crowded animal facilities.
The best approach is to legally restrict the sale of antibiotics for routine purposes, a goal that would be accomplished with the passage of the Preventing Antibiotic Resistance Act of 2013 introduced by Senator Diane Feinstein (D-CA) and the Preservation of Antibiotics for Medical Treatment Act of 2013 introduced by Representative Louise Slaughter (D-NY). But even with successful legislation, a broad appreciation of the opportunities to reduce antibiotic use in food production would be important.
I applaud the CDC’s campaign to address the overuse and inappropriate use of antibiotics in human medical settings. Such use is an essential driver of antibiotic resistant disease. But the CDC should be campaigning with equal vigor against overuse and inappropriate use in food animal production.
It simply makes no sense to urge the parents of sick children to forgo unneeded antibiotics, while silently standing by as producers of cattle, swine and poultry continue to overuse the same drugs just to avoid the transition to modern management systems.
Expanding CDC’s Get Smart campaign to include animal settings would have been a smart idea last year–and it still will be next year. CDC should find the funds to make it happen.]]>
While the forum could have been more comprehensive in its selection of respondents, it provided an interesting and timely array of views.
Against that background, I was very disappointed that Dr. Ronald replied to my comment with gratuitous attacks on UCS and an unprofessional dismissal of three groundbreaking reports produced by my colleague, Dr. Doug Gurian-Sherman, as “widely discredited.” This remark, which is wholly untrue, originally appeared without even explanation or citations to back it up.
The unfair characterization of Doug’s reports has now been removed from the online version of the Boston Review Magazine forum. In its place are two paragraphs of specific criticisms of one of the reports, Failure to Yield. Those criticisms are primarily complaints about what was not in the report, for example, studies from the developing world—rather than scientific criticisms of methods or findings. Strangely, although they were covered in both her original and revised characterizations, Dr. Ronald provides no discussion, much less criticism, of the other two reports.
Doug has ably responded to Dr. Ronald’s specific criticisms elsewhere in comments on the forum.
Here I would like to reiterate why these reports, whose findings have stood the test of time, are so important.
Failure to Yield
Failure to Yield (2009) presented a detailed analysis of the studies, done primarily by land grant university scientists, comparing yields of GE versions of corn and soybean and their closely matched non-GE counterparts. The availability of closely matched, non-GE varieties of crops allowed scientists to tease out the contribution of the GE traits from other contributors to yield performance.
That was important because, before Failure To Yield, the possibility that anything other than GE could contribute to yield was rarely discussed. It was easy to think that all of the 1 to 2% yearly yield increases observed in corn and soybeans were the result of the introduction of biotechnology crops. The question was whether that was true…and report found it wasn’t.
The analysis presented in Failure to Yield differentiated between intrinsic and operational yield. Intrinsic yield is the yield under the best possible conditions–good soil, good weather and no pests—and is the bedrock of gains in agricultural productivity. Operational yield, also very important, refers to increases in the presence of stress. Pest resistance, for example, can enable a farmer to harvest more when insects threaten, but the yield is bounded at the upper end by the intrinsic characteristics of the crop.
The studies showed that the predominant GE traits found in corn and soybeans—herbicide tolerance and BT toxin—did not contribute to annual increases in intrinsic yield observed in corn, but–and here’s the important part—increases in intrinsic yield were the result of conventional breeding and agronomic practices.
No one has ever challenged this key finding of Failure to Yield. In part, this is because it simply makes sense. The steady 1-2% increases in intrinsic yields in corn were evident long before the advent of GE and so couldn’t be solely attributable to GE. Also, it’s hard to imagine how a trait that enabled farmers to use different herbicide could affect intrinsic yield.
But nevertheless the result caused a stir.
This may be why. As long as no one highlighted the central role of conventional breeding in increasing intrinsic yield, promoters of biotechnology were able to imply it was their doing.
But even more importantly, here was proof from land grant scientists that conventional breeding and agronomics were outperforming genetic engineering on a fundamental agronomic trait. That doesn’t comport with the image of biotechnology as a transformational technology essential to solving all the world’s problems.
Doug’s analysis of the contribution of biotechnology crops to operational yield found that indeed BT crops had increased yields by fending off pests. He estimated a range of operational yield increases for corn root worm and corn borer traits of about 3-4% over 13 years– a genuine benefit on the GE side of the ledger. This was rigorous and fair analysis.
Doug’s other two reports, No Sure Fix (2009) and High and Dry (2012), are also carefully done and give transgenic crops credit where credit is due. Both reports aim to assess the performance of the biotechnology industry in producing crops with the long promised traits of drought resistance and increased water use efficiency (High and Dry) or increased nitrogen use efficiency (No Sure Fix).
The two reports identify products approved for commercialization or in the pipeline by analyzing USDA databases and the peer-reviewed scientific literature.
No Sure Fix revealed that there are no crops genetically engineered to increase use nitrogen efficiency on the market and relatively few are in the pipeline.
High and Dry found no crop varieties on the market genetically engineered for water use efficiency (despite Monsanto’s early extravagant claims) and only one variety engineered to be drought tolerant near to commercialization. (That crop is now on the market).
In both reports, Doug scoured the scientific literature to assess the prospects for transgenics in the future, and noted that investigators continue to work in these fields. After careful analysis, he concluded that that science might yet produce successful nitrogen use efficient, drought tolerant or water use efficient varieties. But for now, twenty years into the biotechnology era, the performance record outside herbicide-tolerance and BT is slim. This is not because of regulation…transgenic crops that increase intrinsic yield, nitrogen use efficiency or drought tolerance would sail through the US regulatory system. These disappointing results simply signal that transgenic technology is more complicated and difficult than early proponents expected.
But just as with intrinsic yield, Doug documented that conventional breeding has already produced many drought-tolerant and nitrogen-efficient plants. Together, the three reports provide a compelling picture of the power of conventional breeding to solve major agricultural problems. Many varieties are available right now and many more could be if our agricultural research establishment were to prioritize conventional breeding (including marker-assisted techniques) and agroecology over genetic engineering.
What part of yes don’t we understand?
The truth about GMO’s is that GE is not the transformational technology promised in its early days and that it cannot begin to match the record of conventional breeding for producing fundamentally important traits in crops. By consuming so much attention, biotechnology impedes better solutions, like those laid out by UCS in its Healthy Farm: A Vision of Agriculture. In our world, faced with growing populations and climate change, finding the right solutions matters.
Of course, this is not to say that transgenics will not play important roles in the agriculture of the future. They well may, but for now a little humility is in order. Proponents of GE should step back a bit and let conventional breeding and agroecology take center stage.
Maybe the next Boston Review forum should focus on conventional breeding!]]>
It seems to come up with depressing regularity to justify, among other things, pesticides, industrial-scale monoculture, and biotechnology, all of which we must embrace—all together now—to feed the world. What gets under my skin is that the phrase is so often used by advocates of high-input American corn and soybeans, who otherwise seem not terribly concerned about problems of hungry people or farmers in developing countries.
A recent example is Farmers Feeding the World, an industry-wide campaign that “educates the general public about U.S. agriculture’s role in feeding a hungry world.” The fact that the campaign funnels money into worthy organizations doesn’t obscure its focus on “the unique interests of people and organizations aligned with U.S. agriculture.”
But feeding the world doesn’t have much currency among those dedicated full-time to fighting hunger.
The hunger organization, Bread for the World, talks not of how U.S. agriculture will feed the world, but of agricultural development for small-scale producers and women, improving nutrition for women and young children, and ensuring that efforts are “country-led”—meaning the communities, constituencies, and countries affected by hunger are setting priorities.
The ambitious U.S. initiative called Feed the Future does not use the phrase either but instead talks about “supporting countries in developing their own agricultural sectors to generate opportunities for economic growth that can help reduce poverty and hunger.”
Likewise, the recent Food and Agriculture Organization of the United Nations’ (FAO) report on world agriculture and malnutrition, The State of Food and Agriculture: Food Systems for Better Nutrition, doesn’t use the term. FAO would eradicate malnutrition by integrating agriculture into local and regional food systems, “from inputs and production, through processing, storage, transport, and retailing, to consumption.”
Maybe the phrase is falling from favor. I, for one, would welcome its retirement.
The term has enjoyed a long run. It gets almost 2 billion hits when Googled. Some of those hits relate to Bob Geldof’s 1984 Band-Aid concert, but most are about U.S. crops—more precisely, the export crops soybean and corn. The phrase got a big boost in the 1970s when Secretary of Agriculture Earl Butz used it to advocate for fence-row-to-fence-row agriculture. He knew that new uses and increased exports would be necessary to absorb all that production without lowering prices. “Feed the world” became a rallying cry for export-oriented agricultural policy. (It still is despite the fact that almost 40% of our corn acreage is devoted to producing ethanol.)
One reason the phrase is so favored is because feeding is an essential and benevolent activity that conjures comfortable memories of preparing, serving, and enjoying meals. To satisfy this basic need for the whole world is a noble endeavor. And, of course, there are grains of truth here. US farmers can feel good that they are helping to meet the food needs of those who can afford to buy their products.
But the phrase conflates the important issues of food production and hunger alleviation. It implies that producing corn and soybeans is the equivalent of putting food into the mouths of hungry people. But there is no direct connection between U.S. corn and soy production and ending hunger elsewhere (or for that matter in the US). In fact, the truth is that high production in the U.S. can depress world grain prices and throw developing country farmers off the land.
It is time to separate the issues of hunger alleviation and crop production.
Despite decades of surplus commodity crop production, world hunger has been, and remains, an acute problem. In its recent report, FAO estimates that 868 million people (12.5% of the world population) are undernourished in terms of energy intake. (That’s only a part of the hunger problem. The full global burden of malnutrition would include 26% of the world children who are stunted, 2 billion people suffering from one or more micronutrient deficiencies and 500 million people who are obese.)
Simply increasing crop production in the U.S. won’t help feed those people because insufficient production—and certainly insufficient production in the developed world—is not the heart of the problem. Many issues beyond production need to be addressed and most of the effort needs to be directed to the developing world. Tackling issues like infrastructure, transport, storage, prices, and the role of women in an integrated way, as both the FAO and the Feed the Future initiatives do, is the only serious approach to the world hunger problem.
Implying that U.S. grain exports can alleviate hunger by feeding the world distracts from that key understanding.
U.S. export policy should be addressed on its own terms, primarily as an economic issue rather than a humanitarian enterprise. Hungry people should not be the poster-children for the interests of the well-fed.
People who care most about developing country agriculture don’t use the phrase “feeding the world.” Those interested in corn and soybean exports should drop it as well.
If we need a catchphrase for world hunger issues, we could consider “helping the world feed itself.” I know, it doesn’t exactly sing, but it will help us focus on genuine solutions to vital global problems.]]>
But it turns that an increasing number of farmers are doing just that—buying, planting and tending so-called cover crops. No, they can’t sell them, but they do reap benefits from them, including increased yields of their cash crops like corn and soybeans. Use of cover crops can also help farms survive the droughts expected to be more common in the era of climate change.
Cover crops, which can be many species of grains, grasses and legumes, are usually planted in the interval between the harvest and planting of cash crops. Sending their roots down into bare soil, cover crops can increase soil carbon, provide slow-release nitrogen, and prevent erosion. But a cover crop/cash crop system is complex. If improperly managed, cover crops can deprive cash crops of water or even reduce yields. Although they make sense in theory, many have wondered how cover crops would work in the real world.
A new survey
Now a new survey of commercial farmers has confirmed that that cover crops increase yields in corn and soybeans, our most common crops. Moreover, cover crops were especially effective under drought conditions.
The survey of more than 759 commercial farmers was conducted last year (2012-13) by the North Central Sustainable Agriculture Research and Education (SARE) program and the Conservation Technology Information Center. The farmers who responded to the survey reported average increases of 11.1 bushels of corn per acre and 4.9 bushels of soybeans per acre. In percentage terms, the extra bushels represent an average 9.6 percent greater yield in corn planted after cover crops compared to crops not preceded by cover crops. The increase in soybeans was 11.6 percent. That’s pretty impressive.
The growers reported yield information from comparable fields that were similar in conditions and rotation except for the cover crops.
Star performers under drought conditions
The yield increases in cash crops planted after cover crops were even greater in states hit hard by drought.
The states most affected by the severe 2012 drought were Illinois, Indiana, Iowa, Kansas, Missouri, Nebraska and South Dakota. The 141 respondents from those states reported the average corn yield was 11.3 bushels per acre, which represented an 11 percent increase in crops grown after cover crops compared to those grown without them. Respondents from the drought-affected states reported even greater benefits in soybeans: an average increase of 5.7 bushels per acre, or 14.3 percent higher yields after cover crops.
The farmers responding to the survey grew cover crops on an estimated 218,000 acres of cover crops in 36 states, mostly in the Mississippi River basin. Not surprisingly, drought-related impacts varied across the country. But the results were solid: farmers enjoyed better corn yields after cover crops in all but one of the states hardest hit by the drought.
Benefits worth paying for
Farmers expected to pay for the ecosystem services provided by cover crops. They were willing to pay median costs of $25 an acre to purchase seeds and $15 an acre for establishment (aerial distribution of seed and termination (killing)) of the cover crop.
The challenges of growing cover crops
Farmers interested in cover crops need to decide which species to use, how and when to plant them, and whether to plant single or multispecies mixes. If the wrong decisions are made, cover crops might not deliver on their potential benefits or may even be detrimental. The survey respondents reported a long list of challenges including cover crop seed availability, increased insect potential, and that cover crops might use too much soil moisture.
Despite the challenges, these farmers had steadily increased their use of cover crops over the last decade. Last winter they reported planting cover crops on an average of 42 percent of their acreage and planned to increase their cover crop acreage this coming winter.
The complexity of the system may explain the correlation of yield increases with experience using cover crops. Growers with more than three years working with cover crops saw a 9.6 percent increase in corn yields, while growers with one to three years reported a still respectible, but lower, 6.1 percent boost in corn.
Drought-resistant systems and drought-tolerant crops
A complete drought-tolerant package would include appropriate crop choices and specially bred varieties of crops as well as a drought-tolerant system. The crop-centered approach to drought was discussed by my colleague, Doug Gurian-Sherman, in his recent report High and Dry. In addition to highlighting the availability of crops like sorghum and alfalfa that are inherently more drought-tolerant and might be used more often in U.S. agriculture, Doug also discussed the success of conventional corn breeders who have increased drought tolerance at a steady pace of 1 percent a year over decades.
Genetic engineering has yet to play an important role in drought tolerance, only this year introducing its first drought-tolerant variety, Monsanto’s DroughtGard. According to the Monsanto website, the variety has produced a 5 bushel (or about 4 percent) yield advantage in field tests against competitor hybrids.
But, however successful crop genetics might be, the right choice of crops and varieties cannot compensate for the deficiencies in systems. The fundamental requirement for combating drought is to keep moisture in soil. Cover crops can do that–and so much more.]]>
But food stamps and commodity programs are not all the debate should be about. The farm bill also contains numerous smaller, but vital and innovative programs supporting healthy food, research, and conservation programs.
Right now, those programs are in danger of being forgotten in the farm bill debate, when by rights they ought to be moving to a more central place in farm bill policy. These are the programs that will help us face the fundamental threats of climate pollution, degraded air and water quality, and impaired coastal fisheries and other ecosystems. These are the programs that will help us encourage healthy diets and a prosperous rural economy.
All Americans are interested in clean air, clean water, healthy food and the mitigation of climate change. The growing constituencies for a clean environment and healthy diets could help transcend the narrow versions of urban and rural America that currently dominate the debate. Maybe—just maybe—the current farm bill fracas will allow for the emergence of new, more broadly based coalitions to reorient agriculture policy to achieve these important goals.
UCS vision for the future of agriculture
If so, we will be poised to pursue a new vision of agriculture that both ensures high productivity and responds to environmental and human health challenges. UCS has recently offered such a vision: The Healthy Farm: A Vision for U.S. Agriculture.
Though profound in its impacts, the vision picks up on the positive features of today’s agriculture and reshapes them in ways that are both practical and feasible. It relies on four practices—crop rotation, cover crops, crop/livestock integration and landscape integration, underpinned by a properly oriented research agenda and policy incentives.
Although all are important and work together, crop rotation is probably the central practice of a genuinely sustainable agriculture.
Growing three or four crops in rotation on the same piece of land controls both weeds and insect pests from the get-go and drastically reduces the need for poisonous chemical inputs. But here’s the best part: once in place, the system keeps pests at bay year after year—no buildup of toxins in soil and water (whether the toxins were initially applied externally or engineered into crops)—and slows the emergence of resistant weeds and bugs. Crop rotation can also reduce the need for fertilizers by including nitrogen-fixing crops in the rotation.
Colorado State University scientists recently confirmed that in the High Plains, rotation of corn with other crops is “the best method” of avoiding the pest responsible for most pesticide use in Colorado, the corn rootworm. If farmers in Colorado had been rotating crops, the rootworms would not have proliferated, and neither genetically engineered crops nor chemical pesticides would have been needed to control them.
Scientists have recently documented that crop rotations also substantially increase yields.
U.S. farmers don’t rotate corn
Despite the demonstrated benefits and relative ease of adoption, crop rotation is not often practiced. Scientists at Iowa State estimated that in 2011 only sixty percent of the Iowa corn crop was rotated at all, and then mostly with soybeans. It is not hard to understand why. Farmers have limited land resources and generally speaking choose to grow crops commanding the highest prices. Right now corn is that crop. So, if farmers can grow corn, they will—year after year.
This choice makes short-term economic sense for farmers, and I don’t blame them for making it. I do blame our shortsighted agricultural policy that reinforces this choice by providing direct payments and subsidized crop insurance for corn crops.
We need agricultural policy that reverses direction and makes it possible for farmers to provide public goods like clean air, clean water and flourishing ecosystems. To be blunt, we need to make it economically attractive for farmers to rotate corn with two or three other crops.
Let’s make rotation possible
If we set crop rotation as a goal, there are lots of way to encourage it. We could redeploy subsidies to provide incentives for three- or four-year crop rotations; invest in research for new uses of crops other than corn that will increase their value in the marketplace; balance the corn ethanol incentives with incentives for cellulosic energy crops; or change animal agriculture to raise more on pasture or other grains than corn.
To have a chance of implementing these and other farsighted policies, we need people interested in healthy foods and healthy farms to weigh in and redirect subsidies now going to encourage corn production to new goals like helping farmers rotate corn with other crops and adopt other sustainable practices.
There will be a lot of commotion as the farm bill lurches ahead in this new, uncertain environment, and it may be hard to tell through the din whether we have moved toward sustainability in a meaningful way. But here’s a tip.
Look at what American farmers plant. If they are planting three or four-year crops in rotation, you can be pretty sure we are headed in the right direction. If they are still planting continuous corn or even corn and soybeans, we’re still stuck in the past.]]>
This winter, U.S. commercial beekeepers reported devastating losses of 30 percent to as much as 50 percent of their hives. These losses are on top of annual losses in the 20 percent to 30 percent range since 2007, far exceeding the historical rate, which is approximately 10 to 15 percent. To see bee declines through the eye of a beekeeper, read this story about Steve Ellis in Elbow Lake, Minnesota.
The loss of bees and other pollinators should concern all of us. A third of food crops—among them fruits, vegetables, and nuts— depend on animal pollination. Since 2006, an estimated 10 million bee hives at an approximate value of $200 each have been lost.
Causes of honey bee decline
There are a number of potential causes of bee decline, including loss of flower-rich habitat, infestations with Varroa mites, applications of fungicides and insecticides, use of honey substitutes as bee food, and synergies among pathogens and chemical pesticides.
Much attention has focused on the neonicotinoid, or neonic, neurotoxins now among the world’s most popular insecticides. Startling graphics prepared by the U.S. Geological Survey show how rapidly the neonics imidacloprid and clothianidin have been adopted and how extensively they are now deployed in the United States. In agriculture, neonics are often delivered as seed coatings. The pesticide enters the seed as it germinates and eventually infuses the entire plant, including pollen and nectar.
Neonics are also toxic to aquatic invertebrates and, as discussed in an earlier post by my colleague Doug Gurian-Sherman, to birds.
Europeans have recently announced a two-year precautionary ban on the use of three neonics, clothianidin, imidacloprid and thiametoxam. Despite the evident distress of U.S. bee industry, the U.S. government has yet to take any action.
While it is unlikely that the neonics are solely responsible, it is reasonable to suspect that these widely used, highly toxic insecticides would play a significant role in bee decline and that the government should seriously consider imposing restrictions on their use.
And are these chemicals ever toxic to insects! A study by Christian Krupke and colleagues at Purdue University estimate that the amount of clothianidin coating just one kernel of corn is enough to kill 80,000 honey bees! That same study demonstrated that honey bees can be exposed to neonics by multiple routes, including the clouds of waste talc thrown up by corn seed planters containing very high pesticide concentrations. (The talc is used to keep the treated seeds separate while planting.)
Low levels of neonic exposure
But a central question confronting scientists investigating the causes of bee decline is the impact of the low concentrations of neonics now widespread in the environment that honey bees are likely to encounter.
A new review paper by Henk A. Tennekes and Francisco Sanchez-Bayo in the journal Toxicology (Volume 309, July 5, 2013, pages 39–51; login required) suggests that very low concentrations of neonics can have devastating effects on bees and—here’s the most important part—that conventional risk assessment approaches can miss or underestimate those effects.
According to the paper, neonics are in a group of chemicals, called time-dependent chemicals, whose toxic effects build up during long exposure times. The paper suggests that time-dependent phenomena occur when an insecticide binds very tightly or irreversibly to critical receptors in the target organism. Given a long enough exposure, even very low levels of time-dependent chemicals can kill.
Standard toxicity tests, which focus on the concentration of toxins for relatively short time periods, do not pick up time-dependent effects because they fail to expose target organisms to very low concentrations of a toxin over long enough periods of time.
Tennekes and Sanchez-Bayo use imidacloprid as a test case to demonstrate how standard risk assessment protocols can miss the harmful effects low levels of the chemicals can have on honey bees.
The paper assessed the impact of imidacloprid on honey bees by determining the time it took for 50 percent of the bees to die (t50) when exposed for varying time intervals to low doses of the chemical. It then related the exposure data to the pesticide concentrations typically found as plant residues under field conditions and calculated that 50 percent of worker bees would die within seven to ten days if they fed on a such a field. By contrast, the authors assert that standard risk assessments suggest field concentrations of imidacloprid pose no risks at all to honey bees.
Tennekes and Sanchez-Bayo propose a new risk assessment protocol based on t50s to evaluate the effects of time-dependent chemicals and recommend that going forward regulatory agencies employ such protocols to assess the harmful effects of neonics.
Regulators should consider these recommendations. Pollinators are too important to agriculture and other ecosystems, and neonics too widely used, for regulators to be ignorant of the threats low levels of these pesticides pose.]]>
The Simplot potatoes were produced through a new kind of GE—gene silencing. Simplot’s version of gene silencing, called Innate™ technology, adds genetic fragments derived from cultivated and wild potatoes, but no genetic material from unrelated organisms.
The industry is hoping that the potatoes will get a favorable reception in the marketplace because the potatoes’ benefits—reduced levels of acrylamide and reduced bruising—appeal to consumers as well as potato producers. They also hope that consumers might be less wary of a GE technique that does not cross species lines, but employs only potato genes.
In addition to reducing bruising, the Simplot potatoes also help manage so-called reducing sugars, a substantial benefit to potato growers who can have a much as 20 percent of their crop rejected by food buyers for exceeding acceptable levels of such sugars. Although the potatoes are not initially intended for retail sale, reduced discoloration of after slicing potatoes could be a benefit to consumers.
Consumers might also benefit from reduced acrylamide concentrations. Acrylamide is a neurotoxin produced by cooking foods discovered by Swedish scientists in 2002. Although acrylamide turns out to present in a variety of heated foods like coffee and baked goods, according to Simplot’s petition for deregulation, French fries and potato chips are the highest per serving sources, accounting for 35 percent of the acrylamide in the U.S. diet.
Scientist have still not fully characterized the ill effects of acrylamide in foods. Some studies find associations with cancer; others do not. But acrylamide is a nasty chemical, and reducing its levels in the diet seems like a good idea.
Many consumers, however, are not aware that French fries or other cooked foods contain acrylamide. So I’m not sure how consumers would become aware that Simplot potatoes provide this benefit. I can’t imagine McDonald’s touting a switch to French fries with reduced levels of a neurotoxin that many consumers had never before associated with their product.
My guess is that many health advocates would prefer consuming fewer French fries to GE as a way of reducing exposure to acrylamide. In fact, nutritionists may worry about a technology that appears to justify eating French fries.
Gene silencing is a relatively new form of GE that turns off particular genes typically by interfering with the protein synthetic machinery of cells. Turning off genes can accomplish many tasks, among them, disabling invading viruses, delaying fruit ripening or altering flower colors.
As noted above, some versions of gene silencing can accomplish desired effects without adding genetic material from unrelated organisms.
But like other forms of genetic engineering, it involves elaborate snipping and rearranging of genetic material to construct cassettes of DNA before being introduced into cells.
Organisms produced by gene silencing are GE, but not necessarily transgenic. Simplot’s Innate™ potato, which is engineered with gene fragments from wild and cultivated potatoes, is an example.
Will the fact that the genetic material used in the Simplot potato came only from potatoes make a difference in its reception in the marketplace?
Not an easy question. For those who object to crossing species lines from an ethical or philosophical point of view, the all-potato construction might make a big difference.
For those who view transgenesis not as an ethical issue, but as a surrogate for risk, it might not, depending on the other risks of the potato.
Many scientists are concerned about GE that crosses species lines because the resulting trait combinations are generally not possible in nature and so their potential downsides are difficult to predict. From this standpoint, gene silencing using all-potato gene cassettes will probably raise fewer flags than GE that crosses species boundaries.
On the other hand, gene silencing technology is the product of scientific research that shows the cellular regulation of genes to be an immensely complex process. Science has moved far beyond the days of the so-called central dogma in which stretches of DNA (genes) coded for pieces of RNA (messenger RNA) that travelled into the cytoplasm to direct the production of proteins—essentially a one-way flow of information.
The cast of players in protein production now is much, much larger and includes a new alphabet soup of RNA molecules with a plethora of interactions and roles. One of the new players is microRNA (miRNA) that can reduce the amount of protein in a cell by hooking up with special enzymes and destroying the messenger RNA responsible for that protein. But there are many others.
So while some risk-based concern is allayed by confining gene combinations within a species, residual concern based on the complexity and incompletely understood nature of gene silencing and related processes remains. In addition, it appears that gene silencing may pose a previously unrecognized risk of genetic engineering: the risk of turning off non-target genes.
This potential harm, which has been described in a recent paper by Jack Heinemann of the University of Canterbury and colleagues (Environment International 55: 43-55; requires login), depends on one of the many newly recognized kinds of RNA participating in in protein synthesis—double stranded RNA (dsRNA).
DsRNA is an important new topic in food biotechnology and we will discuss it in a later post. But suffice it to say that it represents a potential harm of gene silencing and other forms of GE that merits examination.
So it is too early to say whether the Simplot potato will get a more favorable reception than earlier GE foods based on the all-potato origin of its new genetic material and consumer benefits. But it certainly will become an important test for how well the FDA handles the potential new risk of genetic engineering—dsRNAs.]]>
The scientific data connecting antibiotic use in agriculture to the evolution of resistant human disease have been compelling for decades. Recent studies only strengthen the case. The FDA knows that ongoing antibiotic use leads to more resistant bacteria and that the crisis deepens every year we fail to address the issue.
The solution to the crisis is straightforward: avoid the unnecessary use of antibiotics in both human and veterinary medicine and animal agriculture. In agriculture, most antibiotic use compensates for crowded, stressful raising conditions that can be avoided with good animal husbandry, so meaningful reductions are possible and economically feasible. (Of course, when animals fall ill or are directly exposed to contagious disease, antibiotics are called for and should be used.)
The FDA has the legal tools to cancel approvals for unnecessary uses of the antibiotic classes that are important in human medicine. Using those legal tools would go a long way to preserve the efficacy of our precious, but ever dwindling, store of drugs.
Regardless, over a 30-year time frame, the FDA has only occasionally addressed this crisis. The agency’s foot-dragging has astounded even a federal court judge.
User Fees and Data Reporting Requirements
The way the Senate and FDA have handled the Animal Drug and Animal Generic Drug User Fee Reauthorization Act of 2013 (ADUFA) gives a strong hint. ADUFA is a law that allows the drug industry to pay fees to the FDA to expedite the review of animal drug applications. The fees, which are negotiated for a five-year term between FDA and the industry, supplement the agency’s budget. The ADUFA reauthorization bill introduced in Senate in March (S.622) would provide FDA more than $21 million annually from 2014 through 2018 to support new animal drug reviews, and about $7 million a year over the same period to handle generic drug applications.
That is not chump change. FDA financial reports for fiscal year 2011 attributed 92 full-time positions to the ADUFA fees.
The convergent interests of the FDA and the industry it is supposed to regulate make ADUFA a “must-pass” piece of legislation that offers an opportunity to enact public health measures along with expedited drug reviews. The 2008 ADUFA included the first-ever requirements for collecting and publishing sales data for animal antibiotics.
This year the public interest community is urging very modest improvements to those 2008 requirements: more information on species, purpose, and how the animal drugs are administered. These data would be valuable for scientists to understand and respond to the antibiotic crisis. But the bill reported out of the Senate Health, Education, Labor and Pensions (HELP) Committee was “clean”—not a speck of public health reporting was added.
Modest as they were, the additional reporting requirements were too much for industry and the HELP Committee.
As former FDA Commissioner David Kessler said in a recent New York Times op-ed, both industry and the HELP Committee are being “aided and abetted” by the FDA, which has not uttered a peep in favor of stronger reporting requirements. Why? With $30 million a year and 90 or so positions on the line, the FDA doesn’t want to rock the boat, not even to achieve important public health goals.
Fortunately, two senators not on the HELP committee, Dianne Feinstein (D-Calif.) and Kirsten Gillibrand (D-N.Y.) are continuing to fight to strengthen the Senate version of ADUFA, while the House is considering more robust, but still very reasonable, reporting requirements that could be added to its its version of the bill. But so far there is no indication that the FDA will throw its weight behind stronger reporting requirements.
User Fees and Agency Priorities
It turns out that the agriculture industry gets even more for its millions than help in quashing important data collection improvements. In exchange for the fees, the FDA has agreed to detailed performance deadlines to ensure the drug review process moves along quickly and the agency rarely misses its deadlines. But when it comes to addressing the threat posed by the overuse of antibiotics, the agency moves at a snail’s pace, if at all. The industry’s milestones are sacrosanct, but the agency’s public health mission—whenever. The user fees set the agency’s priorities.
I also wonder what role industry played in the evolution of the FDA’s timid, voluntary program to remove injudicious antibiotic uses from the market. As I argued in an earlier post, that program envisions lopsided, drug company-friendly negotiations to determine which products go off the market. Could the agency’s dependence on the same companies for its budget have factored into the program’s design?
The public bears a big responsibility here. If taxpayers were willing to step up and fund the FDA, the public, not the regulated community, would be calling the shots.
That said, under the current situation, user fees are giving the animal drug industry too much say in the choices and priorities of the FDA. Like Nipper, the iconic dog sitting at attention next to the RCA Victrola, the FDA appears to be listening to its master’s voice.]]>
From a public health point of view, new preventive uses are likely to be indistinguishable from—and just as troublesome as–the production uses they replace. Both involve large quantities of antibiotics used at low levels over long periods of time, the perfect recipe for encouraging the development of resistant bacteria.
Relabeling rather than reducing the massive uses of antibiotics would be a public health failure.
Draft Guidance #213 responds to the temptation of relabeling by imposing special criteria for new approvals for prevention claims. While at first blush these criteria appear to be reassuring, they are anything but. In effect, they offer a new path to drug approval that would circumvent and weaken current regulatory standards.
Guidance #152: A Public Health Victory for FDA
Current standards are embodied in Guidance for Industry #152 (Guidance #152), one of the genuinely bright spots in the FDA’s record of combating antibiotic resistance. Put in place in 2003, Guidance #152 established an elegant way of combining qualitative factors, for example how drugs are used in human medicine and the levels of consumption of food animals, to reach an assessment of the potential for an animal drug to cause resistant disease that matters to humans.
The standards are based on the scientifically sound principle that the overall amount of drug use is a major driver of the evolution of antibiotic resistance. Thus, Guidance #152 rates flock-wide or herd-wide uses of drugs a key factor in assessing risk, regardless of whether the uses are for preventive or production purposes.
Typically, cattle, swine and poultry are raised in very large, confined animal feeding operations (CAFOs). Resistant bacteria fill the guts of all these animals and are also found on their skin and in their manure. Exposure to antibiotics enriches the populations of resistant bacteria in guts and manure lagoons, which then make their way to humans through food, other humans and the environment. Herd- and flock-wide drug use in CAFOs help explain the continued use of massive quantities of drugs sold in the U.S.
An important factor in Guidance #152 is whether the animal drugs are (or are in classes of drugs that are) used in human medicine. If so, resistance has implications for human (and animal) medicine. Such drugs are considered “medically important.”
The good news is that adherence to Guidance #152 has brought to a virtual halt the issuance of new approvals of medically important drugs for herd-wide and flock-wide uses in major food animal species—cattle, swine, or poultry.
Although Guidance #152 needs improvement—for example, to broaden its definition of drugs critically important to humans beyond those involved in foodborne illness—the fact that the FDA has not approved medically important drugs for large-scale, indiscriminate use for the past decade is a major triumph for public health.
Draft Guidance # 213: A Way Around Guidance #152
Surprisingly, Draft Guidance #213 does not propose to apply Guidance #152 to its decisions for new approvals. Instead it offers a separate approval process “in lieu of Guidance #152” that speaks to the issues of duration and level of use and treatment of apparently healthy animals, but leaves lots of wiggle room.
For example, the new labels for prevention would have to specify a duration of use, but the guidance does not say how long that duration can be. Draft Guidance #213 encourages, but does not require, that doses for the prevention uses be higher than current production doses.
But the big loophole in Draft Guidance #213 is the FDA’s expectation that any new indications should “be available only to those animals that need the drug for the new indication, rather than the entire flock or herd when such use is not necessary.” This leaves open the possibility that routine flock- or herd-wide use can be considered when it is necessary, an option not available in Guidance#152.
In fact, considering necessity as a factor in risk assessment of animal drugs goes beyond the FDA’s statutory authority. Although it is not clear how necessity would play out in the new approval process, the concept is foreign to the Food, Drug and Cosmetic Act, which flatly requires that new animal drugs intended for use in food-producing animals be safe with regard to human health (21 C.F.R. 514.1(b)(8)). The statute does not allow for the approval of unsafe drugs because they are necessary for animal health or efficient meat production.
The conditions most likely to necessitate long term preventive uses of antibiotics in flocks or herd are endemic diseases resulting from diet or crowded, stressful conditions in CAFOs.
A Sweet Deal for Industry
We cannot know whether the FDA will walk through the doors it opened in Guidance #213. It may not. But, as we discussed last post, the voluntary process puts the agency under great pressure to satisfy the demands of the industry with regard to new approvals.
New flock- and herd-wide approvals for prevention would lock in high drug use in CAFOs and gut the victories achieved under Guidance #152. It is possible that the weaker standard in Draft Guidance #213 could eventually displace the high standard of Guidance #152 altogether.
A sweet deal for industry, a travesty for public health.]]>
The FDA’s policy
The FDA’s approach to this public health crisis is to eliminate unacceptable (“injudicious”) uses of antibiotics, and subject the rest to veterinary oversight through either prescriptions or veterinary feed directives (VFDs).
The agency has deemed so-called production uses of antibiotics, like feed efficiency and growth promotion, as unacceptable and all health-related uses (therapy, disease control and disease prevention) as acceptable. The approach sounds reasonable but has hidden flaws.
Production uses are unacceptable because efficient production can be achieved by good husbandry without risking the efficacy of valuable antibiotic drugs. In addition production uses typically involve the administration of drugs at low doses for months at a time, the perfect recipe for encouraging the emergence of resistant bacteria in livestock and poultry.
Health-related uses that are considered acceptable, on the other hand, are a mixed bag.
Therapy, the treatment of sick animal to cure illness and prevent suffering is generally acceptable. Animals get sick only sporadically and when the do are usually treated at high doses for a relatively short period of time.
By contrast, most routine antibiotic uses for prevention involve the same long durations of administration at low doses as production use and good husbandry can prevent disease as effectively as drugs. So long term, routine preventive uses of antibiotics are unacceptable.
There are special circumstances where preventive uses are acceptable. That’s where antibiotics are necessary to prevent imminent occurrence of disease. Disease outbreaks occur relatively rarely and involve relatively short term courses of treatment.
But the FDA policy does not separate out the large scale, routine preventive uses of antibiotics from therapy and special preventive uses in cases of imminent danger. As will be discussed below, lumping all health–related uses together as acceptable sets the stage for large scale waste of agency resources with virtually no public health benefit.
The voluntary path
As noted above, the FDA’s policy is to eliminate all production approvals for feed efficiency and growth promotion. Over 20 companies currently hold such approvals, legally known as label claims. Typically, these claims appear on a drug label and indicate that the drug can be used in a particular way (by injection, by feed), in a particular animal species (swine, turkeys), at a particular dose, and for a specified purpose (to treat, prevent or control a specific disease or improve production).
The Agency can legally withdraw the label claims approvals if it can show that uses under the label circumstances are no longer safe in terms of resistance.
A second way to eliminate production approvals is somehow to persuade all the drug companies to drop claims voluntarily.
The trick, of course, is to get the drug companies to go along. Asking drug companies to no longer sell products on which they are making millions of dollars seems almost ridiculous.
The FDA knows that drug companies will not give up production claims out of altruistic concern for the public health. So it is planning to give them something in exchange: new label claims for drugs for disease prevention or therapy. Sales of drugs under the new claims would bolster sales of antibiotic products and repair any damage to bottom lines from the loss of the production claims.
The negotiations for new approvals
Although the trade-off idea is clear enough in concept, the negotiations to implement it will be complicated and could prove messy. The rules for conducting these negotiations are laid out in Draft Guidance For Industry #213 (hereinafter Guidance # 213).
The idea is that companies will come to the FDA with a list of production claims they are willing to abandon. The list might, but need not, involve all the production claims a company possesses. Of course, the companies won’t actually surrender claims until they see what the FDA has to offer in exchange.
Companies with numerous production claims may end up demanding a fair number of new approvals in exchange for voluntary withdrawals. This is going to be like the pit of old New York Stock Exchange. Do I hear a new therapeutic approval for our penicillin feed additive in swine in exchange for two growth-promoting claims in poultry? What about a preventive use claim for erythromycin in cattle?
Some changes in labels can occur at no cost to companies because they already hold approvals for the same drug at the same dose for both prevention and production purposes. But even in those cases, the companies can insist on an inducement to drop the claim.
If a company does not like what the FDA offers, it can simply refuse to go ahead with its “offer” to give up its production claims. The FDA may eventually decide to cancel the claim legally, but it’s a good bet that it will take the Agency years to act on that decision. Meanwhile the company continues to sell its drugs with no penalty.
These negotiations will take place behind closed doors over a three-year period to begin the day the final version of Guidance #213 is issued. Currently, the Agency is predicting final issuance by the end of March 2013. The companies will have three months from the day Guidance #213 is issued to submit their initial list of claims they are willing give up.
So far drug companies are being understandably cagey. None of them has publicly committed to give up any production claims. And it is unlikely that any will until they have firm commitments from the FDA for new approvals or other inducements.
If the FDA cannot persuade the companies to give up their claims, much time and resources will have been wasted, an outcome the FDA is anxious to avoid. This gives the drug companies strong leverage in the negotiations and means the agency will be willing to go far to induce their cooperation. That’s why designation of routine disease prevention in the acceptable pot is so important. Without it, the Agency would only have low volume claims to bargain with.
The public health payoff?
By the end of the three years, the agency and those drug companies participating in the Guidance #213 process will have presumably come to a set of agreements. If all goes according to the FDA’s plan, some—or perhaps all—of the production claims will have been withdrawn, many new approvals will have been granted, and all uses of antibiotics will be subject to veterinary oversight through prescriptions or VFDs.
The public health test of the policy is whether, when all the trading is over, the Agency has achieved a substantial reduction in overall quantities of antibiotic used in animal agriculture.
Considering the incentives on the part of companies to maintain sales and the FDA’s need to induce industry cooperation, it is hard to imagine that the end result of negotiations will be an overall reduction in antibiotic use. If so, that would be a colossal waste of public health resources.
But believe it or not, horse-trading for approvals may not be the end of what the FDA is willing to do to sweeten the pot for industry. We’ll talk about that in the next post.]]>