Tomorrow’s agreement (full details yet to be released) between Canada, Mexico and the US shows the type of leadership we need to build on the momentum created by the Paris Agreement. The goal of getting half the continent’s electricity from “clean energy” by 2025 is well within reach—and eminently affordable given the sharply falling costs of solar and wind energy and projections for continued price declines. In addition, smart investments in transmission can help integrate the continent’s electricity markets more closely while helping to bring on line even more clean, reliable and affordable energy.
For the purposes of this announcement, “clean energy” sources include wind, solar, hydro and other types of renewable energy; nuclear power; fossil fuels with carbon capture and storage; and energy efficiency. It is more apt to call nuclear power and coal and natural gas with CCS low-carbon or zero-carbon energy sources. Nuclear power comes with issues related to safety, radioactive waste disposal and uranium mining wastes, and there is pollution associated with production, transportation and use of coal and natural gas with CCS.
The US, Canada and Mexico are well-positioned to meet the 50 percent clean energy goal by 2025, but it will require robust implementation of current policies and some new ones.
According to EIA data, in 2015 renewable energy (including hydropower) accounted for 13 percent of US power generation, and carbon-free sources provided about 33 percent of US electricity generation. As the largest economy and biggest consumer of electricity among the three countries, progress on renewable energy in the US can go a long way toward meeting the continent-wide goals.
A recent UCS analysis shows that, with the Clean Power Plan and the recent extensions of the federal renewable energy tax credits, the US can get 25 percent of its electricity from renewable energy (hydro plus non-hydro) by 2025. Adding nuclear power would bring the US total for carbon-free electricity to nearly 46 percent by 2025. In our analysis of the Clean Power Plan, we assumed all states adopt the EPA’s mass-based targets with a “new source complement” and achieve compliance via a nationwide carbon trading program. We also assumed that all states, as part of their compliance strategy, invest in energy efficiency at a level that achieves a reduction in electricity sales of at least 1 percent per year from 2022 to 2030.
For comparison, EIA’s examination of the Clean Power Plan and federal tax credits (the AEO 2016 reference case) estimates that the US would reach 41 percent clean energy by 2025. Depending on how energy efficiency is credited under the proposal to be announced tomorrow the “clean energy” share from the UCS and EIA analyses could be even higher.
Meanwhile, Canada already gets about 80 percent of its power from carbon-free sources, including approximately 60 percent from hydropower and 16 percent from nuclear power. The province of Ontario is the first jurisdiction in North America to completely eliminate coal-fired generation.
Mexico currently gets about a quarter of its electricity from clean generation. The 2015 Ley de Transición Energética (or Energy Transition Law) sets a goal of generating 35 percent of Mexico’s electricity from clean energy sources by 2024.
Together, the US, Canada and Mexico currently get approximately 37 percent of their generation from clean energy sources. With current policies in place, the three countries are projected to get 45 to 49 percent clean energy by 2025, depending on whether the UCS or EIA reference case projections are used for the US. Clearly, some additional policy action will be required—for example through new sub-national or national policies—but it is not a big stretch. With the costs of renewable energy falling rapidly, there’s no reason to shortchange the opportunity to go further.
Canada, Mexico and the US are also members of Mission Innovation, an international initiative of countries that have all committed to doubling their public investments in clean energy research and development over five years. Together with business initiatives, such as the Breakthrough Energy Coalition, the landscape for clean energy R&D is only getting brighter.
Nevertheless, we can’t take this progress for granted. For example, delivering on the US commitments will require that the Clean Power Plan, which is currently under a legal stay, moves ahead in a robust way, prioritizing renewable energy and energy efficiency as compliance options. Low natural gas prices could also thwart progress on renewable energy. Any erosion of state and federal policy incentives could also deal a blow to meeting those targets.
Enacting new or additional federal and state clean energy policies could help us greatly exceed these targets, and do so cost effectively, but we need Congress and state governments to act!
The commitment to cut methane emissions from the oil and gas sector is also critical as these emissions are on the rise globally. Methane is a much more potent greenhouse gas than carbon dioxide, although it is shorter-lived in the atmosphere. In the US, these emissions are increasing as natural gas production reaches all-time highs (mainly due to increased reliance on natural gas for power generation) and extraction of tight oil increases. Mexico and Canada also have significant methane emissions from their oil and gas operations.
Research shows that cost-effective opportunities to cut these emissions exist in all three countries. By taking steps jointly, they can help ensure that standards and best practices are harmonized across the North American continent.
The next US administration and the new Congress will play a critical role in delivering on the clean energy goals announced today, as well as more ambitious commitments in the years to come. The world and our North American neighbors will be counting on our continued climate leadership.
And while we’re thinking big, here’s to hoping that one of the next breakthrough moments for climate action is a North American carbon pricing initiative, building on the success of existing state and provincial programs in the US and Canada and Mexico’s carbon tax.]]>
Oil is the largest source of global warming pollution in the United States, and tailpipe pollution from gasoline-powered cars and diesel trucks is its most concrete manifestation. But even before these fuels are delivered to the gas station, a great deal of damage has already been done.
The carbon dioxide, methane, and other global warming pollution coming from oil wells and refineries in the oil supply chain is larger than the emissions from all the jet fuel used in the United States. Moreover, much of this pollution is unnecessary, and can be readily reduced using existing technology.
This is low hanging fruit compared to replacing all the 737s with planes like the Solar Impulse that just crossed the Atlantic on solar power. But before we can hold the oil industry accountable to cut the hidden pollution on the other side of the gas pump, we need to illuminate the extent of the problem.
A recent report from the Center for American Progress catalogs The Who’s Who of Methane Pollution in the Onshore Oil and Gas Production Sector. The top 5 are ConocoPhillips, ExxonMobil, Chesapeake Energy, EOG Resources Inc., and BP America.
Some of these names are familiar as major oil companies, and others are better known as gas companies, but these days the separation between oil and gas industries has all but vanished. With the rise of fracking, oil and gas comes from the same companies, and often from the same wells. Indeed, recently the trade associations representing these two companies merged. But regardless of what share of their product mix goes into cars versus power plants, methane pollution is a major climate pollutant, and the oil and gas industry has to be held accountable to curtail its wasteful and polluting practices.
EPA recently finalized regulations for new sources of methane in the oil and gas industry and is beginning to collect information on methane emissions from the existing operations, to develop appropriate regulations for this damaging global warming pollutant. Right now it’s clear we still have a lot to learn about exactly how large these emissions are. The CAP report is based on EPA reporting that captures just about half of the methane pollution from the industry. Improving accountability of polluters to measure and report their pollution is an essential foundation for policies to address climate change.
While cutting methane pollution from oil and gas industry is an essential near term action, it’s only the tip of the iceberg where oil industry pollution is concerned. An important new report “A Smart Tax: Pricing Oil for a Safe Climate” by Deborah Gordon and Jessica Tuchman Mathews at The Carnegie Endowment for International Peace takes a broader view of the complex and challenging oil supply chain. They explain that different types of oil have dramatically different emissions. Figure 2 provides a useful contrast of some different US sources of crude.
Some oils are extracted without unnecessary pollution and are relatively easy to refine, with the result that more than 90% of their pollution comes from using the final fuels like gasoline and diesel. Other heavier oils are more energy intensive to extract and refine, and can create 75% more pollution per barrel of oil than the lighter crude.
For these more challenging oils, tailpipe pollution accounts for less than 2/3 of the total pollution, so pollution reduction policies that target only emissions from fuel use will fail to account for a lot of the emissions, which is not good policy.
The oil market is complex, with highly variable sources of crude, different extraction and refining techniques, and complex markets for a variety of products. Gordon and Mathews argue that a well-designed “Smart Tax” that is based on a comprehensive lifecycle assessment of the oil supply chain can align the interests of energy producers with the need to cut carbon pollution, stimulating innovation and avoiding perverse incentives. They walk through key implementation details including why refineries are the best point of regulation and the implications for global oil trade. But the first order of business is better information.
The starting point for smart policies that hold the oil industry to account is better information about pollution from the oil supply chain. Gordon and Mathews lay out a compelling case for a tax policy approach, but regardless of whether the policy instrument is the pollution tax economics textbooks favor, a broad performance standard approach like the California Low Carbon Fuel Standard, or a more narrowly focused performance standard like the methane rules being developed by EPA, good policy must be based on good information. The current information the government collects on pollution from the oil supply chain is scattered and incomplete.
The Energy Information Administration was founded in the 1970s with a charter focused on ensuring reliable access to the energy our economy needed. But where the future of clean transportation is concerned, producing enough gasoline to fuel our cars is no longer the most pressing problem. Instead we need to maintain and improve everyone’s access to mobility while also preserving climate stability. Achieving this goal will require collection, dissemination, and analysis of different information on our fuels by all the stakeholders inside and outside of government. These two reports point us in that direction.]]>
If you are the owner of one of these cars with the a 2.0L diesel engine, you face three choices: 1) you can sell your car back to Volkswagen at a premium above the bluebook value, which varies by model year and trim level (between $12,500 and $44,000); 2) you will be offered the opportunity to repair your vehicle, even though no fix has yet been authorized, for which you will receive some compensation ($5,100 to $10,000, again dependent upon the trim, model year, etc. of your vehicle); or 3) you could choose to do nothing, in which case you receive no compensation.
In total, it’s estimated Volkswagen could pay up to around $10 billion in compensation to customers under this provision.
The biggest question surrounding this settlement lies in #2—if there is no approved fix, what does that mean for the settlement? And what if no fix is ever approved?
For the vehicles sold, it is unlikely that VW will be able to make the vehicles fully compliant.
However, in order to allow the option for consumers to keep their vehicles, the agencies are continuing to work with Volkswagen to certify a fix that will reduce between 80 and 90 percent of the excess pollution from the vehicle. The partial consent decree sets forth a number of criteria that any potential fix must meet, including threshold tailpipe emissions and durability as well as criteria regarding vehicle performance and fuel economy.
Unfortunately, it’s clear that these vehicles have polluted the environment beyond what is legal, and they will continue to do so moving forward. Therefore, additional funds are set aside to deal with this excess pollution.
Volkswagen is required to set aside $2.0 billion for investments in electric vehicles—this includes investments for infrastructure, education, and access. While this money remains in control of VW, they will be required to submit four 30-month plans, enacted over 10 years, that are approved by the EPA and CARB. This money will be used to drive investment in cleaner emitting technologies.
In addition to this, VW is required to fund $2.7 billion in mitigation, to be distributed to individual states for mitigation projects. Within the agreement are specific types of projects on which that this money can be spent, including electrification of buses and heavy-duty truck scrappage programs to accelerate getting older, higher-emitting trucks off the road.
Financial inducements to get the cars at the heart of this scandal fixed or off the road is a critical piece of the settlement. Volkswagen is required to fix or buy back at least 85 percent of these vehicles—if this threshold is not met, additional funds will be set aside for mitigation projects.
From Volkswagen’s perspective, the biggest question is probably regarding the civil penalty. This settlement does not include any civil penalties under the Clean Air Act. There continues to be criminal prosecution underway into who knew what when, and to what fault, so any civil penalties are unlikely to be wrapped up for some time.
From the consumer’s perspective, there still remains significant uncertainty—there is no fix yet approved, and the process by which any owner of these vehicles can sell back their vehicle will not be set in motion until the fall. Even then, it will still likely happen in tiers, as different types of vehicles may have their fixes approved at different times; it’s not as though VW can immediately handle 480,000 vehicles being turned in during a single month. The cut-off for an 85 percent turn-in/fix is not until 2019, so consumers could be dealing with this for years to come.
It’s not just consumers who will be dealing with the aftermath—there are a lot of questions for the public as well, particularly in terms of health and environmental impacts. Will $4.7 billion be enough investment to mitigate the impacts of these vehicles? This depends on a number of factors, including how effectively the funds are allocated, how many vehicles are scrapped versus fixed, and how adequate the fix is. Given the many uncertainties, we will likely not know the full costs for years.
One thing is abundantly clear, however—there is no full compensation possible for the amount of damage done to the environment, consumers, the public writ large, or to Volkswagen themselves for this egregious act of deception. We’ll all be paying for this for years to come.]]>
The book paints a picture of a better beef system, less damaging to the climate and the environment generally than the current system is. This is a vision I applaud, and one that my colleagues in the UCS Food and Environment program are researching. However, the book also raises scientific issues that I feel are worth exploring, since the dominant beef production system we have in place today, both globally and domestically, has some real problems.
As in the previous review, my focus will be on beef’s effect on the climate, which is the part of the subject that I know best. But I should mention that Hahn Niman’s book covers several other aspects—e.g. water, biodiversity, overgrazing, and especially health and nutrition. These are certainly concerns of mine and also aspects on which several of my UCS colleagues are working.
Chapter 1 of the book is titled “The Climate Change Case against Cattle: Sorting Fact from Fiction,” and it responds to a pattern that scientists have repeatedly found—that the climate footprint of beef is much larger than for nearly every other food, including other kinds of meat. This is not only the case globally, but for the United States as well. Thus, cattle are by far the largest source of U.S. emissions from agriculture, as shown by the graph below:
Hahn Niman doesn’t defend the current agricultural system that is producing these levels of emissions. However she focuses her critique on the 2006 FAO report Livestock’s Long Shadow and its estimate that 18% of global emissions are due to livestock, a large majority of which is due to beef. This is despite the more recent scientific studies that have confirmed and reinforced its basic conclusions, with only small changes in the percentage. These changes have come about because newer data became available and, importantly, because fossil fuel emissions—the major component of the denominator of the percentage—have grown. Thus, a decade later, the overall story of Livestock’s Long Shadow has been confirmed and extended by more evidence.
Two of Hahn Niman’s most important points concern the relevance of deforestation caused by livestock to Americans, and failing to consider the “offsetting” of beef’s emissions by the sequestration of soil carbon. So let’s consider those in turn.
Hahn Niman says that when considering the climate impact of beef, “including deforestation from developing countries … is unfair and unreasonable” (page 45). She claims that “I’ve shown that American beef has virtually no connection to deforestation emissions” (page 23). Thus, the issue with U.S. beef seems to be whether deforestation, an important source of global warming pollution (about 10% of the global total, by recent estimates) has anything to do with the U.S. While this is true of US beef production, it misses the important point that American beef is part of a global market and US beef consumption does play a role in deforestation:
Four relevant points:
1) While the majority of beef consumption in the U.S. is produced domestically, we most definitely import appreciable quantities from tropical forest countries.
2) That is because the beef market is now clearly a global one, in which increased consumption in any country, including the U.S., raises total demand and thus drives up world prices. And higher beef prices have been shown to lead to more deforestation. The U.S. is the world’s leading consumer of beef—24.1 billion pounds in 2014, according to the USDA.
3) U.S. companies are an important part of the global beef trade, as explained in UCS’ recently updated web pages on the drivers of deforestation today. This gives Americans an opportunity as well as a responsibility. We can let our corporations know that we want them to act to eliminate beef-driven deforestation from their supply chains—not just those in the U.S. but everywhere in the world—just as we have done with deforestation driven by palm oil.
4) Finally—and this is a criticism of our political leaders, not of Hahn Niman’s argument—we already have a long and sad experience of refusal to act on climate change, using the excuse that other countries are equally or more guilty than we are and therefore have to act first. This applies to all the causes of global warming—including deforestation.
A substantial part of Hahn Niman’s argument on the potential for carbon sequestration in pastures—11 pages—is based on the theories of Alan Savory, the former Zimbabwean rancher now famous for his TED talk. There have been detailed and extensive critiques of Savory’s arguments, both on the web and in scientific journals, so I won’t repeat all their points here. But just add one that to me is quite telling: after many years of controversy, Savory still has not published his studies on carbon sequestration in peer-reviewed scientific journals or made his data available publicly so that other researchers can assess it. This is particularly important since he is claiming to have made such a striking discovery. This omission alone weakens his case—and thus Hahn Niman’s use of his theories—very significantly.
What about carbon sequestration more broadly? Hahn Niman argues that it “may be more than enough to completely offset the emissions from grazing animals.” (page 45). How much evidence is there for this?
Zhongmin Hu and colleagues, in a 2016 article in Global Change Biology, reviewed experiments excluding grazing animals from grasslands at 51 different sites in China. They found that grazing exclusion led to an increase in carbon, both in the soil and in the vegetation, at most of the sites. In other words, they found that the carbon stock was higher without the grazers. This is in the opposite direction of the kind of effect that Hahn Niman’s “offsetting” argument assumes.
The same result—an effect on ecosystem carbon, but in the wrong direction for the hypothesis, comes from a large review of biomass and carbon recovery at 45 sites, with about 1500 total plots, in the New World tropics. In this study by Lourens Poorter and colleagues, both pastures from which cattle had been removed and abandoned agricultural fields showed substantial increases in biomass and carbon stock, with the annual rate of increase in carbon averaging 3.05 tons per hectare (1.23 tons per acre). There was no significant difference between former fields and former pastures in the rate of recovery of carbon.
So, the existing evidence doesn’t show that the difference in sequestration with and without cattle leads to a net carbon sink. Also, it remains unclear to what degree the total direct emissions from animals (ruminant methane, manure, etc.) and indirect ones (e.g. deforestation, fertilizer used to produce feed grains, etc.) could be offset through best management practices (e.g., by soil carbon sequestration in grasslands, avoided conversion to rangelands, avoiding chemical fertilizers, etc.)
This is not to say that we shouldn’t be working hard to increase soil sequestration in pastures, as well as under agricultural fields. And indeed, there have been some promising results in this kind of research. For example, Teague et al. recently proposed a set of scenarios for North American beef production, involving reduced soil erosion through conservation cropping and “adaptive multipaddock grazing” (AMP), under which net emissions could be decreased significantly Likewise, I have colleagues at UCS modeling various agricultural scenarios that will add to our knowledge on this question. But for now, whether or not beef production could ever become carbon neutral is far from settled science.
While I state at the outset that Hahn Niman does not defend current beef production practices, it is instructive to look at the current situation. Beef’s much higher emissions are associated with much more use of water and much more need for land. The figure below, from a recent review by Raganathan et al. published in a chapter in IFPRI’s annual report and also as a separate report from the World Resources Institute, shows the size of these differences.
Changing what we eat is one of the steps that we can take to confront this challenge, but it is not “the solution.” This is not only because emissions related to beef, although significant, are still considerably less than those from fossil fuels. It’s also because the necessary transformation of diets needs to recognize that the consumption of foods from high-emissions, ecologically inefficient production systems varies enormously between countries. It’s in the Americas—both North and South—and to a lesser extent in Russia and Europe that beef consumption rates are highest, and thus where emissions could be cut the most by diet changes.
A final point, is that this is a matter of reducing emissions, not an all-or-nothing question of morality. Personally, I have tried to reduce my emissions over the past decade by making changes such as driving a hybrid car, using public transport whenever possible, and changing our home’s electricity supplier to one that provides 100% renewable energy. These reduce my carbon footprint, but they don’t make it zero. Similarly, I now eat beef less frequently and in smaller amounts, but I haven’t eliminated it from my diet entirely.
There’s a real irony in this, because Nicolette Hahn Niman doesn’t eat beef—in fact, she doesn’t eat meat at all. She explains (page 184) that having given up meat in earlier years when she became a vegetarian, “to date I simply have not had the urge to eat it. If I ever regain the desire to eat meat, I will.”
So, a defender of beef doesn’t eat it, while this critic of it does. I don’t see this as making either of us more ethical than the other. But I do admit that it very likely means that my emissions from what I eat are probably larger than hers.
In my final review of this series on the book Cowed, I’ll consider how we can move towards reducing such emissions, but will also argue that beef consumption should continue, although at a lower level in many countries, including the U.S. Here I have looked at data showing the impact of removing grazing, because it’s a key test of the offsetting hypothesis, not because that’s my policy recommendation. Testing a hypothesis is one thing, and science gives us some a basic method for how to do it. But using that method—comparing “with” and “without”—is quite different from considering how to change beef production and consumption systems in the future.
What’s most important, though, is not just changing our individual carbon footprints, but doing things to change the overall emissions and sequestration of the whole planet. For example, if we could get American companies to insist that the beef and other products that they source from the tropics are deforestation-free, it would have much more impact than simply reducing our own consumption. These kinds of changes will help move our global society towards ways of eating, ways of farming and ranching, and ways of living, that will create a better future for those with whom we share the Earth.]]>
Maybe if it had said something like—Greater Efforts by Congress to Protect the Public Interest. Or—Ensuring that Public Health, Safety and Environmental Protections Take Priority. Then I could be excited. But in reading through this very lengthy policy document, it is a resurfacing of a set of bad ideas. These proposals would radically change the process of creating public health, safety, environment and consumer protections, apparently with one goal in mind—delay. Instead of a “Better Way”, it should have been titled “An Endless Delay.”
There are real reforms we could undertake to make science-based regulations more effective and more efficient. We should do regulatory review to streamline agency actions. There are important improvements to be made in the ability of the public to comment and influence policy, primarily through reducing the influence of high-powered industries. The process by which science informs agency decisions can be improved, through stronger scientific integrity policies, stronger requirements for independent peer review, disclosure of conflicts of interest, and greater public access to information. And yes, the APA could likely be strengthened as well. But the starting premise for such reforms should be the public interest and the need to provide stronger protections for public health, safety and the environment. I hope the Speaker reaches for these types of reforms, rather than the recipes for delay outlined in the current proposal.]]>
Sometimes that is a good thing, particularly when we see water utilities meeting and exceeding Governor Brown’s call for 25 percent water conservation. In other cases, pursuing new, “drought-proof” water supplies can have unintended consequences. Drought-proof supplies, while helping respond to climate change, often require more energy than conventional drinking water sources (see Figure 1 from Clean Energy Opportunities in California’s Water Sector).
For example, the Carlsbad Desalination Plant, the largest of its kind in the U.S., was recently completed and provides San Diego County Water Authority with additional drought-proof water supplies, but those supplies have a large energy footprint, requiring around 750 MW per day.
The developers of the desalination plant only committed to sourcing 30 percent of the energy powering the plant from renewables and, therefore, the remainder is likely reliant on fossil fuels. Burning more fossil fuels to adapt to a changing climate is an example of “maladaptation” or actions taken to address climate risks that actually create, perpetuate, or exacerbate climate change.
The water sector is at a crossroads: it can be part of the climate problem or part of the climate solution.
A decade ago, the California Energy Commission concluded that nearly 20 percent of California’s electricity was used by California’s water sector. It is likely that during this prolonged drought, the water sector’s electricity consumption has risen due to increased groundwater pumping (more than half of water consumed in 2015 came from underground) and increased water treatment. Depending on where water utilities are getting electricity, they could be contributing to more global warming pollution.
The difficulty for decision-makers is that many water and wastewater utilities do not track or disclose electricity use, generation sources, and related global warming emissions. Water utilities that are also retail electricity providers must disclose this information because they are required by law to source 50 percent of their retail electricity from renewables by 2030. But the water utilities that do not also sell electricity have no such requirement. This missing data makes it difficult to identify clean energy opportunities.
Fortunately, a bill introduced by Senator Fran Pavley this year (SB 1425) addresses this challenge by creating a voluntary emissions tracking system for projects that reduce the carbon intensity of California’s water system. This new registry will allow for water agencies, large water consumers, businesses and others to voluntarily measure and track their heat-trapping emissions from water pumping, transport, delivery, and heating.
There is great potential within the water sector to reduce its electricity use and associated emissions. Because many water and wastewater utilities have significant electricity purchasing power and own assets and infrastructure that could host renewable generation facilities or provide flexibility for the electricity grid, they are in a unique position to help the state meet (and surpass) its clean energy goals.
The Sonoma County Water Agency began delivering “carbon-free” water last year and a number of other water utilities are close behind, finding ways to power their operations using clean, renewable sources of electricity, and hosting generation projects for other clean energy purchasers. This is good for the state as it can help meet our greenhouse gas reduction goals. It is also good for water customers since clean electricity locks in a consistent price, which protects against fossil-fuel price volatility.
As the state moves toward its new goal of sourcing 50 percent of its energy from renewable sources by 2030, investing in clean energy solutions has never made more sense.
More than 80 percent of biodiesel is made from vegetable oil (the rest is mostly animal fats). The soybean and canola oil that make up the majority of biodiesel is basically the same as the cooking oil you buy at the grocery store, while the corn and used cooking oils are inedible varieties generally used for animal feed and other purposes.
Using more oils and fats for fuel instead of food and animal feed has consequences for competing users of these products and for the global agricultural system. Of particular importance from a climate perspective is the relationship between rising biodiesel use in the United States and palm oil expansion in Southeast Asia, which is a major driver of deforestation and global warming pollution.
Figure 1 shows that palm oil itself is not a significant direct source of US biodiesel production. But there are important indirect links between how much biodiesel we use in the US and how quickly palm oil plantations expand in Indonesia or Malaysia. These connections can be understood by comparing the rise of biodiesel with ethanol, and by examining the sources of biodiesel one at a time.
Vegetable oils and animal fats are converted into biodiesel via a chemical process called transesterification, after which they’re blended with diesel and used in trucks. Transesterification sounds complicated, but it is a pretty simple chemical reaction (you can actually make biodiesel in your garage); compared with ethanol, the biodiesel production process takes less energy and has lower direct emissions.
The main source of emissions for biodiesel comes from the vegetable oils and fats it is made out of, and not the process of converting them to fuel.
Although ethanol production is much larger, biodiesel has grown more quickly since 2010, more than tripling between 2010 and 2015:
Biodiesel is most often sold as a blend of up to 5 percent biodiesel mixed with petroleum diesel. This is labeled as ordinary diesel fuel consistent with the official specifications. Some trucks can use up to a 20 percent biodiesel blend, but distribution challenges associated with marketing different blends for different vehicles have limited the adoption of these higher blends.
Today, biodiesel accounts for about 3 percent of the diesel fuel sold. For comparison, 10 percent ethanol is blended into most of the gasoline sold today.
While biodiesel is a relatively small share of diesel fuel, it has a large footprint in agricultural markets. The fact that ethanol consumes about 40 percent of U.S. corn is much publicized by ethanol critics, but less attention has been paid to the growing share of soybean oil being made into biodiesel, now about 25 percent.
One reason for the different level of publicity is that expanded demand for corn to make ethanol increases input costs for meat producers, who have been among the loudest and most persistent ethanol critics. But counterintuitively, increased demand for soybean oil actually makes input costs cheaper for the meat industry. To understand this mystery, read on!
Soybeans are an interesting crop, connected to their sister crop corn in complex ways in the agriculture, food and fuel system. While you may occasionally encounter soybeans in their immature form as edamame, the majority of soybeans are crushed to make soybean oil and a high protein meal that is mixed with corn in animal feed.
Soybean oil accounts for only 40 percent of the value of the soybeans, so the economics of soybean production depend jointly on the oil and the meal. As you would expect, increased demand for soybean biodiesel will raise demand and prices for soybean oil, but meal goes the other direction. As more soybeans are crushed to supply oil, the price of soybean meal will fall as increased production meets unchanged demand.
Since soybean prices depend on the sum of oil and meal prices, the net result is that soybean prices are only weakly linked to soybean oil prices. In a specific example worked out and explained in detail here, a 10 percent increase in soybean oil prices led to a 4 percent decrease in soybean meal prices and less than a 2 percent increase in soybean prices. So the impact of soy biodiesel on food prices is mixed, increasing the cost of vegetable oil, but decreasing the cost of animal feed.
But while soybean production is not very responsive to soybean oil prices, other vegetable oils are more responsive, particularly canola and palm oil, which have a higher share of their value derived from vegetable oil. For this reason, increased use of soybean oil to make biodiesel does not lead to much increased production of soybeans, but primarily leads to substitutions among vegetable oils and ultimately more vegetable oil imports.
The substitution of imports for soybean oil used as biodiesel is clearly illustrated in recent agricultural statistics. Starting in about 2003, there was a relatively sudden increase in the use of soybean oil for biodiesel. This increase did not result in an associated jump in soybean oil production, which pretty much stayed on its previous trend, driven by steady growth in demand for protein meal.
Instead, as US soybean biodiesel production grew, domestic consumption of soybean oil for food and other uses fell. Soybean oil use for food and other uses was replaced by imports of other oils, primarily canola and palm oil. This shows up quite clearly in the chart below, which compares the rising use of soybean oil for biodiesel to increased imports of palm, canola and other oils.
It is clear from the data that expanded use of soybean oil to make biodiesel was matched by growing volumes of imported vegetable oil, but the question of causality is a little trickier. That is because in the same timeframe that soy biodiesel consumption was growing, concern about the health impact of trans fats, mostly hydrogenated soybean oil, led to decreased consumption of trans fats, which were replaced in Oreos and many other prepared foods with other oils.
Some of the hydrogenated soybean oil was replaced with palm oil because of its similar properties. In this telling of the biodiesel story, biodiesel expansion is not the cause of increased imports. Rather, rising imports of palm and other oils were caused by changes in US food preferences attributable to health concerns; expanded production of soybean biodiesel was an outlet for the unwanted soybean oil, providing a substitute market while also displacing fossil fuel use and lowering the cost of soy meal for meat producers.
This optimistic interpretation is not implausible, but it is certainly incomplete. Vegetable oils are traded in a global marketplace, where demand for vegetable oil has been growing steadily. If the soybean oil no longer consumed as hydrogenated oil had been exported (either as vegetable oil or as whole soybeans) it would have found a market among the major vegetable oil importers. Vegetable oils are highly substitutable in many markets, and greater availability of soybean oil would have displaced some of the growing demand for palm oil.
Precisely quantifying these relationships is tricky, but given the link between palm oil expansion and deforestation, this alternative explanation paints a less optimistic picture of the climate impact of soybean biodiesel expansion in the last few years.
Regardless of whether you assign causality to falling demand for trans fats or rising demand for biodiesel, that chapter has come to a close. The shift away from hydrogenated soybean oil is now essentially complete; we should not expect a continued surplus of soybean oil.
In fact, the soybean industry is hard at work developing new technologies to regain lost market share in food markets. To the extent they succeed, it will further increase demand for soybean oil and lead to further substitution by palm and other oils.
The point is that increased use of soybean oil-based biodiesel in the US has a limited impact on soybean production, which is primarily determined by demand for protein meal. Instead, the main effect is to tilt the balance of demand in favor of vegetable oils versus protein meal, which favors sources like palm and canola. Canadian canola oil may supply some of this additional demand, but palm oil is the least expensive, fastest growing source of vegetable oil on the global market, and most likely to fill the void left by US soybean oil being used for fuel.
While the majority of biodiesel is made from the same vegetable oil used for cooking, about 40 percent is made from inedible and recycled oils and fats that are not used directly as human food. This share has stayed fairly constant even as biodiesel production has increased.
Every schoolchild knows that recycling is good for the environment, and so increased use of used cooking oil and other recycled sources to make biodiesel is a feel-good story and gets a lot of attention. But like many stories we tell children, the reality is a little more complex.
It turns out that recycled oils and fats used to make biodiesel are not a free lunch for the environment after all. That’s because for the most part, these oils and other fats are not being diverted from landfills like egg cartons used for art projects. There are existing uses for these resources, including livestock feed, pet food, and to make soaps and detergents. If used cooking oil that was feeding livestock is diverted to fuel, the livestock will have to eat something else instead.
There are certainly some efficiency gains to using a lower value feedstock instead of food grade vegetable oil to make fuel, so while these recycled fuels are not a free lunch, they are certainly a discounted lunch. Determining exactly how much of a discount is tricky, and requires lifecycle analysis to figure out the indirect impact by estimating the replacements in animal feed and other existing markets. But ignoring the need to replace these products leads to unrealistically optimistic environmental assessments.
Another fast growing source of biodiesel is inedible corn oil produced as a byproduct of corn ethanol. Corn oil has historically been more expensive than soybean oil, and thus not an attractive source of biodiesel. But over the last few years, a new source of corn oil emerged that was competitively priced.
The corn ethanol boom of 2005 to 2010 saw a huge increase in production of distillers’ grains, an animal feed co-product of ethanol production that is left behind once the corn starch is made into ethanol. Ethanol producers learned that they could extract corn oil from the distillers’ grains, reducing the fat content of the animal feed in the process.
This distillers’ corn oil smells like a brewery and is not sold for human consumption, but it works for biodiesel and animal feed and sells at a significant discount to edible corn oil. Removing a portion of the oil from distillers’ grains of animal feed reduces its caloric content, but it does not reduce its value significantly. So this approach is quite profitable, and most ethanol producers adopted it.
Biodiesel produced from distillers’ corn oil grew by about ten times from 2010 to 2013, but leveled off thereafter. Corn oil associated with distillers grains is limited by corn ethanol production, and while some further shifting of oil from feed to fuel markets is possible, the increase associated with the ethanol boom is unlikely to be repeated, and is not the basis for a sustainable trend into the future.
One well known source of environmentally-friendly biodiesel is used cooking oil, which allegedly makes your old diesel car exhaust smell like French fries. Together with distillers’ corn oil, used cooking oil (also called yellow grease) has accounted for most of the growth of biodiesel from recycled oils and fats.
But while higher prices for used cooking oil has increased collection somewhat, most of the large sources of used cooking oil were already being collected. Increased demand for waste oil does not increase supply of used cooking oil, since this is a waste product whose quantity is set by demand for corn chips or French fries.
For the last few years, overall production of used cooking oil has been basically steady while biodiesel use grew from a small share to consuming 60% of domestic used cooking oil in 2015. The increase came mostly from reducing exports rather than increasing diversion from waste streams.
Even if 100% of our remaining exports are made into biodiesel, it would increase biodiesel production by just about 5%, and the current importers would have to look elsewhere to replace the lost oil. So there is not much more growth coming.
I’ve walked you through the major domestic sources of biodiesel, qualitatively highlighting the limitations to domestic sources of biodiesel. Last year we commissioned Professor Wade Brorsen at Oklahoma State University to do a quantitative projection, and he determined that 29 million gallons per year of growth would be reasonable from domestic sources.
Twenty-nine million gallons sounds like a lot, and indeed it is enough to fill an additional 44 Olympic-sized swimming pools each year. But it amounts to less than 2% growth a year in biodiesel production, which is itself a small share of diesel production.
If biodiesel production grows faster than this rate it is likely to be either imported, produced with imported sources of oil, or produced by bidding away existing sources of oil from other users, who will in turn be forced to switch to imports.
The potential for significant and sustainable growth in domestic biofuel production depends upon moving beyond food-based fuels made from vegetable oil or corn starch and turning instead to biomass resources. These resources have the potential for significant—but by no means limitless—expansion as the technology to convert them to cellulosic fuels scales up.
The potential and implications of making ethanol from biomass is discussed at length in Chapter 3 of our recent report, Fueling a Clean Transportation Future. And as cellulosic ethanol technology matures, different biological or chemical processes can make the same resources into cellulosic diesel, jet fuel or other fuels or products as well.
Talking about government regulations is a good way to put people to sleep (at least my wife), so I saved this little lullaby for the finale. Each year, the EPA must put forth specific regulations to implement the Renewable Fuel Standard (RFS), which Congress passed in 2005 and was amended in 2007. In recent years, this has gotten tricky, as tradeoffs and constraints in the fuel system make realizing Congress’ goals complicated.
Last year the EPA made a major overhaul of its approach to the RFS, which basically put the policy back on track. This year they are sticking quite close to that approach (see this summary for details), which will help build stability and predictability for a policy that has been short of both.
For biodiesel, the EPA has proposed an increase of 100 million gallons, from 2 billion gallons a year to 2.1 billion gallons, the same increase they proposed last year.
Not surprisingly, the biodiesel industry has a more bullish view, and argues that EPA should expand mandates for bio-based diesel by 5 times as much, to 2.5 billion gallons.
This is 17 times more than Professor Brorsen found could be supported by domestic sources of oils and fats. Growth rates this far in excess of domestic resources will inevitably lead to much greater reliance on imports of either biodiesel or oils and fats to replace domestic sources bid away from existing users. The 500-million-gallon a year increase the industry seeks is unsustainable, and would set the industry up for a crash. It would also create a huge hole in the global vegetable oil market which would largely be filled by palm oil expansion.
To provide stable support for the biodiesel industry and to avoid unintended problems across the globe, it is important that policy support for biodiesel growth is consistent with the growth in the underlying sources of oils and fats. The EPA should scale back its proposal in light of these constraints.]]>
As we’ve discussed previously on this blog and elsewhere, keeping these weapons on hair-trigger alert so they can be launched within minutes creates the risk of a mistaken launch in response to false warning of an incoming attack.
This practice dates to the Cold War, when US and Soviet military strategists feared a surprise first-strike nuclear attack that could destroy land-based missiles. By keeping missiles on hair-trigger alert, they could be launched before they could be destroyed on the ground. But as the letter notes, removing land-based missiles from hair-trigger alert “would still leave many hundreds of submarine-based warheads on alert—many more than necessary to maintain a reliable and credible deterrent.”
“Land-based nuclear missiles on high alert present the greatest risk of mistaken launch,” the letter states. “National leaders would have only a short amount of time—perhaps 10 minutes—to assess a warning and make a launch decision before these missiles could be destroyed by an incoming attack.”
Over the past few decades there have been numerous U.S. and Russian false alarms—due to technical failures, human errors and misinterpretations of data—that could have prompted a nuclear launch. The scientists’ letter points out that today’s heightened tension between the United States and Russia increases that risk.
The scientists’ letter reminds President Obama that he called for taking nuclear-armed missiles off hair-trigger alert after being elected president. During his 2008 presidential campaign, he also noted, “[K]eeping nuclear weapons ready to launch on a moment’s notice is a dangerous relic of the Cold War. Such policies increase the risk of catastrophic accidents or miscalculation.”
Other senior political and military officials have also called for an end to hair-trigger alert.
The scientists’ letter comes at an opportune time, since the White House is considering what steps the president could take in his remaining time in office to reduce the threat posed by nuclear weapons.]]>
“There was good overnight humidity recovery in the fire area last night, which will delay the burn period today. However, as temperatures warm and vegetation dries out, pockets of heat may become more active in the afternoon.”
These excerpts from the incident reports of the Sherpa Fire in California and the Dog Head Fire in New Mexico, respectively, highlight the relationships between weather and wildfires. Over the past few decades, climate change has driven an inexorable trend of higher summer temperatures across the West—a trend that is expected to continue for years to come. This year, some states may see relief if the El Niño pattern shifts rapidly to La Niña—and if Congress passes legislation that provides sufficient resources to manage wildfire risk effectively.
Temperatures on Father’s Day tied the record high temperature of 103 degrees in Albuquerque, New Mexico, further contributing to the heat fueling the Dog Head Fire raging nearby, and making conditions miserable for firefighters.
With the fire exceeding 17,000 acres and only 9 percent contained, Governor Martinez declared a state of emergency and mobilized the National Guard. Fortunately, on Monday, weather conditions began to change, with higher humidity moving into the area, bringing peak temperatures down into the upper 90s (!) and allowing firefighters to improve their fire lines. But this was cold comfort to the owners of the two dozen homes that had already been lost to this fire.
Meanwhile, another fire was blazing near Santa Barbara, California—the Sherpa Fire. By Monday, it had burned nearly 8,000 acres and was more than 50 percent contained. More than 1,200 firefighters worked night and day, preparing to defend fire lines in the midst of an excessive heat warning, with maximum temperatures exceeding 100 degrees. “Sundowner” winds were expected to blow from the Santa Ynez Mountains at speeds up to 40 miles per hour in the evening, potentially fanning the flames overnight. Already several communities had been evacuated, and Highway 101 had been closed at times.
These were just two of the 16 active fires burning over the weekend, which had consumed a total of nearly 100,000 acres. The problem is not limited to the West – brushfires in Florida closed a major highway early in the week – but the Western states are where we’ve seen the most fires breaking out. And the wildfire season is just getting started, and might be worsened by the climate phenomena known as El Niño and La Niña.
I’m often asked how El Niño and La Niña affect wildfires in the West, and the answer is complicated.
On top of the rising temperature trend driven by climate change, we have other patterns that affect global temperatures. El Niño and La Niña (“the boy” and “the girl” in Spanish, respectively) are used to describe a pattern of global climate conditions that tend to alternate every few years.
Among other features, El Niño is characterized by warmer-than-normal water in the eastern Pacific, which tends to bring more moisture and precipitation to the southwestern US and leaves the northern tier of states hot and dry. In contrast, La Niña exhibits cooler waters in the eastern Pacific, leading to drier conditions in the Southwest, while wetter conditions prevail in the Pacific Northwest and the northern Intermountain West.
Last year we were in the midst of a powerful El Niño—possibly the strongest measured since 1950. And, as we would expect, the Northwest was abnormally hot and dry, setting the stage for a record-breaking wildfire year, with over 10 million acres burned in the US.
The effects of El Niño lingered through the spring in most places, bringing record heat to the Southwest, but failed to provide much-needed moisture to southern states, including California, Arizona, and New Mexico. As a result, southern California is still suffering through an extreme drought, estimated to be affecting more than 33 million people. It should come as no surprise, then, to see wildfires breaking out in the hottest, driest areas.
Climate scientists are now saying there is a strong likelihood that the El Niño conditions could rapidly flip to La Niña conditions by autumn. If La Niña follows its usual pattern, it may bring some relief to the Pacific Northwest, Alaska, and the northern Intermountain states.
Unfortunately, it could also exacerbate the dry conditions in the Southwest, prolonging the drought in California and potentially expanding it to other states. On top of the drought, if the record-breaking heat of June is a sign of a hot summer to come, we could see another extraordinary fire season—putting many communities at risk and stretching our capacity to cope.
Policy makers have taken notice of the situation, and some are working to help us become better prepared for wildfire risks. It’s a complex problem to solve, with several factors involved.
First, decades of fire suppression have led to a buildup of flammable materials—living and dead trees, litter, grasses, etc.—in many areas, and these materials act as tinder once a fire starts.
Second, the warmer and drier conditions have spurred a wave of tree mortality in Western forests, accelerated by climate change, leaving dead trees that are vulnerable to fire.
Third, more people have moved into forested areas, putting themselves and their property in harm’s way, and making fire suppression more costly because firefighters spend more time and effort protecting developed areas.
And fourth, we have climate change itself creating conditions that increase the likelihood and extent of wildfire.
The Forest Service, which is responsible for managing 193 million acres of public land, primarily in the West, has been grappling with this problem as it unfolds. Administrators have recognized that recent fire seasons are regularly breaking their budgets, perversely forcing them to take money budgeted for fire risk reduction activities and use it for fire suppression instead.
After a collective face palm, Congress began to put together legislation aimed at fixing this problem. The latest bipartisan draft legislation in the Senate is called the Wildfire Budgeting, Response, and Forest Management Act of 2016. A hearing on the draft legislation is expected in the Senate Committee on Energy and Natural Resources on Thursday, June 23. The hearing is organized around two panels of witnesses, representing a range of government, private sector, and civil society viewpoints. It should be an informative discussion.
Regardless of whether this legislation moves forward or not, the wildfire problem is not going away.
Year by year, the factors contributing to wildfire risk continue to increase—with climate change potentially becoming the most important factor of all.
Hopefully Congress will pass legislation that can help us get a better handle on fire-fighting and land management to control the risk of wildfires for the next few years. Maybe some of us will get a break from La Niña or some other short-lived phenomenon. But ultimately these fixes will be temporary, unless we can stop the inexorable rise in global temperatures by reducing our emissions of heat-trapping gases. And on that point, Congress has yet to take any serious action.
Due to a limitation of NHTSA’s authority that says the agency can finalize standards for no more than five years at a time, this policy is currently undergoing a “mid-term review.” This review process will examine the regulations in light of many factors that may have changed over the past four years, including unforeseen technology advancements, changes in consumer acceptance, and lower fuel prices. The process is currently underway but will be formally kicking off with a draft technical assessment report to be released shortly, upon which the agencies will ask for public comment.
To help everyone make sense of these rules, UCS is beginning a fact sheet series on the mid-term review to discuss various issues of concern. The first three fact sheets are being released today, and we will continue to release new fact sheets over the coming months to help illustrate why these standards are important, what they mean for consumers, and why automaker efforts are better spent on continued innovation instead of lobbying for less stringent regulation. Nothing that has happened in the past four years suggests that these standards should be weakened—in fact, given the technology development spurred to-date, this is an opportunity to strengthen the standards for 2025.
Whether it’s because of low gas prices or more efficient SUV choices, it’s clear that consumers have been buying more SUVs than ever before. But despite automaker pleas, that isn’t a reason to weaken the standards.
These standards were designed to give consumers efficient vehicle choices across all sizes and types of vehicles, and that’s exactly what they’ve done. Automakers are selling more efficient cars, trucks, and SUVs alike, which is why they find themselves well ahead of the regulatory targets.
Without these standards, increasing sales of SUVs would have moved fuel economy backwards, just as it did more than a decade ago. However, we will need even stronger standards in order to move forward on our climate and oil use targets in 2030 and beyond.
By setting targets for 2025, regulators gave suppliers and automakers a clear long-term goal, helping to spur significant investment in technology development and deployment—and that investment is paying off. Automakers have come up with low-cost ways to get more out of vehicles powered by internal combustion engines that the regulators never saw coming.
The forthcoming draft technical assessment report will give new insight into what the technology landscape looks like, but our analysis shows that there are a number of developments that the federal agencies didn’t foresee back in 2012:
These technologies are just a small sampling of the technology innovation occurring across the automotive industry, innovation which was unanticipated and helps build the case for why automakers can go do even more than originally expected in 2025.
The fact sheet series is available at www.ucsusa.org/midtermreview. Stay tuned for future installments.]]>