UCS Blog - The Equation (text only)

Vehicle Fuel Economy Standards—Under Fire?

Photo: Staff Sgt. Jason Colbert, US Air Force

Last year, transportation became the sector with the largest CO2 emissions in the United States. While the electricity industry has experienced a decline in CO2 emissions since 2008 because of a shift from coal to natural gas and renewables, an equivalent turnaround has not yet occurred in transportation. Reducing emissions in this sector is critical to avoiding the effects of extreme climate change, and the Corporate Average Fuel Economy (CAFE) and Greenhouse Gas (GHG) emissions standards are an important mechanism to do so.

The most recent vehicle standards, which were issued in 2012, are currently undergoing a review. The Department of Transportation (DOT) is initiating a rulemaking process to set fuel economy standards for vehicle model years 2022-2025. At the same time, DOT is also taking comments on its entire policy roster to evaluate their continued necessity (including the CAFE standards).

A number of criticisms have been raised about fuel efficiency standards, some of which are based more in confusion and misinformation than fact. An intelligent debate about the policy depends on separating false criticisms from those that are uncertain and those that are justified.

In fact, as new research I did with Meredith Fowlie of UC Berkeley and Steven Skerlos of University of Michigan shows, the costs of the standards could actually be significantly lower than other policy analyses have found.

Costs and benefits of the regulations

What my co-authors and I have found is that automakers can respond to the standards in ways that lower the costs and increase the benefits.

Many policy analyses do not account for the tradeoffs that automakers can make between fuel economy and other aspects of vehicle performance, particularly acceleration. We studied the role that these tradeoffs play in automaker responses to the regulations and found that, once they are considered, the costs to consumers and producers were about 40% lower, and reductions in fuel use and GHG emissions were many times higher.

The study finds that the fact that automakers can tradeoff fuel economy and acceleration makes both consumers and producers better off. A large percentage of consumers care more about paying relatively lower prices for vehicles than having faster acceleration. Selling relatively cheaper, more fuel-efficient vehicles with slightly lower acceleration rates to those consumers allows manufacturers to meet the standards with significantly lower profit losses. Consumers that are willing to pay for better acceleration can still buy fast cars.

Debunking some common criticisms

One common criticism is that the regulations mandate fuel economy levels that far exceed any vehicles today. This misconception stems from the frequently quoted figure when the regulations were first issued that they would require 54.5 mpg by 2025. But, the regulations do not actually mandate any fixed level of fuel economy in any year. The fuel-economy standards depend on the types of vehicles that are produced each year. If demand for large vehicles is up, the standards become more lenient; if more small vehicles are sold, they become more strict. The 54.5 mpg number was originally estimated by EPA and DOT in 2012 when gas prices were high. EPA has since revised it to 51.4 mpg to reflect lower gas prices and higher sales of large vehicles. Taking into account flexibilities provided in the regulations and the fact that this number is based on EPA’s lab tests, which yield higher fuel economy than drivers experience on the road, the average target for 2025 is equivalent to approximately 36 mpg on the road. Fueleconomy.gov lists 20 different vehicle models that get at least this fuel economy today.

Another common but unjustified criticism of the standards is that they push consumers into small vehicles. The regulations were specifically designed to reduce any incentive for automakers to make vehicles smaller. The standards are set on a sliding scale of targets for fuel economy and GHG emissions that depend on the sizes of the vehicles. As a result, an automaker that sells larger vehicles has less stringent fuel economy and emissions targets than one that sells smaller vehicles. Research has shown that the policy likely creates an incentive for automakers to produce bigger vehicles, not smaller.

Two easy ways to strengthen the fuel economy standards

There are, of course, advantages and drawbacks to any policy, including today’s vehicle standards, which focus entirely on improving the efficiency of new vehicles.  Fortunately, there are improvements that can be made to the CAFE and GHG regulations to increase their effectiveness and lower costs.

The first is ensuring that automakers that violate the standards pay very high penalties. Companies who cheat steal market share from those that follow the standards, effectively raising the regulatory costs for the automakers that are playing fair.

The second improvement involves the way automakers are able to trade “credits” with each other.  These credits were created to equalize regulatory costs across companies. So, if one automaker finds it relatively easy to reduce emissions, it can reduce more than its share and sell credits to another automaker having trouble reducing emissions. This trading is currently negotiated individually by each pair of automakers, which raises the costs of the transaction. Creating a transparent market to trade these credits would help to achieve the target emission reductions at lower costs.

The Department of Transportation (DOT), which implements the Corporate Average Fuel Economy (CAFE) standards, is currently soliciting comments on regulations “that are good candidates for repeal, replacement, suspension, or modification.” The comment period ends December 1.

 

Dr. Kate Whitefoot is an Assistant Professor of Mechanical Engineering and Engineering and Public Policy at Carnegie Mellon University. She is a member of the NextManufacturing Center for additive manufacturing research and a Faculty Affiliate at the Carnegie Mellon Scott Institute for Energy Innovation. Professor Whitefoot’s research bridges engineering design theory and analysis with that of economics to inform the design and manufacture of products and processes for improved adoption in the marketplace. Her research interests include sustainable transportation and manufacturing systems, the influence of innovation and technology policies on engineering design and production, product lifecycle systems optimization, and automation with human-machine teaming. Prior to her current position, she served as a Senior Program Officer and the Robert A. Pritzker fellow at the National Academy of Engineering where she directed the Academy’s Manufacturing, Design, and Innovation program.

 

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Abnormal and Catastrophic 2017 Hurricane Season Finally Over

The official end of the 2017 North Atlantic hurricane season, November 30th, has finally arrived.  In the season’s wake many are mourning the loss of loved ones, repairing their homes, and still waiting for electricity to return.

Hurricane Tracks 2017

Figure 1. North Atlantic hurricanes and tropical storm tracks of the 2017 Season. Preliminary as the November storm tracks are not yet updated.

2017 North Atlantic Hurricane season was not normal

The first named storm of the 2017 Hurricane Season, tropical storm Arlene, began in early April.  Harvey, Irma, and Maria are the names communities will remember long after they became major hurricanes.

Six of the ten hurricanes were major (greater than category 3).  Recalling the headlines and seeing the damages, the season was catastrophic (Figure 1).  Crunching the numbers on a measure of power – Accumulated Cyclone Energy (ACE) – confirms that impression.  September 2017 ACE was more than three times greater than historical September ACE average over 1981-2000.  Scientists are piecing together the factors that contributed to such an intense hurricane season.  Attribution studies (studies that attribute the relative role of human and natural factors in the occurrence of extreme weather) are already been published about a specific Hurricane from 2017.

Some extraordinary conditions of this hurricanes season:

Hurricane Ophelia SST Anomalies Oct 2017

Figure 2. Warmer than 1985-2012 average sea surface temperatures (SSTs) during the time when tropical storm Ophelia transitioned into a hurricane south of the Azores Islands.

Warmer Seas –  A big factor contributing to the intensification of Harvey, Irma, and Maria was the warmer than average sea surface temperature (SST) conditions.  Another surprising consequence of the warmer than average SSTs was the hurricane region extended beyond typical hurricane regions of the North Atlantic Ocean.  This allowed hurricane Ophelia to thrive in highly unusual latitudes and longitudes making it the easternmost hurricane to date (see storm track number 15 in Figure 1).  The extratropical storm Ophelia made landfall in Ireland and brought waves that battered the UK coast, drenched northern Europe and blew winds that fueled lethal wildfires in southern Europe.  Research suggests that heat-trapping emissions can extend the SST region favorable for hurricanes and increase the chances for these storms to head toward western Europe.

Figure 3. Record-breaking precipitation dropped along the Texas and Louisiana coastal region.

Record Breaking Precipitation – Hurricane Harvey dropped a whopping 60 inches in Nederland, Texas, east of Houston, breaking the 1950-2017 record for state maximum precipitation from tropical cyclones and their remnants. Hurricane Harvey’s average accumulated rainfall over Houston (840 mm or 33 inches) was exceptional.  There was so much floodwater in Houston that it sunk the landscape by 2 centimeters (~0.8 inch) in some places.  Assuming the precipitation area of individual hurricanes remains similar, Kerry Emanuel found that the return period for a greater than 500 mm (19.7 inches) average accumulated event precipitation was a once in a 100-year event over 1981-2000.  This becomes a once in 16-years event by 2017 and a once in 5.5-years occurrence by the end of this century under an unabated emissions scenario.

Catastrophic Wind –   A hurricane category is defined by sustained winds and the associated consequences are described by words such as “devastating damage” for category 3 and “catastrophic damage” for categories 4 and 5.   Hurricanes Irma and Maria had unusually high peak 1-min sustained winds making the rank of North Atlantic Hurricanes with the strongest winds in the historical record (see table).  Those on the ground during landfall withstood ferocious winds.  Hurricane Maria was the first category 5 (157 miles per hour or higher sustained winds) hurricane to make landfall in Dominica—a small Caribbean Island south east of Puerto Rico. It made landfall yet again, but this time as a category 4 (130-156 miles per hour sustained winds), in Puerto Rico.  Similarly, hurricanes Harvey and Irma made landfall as a category 4 storm in Texas and Florida, respectively.

 

How does an abnormal hurricane season become disastrous?

The Intergovernmental Panel on Climate Change has pointed to three major factors that can combine to influence the risk of an extreme event disaster.  A weather and climate event plus communities exposed to that event plus the social vulnerability all combine to influence disaster risk.

Social vulnerability refers to the resilience of communities when confronted by external stresses.  A few examples follow regarding the exposure and social vulnerability that intersected with hurricanes that are changing in a warming world.

Many Caribbean residents were among those exposed to these powerful hurricanes, which made repeated landfall on numerous islands this year.

For over three centuries people have lived on Barbuda and for the first time the risk was so grave that the entire island population fled to avoid exposure to Hurricane Irma.  People are confronting rebuilding a community and civilization on Barbuda.

It is estimated that 3.7 million people, over a million households, nearly 260 billion dollars in Puerto Rico were exposed to wind impacts from Hurricane Maria. The fourth most populated U.S. City in 2017, Houston, was exposed to the precipitation deluge (see Figure 3) from Hurricane Harvey.

An entire metropolitan region, island or county might be exposed to an abnormal hurricane season, but not all of those in the path of a storm are equally vulnerable to its effects.

Differences in vulnerability have already emerged in the aftermath of the 2017 season that reflect in part the history of a place, people, and infrastructure.  Additional factors include communication about the hurricane risks, first response and long-term disaster management. For example, elderly people perished in a Florida nursing home after days of being stuck in sweltering heat following the power outage caused by Hurricane Irma.

The U.S. Health Assessment found that people with chronic medical conditions are more likely to have serious health problem during excessive heat than healthy people.   The elderly in this case depended on others for their care.  As the USA Today Editorial Board put it “In such a climate, air conditioning is not a luxury for elderly people; it’s a necessity.”

Tragic loss of life from Hurricane Maria in Puerto Rico is estimated to be similar to Hurricane Katrina. This large toll is in part due to the vast numbers of  U.S. citizens and residents  still suffering from lack of safe drinking water or do not have access to power in Puerto Rico.

Families are piecing together their lives after a devastating loss of a family member. Or the absence of a child who had to evacuate to continue school in a safer place during a protracted recovery period.

2017 is the most expensive Atlantic Hurricane Season to date with damages already racking up over $200 billion.   The epic disasters of the 2017 hurricane season hold important lessons, which should be taken into account when planning steps to better protect lives from Hurricanes and their aftermath.  In turn those who are recovering from the disastrous hurricane season can learn from those lessons already learned from Hurricanes Sandy, Katrina, and Andrew.

These lessons can help communities rebuild toward climate resilience with principles that are scientifically sound, socially just, fiscally sensible, and adequately ambitious.

NOAA Climate.gov NOAA National Weather Service NOAA tweet http://bit.ly/2AkUySt Brenda Ekwurzel created with NASA, U.S. Air National Guard photo by Staff Sgt. D.J. Martinez, U.S. Air Force, and U.S. Dept of Homeland Security images

Virginia’s Gerrymander Is Still Alive—and a Deadly Threat to Environmental Justice

This week, Virginia’s Board of Elections certified results from the November 7th elections, paving the way for three crucial recounts that will determine control of the Virginia House. The Democratic Party would need to take two of those seats for a majority, having already defeated more than a dozen incumbent Republicans and flipping three seats. If this wave is enough to push the Democratic Party over the 50-seat mark, many in the press will declare that the Virginia GOP’s gerrymandered districting plan is no more. But they will be wrong. The value of some Virginians’ votes are still diluted, as they were before the election. In turn, voting inequalities continue to bias the legislature’s responsiveness to environmental and health threats.

Virginia’s gerrymander has proven durable over the decade. Majorities of voters have supported the Democratic Party over the last four election cycles, only to win about one third of legislative seats. This bulwark against majority rule was engineered after the 2010 Census, by an incumbent party with absolute control over redistricting the assembly. Despite earning a substantial (nine-percent) majority of votes over incumbent Republicans this year, Democrats still have less than a 50/50 chance of gaining majority control, and if they do it will be by one seat. The fact that there is any uncertainty over whether a party with a near 10-point majority vote will control the chamber is proof of just how durable the gerrymander is. What happened on November 7th in Virginia was near historic, but it did not breach the gerrymander.

2017 Democratic district vote shares (blue), sorted by 2015 Republican vote shares (red). Democratic vote shares in 2015 uncontested GOP districts are sorted by 2017 Democratic vote share.

Democratic voters wasted far more votes in uncontested safe districts, 26 in fact, compared to 11 overwhelmingly Republican districts where Democrats did not field candidates. This is illustrated in the graphic below with full blue bars (left), indicating uncontested Democratic seats, and bars that are filled red with no blue, uncontested Republican seats.  While Democrats tend to reside in higher density, urban regions, one of the most powerful gerrymandering tactics is to pack opposition voters into districts so that their surplus votes (over 50%) are wasted. This year, extensive mobilization efforts, coupled with a Gubernatorial campaign tainted with racist overtones, provided the bump that Democrats needed in the most competitive districts (around the 50% mark). The middle of the graph depicts the contests where Democrats reached 50% or higher, reaching into the competitive districts held by GOP incumbents (and several open seats).

In districts that were contested in both cycles, Democratic candidates gained an average of 9.6 points (with a 5-point standard deviation). Democrats also contested far more districts than in 2015 (the solid red area with blue bars), picking off several seats against incumbents where they had not previously fielded candidates. Had the wave reached into districts where Republicans typically win by 15-20 points, we would have seen the type of gerrymander backfiring that occurred in Congress in the late 1800’s. In 1894, for example, a vote shift of less than 10 points against the Democratic Party cost them more than 50% of their seats, the largest loss in Congressional history.

The Democratic wave was enough to sweep away the GOP’s supermajority, but not enough to reverse the tides. Unless the Democratic Party can repeat their impressive turnout effort in 2019, it will be impossible to hold on to those marginal seats. Of course, under a fair system, a party with a nine-point statewide lead would have a cushion of several seats for close legislative votes. Even if Democrats do gain control, that one seat majority is vulnerable to being picked apart by the same powerful actors that helped engineer this electoral malpractice in the first place, at a great cost to Virginians.

Probably the single most powerful player is Dominion Energy. Consistently one of the largest donors to state election campaigns, Dominion greatly benefitted from a gerrymander engineered in large part by one of its biggest supporters, Appropriations Chair S. Chris Jones. Since 2011, Dominion has been remarkably successful at pushing through a rate freeze law that allowed it to hold on to over $100 million it would have paid back to customers, limiting the growth of clean energy technologies like solar power, and avoiding regulatory oversight of the toxic pollutants that it dumps into Virginia waterways. Remarkable enough that several of the successful Democratic challengers in this election made Dominion’s political influence central to their campaigns, refusing to accept their contributions.

The Dominion rate freeze passed the VA House on a 72-24 vote, so it’s not clear that even a fair districting plan would have stopped it, but it definitely would have changed the terms of negotiation. And because it has still insulated the legislature from an accurate representation of public support, the Virginia gerrymander weakens voters’ ability to protect themselves against current and impending health threats. For example, measured by the amount of toxic chemicals discharged into them, Virginia’s waterways are among the worst in the nation. Hundreds of companies are allowed to legally discharge toxins into waters upstream from recreational places where people regularly swim and fish. Arsenic levels up to 400 times greater than what is safe for residential soil have been measured along the James River.

Dan River coal ash spill. Photo: Appalachian Voices

According to a University of Richmond study, eight coal ash disposal sites along major rivers are significant hazards to nearby communities. Yet Virginia’s legislative oversight and regulatory programs are “bare boned and fragmented”, with utilities failing to provide adequate information about the amount, condition and stability of toxic chemicals and containment.

Nor do Virginians bear this burden equally. 76 percent of Virginia’s coal-fired plants are located in low-income communities or communities of color, including Possum Point, Spruance Genco and the Clover Power Station. Cumulative chemical exposure in such communities increases the risk of cancer, lung, and neurological diseases. The cancer rate in rural Appalachian Virginia is 15% higher than the national average, reflecting both environmental threats and lack of access to health care.  Earlier this year, an effort to expand Medicaid was killed on a party-line vote.

And as the impact of climate change becomes more pronounced, Virginia is on the front lines. A UCS analysis of the impact of tidal flooding showed that cities like Norfolk could see four times the frequency of flooding by 2030, while they already spend $6 million a year on road improvement, drainage and raising buildings. In places like Hampton Roads, sea level has already risen by more than a foot over the last 80 years. Yet members of the Virginia House, entrenched in power, continue to deny even the existence of sea level rise. Unfortunately, even a gerrymander as durable as Virginia’s cannot stop actual rising tides.

For their own safety, and the future of the Commonwealth, Virginians must continue the fight to have their full voting rights restored. Many are already suffering, and many more will pay a heavy price for policies that are unresponsive to public needs. Political equality and the integrity of the electoral process are prerequisites to evidence-based policy making that is in the public interest.

More Electric Vehicle Infrastructure Coming to Massachusetts

Massachusetts Department of Public Utilities today approved a proposed $45 million investment in electric vehicle charging infrastructure.

The investments in electric vehicle infrastructure come as part of a complicated rate case that involves a number of important issues related to rate design, energy efficiency and solar energy. But at least on the electric vehicle part, the utilities and the DPU got it right.

Why do we need more investments in electric vehicle infrastructure?

Electric vehicles are a critical part of Massachusetts’ climate and transportation future. Under Massachusetts’ signature climate law, the Global Warming Solutions Act, the state is legally required to reduce our emissions of global warming pollution by 80 percent by 2050.

Transportation is the largest source of pollution in Massachusetts, and it’s the one area of our economy where emissions have actually grown since 1990. Achieving our climate limits will require the near-complete transition of our vehicle fleet to electric vehicles or other zero-emission vehicle technologies.

The good news is electric vehicles are here, they are fun to drive and cheap to charge, and when plugged in to the relatively clean New England grid, they get the emissions equivalent of a 100 mpg conventional vehicle. EV drivers in the Boston area can save over $500 per year in reduced fuel costs. Electric vehicle technology has advanced to the point where mainstream automakers and countries like China and France are now openly talking about the end of internal combustion engine.

But while the future for EVs is bright, electric vehicles are still a very small share of the overall vehicle fleet. Nationally, EVs represent less than half of one percent of new vehicle sales. In 2012, Massachusetts committed to a goal of putting 300,000 electric vehicles on the road by 2025. Five years later, we are still about 288,000 EV sales short of that goal.

What investments are coming?

One of the biggest challenges facing the growth of electric vehicles is limited infrastructure. People are not going to buy an EV if they don’t know where to plug it in. A survey of Northeast residents conducted last year found that limited access to charging infrastructure is one of the biggest obstacles to EV purchases.

We have had over a hundred years – and billions in public subsidies – to build the infrastructure of refineries, pipelines, and gas stations that service the internal combustion engine. New investments in charging infrastructure are critical to making EVs as convenient as filling up at a gas station.

Today’s decision will speed the transition to electric vehicles by making investments in charging infrastructure. These investments include more funding for infrastructure for people who live in apartment buildings, more fast charging infrastructure along highways, and increasing charging infrastructure in low income communities, and greater access to workplace charging.

Overall, the proposal anticipates the construction of 72 fast-charging stations and 3,955 “Level-2” home and workplace charging ports over the next 5 years. Of those charging ports 10 percent will be in low income communities, where utilities will also provide consumers with a rebate for charging stations. These investments will provide thousands of Massachusetts residents with access to EV charging stations.

The DPU did deny Eversource the right to use ratepayer funds for education and outreach. This is unfortunate, as our survey also found that most Northeast residents are not aware of the many incentives available for EV customers, both here in the Northeast and at the federal level.

What more needs to be done?

One big question that is left out of the decision today: how do we best manage EV charging to maximize the potential benefits to the electric grid.

The key issue is when does EV charging take place? If most people charge their EVs at night, or during times of high production of renewable electricity, then the transition to electric vehicles can make our electric system more efficient and speed the transition to renewables. This will mean significant cost savings.

On the other hand, if EV charging mostly happens during “peak” hours (such as morning and early evening), then adding more EVs onto the grid could strain existing electricity infrastructure and require additional investments in pipelines and power plants. This would both raise emissions and cost ratepayers money.

There’s a simple way to address this issue: provide a financial incentive for EV drivers to charge their vehicles during periods of low demand, a policy known as Time of Use Rates. The DPU decision today punts this issue, accepting the utility position that it will take time and additional data to determine how to best implement TOU rates. While we agree with the DPU that the most important priority is to get the charging infrastructure installed, this is an issue that we and others in the clean transportation community will be watching closely over the next few years.

Photo: Steve Fecht/General Motors

Great Lakes’ Great Changes: Temperatures Soar as the Climate Changes

Grand Haven pier extends into Lake Michigan, where average summer surface temperatures have risen markedly over recent decades. Photo: Rachel Kramer/Flickr

Lake Michigan is not yet a hot tub, but the warming of this Great Lake gives you much to sweat about.

In his office at the University of Wisconsin Milwaukee, Paul Roebber, a Distinguished Professor in atmospheric sciences and a former editor of the journal Weather and Forecasting, showed me his most recent climate change lecture slides. The most arresting graphics compare current surface water temperatures of the Great Lakes with those three and a half decades ago. The average summer surface temperatures have risen 8 degrees Fahrenheit since 1980.

Particularly stark was Roebber pointing out a spot where a monitoring buoy floats way out in the middle of 100-mile-wide Lake Michigan, at a latitude between Milwaukee and Chicago. Two decades ago, average mid-July to September surface water temperatures in southern Lake Michigan ranged between 61 and 71 degrees. In 2016, they ranged between 67 and 77 degrees. On three separate days in 2016, temperatures hit 80. Surface water temperature changes near Milwaukee and Chicago were just as remarkable. On August 1, 1992, surface water temperatures were 61 and 65 degrees, respectively. On August 1, 2010, both were in the mid-70s.

“We’re starting to talk bath tub water and that is saying something about the changes,” Roebber said.

The future is almost unthinkable

Roebber’s comments certainly say something to me as a native of Milwaukee. I have vivid memories of childhood winters a half-century ago. We first- and second-graders were so acclimated to consecutive subzero days that when the high was 5 above, we’d walk to school with our coats flying open unzipped.

“We’re starting to talk bath tub water and that is saying something about the changes.” Atmospheric sciences professor Paul Roebber, University of Wisconsin.

Today, scientists predict a future climate unthinkable for a region where Green Bay Packers fans romanticize their home-team advantage in a stadium nicknamed the Frozen Tundra.

Roebber said that the modern lake warming has occurred with a rise of only a single degree in the air temperature over the Great Lakes over the last 30 years. But air temperatures are about to soar in scenarios where little or nothing is done to fight climate change. Researchers all around the Great Lakes and analysts at the Union of Concerned Scientists predict that the average summer highs of Milwaukee, currently about 80 degrees, could rise as high as 92 over this century.

The UCS analysis predicted that by 2100, Milwaukee would have nearly two months’ worth of days 90 degrees or higher, including three weeks’ worth of 100-degree scorchers. There would be at least one heat wave a summer with the sustained oppressive temperatures that killed hundreds of people in Chicago in 1995. Overall air quality would deteriorate as well, exacerbating asthma and other respiratory conditions.

In fact, the Upper Midwest region—including Milwaukee, Chicago, and Minneapolis—could collectively experience regular deadly heat waves with temperatures on the same scale that killed an estimated 70,000 people across Europe in 2003. “Under the higher-emissions scenario a heat wave of this magnitude would occur at least every fifth year by mid-century and every other year toward the end of the century,” the UCS analysis concluded.

 Under worst-case scenarios, northern Illinois will have the climate of Dallas and southern Illinois will have the temperatures of Houston by the end of this century. As for Illinois’ neighbor to the north, Roebber notes, “Our climate in Wisconsin will look like Arkansas.”

Change is underway in the world’s largest surface freshwater system

It’s scary to contemplate what Lake Michigan could be compared to a century from now. The five Great Lakes comprise the world’s largest surface freshwater system, in a basin serving 30 million people. While many long-range projections of climate change along America’s eastern seaboard focus on chronic inundation from rising ocean levels, the lakes offer a different set of perplexing dilemmas.

Perhaps most perplexing is the year-to-year unpredictability of conditions. The general scenario of recent decades has been less ice cover in winter, which has allowed more water to evaporate and resulted in unprecedented low lake levels. But there can also be years where that trend is punctuated by ice-choked Great Lakes as the warming Arctic ironically creates a wavier jet stream.

The overall long-term trends, according to the University of Wisconsin Sea Grant Institute, point to all the bodies of water in the state being at risk.

“Longer, hotter, drier summers and increasing evaporation will result in warmer and shallower rivers, shrinking wetlands, and dried-up streams, flowages and wild rice beds,” the institute says. “Algal blooms will create anoxic conditions for aquatic life in ponds and many lakes.”

“These conditions will reduce the amount of suitable habitat available for trout and other cold-water fishes, amphibians and waterfowl. A two-degree rise in temperature could wipe out half of Wisconsin’s 2,700 trout streams. Hot dry conditions, coupled with more frequent thunderstorms and lightning, will increase the chance of forest fires. Red pine, aspen and spruce trees will disappear from our northern forests.”

A joint report by the University of Wisconsin and the state’s Department of Natural Resources predicts more climate-change losers than winners among fauna. As populations of European starlings, Canada goose, and gray squirrels grow, those of the purple martin, black tern, American marten, common loons, and various species of salamanders, frogs, and prairie birds may decline or disappear.

“This will result in a net loss to the state’s biodiversity and a simplification of our ecological communities,” the report said.

As for commercial activities, Roebber said there may be more ice-free days to allow more winter shipping, but fluctuating lake levels may play havoc with lakeshore-dependent businesses during the rest of the year, from expensive marina dredging operations to beach erosion in resort communities. Water quality may be degraded if low lake levels expose harmful chemicals. An additional wild card is the prospect of Wisconsin facing more weather extremes with heavy rains and floods dancing with more frequent short-term droughts.

“It’s not clear how much lower the lake will go, but the levels will become more variable,” Roebber said.

Sitting on our hands

This month, 13 federal agencies released the government’s latest major assessment that human activities are “the dominant cause” of the warmest period “in the history of modern civilization.” That report predicts a 9.5-degree rise in average temperatures in the Midwest under continued high-emission scenarios, the greatest rise of any region in the contiguous United States.

But it is not clear how much researchers will be able to refine their predictions. The Trump administration, despite approving the release of the congressionally mandated report, is in the midst of an unprecedented attack on climate change research. Climate change experts in the Interior Department have been reassigned. The Environmental Protection Agency has banned some scientists from speaking at climate change conferences. The Trump administration has proposed hundreds of millions of dollars of cuts to NASA and NOAA planetary and weather research that relates to climate change.

The assault is also at the state level. Last year, Wisconsin governor Scott Walker ordered state agencies not to comply with President Obama’s Clean Power Plan and his DNR removed references from its website saying human activities are the root cause. Despite its prior partnering with university researchers, the DNR currently says, “The earth is going through a change. The reasons for this change at this particular time in the earth’s long history are being debated and researched by academic entities outside the Wisconsin Department of Natural Resources.”

In this environment, exacerbated by years of prior Congressional budget cuts that constrict the chances of winning federal research grants, Roebber fears for the further erosion of the nation’s ability to protect lives and livelihoods with science.

Destructive weather events are virtually certain to increase. A report this fall by the Universal Ecological Fund calculates that weather events that currently cost the US $240 billion a year will increase to $360 billion annually over the next decade, the latter cost being equal to 55 percent of the current growth of the US economy.

“Facts used to be something we used to solve difficult things and innovate,” Roebber said. “Why the political process is now so destructive to such an important function of society and why the (political) climate has almost become antagonistic toward education is troubling. We’re sitting on our hands instead of accelerating the things we need to do.”

Hyping US Missile Defense Capabilities Could Have Grave Consequences

In response to North Korea’s latest ballistic missile test, which flew higher and farther than any of its previous launches, President Trump told Americans not to worry. “We will take care of it,” he said. “It is a situation that we will handle.”

The big question is how. Unfortunately, Trump’s assertion may rest on his unwarranted confidence in the US missile defense system. During a recent interview with Fox News host Sean Hannity about the threat posed by a potential North Korean nuclear strike, he declared that the United States has “missiles that can knock out a missile in the air 97 percent of the time.”

The facts, however, tell a different story.

The reality is that the US Ground-based Midcourse Defense (GMD) system has succeeded in destroying a mock enemy missile in only 56 percent of its tests since 1999. And, as I’ll explain, none of the tests approached the complexity of a real-world nuclear launch.

What’s more, ever since the George W. Bush administration, the GMD program has been exempt from routine Pentagon oversight and accountability procedures. The result? Fifteen years later, all available evidence indicates that it is still not ready for prime time, and may never be.

Of course, Trump is prone to exaggeration. In fact, he has averaged more than five lies per day since taking office. But it is critical to understand the potential ramifications of this particular Trumparian boast: It could lull Americans into a false sense of security and, even more alarming, embolden Trump to start a war. As veteran military reporter Fred Kaplan pointed out, if the president truly believes the US missile defense system is infallible, “he might think that he could attack North Korea with impunity. After all, if the North Koreans retaliated by firing their nuclear missiles back at us or our allies, we could shoot them down.”

Such wishful thinking could clearly lead to a disastrous miscalculation. And what’s worse, Trump just may believe his preposterous claim because he’s not the only one making it.

If You Repeat a Lie Often Enough…

Missile defense advocates have a long history of hyperbole. A 2016 report by the Union of Concerned Scientists included an appendix with a selected list of some three dozen statements administration and military officials have made extolling the GMD system’s virtues. They are incredibly consistent, and given the facts, consistently incredible.

In March 2003 — before the GMD system was even deployed — then-Undersecretary of Defense Edward Aldridge assured the Senate Armed Services Committee that its “effectiveness is in the 90 percent success range” when asked if it would protect Americans from the nascent North Korean threat.

Seven years later, in December 2010, then-Missile Defense Agency Director Lt. Gen. Patrick O’Reilly told the House Armed Services Committee’s strategic forces subcommittee that “the probability will be well over in the high 90s today of the GMD system being able to intercept” an Iranian intercontinental ballistic missile (ICBM) targeting New York City.

Fast forward to April 2016, when Brian McKeon, principal deputy undersecretary of defense for policy, testified before the Senate Armed Services Committee’s strategic forces subcommittee. “The US homeland,” he maintained, “is currently protected against potential ICBM attacks from states like North Korea and Iran if it was to develop an ICBM in the future.”

Wrong, wrong, and yet again, wrong. As Washington Post “Fact Checker” columnist Glenn Kessler wrote in mid-October, the claim that the GMD system has a success rate in the “high-90s” is based on “overenthusiastic” math. The system has succeeded only 56 percent of the time over the last two decades, but the calculation is predicated on a hypothetical, never-been-tested launch of four GMD interceptors with a 60-percent success rate producing a 97-percent chance of destroying one incoming ICBM. If one interceptor missed because of a design flaw, however, the other three would likely fail as well. “The odds of success under the most ideal conditions are no better than 50-50,” Kessler concluded, “and likely worse, as documented in detailed government assessments.”

No surprise, defense contractors also wildly overstate the GMD system’s capabilities.

This September on CNBC’s Squawk Box, Leanne Caret, president and CEO of Boeing’s Defense, Space & Security division, stated unequivocally that the GMD system would “keep us safe” from a North Korean attack. The system is “doing exactly what is needed,” Caret said, but added that it will ultimately require even more rocket interceptors from her company, the prime GMD system contractor since 1996. There are currently 40 interceptors in underground silos at Fort Greely in Alaska and four at Vandenberg Air Force Base in Southern California, all made by Boeing.

Raytheon CEO Thomas Kennedy, whose company produces the “kill vehicle” that sits atop Boeing’s interceptor, was equally sanguine about the GMD system when he appeared on Squawk Box the following month. “I say relative to the North Korean threat, you shouldn’t be worried,” Kennedy said. “But you should ensure that you’ve talked to your congressman or congresswoman to make sure they support the defense budget to the point where it can continue to defend the United States and its allies.”

Given such glowing reviews, it’s no wonder President Trump asked Congress for $4 billion for the GMD system and other programs, such as the ship-based Aegis system, designed to intercept short- to intermediate-range missiles. In a November 6 letter to lawmakers, Trump wrote: “This request supports additional efforts to detect, defeat, and defend against any North Korean use of ballistic missiles against the United States, its deployed forces, allies, or partners.”

The House of Representatives apparently is even more enthused about the GMD system’s much-touted capabilities. It passed a $700-billion defense authorization bill on November 14 that includes $12.3 billion for the Missile Defense Agency — more than triple what Trump requested. Some of that money would cover the cost of as many as 28 additional GMD interceptors, but lawmakers asked Defense Secretary Jim Mattis to develop a plan to add 60, which would increase the overall number of interceptors to 104.

Unrealistic, Carefully Scripted Tests

If members of Congress bothered to take a closer look at the GMD system’s track record, they would hopefully realize that committing billions more is throwing good money after bad. Even the most recent test, which the Missile Defense Agency declared a success, would not inspire confidence.

That test, which took place on May 30, resulted in a GMD interceptor knocking a mock enemy warhead out of the sky. At a press conference afterward, then-Missile Defense Agency Director Vice Adm. James Syring claimed it was “exactly the scenario we would expect to occur during an operational engagement.”

Not exactly. Yes, the Pentagon did upgrade its assessment of the GMD system in light of the May exercise, but — like previous tests — it was not held under real-world conditions.

In its 2016 annual report, the Pentagon’s Operational Test and Evaluation office cautioned that the GMD system has only a “limited capability to defend the U.S. homeland from small numbers of simple intermediate range or intercontinental ballistic missile threats launched from North Korea or Iran.” The “reliability and availability of the operational [interceptors],” it added, “are low.” After the May test, however, the office issued a memo stating that “GMD has demonstrated capability to defend the US homeland from a small number of intermediate-range or intercontinental missile threats with simple countermeasures.”

Despite this rosier appraisal, Laura Grego, a Union of Concerned Scientists (UCS) physicist who has written extensively about the GMD system, is not convinced that the latest test represents a significant improvement. After analyzing an unclassified Missile Defense Agency video of the May 30 exercise, she concluded that it was clearly “scripted to succeed.”

As in previous tests, system operators knew approximately when and where the mock enemy missile would be launched, its expected trajectory, and what it would look like to sensors, she said. And, like the previous tests, the one in May pitted one GMD interceptor against a single missile that was slower than an ICBM that could reach the continental United States, without realistic decoys or other countermeasures that could foil US defenses.

The key takeaway? The GMD system has destroyed its target in only four of 10 tests since it was fielded in 2004, even though all of the tests were held under improbably ideal conditions. If the tests had been more realistic, the deployed GMD system likely would be zero for 10. Moreover, the system’s record has not improved over time. Indeed, it flunked three of the four tests preceding the one in May, and not because the Missile Defense Agency made the tests progressively more difficult.

According to the 2016 UCS report Grego co-authored, a primary reason for the GMD system’s reliability problems is not funding, but lack of oversight. In its rush to get the system up and running, the George W. Bush administration exempted the program from standard military procurement rules and testing protocols. That ill-advised decision has not only run up the system’s price tag, which to date amounts to more than $40 billion, it also has produced a system that is incapable of defending the United States from a limited nuclear attack.

“Regardless of what President Trump and other missile defense boosters want us to believe, the data show that we can’t count on the current system to protect us,” said Grego. “We need to reduce the risk of a crisis escalating out of control. Only diplomacy has a realistic chance of doing that.”

Photo: Department of Defense

You Might Be Wasting Food, Even If You’re Not Throwing It Away

Biofuels, if grown and processed correctly, can help contribute to emissions reductions.

When I was a child, I was often told not to waste food. Phrases like “Clean your plate or no dessert,” and “Just cut out that little spot. It’s a perfectly good banana,” and “Don’t put that in the back of the fridge. It’ll spoil and then we’ll have to throw it out.”

Now, half a century later, food waste has grown from family stories into a worldwide policy issue. A common estimate is that 40% of food is wasted. Scientific papers analyze consumers’ feelings about the sensory and social qualities of meals, and reducing waste is becoming just as much a concern as local, organic, and community-supported. This issue is critical. Yet an important part of the food waste problem remains unseen.

This additional waste involves not the food that is thrown out because no one eats it—but the food we do eat.

Recent studies by an international group of researchers led by Peter Alexander of the University of Edinburgh have shown just how important this additional kind of waste is. Alexander and his colleagues have published a series of papers that give detailed, quantitative analyses of the global flows of food, from field to fork and on into the garbage can. The results are striking. Only 25% of harvested food, by weight, is consumed by people. (Measuring food by its energy values in calories or by the amount of protein it contains, rather than by its dry weight, does increase the numbers but only a bit—to 32% and 29% respectively.)

But beyond these overall figures, Alexander and colleagues point to the importance of two kinds of waste in the ways in which we do eat our food, but in an extremely inefficient way. One is termed “over-consumption,” defined as food consumption in excess of nutritional requirements. (For the purposes of this discussion, I am referring to food consumption in excess of caloric requirements. However, it is critical to note that calories consumed only tells a small part of the story. A complete analysis would include the quality of the foods consumed and the many systemic reasons why we “over-consume”—including the structure of the food industry, the affordability of and access to processed foods relative to healthier foods, etc. But that is the subject for several books, not one blog post.)

Even using a generous definition of how much food humans require—e.g. 2342 kcals/person/day, compared to the 2100 kcal used in other studies—Alexander et al. find that over-consumption is at least comparable in size to the amount of food that consumers throw out (“consumer waste”). This is show in the graphic below, in which in each column, the uppermost part of each bar (in dark purple) represents over-consumption and the second-to-the-top section (light purple) shows consumer waste.

Losses of harvested crops at different stages of the global food system. The four columns represent different ways to measure the amount of food: from left to right, by dry weight, calories, protein, and wet weight. Source: Figure 4 of Alexander et al., 2017, Agricultural Systems; DOI: 10.1016/j.agsy.2017.01.014.

So, it turns out that for many people, reducing consumption could improve health while also potentially saving food and therefore also the many resources that go into growing and distributing it.

But neither overconsumption nor consumer waste are the largest way we waste the resources that can be used to produce food. That turns out to be livestock production—the dark red sections in the graphic above. Livestock are an extremely inefficient way of transforming crops (which they use as feed) into food for humans, with loss rates ranging from 82% (in terms of protein) up to 94% (by dry weight) once all of the feed they consume during their lifespans is considered. It’s not food that goes into our garbage or landfills, but it represents an enormous loss to the potential global supply of food for people just the same.

The reasons have to do with ecology: when we eat one level higher on the food web we’re losing about 90% of the edible resources from the level below.

Achieving the ultimate goals of reducing food waste—for example, reduced environmental consequences and ensuring more people have access to foods that meet their nutritional requirements—of course will require additional and critical steps. For example, additional food doesn’t help if it isn’t nutritious or can’t be accessed by the people who need it. Also, spared land doesn’t help if that land isn’t managed in a way that contributes to a healthier environment. However, thinking more about all types of food waste can help us to find better ways to protect our natural resources while producing and distributing healthy food for all.

The results of these new analyses should expand what we think of when we hear the words “food waste.” Yes, it includes the food we buy but don’t eat—the vegetables we leave on our plates and the bananas we throw into the compost bin—and it’s very important to develop habits and policies to reduce this waste. But we also need to confront the wastefulness in what we do eat, by asking: how much and what kind of food should we be buying in the first place?

Climate Summit Makes Progress Despite Trump, But Much More Urgency Is Needed

The Fijian COP23 presidency placed this sea-faring canoe outside of the main plenary hall in Bonn, symbolizing that when it comes to climate change, we are all in the same boat. Photo: By the author.

As the 23rd meeting of the Conference of the Parties (COP23) to the United Nations Framework Convention on Climate Change—or the annual UN climate talks—opened in Bonn, Germany on November 6, the urgency for much greater action on climate change could not have been more clear.  Just two days earlier, Typhoon Damrey barreled into Vietnam, resulting in 69 deaths and nearly $1 billion in damages.  The storm was the worst to hit the southern coastal region of Vietnam in decades, and came on the heels of Hurricanes Harvey, Irma, and Maria, which devastated communities in Texas, Florida, Puerto Rico, and several Caribbean islands; as well as raging forest fires in western North America and Brazil; heatwaves in Europe; and floods in Bangladesh, India, and Nepal.

The week before COP23 started, the United Nations Environment Program released its annual Emissions Gap Report, which found that the global warming emission reduction commitments put forward by countries under the Paris Agreement “cover only approximately one-third of the emissions reductions needed to be on a least cost pathway for the goal of staying well below 2°C.”

The report said that current commitments make a temperature increase of at least 3oC above pre-industrial levels by 2100 very likely, and if this emissions gap is not closed by 2030, it is extremely unlikely that the goal of holding global warming to well below 2°C can still be reached.  The report’s warning was reinforced by analysis released by the Global Carbon Project during the talks, projecting that after three years in which global CO2 emissions have remained flat, they are likely to increase by 2% in 2017.

The UNEP report contains good news as well, outlining practical ways to slash emissions in the agriculture, buildings, energy, forestry, industry and transport sectors, along with actions to control hydrofluorocarbons and other high-potency greenhouse gases.  The report finds that nominal investments in these sectors could help to avoid up to 36 GtCO2e per year by 2030.  Almost two-thirds of this potential is from investment in solar and wind energy, efficient appliances, efficient passenger cars, afforestation and stopping deforestation — actions which have modest or net-negative costs; these savings alone would put the world well on track to hitting the 2oC target.

In the context of these risks and opportunities, the progress made at COP23 was far too modest compared to what is needed.  But negotiators did succeed in laying the groundwork for more substantial achievements down the road, and the fact that countries pushed ahead despite President Trump’s announced intention to withdraw the United States from the Paris Agreement is in itself a welcome accomplishment.

Getting the rules right

A major focus of the negotiations in Bonn was on hammering out the detailed rules (or “implementation guidelines”) for the Paris Agreement, on a range of issues including transparency and reporting, accounting standards for both emissions and finance, the new market mechanisms created in the agreement that would allow reductions achieved in one country to be credited against another country’s emissions reduction commitments, how to raise the ambition of national actions over time, and actions needed to cope with the mounting impacts of climate change.

Countries had set a goal in Paris of resolving these and other implementation issues at the 2018 climate summit in Poland next December, so there was no expectation of final agreements on any of these issues at COP23.  Rather, the objective at COP23 was to narrow the differences amongst countries and to clearly frame the options on the key issues involved, so as to facilitate their resolution next year.

Progress was made across the range of rulebook topics, but it was uneven.  A bright spot was on the sensitive issue of transparency and reporting, where differences were narrowed and a fairly clear set of options was laid out.

By contrast, the negotiations on “features” of the “nationally-determined contributions” that countries are required to put forward under the Paris Agreement, as well as accounting standards for these NDCs and the up-front information requirements to ensure their “clarity, transparency, and understanding,” were much more polarized, and the end result was an unwieldy 179-page list of issues and options.

The most charged discussions were around finance, specifically the requirement in Article 9.5 of the Paris Agreement, that every two years developed countries must provide “indicative quantitative and qualitative information” on their future support for developing countries, including, “as available, projected levels of public financial resources to be provided.”  The African Group of countries pushed for more clarity and detail on this projected financial support by developed countries for developing country actions, a move that was strongly opposed by the U.S. and other developed countries.

Developing countries want greater certainty of the financial resources available to them going forward, so they can plan projects accordingly; but developed countries are loathe to make multi-year commitments that they can be held accountable for. This issue will be revisited at the intersessional meeting in Bonn next spring, and then brought to ministers at COP24 in Poland in December, 2018.

We left Bonn not with the draft negotiating text on the Paris rules that some had hoped for, but instead with a set of “informal notes” produced by the co-facilitators of each of the working groups, which capture and organize the proposals put forward by countries.  Much work lies ahead to meet the goal of adopting the full Paris rulebook at COP24, and while negotiators can work out some of the technical details in advance, it will clearly be up to ministers to resolve the political differences on the major crunch issues.

Catalyzing higher ambition

The decision adopted in Paris explicitly acknowledged the substantial shortfall in collective ambition that could keep the world from meeting the aggressive temperature limitation goals embodied in the Paris Agreement, and called for a “facilitative dialogue” at COP24 next year to address ways to close this gap.  Working with last year’s Moroccan COP22 presidency, Fiji put forward its vision of how this process should be conducted, renaming it the “Talanoa dialogue.” As Fiji explains, “Talanoa is a traditional approach used in Fiji and the Pacific to engage in an inclusive, participatory and transparent dialogue; the purpose of Talanoa is to share stories, build empathy and trust.”

This will be a year-long process consisting of a preparatory phase starting in early 2018 and a political phase involving ministers at next year’s climate summit in Poland. The dialogue will be structured around three key questions: “Where are we? Where do we want to go? and How do we get there?”  One major input will be the Special Report of the Intergovernmental Panel on Climate Change examining the impacts of global warming of 1.5ºC above pre-industrial levels and related global greenhouse gas emission pathways, scheduled for completion next October.  Additional analytical and policy-relevant inputs will be welcomed in the preparatory phase, not just from countries but from NGOs, businesses, research institutions, and other stakeholders as well.

To succeed, this process must do more than reaffirm the ambition gap; it must spur concrete steps to close it.  A central focus will be on the need for countries to signal, by 2020, their intention to raise the ambition of their existing commitments between now and 2030.  But the dialogue should also examine how states, cities, businesses and other “non-state actors” can contribute to closing the ambition gap, and encourage a range of sectoral initiatives on renewable energy, energy efficiency, forestry and agricultural sectors solutions, carbon pricing and other areas.

The Talanoa dialogue process will be jointly led by Fiji and Poland, as the current and incoming COP presidencies. Given Poland’s heavy dependence on coal-generated electricity, there are legitimate concerns about that government’s interest in generating the specific outcomes from the dialogue needed to enhance ambition.  It is clearly up to all countries to ensure the dialogue stays on track and produces meaningful results.

Dealing with climate impacts

Even if we are able to close the emissions gap and hold temperature increases well below 2 degrees Celsius, as leaders committed to in Paris, the world is going to suffer increasing climate impacts over the next several decades, as a result of the emissions we have already put up in the atmosphere.  Developing countries, together with environmental and development NGOs, pushed in Bonn for faster progress on helping vulnerable countries and affected communities cope with these impacts, both through enhanced measures to adapt to current and future impacts, as well as strategies to deal with the now-unavoidable “loss and damage” they are facing, both from “slow-onset” impacts such as sea level rise and desertification, and from typhoons, hurricanes, floods, and other extreme events.  At COP19 in Poland in 2013, countries established the Warsaw Implementation Mechanism on Loss and Damage (or “WIM”), and explicit provisions on loss and damage were included in the Paris Agreement.

Sadly, not enough was accomplished in Bonn on this front.  Five European countries did pledge a total of $185 million of renewed support for the Adaptation Fund and the Least Developed Countries Fund.  But developed countries blocked a push by vulnerable countries to make the issue of mobilizing the much greater level of financial resources to deal with loss and damage a standing agenda item at future negotiating sessions.  All they would agree to is to hold an “expert dialogue” on this issue at next spring’s subsidiary body meetings in Bonn, which in turn will inform technical analysis on financial resource mobilization for loss and damage activities that is already being undertaken by the WIM.

Expect this issue to continue to be a major topic of debate in the negotiations going forward, including at COP25 in late 2019, where countries have agreed to conduct a full-blown review of the WIM.

The elephant in the room

When President Trump announced in June of this year his intention to withdraw the United States from the Paris Agreement, there was widespread condemnation from other countries, as well as from business and civil society both in the United States and around the world.  Not one country indicated that they intended to follow President Trump out the door; in fact, during the first week of the Bonn climate summit, the only other Paris Agreement holdout, Syria, announced that it intended to join all the other countries of the world in the agreement, rather than be lumped in with the United States as a climate scofflaw.

The U.S. negotiating team in Bonn kept a low profile, hewing largely to past positions on issues like transparency and reporting for developing countries and robust accounting standards.  They were quite tough in the negotiations on climate finance and loss and damage, though, perhaps out of concern that any sign of flexibility risked an unhelpful response from the Tweeter-in-Chief.

White House staff organized a side event on the role of coal, nuclear, and gas technologies as climate solutions, which generated a well-organized and creative protest led by U.S. youth groups.  It was also overshadowed by the launch of the Powering Past Coal Alliance, a coalition of 20 countries led by Canada and the United Kingdom that is committed to phasing out use of coal no later than 2030.

California Governor Jerry Brown, former New York Mayor Michael Bloomberg, and other officials at the Nov. 11th launch of America’s Pledge at the U.S. Climate Action Center in Bonn. Photo: By the author.

But the real energy at the Bonn climate summit came from the We Are Still In initiative of university presidents, mayors, governors, business leaders, and NGOs who showcased their steps to reduce climate pollution and pledged their intention to meet America’s emissions reduction commitments under Paris, regardless of President Trump’s efforts to dismantle federal leadership on climate policy.

Through an intensive schedule of side events, press briefings, and bilateral meetings with ministers and business leaders from other countries, this U.S. subnational delegation went a long way to assuring the rest of the world that President Trump represents a short-term deviation in U.S. policy, not a long-term trend.  Of course, until there is a clear demonstration of bipartisan political support at the federal level for climate action, other countries will understandably continue to harbor concerns about the reliability of the United States as a partner in this endeavor.

What lies ahead

Negotiators will reconvene in Bonn on April 30 for a two-week session of the UNFCCC’s subsidiary bodies, working to make progress across the range of issues to be decided at COP24 in Katowice, Poland next December, and Fiji and Poland will convene several informal ministerial discussions over the course of 2018 focusing on the key political decisions that must be reached at COP24.

There are a number of other events where ministers and even heads of state will be discussing ways to enhance climate action over the next year, including:

  • The One Planet Summit being convened by French President Emmanuel Macron in Paris, with a focus on mobilizing increased public and private sector climate finance.
  • Two more sessions of the Ministerial Meeting on Climate Action (MOCA), a dialogue launched by Canada, China, and the European Union in Montreal in September; the next meeting will be hosted by the EU next spring, followed by a meeting hosted by China next fall.
  • The ninth meeting of the Petersberg Climate Dialogue, a ministerial-level discussion to be co-hosted in mid-2018 by Germany and Poland, as the incoming presidency of the Conference of the Parties.
  • The G7 leaders’ summit, to be hosted by Canada on June 8th and 9th 
  • The Global Climate Action Summit being hosted in San Francisco next September by Gov. Jerry Brown, which will bring together national, state and local political leaders, businesses, scientists, non-profits and others to “showcase the surge of climate action around the world – and make the case that even more needs to be done.”
  • The G20 leaders’ summit, hosted by Argentina and starting just two days before COP 24, on November 30th.  Leaders should build on the Climate and Energy Action Plan adopted at the G20 summit last July under the German presidency, which was agreed to by all G20 countries except for the United States.

All of these events can – and must – contribute to accelerated progress at COP24 in Katowice and beyond in implementing and strengthening the Paris Agreement.  As the UNEP report and other analyses clearly show, we have the solutions we need to the crisis we face. But what we need now is a much greater level of political will.

Which States are Most Energy-Efficient? Here are the Latest Results

Adding insulation to your attic is an effective step to improve the efficiency of your home, save money, and cut carbon emissions.

Autumn makes me think of leaves colored orange and amber and red, of the smell of cinnamon and nutmeg wafting from a range of desserts… and of states vying for top honors in the annual state ranking of energy efficiency policies and progress.

The leaves are mostly done, and the desserts are in my belly. But the latest ranking from the American Council for an Energy-Efficient Economy is out and available, and ready for sampling. It’s always a beautiful sight and a tasty treat.

Energy efficiency – Why and how?

Energy efficiency is already one of the main tools we use for meeting new energy demand. Why it makes sense as a tool is clear, as the new report says:

[Energy efficiency] creates jobs, not only directly for manufacturers and service providers, but also indirectly in other sectors by saving energy and freeing up funds to support the local economy. Efficiency also reduces pollution, strengthens community and grid resilience, promotes equity, and improves health.

The annual scorecard “ranks states on their efficiency policies and programs, not only assessing performance but also documenting best practices and recognizing leadership.” ACEEE does that by looking at a range of metrics that are shaped by each state’s efforts:

  • Utility and public benefits programs and policies
  • Transportation policies
  • Building energy codes and compliance
  • Combined heat and power (CHP) policies
  • State government–led initiatives around energy efficiency
  • Appliance and equipment standards

 

ACEEE state energy efficiency scorecard rankings, 2017

Who’s on top?

The highlighted states include some familiar faces plus a few new ones. The top states were the same in 2017 as in 2016, and highlighted the strong focus on efficiency in certain parts of the country:

  • Massachusetts took the top spot for the seventh straight year, and stood alone at the top (after tying with California for 2016 honors). Northeast states also took third (Rhode Island), fourth (Vermont), sixth (Connecticut), and seventh (New York).
  • The West Coast states garnered high marks, too, taking second (California), fifth (Oregon), and seventh (Washington).
  • The Midwest also made a good showing, at ninth (Minnesota) and eleventh (Illinois and Michigan, tied).

ACEEE makes a point of calling out some “most improved” states, too, and this year that brought in states from other parts of the country:

  • Idaho was the most most improved, jumping up seven spots and landing it in the middle of the pack—its best performance, says ACEEE, since 2012—due to investments in “demand-side management”, increased adoption of electric vehicles, and building energy code improvements.
  • Florida gained three spots in part due to its work on energy efficiency for the state’s farmers.
  • Its work to strengthen building energy codes in the state helped Virginia move up four notches.

The savings add up. (Source: ACEEE state energy efficiency scorecard)

How do states take it to the next level?

No state got a perfect score, ACEEE points out, so every state has room for improvement. Fortunately, they offer a few tips on how to make that happen:

  • Establish and adequately fund an energy efficiency resource standard (EERS) or similar energy savings target.
  • Adopt policies to encourage and strengthen utility programs designed for low-income customers, and work with utilities and regulators to recognize the nonenergy benefits of such programs.
  • Adopt updated, more stringent building energy codes, improve code compliance, and involve efficiency program administrators in code support.
  • Adopt California tailpipe emission standards and set quantitative targets for reducing VMT [vehicle miles travelled].
  • Treat cost-effective and efficient CHP [combined heat and power] as an energy efficiency resource equivalent to other forms of energy efficiency.
  • Expand state-led efforts—and make them visible.
  • Explore and promote innovative financing mechanisms to leverage private capital and lower the up-front costs of energy efficiency measures.

But we’re making progress, and leading states are demonstrating what a powerful resource energy efficiency is.

And with a federal administration that seems determined to move backward on clean air and water by propping up coal, and backward on climate action, that state action on clean energy is more important now than ever.

So congrats to the efficiency leaders among our states, and thanks.

 

Lessons from the Land and Water Songs to Heal

Photo: Samantha Chisholm Hatfield

Recently, I was fortunate to be selected as an HJ Andrews Visiting Scholar, and was able to complete an HJ Andrews Scholar Writing residency, where I had the incredible opportunity to view the forest area through a Traditional Ecological Knowledge lens.

I had scheduled the residency specifically so that I could take my child along, teaching Traditional Knowledge as it has been taught to me, passing along generations of information and skills in areas that had been historically traversed by ancestors. There were times when I doubted my decision, as complaints of spotty wifi access began. That quickly subsided as complaints turned to questions, and I knew I had made the correct decision. Spiritually my child felt it; there was connection again, as I’d hoped.

Photo: Samantha Chisholm Hatfield

My child and I sat at the river’s edge, watching the water roll by. We discussed the water, and the tall trees and the bushes that walked alongside the water’s path. We discussed the tiny bugs skimming around on the water, and the spiders, and the rocks. We joked about how Sasquatch must love this area because of the incredible beauty. Time stopped, and the symphony of wind and water rose around us as we watched branches and flowers dance and sway.

At one point my child broke out in traditional song. To most, this would not seem unusual, but to those who live traditionally, this is spectacular. It was song that came to him, gifted through, and from the waters, about the water and the beauty he found. The water ran clean, and the birds sang freely.

This is who we ARE. As Native People, we are living WITH the land, rather than simply ON it. We engage with the tiniest of tiny, as well as with the largest of large. This is a concept that many cannot fathom. Reciprocity with the land is at the core of where we come from, and has been a basis for our survival as well as our identity. It has been essential that we as Native people continue to nurture the land as it nurtures us. Reciprocity is in traditional information, and is an everyday integrated expectation, that fosters well-being of ourselves and our identification as Natives.

Reciprocity with the land

Photo: Samantha Chisholm Hatfield

Our identity is connected with every tiny droplet. Every tiny speck of dust. Every rock, every tree, every winged, every insect, and four-legged. We are one among many, we do not have dominion over, but rather have congruence with.

It is not vital that we share the same communication language, it is not vital that we appear in the same form. The tiny fly deserves as much respect as the bison, or the person standing next to me. Those of us who work to protect have been given orders to do so, often by our Elders, who are at the forefront of holding our wisdom. Oral histories and Traditional Knowledges hold information and instructions that direct and guide us. There is a belief that we are entrusted to care for the earth, and for the seventh generation to come, so that life, and the earth, will remain just as it is currently, if not better for our future generations.

We are borrowing the resources that we live with, caring for the investment of life that we are blessed with. We are taught to have forward-thinking vision in our actions. We work for all, even for those who are antagonists. We do so, because we have been gifted visions by our ancestors of what Seven Generations means, and what it takes to get there. Vision, of how to care of a world that is quickly losing its grip on reality of situations that are dominating, destructing, and devaluing knowledge. Vision, of what needs repaired, who needs helped, and what path needs to be walked.

Respecting how much Traditional Knowledges can teach us

Many question the validity of TEK, and are not be able to ‘connect the dots’. It is difficult to view a system in an alternative perspective if you have not have grown up in it, nor have been enculturated to it. It can seem foreign and be discounted as baseless. Western mainstream promotes the “dominion over” ideology. Controlling and manipulating that which would challenge or hinder human desires. Reciprocity and gentleness are values taught and held in high esteem in many Native communities.

There are no separations from the environment and ourselves, it is a knowing that what befalls the land, befalls The People.

There are no escape diversions, no malls to buy excuses from, no spas to run to for the weekend.

Our escapes come in the form of clear streams, and old growth towering majestically, in the form of waves crashing on shores and dirt under our feet. We are guided alongside teachings of congregations of the finned, and the winged, the hooved, and the crawlers. Our songs, our prayers, our way of life depends on these aspects, but only when they are connected, and healthy.

Half a book, half a lesson, half a river, half a tree, half a story cannot teach. It cannot sustain culture, it cannot sustain life. Anyone’s.

The integration of knowledge is often viewed as an interloper, incongruent and irrelevant to the daily lives of westernized systems of thought. This could not be further from the truth.

 

Dr. Samantha Chisholm Hatfield is an enrolled member of the Confederated Tribes of Siletz Indians, from the Tututni Band, and is also Cherokee. She earned a doctorate from Oregon State University in Environmental Sciences focusing on Traditional Ecological Knowledge (TEK) of Siletz Tribal Members, from Oregon State University. Dr. Chisholm Hatfield’s specializations include: Indigenous TEK, tribal adaptations due to climate change, and Native culture issues. She’s worked with Oregon Climate Change Research Institute, and successfully completed a Post-Doctoral Research position with Northwest Climate Science Center. She’s spoken on the national level such as the First Stewards Symposium, National Congress of American Indians, Northwest Climate Conference, and webinars. She’s helped coordinate tribal participation for the Northwest Climate Science Center and Oregon State’s Climate Boot Camp workshops. Her dissertation has been heralded nationally by scholars as a template for TEK research, and remains a staple conversation item for academics and at workshops. She is a Native American Longhouse Advisory Board member at Oregon State University, was selected as an H.J. Andrews Forest Visiting Scholar, is actively learning Tolowa, Korean, and continues her traditional cultural practices. In her spare time she dances traditionally at pow wows, spends time with family, and is the owner of a non-profit organization that teaches the game of lacrosse to disadvantaged youth.    

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

 

 

How Much Does it Cost to Charge an Electric Car in Your City?

Everyone can see what gasoline costs, but how much does electricity cost for recharging an electric car? Photo: Tewy CC BY 2.5 Wikimedia)

Most drivers know how much it costs to fill the tank with gasoline. It’s hard to miss the glowing numbers at the corner station.  But how much does it cost to recharge an electric car? And how much money do EVs  save drivers compared to gasoline-powered cars? To help answer these questions, our new report, “Going From Pump to Plug,” looks at the price of recharging an EV at home in the fifty largest cities in the US, as well at public charging stations.

Charging an EV at home can be much cheaper than gasoline

After comparing the findings for large cities across the US, the answer is clear: for every electricity provider we looked at, charging an EV is cheaper than refueling the average new gasoline vehicle.

Compared to using the average new gasoline car, driving on electricity would save on average almost $800 per year in fuel costs.

Find EV savings in your city:

<a href=’http:&#47;&#47;www.ucsusa.org&#47;ev-savings’><img alt=’Dashboard 1 ‘ src=’https:&#47;&#47;public.tableau.com&#47;static&#47;images&#47;NP&#47;NP3R36XGZ&#47;1_rss.png’ style=’border: none’ /></a>

var divElement = document.getElementById('viz1511877245282'); var vizElement = divElement.getElementsByTagName('object')[0]; if ( divElement.offsetWidth > 800 ) { vizElement.style.width='812px';vizElement.style.height='527px';} else if ( divElement.offsetWidth > 500 ) { vizElement.style.width='812px';vizElement.style.height='527px';} else { vizElement.style.width='100%';vizElement.style.height=(divElement.offsetWidth*1.77)+'px';} var scriptElement = document.createElement('script'); scriptElement.src = 'https://public.tableau.com/javascripts/api/viz_v1.js'; vizElement.parentNode.insertBefore(scriptElement, vizElement);

However, where you live and what electric rate plan you choose can change your savings. For almost all EV drivers, choosing a time-of-use (TOU) electric rate plan is needed to see the largest savings.

A TOU plan gives cheaper electric rates during off-peak periods (often late at night), with higher rates for using electricity during high-demand times. Because most EVs are parked at home overnight, TOU rates are a good fit for most EV drivers.

In some cities, especially in California, TOU rates are essential for saving money on fuel costs. For example, in my home in Oakland, CA, recharging using the standard electricity plan is equal to buying gasoline at $3.34/gallon, while using the TOU plan only costs the equivalent of $1.03/gallon.

Public EV charging costs are variable

Costs to charge at public charging stations varies considerably. Some stations are free, while others can cost over twice as much as home charging. However, the impact of public charger costs is often muted by the high preponderance of home charging.  For example, a San Francisco driver that uses higher-cost DC fast charging for 20 percent of charging would only see their average fuel costs increase from $0.78/gallon equivalent to $1.35/gallon.

Savings on maintenance, too

Drivers of battery electric vehicles also can have significantly lower maintenance costs. These EVs have no engine, so no oil changes, spark plugs, or engine air filter to change. Instead, the electric motors and batteries require little to no attention. This means less time and money spent on routine car maintenance. Comparing the Chevy Bolt EV to the Chevy Sonic gasoline car, the Bolt owner will spend over $1,500 less on scheduled maintenance over the first 150,000 miles.

Policies needed to ensure all can access these EV benefits

Electric vehicles can save drivers on fuel and maintenance costs, at the same time they help reduce global warming emissions and air pollution. However, good policies are needed to make sure that all can access the benefits of EVs.

  • Buyers need to be able to afford EVs. Currently, EVs cost more to manufacture compared to similar-sized gasoline cars. These manufacturing costs are coming down as EV production volumes increase and technology advances, but federal, state, and local purchase incentives are vital to accelerate the transition from gasoline to electricity.
  • Policies are needed to ensure that everyone can recharge an EV at a price lower than gasoline cost. Regulators and electricity providers should ensure that EV customers can access lower-cost electricity rate plans, which are key to making EVs a reliable and affordable alternative to gasoline vehicles. Solutions are needed for those who cannot charge at home and those that must drive long distances. Therefore, access is essential to reliable and affordable public charging, especially fast-charging stations. Also, public policies that improve charging options at apartments and multi-unit dwellings will broaden the base of drivers who can choose an EV.
  • Public policies should require manufacturers to produce higher volumes of EVs and encourage a greater diversity of electric-drive models and sizes. There are many more models of EVs available now as compared to just a few years ago, but there is still a lack of some types of vehicles with electric such as pickup trucks. Also, not all manufacturers offer EVs nationwide, making it more difficult for buyers to find and test drive an EV.

Policies like these can help ensure that everyone has access to EVs and can make personal transportation choices that both save them money and reduce their carbon footprint.

The Senate Tax Bill: Just Say No

By the end of this week, the Senate is expected to vote on the tax cut bill reported out of the Senate Finance Committee earlier this month.  Changes in the bill will likely be made right up to the end, as Republican leaders struggle to secure the 50 votes needed to approve the bill under budget “reconciliation” rules (normally, it takes 60 votes to move major bills through the Senate).

At least six Republican Senators are reported to have serious concerns about the bill, either because they fear it would add too much to the deficit or because it favors large corporations more than small business owners. If three or more of those Senators end up opposing the bill (and no Democrats break ranks and support it), the bill will die.  For the reasons outlined below, it should.

Equity is in the eye of the campaign contributor

As was the case with the tax bill passed by the House on November 16th, there’s been a fierce debate over the distributional impacts of the Senate bill.

The nonpartisan Tax Policy Center finds that if the bill becomes law, most taxpayers would see a reduction in their tax bills in the years out to 2025 – although the cuts would be heavily skewed towards the top 1 percent of the income distribution (households with more than $750,000 in annual income).

But this changes dramatically in 2026 and beyond, because of Senate Republicans’ decision to make the corporate tax cuts permanent while sunsetting the individual tax cut provisions after 2025 (they did this to comply with the prohibition on increasing the deficit after ten years when using the reconciliation process).  As a result, by 2027, the TPC projects that some 50 percent of taxpayers would see an increase in their tax bills, while only 28 percent would still be getting a tax cut.  And once again, the impacts would be skewed: for taxpayers with incomes in the top 0.1 percent of all Americans, less than 2 percent would see an increase in their taxes, while 98 percent would enjoy a tax cut averaging nearly $224,000 in 2027 alone.

The Senate bill also eliminates the tax penalty that individuals who choose not to purchase health insurance must pay under the Affordable Care Act, in order to achieve deficit reductions that can be used to offset the cost of the permanent reductions in corporate tax rates.

The Congressional Budget Office estimates this will reduce the deficit by $338 billion over the next ten years, as the number of Americans with health insurance would decrease by 13 million by 2027, reducing government outlays both for Medicaid and for subsidies for individuals purchasing health insurance in the ACA’s marketplace.  Meanwhile, health care premiums would increase by 10 percent for individuals in the non-group marketplace, compared to the baseline.

This is Robin Hood in reverse – robbing the poor to pay the rich – and represents yet another effort to dismantle the Affordable Care Act without putting anything credible in its place to deal with the health care needs of millions of Americans.

Deficit, schmeficit

Some Republicans have claimed that the Senate’s tax cuts will largely pay for themselves as a result of higher economic growth rates.  But analysis using a highly-respected economic model estimates the Senate bill would increase the deficit by some $1.4 – 1.6 trillion over the next ten years; this closely tracks the $1.4 trillion deficit increase estimate by the official Congressional scorekeeper, the Joint Committee on Taxation.  And of course, these estimates assume that Congress allows the individual tax cuts to expire after ten years, allows the generous business deduction for investments in factories and equipment to expire after five years, and allows other tax increases scheduled to take effect in 2026 to stand.  If (as is more than likely) those provisions were to be reversed by a future Congress and President, the resulting deficit would swell further, creating even greater pressure for cuts in Medicaid, Medicare, food assistance, and other programs that benefit low- and middle-income families, along with reduced investments in scientific and medical research, education and job training, infrastructure, and other public goods.

As I’ve noted previously, federal government investments in science research and innovation have led to discoveries that have produced major benefits for our health, safety, economic competitiveness, and quality of life.  This includes MRI technology, vaccines and new medical treatments, the internet and GPS, earth-monitoring satellites that allow us to predict the path of major hurricanes, clean energy technologies such as LED lighting, advanced wind turbines and photovoltaic cells, and so much more.

The work of numerous federal agencies to develop and implement public and worker health and safety protections against exposure to toxic chemicals, air and water pollution, workplace injuries, and many other dangers has also produced real benefits. All of these programs (along with veterans’ care, homeland security, transportation and other infrastructure, law enforcement, education, and many other core government programs) fall within the non-defense discretionary (or NDD) portion of federal spending, which has been disproportionately targeted for spending cuts over the last decade. As an analysis by Paul Van de Water of the Center for Budget and Policy Priorities points out, “NDD spending in 2017 will be about 13 percent below the comparable 2010 level after adjusting for inflation (nearly $100 billion lower in 2017 dollars).”

The aging of the American population, continued increases in health care costs, the need to replace crumbling infrastructure and pay billions to help communities devastated by hurricanes and wildfires, and other factors will drive a substantial increase in federal spending over the next few decades.

One estimate is that federal spending will need to grow from 20.9 percent of gross domestic product (GDP) to 23.5 percent of GDP by 2035, largely as a result of increased costs for Social Security, Medicare, and Medicaid. In order to keep the national debt from growing faster than the overall economy, federal revenues will need to increase from some 17.8 percent of GDP in 2016 to at least 20.5 percent in 2035.

The need to increase spending on entitlement programs such as Social Security and Medicare, along with pressure to maintain (or increase) defense spending, will continue to squeeze NDD expenditures in the years ahead, even without the higher deficits created by the Senate Republican tax cut bill.

The game plan is clear as can be: pass massive tax cuts that add hundreds of billions of dollars each year to the deficit, then starting next year, use those higher deficits as an excuse for slashing programs that benefit middle- and lower-income Americans.

There’s a better way

The outcome of this week’s Senate action on the tax bill will not only determine whose tax bills will go down (or up) and by how much, important as that is; it will also impact America’s ability to maintain our global leadership on scientific and medical research and technology innovation, improve our air and water quality, avert the worst impacts of climate change (and cope with the impacts we can’t avoid), upgrade our transportation, energy, and communications infrastructure, and make investments in other critical areas.

Senators face a momentous choice.  They must refrain from handing out trillions of dollars in tax breaks to profitable corporations and the wealthiest Americans, while eroding health care coverage and laying the groundwork for deep cuts in a broad range of important federal programs down the road.  Instead, they should start over, and work across the aisle to craft a real tax reform plan that clears away the dense thicket of special interest loopholes and simplifies the tax code in a way that’s equitable to all Americans, without exploding the deficit and endangering the ability of the federal government to meet America’s current and future needs.

We know it’s possible to legislate in such a responsible, bipartisan manner; after all, it’s happened before.

 

 

I’m About to Testify at the EPA. Here’s What I Have to Say….

Photo credit: Sanjay Suchak.

After a restful and enjoyable time with my family over the Thanksgiving holiday, I’ve extended my stay here in Charleston, West Virginia, to testify at the Environmental Protection Agency’s hearing on its proposed repeal of the Clean Power Plan. I’ll be speaking tomorrow morning. Below are my prepared remarks.

Testimony of Dr. Jeremy Richardson at EPA’s Public Hearing on Repealing the Clean Power Plan, on behalf of the Union of Concerned Scientists

Remarks as Prepared

I stand before you today as the brother, son, and grandson of West Virginia coal miners. And at the same time, I am also a senior energy analyst at the Union of Concerned Scientists, where I focus on the US power sector and how the clean energy transition already underway can help us address the urgent threat of climate change. As you might imagine, we have interesting discussions at our house over Thanksgiving!

Like so many others here today, my family has helped keep the lights on in this country for generations—and also like many of you, I’m deeply proud of that history. And yet, things are changing—fast. My research confirms something you probably already know: coal has become increasingly uneconomic compared with cheaper, cleaner forms of energy like natural gas and renewable energy—and this market trend is going to continue.

But these days it feels like facts don’t matter—and that’s very disturbing to a scientist like me. So, just for the record, allow me to state some things that are true and obvious, but seem to have been forgotten in the rhetoric around these issues.

First, coal miners and coal communities are suffering. The job losses experienced—especially over the last five to ten years—have been devastating for families and communities. But—the primary driver of the decline of coal is economics. Coal can no longer compete with cleaner and cheaper ways to generate electricity—largely natural gas, with renewables increasingly beating coal in some parts of the country. And coal mining jobs have been declining since the middle of the last century because of mechanization, the shift to cheaper, large-scale surface mining operations out West, and geologic realities that have led to declining productivity in Appalachian coal mines. It is easy to blame the policies of the last president for all of coal’s problems, but it simply isn’t true.

Second, it is the job of the Environmental Protection Agency to protect human health and the environment. It is not the job of the EPA to protect the coal industry. In fact, the EPA is bound by law to address air and water pollutants from producing and using coal. Many of these pollutants are hurting the health of communities right here in Appalachia, where acid mine drainage and coal ash contaminate our waterways, and are also causing harm around the country where people live downwind from coal-fired power plants. The EPA is also legally required by the Clean Air Act to curtail global warming emissions from power plants because science shows that climate change poses risks to our health and the health of future generations.

This brings me to my third point, that climate change is real, period. It is primarily caused by human activities—including the burning of fossil fuels like coal, natural gas, and oil. Despite what you may have heard or read, this is not disputed by any expert on the issue. The recently released National Climate Assessment special report confirms what we already knew—we are observing the impacts of climate change now, and left unchecked it will likely get much worse. And importantly, we can still avoid some of the worst consequences—if we act fast.

The Clean Power Plan was an important step toward reducing emissions from one of the largest sources of US carbon emissions. Nationally, it also would have provided significant economic and public health benefits by lowering other pollutants and encouraging growth in the renewable energy industry. That is why I am here today to voice UCS’ opposition to the repeal of the Clean Power Plan.

My dad, who is a retired longwall maintenance foreman believes that climate change is real. He also understands that coal represents good paying jobs for our state. So do I.

When I left behind my previous research in astronomy more than 10 years ago, I did so because I was deeply passionate about addressing the threat of climate change. The truth is, the often-vilified environmental activists are worried about climate change because of its impacts on people. For me, I don’t really care about what happens to the polar bears—but the reality of melting ice is truly a canary in the coal mine, and the potential impacts on humans and human civilization are deeply frightening.

According to the latest scientific assessment, sea levels are expected to continue to rise by at least a few more inches in just the next 15 years, and from 1 to 4 feet or more by 2100. Tidal flooding in communities along the US East and Gulf Coasts has increased in recent decades, and is expected to get much worse in the coming decades. An analysis by Climate Central finds that depending on emissions level, between 147 and 216 million people worldwide are at risk of living on land that is below sea level in 2100. And that may be a conservative estimate, based on current population estimates and data limitations, and the authors suggest the number may be much higher—around 300 to 650 million people.

Heavy rainfall is increasing in both intensity and frequency across the United States, with the largest increases observed in the Northeast region, which includes West Virginia. Changes in extreme precipitation can lead to catastrophic flooding, like the state experienced during the historic floods of June 2016.

Even as I changed careers, I recognized that we must reduce emissions to address climate change—and that means changing how we produce energy. But I have been wrestling with a nagging question—what does a low carbon future mean for a place like West Virginia, a place I still call home?

The challenge before us is that we must figure out how to solve both problems—bringing down carbon emissions so that we protect people all around the world who are facing the impacts of climate change, and simultaneously investing in new economic opportunities in the very places where people depend on coal for their livelihoods.

As a start, we must increase federal spending targeted at economic development and economic diversification in coal country. If the current administration really cared about coal communities, it would be doubling down on those investments, not cutting federal programs, like the Appalachian Regional Commission and the Economic Development Administration, that support communities here and around the region.

I am here to tell you that it’s time we tone down the rhetoric on this issue. It’s not as if there was a “war on the horse and buggy” a hundred years ago. No, something better came along: the automobile.

Today we are seeing solar panels go up on homes and businesses right here in West Virginia, no thanks to state policies, but rather due to some intrepid business leaders who see the future and want our state to be a part of it. We need to collectively support those efforts, not because we’re anti-coal, but because we deserve to be a part of the clean energy economy that is emerging all around us.

This hearing, and this entire process to derail action to address climate change, are distracting us from the real work at hand.

We must not only work to protect the planet’s climate through strong carbon standards, but also ensure that we invest in workers and communities to spur new economic opportunities right here in the heart of Coal Country.

I do not accept that this is an “either-or” proposition.

The Union of Concerned Scientists stands ready to do its part.

Thank you.

Dr. Jeremy Richardson

Senior Energy Analyst, Union of Concerned Scientists

Will Automakers Walk the Talk on EVs? Four Things to Look for at the 2017 Los Angeles Auto Show

Chevy Bolt featured in the 2016 LA Autoshow. Photo: Dave Reichmuth

I’ll be attending this year’s Los Angeles Auto Show to check out the latest and greatest in vehicle technology. While the flashy presentations of the automakers will certainly grab attention, here are four things that I’ll really be paying attention to:

Are there more electric vehicle (EV) options?

The future of transportation is electric drive, but we are a long way from replacing all gasoline and diesel cars with EVs (both plug-in and fuel cell EVs). One barrier in the way of transitioning to electric cars is the availability of EV models. In California, EV sales have been increasing over the last few years, with plug-in sales reaching 4.5 percent of all cars and trucks sold in the state this year. This is a great start, but we’ll have to go a lot further to meet our air quality and climate pollution reduction goals. To get to higher levels of EV sales, we’ll need to start seeing more EV models and a larger selection of sizes and styles available. So, I’ll be looking for what new options are coming, especially in the larger-size vehicle segments like SUVs.

Will automakers showcase the available technologies powering cleaner, more efficient cars?

While the future is electric, many of the cars sold over the next 5 to 10 years will have a combustion engine. Making those conventionally-powered cars and trucks as clean as possible will be important to reduce air pollution and climate-changing emissions. The good news is that the technology needed to meet clean car standards is available and starting to be used by many automakers. This means I’ll expect to see more efficient engines like smaller, turbocharged four- and six-cylinder engines replacing larger and thirstier naturally-aspirated engines.

Last year, Nissan showed off an innovative variable compression engine that promises both higher power and better efficiency, but hadn’t released a vehicle using it. Will this year see this engine go into production?

Many automakers are talking EVs. Who’s actually following through?

When I visited the show last year, I heard from automakers detailing plans to electrify their cars and saw a number of new EVs promised for 2017. But how much was talk and who actually followed through? Some companies did bring out successful EVs. A year ago, the Chevy Bolt EV was about to go on sale and just last month it became the sales leader for EVs. Toyota’s Prius Prime was also new to the market last November and is now a top-selling EV. On the other hand, cars like Hyundai’s Ioniq EV had an impressive press showing, but since then has been virtually nonexistent in the US market, with less than 400 sales this year to date.

In California, the division between EV market leaders and laggards is stark: For the first 9 months of 2017, 11 percent of BMW-branded vehicles were plug-ins and Chevrolet had over 14 percent plug-in sales! Over the same period, Honda had less than 0.3 percent electric drive sales, Hyundai sold just over 1 percent EVs, and Subaru sold more than 55,000 cars in the state without a single plug-in option available.

Both the Chevy Bolt EV (left) and Hyundai Ioniq BEV (right) were featured at last year’s LA Auto Show. However, General Motors has sold over 17,000 Bolts in 2017 so far compared to less than 400 sales for Hyundai’s Ioniq. 

There were also several concept and prototype EVs at the show during the last couple of years. Will any of them show up this year as production models? Our research into the EV market last year showed that there a number of automakers that are lagging their peers in making EVs available, despite claims of progress. Our report shows that even though most companies now offer electric vehicles, many are not truly available (especially outside California). The first step in catching up is to start making EVs in volume and marketing them like they do their gasoline cars.

What models are emphasized by the manufacturers?

The LA Auto Show starts with a preview for media, with press conferences and displays of the automakers’ latest offerings. Then, after the press and auto industry executives are gone, the show opens to the public, becoming a showroom for virtually every car, truck, and SUV on the market in the US.

It’s interesting to see what models the manufacturers emphasize for each audience. For example, in 2015, Audi featured a prototype of a full-size all-electric SUV on its stage for the press days, but it was gone by the public days. Last year, Nissan didn’t even show its electric car, the LEAF on the press days. Other brands, like Chevrolet and BMW grouped their electric offerings and called attention to them for both the press and public days.

This inconsistent effort by some manufacturers at an auto show is indicative of the larger struggle playing out within the major automakers. On one hand, the car companies acknowledge that EVs are the future of transportation and will be needed to meet global emissions and EV standards being set by countries around the globe. However, they also have decades of expertise in designing and making gasoline-powered cars and trucks. This provides a powerful incentive to resist the inevitable switch from oil to electricity as the primary fuel for our personal vehicles. That’s why it’s important that we have regulations and incentives in place that both ensure that gasoline vehicles are as clean as possible while also pushing the automakers to move as quickly as possible away from combustion altogether.

 

 

 

New Tax on Graduate Students Would Harm the US

A graduate student demonstrates how her tax burden would increase by nearly $10,000 if the House version of the Tax Cuts and Jobs Act became law. Photo: Amanda Rose

On November 16, the House of Representatives and the Senate Finance committee voted to advance tax reform legislation. These bills, both of which are named the “Tax Cut and Jobs Act,” propose to disproportionately and negatively impact the middle class, threaten to leave millions of Americans without health coverage, would add as much as $1.5 trillion to the deficit, and could burden graduate students with a giant tax hike.

Many graduate students have taken to social media to demonstrate how their tax burden would change if the House version of the Tax Cuts and Jobs Act became law. This picture and calculation were made publicly available via the Facebook page of Amanda Rose, a graduate student at Columbia University in New York City, NY.

The version passed by the House of Representatives includes a new tax provision that would require students to pay tax on the value of the tuition that is waived for graduate student research and teaching assistants. Given the low pay of such positions, this would make it nearly impossible to pay cost of living expenses while attending graduate school. As a former graduate student myself, and having been a teaching and research assistant, I understand how critical every dollar of a stipend is to purchase groceries, pay rent, and maybe even take care of your own health (if you can afford it).

The Tax Cut and Jobs Act is an attack on higher education in more ways than one. It also proposes to repeal the student loan interest reduction, graduate student tuition waivers, the Hope Scholarship Credit, the Lifetime Learning Credit, and other educational assistance programs. But it isn’t just graduate students who will feel the consequences; such moves stand to affect us all.

Science is linked to economic prosperity

Investment in science is investment in our nation. Many international comparisons still place the US as a leader in applying research and innovation to improve the country’s economic performance. A prior review by the Organization for Economic Co-operation and Development (OECD) concluded that since World War II, United States leadership in science and engineering has driven its dominant strategic position, economic advantages, and quality of life. Indeed, researchers have long understood that there is a link between economic prosperity and investment in science and technology.

The leadership of the United States in science explains, in part, why the country is ranked as one of the most economically competitive nations in the world. Across a number of metrics, the United States is still the undisputed leader in basic and applied research.

Researchers in the United States lead the world in the volume of research articles published, as well as the number of times these articles are cited by others. The United States is not just producing a lot of raw science, it also is applying this research and innovation, as other metrics show.

The United States has a substantial and sustainable research program, as evidenced by the number of Ph.D. students trained; it invests heavily in research, as shown by the country’s gross domestic expenditure on research and development; and it is a leader at turning science into technology, as evidenced by the high number of patents issued.

Graduate students are critical to US science and innovation

If the production of science has helped the United States economy remain competitive, graduate students are largely to thank. They are pivotal to the production of novel science and innovation in the US, and they are also the professors, inventors, and innovators of the future that our economy depends on.

The Tax Cut and Jobs Act would make it difficult, if not impossible, for many of the brightest minds in America to enter into science, technology, engineering, and mathematics (STEM) fields, ultimately decreasing America’s international competitiveness in science and technology.

A provision in the Tax Cut and Jobs Act passed by the House of Representatives would tax graduate students on their tuition costs. This would reform the Internal Revenue Service tax code, section 115(d), which allows universities to waive the cost and taxation of tuition for graduate students who conduct research or teach undergraduate classes at approved universities.

An estimated 145,000 graduate students benefit from this reduction with 60 percent of these students in STEM fields. Thank goodness such provisions exist for tuition waivers and scholarships as even some of our senators likely wouldn’t be where they are today without this benefit in our tax code.

If graduate students were taxed on waived tuition, many who serve as research or teaching assistants would find it more difficult to cover basic living expenses with the stipend they receive. For example, a graduate student at Columbia University might receive $38,000 for a stipend and a tuition waiver for $51,000. Currently, they pay $3,726 in taxes, but that could go up to $13,413 under the House’s proposed legislation reducing their monthly take home pay for food, rent, and health from $2885 to $2078.

Some students have reported that they would see their stipends cut from $27,000 to $19,000, or from $13,000 to $8,000 for the year if the House’s tax reform bill became law. While some students may be able to depend on their families to defray the costs of these taxes, many graduate students who come from poor and middle class backgrounds could not. As the majority of Americans who come from poorer backgrounds are also minorities, this would deter diversity in higher education, where we already know it is sorely needed.

Some universities could cover tuition and the tax on that tuition for some students, but they wouldn’t be able to do it for all. Taxation of tuition waivers also would likely make the US less attractive to international students, many of whom are graduate students in STEM. Ultimately, this regressive tax legislation means fewer graduate students at universities and, therefore, decreased research in the United States.

An anti-science message is in the air

If you are surprised that graduate students are being targeted, you are not alone. Many organizations who support the higher education community have signed on to letters and published statements expressing concerns for graduate students, including the American Council on Education, the Association of Public and Land Grant Universities, the Association of American Universities, the American Association for the Advancement of Science, and the Union of Concerned Scientists.

It is unclear if a final version of a tax reform bill will include provisions that burden graduate students with enormous tax hikes. While the Senate’s version of a tax reform bill would retain many of the tax benefits for undergraduate and graduate students (including a non-taxable tuition waiver), it still includes many provisions opposed by organizations supporting higher education.

Regardless of what tax reform bill is pushed through, there is still the question of why the House targeted graduate students in the first place? Is it because they are an easy target having little representation on the hill? Is it because this would be one way to dismantle the pipeline of those pesky academics?

These are Americans who work hard to teach and produce transformative research that greatly benefits the United States economy–and they already do this for very little pay. Furthermore, the amount of money that the government would gain from these taxes has been said to be “miniscule” compared to trillions of dollars in national debt. It is absurd that graduate students are being targeted.

Speak up for all scientists now and in the future!

I’m a former graduate student and I would not have been able to afford graduate school if I had to pay tax on my graduate student tuition and certainly wouldn’t be where I am today without this benefit in our tax code. That’s why I’m speaking up for all the early career scientists now and in the future–everybody deserves the same opportunities that I had, and the United States deserves the continued prosperity that science affords it.

Call your senators today at 1-833-216-1727 and urge them to vote ‘no’ to the Tax Cut and Jobs Act.

The full Senate will vote on this bill after Thanksgiving. Learn more about the current tax reform legislation and how you can push back.

Always in “Hot Water”

My wife likes to joke that I am always in “hot water.” It’s a play on words that reflects my career from college, at two National Laboratories and now in retirement.

America’s National Laboratories are hotbeds of scientific research directed at meeting national needs. In my case, working at two national labs helped me contribute to resolving growing issues of environmental impacts of energy technologies—thermal electric generating stations, in particular on aquatic life of rivers, lakes and coastal waters.

Getting a PhD in 1965, I was recruited by the Atomic Energy Commission’s (AEC’s) Hanford Laboratory (now the Pacific Northwest National Laboratory of the US Department of Energy) to conduct research on thermal discharges to the Columbia River from nine Hanford, Washington, plutonium-producing nuclear reactors. They were part of cold-war nuclear weapons production, but their thermal discharges were not unlike those from a power plant, just larger.

With pretty good understanding of potential water-temperature effects on aquatic organisms, our team of researchers sought effects of elevated temperatures on various salmon populations and the river’s other aquatic life. We had two main objectives: (1) to identify effects of the Hanford reactors on the river’s life, and (2) to translate our findings into criteria for safely managing thermal discharges (like the 90-degree limit for damages I found for Delaware River invertebrates).

Our Hanford research caught the attention of AEC headquarters and its Oak Ridge National Laboratory in Tennessee. There was interest in countering the public thermal pollution fears by doing research that could be applied to minimizing ecological impacts everywhere. Thus, in the fall of 1969, I was asked to leave Hanford, which I greatly enjoyed (as a Northeasterner, the Pacific Northwest was like a paid vacation!) and moved to Oak Ridge in spring of 1970.

At Oak Ridge, I put together a team to develop criteria for minimizing ecological effects of thermal effluents nation-wide.  Oak Ridge had no power plants of its own. Tennessee Valley Authority (TVA) power stations nearby were research sites, but our focus was on developing general criteria. We built a new Aquatic Ecology Laboratory with computer-controlled tank temperatures, a set of outside ponds to rear fish for experiments, hired biologists and engineers, and assembled a “navy” of boats for field work. We set to work at a fever pitch.

But then…. The Congress passed the National Environmental Policy Act (NEPA), and the AEC was handed the Calvert Cliffs decision that mandated the AEC conduct complete reviews of the environmental impacts of the nuclear power stations it licensed. In 1972, our research staff was “reprogrammed” to prepare Environmental Impact Statements on operating and planned nuclear power plants. This turned out to be a tremendous opportunity to carefully evaluate not only thermal discharges but other impacts of using cooling water. By evaluating facilities across the country, we gained the nationwide perspective we needed for our research. With the National Lab having staff from many scientific and engineering fields to assign to the assessments, we gained a hugely valuable multi-disciplinary perspective that has helped us advance beyond just biology, fish and bugs.

Many years of productive thermal-effects work followed, with satisfaction that our contributions were often followed and our data used. We saw many of our efforts resolve issues for power plant thermal discharge permitting. The National Academies used our framework for water quality criteria for temperature; EPA used them as criteria for “Balanced Indigenous Communities” in thermally affected waters and setting temperature limits. As “thermal pollution” became more resolved, the Department of Energy and our National Laboratory provided our scientists the mission and capacity to work on other issues, most notably aquatic ecological effects of hydropower, that is helping with future innovation as technologies shift.

Throughout our research and analysis, we fostered “technology transfer” to the public through educational seminars and information aid to electricity generators. ORNL sanctioned some outside, site-specific consulting. I have been fortunate in retirement (since 2005) to continue to do this, and have assisted more than 50 companies and regulatory agencies (both domestic and foreign) with thermal effects issues. I feel good that the problem-solving research and analysis and application of this knowledge outside the labs (my “hot water”) have benefited society.

Through my time at the Hanford/Pacific Northwest and Oak Ridge national labs, I’ve worked with world-class researchers and scientists in many disciplines and have worked on projects that have advanced our understanding of ecological impacts from various energy sources. We need to continue to invest in our scientists at federal laboratories of the Department of Energy. I would like to thank my fellow scientists at government labs this Thanksgiving for the work they’ve done problem solving and finding innovative solutions for the public as well as private sector.

Dr. Charles Coutant retired as distinguished research ecologist in the Environmental Sciences Division of Oak Ridge National Laboratory in 2005. Dr. Coutant received his B.A., M.S., and Ph.D. in biology (ecology) from Lehigh University.  Since retirement he has served part time as an ecological consultant to regulatory agencies and industry.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Thanksgiving Dinner Is Cheapest in Years, But Are Family Farms Paying the Price?

Last week, the Farm Bureau released the results of its annual price survey on the cost of a typical Thanksgiving dinner. The grand total for a “feast” for 10 people, according to this year’s shoppers? About 50 dollars. ($49.87, if you want to be exact.) That includes a 16-pound turkey at $1.40 per pound, and a good number of your favorite sides: stuffing, sweet potatoes, rolls with butter, peas, cranberries, a veggie tray, pumpkin pie with whipped cream, and coffee and milk.

After adjusting for inflation, the Farm Bureau concluded that the cost of Thanksgiving dinner was at its lowest level since 2013. Let’s talk about what that means for farmers, and for all of us.

We can debate whether the Farm Bureau’s survey captures the true cost of a holiday meal for most Americans. This isn’t the world’s most technical survey—it was based on 141 volunteer shoppers at 39 grocery stores across the country purchasing these items at the best prices they could find.

But according to the USDA’s Economic Research Service, Americans do spend less than 10 percent of their disposable personal incomes on food. ERS data also shows that farmers receive just 16 cents for every dollar of food consumers purchase. (Speaking of historic lows, that’s the lowest farmer share of the food dollar in over a decade.) The rest of it is distributed throughout the food supply chain, which includes the companies that process, package, transport, and sell these foods at any number of retail outlets.

For our hypothetical holiday dinner for 10 (including leftovers), this means that in total, the farms that produced the raw foods, from potatoes to pumpkins, made about eight dollars. That’s eight dollars total across all farms, which then must pay workers’ wages and cover operating costs. These margins can work for large-scale industrial farming operations, due in part to heavy reliance on and exploitation of undocumented agricultural workers, but the math doesn’t add up for most family farms and farm workers.

And despite the savings we enjoy as consumers, the reality is that the prevailing model of food production isn’t good for any of us—least of all rural farming communities.

Midsize farms and missed opportunities

Midsize family farms, generally defined by the USDA as those with a gross cash farm income between $350,000 and $1 million, have long been key drivers of rural economies. But since 2007, more than 56,000 midsize farms have disappeared from the American landscape—a trend that has had serious consequences for rural communities across the country.

These farms employ more people per acre than large industrial farms, and when they disappear, they take both farming and community jobs with them. Midsize farms are also more likely to purchase their inputs locally, keeping more money in the local economy. Research has shown that areas containing more midsize farms have lower unemployment rates, higher average household incomes, and greater socioeconomic stability than areas having larger farms.

Beyond their impact on local economies, midsize family-owned farms are more likely than large industrial farms to use more environmentally sustainable practices such as crop rotation and integrated livestock management, resulting in greater crop diversity. This, too, may have health implications: in a country in which about half of all available vegetables and legumes are either tomatoes or potatoes, with lettuce bringing home the bronze, it stands to reason that greater diversity in our food supply can only be a good thing.

So if midsize farms are so great… why are they disappearing, and what can we do to reverse the trend and revitalize rural farming communities?

 US Department of Agriculture/public domain (BY CC0)

The Local Food and Regional Market Supply (Local FARMS) Act

Representatives Chellie Pingree (D-ME), Jeff Fortenberry (R-NE), and Sean Maloney (D-NY) and Senator Sherrod Brown (D-OH) recently offered their answer with a set of proposed policies and programs they want included in the 2018 farm bill. The Local Food and Regional Market Supply (Local FARMS) Act of 2017 would make new investments in local and regional food systems, helping small and midsize farmers connect with more consumers. It would ease the way for institutions like schools to purchase locally produced food, and would make fresh, healthy foods more accessible and affordable for low-income families.

In short, the Local FARMS Act is a win-win for farmers and eaters.

Leveraging consumer demand for local and regional foods and the substantial economic opportunity provided to midsize farmers by institutional food purchasers, this bill shortens the distance between producer and consumer. That ensures that a greater share of the food dollar ends up in farmers’ pockets—and that more fresh, healthy foods get to the people that need them.

Some of the key programs and provisions include:

  • The new Agricultural Market Development Program, which streamlines and consolidates local food programs to provide a coordinated approach to strengthen regional food supply chains. This program includes:
  • A Food Safety Certification Cost-share Program that allows farmers to share the cost of obtaining food safety certifications, which are required by many institutional purchasers but often prove cost-prohibitive for small and midsize producers—many of whom already have good food safety practices in place.
  • An amendment to the Richard B. Russell National School Lunch Act that allows schools to use locale as a product specification when soliciting bids, making it easier to procure local foods.
  • A Harvesting Health Pilot authorizing a pilot produce prescription program that would enable healthcare providers to offer nutrition education and fresh fruit and vegetable coupons to low-income patients.

By providing the infrastructure and support needed to bridge critical gaps between local producers and consumers, the proposed policies and programs contained in the Local FARMS Act lay the groundwork for stronger regional food systems, more vibrant local economies, and a healthier food supply.

Let’s give thanks and get to it

Whatever table you might gather around this Thursday, in whosever company you might enjoy, save some gratitude for the folks who put the food on your plate. And when you’re done enjoying your meal, let’s get to work take a nap. And when you’re done taking a nap, let’s get to work. If we want a financially viable alternative to industrial food production systems, it’s up to all of us to use our voices, our votes, and our dollars to start investing in one.

Stay tuned for action alerts from UCS on how you can help strengthen our regional food systems and support our local farmers through the 2018 farm bill. For updates, urgent actions, and breaking news at your fingertips, use your cell phone to text “food justice” to 662266.

Climate change is here. Can California’s infrastructure handle it?

Wildfires across the West threaten critical infrastructure. Photo: Tim Williams. CC-BY-2.0 (Wikimedia)

This has been a year of extremes in California. We’ve experienced all-time temperature highs (statewide and regionally), a deadly heat wave, the most destructive and lethal wildfires in the state’s history, and the second wettest winter on record following a historic five year drought. The impacts have been staggering: many lives lost, thousands of properties destroyed, and costly infrastructure damage.

We know that extreme weather events will become more common and intense as a result of climate change. Such events multiply threats to infrastructure across the state, endangering community well-being, public health and safety, and the economy. A new white paper released by UCS today – Built to Last: Challenges and Opportunities for Climate-Smart Infrastructure in California –makes the case for investing limited public resources in infrastructure that can withstand climate change impacts and keep Californians safe.

A better path forward

Extreme weather-related infrastructure disruptions in recent years – from power losses and train derailments to bridge and spillway failures, road closures, and low water supplies – provide us with a sobering preview of the future challenges facing California’s infrastructure systems. (See this map for other recent examples.) The type, frequency, and severity of these climate-related hazards will vary by location, but no region of California or infrastructure type will be left untouched.

While the state of our dams, pipes, levees, bridges, and roads is mediocre at best (they received a combined C- on ASCE’s 2012 report card), the need to upgrade or replace our water, power, and transportation systems is a golden opportunity to plan, design, and build these systems with climate resilience in mind. The UCS white paper describes a set of principles for ‘climate-smart’ infrastructure and then highlights barriers and opportunities for improving and accelerating their integration into public infrastructure decisions.

What is climate-smart infrastructure?

Climate-smart infrastructure is designed and built with future climate projections in mind, rather than relying on historic data that are no longer a good predictor of our climate future. It bolsters the resilience of the Golden State’s communities and economy to the impacts of extreme weather and climate change instead of leaving communities high and dry, overheated, or underwater.

A microgrid is providing efficient, reliable, cleaner power for Blue Lake Rancheria government offices, buildings, and other critical infrastructure, such as an American Red Cross disaster shelter. It will also create local jobs and bring energy cost savings. Photo: Blue Lake Rancheria

Climate-smart also can reduce heat-trapping emissions, spend limited public funds wisely, and prioritize equitable infrastructure decisions. This last point is important because some communities in California are more vulnerable to both climate impacts and infrastructure failure due in part to decades of underinvestment and disinvestment, especially in many low-income communities, communities of color, and tribal communities.

When done right, the results can be innovative infrastructure solutions, like the Blue Lake Rancheria microgrid, that bring social, economic, health, and environmental benefits to Californians AND protect us from the weather extremes we are inevitably facing.  More examples of climate-smart principles in action are described in the white paper, and some are shown in the accompanying StoryMap.

We’re just getting started

The Golden State is beginning to integrate climate change into its plans and investments and recently released high-level guidance for state agencies. These and other efforts underway at the state level must be accelerated and implemented in a consistent and analytically rigorous, climate-smart manner.

This is especially important in light of the billions of taxpayer dollars the state is planning on spending on new long-lived infrastructure projects. Many more billions will be spent on maintenance and retrofitting of existing infrastructure over the next few years. These projects must be able to function reliably and safely despite worsening climate impacts over the coming decades. Otherwise, we risk building costly systems that will fail well before their intended lifespans.

Barriers can be overcome

There are still many reasons why public infrastructure is not being upgraded or built today in a more consistently climate-smart way. They generally fall into three categories: (1) inadequate data, tools, and standards; (2) insufficient financial and economic assessments and investments; and (3) institutional capacity and good governance are lacking.

For example, many engineers, planners, and other practitioners still don’t have enough readily usable information to easily insert climate impacts into their existing decision-making processes and economic analyses. In addition, there has not been enough attention focused on the unique risks and infrastructure vulnerabilities faced by low-income communities, communities of color, and other underserved communities.

The UCS white paper includes several recommendations on how to overcome the barriers we identified. They focus on ways to improve and accelerate the integration of our climate-smart principles into public sector infrastructure decisions. For instance, they range from increasing state and local government staff’s technical capacity and updating standards and codes to better incorporating climate-related costs and criteria, as well as climate resilience benefits, into project evaluations and funding decisions. Others include better planning in advance for more climate-smart disaster recovery efforts, ensuring better interjurisdictional coordination at the local and state government levels, and addressing the funding gap. Additional recommendations and specifics can be found in the paper. All infrastructure solutions should help advance more equitable outcomes, so equity is integrated throughout these recommendations

Building to last? There’s reason for optimism

Progress is being made, as evidenced by the recent state actions mentioned above and a growing number of climate-smart projects and local solutions. For example, Los Angeles has begun a process to update its building codes, policies, and procedures, called Building Forward L.A. San Francisco is incorporating sea level rise into its capital planning. Plus, there’s an ever-expanding list of novel funding mechanisms for these types of infrastructure investment. But we need more, and soon, to help inform the tough decisions ahead as we adapt to climate change and invest in long-lived infrastructure projects. Thoughtful implementation of our recommendations can help clear the way.

California governments should grab hold of the opportunities before them to spend limited resources in climate-smart ways that increase our infrastructure’s ability to provide California’s communities and businesses with the needed services to thrive now and in a changing climate future.

Coal-burning Dynergy Wants a Handout. Will Illinois Give It to Them?

Photo: justice.gov

Last week marked the end of the Illinois General Assembly’s 2017 veto session. Fortunately, Dynegy failed in its latest attempt to have the legislature bail out several of its coal plants in central and southern Illinois at the expense of local ratepayers.

But the fight isn’t over. Dynegy has been relentless in their efforts to force the public to pay for keeping their aging, polluting, and uneconomic coal power facilities open. Here are some pathways they are pursuing and why it’s important to stop them.

The legislature

Dynegy, a Texas-based company that owns eight coal plants in central and southern Illinois, introduced legislation (SB 2250/HB 4141) that would grant them a bailout for their uneconomic Illinois plants, while ratepayers foot the bill. These plants were built several decades ago: the bill would allow Dynegy to continue to emit harmful pollutants for years to come.

Last year alone, Dynegy’s Illinois plants emitted more than 32 million tons of heat-trapping carbon dioxide.

Dynegy claims that their Illinois coal plants are not being fairly treated in the current wholesale power market and if forced to close they would take hundreds of jobs with them. The proposed legislation would create a capacity-pricing system for Central and Southern Illinois, run by the Illinois Power Agency. Such a system would expectantly produce higher capacity prices, like those in Northern Illinois, and put more money into Dynegy’s coffers. Meanwhile, the higher capacity prices would be passed onto ratepayers.

Yet, Dynegy’s argument that immediate action is needed is unjustified. Ameren Illinois—the local power provider that purchases and delivers generation from Dynegy’s coal plants to customers—does not believe this is a resource adequacy issue in the short-term. And we agree. In 2016 the Illinois Clean Jobs Coalition (of which UCS is a member) worked tirelessly to pass a long-term vision for the state’s energy future with the passage of the Future Energy Jobs Act, which increases energy efficiency and renewable energy development in the state.

Prolonging the life of uneconomic and dirty coal plants would derail this clean energy future.

This bill got lots of push back at last week’s hearing. The opposition’s testimony noted that an immediate threat to grid reliability does not exist and passing the legislation would put a financial burden on Ameren Illinois ratepayers. It’s estimated that the proposal could raise Ameren Illinois customer’s electric bills upwards of $115 a year.

Avenue 2: the Pollution Control Board

In addition to its legislative efforts, Dynegy has been working with the Illinois EPA to rewrite the Illinois Multi-Pollutant Standard, which is a 2006 clean air standard for coal plants. The proposed changes to the rule would create annual caps on tons of sulfur dioxide and nitrogen oxide emitted by the entire coal fleet rather than on individual power plants. If approved, the new limit on sulfur dioxide would be nearly double what Dynegy emitted last year and the cap on nitrogen oxide emissions would be 79 percent higher than in 2016.

This proposal would allow Dynegy to close newer plants and run older and dirtier plants harder. Meanwhile, Illinois communities will get increased air pollution, and some will still be faced with job losses.

Not just an Illinois issue

While some blame environmental regulations for the ailing coal industry, a recent report from the Trump administration’s Department of Energy confirms the major primary reasons coal plants nationwide have been faced with economic woes are  low natural gas prices and flat electricity demand. Struggling coal plants aren’t just an Illinois issue. The role of coal in the electricity sector is on the decline nationwide, while the increase of wind and solar presents opportunities for communities, businesses, and policymakers.

Our recent report A Dwindling Role for Coal: Tracking the Electricity Sector Transition and What It Means for the Nation examines the historic transition of the US electricity sector away from coal and towards cheaper, cleaner sources of energy. Since 2008, more than one-fifth of US coal generation has either retired or converted to different fuels, with significant benefits to public health and the climate. This transition has reshaped the power sector and will continue to do so.

What’s next

It’s expected Dynegy will be back in 2018 with similar legislation. And the Illinois Pollution Control Board hearings will be held on January 17 in Peoria and March 6 in Edwardsville.

Recently, a third pathway for Dynegy has surfaced, a stakeholder process that will kick off at the end of the  month to discuss the potential policy opportunities that are laid out in a report requested by Governor Rauner and written by the Illinois Commerce Commission. The white paper addressed current questions about resource adequacy in central and southern Illinois.

Speak up!

Tell Governor Rauner, and your state legislators, to oppose a Dynegy bailout that would prolong the life of uneconomic coal plants in the state, and would have negative public health impacts for Illinois residents. Illinois needs to transition away from old, dirty, and costly fossil fuels, and continue to increase development of renewable energy and energy efficiency in the state.

Photo: justice.gov

Hey Congress! Here’s Why You Can’t Scrap The Electric Vehicle Tax Credit

The fate of the federal tax credit for electric vehicles hangs in the balance. The House version of the GOP-led tax plan removes it entirely while the Senate version (as of Friday, November 17th) keeps it on the books. As lawmakers work to combine the House-passed bill with the Senate version, let’s examine why the EV tax credit shouldn’t be eliminated.

What is the federal tax credit for electric vehicles?

Section 30d of the tax code gives electric vehicle buyers up to $7,500 off their tax bill – or allows leasing companies to receive the credit and lease EVs for lower rates.

The credit is scheduled to phase out for each automaker that surpasses 200,000 EV sales. Some of the early entrants to the EV scene, like Tesla, General Motors and Nissan, are forecast to hit the 200,000 limit in 2018, while others, like BMW, Volkswagen, and Ford, are relying on the federal tax credit to offset the price of EVs that are set to hit dealerships in the next couple years.

What has America gotten for investing in EVs?

The EV tax credit has stimulated a market for vehicles that are cheaper to drive, pollute half as much, and offer a simply better driving experience compared to gas-powered vehicles. If you think that automakers would have produced EVs without the prompting of state and federal policy, may I remind you that automakers fought tooth and nail against seatbelts and air bags, improving fuel efficiency, and pretty much every other vehicle-related regulation that has ultimately benefitted public health and safety. Consumers deserve the opportunity to choose clean vehicles, and the federal tax credit has made that choice easier to make by offsetting the upfront cost of EVs that is often higher than comparable gas vehicles.

The tax credit has also spurred domestic automakers to get in on the EV game. American companies like General Motors and Tesla sell EVs in all 50 states, and are competing with foreign auto giants to become the global leader in EV sales. At a time when EV demand is poised to skyrocket in other countries, eliminating the federal credit will hamper domestic automaker efforts to both sell EVs on their own turf and maintain their global competitiveness.

Federal support for EVs won’t be needed forever

As I’ve previously discussed, the federal tax credit is the most important federal policy supporting the EV market, but won’t be needed forever. Battery costs are forecast to continue their decline, with some projections showing EVs becoming price competitive with gasoline-fueled vehicles in the mid-2020’s. By making EVs cost competitive today, the federal tax credit has helped EVs gain a fingerhold in a market monopolized by gasoline-powered vehicles that have had over a century to mature. Removing the credit now is premature, and will cause EV sales to suffer at a time when the market is just beginning to gain traction.

What will happen to the EV market without the credit?

Even if the federal tax credit is eliminated, the California Zero Emission Vehicle Program will still require automakers to sell EVs in California and the 9 other states that adopted the ZEV program. This program will require EV sales in states that comprised about a quarter of the U.S. vehicle market, so EVs will certainly remain available for sale. Other state support for EVs, like a $5,000 tax rebate in Colorado, will survive too. For state-level EV incentives in your area, check out this handy guide. EVs will also remain cheaper to drive, and a smart choice for millions of Americans who have a strong demand for the technology. That’s the good news.

The bad news is that one of the primary hurdles to more EV adoption is their price (along with access to charging in multi-unit dwellings and the lack of a cheap EV SUV (see Tesla Model Y). So taking away a policy that directly addresses this barrier will make it harder to own an EV, and it will hurt sales. Georgia removed a state tax credit for electric vehicles, and sales dropped an estimated 90% in the following months. I’m not expecting as dramatic as a drop if the federal credit is removed, but EV sales will drop because they will become more expensive and automakers will have less incentive to making them available in the U.S.

So, join UCS in telling Congress that you deserve more clean vehicle options, and that the EV tax credit is a key federal policy that makes it easier to own an EV. Also keep an eye on the UCS website for additional ways you can get involved, and if you are considering an EV, getting one now might be a good option if you are looking to save at least $7,500 of its sticker price.

Pages