Combined UCS Blogs

The Hidden Dangers of Hurricane Florence: Catastrophic Storm Surge and Inland Flooding Threatens Rural and Low-Income Communities

UCS Blog - The Equation (text only) -

The North Carolina National Guard prepares for Hurricane Florence

Over the last few days, we have watched with deepening dismay as the forecast for Hurricane Florence has turned increasingly grim. This rapidly intensifying hurricane is now on a trajectory to come ashore somewhere along the southeast coast, likely in North Carolina, potentially as a Category 4 storm. What heightens the risks of this storm is the forecast of days of lingering heavy rain, threatening not just coastal but also inland areas.

A coastal emergency compounded by inland flooding

This was projected to be a below-normal or near-normal hurricane season—but any complacency that may have engendered has changed very quickly. Yesterday, the National Hurricane Center (NHC) was tracking no less than three storms in the Atlantic and there are additional advisories in the Pacific. And it just takes one major landfalling hurricane to make it a terrible season.

(On the other side of the world, super-typhoon Mangkhut is threatening the Philippines, Taiwan and China, after passing over Guam.)

Coastal states in the Southeast and Mid-Atlantic are clearly taking Hurricane Florence very seriously. As of now, there are emergency evacuation orders for well over a million people across North and South Carolina, Virginia and Maryland.

The Navy has moved ships from Naval Station Norfolk out to sea to ride out the storm more safely.

Duke Energy is gearing up for major impacts to the power system in North and South Carolina and getting emergency crews in place to restore power after the storm passes. In a news release it warned of widespread outages in North and South Carolina, potentially lasting days or weeks. The company said that impacts could exceed that of hurricane Matthew, which caused 1.5 million Duke customers to lose power and cost $125 million in repairs.

What makes this storm especially scary is the huge storm surge and major rainfall that is predicted to accompany it. Forecasts show that the storm might stall creating a multi-day extreme precipitation event, similar to what residents of Houston experienced in the wake of Hurricane Harvey last year and are still struggling to recover from.

The latest advisory from the National Hurricane Center indicates that if the peak of the storm surge coincides with high tide, areas from Cape Fear to Cape Lookout, including the Neuse and Pamlico rivers, could see surge as high as 6 to 12 feet! Other parts of coastal North Carolina and Virginia, including low-lying coastal areas, could see 2 to 8 feet of storm surge.

Alarmingly, the forecast also indicates the potential for 15 to 20 inches of rain from the storm, with some areas of North Carolina, South Carolina and Virginia expected to experience as much as 30 inches through Saturday! Depending on the track of the storm, places as far away as West Virginia could also see heavy rain and flash flooding in the days to come.

Unfortunately, much of the southeast and mid-Atlantic, including North Carolina, Virginia and the Washington DC area have experienced above-normal rainfall over the past weeks, so the ground is already saturated. With more rainfall coming, catastrophic flooding—including in inland areas—is very likely.

Potential impacts from the storm

A storm of this magnitude will undoubtedly cause great harm. Hopefully, with the advance warning and preparation underway, loss of life will be avoided.

Early analysis from CoreLogic shows that nearly 759,000 homes across North and South Carolina and Virginia, with a reconstruction value of over $170 billion, lie in the path of the storm surge from Hurricane Florence, were it to come ashore as a category 4 storm.

In rural communities in North Carolina, experience from previous storms shows that flooding could cause waste lagoons from hog farms to overflow, contaminating rivers and streams. Coal ash ponds can (and do) also leak toxic contaminants. And wastewater treatment facilities could be overwhelmed by floodwaters.

Sewage and waste can also contaminate groundwater, affecting the well water that many rural communities depend on for their drinking water supplies.

Loss of power, especially for long periods of time, can be life-threatening for patients in hospitals and others with medical conditions, if they are not quickly moved to safety, as we witnessed so tragically after hurricanes Irma and Maria hit last year.

Those incarcerated in prisons also must be evacuated for their safety—it is troubling to see news reports that, as of now, South Carolina has chosen not to evacuate a prison in Jasper County.

Disaster preparedness requires advance planning

The emergency response to Hurricane Florence hasn’t just been conjured up in the last few days; emergency managers, planners and utility managers are using hard-won experience from previous disasters to prepare for this storm. Hurricanes Floyd and Matthew taught some bitter lessons.

Getting people out of harm’s way is job #1. Hence the mandatory evacuation orders from some of the highest risk areas that are being issued well in advance of landfall. These are warnings that people should take seriously and obey.

The storm is also going to test the resilience of critical infrastructure like roads, bridges, power lines and substations, sewage treatment plants, storm water drainage, hospitals, airports and more. Smart investments made well ahead of time will pay off in the days to come. And fatal weaknesses will be exposed where those investments fall short.

Responsible officials are not waiting for the storm to hit or for its exact path to be clear: they are acting out of an abundance of caution and making sure people and critical services are protected as best they can. (Incidentally, that’s a lesson well worth extending to how we think about preparing for the growing risks of climate change.)

How will rural, island and low-income communities fare?

The true test of our disaster response doesn’t just lie in how quickly the lights come back on or flights are restored in major economic hubs, but in how well isolated or marginalized communities fare in the aftermath of storms.

Disasters lay bare the socioeconomic inequities in our society. For some, fleeing to safety is prohibitively expensive—they may not have money for hotels or gas or even a car. Taking time off from work because of impassable roads or closed schools could mean losing a job. For those who can barely make ends meet, buying flood insurance to protect their homes or belongings can seem a luxury.

Low-income communities and communities of color are more likely to live near toxic waste sites, like coal ash ponds and landfills. Rural communities are more likely to depend on well water.

Island communities including those along the outer banks of North Carolina and coastal South Carolina are on the frontlines of this storm. Hopefully their residents are heeding evacuation orders. In some cases, they may have to travel long distances to really get out of harm’s way, given the wide swath of destruction this storm is likely to cut—which creates an additional burden for those who may not have the resources. Rural and island communities could be cut off for days if their bridges are washed out or few access roads are flooded.

So as this storm bears down, let’s remember the people of Princeville and Roxboro, the residents of the Gullah/Geechee Nation, those from Nags Head and Kitty Hawk, and from Tybee Island and Kiawah Island, and many other small communities like them that may not make the headlines.

A long road ahead to recovery

Looking ahead, given the terrifying forecast, unfortunately we can expect to see major impacts from this storm. Hopefully, communities in its path will be able to ride out this storm without loss of life.

But experience shows that recovery will take a long time, well after Hurricane Florence drops out of the headlines and is no longer trending on twitter. Communities in Houston and Puerto Rico are still struggling to recover from last year’s catastrophic hurricane season.

And then there are a whole set of additional questions regarding our nation’s response to these types of disasters:

  • Will the nation use this as an opportunity to build back in a more resilient way that takes into account the impacts of climate-driven sea level rise; as well as the increasing intensity of powerful Atlantic storms and increase in heavy rainfall events fueled in part by climate change?
  • Will Congress and the administration adequately fund not just the immediate recovery efforts, but long-term resilient rebuilding, as well as voluntary home buyouts and relocation from high-risk areas?
  • Will the Federal Emergency Management Agency (FEMA) and the Department of Housing and Urban Development (HUD) budgets, so necessary for disaster preparedness and recovery, be protected as the September 30 deadline for the federal budget approaches?
  • Will Congress protect NOAA’s funding and mandate so it can continue to provide the science we need to anticipate and prepare for these storms?
  • Will the Environmental Protection Agency (EPA), so compromised under the Trump administration, do its job to quickly identify and remediate toxic pollution in the aftermath of this storm—or will it put communities at risk as we saw after Harvey?
  • Will Congress and states make targeted resources available for low-income and otherwise marginalized communities, both ahead of and in the aftermath of the storm?

The answers to those questions will provide an important indication of whether we have the resolve to truly take on the long-term challenge of dealing with growing risks of extreme weather and climate disasters in a robust and equitable way—or whether we will just default to responding to these as one-off catastrophes whose burden falls disproportionately on those who can least bear it.

For now, our thoughts are with the many millions of people in the path of this storm and the first responders who are working so hard to protect them. May they all be safe.

For more information on the local hazards from #Florence, follow the @NWS offices on Twitter: @NWSCharlestonSC @NWSMoreheadCity @NWSRaleigh @NWSWilmingtonNC @NWSColumbia @NWSGSP @NWSWakefieldVA @NWSBlacksburg @NWS_BaltWash @NWS @NWSCharlestonWV

— National Hurricane Center (@NHC_Atlantic) September 11, 2018

Global Climate Summit: Thank Goodness for Clean Energy State Champs

UCS Blog - The Equation (text only) -

Photo: Andy Dingle/Wikimedia Commons

This week, California is hosting a Global Climate Action Summit. The summit is intended to “bring leaders and people together from around the world to take ambition to the next level” and “celebrate the extraordinary achievements of states, regions, cities, companies, investors and citizens with respect to climate action.”

It couldn’t happen at a better time or a better place. The Trump administration is busy swinging a wrecking ball at the pillars of climate progress in the United States, including the Clean Power Plan, our nation’s first ever limits on carbon dioxide emissions from power plants, and the fuel economy/tailpipe emissions standards that cut carbon pollution from cars and trucks. And his administration is hatching a scheme to bail out aging coal plants that increasingly can’t compete against renewables or natural gas.

Given these actions at the federal level, the world community, which joined us in signing the historic Paris Agreement, can reasonably question our national commitment to combating climate change. And that’s why it is so important that the United States is hosting this summit, and can present many success stories to show that Donald Trump does not speak for this country when it comes to addressing climate change. There is so much to be proud of in the private sector, cities and towns, and universities and others, but I will focus on the particularly encouraging success at the state level. Here are some of the major state success stories that  should be highlighted at the summit, but also the areas that the summit should focus on to build upon and expand that success.

California—the gold standard

Today, Governor Brown signed an extremely ambitious and inspiring new law –a mandate of —100% carbon free energy by 2045. This a breathtaking standard for any state to adopt, and it is particularly transformative given California’s size as the world’s fifth largest economy were it a nation. This goal, if achieved, would put California on the track of net zero emissions by mid-century—the level of reduction that scientists across the globe have indicated is necessary for us to meet the goals of the Paris agreement and prevent runaway climate change impacts. Moreover, California has solid policies in place to make a major head start to meeting this goal, including renewable energy standards, a low carbon fuel standard, a cap and invest program, and many others.

The wind miracle in the Texas panhandle

Texas is ranked number one in wind energy generation in the United States; it generates more wind energy than all but five countries. Wind energy powers about 15% of the Texas economy, or over 6 million homes, with thousands more megawatts under construction. Texas’ success is derived from its strong steady winds and large open spaces, and the foresight of state leaders to invest over $7 billion in transmission infrastructure to connect these open areas to population centers. Wind energy is so plentiful and cheap in Texas that some customers even get their electricity for free at night.

Sunny solar in North Carolina

In 2017, North Carolina ranked second in the US in solar by the Solar Energy Industries Association. It currently powers over 500,000 homes with solar, about 5% of its energy production, and that amount is projected to double in the next five years. North Carolina has made this progress with state incentives and renewable portfolio requirements, investments by utilities, and solar energy purchases from major in state firms such as Apple and Ikea.

Offshore wind in Massachusetts

In the 19th century, Massachusetts was the maritime leader of the world in whaling industry. In the 21st century, it is on its way to becoming the national leader in a new maritime industry–offshore wind. To take advantage of the strong steady winds across the Atlantic ocean, Massachusetts enacted a far-sighted law to authorize utility companies to purchase, after competitive bidding, approximately 1600 megawatts of offshore wind, enough to power about a third of the homes in MA and about ten percent of the power for the state. And MA has followed this up by signing a twenty year contract for the first 800 MW project, at a surprisingly low cost  (levelized 6.5 cents per kwh), with more projects to follow.

What more should states do on clean energy?

The astonishing success of renewable energy in these states and many others (e.g., New York, South Dakota, Washington, Iowa, and others) coupled with gains in energy efficiency and switching from coal to gas, is helping to dramatically drive down emissions from power plants to the point where nationally we are about 28% below 2005 levels. This is solid progress, but much more needs to be done. Many states have renewable energy standards that require utilities to purchase certain percentages of renewable power; many of the targets can be easily ratcheted upwards as the falling costs of wind, solar and energy storage make much higher levels of renewables cost effective. In addition, states can do much more to modernize their electric grids and build out transmission lines to make sure that renewable power is being used whenever the sun is shining and the wind is blowing, and invest in energy storage to store that renewable energy when they are not.

Transportation is the Next Frontier

In contrast to the electric sector, the picture is very different for emissions from the transportation sector, which is now the largest source of emissions in the US. While there have been some increases in the efficiency of cars and trucks, in large part due to rules issued by President Obama, these gains have been mostly offset by increased vehicle miles traveled and shifting consumer preferences for SUV’s and trucks, owing to sustained low gasoline prices.

It is here that states particularly need to step up. Twelve states have adopted California’s greenhouse gas emission standards for gas-powered cars, and nine have adopted “zero emission vehicle standards” that require higher sales of electric vehicles through 2025; other states should join these groups, as Colorado has indicated it intends to do.

And states can do much more to incent electric cars, buses and trucks and enjoy dramatically cleaner air and less carbon pollution. One of the biggest barriers now is the higher up-front cost of electric vehicles. While falling battery costs are expected to bring electric vehicles into cost parity with gas-fired vehicles by the mid-2020’s, we are not there yet. In the interim, states should help offset this higher cost with rebates, focusing particularly on EV customers of low and moderate income. States can also build out charging station networks, and should focus particularly on making EV’s convenient for those who don’t own a garage and can’t easily charge up overnight at home. States can also direct electric utilities, whom they regulate extensively, to offer EV-related services to customers, such as installing charging stations in homes and apartments, or offering discounted rates for charging at off-peak hours. Finally, states can lead by example by purchasing electric cars, buses and trucks for their needs.

A funding source will be needed to pay for the transition from gasoline powered transportation to electricity. Here additional experimentation is needed. One promising example is a “cap and invest” program that is in place in California and is being considered by northeast and mid-atlantic states. A cap and invest system would establish an overall cap on transportation emissions, require fuel distributors to purchase “allowances” for the right to sell transportation fuels, and use the funds to invest in cleaner forms of transportation.


We have a lot to be proud of when it comes to clean energy advances, the Trump administration notwithstanding. However, the federal rollback is so extensive, and time is so swiftly running out, that we will need states, and other key stakeholders such as cities, universities, businesses and others to step up the pace. The Global Summit will be a good way to mark progress, but its more important role is to stimulate ambition and jump start the next round of policies.

Photo: Andy Dingle/Wikimedia Commons

Why We Need to Humanize Chemists, and All Scientists

UCS Blog - The Equation (text only) -

Silver Microscope Photo: Alexandra Gelle

Manifesto of a passionate chemistry PhD student, tired of having to fight prejudices when introducing herself.

Why humanizing scientists and their research is essential

Science has shaped our society and everyday life, and yet the public and many policymakers neglect, discredit, and underfund research and scientists due to their negative perceptions of the field. Over the last few years, public trust towards scientists has been challenged. According to recent studies by Fiske and Dupree, the public describes scientists as competent, but not as warmly as they describe doctors or nurses. Yet, scientists need to be able to effectively communicate their research and engage with the public and policymakers to ensure that the decisions that impact all of us are based on evidence.

Graphics and tables are not enough to establish a relationship between scientists and society. The public needs emotional connections with scientists and scientists need the public’s trust to be able to disseminate reliable and pertinent research. In addition, although technology now provides wide access, fake and sensational news are more accessible and can damage scientists’ image. This is why restoring the public’s trust towards scientists and science is crucial.

What chemists can do for you

Have you ever wondered what medicine would be like without the molecules that have been carefully designed by chemists? How would engineers conceive of laptops and cellphones without the development of batteries and electrochemistry?

When introducing myself as a PhD student in chemistry, I often see fear, rejection, or incomprehension in people’s eyes. I have always thought chemistry was fascinating, entertaining, and useful. Unfortunately in my experience, some of the public seems to be reluctant and suspicious when speaking about chemistry. Chemists are commonly pictured as environmental destroyers, eager for explosions, who are disconnected from the impacts of their laboratories and experiments. However, reality is quite the opposite.

It would be a lie to say that fire and explosions are not part of every chemist’s life, however, chemists are pursuing a more noble goal: helping people by improving their health and quality of life, and preserving the environment. Chemists’ ultimate objective is to better understand the behavior of molecules and use elements available on Earth to develop high-performance materials, new drugs, and more sustainable processes. One of the most extensively shared examples of chemistry in media outlets is the environmental and health damages caused by the misuse of scientific knowledge, such as chemical bombs.

While the public frustration and confusion is understandable, chemists should not be blamed for their discoveries but instead work diligently for their ethical and just applications. Chemistry, and science generally, are key to our lives and the public often neglects its importance. However the work of scientists is meaningless if not shared.

Why I decided to study chemistry


Chemists study reactions intending to develop new molecules or to enhance the efficiency of chemical processes. My PhD projects focus on the latter, in the field of catalysis. Building new molecules requires breaking and eventually forming bonds between atoms. Therefore, chemical reactions are often energy-intensive and generate large amounts of waste. In catalysis, chemical reactions can be sped up upon the addition of a substance, called a catalyst, which increases the efficiency of a chemical transformation. Moreover, catalysts can often be recycled and reused in other reactions.

My PhD focuses on the use of sunlight as an energy source and silver as a catalyst to promote popular reactions. Such catalysts which can be activated by sunlight are called photocatalysts and fall within the field of Green Chemistry – field aiming to reduce the ecological footprint of chemical industries by developing more environmentally-friendly reaction conditions and reducing chemical waste.

I always appreciate sharing my research and can do that more effectively when scientists and the public respect each other and work to ensure science is used for evidence-based policymaking, for knowledge-sharing, and for justice . Next time you see a chemist, or any other scientist, let’s talk about how we can learn from one another and be stronger together. How about we chat over a cup of caffeine (C8H10N4O2) extracted by dihydrogen monoxide (H2O) or a glass of ethanol (C2H6O)?


Originally from France, Alexandra Gellé moved to Montréal, QC, Canada to start her undergraduate degree in Chemistry in 2013. She is now a PhD student and is passionate about science communication and outreach. Alexandra is also the president of Pint of Science Canada, an international festival promoting science through speaker series in bars.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Illa Maru,

Hurricane Florence: Four Things You Should Know That Your Meteorologist is Truly Too Busy to Tell You

UCS Blog - The Equation (text only) -

Hurricane Florence is currently making its way as a Category 4 storm toward the southeast coast and is expected to make landfall sometime on Thursday, most likely in North Carolina. Our hearts are with those who are looking at the storm’s predicted path and wondering what this means for their homes, families, and communities. As millions of residents in the storm’s path make preparations to stay safe, our hearts are also with the thousands of people who have faced similar risks in Texas, Florida, and Puerto Rico in the past year. If you are in the Carolinas, please do take care to heed local warnings and evacuation orders–and know that we are all hoping for your safety.

Florence, like any hurricane, is a fearsome storm. But the direction and northward extent of Florence’s path make it unusual, and the atmospheric and oceanic conditions in which Florence is brewing are contributing to the storm’s outsized strength for its location. With that in mind, let’s take a look at some of the climate dynamics that make Florence stand out amid our historical knowledge of Atlantic hurricanes.

ONE: Florence’s path is unusual—in a way that’s similar to Sandy’s

Atlantic hurricanes tend to develop off the coast of Africa, then move in a north/northwest direction. By the time they reach the position Florence was in a couple of days ago, they tend to take a hard right turn toward the north/northeast, staying well away from the US. In fact, as reported by Brian McNoldy and the Washington Post, of the nearly 80 recorded storms that passed within 200 nautical miles of Florence’s position on Friday, none made landfall on the US coast.

Florence’s path, however, has been blocked by a ridge of high pressure in the atmosphere, which is essentially blocking the storm from moving northward and keeping it on a westward trajectory toward the coast instead.

A ridge of high pressure along the northeast coast of North America, shown here in orange, has prevented Hurricane Florence from making the typical northward turn of most hurricanes.

Six years ago, when Sandy slammed into the coast of New Jersey, a “blocking ridge” over the eastern half of northern North America prevented Sandy from moving north. Never before had we seen hurricane take such a perpendicular path toward the Mid-Atlantic coastline.  One important difference between the paths of Sandy and Florence, however, is that during Sandy, the blocking ridge also prevented a low-pressure storm system coming from the west from moving north, so the two storms collided (hence the “Superstorm Sandy” moniker).

TWO: Major hurricanes this far north are rare

The Southeast US is no stranger to hurricanes. The Carolinas have experienced dozens of hurricanes since modern record-keeping began in 1851. The vast majority of these hurricanes have been Category 1 storms; together the Carolinas have only been hit three times by a Category 4 storm or above. The last time North Carolina was hit by a Category 4 storm was over 60 years ago.

Why is this? Hurricanes require a supply of fuel in the form of warm sea surface temperatures. Historically, as storms moved northward they did so closer to the central Atlantic and they encountered progressively cooler temperatures and weakened. Not so with Florence. While temperatures off the coast of Africa, where most Atlantic hurricanes develop, are running cooler than average right now, Florence’s path, determined largely by the blocking ridge, has taken it westward into a wide swath of the Atlantic where temperatures are running 2-3 °C above normal. Because of that ridge, even as Florence’s latitude increases, it’s projected to stay within a zone of warm temperatures that will allow Florence to stay strong and indeed strengthen as it churns its way toward the coast.

Over the next few days, Hurricane Florence will encounter abnormally warm sea surface temperatures, which will enable it to remain strong as it churns toward the coast.

We have seen the effects of warmer than average temperatures on hurricanes in the recent past. Last summer, for example, Hurricane Harvey passed over Gulf of Mexico waters that were 2.7 – 7.2 °F above average before slamming into the coast and dropping unprecedented amounts of rain on the Houston area. We know that warmer temperatures help to fuel hurricanes and that such conditions are more likely to occur in a warming world.

THREE: The expected storm surge will be amplified by higher average sea levels

The National Hurricane Center is expected to issue storm surge warnings tomorrow, but residents are already being cautioned that Florence’s storm surge could be life-threatening.

Storm surge is driven by several factors, but its primary driver is wind. As Florence makes its way over more than 1,000 miles of ocean, its winds push surface water toward the coast. That water piles up and creates a surge. The stronger the winds and the farther the storm travels, the bigger the surge. While some storms, like Harvey, cause most of their flooding through intense rainfall, for others, like Sandy, storm surge is the primary cause of flooding.

The last time a Category 4 hurricane to make landfall North Carolina was in 1954. Since then, sea level along the coast of the Carolinas has risen roughly 8 inches. That rise is already playing out in the form of increasingly frequent high tide flooding in the region. Charleston, for example, has experienced more than a quadrupling in the number of high tide flooding events just since the 1970s. And when it comes to storm surge, higher sea levels make for larger, farther-reaching surges.

Given that Florence is moving relatively slowly and is predicted to stall over the Southeast or Mid-Atlantic, the storm will likely remain along the coast during at least one high tide cycle. The timing of landfall relative to high tide remains to be seen, but the current state of sea level along the Carolina coast has the potential to add more height to the storm surge, which allows it to reach farther inland, than a storm like Florence would have caused, historically.

FOUR: In a warmer world, the atmosphere can hold more moisture and there is increased potential for extreme rainfall during storms

Like the rest of the US, since the late 1950s, the Southeast has experienced a dramatic increase in the percentage of rain that falls during the heaviest events. This trend has been linked to human-caused climate change because warmer air holds more moisture. And in the case of Hurricane Harvey, human-caused warming was found to have made the storm’s record-breaking rainfall three times more likely and 15% more intense.

With Florence’s path through very warm waters, we can expect a lot of moisture with this storm. Current forecasts are predicting 10 to 15 inches of rain for much of North Carolina and Virginia in the next few days.

Parts of North Carolina and Virginia could see 10 to 15 inches of rain in the coming days.

Just as when Hurricane Matthew hit the region in 2016, the extreme precipitation expected during Hurricane Florence will be falling on already saturated ground. Stream gauges in inland areas of North Carolina and Virginia, where Florence could stall, are recording streamflow–or the flow of water in streams, which is primarily driven by rainfall amounts–“much above normal,” which calls into question just how much more rainfall they’ll be able to accommodate.

The combination of saturated soil and rivers, heavy rainfall, and elevated sea levels due to long-term sea level rise and storm surge could make it very difficult for floodwaters to drain after the storm has passed.

As I wrote this, Governor McMaster of South Carolina was telling residents of his state that they should expect more wind than with Hurricane Hugo and more rain than with Hurricane Matthew. As South Carolinian listeners would know, these storms each caused grave damage through their respective mechanisms. In that press conference, the Governor also ordered the mandatory evacuation of the state’s entire coastline. This means that over 1 million people will be fleeing the coast in that state, and more in North Carolina, Virginia and elsewhere. The threat to the coast is the obvious priority as this week gets underway. Later, the challenge of managing the impacts to North Carolina’s interior regions may need to take center stage, as the stalled storm deluges large areas.

People talk about the “calm before the storm” and, if we do things well, there will indeed be an eerie quiet along much of the southeast of the United States later this week. In the meantime, as we scramble, hunker down, and prepare to ride out the latest in a terrible spate of hurricanes, we also hope that, unlike with Katrina, Sandy, Harvey, and Maria, we don’t surface to find our communities fundamentally scarred by this latest in spate of brutal storms.

We’ll be updating this blog post as conditions continue to evolve.

NASA Climate Reanalyzer Climate Reanalyzer National Hurricane Center

Puerto Rican Scientists and the Communities They Serve: “Resistance is Resilience”

UCS Blog - The Equation (text only) -

Photo: Juan Declet-Barreto

We are coming up on the one-year anniversary of the devastation caused by Hurricane María in Puerto Rico. As part of the Puerto Rican diaspora in the United States and like thousands more of my compatriots abroad, I spent a frustrating, depressing, and maddening year viewing the fiscal and climatic catastrophe unfold from afar, and collaborating with others in the diaspora and other sectors of American society to send emergency aid, advocate for immediate federal action, and making myself useful any way I could for Puerto Rico and the US Virgin Islands.

So it was especially rewarding for me to return last week to Puerto Rico for the first time since the hurricane. In my homeland, I was able to witness not only the incredible resilience of our people, but also their refusal to sit idly by and wallow in the misery left behind by María. Here’s what I saw.

As I drove through the north and northeastern towns of San Juan, Naguabo, and my beloved Luquillo, I talked to people who told me stories of how, in the absence of a coherent or timely federal and local government response, neighbors banded together to care and feed each other, to remove debris from roadways, and to make treacherous trips to the nearby El Yunque rainforest to open up municipal water supply valves.

I was particularly impressed by the Coalición Pro Corredor Ecológico del Noreste, a local coalition of residents and scientists protecting coastal beaches and wetlands that serve as egg-laying grounds for the beautiful and endangered tinglar (leatherback turtle). The corredor provides several valuable ecological services, as it is an effective barrier against storm surge and coastal erosion, and its wetlands, beaches, coral reefs, bioluminiscent lagoon, and forests evidence its great biodiversity. As Cristóbal Jiménez, the president of the coalition, told me, they considered not holding the annual Festival del Tinglar in 2018 due to the devastation caused the year before, thinking at first that people would be too overwhelmed to attend. But as soon as they started planning for it, the community turned out in record numbers to hold their festival and continue the defense of local flora and fauna and the valuable ecological and cultural services the corredor provides.

This, to me, is testament to the potential for communities to build resilience by banding together.

The coalition’s story is a great example of scientist-community collaborations built on decades of experience. And it’s a great example of the type of partnerships to advocate for a climate-resilient future that have developed in the post-María period.

In my time on the island, I was able to get a broader look at how scientist-community partnerships are organizing to construct and demand a climate-resilient and equitable reconstruction of the island’s infrastructure. UCS joined the leadership of Ciencia Puerto Rico (CienciaPR) and the American Association for the Advancement of Science – Caribbean Division (AAAS-CD) in a conference titled “Ciencia en Acción: Política Pública Puertorriqueña Apoyada por Evidencia” (Science in Action: Puerto Rican Public Policy Supported by Evidence). CienciaPR and AAAS-CD are scientific societies front and center in making scientists’ voices heard in decision-making around public policy in Puerto Rico, and the event was the kick-off for the Puerto Rico Science Policy Action Network (PR-SPAN).

While the UCS Science Network and I were invited to add our own experiences in science policy advocacy, I was humbled to learn more of the long-standing and deep commitment of boricua* experts to elevating their communities’ needs, but also saddened at how most of their expert recommendations have been sidelined or otherwise ignored for decades.

For example, renewable energy experts from the National Institute for Island Energy and Sustainability (INESI, in Spanish) have long been strong proponents of developing local energy sources like solar and wind to facilitate the transition from expensive, global warming-producing, and climate-vulnerable fossil fuel-burning electric infrastructure.

Dr. Elvira Cuevas, a terrestrial ecosystems ecologist at the University of Puerto Rico, reminded the audience of the urgency of taking action, and that building climate resilience is both our obligation and right: “If we want a Puerto Rico that is truly resilient, we cannot leave it in the hands of the universities and [other] organizations. Each and every one of us is responsible for demanding our rights.”

Marine scientist Dr. Aurelio Mercado from the University of Puerto Rico told us of the long history of the Puerto Rican government ignoring scientists’ warnings about climate change and dismissing the need for hurricane preparedness: Dr. Cruz recalled how science was sidelined in the decades leading up to Hurricanes Hugo (1989), and Irma and María (2017), dismissing the need to prepare for what local government officials—decades before Hugo—called “hypothetical hurricanes” of categories 3, 4, or 5. We are at risk of repeating that history as scientists warn that the San Juan international airport could be underwater by the next decade or so – warnings that so far remain unheeded.

But perhaps it was Dr. Braulio Quintero, urban ecologist and co-founder of the scientific non-profit ISER Caribe, who best described the Puerto Rican population’s response to the government’s so-called recovery plan: “Resilience requires resistance; resistance is resilience”. Dr. Quintero is referring to the community- and science-driven mobilization of large swaths of Puerto Rican society against the anti-democratic impositions of fossil-fuel interests and the fiscal control board appointed by President Obama through the Puerto Rico Oversight, Management, and Stability Act of 2016 (PROMESA).

PROMESA’s fiscal austerity measures, together with the Trump and Rosselló administrations’ commitment to fossil fuel interests will keep Puerto Rico on the path of fiscal and climatic vulnerability that catastrophically hit rock-bottom after María.

So it’s not hard to see two divergent visions for the future: the first one, largely imposed on the Puerto Rican population by the Trump and Rosselló administrations, and without taking into account the climatic, fiscal, social, and economic challenges facing Puerto Rico and the Caribbean, insists on continuing the reliance on climate-changing fossil fuels for electricity production and fails to start planning for climate impacts like increased temperatures, sea level rise, and more frequent and destructive hurricanes. The second one, actively proposed and sought by Puerto Rican civil society, grassroots organizations and collectives, and scientific societies and advocates, demands a diversified and decabornized power sector, and a climate-resilient and equitable recovery that prioritizes the needs of the Puerto Rican population.

In light of the crossroads that Puerto Rico finds itself in, I go back to the question I have asked before: Are Puerto Ricans willing to allow a repeat of the errors of the past that put us on the path to fiscal and climate ruin, or will Puerto Rican society actively demand and work together towards developing an energy, housing, and economic infrastructure that responds to our present and future needs under a changing climate?

For Puerto Rican scientists and the communities they serve, the answer is clear: they are using community-driven science to demonstrate impacts and propose resilient solutions for the benefit of all, not just a few narrow—ahem, ahem, fossil-fuel—interests. UCS is proud to stand together with Puerto Rico and all climate-vulnerable communities to turn resistance into resilience.


*Boricua is the ancestral demonym for Puerto Ricans, from Borikén or Borinquen, given by the Taíno native peoples to the island later baptized “Puerto Rico” by the Spanish colonizers.

Photo: Juan Declet-Barreto

Community Choice Aggregation Puts Communities in Control of Their Electricity

UCS Blog - The Equation (text only) -

Rebecca Behrens, 2018 UCS Schneider Fellow

Keep your eyes and ears open for Community Choice Aggregation, already a major player for consumer energy choice in California and spreading rapidly. In the post below, 2018 UCS Schneider Fellow Rebecca Behrens explains how CCAs work, where CCAs are forming, and what you should be on the look-out for as more communities get involved.

It’s late summer, which means ice cream season is coming to an end. A coworker and I have made it a habit of exploring the (many) ice cream shops around our office each week, and for something as simple as ice cream, it’s amazing how many choices we have. I can choose what ice cream I want based on price, proximity, flavor, or even the company’s business practices.

This got me thinking: if I have so many choices for something as simple as ice cream, what about bigger choices in my life—like where my electricity comes from? Like most of the US, I’m served by one utility. If I don’t like the way they’re sourcing electricity or setting rates, I have limited options.

But that story has been changing, in part due to the growth in Community Choice Aggregation (“CCA”). CCAs offer an alternative to traditional utilities and are designed to give communities a voice in where their electricity comes from. In California, many CCAs are striving to provide their customers with more renewable energy at lower costs than traditional utilities. Let’s break down the what, when, where, how and why of this new body.

What are CCAs?

Community Choice Aggregation allows local governments to purchase electricity on behalf of their residents, aggregating the electricity needs of everyone in the community to increase purchasing power.

The investor-owned utility (“utility” or “IOU”) that used to supply and deliver electricity is still there, but it plays a different role. Now, the utility is just in charge of delivering the electricity through its transmission and distribution lines (the utility still owns and maintains the “poles and wires”) and billing customers. This partnership distinguishes a CCA from a municipally-owned utility, which takes over both electricity procurement and electricity delivery (aka the poles and wires).

CCAs are in charge of procuring electricity while the utilities are in charge of delivering the electricity to you. (Source: Cal-CCA)

When and where have CCAs formed?

So far, CCAs are allowed in seven states: Massachusetts, Rhode Island, New Jersey, New York, Ohio, Illinois and California. Within a state, the decision to form a CCA is up to the community and local government. California has seen the most recent growth in CCAs, so I’ll be using it as an example here, but know that CCA formation and growth looks a bit different in each state.

Most of the seven states that allow Community Choice Aggregation passed bills legalizing CCAs in the early 2000s: California passed AB 117 in 2002. However, it wasn’t until years later, in 2010, that the first CCA in California launched in Marin County.

Since 2010, the number of CCAs in California has grown significantly. In 2016, there were five CCAs serving 915,000 customers. In 2017, there were nine CCAs. By the end of 2018, there will be 20 CCAs, serving over 2.5 million customers. And more local governments are considering the option.

The regions CCAs serve in California as of September 2018. Because CCAs are growing quickly in California, this map changes quickly, too. (Source: Cal-CCA)

Even if no more CCAs launch after 2018, CCAs are expected to serve 16% of the electrical load in California in 2020. But, it’s highly likely more CCAs will launch in the coming years, which could put this number at over 50% in 2020.

How do CCAs work?

In California, once the local government votes to form a CCA, a nonprofit agency is formed to carry out its duties. The agency goes through a rigorous planning process and once the CCA is ready to launch, they line up the customers.

And who are those customers? Anybody who wants to be. CCAs are “opt-out” in California, and in most other states, meaning that the default is for customers to be automatically served by the CCA. Customers have 60 days to opt-out for free and are notified about the change four times before this deadline. After 60 days, customers can opt-out for a fee to account for the power the CCA had bought in advance for them.

And that’s it! Customers are now served by the CCA. In California, if customers were receiving discounts because of particular circumstances, they will automatically continue receiving those discounts. This includes California Alternative Rates for Energy (“CARE”), Family Electric Rate Assistance Program (“FERA”) and Medical Baseline customers. Customers with rooftop solar systems who are on a net energy metering program are automatically enrolled to continue.

In terms of electricity service, as a CCA customer, nothing else changes. Your lights stay on, your TV still works, and your freezer stays cold.

The biggest difference is that the existence of CCAs allow customers to have more of a choice in the type of electricity they receive. Not only can customers choose between being served by the utility or the CCA, but if customers are unhappy with the electricity options or rates offered by their CCA, they can provide feedback to the CCA at its board meetings, which allow for public participation in California.

CCA communities can also benefit from the reinvestment of CCA profits, given that CCAs are nonprofits. CCAs can offer additional programs beyond what the utility offers. These could look like free energy efficiency audits, rebates for electric car charging stations, incentives for low-income customers to install solar, or really any program that helps customers better manage their electricity usage.

In some cases, customers could lose access to programs run by their utility by joining a CCA, although in California, most utility programming is still available to CCA customers. In any case, it’s smart to reach out to your local CCA and ask if you’ll still be eligible for programs you rely on.

Why do CCAs matter?

In California, every CCA (so far) has chosen to provide customers with more renewable energy than the competing utility and has done so at lower rates. However, how much new renewable energy CCAs are contributing to the grid varies a lot from community to community.

The devil is in the details here: A CCA that uses mostly short-term contracts to buy renewable energy or renewable energy credits (“RECs”) is likely buying from projects that already exist. Electricity purchases from existing renewable energy projects do not increase the supply of clean electricity on the grid, and customers that used to consume electricity from those renewable projects may now be consuming electricity from a dirtier source. This is called resource shuffling. On the other hand, a CCA that uses long-term contracts is helping new renewable projects develop, which means that more clean power is being added to the grid.

If you live in an area served by a CCA, it’s up to you to make sure your CCA is sourcing electricity in a way you support and providing programming you can use. Here are some questions you can ask to see how well a CCA is doing:

  1. Is the CCA providing more renewable energy than the competing utility, and are they sourcing their renewable energy from long-term contracts for energy and RECs? By buying “bundled” renewable energy through long-term contracts, CCAs can more directly support the development of additional renewable energy projects and add more clean electricity to the grid.
  2. Is the CCA making use of local resources and supporting the local community? Having a sustainable workforce policy and hiring locally and from unions can help bring the broader benefits of renewable energy to a community.
  3. Is the CCA leveraging grants and their revenue to provide programs designed to help customers reduce or better control their energy use? More renewable energy is just one piece of the puzzle; we need a host of solutions for a clean energy transition. Programs that invest in electric vehicle infrastructure and energy efficiency are equally important.
  4. Is the CCA proactively reaching out to its community? Programming needs to be accessible, useful and reach all members of the community—especially those that historically have not received the full benefits of energy programming and renewable energy.

CCAs have the potential to empower (and quite literally power) communities. But it’s up to residents to hold their CCAs accountable and ask them to provide equitable and fair climate solutions. By staying engaged and informed, you can make sure your CCA is providing your community with the best options.

CCAs are a growing movement in California but they aren’t the only way consumers are making choices about their electricity. While not every utility or state offers choices in electricity sourcing, it is worth seeing if yours does. You may even be surprised on what your options are: home in Vermont, through my utility I can choose to buy Cow Power! What sets CCAs apart from other choices is their ability to localize decision making and let communities invest in what is best for themselves, which has made them a powerful new player at the table.

Photo: Zbynek Burival

Clinton Power Station: Even More Power Problems

UCS Blog - All Things Nuclear (text only) -

The Clinton Power Station is located 23 miles southeast of Bloomington, Illinois and has one General Electric boiling water reactor with a Mark III containment that began operating in 1987.

In December 2017, the Nuclear Regulatory Commission (NRC) dispatched a Special Inspection Team to the plant to investigate a transformer failure that prompted the operators to manually scram the reactor. That event nearly duplicated a transformer failure/manual scram event that happened at Clinton in December 2013.

The ink had scarcely dried on the NRC’s special inspection report when Clinton experienced yet another electrical power problem. Some progress has been made—this time it did not involve a transformer failure causing the reactor to be shut down. This time, the reactor was already shut down when the power problem began. This time, the failures involved several workers over several days failing to follow several procedures to disable an emergency power supply. This time as in the past, the NRC dispatched a special inspection team to figure out what when wrong.

Entering a Refueling Outage

The operators shut down Clinton on April 30, 2018, to enter an outage during which the reactor would be refueled. When the reactor is running, nearly the entire array of emergency equipment must be operable except for brief periods of time. During refueling, the list of emergency equipment required to remain operable is shortened, providing opportunities for components to be tested, inspected, and repaired as necessary.

The operators tripped the main generator on April 30 as part of the reactor shut down process. When the generator was online, the electricity it produced went through the main transformers to the 345-kilovolt switchyard where transmission lines provided it to the offsite power grid. The generator’s output also flowed through the Unit Auxiliary Transformers to supply in-plant electrical needs. As shown in Figure 1, this supply to in-plant loads was unavailable with the main generator offline.

Fig. 1 (Source: NRC, color annotations by UCS)

On May 5, workers de-energized the Emergency Reserve Auxiliary Transformer (ERAT) shown on the left side of Figure 1 to support planned maintenance. Power for in-plant loads came from the 345-kilovolt switchyard through the Reserve Auxiliary Transformer (RAT).

At 9:36 pm on May 9, workers closed an electrical breaker to restore power from the RAT to 4.16-kilovolt Bus 1B1. Bus 1B1 had been removed from service for maintenance on it and the equipment powered from it. Emergency diesel generator 1B (EDG 1B) provided the backup power to Bus 1B1 in event power from the main generator and offsite grid were lost. During the planned outage of Bus 1B1, EDG 1B had been intentionally disabled to prevent it from starting. This measure protects workers from contacting energized equipment if EDG 1B started unexpectedly.

Bus 1A1 remained in service during the time Bus 1B1 was unavailable. Bus 1A1 was also supplied with offsite power from the RAT, with EDG 1A in standby to provide backup power if needed. Safety equipment powered from Bus 1A1 cooled the reactor core and could provide makeup water if necessary.

Entering an Unsafe Condition

When power to Bus 1B1 was restored, procedures called for its backup power supply—EDG 1B—to be returned to service. A worker was sent out to place EDG 1B back in service. The emergency diesel generators (EDGs) are normally maintained in standby. Should power from the offsite power grid or accident occur, the EDGs are designed to start up, reach speed, and begin supplying electrical power to their respective buses with a little more than ten seconds. To enable the large diesel engines to perform such rapid feats, the EDGs are equipped with support systems. One support system maintains the lubricating oil warmed. The start air system supplies compressed air to help the engine shaft begin spinning. Another support system supplies cooling water to protect a running diesel engine from damage caused by overhearing.

Because the cooling water system for EDG 1B was not yet returned to service, a supervisor directed the worker to keep the start air valves closed. The restoration procedure called for these valves to be opened and later checked to ensure they were open. But the supervisor was concerned that an inadvertent start of EDG 1B might damage it from overheating. EDG 1B was partially restored to service on May 9.

Late in the evening of May 10, a second supervisor directed a second worker to conduct another partial restoration of EDG 1B. The fuses for the lubricating oil system had been pulled. The worker reinserted the fuses to return the lubricating oil system for EDG 1B to service.

The second supervisor turned over duties to a third supervisor before the second worker completed the assigned partial restoration. Due to miscommunication, the third supervisor thought that all the EDG 1B restoration tasks had been completed. EDG 1B was declared back in service at 2:30 am on May 11.

EDG 1B may have been declared in service, but it was incapable of running because both its start air valves were closed. At that moment, it did not compromise safety because EDG 1A and the safety equipment it supplied were still available and that’s all that was required per regulations.

Safety was compromised at 11:28 pm on May 13 when the reactor core cooling pump supplied from Bus 1A1 was removed from service and the reactor core cooling pump supplied from Bus 1B1 placed in operation. Bus 1B1 was supplied with offsite power through the RAT. But if the transformer failed or the offsite power grid lost, the disabled EDG 1B would not have stepped in to save the day.

Safety was further compromised at 12:30 am on May 14 when Bus 1A1 was de-energized and all the safety equipment it supplied rendered useless.

Had the offsite power grid been lost or the RAT failed, Bus 1B1 and all the equipment it supplied would have been de-energized. Bus 1A1 and all the equipment it supplied was intentionally de-energized. And Bus 1C1, backed by EDG 1C, was energized. But it’s primary safety component, the High Pressure Core Spray system, was unavailable due to ongoing maintenance. The plant was in a vulnerable situation expressly forbidden by its operating license requirements.

Fig. 2 (Source: NRC, color annotations by UCS)

Restoring a Safe Condition

At 3:03 pm on May 17, a worker conducting routine shift rounds found the start air valves for EDG 1B closed and notified the control room operators. The EDG restoration procedure was performed—in its entirety—to really and truly restore EDG 1B to service and achieve compliance with regulatory requirements.

NRC Findings and Sanctions

The NRC special inspection team determined that EDG 1B had been inoperable for over six days without the owner’s awareness. The NRC team additionally determined that for more than three days—from May 14 through May 17—a loss of the offsite power grid would have plunged the plant into a station blackout.

While a station blackout condition doomed three reactors at Fukushima Daiichi to meltdowns, the NRC team identified three ways for workers to have responded to a station blackout at Clinton avert such an outcome. First, they could have discovered the closed start air valves and opened them to recover EDG 1B. Second, they could have started EDG 1C and cross-connected it to re-energize Bus 1B1. While EDG 1C has smaller capacity than EDG 1B, it had sufficient capacity to handle the loads needed during refueling. Third, they could have deployed the FLEX equipment added after Fukushima to cool the reactor core.

The NRC team calculated that had a station blackout occurred, it would have taken about five hours for the loss of cooling to heat up the water in the reactor vessel to the boiling point and that it would have taken about another twelve hours for water to boil away to uncover the reactor core and cause damage. Approximating this timeline helps the NRC assess how likely it would have been for workers to successfully intervene and avert disaster.

The NRC team also identified factors lessening confidence that workers would successfully intervene. The NRC team reported that five different workers entered the room housing EDG 1B a total of twelve times during the period it was disabled for the express purpose of ensuing things were okay. The NRC team observed that the start air valves were located at about knee-level and had been secured in the closed position with long black plastic straps. The NRC team also noted that there were two air pressure gauges both reading zero—a clear indication that there was no start air pressure available for EDG 1B. The NRC team interviewed workers, but never learned why so many workers tasked with looking for signs of trouble overlooked so many signs of trouble so many times.

The NRC issued one Green finding for failing to notice that the EDG 1B start air valves were closed.

The NRC also issued a finding with a significance yet to be determined for the multiple failures to follow procedures that led to the start air valves for EDG 1B remaining closed.

UCS Perspective

The failures by the supervisors and workers can be explained, but not excused.

Like most U.S. nuclear power reactors, Clinton typically shuts down for refueling every 18 or 24 months. The refueling outages last about a month. Thus, Clinton runs about 95 percent of the time and refuels only about 5 percent of the time.

When the reactor was running, safety equipment like the EDGs was routinely removed from service, tested and/or repaired, and returned to service. Similarly, workers conducted rounds—walkdowns of plant areas looking for off-normal conditions—every shift of every day.

During refueling, the same restoration and rounds procedures are used for the same purposes, but under significantly different conditions. When the reactor is running, most safety systems are in service making it easier to concentrate on the tiny subset taken out of service. And it’s easier to spot when something is off-normal.

Many safety systems are removed from service concurrently during refueling. Restoring safety systems to service during refueling is complicated when support systems have not yet been restored to service. Performing rounds is complicated by so many systems and components being out of their normal condition that distinguishing acceptable off-normal from improper off-normal becomes challenging. So, it can be understood how trained and dedicated workers with good intentions can fail to rise to the challenge periodically.

This event illustrates two important safety truths: (1) despite best efforts, things can go wrong, and (2) the way to make best efforts better is to extract lessons learnable from near misses and implement effective fixes.

This event did not involve any actual loss of power to safety equipment or loss of reactor core cooling. This event did involve an increased potential for these losses.

The plant owner and the NRC took this increased potential seriously and examined why it had happened. Those examinations will identify barriers that failed and suggest upgrades to existing barriers or additional barriers to lessen the chances that a potential, or actual, event occurs.

On one hand, Clinton can be said to have dodged a bullet this time. On the other hand, the owner and NRC examining this near miss will help make Clinton—and other reactors—more bulletproof.

Department of Energy Walks Into a Fight About Subsidies

UCS Blog - The Equation (text only) -

Offshore wind gets started where policy supports it. Photo: M. Jacobs

There is a fight over power plant costs that could threaten grid reliability, and it’s not as simple as the fight you have been hearing.  This wraps together three issues, each of which could cost billions of dollars. By throwing them together, policymakers are jeopardizing the electric grid reliability they say they are trying to protect. The three subjects in this fight are:

  1. Long-standing state policies for utility-owned generation in Kentucky, Ohio, Virginia, and West Virginia have been challenged as uneconomic;
  2. Renewable energy supports enacted by states are under attack;
  3. The federal government is pushing contradictory treatment for old coal plants.
A mess this big takes time

Presently, new political appointees in key agencies have tossed their respective agencies into a manufactured crisis that casts doubt on the basic means for paying power plants to keep the lights on. This uncertainty is a train-wreck of unacknowledged and uncoordinated policies verging on playing chicken with grid investments. In a hasty decision that invalidated the existing rules for reliability payments, a three-person majority at the Federal Energy Regulatory Commission, all appointed by President Trump, has made the continued operation of coal and nuclear plants less certain and new investment riskier. Meanwhile, DOE proposals to override the market, and over-pay coal plant owners threaten market investments.

Taxes on CO2 are a good idea for sorting out subsidies.

The owners of coal and nuclear plants opened this battle in 2013-2014 by arguing that the markets were paying too little, and despite all evidence that cheap natural gas had lowered prices across all U.S. energy markets, the fault lay in state policies that supported the gradual use of renewable energy. Soon, states began to rescue nuclear plants with additional payments and the fighting widened. Economists predicted that subsidies would lead to more subsidies, though this is already what we call U.S. energy policy. The Trump administration soon proposed subsidies for coal plants, and a national debate broke out.

No one expects markets to function when subsidies keep uneconomic plants online and force the supply to be greater than demand.  While the arguments to straighten this out will continue at the federal agencies and courts, here’s an explanation that should get you up to speed on how the economics and regulation are meant to provide grid reliability are complicated by old policies colliding with market prices driven down by innovation.

The focus is on FERC. What is FERC?

The Federal Energy Regulatory Commission (FERC) is center stage for this drama. For over 20 years, FERC has championed competition between power plants as the best way to determine how much should be paid to plant owners. The fundamental role for FERC is to ensure that rates for buyers and sellers of energy are just and reasonable. FERC was created in the 1930’s after financial manipulation by an interstate electric company demonstrated the need for a federal system to regulate in conjunction with the long-standing state authority over power plant construction and electric company service to consumers.

FERC’s role in electricity markets addresses the interstate commerce of power plants once they are built. With considerable reliance on competition to sort out winners and losers, as well as set prices, FERC looks to ensure open access to the transmission system and the administration of fair markets. This assignment has been accepted in much of the U.S. by independent system operators, with names like ISO-New England, Southwest Power Pool, California ISO and PJM. In addition to markets, these organizations are key to maintaining the reliability of the power system.

Role of grid operators getting into politics

PJM and the other grid operators are utilities and regulated by FERC. Unlike most utilities, the grid operators own no power plants or wires. Instead, they have rule-making and stakeholder processes where policies are made that shape competition. These stakeholder and governance processes are not perfect. Where a grid operator covers multiple states, grid operators in New England and PJM have entered a dramatic policy battle between state policies and the grid operators’ perception of economic subsidies for certain power plants. PJM accepted the idea that state policies are subsidies in these rule-making and stakeholder processes.

PJM functions for reliability and adequacy of the power supply involve consumers and utilities in 13 states and the District of Columbia. All grid operators create a demand forecast and projection of needed future electricity supply. This is key to signaling the need for new investment in power plants or alternatives, which would help ensure reliability. PJM’s approach to ensuring adequate supply also addresses the challenges related to power plant utilization and revenues from energy sales that vary by hour and season.  PJM calls this the Reliability Pricing Model or RPM. This operates through a series of auctions that are expected to determine what existing plants remain operating in future years or close, and what new plants will be built.

Take a deep breath- we are diving in deep

There is so much investment in our electricity supply, it is unrealistic to think there are some fuels and power plants that have no subsidies. PJM got into trouble by trying to pick sides and pretend that it wasn’t doing so. In practice, the folks with subsidies from “the old days” are unhappy that there are new subsidies. What might have been a principled stand by PJM about the new subsidies and their impacts on a market has to address many layers of subsidies and protections. We debated specific fuel subsidies and tax breaks, only to discover the very basics of old utility monopolies would be put on the table by FERC.

Since RPM pays for capacity that can produce energy (or reduce demand) separate from how many days or hours it actually runs, the debate over retiring coal plants, maintaining nuclear plants and how to recognize subsidies all focus on the RPM market. (In the midst of these debates, many observers say all the tweaking and adjustments PJM makes prove the RPM is not actually a market…but that is another debate.) PJM started the debate over state actions in 2016 when legislatures in Illinois and New Jersey took steps to provide nuclear plants with additional revenues. This, along with earlier action by the Ohio Public Utilities Commission and the West Virginia Public Service Commission to protect coal plant owners from losing money in the energy market led PJM to the position that state policies supporting existing plants could be suppressing the RPM auction prices. At this stage, PJM is saying it has a problem with every state it serves (except Kentucky, but that may not last), as each has either a renewable portfolio standard in its laws, a nuclear support in its laws, or a recent regulatory decision bailing out a coal plant.

The auction clearing prices are applied to all generators in the auction, so PJM says it is keenly interested in preventing available out-of-market revenues supplementing the auction prices bid by generators, thus hiding the true costs of the generators and suppressing auction prices. However, there is a spectacular hidden exception to this pursuit of accuracy and fairness of auction bids and results. (Sorry Kentucky.)

“Guaranteed revenues” sounds like a subsidy

PJM has long accepted the presence in its markets a category of old plants (mostly coal) that receive state support through consumer bills and are protected from competition. These old plants are a legacy of monopoly utilities that have their costs repaid through state-approved rates that are paid in consumer electric bills. This remnant allows old generation that is owned by utilities and still paid through cost-recovery rules to automatically succeed in the capacity auction (i.e. PJM rules say this is “a mechanism to guarantee that the resource will clear in the Base Residual Auction”).  The effect of this provision in PJM’s rules is it allows state-supported generation to bid low in the auction while receiving out-of-market revenues from state-sponsored payments made by consumers.

A rough estimate of the old plants protected in Kentucky, Ohio, Virginia and West Virginia is approximately 40,000 MW. Another measure is that over 100 power plants in PJM bid zero in the most recent capacity auction.

In its stakeholder process, PJM pushed to decide what kinds of subsidies it would tolerate, and which it needed to “correct” or “adjust” so that the RPM auction would have correct prices. This all got out of hand when PJM requested permission from FERC to adjust bid prices for nuclear and renewable generation that receive out of market payments. The industry was not prepared for what happened next.  The proposed “minimum offer price rules” were rejected by a split decision that declared PJM did not go far enough to root out all out-of-market payments to generators of all kinds. The three Trump-appointed commissioners voting to reject PJM proposal also found PJM could not rely on existing rules, as those would result in rates that are not just and reasonable.

FERC makes a big splash

FERC instructed PJM to make several changes not proposed by any party to the case, and to do so quickly. In effect, FERC ordered PJM to reshape the market that distributes $6-10 B a year, maintains reliability and determines coal plant closing.  Acknowledging this would be difficult, FERC nonetheless ordered this be done in 90 days. (FERC has since granted an extension of 6 weeks.)

Does anyone know what happens next?

As of late August, PJM discussion with stakeholders have not been promising. No clarity on what counts as a subsidy. Is a municipal- or cooperative-owned electric plant “subsidized”? Consumer-owned utilities see no overlap between their business model and the issues in this debate.  If the U.S. Department of Energy orders payments to uneconomic old coal plants to keep them open, is that a subsidy that should be “corrected”? PJM has said yes, and they intend to include any DOE-directed support to coal or nuclear plants with the bid price re-setting to protect the PJM auction from interference by subsidized bids. PJM has said things like “if the reason is national defense, then the payments should be made from a nation-wide fund.” Also, PJM showed stakeholders on August 15 “Out-of-market payments from any federal program adopted [after 3/21/16] will be subject to [adjustment through] Minimum Offer Price Rules, unless there is a clear statement of congressional intent indicating otherwise in the law creating the subsidy.”

In other words, PJM still believes that as the regulated utility responsible for the process, they should decide which subsidies are OK and which are not. PJM wants to stick with their plan:

  • State payments from state laws establishing renewable portfolio standards are bad,
  • State payments for old coal plants that are paid for in rates are OK, because that’s the way we have always done it, and
  • New federal subsidies are bad, but old federal subsidies are OK.

FERC, the regulator, has said all the subsidies are bad. And of course DOE has said once and will soon say again, a new subsidy is good.

Now you are up to speed.

Photo: EarthCareNM

Amazon Deforestation in Brazil: What Does it Mean When There’s no Change?

UCS Blog - The Equation (text only) -

Photo: Brazilian things/Wikimedia Commons

I was recently invited by the editors of the journal Tropical Conservation Science to write an update of a 2013 article on deforestation in the Brazilian Amazon that I had published with Sarah Roquemore and Estrellita Fitzhugh. They asked me to review how deforestation has changed over the past five years. The most notable result, as you can see from the graph in the just-published article (open-access), is that overall it hasn’t changed. And that’s actually quite surprising.

During the late 90s and early 2000s the deforestation rate in the Brazilian Amazon averaged about 20,000 square kilometers per year, driven by the rapid expansion of cattle pasture and the commercial soybean industry. Then, starting around 2005, it began to drop rapidly, falling by 70% in just half a dozen years. This dramatic drop cut Brazil’s national global warming emissions very substantially, in addition to having important benefits for biodiversity and for the people of the Amazon basin.

Since then – essentially no net change. There have been small fluctuations up and down in the annual measurements of deforestation (up in three years and down in three years, to be specific) but it remains at basically the same level. In 2017 the annual loss of Amazon forest was 6,947 km2; that compares to 6,418 km2 in 2011.

Why is this surprising? Because in the same period, Brazilian politics has been incredibly chaotic. To cite the most striking developments during this turbulent period: one President has been impeached and removed from office; an ex-President (during whose administration the decrease in deforestation was achieved) has been jailed and prevented from running again; and politicians across the political spectrum have been implicated in the corruption scandal known as “Lava Jato” – or Car Wash. Not to mention a major economic depression, the passage of legislation weakening of Brazil’s Forest Code, and the indictment of the world’s largest meatpacking company, JBS S.A., on charges relating both to deforestation and to selling tainted meat.

Why then, did deforestation remain essentially the same?

While there are many factors involved, the lack of change does seem to reflect the institutionalization of the reasons that caused deforestation to drop in the earlier period. These include regulations (and prosecutions) limiting the sale of beef and soy from deforested areas; increased transparency concerning who is deforesting and to whom they’re selling their beef and soy; improvements in efficiency which allowed farmers and ranchers to raise output without clearing more land; and underlying these, the development of a political movement, led by Brazilian NGOs, that made deforestation an important issue in national politics.

If the lack of change in deforestation is interesting, so is the way that the international media have covered it. My co-author Dora Chi and I reviewed news stories on Amazon deforestation (using Lexis-Nexis; our search found 134 print articles from 2013 through 2017) and discovered a common theme: the idea that although deforestation had fallen in earlier years, now it had gone back up. As our review showed, even though this interpretation isn’t borne out by the data, it was nonetheless quite frequently used in the media narratives about deforestation.

Perhaps this mis-interpretation simply reflects a common journalistic tendency to write “on the one hand… but on the other hand…” stories. Or maybe it’s that you can’t get a story into print if it says that there’s nothing new. It may also reflect our tendency to present data such as deforestation rates as percentages, without realizing how they can be misleading because they’re using different denominators. A quick example – if my income dropped by 50% last year, then turned around and increased by 50% this year – am I now back to where I was two years ago? No – I’m actually still 25% below that level.

So, both the lack of change in the data, and the mis-communication of its stability in the media, are notable phenomena. But there’s a third (non-)event worth noting, and that’s the fact that deforestation hasn’t dropped to zero, as it would have if the earlier trend had continued. This is a major failure in terms of its effect on climate change and efforts to reign in global emissions. It shows that Brazil’s political turbulence has had important consequences for the global environment.

Photo: Brazilian things/Wikimedia Commons

One Year after Maria, Puerto Rico Deserves a Solid, Resilient, and Healthy Power System

UCS Blog - The Equation (text only) -

Lights in San Juan. Puerto Rico Photo: Paula García

Hurricane Maria, one of the most extreme climate events to devastate the island of Puerto Rico (PR), left tragic statistics in its wake: thousands of people killed, material damages of more than $90 billion from which many people are still struggling to recover, hundreds of animals (abandoned, lost, and hurt) that are still looking for a home—and the largest power outage in US history, one that for a large swath of the population lasted even for months.

While the lights have come back on for the majority of Puerto Ricans, the hurricane and the distraction it caused shined a spotlight on an electric power system that was on the edge of collapse and that today demands urgent investment. Today’s decisions about investment and management will define whether the system can survive, recover, and be resilient for the long term.

Earlier this month, Ciencia PR, the Caribbean division of the American Association for the Advancement of Science (AAAS-CD), and the Union of Concerned Scientists organized a “Science in Action” symposium, in which one of the questions explored focused on that very issue: What can we do to make sure that the island’s power system emerges solid, resilient, and healthy?

Here I share some of the key issues that emerged in a panel discussion between Lionel Orama of the National Institute of Island Energy and Sustainability (INESI, in Spanish), Agustín Carbó Lugo of ClimaTHINK, former commissioner of the Puerto Rican Energy Commission (CEPR), and me.

Science, innovation and energy panel. From left to right: Lionel Orama, Agustín Carbó and Paula García

A critical moment for the power sector
  • Maria was the straw that broke the camel’s back of an electricity sector that was already on the verge of collapse. Puerto Rico Electric Power Authority (PREPA, or AEE in Spanish) was already in an extreme fiscal crisis with a debt of $9 billion dollars. This contributed to PREPA underinvesting in the infrastructure and maintenance of facilities and equipment. On arriving, the hurricane knocked down 80% of the power poles and all of the transmission lines, leaving the island’s 3.4 million inhabitants in the dark.
  • For years, PREPA has clung to the use of fossil fuels, forcing Puerto Ricans to depend on fuel imports, exposing them to swings in fuel prices, and subjecting them to the financial stress associated with operating power plants dependent on oil, coal and natural gas. The lack of an energy mix that was diversified, decentralized, and free of the dependence on imported fossil fuels has prolonged even more the recovery of—and confidence in—the energy services provided by PREPA.
  • The privatization of PREPA is adding to the fiscal and operational uncertainty. At the beginning of the year, Governor Ricardo Roselló signed into law the privatization of PREPA. This privatization will define who generates the electricity, from what sources, and at what prices; so far there’s total uncertainty about the answers to these questions and the impact that they’ll have on island residents.
  • Likewise, the island for years lacked a control entity to ensure transparency and optimal functioning of PREPA until the CEPR was created in 2014. Despite its importance, its work has been threatened with a new law signed recently by Gov. Roselló.
The transformation that Puerto Ricans deserve
  • The voices of the scientific community and civil society need to be reflected in the development of the utility’s “integrated resource plan” (IRP). Having them at the table is key for making sure that decisions made are informed by solid technical analyses that respond to the needs of the communities. INESI is one of the organizations contributing to this effort.
  • A solid system needs to consider diversification and resilience. It’s crucial to reduce dependence on fossil fuel imports, diversify generation to incorporate local sources of energy (like solar and wind), upgrade electrical distribution systems, and integrate microgrids and energy storage systems to increase confidence in the grid and meet critical needs (at health centers, in emergency shelters, and for water pumping systems, for example). Reducing energy consumption through energy efficiency programs is also crucial. All of this should be guided by principals of transparency and affordability.
  • A healthy system should benefit us all. Emissions of heat-trapping gases (like carbon dioxide and methane) from power plants based on fossil fuels (like oil, coal and natural gas) just worsen the effects of climate change, like hurricanes, floods, and droughts that become ever more devastating. Burning fossil fuels also emits a number of air pollutants (like sulfur dioxide, nitrogen oxides and particulate matter) that can have big impacts on our health. It’s vital that we make the transition to clean energy as quickly as possible.
Energy, climate, and health: An equation that affects us all

My visit to the Isla del Encanto affected me deeply. Interacting with some of the island’s experts on energy, environment, and health reconfirmed for me that these variables are intrinsically linked at the local level. I return to Boston inspired by all of the work led by the symposium’s organizers and participants, and motivated to collaborate with them on these themes, which have an impact not just locally but globally.

While climate change affects us all, some communities are more vulnerable to the bad decisions that others have taken for them. Hopefully the power that comes from working together can help us to have an increasingly strong voice for urgent action to address climate change, for the sake of our fellow humans, for our fellow living beings, and for our one planet, Earth.


Grace. Brought from a shelter in PR to one in MA for adoption.

*NOTE: As the beginning of this post mentions, the hurricane left hundreds of animals (abandoned, lost, and hurt) in need of homes. The island’s shelters have limited capacity (both physical and financial) and need volunteers that can take animals to shelters in different parts of the US. For those who travel to Puerto Rico and are interested in helping out, All Sato Rescue can fill you in. I brought home Grace, an adorable puppy that will soon be up for adoption via Buddy Dog.

This blog is available in Spanish here.


Audrey Eyring Paula García

Will Happer, a Climate Science Denier, Joins the White House

UCS Blog - The Equation (text only) -

Photo: Gage Skidmore/Flickr

News broke Tuesday that Dr. William Happer, a skeptic of climate science and professor emeritus of Physics at Princeton University, has joined the National Security Council, directing an emerging technologies portfolio. The scope of his responsibilities and the power he will wield remain unclear as the position appears to be newly created. However, Dr. Happer’s public condemnation for the scientific consensus around climate change (a field that he is not an expert in) is cause for serious concern, especially given the role the National Security Council has in setting high-level foreign policy and the growing threat climate change is posing to our nation’s security. Yet again, the White House has elevated an individual who denies the science around climate change to a position of power – with this, individuals such as Happer are not the exception in this administration, but the rule.

Dr. Happer is known for his important contributions to the field of modern atomic physics. He has been on the faculty of Princeton University since 1980, during which time he has also been active in government service, having, for example, served as the Department of Energy’s director of energy research during the George H.W. Bush administration.

However, Dr. Happer is also known for his dismissal of the scientific consensus around climate change, making scientifically unfounded statements that more carbon dioxide will net benefit society. He has also questioned the science underpinning The Paris Agreement (a worldwide commitment to reduce global warming emissions and limit the increase in global temperature to well below 2 degrees Celsius), and recommended withdrawing from the Agreement. Such statements caught the attention of the Trump Administration, who considered Dr. Happer for the Director of the White House Office of Science and Technology Policy (OSTP) post (a post Dr. Kelvin Droegemeier was nominated to fill). Instead, the administration named him to his new role directing the emerging technologies portfolio.

In his new position, Dr. Happer is now an earshot away from the President and his closest climate and energy advisors. While it is unclear what his specific responsibilities will be, we will be watching to see if he advocates or not for technologies that help reduce carbon emissions in line with climate goals, and that help make the world a safer place. The White House must specify soon what exactly Dr. Happer will be working on. We will be watching to ensure that if his role drifts into climate-related issues, such as policies around our climate, ocean, the Arctic and Antarctic, and Earth observations, that he relies on and accurately conveys the latest science represented by the expertise of many excellent agency scientists and scientific resources available to him.

Photo: Gage Skidmore/Flickr

A un año de María, Puerto Rico merece un sistema eléctrico sólido, resiliente y saludable

UCS Blog - The Equation (text only) -

Luces en San Juan desde la playa. Puerto Rico

El huracán María, uno de los eventos climáticos más extremos que ha devastado la isla de Puerto Rico (PR), dejó trágicas cifras: miles de personas muertas, daños materiales por más de 90.000 millones de dólares de los que muchas personas aún luchan por recuperarse, cientos de animales (abandonados, perdidos y heridos) que aún buscan un hogar, así como el mayor apagón estatal en la historia de los EE.UU. que para un amplio sector de la población duró incluso por meses.

Aunque la luz ha vuelto para la mayoría de puertorriqueños, el huracán y la destrucción por su paso, puso en evidencia un sistema eléctrico que se encontraba al borde del colapso y que hoy en día requiere una inversión urgente en infraestructura. Estas decisiones de inversión y manejo definirán si el sistema podrá sobrevivir, recuperarse y ser resiliente en el largo plazo.

Ciencia PR, la División del Caribe de la Asociación Americana para el Avance de la Ciencia (AAAS-CD, por sus siglas en inglés), y la Union of Concerned Scientists organizaron el primer  fin de semana de septiembre un simposio llamado “Ciencia en Acción” en el que una de las preguntas exploradas giró en torno a ¿qué podemos hacer para que el sistema energético emerja sólido, resiliente y saludable?

Acá les comparto algunos de los temas claves que discutimos con Lionel Orama del Instituto Nacional de Energía y Sostenibilidad Isleña (INESI) y Agustín Carbó Lugo de ClimaTHINK y ex comisionado de la Comisión de Energía de PR (CEPR), en el panel donde abordamos esta pregunta.

Panel sobre ciencia, innovación y energía. De izquiera a derecha: Lionel Orama, Agustín Carbó y Paula García

Un momento crítico para el sector eléctrico
  • La llegada de María fue la gota que derramó la copa de un sector eléctrico que estaba ya al borde del colapso. La Autoridad de Energía Eléctrica (AEE) de Puerto Rico se encontraba ya en una crisis fiscal extrema con una deuda en bonos por $9.000 millones de dólares. Esto contribuyó a que la AEE realizara una precaria inversión en infraestructura y un escaso mantenimiento de instalaciones y equipos. A su llegada, el huracán derribó el 80% de los postes de energía y todas las líneas de transmisión dejando a oscuras a los 3,4 millones de habitantes de la isla.
  • Por años, la AEE se ha adherido de forma sorda al uso de combustibles fósiles, sometiendo a los puertorriqueños a depender de la importación de estos combustibles, de los vaivenes en sus precios, y al estrés financiero asociado con la operación de sus plantas que funcionan con petróleo, carbón y gas natural. La falta de una matriz energética diversificada, descentralizada y que no dependa de la importación de combustibles fósiles sólo ha dilatado aún más la recuperación y confiabilidad del servicio energético prestado por la AEE (también conocida como PREPA por sus siglas en inglés).
  • La privatización de la AEE solo contribuye a generar más incertidumbre fiscal y operacional. Al inicio de este año, el Gobernador Ricardo Roselló firmó una ley mediante la cual privatizará la AEE. Esta privatización conlleva preguntas claves en cuanto a ¿quién genera la electricidad?, ¿de qué fuentes? y ¿a qué precio?; a la fecha se tiene total incertidumbre sobre las respuestas a estas preguntas y el impacto que tendrán en los residentes de la isla.
  • Así mismo, la isla careció por años de un ente de control que regulara la transparencia y el óptimo funcionamiento de la AEE hasta que se creó la CEPR en el 2014. A pesar de su importancia, su función se está viendo amenazada con la nueva ley recientemente firmada por el Gobernador Roselló.
La transformación que los puertorriqueños merecen
  • Las voces de la comunidad científica y de la sociedad civil deben verse reflejadas en la actualización del Plan Integrado de Recursos (IRP, por sus siglas en inglés). Su puesto en la mesa es fundamental para que las decisiones que se tomen estén informadas por análisis técnicos sólidos que respondan a las necesidades de las comunidades. El INESI es una de las organizaciones que está contribuyendo con esta tarea.
  • Un sistema sólido debe contemplar diversificación y resiliencia. Es necesario reducir la dependencia en importaciones de combustibles fósiles, diversificar la generación a fuentes locales de energía (por ejemplo, solar y eólica), actualizar las redes de distribución e integrar micro redes y sistemas de almacenamiento energético para incrementar la confiabilidad de la red y brindar servicios críticos (por ejemplo, para centros de salud, refugios y sistemas de bombeo de agua), así como reducir el consumo energético por medio de programas de eficiencia energética. Todo esto debe estar guiado por principios de transparencia y asequibilidad.
  • Un sistema saludable debe beneficiarnos a todos. Las emisiones de gases de efecto invernadero (como el dióxido de carbono y el gas metano) provenientes de plantas termoeléctricas que funcionan con combustibles fósiles (como el petróleo, carbón y el gas natural) sólo harán que los efectos del cambio climático como la intensificación de huracanes, inundaciones y sequías sean cada vez más devastadores. Adicionalmente, la quema de combustibles fósiles emite un número de contaminantes del aire (como dióxido de carbono, óxido de nitrógeno y material particulado) que son altamente nocivos para nuestra salud. Es vital que la transición a fuentes de energía limpia se haga a la mayor celeridad posible.
Energía, clima y salud. Una ecuación que nos afecta a todos

Esta visita a la Isla del Encanto me impactó profundamente. Interactuar con algunos de sus expertos en temas energéticos, ambientales y de salud reconfirmó cómo estas variables están intrínsecamente ligadas a nivel local. Regreso a Boston inspirada por todo el trabajo liderado por los organizadores y asistentes del simposio, y motivada a colaborar con ellos en estos temas que tienen un impacto tanto local como global. A pesar de que el cambio climático nos afecta a todos, hay comunidades más vulnerables a las malas decisiones que otros han tomado antes que ellos. Ojalá el poder de la unión nos ayude a tener una voz cada vez más fuerte para tomar acción urgente frente al cambio climático, por nuestros hermanos humanos, por nuestros otros hermanos vivientes, por nuestro único planeta Tierra.


Grace. Traída de un refugio en PR a uno en MA para adopción

*NOTA. Como mencioné al inicio del blog, el huracán dejó cientos de animales (abandonados, perdidos y heridos) que aún buscan un hogar. Los refugios de la isla se encuentran con limitada capacidad (de espacio y económica) y necesitan voluntarios que puedan llevar consigo animalitos a refugios en diferentes partes de EE.UU. Para quienes viajen a Puerto Rico y estén interesados en colaborar, All Sato Rescue puede brindarles más información. Yo traje conmigo a Grace, una perrita adorable que pronto podrá ser adoptada en Buddy Dog.









Paula García Audrey Eyring Paula García

California’s Clean Fuel Policies Clear Roadblocks to Electric Vehicles

UCS Blog - The Equation (text only) -

Photo: wellphoto/iStockphoto

The fight against climate change will be won or lost depending on how successful we are at decarbonizing the transportation sector.  Transportation is the largest source of carbon dioxide emissions responsible for climate change in the United States, and in California, and while emissions from electricity generation have been falling, emissions from transportation have been rising.  Getting these emissions in check requires steady higher efficiency conventional vehicles, a rapid transition to electric vehicles, and cleaner fuels that reduce the carbon emissions of the fuels used by all our vehicles.

California’s low carbon fuel standard (LCFS) is a critically important policy to make cleaner fuels available to all drivers. But the LCFS is doing more than just offering incentives to fuel producers to blend low carbon biofuels into gasoline and diesel. The policy is also accelerating the availability of electricity as a transportation fuel.  With the California Air Resource Board (CARB) considering a package of amendments in September to strengthen and extend the LCFS, the policy’s ability to support the transition to electric vehicles is finally coming into focus.

More EVs mean more progress cleaning up fuels

The central element of this year’s amendments to the LCFS is a new, ambitious target for the program that will double the required reduction in carbon intensity from 10 percent by 2020 to 20 percent by 2030. This ambitious target is only feasible because electricity is becoming a more common clean fuel in California as more drivers opt to buy electric vehicles.  The chart below comes from a study we recently commissioned that shows the large share of emissions reductions from different types of electric vehicles.  The yellow wedge illustrates the growing importance of passenger vehicles fueled with electricity—battery and plug-in hybrid electric vehicles—while the hashed yellow shows medium and heavy-duty electric vehicles, like transit buses and delivery vehicles, and the brown shows hydrogen fuel cells.

This chart shows which fuels accounted for emissions reduction in the “Steady Progress” scenario of a study UCS, NextGen and Ceres, commissioned (see full report or 2 page summary for more details)

EVs and LCFS: a mutually beneficial relationship

While more EVs make higher LCFS targets achievable, the LCFS in turn is accelerating the transition to EVs.  Under the LCFS, fuels cleaner than the standard generate credits and more polluting fuels generate deficits.  Major fuel suppliers such as oil refineries comply with the standard by accumulating enough credits from clean fuels to cover the deficits generated by the gasoline and diesel they sell.  They can generate credits by blending low carbon sources of ethanol into gasoline, or biodiesel into diesel fuel, or they can buy credits generated by other transportation fuel producers.  Electricity is one of the cleaner fuels, and since EVs are a lot less polluting than gasoline and diesel, EVs can generate a lot of these credits, which translates into a lot of money.

In 2016 the LCFS generated $92 million that supported transportation electrification in a variety of ways, from funding consumer EV rebates to making electric buses more cost-competitive.  The total value of LCFS EV credits will grow as more EVs hit the road.  As we describe in our recent fact sheet, the cumulative total is expected to add up to $4 billion dollars of support for electrification between 2017 and 2030. 

However, electricity is a different kind of fuel than ethanol or biodiesel.  You can’t blend electricity into gasoline or diesel, and when you charge an EV at home, the bill doesn’t itemize the electricity used to charge your EV versus powering your refrigerator.  Therefore, CARB has developed different rules to handle the credit generation from EVs that are specific to the different circumstances of different types of electric vehicles.  For example, transit agencies using electric buses generate LCFS credits, which is helping transit agencies lead the way on medium- and heavy-duty electrification. LCFS credits make electric transit buses cost-effective. Transit agencies earn about $9,000 per year for each electric bus in their fleets. But while a transit agency can register with CARB to generate credits, it’s not practical for every individual EV owner to do so on their own, so credits for residential charging are managed by CARB and the utilities.  CARB is considering amendments that make important changes in how these residential charging credits are handled.

LCFS credits can make electric cars more affordable via new point-of-purchase rebates

Renewable energy maximizes the benefits of EVs. The emissions associated with driving an electric vehicle depend upon the source of the electricity.  California’s grid is cleaner than average and has been getting cleaner in the last few years, which is why an EV is so much cleaner than a gasoline powered car. But powering an EV with renewable power will reduce emissions further, as will smart-charging, which schedules an EV’s charging to take advantage of low cost and/or low-carbon electricity. In the 2018 Amendments, CARB is proposing changes that allow the use of renewable power from remote sources and establish rules recognizing the benefits of smart charging for EVs.  Together these changes allow more people to use low carbon source for charging and will deliver even greater climate benefits than EVs charged on average electricity.

LCFS will start supporting hydrogen and DC fast charging infrastructure. One of the most surprising changes in the 2018 LCFS amendments is a proposal to grant LCFS credits based on infrastructure capacity in addition to delivered fuel for hydrogen and DC fast charging. This is a significant change, responding to a recent executive order requiring that “all State entities work with the private sector and all appropriate levels of government to spur the construction and installation of 200 hydrogen fueling stations and 250,000 zero-emission vehicle chargers, including 10,000 direct current fast chargers, by 2025.”

Hydrogen fuel cell vehicles address some of the limitations of battery electric vehicles, particularly for larger long-range vehicles.  But the longer range and quicker refueling of hydrogen vehicles will be of little value without an adequate network of hydrogen stations. And as long as there are very few hydrogen vehicles on the road, hydrogen stations will have very few customers, making a difficult business case for companies that have the expertise to build and operate such stations. To address this challenge CARB is proposing a program to run from 2019 to 2025 that would allow hydrogen stations to claim LCFS credits based on a hydrogen station’s fueling capacity for their first 15 years of operation.  This should substantially improve the economics of building and operating a hydrogen fueling station and help meet the goal getting 200 hydrogen fueling stations up and running by 2025. The program is capped at 2.5% percent of overall LCFS demand for clean fuel credits, to ensure it does not substantially erode demand for other clean fuels and will be reviewed at the end of 2025.

Similar treatment is being extended to DC fast charging stations.  While most battery electric vehicles are charged at home, some people can’t do this, for example if they live in an apartment building or a house without a designated parking space where they can install a charger.  DC fast charging makes it possible to quickly recharge an EV, which will help people without home charging and help all EV drivers on longer trips, making EVs an attractive choice for even more people. Like hydrogen fueling stations, the utilization of DC fast charging infrastructure will be limited in early years because EVs are still a small share of cars on the road. Like the hydrogen provision, CARB is proposing a program that would allow DC fast charging equipment operators to claim LCFS credits based on infrastructure for the first five years of their operation. The program is capped at 2.5% of overall LCFS demand for clean fuel credits, and total infrastructure-based credits received by DC fast charging equipment would be limited to the installation cost of the station, less any grants received. This program should substantially improve the economics of building DC fast charging equipment and support the goal of having 10,000 DC fast chargers deployed by 2025.

The LCFS amendments modernize and improve the program. A lot has changed since the LCFS was first adopted in 2010. UCS has been actively involved in the rulemaking process throughout the program’s history and has worked with CARB and other stakeholders on the amendments. We are confident the proposed amendments will strengthen the program, building on what worked, addressing challenges that have arisen, and adding new provisions to meet new challenges. We urge CARB to finalize these amendments when it meets in September.  California needs not just cleaner vehicles, but also cleaner fuels. The LCFS achieves this goal.


Zinke Attends Pacific Islands Forum, Ignores Their Biggest Concern

UCS Blog - The Equation (text only) -

Photo: Christopher Michel/CC BY 2.0 (Wikimedia)

This week, Interior Secretary Ryan Zinke heads the United States delegation to the Pacific Islands Forum Leader’s Session on the Island of Nauru on September 4, 2018, an annual gathering of dozens of Pacific Island leaders and partners. In the Interior Department press release, Zinke noted that the Pacific Islands are strategically important and he wants to discuss trade and the rule of law. He did not indicate any interest in discussing the impacts of climate change in the Pacific Islands region – dramatic impacts that his own agency described in a publication earlier this year.

The US Geological Survey (USGS), the lead science agency within the US Department of the Interior, published a report in April with an ominous headline: “Many Low-Lying Atoll Islands Will Be Uninhabitable by Mid-21st Century.” The study concluded that due to rising sea levels, flooding, and salt-water intrusion into aquifers there will be no potable water on thousands of islands in the Pacific “no later than the middle of the 21st century,” rendering those islands uninhabitable. Thirty to forty years from today, entire island cultures may have to leave their homelands for good because we have burned too much fossil fuel.

In the meantime, they urgently need assistance to adapt to rising seas, build resilience, and develop the infrastructure that could save lives and livelihoods before and during relocation. Ironically, the Interior Department, the agency that released this study, stands in the way of the assistance these islands so desperately need.

The USGS study, a collaboration with the National Oceanic and Atmospheric Administration, the University of Hawaii, and other partners, was funded by the Department of Defense, an agency that has become increasingly concerned about the impacts of climate change upon national security. The research took place on an atoll in the Republic of the Marshall Islands, a country that was tragically irradiated by American nuclear weapons testing in the 1940s and 50s. A series of international agreements since those years have attempted to provide adequate restitution for the people of the Marshall Islands, with little success. The current agreement is a “Compact of Free Association” that was enacted into law by Congress in 1986 and has been periodically updated.

Under this Compact, the United States agreed to provide the Marshall Islands with security and defense as well as financial assistance. The Interior Department is the agency charged with overseeing the grant assistance and aid to meet a variety of needs, including health, education, infrastructure, and disaster preparedness.

This recent study from the Interior Department has dramatically raised the stakes for those badly-needed resources, particularly for addressing resilience and disaster preparedness. Until now, the Marshallese thought they had until the end of this century before their homeland could be uninhabitable. This report shows that they may have far less time to adapt, and they urgently need the American assistance that has been promised.

Unfortunately, Secretary Ryan Zinke’s agency has eliminated all climate change language from Interior’s strategic plan, censored press releases and deleted references to climate change on agency websites, promoted amateurish climate denial literature, and advanced an aggressive fossil fuel agenda despite its role in accelerating climate change. To make matters worse, the Assistant Secretary for Insular Affairs, the person responsible for overseeing the Interior Department’s assistance to the Marshall Islands, is a protégé of Charles and David Koch, the oil billionaires best known for funding misinformation campaigns to undercut efforts to address climate change.

The political leaders at the agency have staked out a position contrary to established scientific evidence and in denial of the problem the Marshall Islanders face with every high tide.

The implications of this study go well beyond the Marshall Islands. While there are over a thousand low-lying islands in the Republic of the Marshall Islands, there are thousands more elsewhere in the Pacific and Indian oceans. As the report states, “These findings have relevance not only to populated atoll islands in the Marshall Islands, but also to those in the Caroline Islands, Cook Islands, Gilbert Islands, Line Islands, Society Islands, Spratly Islands, Maldives, Seychelles, and Northwestern Hawaiian Islands.”

Climate change poses the very same threat to Alaska Native villages as permafrost melts beneath their feet and waves devour meters of land each year. For this reason, Alaska Natives and Marshall Islanders have frequently participated together on climate change panels—including one that I organized during the climate negotiations in Paris in 2015. In my view, bearing witness to their stories and the risks they face in these places is essential to understanding the true threat of climate change.

Unfortunately, I’ve seen firsthand that Secretary Zinke and the Trump Administration have chosen to turn a blind eye to these dangers. How much suffering and loss of life and property will snap these ideologues out of their fossil fuel fever dream?  At what point along the arc of disaster will they wake up to their responsibilities as public servants?

At least the scientists at Interior are on the job. The USGS paper released in April was an important addition to the scientific literature describing the real risks of climate change. “Such information is key to assess multiple hazards and prioritize efforts to reduce risk and increase the resiliency of atoll islands’ communities around the globe,” said lead author Curt Storlazzi of the USGS.

This is exactly what Interior must now do—assess risk and prioritize urgent resilience investments. To do so Secretary Ryan Zinke and his team must acknowledge the overwhelming scientific evidence that climate change is real, it’s dangerous, and it’s human-caused.  As the head of the U.S. delegation at the Pacific Islands Forum, Zinke must respect the lives and livelihoods of Pacific Islanders by promising to address climate change and its consequences head-on.

Photo: Christopher Michel/CC BY 2.0 (Wikimedia)

California Gets one Step Closer to Zero-Emission Transit Buses

UCS Blog - The Equation (text only) -

Photo: Jimmy O'Dea

The California Air Resources Board (CARB) recently released a draft standard for transitioning the state’s transit buses to zero-emission battery or fuel cell technologies by 2040. This is great news for bus riders, bus drivers, local air quality, and tackling global warming emissions from the transportation sector.

The proposal is the result of more than three years of stakeholder engagement and public comment. In the process, CARB has generated a wealth of knowledge, including a sophisticated total cost of ownership analysis, a charging cost calculator, and a thorough understanding of the on-the-ground challenges to deploying a new technology on a large scale.

As a key step in the official regulatory process, the standard will be discussed and public comment heard at the September 27-28 CARB Board Meeting. A final vote will occur at a subsequent Board Meeting (date to be determined).

What’s being proposed?

Click to enlarge.

For large transit agencies (100 or more buses), 25 percent of bus purchases must be battery or fuel cell electric vehicles beginning in 2023. This increases to 50 percent in 2026 and 100 percent in 2029.

For small agencies, the proposed purchase standard doesn’t begin until 2026 (at 25 percent) and increases to 100 percent in 2029. Thirty of the state’s 214 transit agencies fall into the definition of a “large” agency and represent 75 percent of buses in the state.

When CARB began hosting workshops in 2015, the purchase standard was scheduled to take effect in 2018. So, the current proposal represents a five-year delay from CARB’s original plan.

To encourage early adoption, the 2023 purchase standard will be waived if 1,000 zero-emission buses have been purchased across the state by the end of 2020. If an additional 150 zero-emission buses are purchased by the end of 2021, the purchase standard will remain waived until 2025.*

With more than 130 zero-emission transit buses already operating in California, several hundred more on order, and significant amounts of incentive funding allocated for buses, transit agencies are already on track to exceed the early-adoption thresholds.

Finally, the standard also requires agencies to develop and submit plans to CARB for how they will reach a 100 percent zero-emission fleet by 2040. These plans will be critical to transit agencies’ successful incorporation of zero-emission vehicles in their fleets.

Which buses are included in the standard?

“Buses” in the context of this standard include standard 30 to 40-foot buses, shuttle buses (cutaway buses), articulated buses, coach buses, and double-decker buses operated by transit agencies. There are 14,600 transit buses falling under this definition in California. For reference, the city of Shenzhen in China (population of 12 million people compared to California’s 40 million people) already has 16,000 electric buses on the road.

The chart below shows a breakdown of California’s transit bus population by type (not shown are double-decker buses, of which there were only six in the most recent survey).

The standard’s percentages apply to purchases, not the total makeup of a fleet

Given transit buses are typically on the road for 14 years, this corresponds to a fleet turnover rate of roughly 7 percent each year. So, a 25 percent purchase standard in 2023 works out to roughly 2 percent of total buses on the road across all agencies.

Click to enlarge.

Looking at bus purchases statewide over the last five years, the 25 percent purchase standard in 2023 corresponds to about 150 zero-emission buses. The 50 percent purchase standard in 2026 corresponds to about 550 zero-emission buses.**

Individual transit agencies don’t necessarily turnover 7 percent of their fleet every year; instead making larger purchases every few years as shown in these two charts. Transit agencies’ different purchasing schedules points to the need for individual rollout plans in addition to purchase standards.

For a large agency like San Diego MTS, the 25 percent purchase standard corresponds to about 12 buses based on MTS’ purchase history. For a small agency like Sonoma County Transit, a 25 percent purchase standard corresponds to about 2 buses.

The chart above shows bus population by age in California (zero years old corresponds to 2016). More than half the buses on the road are from 2009 or earlier, which has significant implications for air quality as these vehicles were not subject to the latest engine standards. A combustion bus from before 2010 can have up to 30 times higher NOx tailpipe emissions compared to its newer combustion counterpart.

Three ways CARB can improve the proposed standard

1. The standard should clearly state that all buses must be zero-emission by 2040. Since CARB began workshops in May 2015, the goal of this standard has been achieving a full transition to zero-emission buses by 2040, yet the actual language of the standard doesn’t explicitly say this. In fact, it could be several years past 2040 when the full transition is achieved based on how the standard is currently written.

The rule’s proposed standard of 100 percent zero-emission buses purchases beginning in 2029 would guarantee a transition by the end of 2040 only for buses on the road for 12 years. But many buses in California are on the road for 14 years or longer and there is up to a two-year lag between when a bus is purchased and when it hits the road, so a 2029 purchase standard would likely not achieve the goal of all zero-emission buses by 2040. Anything past 2040 ignores the state of technology and how quickly other jurisdictions are making this transition, namely in China.

2. The standard should apply to shuttle, articulated, coach, and double-decker buses sooner. Under the proposed rule, these buses are not subject to the purchase standard for eight years despite comprising one-third of transit buses.

Waiting until 2026 would miss an opportunity to reduce emissions from these buses. Several models of these buses are on the road today and becoming increasingly available across manufacturers. We recommend these buses fall under the purchase standard two years after at least two models of a given type of bus have completed testing by the Federal Transit Administration.

If you haven’t been following the electric bus industry, there are currently 14 companies that make over 30 different models of buses ranging from standard transit buses to shuttle buses, coach buses, double-decker buses, and long, articulated buses.***

3. Small transit agencies should submit transition plans by 2021 to take advantage of current incentive funding. Under the draft plan transit agencies with less than 100 buses have until 2023 to submit plans for transitioning their fleets to zero-emission buses by 2040. If these transit agencies wait five years to come up with a plan, they could miss taking advantage of the significant amount of incentive funding currently available across the state for the bus itself as well as electric vehicle charging infrastructure. And due to the gaps between agencies’ purchases, a delay in planning could result in a several year delay in deploying zero-emission buses.

Why a standard is needed

In the three years CARB’s standard has been under development, there has been a significant increase in the number of transit agencies deploying zero-emission buses. Twelve agencies (see below) have made voluntarily commitments to 100 percent zero-emission fleets. These agencies represent both small and large fleets and operate 37 percent of the state’s total buses.

Antelope Valley Transportation Agency is working to transition its 85 bus fleet by the end of this year. LA Metro, the second largest bus fleet in the country, has committed to transitioning its fleet by 2030, a full 10 years ahead of what the state standard will achieve.

With leadership shown by these agencies, it’s important to acknowledge that a state-wide standard is critical to realizing the benefits of zero-emission buses across the state. AC Transit, in its plan to rollout 144 electric buses by 2032, directly references CARB’s proposed standard as a motivating factor in creating the agency’s plan.

If you look at the actual language of the proposed standard, you’ll notice it is a revision to an existing standard, first adopted 18 years ago. California’s early demonstration of zero-emission bus technology, such as fuel cell buses operated at Sunline Transit and AC Transit, can be traced to the original standard.

The proposed standard is a reasonable next step. The standard is achievable and without it, zero-emission buses would see a slow deployment. The technology is here, the public health and climate benefits are significant. The thoughtful conversations and detailed analyses have been had. The standard should be approved and California should continue to show we are a state that embraces solutions to air pollution and global warming.

* CARB’s draft standard also awards credits to agencies with zero-emission buses already on the road that can be used to offset future purchase requirements. Current credits correspond to roughly 150 buses.

** The purchase estimate in 2023 is based only on standard bus purchases made by large transit agencies. The purchase estimate in 2026 includes purchases made by small agencies and accounts for shuttle, articulated, coach, and double-decker buses.

*** The list of bus manufacturers and models includes those available for purchase if not already on the road.

Check out our Got Science? podcast for more on transit buses, the people’s electric vehicle:
Photo: Jimmy O'Dea

Electric Vehicle Sales Are Taking Off in 2018

UCS Blog - The Equation (text only) -

New models will help continue the growth in EV sales, like the longer-range battery electric Jaguar I-PACE SUV that is scheduled to arrive for sale later this year.

The sales numbers are in for the first half of 2018 and more new car buyers than ever are choosing an electric vehicle (EV). Through June, over 123,000 new EVs were registered in the US, compared to 91,000 in the first half of 2017, an impressive increase of 35 percent. And it’s more than double the sales from just 3 years ago.

What’s driving increasing EV sales?

Much of the increase in sales was due to new models becoming available. First, let’s look at battery electric vehicles (BEVs, all-electric vehicles without a gasoline engine): Tesla led the way with the more affordable Model 3, notching over 22,000 registrations for the first half of the year. That’s more than the battery electric vehicle sales for all traditional automakers, combined.

Unlike the battery electric sales, there is not one dominant manufacturer of PHEVs. However, like BEVs, the addition of new models (like the Honda Clarity and Chrysler Pacifica) have helped to boost sales. The introduction of Tesla’s Model 3 has increased battery electric vehicle sales to new highs in the 2nd quarter of 2018. At the same time, sales from traditional automakers have fallen, giving Tesla the majority of the BEV market in 2018.

New models also helped grow the numbers of plug-in hybrid electric vehicles (PHEVs). Honda played a role in boosting overall sales by finally selling an EV in some quantity.

From 2010 through 2017, Honda had sold fewer than 4,500 EVs in the US but moved over 6,500 Honda Clarity plug-in hybrid EVs in just the first 6 months of 2018. It’s a bit of encouraging news from an automaker that has previously been a laggard in selling EVs, though they still have much room before claiming a leadership position in electric vehicles.

California still leads the way

While EV sales have increased across the US, California is still far ahead on EV sales. About half of all EVs are sold in California, a fraction that has stayed constant for the last 3 years. In total, over 6 percent of all new cars sold in the state in 2018 were EVs (plug-in or fuel cell powered). This is a significant and growing fraction of the new car market and would be even larger if all car brands had EVs for sale. About 8 percent of cars were EVs when looking only at brands that have an EV for sale.

EV sales were also boosted by Honda finally starting to sell a plug-in vehicle in volume. However, they are still far behind leaders like General Motors for cumulative EV sales in the US. Click to enlarge.

Some brands excelled at selling EVs in CA: over 16 percent of BMW-badged vehicles were plug-ins and over 10 percent of Chevrolets were EVs. On the other hand, Toyota and Honda, who sell the largest number of cars in California, had less than 4 percent EV sales (even less if you include their luxury brands Lexus and Acura, neither offering an EV).

Sales likely to accelerate with more models on the way

Click to enlarge.

The next 6 months will bring important new competitors in the EV market, with several new long-range battery electrics slated to arrive. The new EVs include both luxury cars and more affordable models. Hyundai will launch their Kona battery electric with over 250-mile range later this year. Jaguar, Porsche, and Audi will all debut luxury EVs to compete with the higher end Tesla models.  The number of plug-in hybrid models is also expected to grow, including the first EV offering from Subaru. If the past is any guide, adding more EV options (both more brands and types of vehicle) will help grow the sales share of EVs, even if not all are big sellers

Speed bumps ahead?

Some manufacturers are moving ahead with new EV models, while others are seeming to squander early leads in moving to electric vehicles.  For example, Ford’s EV sales are down significantly this year, with the end of the C-MAX plug-in hybrid and no new electric products announced for this year or next. Nissan has also seemed to stumble a bit. They delivered the first mass-market all-electric car from a traditional automaker with the LEAF in 2010. However, sales of the LEAF have slowed, and Nissan hasn’t expanded its electric line up beyond that one model. Lastly, Tesla is of course fully committed to EVs, but is embroiled in controversy regarding its CEO Elon Musk and

Beyond individual automaker efforts, there is another potential source hinderance to the growth of EVs. The disconnect between state policies that are pushing EVs forward and federal efforts to rollback clean car standards and remove vital authority for California and the 9 other states that have adopted the Zero Emission Vehicle regulations. It’s unlikely that lack of leadership at the federal level will stop EVs. Automakers realize that the transition to EVs is inevitable, and policies around the globe will continue to push in that direction. However, irresponsible decisions by the current Administration could delay this transition here, harming both US drivers and the environment. The good news is that so far we are seeing continued development of EVs and growth in sales.

Data Source: IHS Markit Data Source: IHS Markit

We Need Better Data about What Is Killing American Prisoners. It’s Probably the Heat.

UCS Blog - The Equation (text only) -

American Climate Prospectus

DC is in the middle of a swampy heat wave right now, with temperatures exceeding 90oF regularly. My peers and I can joke about getting drenched in sweat from the walk from the metro to school because we have an air-conditioned building to look forward to. Any heat-related discomfort is temporary for us. Prisoners in our country don’t have this luxury, and it may be killing more of them than we realize.

If you go to the Bureau of Justice Statistics (BJS) website, you can download datasets showing the reasons inmates died over the last few years. As part of my studies, I accessed this data and found a shocking lack of resolution.


Photo credit: Bureau of Justice Statistics

Every death that isn’t due to an inmate killing themselves or another inmate is written off as “natural causes.” Further exploration of the BJS site or Centers for Disease Control and Prevention sites yield a little more information about the burden of certain diseases like diabetes and heart disease, but overall there is not much public information or raw data about what is actually killing prisoners in America.

Recent studies into the effects of extreme heat exposure (which we can loosely define as constant exposure to heat exceeding 86 degrees F based on the National Oceanic and Atmospheric Administration heat index) suggest there might be a sizable burden of heat-related illnesses. In order to dive deeper on this issue, I started to look at Texas specifically, due in part to the excellent journalism of groups like the Texas Tribune and the Marshall Project. The writers on these teams have been tracking policy changes as well as the risk factors for susceptibility to heat, and bring up some excellent points.

First, there is some evidence that prisons make the incarcerated “age faster,” which is to say they have the health issues commonly associated with populations a decade or more older than them. This includes dampening the nervous system’s ability to regulate heat, which decreases the body’s ability to combat effects of extreme heat exposure. This issue, and the fact that there is an increasing population of prisoners older than 50, means that prisoners may be more susceptible to heat exposure than the general population.

Second, about a third of American prisoners also experience mental health issues, and another sizable chunk experience chronic illnesses like diabetes and hypertension. The medications for these conditions include psychotropics which further dampen the nervous system response to heat, as well as diuretics and anticholinergics, which tamper with bodily functions like sweat and urination. As a result, many prisoners may not even be able to sweat properly, and retain urine to the point of danger for kidney disease and hypertension.

But of course, the most important part of heat-related illness is the heat itself. And in Texas, the state with one of the hottest summers in the United States, this is the major killer. Almost 75% of Texas prisons don’t have air-conditioned residential areas. This is unacceptable now, and is even more concerning when you look at climate projections for the next 80 years.

As you can see, Texas is hot, and it’s only going to get hotter. Prisoners in Texas, and the rest of the country, are already feeling the impacts of heat. Last summer, a particularly disturbing video was shared on Facebook, where prisoners’ screams for help could be heard from outside. They repeated “Help Us, Help Us, It’s Too Hot, We Can’t Breathe.” The viral nature of this video pushed the St. Louis prison to implement better air-conditioning, but that was just a small start to addressing the larger issue.

Anecdotal reports will never be taken seriously by any institutional body, and they cannot inspire political will on the level that is needed to protect prisoner health. This brings me back to data resolution. Researchers need access to more information about prisoner health in order to better understand this issue and make a compelling case for better heat management. For starters, we need more data on the actual temperatures and humidity levels  inside of our prisons. As part of my graduate coursework I wrote a petition which calls for a public data collection schedule which includes urinalysis and blood work data along with temperature data.

You have the ability to send a similar petition the governmental bodies which control what data is collected and made public. You can use this template to petition the Office of Justice Programs, or your state’s correctional body. Finally, as further reading, I encourage you to look at the 2015 report out of Columbia Law School which examines the challenges climate change will pose for correctional institutes in the coming years.

Anyun Chatterjee is finishing his masters in environmental health at George Washington this fall. He is a researcher with the Cleveland based network of psychiatrist and mental health providers known as BRAIN. During his time in DC he started the research group, based on the principles of informational equality and purpose based research.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Vogtle and Hatch: Have Cost Over-Runs Undermined Safety Performance?

UCS Blog - All Things Nuclear (text only) -

In August 2018, Georgia Power announced raised its estimate of the construction costs for its 45.7% share of the two new reactors being constructed at the Vogtle nuclear plant by $1.1 billion from $7.3 billion to $8.4 billion. Assuming the company lacked warehouses stuffed with money, the cost over-run raised an important question: has the hemorrhaging budget for constructing Vogtle Units 3 and 4 taken funding or distracted management attention away from the company’s operating reactors—Vogtle Units 1 and 2 and Hatch Units 1 and 2—and undermined their nuclear safety performance?

If asked, Georgia Power would certainly say “nope.” Because the company cannot forecast the cost of building reactors within a billion dollars or so, their skill at forecasting the necessary cost of operating reactors is questionable, at best. In other words, I didn’t ask Georgia Power.

Instead, I examined two data sets that provide more reliable insights on whether cost over-runs on Vogtle Units 3 and 4 have undermined safety performance of the company’s operating reactors. One data set was the quarterly performance ratings issued by the Nuclear Regulatory Commission (NRC) for every operating reactor in the country. The other data set was the reactor power levels reported each day by reactor owners to the NRC.

NRC Performance Ratings

In 2000, the NRC began assessing performance of every operating reactor every quarter using a combination of violations of regulatory requirements identified by NRC inspectors and about 24 performance indicators. When performance meets expectations, the NRC’s findings (if any) are green and the performance indicators are green. The further performance drops below expectations, the colors move from green to white to yellow to red.

Each quarter, the NRC uses the findings and indicators to place each operating reactor into one of five columns of its Action Matrix. When all expectations are met, reactors are placed in Column 1. As performance drops, reactors are moved into Columns 2, 3, 4, and 5. More than 80 percent of the time, NRC has placed reactors in Column 1. So, performance warranting a move out of Column 1 has been experienced, but most often avoided.

The NRC’s quarterly performance ratings between 2012 and the first half of 2018 for the operating reactors at Hatch and Vogtle are shown in Figure 1. Both the Hatch reactors remained in Column 1 the entire time. The two operating reactors at Vogtle dropped into Column 2 for a total of 8 of the 26 quarters. The good news is that Georgia Power was able to remedy the performance shortcomings to return the Vogtle reactors to Column 1. The bad news is that the Vogtle reactors are underperforming the U.S. nuclear fleet. The typical U.S. reactor received Column 1 performance ratings over 80 percent of the time. The Vogtle reactors were in Column 1 less than 70 percent of the time from 2012 onward.

Fig. 1 (Source: Union of Concerned Scientists)

Daily Reactor Power Levels

Each day, plant owners report the power levels their reactors are operating at. The NRC archives the reports and posts the daily reactor power levels over the past 365 days on its website. I used this data to plot the daily power levels reported for the Hatch Unit 1 and 2 reactors between 2014 and 2018 in Figure 2. The refueling outages conducted over this period are easy to spot—they are the wider white gaps preceded by a few days of gradually decreasing reactor power levels. Refueling outages commonly last three to four weeks. Figure 2 also shows a few other shorter outages and power reductions, especially on Unit 1.

Fig. 2 (Source: Union of Concerned Scientists)

Figure 3 shows the daily power levels for the Vogtle Unit 1 and 2 reactors between 2014 and 2018. Again, refueling outages, non-refueling outages, and power reductions are evident in the plots.

Fig. 3 (Source: Union of Concerned Scientists)

The plots of daily reactor power levels may appear as insightful as the squiggles and blips are an EKG screen. To help put the plots for the Hatch and Vogtle reactors in context, the daily power levels for the Pilgrim reactor over the same time period are plotted in Figure 4. During most of this time, Pilgrim resided in Column 4. No reactor in the United States received lower performance ratings from the NRC during this period than Pilgrim.

Fig. 4 (Source: Union of Concerned Scientists)

What’s the difference between good performing reactors and Pilgrim? Pilgrim has fewer big blue rectangular blocks of operating at full power. Ideally, a reactor should run at 100 percent power from refueling outage to refueling outage, with only short-duration power reduction every quarter for testing. The more that the solid blue rectangles between refueling outages are splintered by unplanned shut downs and unwanted power reductions, the less ideally a reactor is operating.

UCS Perspective

The NRC’s quarterly performance ratings suggest the financial and management resources poured into the cost over-runs on Vogtle Units 3 and 4 have not undermined safety performance at Hatch Units 1 and 2.

The NRC’s quarterly performance ratings for Vogtle Units 1 and 2 paint a slightly different picture. Whereas the average U.S. reactor received Column 1 ratings from the NRC over 80 percent of the time, the Vogtle reactors got Column 1 ratings less than 70 percent of the time in recent years. But this situation is tempered by both reactors currently receiving Column 1 ratings. The Vogtle reactors under-performed the U.S. fleet, but not by a troubling extent.

The daily reactor power levels for the Hatch and Vogtle reactors also suggest that performance has not been appreciably undermined. The data do not suggest that the Hatch and Vogtle reactors have the performance shortcomings reflected by the daily reactor power levels for the Pilgrim reactor—the worst performing reactor per the NRC’s ratings—over the same period.

The NRC’s quarterly performance ratings are the public’s safety net. Insufficient budgets, inadequate management attention, aging equipment, and other causes can lead to lowered performance ratings. Lower performance ratings increase NRC oversight. The early detection and correction of performance shortcomings prevents problems from growing to epidemic proportions that invite disaster.

Unfortunately, the NRC is contemplating changes to its quarterly performance ratings and mandated responses that could cut holes in the public’s safety net. As nuclear plants age and their maintenance budgets shrink, the NRC needs to strengthen rather than weaken the most reliable tool it uses to protect public health and safety—timely, reliable and accurate performance ratings.

Why the Farm Bill Should Invest in Agroecology Research: An Interview with Dr. Selena Ahmed

UCS Blog - The Equation (text only) -

Recently, the 2018 farm bill—the massive federal legislative package that shapes our country’s food and agriculture system—cleared a major hurdle, as both the House and Senate voted to begin negotiations toward a compromise bill. This process is important for many reasons, including how it will impact the US Department of Agriculture’s $3 billion annual investment in research to help the nation’s farmers and eaters alike.

In case your Schoolhouse Rock memories are fuzzy, a quick civics lesson: A bill becomes a law after it is passed by both the House and the Senate and signed by the president. When the House and Senate versions of a bill are not the same, a conference committee—made up of negotiators from both chambers—must meet to hash out their differences. The resulting bill then returns to both chambers of Congress for a final vote before heading to the president’s desk for signature. In the case of the farm bill, negotiators have some serious work to do to bridge the yawning gap between major components of the two bills, including nutrition and conservation provisions. Yet while these differences have generated bigger headlines, agriculture research is quietly one of the most important parts of the farm bill.

That’s because the USDA’s investment in science-based research—again, nearly $3 billion every year—helps to keep farmers and ranchers viable and profitable amidst a whole host of challenges, from changing patterns of pests to extreme weather to the economic uncertainty created by the president’s volatile trade policy. Publicly-funded agricultural research is crucial to advancing the sort of farming systems that can benefit both growers and the public, for example by improving soil health, diversifying our food supply, and reducing water pollution while maintaining farmers’ profits.  Research suggests that a field of science known as agroecology can be particularly effective at uncovering such solutions. Yet our investment in public agricultural research overall has been declining —both in comparison to funding from private industry, and to other global powers like China—and  investment in agroecology research is particularly insufficient.

Since the farm bill is the major legislative vehicle for supporting public agricultural research—and for transforming our food system more broadly—the agricultural research community has been outspoken in demanding a substantive increase in research funding. Last October, more than 60 organizations, including UCS, called on Congress to double total USDA food and agricultural research, education, and extension funding by the time the next farm bill comes up for reauthorization in 2023. And earlier this summer, a group of researchers from across the country traveled to Washington, D.C. to make a case for agroecology and interdisciplinary food systems science.

One of those researchers was Selena Ahmed, Assistant Professor of Sustainable Food and Bioenergy Systems at Montana State University. While in D.C., Dr. Ahmed met with seven senators and representatives who serve on the House and Senate Agriculture Committees, which are responsible for drafting the next farm bill and funding USDA research programs. I recently caught up with Dr. Ahmed to hear more about why the U.S. should invest in interdisciplinary, systems-based food and farm research, and how her message was received by Congress.


What are some of the objectives of your research program?

The overall goal of the research program that I lead through the Food and Health Lab at Montana State University is to strengthen sustainability and design innovations in the food system towards supporting local, national, and global food security for all. In my research program, I approach food security as equitable access to healthy, affordable, and desirable food that strengthens the capacity of individuals and communities to serve the challenges and needs of our nation and our world. My research has three key objectives. First, on the agriculture side of food systems, is to identify and design innovations that strengthen the resilience of farms and farmers to support environmental and human wellbeing. Second, on the consumption side of food systems, is to identify and design innovations that enhance access to high-quality and affordable food for healthy communities. Lastly, as a faculty member of Sustainable Food Systems at one of our nation’s land-grant institutions, I seek to build the capacity of future food system leaders to effectively address complex food system challenges towards supporting long-term local, national, and global food security.

How does your research and outreach work impact communities—in Montana, and across the country (and world)?

It is my hope that my work impacts communities in Montana, nationally, and globally through generating evidence to identify food system innovations, developing plans, and informing policies that support food security as well as environmental and human wellbeing. I lead and collaborate on multiple federally-funded projects as part of achieving this goal. This work is providing research evidence towards developing plans and policies to support farms, farmers, and communities.

For example, I have two funded projects through the National Science Foundation that are examining the effects of environmental and management factors on crop quality and farmer livelihoods as well as identifying agricultural innovations to build the resilience of farms and farmers to climate and market risk. This work is being conducted locally to support farmers and communities in Montana and regionally in the Upper Missouri River Basin, as well as in countries globally where many of our food supply chains start and those countries that have agricultural innovations that we can learn from in the United States. This research has generated evidence on multiple agroecological innovations that can be applied to reduce vulnerability to droughts and extreme weather events in order to more effectively feed our communities. These innovations include multiple agricultural solutions including diversified agriculture such as agroforestry, precision agriculture, tree planting, and management of soil organic matter and soil carbon sequestration through organic agriculture, manure management, mulching, and cover crops. I have generated data that agricultural diversification at the landscape, species, and genetic levels not only supports the environment, it can also result in crops with higher quality based on phytochemical and sensory profiles that are associated with higher price premiums and livelihoods for farmers as well as higher health attributes for human consumers.

On the consumption side of the food system, I have been engaging in a series of community-based projects in rural and tribal communities of Montana to generate research evidence to identity and design food system innovations that can enhance access to high-quality food. The goal of this work is to mitigate food insecurity, diet-related chronic disease, and health disparities through projects with community partners on the Flathead Reservation of the Confederated Salish and Kootenai Tribes. Through a funding mechanism of the National Institutes of Health, our team has been taking a food systems approach to strengthen access of participants of nutrition assistance programs such as the Food Distribution Program on Indian Reservations (also known as the Commodities Program) to fresh, healthy, local food. We see this food systems approach being ‘win-win’ for local communities by enhancing food security and human health while supporting local farms and strengthening the local economy.

Why were you motivated to come to DC to talk about agroecology research?

I thought it was a critical time to visit DC to discuss agroecology and food systems research with Congress in session working on the Farm Bill. I wanted to provide whatever input I could while also wanting to learn more about food policy in the United States and the varied perspectives of the Congressional offices regarding the Farm Bill. As a scientist, I believe it is our duty to share findings of our research to a broad audience including to policy makers in order for our research findings to be operationalizable and have far-reaching impacts. I do what I do to positively transition food systems to sustainability and a critical part of this work is engagement with community partners, industry stakeholders, policy makers, and advocacy groups.  I am extremely grateful for the federal funding that has supported my research program and I was honored for the opportunity to share the relevance of this work with Congressional offices.

What was your experience talking with Congressional offices? 

I very much benefited from the opportunity to share the relevance of my research program with Congressional offices while learning more about the varied priorities of these offices in respect to the Farm Bill. Overall, I found the members of the Congressional offices receptive in hearing about agroecology and food systems research as well as its relevance for communities and the nation. Multiple representatives from these offices noted that they appreciated learning about the work that is supported by federal funds. I also found the experience valuable in better understanding policy in the United States. Some of the Congressional representatives noted some of the policy opportunities and challenges in response to the recommendations offered based on our experiences. These perspectives were extremely insightful and I found this experience to be something I would like to continue to be involved in.

Anything else you’d like to add?

Ensuring food security is critical for a strong nation and a healthy planet. An agroecology and local food systems approach is crucial for ensuring food security while strengthening local economies and their capacity to serve the challenges and needs of our nation and our world. There are key attributes of specific federal research funding mechanisms that I believe result in the most successful outcomes for communities and the nation. One key attribute of funding mechanisms is their long-term nature. Agricultural and environmental processes are long-term and I have found that grant mechanisms that are also relatively long-term (4-5 years in length) allow for greater monitoring of long-term processes as well as greater impact. In addition, successful community-based work is dependent on developing relationships that can also be a long-term process; thus, grant mechanisms that are relatively long-term also allow for greater development of relationships with communities towards greater positive impact. Lastly, I wanted to highlight the importance of interdisciplinary and international research. Solutions for food system challenges we face in the United States may be found in the agricultural fields and communities of other nations. It is my hope such Congressional visits can serve to increase federal funding for agroecology and food systems research.


As luck would have it, Dr. Ahmed’s meetings with Congress were perfectly timed to make an impact. Just as she was heading back to Montana, the Senate Agriculture Committee released its draft farm bill, which was eventually approved by the full Senate on June 28. The Senate version of the bill makes important strides to protect and ramp up investment in agricultural research that supports agroecology. The Organic Agriculture Research Extension Initiative got a boost from $20 million to $50 million annually. A matching requirement for federal funding was eliminated, leveling the playing field for research institutions with fewer financial resources—including historically black and tribal colleges and universities. And the USDA’s Office of the Chief Scientist (which plays an influential role, as evidenced by last year’s fight to defeat the nomination of Sam Clovis) was empowered with increased funding to improve staffing, increase coordination between federal research agencies, and expand oversight and scientific integrity.

The Senate bill was a welcome contrast to the House bill, passed earlier, which does not substantively improve the landscape for public agriculture research.

As negotiators in the House and Senate reconcile their two opposing bills, researchers like Dr. Ahmed will have to keep up the pressure, urging their members of Congress to adopt the Senate’s research title to prioritize strong investment in food and agriculture research. In the meantime, you can add your name to the petition calling on Congress to prioritize proven, science-based policies and programs in the farm bill. It’s what the agricultural research community wants, and it’s what we all need to build a world-class food system that makes affordable, healthy, and sustainably grown food available to everyone.

Climate-Safe Infrastructure for All: California Working Group Report Provides Comprehensive Recommendations

UCS Blog - The Equation (text only) -

Nearly two years ago, the Climate-Safe Infrastructure bill (AB 2800, Quirk, 2016) became law and established the Climate-Safe Infrastructure Working Group (CSIWG) to develop recommendations to the California legislature on how to build and design our infrastructure to be safer for Californians in the face of growing climate extremes. Since then, unprecedented wildfires and mudslides, record-breaking temperatures and precipitation have added an exclamation point to the importance of this group’s work in preparing our infrastructure to keep us safe, as we’ve experienced the risks and what’s at stake. Today, the CSIWG released its report, Paying it Forward: The Path Toward Climate-Safe Infrastructure in California, which recommends an ambitious and attainable path forward.

The problem

Traditionally, engineers designed infrastructure assuming that past climate trends were good predictors of the future. With climate change, this is no longer a reasonable assumption, and is even a downright dangerous one, as recent infrastructure failures have shown us. Yet many engineering codes, standards, and practices still plan for the future by looking at the past.

They also face the additional challenge of planning and designing long-lived infrastructure for a wide range of potential futures and addressing inherent uncertainties in climate projections. At the same time, the state is deciding over the next few years how to spend more than $60 billion of taxpayer dollars on infrastructure – infrastructure that we need to perform well for many decades. A recent study found that for every $1 invested in preparing for natural disasters, society can save $6. We need to get this right, and soon.

UCS sponsored AB 2800 to address this disconnect between what climate change projections tell us about the future and how they are inadequately integrated into state infrastructure plans and design, engineering, investment, and construction decisions.

The Climate-Safe Path for All

Click to enlarge.

Fourteen prominent engineers, scientists, and architects came together for a first-of-its-type, multisectoral, state-level initiative focused on how to design state infrastructure for climate change and address critical needs and barriers to doing so. They define climate-safe infrastructure as sustainable, adaptable, equitable, and low- (or no-) carbon infrastructure that’s resilient when faced with climate-related stressors and shocks, so that it can keep Californians safe.

Science is key to climate-safe infrastructure decisions, but alone it’s not enough. Their insightful recommendations therefore touch upon other key steps in the infrastructure lifecycle while also providing a list of information needs and climate-sensitive codes. One of the far-reaching recommendations is adoption by the Legislature of “The Climate-Safe Path for All” as official state policy.

(This mammoth report – at 150+ pages – is wide-ranging and very detailed. In future blogs, I will analyze other findings and recommendations in greater depth.)

Building safer

The “climate-safe” portion of the Climate-Safe Path for All refers to a two-pronged approach to protecting our infrastructure, and the communities and economy that depend on it. State agencies would continue pursuing aggressive reductions in global warming pollution, in order to have the best chance of avoiding some of the worst impacts of climate change. At the same time, they would plan and eventually build long-lasting infrastructure (like bridges, roads, dams, etc.) to withstand the climate impacts from a high-emissions, “business as usual” pathway.

The Climate-Safe Path for All consists of global warming emissions reductions in line with the Paris Agreement, and adapting long-lived infrastructure for climate impacts under a business-as-usual scenario. The Working Group considers it the safest approach given our current high-emissions trajectory and the long-lasting nature of global warming pollution. Social equity is another key component of the Path.

Rather than requiring everything be built today for the high-emissions scenario decades from now, the Working Group recommends taking an adaptive, phased approach over time. This would preserve flexibility to respond as new information becomes available while also helping save lives and money.

A very simple but illustrative example of this type of adaptive approach would be designing a waterfront roadway with future sea level rise in mind. The road would be built (or modified if already existing) to be protective over the near term and allow updates, like increasing its height, expanding nearby green space to absorb more water, and moving the road or identifying alternative transportation routes. Pre-determined triggers or thresholds set design updates and/or policy actions in motion (e.g., a specific amount of sea level rise or number of flooding events). And there would be plans in place to minimize disruptions that do occur and hasten recovery.

It’s a relatively new concept for infrastructure decisions that makes sense in theory. I do have some questions about its application. How are triggers selected, and in an equitable way? How does this apply to less easily adapted, large-scale projects, like bridges or dams? Money must be set aside before these updates are necessary and policies put in place to require them, so they do not rely on a conducive economic and political environment at the time of needed changes. In addition, the thresholds should be monitored and reviewed regularly so they are observed well in advance of disaster.

Building for all

In addition to the “climate-safe” part of the recommendation, the other important piece of the Working Group’s recommendation is the priority focus on how to deliver “for all.” The report rightly acknowledges that there is no overarching set of criteria for informing and prioritizing infrastructure decisions that’s consistent across sectors, and that’s a problem. How do we spend limited public funds wisely and fairly without such common criteria?

The Working Group recommends reducing social inequities as one of three key criteria for prioritizing infrastructure projects.

The Working Group proposes prioritizing projects that “most reduce inequality and increase opportunity,” in addition to also addressing the greatest climate risks and biggest infrastructure investment needs. Many low-income communities and communities of color face both a “climate gap” and an “infrastructure gap.” This reality must change in order for our infrastructure to truly benefit and protect all Californians. It will be a significant undertaking, but hugely important to get right as well. The Working Group’s recommendations also highlight specific ways to address equity throughout the infrastructure decision-making process, from planning to design to spending decisions.

Looking ahead

With the release of its report, the Climate-Safe Infrastructure Working Group has delivered strategic and sound recommendations at a time when repeated extreme weather events remind us of the need for action. We encourage the Legislature and Strategic Growth Council to strongly consider their recommendations and keep the momentum going. For example, an immediately helpful next step would be to prioritize the climate-sensitive codes and standards to begin updating now.

With billions of dollars slated to be spent on infrastructure over the next few years, we must seize this rare opportunity to spend it wisely and equitably. It is time to ensure that our infrastructure can truly keep all Californians safe in the face of more frequent and severe climate extremes.

CSIWG, Paying It Forward CSIWG, Paying It Forward


Subscribe to Union of Concerned Scientists aggregator - Combined UCS Blogs