Combined UCS Blogs

How Many Rides Do Lyft and Uber Give Per Day? New Data Help Cities Plan for the Future

UCS Blog - The Equation (text only) -

In the span of about 7 years, app-based ride-hailing (i.e. Lyft and Uber) has gone from non-existent to ubiquitous in major metro areas. But how are these services affecting important aspects of our transportation system like congestion, public transit, and vehicle emissions?

The San Francisco County Transportation Authority (SFCTA) made a big first step last week towards answering these questions. The agency released data showing when, where, and how many rides start and end within San Francisco.

These statistics are important because passenger vehicles are the largest source of climate emissions in California, a major source of air pollution, and play a central role in our transportation system, which greatly affects social equity. If ride-hailing continues to grow, it has the potential to positively or negatively impact many aspects of transportation, including the reliability of public transit; costs of travel; extent of air pollution and climate change; safety of pedestrian and vehicular travel; and accessibility, type, and quality of jobs.

Lyft’s recent commitment to provide 1 billion miles of travel in autonomous electric vehicles powered by 100 percent renewable energy by 2025 is an encouraging step towards a positive future of app-based travel.

Some of the report’s findings are what you’d expect

Not surprisingly, the number of rides within San Francisco peaks in the heart of downtown on Friday and Saturday nights. During the week, ride-requests are at their highest during the morning and evening commutes. More rides are requested after work than before work. Interestingly, more rides are also requested as the work week progresses, #fatigue?

SFCTA developed a website to visualize when and where rides are starting and ending in San Francisco. It’s pretty cool, especially if you’re familiar with the city.

Switching from pick-up to drop-off location (see gifs), gives a rough sense of where people are traveling to and from, i.e. commuting to downtown in the morning and out of downtown in the evening. SFCTA’s data doesn’t correlate the pick-up and drop-off locations of individual rides, but the aggregate data still suggests these trends.

Other findings are less expected

The most surprising numbers from SFCTA’s report are the sheer volume of rides being given by Uber and Lyft: more than 150,000 intra-San Francisco trips per day, which is roughly 15 percent of all vehicle trips taken within the city and more than ten times the number of taxi trips.

The SFCTA study only considered trips originating and ending within San Francisco. So, there are actually many more Uber and Lyft trips being taken to or from the city.

Another interesting finding: approximately 20 percent of the miles traveled by Uber and Lyft drivers in San Francisco are without a passenger. These out-of-service miles (also known as “deadheading”) are actually lower for Uber and Lyft than taxis, which drive 40 percent of their miles without a customer. More Ubers and Lyfts on the road compared to taxis mean less distance is traveled between drop-offs and pickups.

What’s the big deal?

If you asked, “Don’t Uber and Lyft already have this data?” You’d be right. They do. So does the California Public Utilities Commission (PUC), which oversees transportation network companies (TNCs) – the policy term given to Uber and Lyft.

But the TNCs and PUC denied requests for data, so SFCTA partnered with Northeastern University to indirectly measure it themselves. Uber and Lyft oppose sharing data that could reveal aspects of their market share, such as where they dispatch drivers and pickup riders. Because there are only two main ride-hailing companies, either company could just subtract out their own numbers from aggregate data sets to get a sense of what the other company is doing.

The companies have a competitive history, but the need for this type of data will only increase as they provide larger fractions of vehicle trips, especially if projections materialize for ride-hailing with self-driving cars. Without data, it will be difficult to justify the potential safety, mobility, and emissions benefits (or consequences) of self-driving vehicles.

It’s fair to ask whether Uber and Lyft should share data not necessarily required by other fleets. A notable exception is the New York City Taxi and Limousine Commission, which approved standards earlier this year requiring TNCs to report trip information taxis were already required to share.

Even simple metrics such as the types of vehicles in a fleet (electric, hybrid, conventional), as reported by taxis in San Francisco, are important pieces of information for local governments to address the climate and air quality aspects of transportation. As the saying goes, you can’t improve something that you don’t measure.

What’s next?

SFCTA’s findings raise many questions about what types of trips TNCs are replacing. Are they getting people out of personal cars or turning pedestrians into ride-hailers? Are they eroding public transportation or making it easier for people to get to the bus, MUNI, or BART? Are people taking solo rides or sharing trips via uberPOOL or Lyft Line?

Previous studies and those underway are attempting to answer these questions. But ultimately, data like those from SFCTA are critical for transportation planners and researchers to understand the impact of ride-hailing services today and how they can be used to improve, and not hinder, how we get around in the future. Decisions like expanding roads vs. setting aside land for public spaces or how to better serve a community with public transportation all depend on knowing when, where, and how many trips we’re taking, whether by foot, bike, car, bus, or train.

Note to the Department of Energy: The Grid Has Changed

UCS Blog - The Equation (text only) -

The electric grid is steadily evolving to incorporate growing levels of renewable energy, and it’s saving consumers money and maintaining reliability. However, an April memo from US Energy Secretary Rick Perry is critical of this fact and seeks to protect old plants from closing due to competition.

Grid reliability depends on continued investment, innovation, and modernized practices. New, renewable solar and wind power plants and new practices that replace the old are meeting the public’s electricity needs. Delaying the retirement of old plants costs consumers money. With these changes to grid, there is no change to the vigilant attention to reliability.

When the DOE releases the study and policy recommendations requested by Secretary Perry, it should show that US electric system has not been diminished by change. The study should show that reliability is defined across many criteria and time windows, and that a growing diversity of resources are capable of providing reliability services. If done well, the report will reflect the industry practice of relying on the mix of generation, and that no single metric describes the reliability of the power supply.

Renewable energy from wind and solar has grown to supply as much as 50% or more of electricity on particular days over large areas of the United States.  Individual companies and cities have set and met goals of procuring 100% of electricity use from renewable energy. These early indications of the change in the electric grid demonstrate the robust ability of the technology, investment, and coordinated operations that provide a successful functioning of the grid on new terms.

The challenge for the Department of Energy is to keep up with the changes that make the grid more reliable, despite the decline in coal. Utility engineers have taken up the challenge to use existing tools, such as power system forecasting and coordinated economic scheduling of power plants, and modernize them.

In addition, rapid advances in renewable generators technology provide surprising capabilities. The recent report made by the California ISO staff to its board describing the accuracy and speed of solar farms to provide reliability services was filled with positive exclamations. This is just the latest technical assessment to demonstrate that making more use of the services available from renewable energy for reliability improves both the economics and the reliability of the grid.

How this happened and how the energy system stays reliable

Dynamic innovation and capital investments push the evolution of energy sources, especially in generating electricity. Competition between coal and hydropower factored in the famous rivalry between inventors Thomas Edison and Nikola Tesla at the start of the electricity era (circa 1890). Today, market competition from lower-priced gas, efficiency, wind and solar is driving the decline of coal in the US and corresponding increased use of these supplies. Texas leads the US in wind installations, through a combination of competition and cost-effective infrastructure investment.

Growth of wind & solar in the US in recent years is greater than other technologies or fuels.

Lower prices for these energy sources combined with technical innovations ensures the continued reliable operation of the electric system with this growth and change. As falling costs from new competing sources of energy attract investors, the construction of new power plants using new technology is dramatic. The majority of generating capacity added in the US in each year 2014, 2015 and 2016 used either wind or solar. See the chart to the right for the rise in renewable energy installations using wind and solar.

The organizations responsible for reliability are fully engaged in this evolution

Grid operators, also known as “power pools” are responsible for reliably managing this growth across multi-state regions, with some seeing these changes faster than others. Graphs below show the growth of wind in the major electricity markets from 2008 (when nation-wide windpower (or wind and solar) surpassed 20,000 MW) to 2016.

Independent system operators that operate these markets, as well as plan and operate the grid, are informed of these changes, and the retirements of old plants, through various study obligations that supplement fundamental reliability standards. The MidContinent Independent System Operator (MISO) estimates over $350 million annual savings comes from planning for wind.  The benefits from $3.4 billion in transmission construction in the Southwest Power Pool (SPP) from 2012 to 2014 are $240 million per year, expected to exceed $10 billion over 40 years.

Power pools serve two-thirds of US consumers as system operators independent of the local utilities, fostering reliability, innovation and competition. The agreement to form the PJM power pool to save money and increase reliability was front page news in 1927. Energy needs for wartime aluminum production drove the formation of SPP eight days after the United States entered World War II. Today MISO estimates annual savings of $1.8 billion from reducing needed reserves.

Reliability oversight and sharing of best practices comes from national and continental-scale organizations. US reliability and interconnection standards are developed by the Federal Energy Regulatory Commission (FERC) and the North American Electricity Reliability Corporation (NERC). FERC first adopted a requirement for wind contributions to grid reliability in 2003, setting a standard for “riding through” (i.e. staying connected) disturbances that was more stringent than that applied to nuclear plants. From that time on, NERC has provided a series of reports and recommendations to guide the industry in safe and reliable integration of renewable energy. With this oversight on reliability, the growth of renewable energy has reached some impressive records.

Records for supply from renewable energy

Regional record for use of renewable energy in a single hour. Chart UCS.

Wind farms, and all renewable generation in California, are setting new records for serving regional electricity needs. Adaptations by grid operators over the years allow a steady increase in renewable energy on the grid. The numbers shown here illustrate how wind and solar technology, combined with the grid operators’ tools, are running the electric supply at times with 50% wind in the Great Plains and at 80% with the combination of renewable sources (including wind, solar, hydro, biopower and geothermal) in California.

 

These records are during hours when renewable production is high and demand is relatively low, but they provide experience for routine operations with ever-higher levels of renewables.

Grid practices keeping pace make these records possible

The industry continues to expand the innovations and tools for reliable operations with higher levels of renewable energy and lower levels of fossil fuel.

When wind farms and solar generation are distributed across large areas, the energy produced is both more predictable and steady. This geographic diversity allows regional power pools to integrate the supply of renewable energy into the larger supply mix, as weather patterns move across their region. The effect of this pooling wind across a large area was noted by ERCOT’s official market monitor, who observed wind production in June 2016 was at all times at least 3,500 MW.

Because they’re so large, power pools also create cost savings by reducing the need for generators held in reserve (used to balance supply and demand). Smoothing out these changes, adjusting for weather forecasts, and now incorporating centralized wind forecasts for the region are all best practices. The California ISO implemented wind forecasting in 2004. The regional grid operators of Texas, New York ISO (NYISO), and the Midcontinent ISO (MISO) implemented wind forecasting in 2008 and PJM did so in 2009.

State-of-the-art wind forecasting predicts the output of individual wind farms and allows grid operators to include wind farm operations in their day-ahead preparations and real-time generator dispatch systems. Grid operators continue to improve technical forecast tools, including visualization of wind conditions to improve system operators’ situational awareness, and increased forecasting electrical system needs and capabilities based on forecasted wind output.

Once forecasting demonstrated significant cost savings and reliability benefits, grid operators and the wind industry adopted the best practice of expanding market system control of dispatch, and implementation of wind dispatch (i.e., windfarms respond economically to instructions).  In 2009, NYISO was the first to include wind offer prices and dispatch in the market system. By 2011 the markets run by MISO and PJM included similar price-based wind integration. This innovation allows ISOs to determine the most cost effective way to address reliability issues, ensuring better utilization of wind plant output while maintaining a secure, reliable system.

In addition to new power plants and grid practices, other investment categories contribute to greater reliability and more use of renewable energy. Increased transmission allows greater sharing of energy resources within and among power pools. Utilities, independent developers, and wind farm companies continue the expansion of this most fundamental electricity infrastructure. Coordination of demand response, electric vehicle charging, and simple upgrades such as thermostats and efficient lighting reduce the stress on the grid, directly and immediately improving reliability.

The utility industry has great potential to improve this sort of interaction with consumers, as well as the game-changing possibilities of battery energy storage.

Methods need to continue to evolve and be adopted

The evolution of modern electric grids has reached the point where old coal plants are retiring and grid management proceeds without any coal generation. Both New England, and Britain (old England) have reached this point. Continued progress with new technologies requires support for research, demonstration, and deployment. New methods of understanding requirements and the solutions come from open and honest dialogue.

That competition for the future cannot be successful when the regulatory agencies champion a backward-looking approach. The recommendations anticipated shortly from the DOE should recognize the realities of changes already made in electric grid operations, as well as the capacity to make greater use of new technologies already demonstrated.

Heat Waves and Wildfires Signal Warnings about Climate Change (and Budget Cuts)

UCS Blog - The Equation (text only) -

Southern California and the Southwest US are experiencing a significant heat wave this week. More than 29 million people in California alone are under an excessive heat warning or heat advisory.

If you live in areas affected by this heat wave, please follow health advisories to stay cool, stay hydrated, and stay safe. And watch for wildfire advisories while you’re at it.

Heat waves are dangerous

Extreme heat can cause heat exhaustion, heat stroke, or even death. Symptoms to watch for include dizziness, headaches, nausea, muscle cramps, and loss of consciousness. Be especially vigilant for children, the elderly, those with pre-existing health conditions, those who work or play outdoors, and your pets. (For more on how to stay safe in extreme heat, refer to guidance from the CDC.)

Unfortunately, climate change is increasing the frequency and severity of heat waves. According to the EPA:

“Nationwide, unusually hot summer days (highs) have become more common over the last few decades. The occurrence of unusually hot summer nights (lows) has increased at an even faster rate. This trend indicates less “cooling off” at night.”

Furthermore, heat waves that arrive earlier in the summer can have worse health impacts because people’s bodies have had less time to adjust to the warm weather. And the longer a heat wave lasts, the more severe the cumulative effects can be.

On the other side of the world, India has already experienced a serious early heat wave in April, and recent research shows that even a small increase in global average temperature (which is very likely with climate change) is projected to cause a huge increase in heat-related deaths there.

Hotter, drier conditions also raise risks of wildfires

The wildfire season in the Southwest is also underway. Many of the same areas experiencing this week’s heat wave—including parts of Arizona, New Mexico, and California—are also forecast to have an above-normal wildfire risk this month (see map).

That’s no coincidence: in many parts of the world hotter, drier conditions are also contributing to growing risks of wildfires.

Arizona currently has more than 12 active wildfires and the state has already seen dozens of fires this year. California has also seen a number of wildfires over the past month; officials warn that the risk continues to be high. Ironically, winter precipitation in these states has helped provide more fuel for fires, stimulating the growth of brush and other vegetation that is now drying out in the hot temperatures.

Halfway across the world, Portugal is experiencing terrible wildfires, where more than 60 people tragically lost their lives this past weekend after getting trapped by raging fires. The country is, of course, focused on the emergency response and is in a state of mourning. Unfortunately, Portugal has been experiencing bad wildfires seasons year-on-year. Earlier this year, Chile also experienced devastating wildfires.

Drought and extreme heat are important contributing factors in all these cases, and frequently faulty forest and land management policies are also implicated.

Managing the risks of wildfires

Wildfires are inevitably a consequence of several factors, including the weather, winds, and the condition of forests and underbrush, plus the proximate causes such as lighting or human activities. Here in the US and many parts of the world, climate change is making hotter, drier conditions more likely and worsening the risks of wildfire.

Development in wildfire-prone areas also exposes more people and property to the risks of harmful impacts. The smoke from wildfires can also impose harmful health impacts on people living hundreds of miles away—recent research shows that the air pollution from wildfires is significantly higher than previously understood.

To manage wildfire risks and impacts, we will have to work on solutions on all these fronts.

Cutting the Forest Service budget is a bad idea

Given what we know about these growing wildfire risks and the need to take robust action to protect people and healthy forests, the Trump administration’s proposed cuts to the US Forest Service budget are a particularly bad idea. For instance, the president’s FY 2018 budget proposes to cut funding for forest health management by about $9 million relative to FY2017 (more specifically, relative to the FY 2017 annualized Continuing Resolution level), which would reduce the resources available to cope with disease and pest outbreaks that kill trees. The hazardous fuels management budget would take a hit of $20 million—meaning that there would be less money to manage or thin forests to reduce wildfire risks near where people live. The budget also proposes to cut funding for volunteer fire departments.

Last week Tom Tidwell, Chief of the USDA Forest Service, testified about the budget before the Senate Committee on Energy and Natural Resources. At the hearing, there was bipartisan push-back to these cuts. Senator Murkowski (R-Alaska) said:

“While some of the agency’s recommended budget cuts are worth considering, others, like the proposed cuts to recreation programs, are concerning. Some could impact critical forest management activities, like firefighting and hazardous fuels reduction. And some appear to contradict other proposals in the budget, so we will need to review all of these very carefully, as we work on our budget for the next fiscal year.”

And Senator Cantwell (D-WA) said:

“President Trump’s proposal reduces funding for fighting wildfires. This budget proposes a decrease of almost $300 million for fighting wildfires and another decrease of $50 million for preventing wildfires.

A way forward on wildfire and climate policy?

Senators Murkowski and Cantwell have a long history of working together to find solutions for improving forest management and fixing wildfire budgeting.

I hope Congress will reject the harmful budget cuts proposed by the Trump administration, and step up and pass legislation to address these critical issues as soon as possible. People who live in wildfire-prone areas—whether in California, Arizona, Alaska, or Georgia—cannot afford further delays or back-sliding.

We also have to continue to work with the global community to limit the heat-trapping emissions that are driving climate change and worsening the risks of deadly heat waves and wildfires worldwide—despite the Trump administration’s stance on the Paris Climate Agreement.

 

A Priority List for Trump’s New FEMA Chief

UCS Blog - The Equation (text only) -

This evening Congress is expected to confirm the new Federal Emergency Management Agency (FEMA) Administrator, Brock Long, who will serve as the principal advisor on emergency management under President Trump and John F. Kelly, the newly appointed Secretary of the Department of Homeland Security. In my previous blog, I mentioned 5 reasons why this was welcome news.

Once Mr. Long is in place, here’s a priority list to ensure he hits the ground running.

#1 Commit to Incorporating the Latest Climate Change and Future Conditions into All of FEMA’s Actions

Losses from natural disasters (hurricanes, wildfires, flooding, earthquakes, etc.) are on the rise due to growing populations and urban development in high hazard areas, as well as climate change which is increasing the frequency and intensity of extreme weather events.  Thanks to NOAA, we know that in the first quarter of 2017, the U.S. has had 5 weather and climate disasters with over $1 billion in damages, while 2016 had 15 “billion-dollar” natural disaster events (each event either reached or exceeded $1 billion in losses) and was a record year for inland flooding disasters.

While President Trump has signaled to the world his disregard for science, climate change and making our  planet a cleaner and safer place by pulling out of the Paris Agreement on June 1, there is still a lot of work that our federal agencies can do to reduce the impacts of natural disasters, including those worsened by climate change.

  • At the national level, Mr. Long should implement the Technical Mapping Advisory Committee’s (TMAC) Future Conditions Risk Assessment and Modeling recommendations to FEMA on how to utilize and incorporate the best available climate science and methodology to assess possible future flood risk. Mr. Long must also defend FEMA’s budget to Congress to make this possible.
  • At the state level, Mr. Long ought to play a leadership role to ensure that states comply with FEMA’s policy to update their State Hazard Mitigation Plans (SHMP’s) to consider future conditions, including the projected effects of climate change, in addition to substantially improving these plans. States must update their hazard mitigation plans every 5 years to be eligible to receive federal funding for pre-disaster mitigation (PDM). Recent reviews of the quality of these plans (here and here) find that most states need to substantially improve their plans, particularly land- locked states. FEMA is responsible for providing both “technical assistance and reviewing state activities, plans and programs  to ensure mitigation commitments are fulfilled” and does so on an annual basis (for more information see State Mitigation Plan Review Guide).Mr. Long should work with FEMA staff to encourage states to substantially improve their SHMPs which will help to better coordinate state level hazard mitigation actions and safeguard communities.
  • At the community level, Mr. Long can continue FEMA’s leadership on promoting community resilience through building on and expanding upon FEMA’s smart climate adaptation efforts like using flood-resilient design building codes, maintaining natural and beneficial functions of floodplains, investing in more resilient infrastructure, engaging in mitigation planning and strategies to increase community resilience, and including environmental analysis in the Benefit-Cost Analysis (BCA) Tool when considering mitigation activities.
#2 Defend FEMA’s budget

Jamestown,CO, October 9,2013–Deane Criswell,FEMA Deputy FCO, and Dan Alexander, FEMA Region 8 Deputy Regional Coordinator look at map which shows the course of the flood waters in and around the Jamestown, CO area. FEMA is working with local, state, volunteer and other federal agencies to provide assistance to residents affected by flooding. Photo by Patsy Lynch/FEMA

Mr. Long should defend FEMA’s budget and champion the need for robust funding for FEMA’s PDM grant program as well as flood risk mapping. The Trump administration’s FY18 budget proposal would obliterate both programs. As my colleague said in her blog, these cuts would seriously undermine our nation’s ability to prepare for and recover from disasters, and put the safety of Americans at risk.

It remains a concern as to whether Mr. Long will adequately defend the FEMA Budget.

During the June 7 Senate Homeland Security & Governmental Affairs Committee hearing, Senator McCaskill (MO) spoke to the impacts of the recent flooding on many counties in Missouri and then asked Mr. Long whether he “was concerned about the $600 million cut in FEMA’s budget?”  Long replied to say that he supports the President’s budget. However, he also said he would work with FEMA to make sure the agency can meet its needs.

While Mr. Long’s remarks may leave an impression that he will have an open dialog on what those needs may be, families and communities and a broad range of sectors will want a more definitive answer that he will be defend FEMA’s budget, particularly the mapping and pre-disaster mitigation programs.  Funds in the pre-disaster mitigation program can, for example, go toward helping communities implement buyout programs of houses that have been repeatedly flooded in high risk areas and leaving the area as permanent open space, reducing future costs.

The Pre-disaster Mitigation (PDM) Grant Program, authorized under the Stafford Act, is vital to communities and state and local governments as it provides them with funding to implement measures before a disaster strikes instead of afterwards.  A Congressional Budget Office (CBO) study found that the majority of grants and funds go towards flood-risk mitigation measures (vs. planning or other risks such as earthquakes, wind, etc.) and that investing in mitigation before a disaster event helps to reduce future losses. Under President Trump’s budget proposal, this program would be funded at $39,016,000, a 61% cut compared to the FY17 continuing resolution levels.

What is particularly disheartening about this budget cut is that the PDM grant program is already underfunded as more communities apply for funding than can be funded under the allocated amounts.  The Government Accountability Office (GAO) found that post disaster assistance can be a reactionary and fragmented approach and from fiscal years 2011-2014, FEMA allocated a whopping $3.2 billion for post disaster hazard mitigation under the Hazard Mitigation Grant Program (HMGP) and approximately $222 million for the PDM grant program.   A more recent review of pre and post disaster funding by Kousky and Shabman finds that almost 90% of FEMA funding on flood risk reduction comes in the aftermath of a big flood and recommends the need for an increase in funding for pre-flood risk reduction program budgets.

Mr. Long should also make the case for investing in flood hazard mapping and risk analysis program which the Trump Administration’s FY 18 Budget proposal zeros out. Accurate flood risk maps are vital for communities to assess their risks and take action to reduce them. For FY 2017, Congress appropriated $178 million for flood mapping compared to the $400 million per year that was authorized in 2012 by the Biggert-Waters legislation. The Association of State Floodplain Managers (ASFPM) in their Flood Mapping for the Nation report estimated the cost of remaining unmet mapping needs to range between $4.5 billion – $7.5 billion and a spending level of $400 million per year, FEMA could update the maps in 10-11 years.

 #3 Champion existing pre-disaster mitigation policies and robust coordination across federal agencies and state, local and tribal governments: FEMA’s coordination role is a critical one.

Belle Harbor, N.Y., Aug. 17, 2015– John Covell, NY Sandy Recovery Office Director (3rd from left) and NYC DOT Commissioner Polly Trottenberg (2nd from right), inspect the sand dune and baffle walls built to protect the neighborhood from future storms. They were there with local politicians and neighborhood groups for the groundbreaking ceremony of the FEMA funded Street Reconstruction project that will address damage caused by Hurricane Sandy. K.C.Wilsey/FEMA

Mr. Long should champion commonsense pre-disaster mitigation strategies and collaborations that are already underway by:

  • Ensuring the common sense Federal Flood Risk Management Standard (FFRMS) that requires all federally funded infrastructure (such as hospitals, roads, transit systems) in flood-prone areas be constructed to better withstand the impacts of flooding moves forward. A Pew poll found that of the Americans polled, a whopping 82% support such a requirement for the construction of new infrastructure, as well as for repairing and rebuilding structures damaged by flooding. This support shouldn’t really be such a surprise as it aligns well with what states and local governments are already doing; here’s a list of participating communities by state.
  • Move rulemaking on the Public Assistance (PA) Deductible swiftly forward to give states incentives to increase their investments in resilience to natural disasters and to reduce the burden on federal taxpayers after a natural disaster. The PA program provides funding for local, state, and tribal governments to help communities recover from major disasters. We provided extensive comments in support of and to strengthen FEMA’s proposal for such a deductible. The reason this policy is innovative is that it gives states the option to reduce their deductible through credits earned for qualifying statewide mitigation measures that increase resilience and ultimately lower the costs of future disasters.
  • Further FEMA’s progress in concert with the federal inter-agency Mitigation Flood Leadership Advisory Group (MitFLG) on a National Mitigation Investment Strategy (NMIS). After Hurricane Sandy, the GAO in their 2015 report identified a need for a coordinated, federal government-wide investment strategy for resilience and mitigation that reduces the nation’s exposure to future losses from disasters. The NMIS is a great opportunity to help the federal family plan and justify budgets and resources that invest in mitigation measures before a disaster happens.
#4 Advocate for substantial NFIP reform to increase climate resilience and safeguard communities

This is a busy time for Congress as each side of the aisle is working on how best to reauthorize the National Flood Insurance Program (NFIP) before its expiration in September, 2017.  Congressional leaders are working in both the House (see House Financial Services Committee Memorandum that includes 7 different draft bills and that would reauthorize NFIP for 5 years) and the  Senate (see the “SAFE NFIP Act” 2017 that would reauthorize NFIP for 6 years and the Cassidy-Gillibrand (FIAS Act) 2017 that would reauthorize NFIP for 10 years).

Previous NFIP reforms occurred in 2014, 2012, and 2004 and yet many fixes and improvements are very much needed.

Mr. Long must play a leadership role to help Congressional leaders draft a NFIP reauthorization bill that is substantially improved to increase climate resilience and safeguard communities.

Time to hit the ground running…

We are hopeful that Mr. Long will commit to strong leadership on these funding and policy initiatives and that he will demonstrate results.  As Hurricane season is in full swing and as the costs of more frequent and intense natural disaster events grow, we look forward to working with Mr. Long to moving this priority list forward, checking each of the “to do” boxes and ultimately creating a more climate-ready, resilient nation.

 

DOT FEMA

Some Tough Questions for Rick Perry at DOE Budget Hearing

UCS Blog - The Equation (text only) -

Department of Energy (DOE) Secretary Rick Perry is set to appear before the House Energy and Water Appropriations Subcommittee on Tuesday to talk about the administration’s fiscal year 2018 budget request.

Once the hearing begins, perhaps Chairman Simpson (R-ID) should start by asking the secretary, given the extremeness of the administration’s proposed cuts, does he in fact now feel that he has accomplished his mission of eliminating the DOE. While this event promises a few laughs, the secretary will also hear some harsh words from members of both parties for whom the administration’s DOE budget proposal is a non-starter.

A deeper look into the Trump DOE budget reveals a war on renewable energy, cuts to our national labs, and reduced capacity for federal R&D, science, and innovation.

Here’s what the appropriators should focus on…

The administration is going after clean energy

The administration is trying to gut our nation’s federal work on renewable energy, energy efficiency, and sustainable transportation, proposing almost a 70% cut to the Office of Energy Efficiency and Renewable Energy (EERE). They also propose to cut between 64–80% of the clean energy infrastructure work at the Office of Electricity Delivery and Reliability (OE). They propose to eliminate ARPA-E, our nation’s early stage clean energy research and development program. They propose eliminating DOE’s clean energy loan guarantee program as well.

This DOE budget essentially pluses up National Nuclear Security Administration (NNSA), takes a scalpel to the Office of Science, and takes a machete to DOE energy programs, most specifically, clean energy programs.

The cuts to EERE in particular were so extreme, all seven former assistant secretaries of EERE (form 1989 – 2017) sent a letter to appropriators and Secretary Perry earlier this month, warning that the cuts would cripple the office’s work and undermine America’s competitive advantage in clean energy research and development.

Unlike his confirmation hearing back in January, where the Secretary touted his record of support for wind energy as Texas Governor, he can’t hide from these budget numbers:

Sources: https://rules.house.gov/bill/115/hr-244; https://www.whitehouse.gov/sites/whitehouse.gov/files/omb/budget/fy2018/doe.pdf

On top of these numbers, some of the actions taken early into Secretary Perry’s leadership appear to undermine clean energy momentum; like the recently announced grid study that many people, including prominent Republicans, are saying is biased against renewables. Or the recent announcement that DOE is dismantling their Office of International Climate and Technology, which was doing important work on international clean energy technology exchange and policy.

So there’s clearly a troubling pattern on clean energy early in the administration, and the appropriators should drill down on that with the secretary.

Please don’t hurt our national labs

At his January senate confirmation hearing, Secretary Perry called our national labs “the repository of some of the extraordinary brilliance in the world,” and he pledged to defend and support our national labs and our capacity for science and technological innovation. But the administration’s budget reflects just the opposite.

Flow of US/SE Program Funding to National Laboratories (FY 2014 enacted). Source: DOE

One third of all EERE’s budget goes to the national labs, so a 70% cut not only hurts those folks in Golden, CO at the National Renewable Energy Laboratory, it also hurts folks at other labs, like Oakridge National Laboratory (ORNL), which received $117 million from EERE in FY 2015 alone.

DOE’s Office of Science, the nation’s largest sponsor of the physical sciences (it manages 10 of the 17 national laboratories), would see its budget cut by 17% under the Trump proposal.

The administration’s proposed 31% cut to the Office of Nuclear Energy is really going to hurt folks at the Idaho National Laboratory, and people are already sounding alarm bells. And the proposed 58% cut to the Office of Fossil Energy would cripple West Virginia’s National Energy Technology Laboratory. Most of the national labs are involved in the important grid and infrastructure work being done at OE and the innovative clean energy R&D work at ARPA-E as well.

Appropriators know that no matter what DOE office or program you cut, critical work and thousands of jobs are on the line at the national labs.

Also on the line? Our nation’s scientific and technological priorities, and credibility.

Do you or don’t you …support an efficient, secure, reliable grid?

The Office of Electricity Delivery and Energy Reliability (OE) works to strengthen, transform, and improve energy infrastructure. It works with industry, the national labs, and government partners to help modernize the electric grid and increase the resilience of electric infrastructure.

At his confirmation hearing in January, Ranking Member Cantwell (D-WA) asked Mr. Perry about the work of the Office of Electricity; if he understood “that office’s capabilities on storage, on cyber, on transforming the grid, on all of those things and are committed to that office.” To which Mr. Perry replied “most important aspects of the agency,” and said that “those functions that are under that agency, there is great support in general for that.”

And yet the administration’s FY18 budget request cuts the office by almost half, and more specifically, it seems to target transmission and reliability, smart grid R&D, and energy storage for the deepest cuts; from 64–80%, essentially decimating the office’s work.

The appropriators need to ask the secretary to explain the contradiction here. Why would we be cutting support for work that makes our grid more efficient, affordable, reliable, and less vulnerable at a time when cyber threats and threats from extreme weather are increasing?

5 Reasons Why Congress Should Immediately Greenlight the Nation’s Natural Disaster Responder-in-Chief

UCS Blog - The Equation (text only) -

One thing this Congress seems to agree on is who the new Federal Emergency Management Agency (FEMA) Administrator will be.  The Senate is expected to take an evening confirmation vote on the nomination of Brock Long to be FEMA’s Administrator this coming Monday.

If Congress confirms Mr. Brock Long as the FEMA Administrator, he will serve as the principal advisor on emergency management under President Trump and John F. Kelly, the newly appointed Secretary of the Department of Homeland Security.

Congress should move swiftly to confirm Mr. Long so he can be at the helm of FEMA as soon as possible. Here are five reasons why. 

#1: It’s Hurricane Season!

June 1 was the official start of hurricane season and NOAA’s National Hurricane Center forecast says that we’re likely to have a 45 percent chance of an above-normal season, with a 70% likelihood of 11 to 17 named storms, 2-4 of which could become major hurricanes.  And the data show that hurricanes in the North Atlantic region have been intensifying over the past 40 years.  Currently eyes are on the Gulf of Mexico because of the potential threat of a serious storm brewing.

#2:  Climate change is increasing the frequency and intensity of extreme weather events and pushing us into “Truly Uncharted Territory.

In her blog, Astrid Caldas highlighted some of the sobering facts that the World Meteorological Organization (WMO) published in their State of The Global Climate for 2016 report. Most striking of all is the fact that we are truly in uncharted territory because of the unknown possible consequences of the combined record breaking temperatures, new highs in carbon dioxide emissions, and new records for global sea level rise (to name a few).

But we do know that losses from natural disasters (hurricanes, wildfires, flooding, earthquakes, etc.) are on the rise due to growing populations and urban development in high hazard areas, and to climate change which is increasing the frequency and intensity of extreme weather events.  In fact, data show that we’ll see:

During the first quarter of 2017 NOAA’s National Centers for Environmental Information (NCEI) found that the U.S. has had 5 weather and climate disasters with over $1 billion in damages including 1 flooding event, 1 freeze event, and 3 severe storm events, which combined caused 37 deaths. This follows on the back of 2016 which had 15 “billion-dollar” natural disaster events (each event either reached or exceeded $1 billion in losses), and was a record year for inland flooding disasters with 4 inland flood events (since 1980 no more than 2 inland flood events had occurred in a year).  Just in the month of June (to date) there have already been 3 national disaster declarations in New Hampshire, Missouri, and Arkansas.

#3:  We’re over a month past the 100-day mark of the Administration and yet, a large number of key executive branch positions are still unfilled.

The Washington Post, in collaboration with the Partnership for Public Service, is tracking positions that need Senate confirmation. According to their tally, of the 558 key positions that require Senate confirmation, only 42 have been confirmed. While it is a relief to have Brock Long in line to take over as FEMA Administrator, he will be lacking the support team he needs as positions remain unfilled, including two key FEMA staff — the Deputy administrator and the Deputy administrator for protection and national preparedness –and colleagues in sister agencies including NOAA or NASA administrators.

#4: Unlike President Trump’s nominations for other key posts, Mr. Brock Long actually has years of relevant experience and leadership. 

In his words, Mr. Long has two decades of service “helping communities and organizations prepare for, respond to, and recover from disasters.” He comes most recently from the private sector working for Hagerty Consulting and was previously the Director of Alabama Emergency Management for four years under Governor Bob Riley. During Mr. Long’s stint in Alabama, he responded to 14 disasters (8 of which were presidential declarations). He also has served as the FEMA Region IV (AL, FL, GA, KY, MS, NC, SC, and TN) Hurricane Program manager, as Georgia’s statewide Hurricane Program Manager, and as School Safety Coordinator with the Georgia Emergency Management Agency.

#5:  Mr. Long understands the need for investing in pre-disaster mitigation to safeguard communities now, reducing the costs of future disasters and lessening the burden of federal taxpayers.

Highlands, N.J., Feb 7, 2013- This home was elevated prior to Sandy and received only minor damage, but the homeowner has opted to utilize FEMA’s new Advisory Base Flood Elevation (ABFE) guidelines to mitigate potential future damage. Photo by Rosanna Arias/FEMA.

During the June 7 Senate Homeland Security & Governmental Affairs Committee hearing, Senator Hassan (D-NH) spoke to the importance of rethinking how the federal government provides funding to mitigate against disasters, and of the need for increased pre-disaster mitigation (PDM) funding (noting that the Trump Administration cut the PDM grant program in half in the President’s proposed Fiscal Year 2018 Budget).

Mr. Long agreed with Senator Hassan on the importance of pre-disaster mitigation and responded to say that mitigation is the “cornerstone of emergency management. If we ultimately want to reduce costs in the future for disasters, we have to do more mitigation.” He followed up to say “If confirmed, I would like to work with the committee to evaluate all of the mitigation funding, not just pre-disaster mitigation, but how do we possibly budgetize all of it up front to do more work to reduce disaster costs, rather than basically having to get hit to be accessing the mitigation funding that’s there.”

Given Brock Long’s experience with responding to disasters, it is not surprising that he values the benefits of putting resources toward mitigating disaster risk on the front end.

Time is of the essence for Congress to vote “Yea” on our next FEMA chief

When things in Washington, DC are so divisive and partisan – let’s have Congress take pleasure in doing a light, but important lift and confirm Mr. Brock Long as our next FEMA chief. I know that if I were in congress, I wouldn’t want to have to explain to my constituents why, with these odds of a volatile hurricane season, I didn’t do all I could to get a new and qualified FEMA Administrator at the helm of our nation’s disaster readiness.

NOAA National Centers for Environmental Information (NCEI) U.S. Billion-Dollar Weather and Climate Disasters (2017). https://www.ncdc.noaa.gov/billions/ FEMA

As Washington Flounders, Midwest Businesses Forge Ahead on Clean Energy

UCS Blog - The Equation (text only) -

If you’ve been watching the news lately, it can be depressing.  With President Trump announcing that the United States will back out of the Paris Climate Agreement, his executive order to dismantle many climate policies, and the proposed EPA budget cuts, things look bleak for US leadership on climate.

But there’s hope! Despite all the negative attacks on clean energy at the federal level, clean energy momentum is happening, including in the Midwest. Businesses are leading the charge through an effort known as RE100.

What is RE 100?

RE100 is a collaborative, global initiative of influential businesses committed to achieving 100 percent renewable energy, and working to greatly increase demand for renewable energy. There are currently 96 RE100 companies that have made a commitment to power their businesses with 100% renewable energy.

This is important because the industrial sector uses more energy than any other end-use sector, approximately 54% of the world’s total delivered energy. Businesses from all over the world, and from an array of sectors, have joined this commitment to support the transition to a clean energy economy. While there are some businesses on the list that may not surprise you like Google, Apple, and IKEA, there are some businesses, particularly in the Midwest, that may.

Midwest businesses are on board

You may not expect to see Detroit-based car giant General Motors (GM) on the list, but they have committed to using 100% renewable electricity across their global operations by 2050. This commitment, combined with their pursuit of developing affordable electric vehicles positions GM to be a clean energy leader in their industry. With a global annual consumption of nine terawatt hours of electricity, equivalent to more than the total yearly energy consumption of Nicaragua, Senegal, and Uganda combined, moving forward with a 100 percent renewable energy goal is no small feat. Their roadmap to achieving this goal includes adding 30 megawatts of solar arrays at two facilities in China, completing an installation of 466 kilowatts of solar at an operations plant in upstate New York, and their largest ever renewable energy purchase in 2016 with a commitment to buy 50 megawatts of wind power.

Salesforce, the Chicago based CRM software solutions and cloud computing company, has committed to increasing the percentage of renewable energy powering its global operations and reaching its goal of 100% renewable electricity. To work towards this goal, Salesforce signed a 12-year wind energy agreement for 40 megawatts (MW) from a West Virginia wind farm through a virtual power purchase agreement. Last year they announced their second renewable energy agreement, a 12-year contract for 24 MW from a Texas wind farm.

Headquartered in Grand Rapids Michigan, Steelcase is an office furniture manufacturer. Their long-term commitment to 100% renewable energy reflects the company’s larger energy strategy which has resulted in a 60% reduction in energy use since 2001. Steelcase has chosen to meet its commitment through the purchase of renewable energy credits (RECs) from a portfolio that includes non-emitting sources like wind and hydroelectric energy. Steelcase has also created a program to encourage the company’s suppliers to purchase RECs from new wind energy facilities. Partners that choose to participate will get Steelcase’s volume discount pricing.

Procter & Gamble (P&G), headquartered in Cincinnati, has also joined RE100, instituting an intermediate goal of 30% renewable energy by 2020. This is in addition to adopting a goal to reduce absolute greenhouse gas (GHG) emissions by 30% by 2020. Clean energy investments make economic sense: P&G’s energy efficiency improvements alone have resulted in a cost savings of more than $350 million since 2010.

We push forward

President Trump may have announced he is going to withdraw the US from the Paris Climate Agreement, but hundreds of businesses and investors have vowed to continue to support climate action toward meeting the Paris climate targets. And more and more businesses continue to commit to 100 renewable energy.

Significant action on climate change is happening around the world, its gaining momentum, and it’s exceedingly driven by the private sector.

Renewable energy prices are falling, and investing in renewable energy just makes economic sense for many large businesses. Advanced Energy Economy (AEE)

found that 71 of the Fortune 100 companies currently have renewable energy or sustainability targets.

What can you do?

As an individual, you can follow the business community’s lead and you can commit to support the Paris Agreement by reducing your own carbon emissions, support US states, cities, businesses, investors, universities, and other entities taking strong climate action, and urge President Trump to protect federal safeguards for our health and environment.

Beliefs Won’t Save Tangier Island, Virginia, From Sea Level Rise—Informed Preparedness Will.

UCS Blog - The Equation (text only) -

On Tuesday, President Trump called James Eskridge, the mayor of Tangier Island, Virginia, and told him that sea level rise isn’t an issue for Tangier, one of the most threatened communities in America.

My heart sank as I read it, and I was reminded of how our core beliefs are so central to our worldviews–and how we all struggle to accept evidence that challenges them.

How it feels when your core beliefs are under threat

Case in point: me. The last time I took my kids to the doctor for their checkup, the doctor gave me this advice: Stop giving them apple slices as a bedtime snack; give them Häagen-Dazs instead. Feed them as much ice cream as they can eat. My heart nearly stopped, and resistance coursed through my veins as my kids cheered.

I had to set my own biases aside and listen to an expert: though my kids are healthy, for their bodies to keep up with the rapid growth and changes of the ‘tween years, they need to bulk up. But the idea of feeding my kids unlimited ice cream (the sugar!) flies in the face of my core beliefs about how a parent should feed her children.

In the Post’s account of Eskridge’s call with President Trump, I saw myself.

“He said we shouldn’t worry about rising sea levels,” Eskridge said. “He said that ‘your island has been there for hundreds of years, and I believe your island will be there for hundreds more.’”

Eskridge wasn’t offended. In fact, he agreed that rising sea levels aren’t a problem for Tangier.

“Like the president, I’m not concerned about sea level rise,” he said. “I’m on the water daily, and I just don’t see it.”

In most US counties, less than half residents think climate change will affect them personally. In Accomack County, home to Tangier Island, only 40% of respondents said that climate change would affect them personally.

Like Eskridge,  I too had built an opinion based on my daily observations. I watched my kids eat every day. Sure, I noted that they were skinnier than some of their friends. But I wanted to think that following all the advice that’s out there (low sugar, plenty of healthy fats, lots of fruits and veggies) was enough.

Likewise, I want so desperately for sea level rise not to be an issue for Tangier Island, because the idea that the residents of Tangier–or any other coastal town, big or small–could lose their land and their homes is deeply upsetting.

We’ve seen people displaced after hurricanes: Residents of New Orleans, a city whose population is still down 20% from its pre-Katrina population, can speak to the lack of deep family ties in the city since becoming dispersed.

But those are rare disasters, right? When it comes to the steadily expanding reach of the sea, we don’t want to believe that it could fundamentally change our towns and communities. As a nation, we tend to think that climate change won’t affect us personally. Rather, it will affect people somewhere else, far from home.

But over and over, the science tells us that rising seas are growing problem for Tangier Island and hundreds of other U.S. communities.

As sea level rises over the next 30 years, the number of tidal flooding events is projected to rise dramatically.

Examining our core beliefs and acknowledging the need for change

Erosion is absolutely a problem for Tangier Island. But it is not the only problem. Underlying that erosion is the fact that Tangier Island is located in a hotspot of sea level rise caused by a combination of factors. By expanding the reach of waves and surge during storms, sea level rise can exacerbate erosion.

But when we see erosion as the only problem, and design solutions to only that problem, we put ourselves in danger. In order to fully address the issue of erosion, the residents of Tangier must also be able to plan for future erosion, which requires an acknowledgement that sea level rise is an important part of the equation.

What this means is that we need leaders that value our core beliefs and cultural identities while being aware of the realities on the ground.

President Trump, in his conversation with Mayor Eskridge, did just the opposite. He neither recognized the rich history of Tangier Island nor committed to policies that would truly help to maintain the island’s community and culture. Instead, he echoed Eskridge’s belief that sea level rise isn’t a problem for Tangier, even while making motions to end funding of the Chesapeake Bay Program. (Note that funding has since been restored, at least for now.)

A critical media can remind us of where the balance of evidence lies

When a national news outlet like The Washington Post publishes a piece like this, it’s bound to get attention. So every such article is an opportunity to push back against an administration that consistently devalues science.

The current article in the Post incorporates some science:

“The small island…shrinks by 15 feet each year, according to the Army Corps of Engineers, which points to coastal erosion and rising sea levels as the cause.

…Scientists predict they will have to abandon the island in 50 years if nothing is done.”

But one could imagine this piece structured in a different way that highlighted how very out of alignment Trump and Eskridge’s views are with mainstream climate science. The US Climate Change Science Program issued a 300+ page report with over 20 authors on the sensitivity of this region to sea level rise. NOAA monitors sea level rise rates around the country, and the observations clearly show high rates of sea level rise in the mid-Atlantic region.

Our country needs a coherent approach to rising seas

Tangier Island is just one example of a community asking for federal assistance to cope with a growing environmental problem.

We all like simple solutions, such as the one that Eskridge proposes:

“Currently, the Army Corps of Engineers is scheduled to begin building a jetty on the west channel of the island some time this year…But Eskridge said they need a jetty, or perhaps even a sea wall, around the entire island.

He believes Trump will cut through red tape and get them that wall.”

As highlighted in a recent piece about Tangier Island in The New York Times, the decisions about where and how much to invest to protect a given community are rarely simple because they force us to evaluate how our core beliefs may need to change in the face of growing climate challenges.

As an increasing number of coastal communities grappling with frequent flooding, our country needs a coherent approach to providing research and resources that build community-level resilience. In the absence of this, resources for adaptation could easily be distributed unfairly or allocated to projects that protect one community while leaving other, neighboring communities to fend for themselves.

Growing stronger together

My core values about nutrition have shifted to be less prescriptive because, after all, what I want is for my kids to grow up to be strong and healthy.

Recognizing the reality of sea level rise and building a coherent, nationwide approach to the challenges it presents are the best chance we have to preserve the safety and prosperity of our coastal communities.

Data source: Yale Climate Opinion Maps 2016 Dahl et al. 2017

The Case of the Missing Numbers

UCS Blog - All Things Nuclear (text only) -

Good performance requires good long-term planning. For federal agencies like the National Nuclear Security Administration (NNSA), one of its important functions is preparing its part of the federal government’s annual budget request, which normally includes information on projected budget requirements for future years. This year, not so much.

This is important because the Congress, which has final say on what the government funds, needs to know which programs will require increased funding in the following years. Those numbers give Congress and the public a sense of priorities and long-term planning that informs the annual federal budget process.

For the NNSA, those long-term budget numbers are called the Future-Years Nuclear Security Program, or FYNSP (commonly pronounced  “fin-sip”), and they are so important that they are, in fact, required by Congress.  In a typical budget request, the budget numbers are simply listed as “Outyears” and they are provided both by location—each NNSA facility, including the three nuclear weapons labs—and for each program area and project.

I assume this isn’t why the budget numbers are missing . . .

However, for almost the entire FY 2018 request, the NNSA budget does not provide future year numbers. In particular, for the Weapons Activities programs (as we discussed in The Bad, the FY 2018 requests were substantially more than the Obama administration projected in their FYNSP) there are no such projections at all in this budget. For example, we don’t know how much the NNSA thinks the B61 life extension program will cost in FY 2019-FY2022. That is information that the Congress should have.

(To be fair to the NNSA, the Department of Defense, where the budgets are far, far larger, also did not include outyear budget projections.)

The NNSA FY2018 budget offers an explanation for why there are no outyear budget figures:

Estimates for the FY 2019 – FY 2023 base budget topline for the National Nuclear Security Administration reflect FY 2018 levels inflated by 2.1 percent annually. This outyear topline does not reflect a policy judgement. Instead, the Administration will make a policy judgement on amounts for the National Nuclear Security Administrations’ FY 2019 – FY 2023 topline in the FY 2019 Budget, in accordance with the National Security Strategy and Nuclear Posture Review that are currently under development.

So, the budget doesn’t have projections because the NNSA is awaiting the results of the Pentagon-led Nuclear Posture Review and the Congressionally-mandated National Security Strategy that the Trump administration is conducting.

Frankly, that explanation is not satisfactory. There is almost no chance that the Nuclear Posture Review will decide to abandon most of the programs designed to maintain and improve the weapons in the US nuclear arsenal. And significant changes to the programs that are already underway (updates to the B61, W88, and W76) are highly unlikely because such modifications would inevitably lead to delays that the Pentagon and the NNSA would not support. For example, as mentioned in “The Bad,” NNSA officials have said any delays would affect certification requirements for the B61.

The only exception is the life extension program for the W80, which is intended for use on the proposed new nuclear-armed cruise missile, the Long-Range Standoff weapon, or LRSO. Secretary of Defense Mattis has testified that he is not yet convinced of the case for the LRSO, so there is a possibility that the program could be cancelled. (And it should be.) But even so, the NNSA should be planning as if it will not be, as the adverse impact of cancellation is significantly less than the consequences of undertaking required budget work on a weapon that is later cancelled.

Obama’s First NNSA Budget

For comparison, the Obama administration faced a similar situation when it came to office in 2009. Like the Trump administration, the first budget request, for FY2010, was delivered to Congress later than normal, in May rather than February. The Obama administration was also, like the Trump administration, doing a Nuclear Posture Review and a National Security Strategy. There was also a change in the political party of the President, so one might expect more substantive changes in nuclear weapons policy than if there was continuity in the White House.

Despite those similarities, the Obama administration delivered a FY2010 budget request that included projections for future years. To be fair, the Obama budget also stated that the projections for Weapons Activities were “only a continuation of current capabilities, pending upcoming strategic nuclear policy decisions.” But the budget actually included additional money for a study of the B61 life extension program, along with further increases in later years.

Moreover, the status of Weapons Activities was dramatically different in 2010 than it is now. In 2010, the W76 was the only active life extension program, and it was already in full production. The B61 was still in study phase, and there was no other active work being done on weapons in the stockpile.

Now, in 2017, the NNSA is involved in four major warhead projects simultaneously, three of which are ramping up substantially. The idea that the NNSA is putting the planning efforts for future work on these programs essentially on hold for a year is troubling.

I suspect one important factor leading to the missing future year budgets is the lack of people in place to do the planning. The man in charge of the NNSA is Lt. Gen. Frank Klotz (Air Force, retired), who by all accounts has done an able job running the agency. He is a holdover from the Obama era, and he was not asked by the Trump team to stay on until the very last day of the Obama administration (which he dutifully did). But no other officials have been nominated for any slots, leaving key positions like the deputy administrator empty while other slots have officials serving only in an acting capacity.

Playing with numbers

One small thing flagged but not described in The Good is the level of increases the Trump administration claims for its NNSA budgets compared to the Obama team’s budgets. The Trump budget claims an 11% increase for the NNSA overall, and even higher increases in Weapons Activities—around 15%–where the work on nuclear weapons is funded.

But those increases are in comparison to the final FY2016 budget, not the FY2017 budget. Notably, the FY2018 request only lists the FY2017 numbers that were in place under the Continuing Resolution (CR) that operated for a good portion of the year.

But in fact Congress did pass a final appropriations bill, albeit very far into the 2017 fiscal year, and for the NNSA those numbers were significantly higher than under the CR. If you compare the Trump budget to those figures, the NNSA budget receives an increase of 7%, not 11%, and the budget increase for  Weapons Activities is 11%, not 15%.

Make no mistake, those are still substantial increases (though as mentioned in The Good they are not dramatically more than increases the Obama administration requested and got Congress to support).

But it’s troubling that the Trump budget was presented in a way that makes it look like it has increased NNSA funding more than it actually has. Who is the audience for this charade?

 

How Do We Get to 100% Renewable Energy? Could be Storage, Storage, Storage

UCS Blog - The Equation (text only) -

As communities, companies, and even entire Midwestern utility companies move to supply 100% of electricity needs from renewable energy, the question presents itself: is this even possible? The answer, it turns out, is yes—and it’s made possible by the technical capabilities of advanced energy technologies (and especially storage).

This is UCS, so let’s talk about how to get the hard stuff done. To replace conventional generation with renewables, eventually all the services from fossil-fuel power plants have to be supplied by adding wind, solar, smart consumer appliances and electric vehicles, and storage.

As renewable energy is added by businesses and utilities, here are 5 great building blocks for a future that is 100% renewable energy.

Hot water heated with solar energy is energy storage. credit: DIY Home Sweet Home

1. Solar is capable of so much more than energy

The utility industry has begun to recognize that new technologies available can provide the reliability functions they need. The adoption of digital controls on solar and wind systems, particularly the inverters, make reliability functions available and useful already, without storage.

The fastest changing parts of the power grid and thus the tool box of solutions are coming from solar—which has present deployments averaging over 10,000 MW per year—and the sudden return of interest and capital to utility-scale energy storage.

The California grid operator has taken the challenge posed by UCS to demonstrate that the solar farms being built today are capable of providing a range of “essential services.” See this summary of a field test where a solar farm demonstrated faster and more accurate performance than other generator types. Wind turbines have also demonstrated these capabilities.

2. Storage is becoming part of the power plant

New energy storage deployments demonstrate just how quickly we can overcome the limit that the sunset creates for solar. The trajectory of energy storage substituting for conventional generation can be traced from actual practices. Beyond the early or largest examples of non-battery energy storage recently illustrated in the New York Times, going forward, there is widespread and dramatic potential for the use of battery storage by businesses and hybrid power plants.

Battery storage added at a power plant, both conventional and renewables, can take on duties that were met by old generators. First seen in remote locations, battery storage paired with generation began in isolated grids in places like Hawaii and Chile where ancillary services from very small generator fleets were unavailable or constraining the grid operations. This helped establish the technical and commercial foundation for expansion to larger grids in the United States.

Recent energy storage deployments now demonstrate a turning point. Present state-of-the-art technology adoption includes manufacturer General Electric (GE) adding energy storage to improve the performance of its line of peaking plants.

With short duration storage now understood as providing ancillary and essential services, GE is delivering hybrid plants with storage and a gas turbine integrated in a system with a single set of controls. The GE hybrid system uses the storage to provide the reliability capabilities of the gas generator with instantaneous response, regardless of whether the unit is started and burning fuel when response is needed.

Switching to electric can include energy storage. credit: NRECA

3. Solar plus storage is getting cheap

Well before planning for large-scale grids that run on 100% renewable energy, long-duration storage is already being paired with variable renewable generation (solar now, look soon for wind), making it able to satisfy market, reliability, and regulatory requirements.

Last month, Tucson Electric Power announced a contract with major power plant company NextEra that should extend production from a large solar array for 4 hours, and operate with the technical capabilities of a conventional plant. The amazing power purchase agreement price, under $0.045/kWh over 20 years, should put everyone on notice that solar plus storage is a very serious competitor based on innovations and cost-reductions.

4. Storage can pay for itself

Batteries used at the customer’s home or business to supplement the grid can lower the need for utility plants. For example, customers with a demand charge can use energy storage (combined with solar or not) to reduce that charge.

This business model addresses the utility premise that a consumer demand charge reflects the need for utility plants. The regulators should note this charge has not been aligned to the utility’s system peaks, and more accurate matching of demand charges and system peaks will create a better regulatory outcome (i.e. lower costs for the utility and consumers) than current practice. Consulting firm McKinsey & Company advises battery storage changes the industry, and in more than a few ways.

5. The future is now

With the introduction of inverters and better energy storage, decision-makers are, for the first time, facing the reality that renewables and storage may be able to replace what’s currently used.

At present, grid operators are showing they can maintain reliability when renewable energy has reached 40-60% of electricity demand in particular hours. Competitive prices are driving more wind and solar every day. An energy future of 100% renewables can be seen as coming soon. With multiple business models for storage replacing conventional utility plant, we can see where this is headed.

Photo: Chris Hunkeler/CC BY-SA (Flickr)

This Summer’s Gulf “Dead Zone” Could Be Bigger Than Connecticut—and Trump’s Budget Cuts Would Make It Worse

UCS Blog - The Equation (text only) -

Summer is almost here, and you know what that means. Sun, sand, and…a watery wasteland devoid of all life? Yep, this is the time each year when a team of federal and university scientists predicts the size of the so-called dead zone that will develop in the Gulf of Mexico later in the summer. We’re waiting for that official prediction, but based on federal nitrate flux data and Midwest weather patterns this spring, it seems likely that it will be bigger than usual.

That means a swath of marine habitat considerably larger than the state of Connecticut could be lifeless by summer’s end—a haunting prospect for coastal ecosystems, fisheries, and the men and women who earn their livelihoods from them. And the Trump administration’s budget proposal and general antagonism toward science and environmental protection are likely to make the problem worse in the future.

Dead zones don’t talk

Marine and coastal dead zones are the result of a phenomenon called hypoxia—a state of low dissolved oxygen that occurs when excess pollutants, such as nitrogen and phosphorus, accumulate in bodies of water. These nutrients feed blooms of algae that, when they die and decompose, deplete the oxygen in the surrounding water. Hypoxia is a silent killer, suffocating organisms that can’t escape the low-oxygen zone quickly enough, and causing others to flee.

As we wrote a year ago when the National Oceanographic and Atmospheric Administration (NOAA) predicted an “average” (roughly Connecticut-sized) Gulf dead zone, even average is not the same as normal. Nitrogen and phosphorus can come from many sources, but the largest are due to human activity, including sewage discharges and fertilizers from farm fields running off into rivers and streams. In 2010, researchers at the University of Illinois showed that the problem of runoff from industrialized, corn-and-soybean intensive agriculture, with its system of underground drainage channels, dwarfs the impact of cities and other nutrient sources in the Midwest. Essentially, each year the Mississippi River and its many tributaries meandering through the Corn Belt quietly funnel a vast amount of agricultural pollution into the Gulf.

Image courtesy National Oceanographic and Atmospheric Administration

April showers bring May flowers, but what do May downpours bring?

The size of the dead zone in any given year is dependent not just on how much fertilizer was applied to fields in the drainage area, but also on the amount of rainfall available to carry it from the land into the rivers and on to the Gulf. This spring has been a wet one in the Midwest, and the drenching rains and widespread flooding that hit parts of the region in late April and continued into May have created ideal conditions for a large flush of nutrients downstream.

The two graphs below from the US Geological Survey (USGS)—which monitors stream flow and nitrate levels in rivers and streams—show how the nitrate “flux” in the Mississippi River basin in May 2017 compares with previous years, and how the actual size of the dead zone tracks those fluxes each year. It’s easy to see the contrast between a wet year like this one and, say, the devastating drought year of 2012.

These are preliminary data that the scientific team led by NOAA will use in making its dead zone prediction this month.

 

 

 

The solution to dead zone pollution…

Nope, it’s not dilution. Even very large bodies of water like the Gulf of Mexico aren’t safe from this annual, preventable destruction. And recurring dead zones and toxic algae blooms also plague other large water bodies including the Chesapeake Bay and Lake Erie. Just today, NOAA predicted a larger-than-average dead zone in the Chesapeake Bay this summer.

And there’s more bad news—climate change is likely making these problems worse. A study published earlier this year examined runoff data from the drought year 2012 and the following, wetter year, to show how “weather whiplash” can increase the flow of nitrate into the Gulf. So we can expect more of the same as the cycle of Midwestern floods and droughts becomes more intense and erratic in the future.

In addition, as the climate heats up, shallow waters like the end of Lake Erie that abuts Toledo, Ohio, will be warmer and thus will likely suffer more toxic algae blooms that taint the city’s drinking water, causing recurring health risks and economic pain.

It’s clear that decreasing the size and severity of algae blooms and dead zones will require significant reductions to current rates of fertilizer runoff in the Midwest. And fortunately, there is bountiful evidence about how to do that on the region’s farms. As we’ve documented in two recent reports, for example, innovative farming practices such as extended crop rotations can cut fertilizer use significantly, and planting perennial prairie strips in and around cropland can dramatically reduce the amount of nitrogen that escapes from those lands into waterways. Even better, we’ve shown that such practices and systems are also good for farmers’ bottom lines.

Budget cuts could grow the dead zone (and shrink opportunities for farmers)

But just when pollution-cutting practices are showing such promise and are needed more than ever, the Trump administration’s proposed Department of Agriculture (USDA) budget could hamstring the department’s efforts to help farmers implement them, cutting programs that deliver financial and technical support for farmers.

Moreover, proposed major cuts at NOAA, the USGS, and the Environmental Protection Agency would hamper the ability of scientists at those agencies to study and remediate the Gulf dead zone and other water bodies that suffer from hypoxia and toxic algal blooms due to fertilizer pollution.

Water pollution from agriculture has real impacts on farmers, coastal and lakeshore communities across the country, and millions of Americans. Even as we wait to see if this year’s problem in the Gulf will be as bad as we think, UCS is advocating for policies and budget investments that could truly tackle the problem in future years.

Join us by calling on Congress to reject the Trump administration’s unacceptable budget cuts at the USDA, and instead vote to fully fund proven programs that keep our water clean, improve farmers’ livelihoods, and help hungry families.

Signed, Sealed, Delayed? The New Fate of the Added Sugar Rule and Other Safeguards

UCS Blog - The Equation (text only) -

The FDA announced this week that it “intends to extend compliance dates” for the nutrition facts label final rules, which will include the separate line for added sugars. We celebrated the finalization of this rule last May as science-based advocacy prevailing to give consumers key information on the foods they consume. While the FDA has not yet announced exactly how long that extension will push back implementation, the food industry has asked HHS Secretary Tom Price to delay the rule’s enforcement three years, until May 2021.

But do food manufacturers really need even more time to help them, as FDA puts it, “to complete and print updated nutrition facts labels for their products,” or are they using this delay tactic to keep consumers from knowing how much sugar is in their food as long as possible? FDA first began its work to revise the nutrition facts label in 2004, and the proposed rule which included the added sugar line was issued in 2014. Industry has had ten years to think about how it could give consumers the information they want to make informed decisions and ten years to come to terms with the mounting evidence that excessive sugar consumption can lead to adverse health consequences including heart disease, obesity, diabetes, and hypertension.

The longer we delay giving consumers the knowledge and power to make informed decisions about the foods they buy and eat, the longer we are missing out on an opportunity to improve Americans’ overall health. Since this rule was first proposed in 2014, representatives from food industry trade organizations, including the Sugar Association, the Grocery Manufacturers Association, and the American Frozen Food Institute, have made inaccurate claims about the science linking sugar consumption to adverse health impacts, the ability of labeling to positively impact consumer health, and the burden of technical challenges and excessive record-keeping in measuring added sugar for food companies.

If instead of spending time and resources coming up with reasons the FDA shouldn’t have issued its final rule, the food industry had accepted established science on added sugars and proactively worked with the FDA to roll out the label, most foods would be bearing these labels today. In fact, some companies have already updated their labels well before the compliance date, showing how very possible it is!

All of the protest and delay has only made consumers skeptical of food companies that appear to be actively working to undermine a rule that would empower them with knowledge of how much sugar has been added to their foods. This move from Scott Gottlieb’s FDA is a disappointing step backward from progress made in food label transparency during the Obama administration. And sadly, the added sugar rule is just one of many that has recently been thwarted by agency delay under the Trump Administration.

The latest policy targets of industry’s stalling tactics

Just this week, EPA administrator Scott Pruitt issued a final rule that would delay implementation of the Risk Management Plan (RMP) amendments for 20 months, until February 19, 2019. This move came after several petitions from the American Chemistry Council and a handful of other chemical manufacturing corporations, oil and gas companies, and trade organizations asked the agency to reconsider the rule. Even after receiving thousands of public comments, including those from individuals from low-income communities and communities of color that face the greatest risks from RMP facilities urging the EPA to enforce the rule as planned, the EPA sided with industry and went forward with its decision to delay.

Then there’s the ozone rule. My colleague, Gretchen Goldman, wrote a letter to Scott Pruitt after the EPA administrator announced that he would extending the deadline for promulgating the rule one year due to “insufficient information.” Perhaps Pruitt has been spending too much time with oil and gas industry lobbyists, because the science is actually more than sufficient on the need for a stronger ozone standard, including a 1,251-page Integrated Science Assessment that found several “causal” and “likely causal” relationships between ozone pollution and health effects, confirmed by a slew of independent advisory committees and independent scientists since the Clean Air Science Advisory Committee (CASAC) recommended tightening the standard a decade ago.

Speaking of rules that are long overdue but are being delayed anyway, the silica rule, beryllium rule, and formaldehyde rule that would have tightened standards that have been too low for decades have been targeted by this administration for further delay so that the construction industry, manufacturing industry, and oil and gas industry have more time to educate employees and change internal practices. You read that right. Even though the science on impacts of silica has been known since the 1970s, the industry still needs time to inform its own employees about it.

The Bureau of Land Management’s (BLM) Methane and Waste Prevention Rule, which was spared by senators as it was voted on in the final days of the Congressional Review Act window, is now being delayed by the agency because of the uncertain fate of the rule as it is being challenged in court by industry organizations and three states and because the American Petroleum Institute CEO, Jack Gerard, asked it to do so. This rule would have reduced some of the most dangerous impacts of fracking for natural gas extraction, including leaks, venting, and flaring, which would have reduced methane pollution that can lead to elevated levels of ground-level ozone and other hazardous air pollutants like benzene, formaldehyde, and hydrogen sulfide, triggering asthma and even cancer.

Public health progress demands action today, without delay

Whether we’ve waited four, ten, or even forty years for a particular standard, we must remember that every day of delay means:

  • one more day that a pregnant mother trying to cut down on her sugar intake will have to guess whether the sugar in her food is natural or added;
  • one more day that a person of color living near a chemical plant in Texas will have to fear the consequences of a toxic leak or explosion;
  • one more day that a first-time homeowner could be exposed to formaldehyde from household items in amounts high enough to lead to an asthma attack or a nasopharyngeal cancer diagnosis;
  • one more day that a worker in a metal foundry could be exposed to beryllium at levels high enough to one day lead to chronic beryllium disease.

Each and every one of these scenarios represents one day too long. When each of these rules were finalized, the scientific and cost-benefit analyses supported the date of implementation. The only reason for delay now is political. Often, the reaction of industry trade associations to final rules designed to protect our public health makes it seem like government is blindsiding them. But in reality, the process that goes into crafting new rules is meticulously and thoughtfully executed by government, which takes years to gather input from stakeholders, including industry, to inform several iterations of rules that are eventually finalized. However, it seems not to matter if industry has one year or ten years to prepare for a change; the end result is the same. The fact is that science-based policies threaten an industry’s status quo and trade organizations with business interests in mind will work tirelessly to stop or slow any perceived disruptions in business as usual, regardless of what that means for public health.

 

Nuclear Leaks: The Back Story the NRC Doesn’t Want You to Know about Palo Verde

UCS Blog - All Things Nuclear (text only) -

As described in a recent All Things Nuclear commentary, one of two emergency diesel generators (EDGs) for the Unit 3 reactor at the Palo Verde Nuclear Generation Station in Arizona was severely damaged during a test run on December 15, 2016. The operating license issued by the Nuclear Regulatory Commission (NRC) allowed the reactor to continue running for up to 10 days with one EDG out of service. Because the extensive damage required far longer than 10 days to repair, the owner asked the NRC for permission to continue operating Unit 3 for up to 62 days with only one EDG available. The NRC approved that request.

Around May 18, 2017, I received an envelope in the mail containing internal NRC documents with the back story for this EDG saga. I submitted a request under the Freedom of Information Act (FOIA) for these materials, but the NRC informed me that they could not release the documents because the matter was still under review by the agency. I asked the NRC’s Office of Public Affairs for a rough estimate of when the agency would conclude its review and release the documents. I was told that their review of the safety issues raised in the documents wasn’t a priority for the NRC and they’d get to it when they got to it.

Well, nuclear safety is a priority for me at UCS. And since I already have the documents, I don’t need to wait for the NRC to get around to concluding its stonewalling— I mean “review”—of the issues.  Here is the back story the NRC does not want you to know about the busted EDG at Palo Verde.

Emergency Diesel Generator Safety Role

The NRC issued the operating license for Palo Verde Unit 3 on November 25, 1987. That initial operating license allowed Unit 3 to continue running for up to 72 hours with one of its two EDGs out of service. Called the “allowable outage time,” the 72 hours balanced the safety need to have a reliable backup power supply with the need to periodically test the EDGs and perform routine maintenance.

The EDGs are among the most important safety equipment at nuclear power plants like Palo Verde. The March 2011 accident at Fukushima Daiichi tragically demonstrated this vital role. A large earthquake knocked out the electrical power grid to which Fukushima Daiichi’s operating reactors were connected. Power was lost to the pumps providing cooling water to the reactor vessels, but the EDGs automatically started and took over this role. About 45 minutes later, a tsunami wave spawned by the earthquake inundated the site and flooded the rooms housing the EDGs. With both the normal and backup power supplies unavailable, workers could only supply makeup cooling water using battery-powered systems and portable generators. They fought a heroic but futile battle and all three reactors operating at the time suffered meltdowns.

More EDG Allowable Outage Time

On December 23, 2005, the owner of Palo Verde submitted a request to the NRC seeking to extend the allowable outage time for an EDG to be out of service to 10 days from 72 hours. Longer EDG allowable outage times were being sought by nuclear plant owners. Originally, nuclear power reactors shut down every year for refueling. The refueling outages provided ample time to conduct the routine testing and inspection tasks required for the EDGs. To boost electrical output (and hence revenue), owners transitioned to only refueling reactors every 18 or 24 months and to shorten the duration of the refueling outages. To facilitate the transitions, more and more testing and inspections previously performed during refueling outages were being conducted with the reactors operating. The argument supporting online maintenance was that while it adversely affected availability (i.e., an EDG was deliberately removed from service for testing and inspecting), the increased reliability (i.e., tests to confirm EDGs were operable were conducted every few weeks instead of spot checks every 18 to 24 months). The NRC approved the amendment to the operating licenses extending the EDG allowable outage times to 10 days on December 5, 2006.

More NRC/Industry Efforts on Allowable Outage Times

While the EDGs have important safety roles to play, they are not the only safety role players. The operating license for a nuclear power reactor covers dozens of components, each with its own allowable outage time. Around the time that longer EDG allowable outage times were sought and obtained at Palo Verde, the nuclear industry and the NRC were working on protocols to make proper decisions about allowable outage times for various safety components. On behalf of the nuclear industry, the Nuclear Energy Institute submitted guidance document NEI 06-09 to the NRC. On May 17, 2007, the NRC issued its safety evaluation report documenting its endorsement of NEI-06-09 along with its qualifications for that endorsement.

To create yet another acronym for no apparent reason, the nuclear industry and NRC conjured up Risk Informed Completion Time (RICT) to use in place of allowable outage time (AOT). The NRC explicitly endorsed a 30-day limit on RICTs (AOTs):

“The RICT is further limited to a deterministic maximum of 30 days (referred to as the backstop CT [completion time] from the time the TS [technical specification or operating license requirement] was first entered.”

The NRC explained why the 30-day maximum limit was necessary:

“The 30-day backstop CT assures that the TS equipment is not out of service for extended periods, and is a reasonable upper limit to permit repairs and restoration of equipment to an operable status.”

NEI 06-09 and the NRC’s safety evaluation applied to all components within a nuclear power reactor’s operating license. The 30-day backstop limit was the longest AOT (RICT) permitted. Shorter RICTs (AOTs) might apply for components with especially vital safety roles.

For example, the NRC established more limiting AOTs (RICTs) for the EDGs. In February 2002, the NRC issued Branch Technical Position 8-8, “Onsite (Emergency Diesel Generators) and Offsite Power Sources Allowed Outage Time Extensions.” This Branch Technical Position is part of the NRC’s Standard Review Plan for operating reactors. The Standard Review Plan helps plant owners meet NRC’s expectations and NRC reviewers and inspectors verify that expectations have been met. The Branch Technical Position is quite clear about the EDG allowable outage time limit:

“An EDG or offsite power AOT license amendment of more than 14 days should not be considered by the staff for review.” [underlining in original]

Exceptions and Precedent

Consistent with the “every rule has its exception” cliché, neither the 14-day EDG AOT in NRC Branch Technical Position 8-8 nor the 30-day backstop limit in the NRC’s safety evaluation for NEI 06-09 are considered hard and fast limits. Owners can, and do, request NRC’s permission for longer times under special circumstances.

The owner of the DC Cook nuclear plant in Michigan asked the NRC on May 28, 2015, for permission to operate the Unit 1 reactor for up to 65 days with one of its two EDGs out of service. The operating licensee for Unit 1 already allowed one EDG to be out of service for up to 14 days. During testing of an EDG on May 21, 2015, inadequate lubrication caused one of the bearings to be severely damaged. Repairs were estimated to require 56 days.

The NRC emailed the owner questions about the 65-day EDG AOT on May 28 and May 29. Among the questions asked by the NRC was how Unit 1 would respond to a design basis loss of coolant accident (LOCA) concurrent with a loss of offsite power (LOOP) and a single failure of the only EDG in service. The EDGs are designed to automatically start from the standby mode and deliver electricity to safety components within seconds. This rapid response is needed to ensure the reactor core is cooled should a broken pipe (i.e., LOCA) drain cooling water should electrical power to the makeup pumps not be available (i.e., LOOP). The single failure provision is an inherent element of the redundancy and defense-in-depth approach to nuclear safety.

The NRC did not approve the request for a 65-day EDG AOT for Cook Unit 1.

The NRC did not deny the request either.

On June 1, 2015, the owner formally withdrew its request for the 65-day EDG AOT and shut down the Unit 1 reactor. The Unit 1 reactor was restarted on July 29, 2015.

More on the Back Story

About 18 months after one of two EDGs for the Unit 1 reactor at DC Cook was severely damaged during a test run, one of two EDGs for the Unit 3 reactor at Palo Verde was severely damaged during a test run.

About 18 months after DC Cook’s owner requested permission from the NRC to continue running Unit 1 for up to 65 days with only one EDG in service, Palo Verde’s owner requested permission to continue running Unit 3 for up to 62 days.

About 18 months after the NRC staff asked DC Cook’s owner how Unit 1 would respond to a loss of coolant accident concurrent with a loss of offsite power and failure of the remaining EDG, the NRC staff merely assumed that a loss of coolant accident would not happen during the 62 days that Palo Verde Unit 3 ran with only one EDG in service. Enter the back story as reported by the Arizona Republic.

On December 23, 2016, and January 9, 2017, Differing Professional Opinions (DPOs) were initiated by member(s) of the NRC staff registering formal disagreement with NRC senior management’s plan to allow the 62-day EDG AOT for Palo Verde Unit 3. The initiator(s) checked a box on the DPO form to have the DPO case file be made publicly available (Fig. 1).

Fig. 1 (Source: United States Postal Service)

The DPO initiator(s) allege that the 62-day EDG AOT was approved by the NRC because the agency assumed that a loss of coolant accident simply would not happen. The DPO stated:

“The NRC and licensee ignored the loss of coolant accident (LOCA) consequence element. Longer outage times increase the vulnerability to a design basis accident involving a LOCA with the loss of offsite power (LOOP) event with a failure of Train A equipment.”

Palo Verde has two fully redundant sets of safety equipment, Trains A and B. The broken EDG provided electrical power (when unbroken) to Train B equipment. The 62-day EDG AOT was approved based on workers scurrying about to manually start combustible gas turbines and portable generators to provide electrical power that would otherwise be supplied by EDG 3B. The DPO stated:

“The Train B EDG auto starts and loads all safety equipment in 40 seconds. The manual actions take at least 20 minutes, if not significantly longer.”

Again, the rapid response is required to mitigate a loss of coolant accident that drains water from the reactor vessel. When water does not drain away, it takes time for the reactor core’s decay heat to warm up and boil away the reactor vessel’s water, justifying a slower response time.

The NRC staff considered a loss of coolant accident for the broken EDG at Cook but allegedly dismissed it at Palo Verde. Curious.

The DPO also disparaged the non-routine measures undertaken by the NRC to hide their deliberations from the public:

“The pre-submittal call occurred on a “non-recorded” [telephone] line. The NRC staff debated the merits of the call in a headquarters staff only discussion. Note that the Notice of Enforcement Discretion calls are done on recorded [telephone] lines.”

President Richard Nixon’s downfall occurred when it become known that tape recordings of his impeachable offenses existed. The NRC avoided this trap by deliberately not following their routine practice of recording the telephone discussions. Peachy!

Cognitive Dissonance or Unnatural Selection?

The NRC’s approval of the 62-day EDG AOT for Palo Verde Unit 3 is perplexing, at best.

In the amendment it issued January 4, 2017, approving the extension, the NRC wrote:

“Offsite power sources and one train of onsite power source would continue to be available for the scenario of a loss-of-coolant accident” while EDG 3B was out of service.

In other words, the NRC assumed that loss of offsite power (LOOP) and loss of coolant accident (LOCA) are separate events. The NRC assumed that if a LOCA occurred, electrical power from the offsite grid would enable safety equipment to refill the reactor vessel and prevent meltdown. And the NRC assumed that if a LOOP occurred, a LOCA would not drain water from the reactor vessel, giving workers time to find, deploy, and start up the portable equipment and prevent core overheating.

But in the amendment it issued December 5, 2006, establishing the 10-day EDG AOT, the NRC wrote:

“During plant operation with both EDGs operable, if a LOOP occurs, the ESF [engineered safeguards] electrical loads are automatically and sequentially loaded to the EDGs in sufficient time to provide for safe reactor shutdown or to mitigate the consequences of a design-basis accident (DBA) such as a loss-of-coolant accident (LOCA).”

In those words, the NRC assumed that LOOP and LOCA could occur concurrently in design basis space.

More importantly, page B 3.8.1-2 of the bases document dated May 12, 2016, for the Palo Verde operating licenses is quite explicit about the LOOP/LOCA relationship:

“In the event of a loss of preferred power, the ESF electrical loads are automatically connected to the DGs in sufficient time to provide for safe reactor shutdown and to mitigate the consequences of a Design Basis Accident (DBA) such as a loss of coolant accident (LOCA).”

In those words, the operating licenses issued the NRC assumed that LOOP and LOCA could occur concurrently in design basis space.

So, the NRC either experienced cognitive dissonance in having two opposing viewpoints on the same issue or made the unnatural selection of LOCA without LOOP.

Actions May Speak Louder Than Words, But Inaction Shouts Loudest

Check out this chronology:

  • December 15, 2016: EDG 3B for Palo Verde Unit 3 failed catastrophically during a test run
  • December 21, 2016: Owner requested 21-day EDG AOT
  • December 23 2016: NRC approved 21-day EDG AOT
  • December 23, 2016: DPO submitted opposing 21-day EDG AOT
  • December 30, 2016: Owner requested 62-day EDG AOT
  • January 4, 2017: NRC approved 62-day EDG AOT
  • January 9, 2017: DPO submitted opposing 62-day EDG AOT
  • February 6, 2017: NRC special inspection team arrived at Palo Verde to examine EDG’s failure cause
  • February 10, 2017: NRC special inspection team concluded its onsite examinations
  • April 10, 2017: NRC issued special inspection team report

The NRC jumped through hoops during the Christmas and New Year’s holidays to expeditiously approve a request to allow Unit 3 to continue generating revenue.

The NRC has not yet responded to two DPOs questioning the safety rationale behind the NRC’s approval.

If the NRC really and truly had a solid basis for letting Palo Verde Unit 3 run for so long with only one EDG, they have had plenty of time to address the issues raised in the DPOs. Way more than 62 days, in fact.

William Shakespeare wrote about something rotten in Denmark.

The bard never traveled to Rockville to visit the NRC’s headquarters. Had he done so, he might have discovered that rottenness is not confined to Denmark.

The EPA Budget Cuts Are A Direct Attack on our Air, Water, and Health

UCS Blog - The Equation (text only) -

Congress will have a chance this week to question why Scott Pruitt proposes to eviscerate the Environmental Protection Agency budget by 30%. Let that sink in—nearly a third of the agency’s activities could vanish.

That is unless leaders in Congress stop such a hemorrhage and restore safeguards put in place to protect the air that Americans breathe and the water we drink. What’s more, science should continue to play its rightful role in informing policy at the EPA.

Perhaps some are taking clean air and the associated thriving economy and public health benefits for granted.

I do not take this for granted today because I grew up looking at pictures of hazy street lights turned on at noon in a book on the coffee table with photos of smog in my hometown of Pittsburgh. It wasn’t just hard to see during the day: the smog could become lethal. In 1948 residents lost their lives in a smog event in Donora, Pennsylvania, a town along the Monongahela River valley 24 miles southeast of Pittsburgh. The smog was not just restricted to one region, but also occurred in Tulsa OK, Los Angeles, and beyond.

A recent dramatization can be found in episode four of The Crown regarding the ill-fated decision to place fossil fuel burning power plants near an urban area that contributed tiny pollution particles traced to an estimated 12,000 deaths in the 1952 London fog disaster. Such disasters led to the UK Clean Air Act of 1956 and in America to research and control the problem through a series of Acts in 1955, 1963, 1967 and 1970 with amendments in 1977 and 1990.

To help ensure high quality research and enforcement of safeguards, the Environmental Protection Agency was formed in 1970. Among the projected benefits of the amendments to the 1990 Clean Air Act is that by 2020 we’ll have avoided around 230,000 early deaths from particulate pollution, 200,000 early deaths from heart disease, and 17 million lost work days.

Despite the immense progress for protecting Americans since the formation of the EPA, signs already point to troubled times ahead. As my colleague Gretchen Goldman noted, vast science supports a ground-level ozone standard below 70 ppb to protect public health. Yet Administrator Pruitt has delayed the ozone rule another year due to “insufficient information.”

Asking for more time to study a problem is one of the oldest delay tactics in the playbook.

The science has established the combustion of fossil fuels to be the root cause of toxic smog. It’s also the primary cause of climate change. Delay would be an abdication of the EPA Administrator’s legal obligation to reduce CO2 pollution after a decision by the Supreme Court of the United States.

Despite the mountain of scientific evidence, the EPA Administrator has called for a “true, legitimate, peer-reviewed, objective, transparent discussion about CO2,” indicating support for a “red team-blue team” approach for such a discussion. But there’s no need for further delay on the fundamental risks of atmospheric CO2. Since the late nineteenth century, science supports the basic fact that climate change is real, is primarily due to human activities, and that we have a choice about the future trajectory of global temperature and sea level as a result.

Leaders in the past ignored the science and ignored the risks, and lives were lost to pollution-induced toxic smog. The atmospheric carbon dioxide caused by burning fossil fuels also poses a risk to human life, especially if allowed to continue unabated.

It is time to move away from merely surviving—and move to thriving—in America. The EPA can continue to play a key role. That is unless the budget is slashed and deep experience and practical know-how walks out the door, and historical inequities become exacerbated.

We have already lost great leaders in the EPA that no longer can make progress with eviscerated budgets in their programs.

This is not just another routine set of budget hearings. Lives may depend on leaders in Congress asking Administrator Pruitt the hard questions based on solid science and latest economic calculations, with the true costs of burning fossil fuels factored in.

Members of Congress should demand that Administrator Pruitt has the budget necessary to uphold and implement the core mission of the EPA to protect the health and well-being of all Americans.

Oyster Creek Reactor: Bad Nuclear Vibrations

UCS Blog - All Things Nuclear (text only) -

The Oyster Creek Nuclear Generating Station near Forked River, New Jersey is the oldest nuclear power plant operating in the United States. It began operating in 1969 around the time Neil Armstrong and Buzz Aldrin were hiking the lunar landscape.

Oyster Creek has a boiling water reactor (BWR) with a Mark I containment design, similar to the Unit 1 reactor at Fukushima Daiichi. Water entering the reactor vessel is heated to the boiling point by the energy released by the nuclear chain reaction within the core (see Figure 1). The steam flows through pipes from the reactor vessel to the turbines. The steam spins the turbines connected to the generator that produces electricity distributed by the offsite power grid. Steam discharged from the turbines flows into the condenser where it is cooled by water drawn from the Atlantic Ocean, or Barnegat Bay. The steam vapor is converted back into liquid form. Condensate and feedwater pumps supply the water collected in the condenser to the reactor vessel to repeat the cycle.

Fig. 1 (Source: Tennessee Valley Authority)

The turbine is actually a set of four turbines—one high pressure turbine (HPT) and three low pressure turbines (LPTs). The steam passes through the high pressure turbine and then enters the moisture separators. The moisture separators remove any water droplets that may have formed during the steam’s passage through the high pressure turbine. The steam leaving the moisture separators then flows in parallel through the three low pressure turbines.

The control system for the turbine uses the speed of the turbine shaft (normally 1,800 revolutions per minute) and the pressure of the steam entering the turbine (typically around 940 pounds per square inch) to regulate the position of control valves (CVs) in the steam pipes to the high pressure turbine. If the turbine speed drops or the inlet pressure rises, the control system opens the control valves a bit to bring these parameters back to their desired values. Conversely, if the turbine speed increases or the inlet pressure drops, the control system signals the control valves to close a tad to restore the proper conditions. It has been said that the turbine is slave to the reactor—if the reactor power level increases or decreases, the turbine control system automatically repositions the control valves to correspond to the changed steam flow rate.

The inlet pressure is monitored by Pressure Transmitters (PT) that send signals to the Electro-Hydraulic Control (EHC) system. The EHC system derives its name from the fact that it uses electrical inputs (e.g, inlet pressure, turbine speed, desired speed, desired inlet pressure, etc.) to regulate the oil pressure in the hydraulic system that positions the valves.

Fig. 2 (Source: Nuclear Regulatory Commission)

Bad Vibrations

In the early morning hours of November 20, 2016, the operators at Oyster Creek were conducting the quarterly test of the turbine control system. With the reactor at 95 percent power, the operator depressed a test pushbutton at 3:26 am per the procedure. The plant’s response was unexpected. The positions of the control valves and bypass valves began opening and closing small amounts causing the reactor pressure to fluctuate. Workers in the turbine building notified the control room operators that the linkages to the valves were vibrating. The operators began reducing the reactor power level in an attempt to stop the vibrations and pressure fluctuations.

The reactor automatically shut down at 3:42 pm from 92 percent power on high neutron flux in the reactor. Workers later found the linkage for control valve #2 had broken due to the vibrations and the linkage for control valve #4 had vibrated loose. The linkages are “mechanical arms” that enable the turbine control system to reposition the valves. The broken and loosened linkages impaired the ability of the control system to properly reposition the valves.

These mechanical malfunctions prevented the EHC system from properly controlling reactor pressure during the test and subsequent power reduction. The pressure inside the reactor vessel increased. In a BWR, reactor pressure increases collapse and shrink steam bubbles. Displacing steam void spaces with water increases the reactor power level. When atoms split to release energy, they also release neutrons. The neutrons can interact with other atoms to causing them to split. Water is much better than steam bubbles at slower down the neutrons to the range where the neutrons best interact with atoms. Put another way, the steam bubbles permit high energy neutrons to speed away from the fuel and get captured by non-fuel parts within the reactor vessel while the water better confines the neutrons to the fuel region.

The EHC system’s problem allowed the pressure inside the reactor vessel to increase. The higher pressure collapsed steam bubbles, increasing the reactor power level. As the reactor power level increased, more neutrons scurried about as more and more atoms split. The neutron monitoring system detected the increasing inventory of neutrons and initiated the automatic shut down of the reactor to avoid excessive power and fuel damage.

Workers attributed the vibrations to a design flaw. A component in the EHC system is specifically designed to dampen vibrations in the tubing providing hydraulic fluid to the linkages governing valve positions. But under certain conditions, depressing the test pushbutton creates a pressure pulse on that component. Instead of dampening the pressure piles, the component reacts in a way that causes the hydraulic system pressure to oscillate, creating the vibrations that damaged the linkages.

The component and damaged linkages were replaced. In addition, the test procedure was revised to avoid performing that specific portion of the test when the reactor is operating. In the future, that part of the turbine valve test will be performed during an outage.

Vibrations Re-Visited

It was not the first time that Oyster Creek was shut down due to problems performing this test. It wasn’t even the first time this decade.

On December 14, 2013, operators conducted the quarterly test of the turbine control system at 95 percent power. They encountered unanticipated valve responses and reactor pressure changes during the test. The operators manually shut down the reactor as reactor pressure rose towards the automatic shut down setpoint.

Improper assembly of components in the EHC system and vibrations that caused them to come apart resulted in control valves #2 and #3 closing. Their closure increased the pressure within the reactor pressure, leading the operators to manually shut down the reactor before it automatically scrammed.

The faulty parts were replaced.

Bad Vibrations at a Good Time

If every test was always successful, there would be little value derived by the testing program.

Similarly, if every test was seldom successful, there would be little value from the testing program.

Tests that occasionally are unsuccessful have value.

First, they reveal things that need to be fixed

Second, they provide insights on the reliability of the items being tested. (I suppose tests that always fail also yield insights about reliability, so I should qualify this statement to say they provide useful and meaningful insights about reliability.)

Third, they occur during a test rather than when needed to prevent or mitigate an accident. Accidents may reveal more insights than those revealed by test failures. But the cost per insight is a better deal with test failures.

Five Questions Scott Pruitt Should Answer at This Week’s EPA Budget Hearing

UCS Blog - The Equation (text only) -

On Thursday, EPA Administrator Scott Pruitt will appear before the US House of Representatives Appropriations Subcommittee on Interior, Environment, and Related Agencies to discuss the Trump administration’s budget proposal for his agency.

This is the first time Mr. Pruitt will appear on Capitol Hill since his confirmation hearing, and the first time members of Congress will be able to publicly question him about the proposal to slash the EPA budget by 30%. THIRTY PERCENT! The proposal also calls for EPA staffing cuts of the same magnitude.

Make no mistake. These proposed cuts are devastating. They would erode our environmental health infrastructure, gravely diminish our nation’s scientific expertise, threaten our ability to prepare for and deal with emerging threats, lock us into the status quo, and make it virtually impossible for the EPA to do its job.

What’s the job?

Pure and simple—it’s safeguarding public health. Because polluted air, water, food, land, homes, neighborhoods, and workplaces make people sick.

The EPA performs core public health functions that help safeguard our health and safety, like assessing and analyzing health needs and risks; assuring action to manage and communicate those risks; and developing, advocating, and implementing policies and programs that address them.

Indeed, this mission is still reflected on the EPA website, which says that “EPA’s purpose is to ensure that:

  • all Americans are protected from significant risks to human health and the environment where they live, learn and work;
  • national efforts to reduce environmental risk are based on the best available scientific information….”
EPA science

Science is the bedrock, the backbone, that makes these safeguards possible. The website still describes the agency as “…one of the world’s leading environmental and human health research organizations. Science provides the foundation for Agency policies, actions, and decisions made on behalf of the American people. Our research incorporates science and engineering that meets the highest standards for integrity, peer review, transparency, and ethics.”

That’s why a focus on the proposed cuts to EPA’s science and technology budget is key. The proposed cuts are crippling. (Short-sighted and stupid are also apt descriptors.)

The administration (and Mr. Pruitt) proposes to reduce the EPA’s science and technology budget from $733 million to $450 million, along with deep personnel cuts in the agency’s science programs. That includes the science supporting clean air programs (reduced by $30 million), Homeland Security support (reduced by $14 million), and research on chemical safety (reduced by $42 million) and healthy communities (reduced by $85 million), to pick a few.  The proposal also cuts the budget of the EPA’s Office of Research and Technology (ORD) in half; ORD conducts the bulk of the research that underpins all EPA policies.

Slashing the science at EPA is like robbing a clinician of a critical diagnostic tool. The American public expects the EPA to assess, understand, communicate, and manage existing problems and risks, but also to be there and ready when things go bang in the night and new threats emerge. The agency can’t do that without robust science and research, and without scientists who work on the front lines and are valued for the work they do.  

Five questions for Thursday’s budget hearing

So far, I have heard Mr. Pruitt talk a lot about the “impact on regulated industry.” He has said that that the purpose of regulations are “to make things regular” for the regulated community. I’m waiting to hear him talk about the agency’s public health and science-based mission.  Thursday’s budget hearing is an opportunity to explore these issues.

So here are the top five questions I think we deserve answers to and that I hope the budget subcommittee will ask:

  1. Do you see the EPA as a public health agency and that agency science is a critical component of good decisions? If so, how and why can you justify reducing spending on science and technology by 30% across the board?
  2. The National Laboratories across the country under the EPA Office of Research and Development are critical world-class resources not only for the agency, but also for states, tribes, municipalities, and other federal agencies including the Department of Homeland Security. They have been deeply involved in solving problems ranging from the anthrax attack on Congress, to the air quality issues following the 9/11 attacks, to responding to water crises in Flint and elsewhere. Have you ever visited the EPA laboratories and talked to their scientists? How and why do you think that these key public health science resources can continue to function with the large cuts you are proposing? Or do you think these labs are no longer needed?
  3. All of the EPA’s statutory mandates require that the agency rely on solid science to protect public health and the environment. How do you propose meeting those mandates if you drastically reduce the science capability of the agency? With more than a third fewer scientists, and a third less funding, as well as less funding for grants to the states, tribes, universities, municipalities, and regional collaborations, how can you assure the public that EPA actions and decisions will be based on science and not on politics?
  4. On the one hand you have repeatedly said that states should take a greater role in environmental protection. On the other hand you have slashed support for the states for everything from grants to regional partnerships. At the same time, you are dramatically reducing the resources in dollars and people in the science programs at EPA. Those science programs directly support collaborative state and federal action.  How do you expect the states to take on this supposed greater role without the science resources to do so?
  5. Your budget proposal with its huge reductions in support for science means that the EPA’s ability to respond to new and emerging issues and threats, from toxic contaminants to homeland security issues, will essentially be lost. As one senior scientist put it, this budget at best “locks us into where we are now in environmental and public health protections as if no new challenges will confront us.”  Are you suggesting there are no new challenges to public health and the environment that the EPA should anticipate and be prepared address? Are you suggesting that the current state of knowledge about existing problems—like water and air pollution—is sufficient? Can you tell us how reducing science capacity at the EPA will help us answer these questions and address these needs?

Perhaps the overarching and most important questions of all are whether, with this budget proposal, Mr. Pruitt can assure the American people that we will have the science to keep our drinking water safe, our air clean, our food free from harmful contaminants, and our neighborhoods free from toxic chemicals. Whether he can assure us that the EPA will be ready and able to respond to acute emergencies and threats that may be on the horizon. Whether the budget will support and advance the science and technology needed to bring us into a complex and challenging future.

I have my doubts, though I expect he will try to offer this assurance. It will be up to all of us—we, the people—to hold him accountable.  Let’s get him on record on Thursday. Get your own questions out on social media. Here is the info about the hearing.

And be hopeful—we are a long way away from this draconian EPA budget becoming reality. There will be many way, means, and opportunities to engage and push back in the weeks and months ahead. We’ll keep you informed.

Protect the Science, Protect the Species

UCS Blog - The Equation (text only) -

As we face irreversible destruction of species and their habitats due to threats from habitat loss and fragmentation, overharvesting, pollution, climate change, and invasive species, lawmakers indicate they intend to attack the Endangered Species Act again. Under the current administration, we’ve already witnessed the introduction of several pieces of legislation intended to weaken the Endangered Species Act or specific species protections. Most recently, Senator Barrasso (R-WY), chair of the Senate Committee on Environment and Public Works, announced interest in introducing legislation sometime this summer to overhaul the Act (here and here), despite the ESA’s history of overwhelming support from voters. These potential modifications would mean shifting the authority of implementing the Endangered Species Act from scientists and wildlife managers to politicians.

Science is a constitutive element of the Endangered Species Act, the emergency care program for wildlife. It is the foundation for listing and delisting threatened and endangered species, developing recovery plans for the continued survival of listed species, and taking preventative conservation efforts. This is both a boon and a curse. Since the Endangered Species Act relies on the best available science to make conservation decisions, it is highly successful—over 99% of the species protected under the Act have dodged extinction—yet this reliance on science also makes the law highly susceptible to outside interference from political interests.

Here I am on a nesting beach in Barbuda, monitoring critically endangered hawksbill sea turtles (see above), one of over a thousand species currently listed under the ESA.

The Endangered Species Act has withstood a barrage of politically motivated attacks over the years, from hidden policy riders to blatant editing of scientific content in federal documents.  The notoriety of the sage grouse, for example, comes more as a direct result of it being one of the most politically contentious species listed under the ESA than from its ostentatious courting rituals. The sage grouse issue illustrates what can happen when decisions to protect a species prioritize politics.

The implications of attacks on the science-based Endangered Species Act reflect broader attacks on science in general. Science should have priority influence on our policy decisions; otherwise regulated industries and politics will decide critical aspects of our everyday lives—like the safety and quality of our food, air, and water, and whether or not our nation’s biodiversity is protected. As scientists, we must continue to advance the role of science in public policy as a whole, and ensure that public health, worker safety, and environmental protections rely on the best independent scientific and technical information available.

My generation has been accused of ruining everything from napkins to handshakes. But we should recognize that we have a responsibility to protect imperiled species from permanent extinction so that future generations can experience animals like the bald eagle in the wild. Ensuring that this responsibility is informed by the best available science provided by biologists and other conservation experts is critical. That’s why as a scientific community, we need to make certain the decisions to protect wildlife at risk of extinction are grounded in science. Scientists, not Congress, should be informing decisions about which species deserve protection under the Endangered Species Act. We don’t need to “fix” something that already works. Please join me in urging Congress not to support any legislation to rewrite or modify the Endangered Species Act—our most successful conservation law.

PS If you need additional motivation to sign the letter, just look at this pair of gray wolf pups! Why would someone be against protecting endangered species?

Drowning in a Sea of Sufficient Ozone Research: An Open Letter to EPA Administrator Scott Pruitt

UCS Blog - The Equation (text only) -

Dear Administrator Pruitt,

When you decided this week to delay the 2015 ozone rule by one year, citing “insufficient information,” did you think about the science of ground-level ozone? Did you look at the data showing that ozone pollution is widespread across the country? And importantly, did you look at the detrimental health impacts that ozone pollution has for Americans?

As the law requires, the ozone standard must provide an adequate margin of safety for the most vulnerable populations—including the elderly, children, and those with lung diseases. I know that you are familiar with—actually hostile to—the ozone rule and its basis in the Clean Air Act. In fact, you’ve spent years fighting the (strong) legal and scientific basis for ozone protections and other environmental safeguards.

I’m sure you remember suing the EPA—alongside fossil fuel industry co-parties who gave to your political action committees—over the ozone rule, a challenge you have said was “based, in part, on concerns that EPA has not adequately assessed the available science.” You might even remember vowing, back when the rule was recently proposed, to “challenge the EPA’s misguided and unlawful overreach” in part because the “EPA has not yet articulated how the rule will further improve public health.”

So Administrator Pruitt, I have to ask; what are your definitions of “insufficient” and “adequate” and “articulation”? Because I have looked at the science, and I can tell you that we have more than sufficient information to act on it, contrary to what you claim.

As an air quality scientist, I’ve studied the data on ozone and health. I submitted my own opinion on the ozone rule during its official comment period. I can assure you we are standing on solid ground when it comes to the ozone rule. For your quick reference, here’s a rundown of just how incredibly sufficient the science is on the public health threat of ground-level ozone pollution.

1,251 pages of scientific assessment

As part of the update to the ozone standard, EPA conducts the Integrated Science Assessment (ISA). The 1,251-page document is produced by EPA scientists and surveys the current scientific literature on ozone (including one of my own papers). The peer-reviewed document finds several “causal” and “likely causal” relationships between ozone pollution and health effects. Of note, the report identifies “a very large amount of evidence spanning several decades [that] supports a relationship between exposure to O3 and a broad range of respiratory effects.” In addition, the report finds associations between ozone and short-term cardiovascular effects and total mortality, along with long-term respiratory effects.

Science advisers agree

As I’ve written before, the Clean Air Science Advisory Committee (CASAC), or the group of external independent subject-matter experts that EPA uses to provide scientific recommendations for the standard, came to the conclusion that the standard should be tightened. In its letter to the EPA administrator, the science advisors recommended a range of 60-70 ppb for the standard. In addition, the committee concluded that although 70 ppb was included in its recommended range, such a standard would not provide an “adequate margin of safety,” as the Clean Air Act mandates. The committee went on to note that with a 70-ppb standard there is “substantial scientific evidence of adverse effects … including decrease in lung function, increase in respiratory symptoms, and increase in airway inflammation.”

More scientists agree

The Ozone Review Panel was an additional set of external independent experts that works with CASAC to discuss the state of the science and review the ISA. These experts were brought in to provide additional expertise specific to ozone. This panel largely concurred with lowering the standard to something in the 60 to 70 ppb range as well, noting that a standard below 70 ppb would be more protective of public health.

None of this is new

It’s worth reiterating that the above voices recommending a lower standard are joining those from many years previous. In fact, CASAC first proposed that the ozone standard be in the 60 to 70 ppb range back in 2007. States have known—and have been preparing for a tighter ozone standard—for a very long time. Despite your suggestion otherwise, states have had ample opportunity to prepare for this standard that was finalized nearly two years ago.

The bottom line is that the law requires setting the ozone standard based on science and science alone. The administration must set a standard that is protective of public health with an adequate margin of safety and cannot legally consider economic arguments.

Do you feel up to this task, Administrator Pruitt? Are you able to do your job of protecting the public from ozone threats? And importantly, can you carry out the mission of the EPA of protecting the public health and environment? My colleague Andrew Rosenberg raised concerns before you were even appointed and this decision (among others) proves those concerns were well placed.

If you’d like more information, you can read more of my posts on the EPA update to the ozone standard herehere, and here. And I know many air quality scientists who would be happy to tell you more about what they know. I assure you, Administrator Pruitt, we are drowning in a sea of sufficient science on ozone, if only you’ll listen to the scientists.

Sincerely,

Gretchen Goldman

 

Wind Keeps Creating Jobs, Even as We Pull Out of Paris

UCS Blog - The Equation (text only) -

President Trump announced last week that he was pulling the United States out of the Paris Climate Agreement because, he said, it would impose “draconian financial and economic burdens” on the US. This classic fossil fuel industry rhetoric of pitting the economy against the environment (in this case the climate and future of our planet) has been proven time and time again to be a false choice. The latest, impressive US wind industry results show that more clearly than ever.

Numerous cost-effective climate solutions are available that can create jobs and reduce emissions at the same time to help meet the Paris Agreement. In fact, solutions like improving the energy efficiency of our homes, offices, factories and cars, and investing in solar and wind power can take us most of the way there and actually save consumers money.

When you include the public health and environmental benefits of clean energy, the savings and economic benefits are even larger.

Wind power is working for America

For wind power in particular, recent data from the American Wind Energy Association’s (AWEA) 2016 Annual Market Report show how wind is creating high quality jobs and important economic benefits to rural areas, while reducing emissions at the same time.

US wind capacity has more than doubled since 2010, accounting for nearly one-third of all new electric generating capacity since 2007. Wind power surpassed hydropower in 2016 to become the number one source of renewable electric generating capacity in the country. The wind industry installed more than 8,200 megawatts (MW) of new capacity in 2016, bringing the total US installed capacity to 82,000 MW. Wind power generated 5.5 percent of total US electricity generation in 2016, the equivalent of meeting the entire electricity needs of 24 million average American homes.

Wind industry jobs are growing fast. The US wind industry added nearly 15,000 new jobs in 2016, reaching a total of 102,500 full-time equivalent jobs in all 50 states, up from 50,500 jobs in 2013. Wind power technician is the fasting growing job in the US, according to the Bureau of Labor Statistics. Texas, the national leader in installed wind capacity, also has the most wind-related jobs with more than 22,000, followed by Iowa, Oklahoma, Colorado, and Kansas, each having 5,000 to 9,000 wind jobs (see map).

Source: AWEA annual market report, year-ending 2016.

Domestic wind manufacturing is expanding. Wind power supports 25,000 US manufacturing jobs at more than 500 facilities located in 43 states. US wind manufacturing increased 17 percent in 2016, with 3 new factories opening and 5 existing factories expanding production. Ohio is the leading state for wind manufacturing with more than 60 facilities, followed by Texas (40), Illinois (35), North Carolina (27), Michigan, Pennsylvania and Wisconsin (26 each).

While manufacturing jobs are concentrated in the Rust Belt, Colorado, Iowa, and California are also national leaders manufacturing major wind turbine components, and the Southeast is a major wind manufacturing hub with more than 100 factories. US facilities produced 50-85 percent of the major wind turbine components installed in the United States in 2015, up from 20 percent in 2007, according to Lawrence Berkeley National Lab (LBNL).

Investing in rural communities. The wind industry invested $14.1 billion in the US economy in 2016, and $143 billion over the past decade, with most of this flowing to rural areas where the wind projects are located. Wind energy also provided an estimated $245 million annually in lease payments to farmers, ranchers and other landowners in 2016, with more than $175 million occurring in low-income counties. AWEA estimates that 71 percent of all wind projects installed through 2016 are located in low-income rural counties.

And now for the kicker…

Wind power is providing major economic benefits to President Trump’s base. AWEA estimates that 88 percent of the wind power added in 2016 was built in states that voted for President Trump. In addition, 86 percent of total installed wind capacity in the US and 60 percent of wind-related manufacturing facilities are located in Republican districts.

Source: AWEA annual market report, year-ending 2016.

Wind power is affordable for consumers. The cost of wind power has fallen 66 percent since 2009, making renewable energy more affordable to utilities and consumers. A 2016 NREL and LBNL analysis quantifying the benefits of increasing renewable energy use to meet existing state renewable standards found that the health and environmental benefits from reducing carbon emissions and other air pollutants were about three times higher than the cost of the production tax credit (PTC).

Wind power is reducing emissions: AWEA estimates that existing wind projects avoided nearly 159 million metric tons of carbon dioxide (CO2) emissions in 2016, equivalent to 9 percent of total power sector emissions, as well as 393 pounds of SO2 and 243 million pounds of NOx emissions.

More wind development, jobs, and emission reductions are on the way

And there’s lots more to come. Wind development will continue over the next few years due to the recent 5-year extension of the federal tax credits, state renewable electricity standards, and continued cost reductions. Studies by NREL, EIA, and UCS project that the tax credit extensions will drive 29,000 to 59,000 MW of additional wind capacity in the US by 2020.

Similarly, a study by Navigant Consulting projected 35,000 MW of new wind capacity will be installed in the US between 2017 and 2020, increasing total wind-related jobs to 248,000 by 2020 and injecting $85 billion into the US economy. They also found that each wind turbine creates 44 years of full-time employment over its lifetime.

When combined with additional deployment of solar, NREL found that the federal tax credit extension would result in a cumulative net reduction of 540 to 1,420 million metric tons (MMT) of CO2 emissions between 2016 and 2030, depending on projected natural gas prices.

Studies by EPA and UCS also show that the Clean Power Plan (CPP)—a key policy for achieving the US Paris commitments–would continue to drive wind and solar development and emission reductions through 2030, with the public health and environmental benefits greatly exceeding the costs.

Backing away from Paris and the CPP could actually hurt the US economy

All these amazing facts show that President Trump is wrong to ignore the economic benefits of wind and other clean energy options for the US, and that’s a real shame.

Market forces and continued cost reductions will drive more clean energy development in the US in the near-term. However, countries like China and India are also making significant investments in renewable energy as a key strategy for reducing emissions under the Paris Agreement.

For America to maintain its leadership position in the global clean energy race, we need strong long-term climate and clean energy policies like the Paris Agreement and the Clean Power Plan. Our country will be stronger for it, not weaker.

Increase in Cancer Risk for Japanese Workers Accidentally Exposed to Plutonium

UCS Blog - All Things Nuclear (text only) -

According to news reports, five workers were accidentally exposed to high levels of radiation at the Oarai nuclear research and development center in Tokai-mura, Japan on June 6th. The Japan Atomic Energy Agency, the operator of the facility, reported that five workers inhaled plutonium and americium that was released from a storage container that the workers had opened. The radioactive materials were contained in two plastic bags, but they had apparently ripped.

We wish to express our sympathy for the victims of this accident.

This incident is a reminder of the extremely hazardous nature of these materials, especially when they are inhaled, and illustrates why they require such stringent procedures when they are stored and processed.

According to the earliest reports, it was estimated that one worker had inhaled 22,000 becquerels (Bq) of plutonium-239, and 220 Bq of americium-241. (One becquerel of a radioactive substance undergoes one radioactive decay per second.) The others inhaled between 2,200 and 14,000 Bq of plutonium-239 and quantities of americium-241 similar to that of the first worker.

More recent reports have stated that the amount of plutonium inhaled by the most highly exposed worker is now estimated to be 360,000 Bq, and that the 22,000 Bq measurement in the lungs was made 10 hours after the event occurred. Apparently, the plutonium that remains in the body decreases rapidly during the first hours after exposure, as a fraction of the quantity initially inhaled is expelled through respiration. But there are large uncertainties.

The mass equivalent of 360,000 Bq of Pu-239 is about 150 micrograms. It is commonly heard that plutonium is so radiotoxic that inhaling only one microgram will cause cancer with essentially one hundred percent certainty. This is not far off the mark for certain isotopes of plutonium, like Pu-238, but Pu-239 decays more slowly, so it is less toxic per gram.  The actual level of harm also depends on a number of other factors. Estimating the health impacts of these exposures in the absence of more information is tricky, because those impacts depend on the exact composition of the radioactive materials, their chemical forms, and the sizes of the particles that were inhaled. Smaller particles become more deeply lodged in the lungs and are harder to clear by coughing. And more soluble compounds will dissolve more readily in the bloodstream and be transported from the lungs to other organs, resulting in exposure of more of the body to radiation. However, it is possible to make a rough estimate.

Using Department of Energy data, the inhalation of 360,000 Bq of Pu-239 would result in a whole-body radiation dose to an average adult over a 50-year period between 580 rem and nearly 4300 rem, depending on the solubility of the compounds inhaled. The material was most likely an oxide, which is relatively insoluble, corresponding to the lower bound of the estimate. But without further information on the material form, the best estimate would be around 1800 rem.

What is the health impact of such a dose? For isotopes such as plutonium-239 or americium-241, which emit relatively large, heavy charged particles known as alpha particles, there is a high likelihood that a dose of around 1000 rem will cause a fatal cancer. This is well below the radiation dose that the most highly exposed worker will receive over a 50-year period. This shows how costly a mistake can be when working with plutonium.

The workers are receiving chelation therapy to try to remove some plutonium from their bloodstream. However, the effectiveness of this therapy is limited at best, especially for insoluble forms, like oxides, that tend to be retained in the lungs.

The workers were exposed when they opened up an old storage can that held materials related to production of fuel from fast reactors. The plutonium facilities at Tokai-mura have been used to produce plutonium-uranium mixed-oxide (MOX) fuel for experimental test reactors, including the Joyo fast reactor, as well as the now-shutdown Monju fast reactor. Americium-241 was present as the result of the decay of the isotope plutonium-241.

I had the opportunity to tour some of these facilities about twenty years ago. MOX fuel fabrication at these facilities was primarily done in gloveboxes through manual means, and we were able to stand next to gloveboxes containing MOX pellets. The gloveboxes represented the only barrier between us and the plutonium they contained. In light of the incident this week, that is a sobering memory.

Pages

Subscribe to Union of Concerned Scientists aggregator - Combined UCS Blogs