Combined UCS Blogs

Role of Regulation in Nuclear Plant Safety: A New Series of Posts

UCS Blog - All Things Nuclear (text only) -

President Trump seeks to lessen the economic burden from excessive regulation. The Nuclear Regulatory Commission (NRC) initiated Project AIM before the 2016 elections seeking to right-size the agency and position it to become more adaptive to shifting needs in the future. And the nuclear industry launched its Delivering the Nuclear Promise campaign seeking productivity and efficiency gains to enable nuclear power to compete better against natural gas and other sources of electricity.

In light of these concurrent efforts, we will be reviewing momentous events in nuclear history and posting a series of commentaries on the role of regulation in nuclear plant safety. The objective is to better understand under-regulation and over-regulation to better define “Goldilocks” regulation—regulation that is neither too lax nor too onerous, but just right. That better understanding will enable us to engage the NRC more effectively as the agency pursues Project AIM and the industry tries to deliver on its promise.

Searching for Goldilocks

We will be reviewing “momentous events” with the expectation of examining times when regulation played too little a role as well times when regulation played too large a role. If we are lucky, we will examine events from all three bins—regulation too lax, regulation just right, and regulation overly stringent. Lessons from all three bins will yield the best understanding of what traps to avoid as well as what practices to emulate for the “just right” bin to become more and more popular in the future.

We have a working list of events that will hopefully populate all three bins. While we will not draft the commentaries or bin an event until after reviewing the relevant records, the events likely to fall into the “too lax” bin include the 1979 accident at Three Mile Island, the mid 1990s Millstone, Salem and Cooper problems, and the 2011 accident at Fukushima.

Events likely to fall into the “undue burden” bin include the August 1991 Site Area Emergency declared at Nine Mile Point following a transformer failure, the 1998 Towers Perrin report, and the semi-annual reports by the NRC’s Office of the Inspector General.

And events likely to fall into the “just right” bin include March 1990 station blackout at Vogtle, the September 1997 discovery of and recovery from containment problems at DC Cook, and the flood protection deficiencies identified at Fort Calhoun in 2010 whose remedies sure came in handy during the flood the plant experienced in June 2011.

While we may have reported on or blogged about some of these events already, the perspective is slightly different now. Before, we may have explained how event A resulted in regulatory requirements x, y, and z. Now, we will strive to determine whether there was sufficient awareness prior to the event for these requirements to already have been put in place (i.e, lax regulation), a knee-jerk reaction imposing more regulatory requirements than necessary (i.e., over-regulation), or a prudent reaction to a reasonably unavoidable event (i.e., just right regulation).

The list of potential events for this series contains nearly four dozen candidates. Other candidates may emerge during the reviews. We do not anticipate posting commentaries until every candidate is crossed off the list. Instead, we will continue the series until all three bins are populated with sufficient events to shed meaningful insights on the proper role of regulation in nuclear plant safety. Upon reaching this point, we intend to discontinue the series and share the findings and observations from our reviews in a post and/or report.

Trump Wants a New Low-Yield Nuclear Weapon. But the US Has Plenty Already.

UCS Blog - All Things Nuclear (text only) -

The Trump administration’s Nuclear Posture Review (NPR), released in February of this year, calls attention to the composition of the US nuclear arsenal and its adequacy as a deterrent. The NPR calls for a new lower-yield submarine-launched nuclear warhead, arguing that it is needed to “counter any mistaken perception of an exploitable ‘gap’ in U.S. regional deterrence capabilities.” We decided to put together the chart in Fig. 1 to illustrate the range of nuclear weapons already available in the US arsenal.

One thing that this visual immediately makes clear is that it would be difficult to perceive any real gap in US capabilities—the existing arsenal certainly does not lack for nuclear options for any occasion.

Fig. 1 (Source: UCS)

The blue bars in the image above each represent the explosive power, in kilotons, of an existing weapon or weapons in the US arsenal. As you can see, the US already has weapons with yields ranging from 0.3 to 1,200 kilotons—from 1/50 to 80 times the yield of the roughly 15 kiloton bomb that the United States dropped on Hiroshima in WWII (represented by the dotted bar in the chart). On the lower-yield side, the US currently deploys around 450 weapons, sometimes called “tactical nuclear weapons,” that have options for yields ranging from 0.3 to 10 kilotons. These include both an air-launched cruise missile and gravity bombs. It also deploys another roughly 2,000 weapons with higher—in some cases much higher—yields, ranging from 45 to 1,200 kilotons.

Still, the administration is proposing to fill this non-existent gap with a new lower-yield submarine-launched warhead, called the W76-2—the orange bar in Fig. 1. This new warhead would reportedly have a yield of 6.5 kilotons—right in the middle of the range of existing US low-yield nuclear options. “Low”-yield in the case of 6.5 kilotons, however, is a pretty questionable description. As noted above, the nuclear bomb that the US dropped on Hiroshima in WWII had a yield of roughly 15 kilotons—a little more than twice that of the proposed new weapon. It killed 100,000 people and reduced the city to rubble.

Labeling such deadly and destructive weapons “low-yield” may give leaders the dangerous impression that using them is not as serious as using a nuclear weapon with a larger yield, and that their use would not lead to full-scale nuclear war. But in reality, no one knows what would happen if a nuclear weapon—of any size—were once again used in war. As Defense Secretary James Mattis has said,

“I don’t think there’s any such thing as a tactical nuclear weapon. Any nuclear weapon used at any time is a strategic game changer.”

The administration’s choice of language in the NPR rationale for the new warhead is also interesting. It does not argue that such a gap actually exists, but that it is concerned that an adversary might mistakenly perceive one. While perceptions are always an important consideration in deterrence, it’s useful to keep in mind the fact that 1) we don’t actually know what our adversaries are thinking, and we’ve been dangerously wrong in past guesses; and 2) trying to ensure that no country could ever possibly perceive that it might have any type of military advantage is how arms races happen. Most relevant in the current situation, it is how the US and Soviet Union ended the Cold War with arsenals of tens of thousands of nuclear weapons each. This type of thinking is not about deterrence, but about “escalation dominance” and “nuclear warfighting,” both of which are even more unstable and dangerous.

Recognition of the particular dangers of low-yield nuclear weapons has, until recently, been widespread and bipartisan among US military and political leaders. Over the past several decades, the United States has eliminated much of its arsenal of low-yield nuclear weapons for this and other reasons. The Trump administration’s new move to develop more of these weapons is a step in the wrong direction that is both unnecessary and dangerous.

Peter Wright’s 50+ Chemical Facility Conflicts: A Disaster Waiting to Happen

UCS Blog - The Equation (text only) -

Peter Wright, President Donald Trump’s nominee to lead the EPA’s Office of Land and Emergency Management, will face the Senate Environment and Public Works committee at his nomination hearing this Wednesday. Mr. Wright has spent the majority of his career working as an attorney for Dow Chemical Company (now DowDuPont). Would he make a smooth transition from defender of polluters to defender of the public? Under Pruitt’s lead, it seems unlikely that public safety would be at the top of his agenda.

As I wrote back in April, Mr. Wright is on tap to lead the EPA’s chemical hazards arm, including the Superfund program and the Risk Management Program (RMP). The agency is currently “reconsidering” the Chemical Disaster Rule, which would have improved the RMP by helping to make the communities surrounding the 12,500 facilities regulated under the program safer and better informed.

DowDuPont itself, or one of its subsidiaries, owns over 50 RMP facilities. An analysis of EPA data on accidents at those facilities shows that Dow, DuPont, and their subsidiaries averaged 7 chemical disaster incidents per year, for a total of 99 fires, explosions, spills or gas releases from 2004 to 2016. These accidents resulted in the deaths of 6 workers and caused over 200 people to be hospitalized or seek medical treatment for injuries.

Here are just a few accounts of accidents from those DowDuPont facilities:

  • At Dow’s Texas Operations plant in Freeport, TX, there have been a series of reported accidents from 2004 to 2016. One July 2014 fire resulted in $200,000 property damage, and a liquid spill later that month injured two employees. Then, in July 2015, a gas release and liquid spill shut down a major highway for several hours. Neither the city of Freeport nor its residents were informed by Dow Chemical; instead they learned of the release from the local news network. Freeport’s fire chief, who should have been notified by Dow through the community awareness and emergency response line, told reporters, “We can’t be left in the dark while we are trying to protect our community.”
  • In Hahnville, LA, Dow’s St. Charles operations had a chemical leak of ethyl acrylate in July 2009. Parish residents reported respiratory impacts that sent almost 30 people to the hospital with eye and nose irritation. Dow notified the St. Charles Parish Emergency Operations Center (EOC) about the leak, but it was unclear whether the EOC had adequate information from the company about the chemical’s risks. This facility has continued to have issues, including a 2014 liquid spill that injured one worker and a 2015 gas release that injured three plant employees.
  • A particularly infamous DuPont facility accident occurred at its insecticide plant in La Porte, TX in November 2014. Two workers died when exposed to methyl mercaptan as they were attempting to fix what they thought was a routine problem, and two other workers died after responding to the others’ distress call. After an investigation into the disaster, Chemical Safety Board chairperson Vanessa Allen Sutherland said that “this investigation has uncovered weaknesses or failures in DuPont’s safety planning and procedures.” These weaknesses included inadequate gas detectors, nonfunctional ventilation fans, outdated alarms, no system in place to measure the quantity of toxins leaked beyond property lines, an inadequate process of assembling an internal response team, and nonfunctional emergency vehicles. This plant had been fined in the past for spills and gas releases that had injured workers. In 2015, the Occupational Safety and Health Administration issued penalties summing $273,000 to DuPont for a variety of violations at this plant, and the company finally announced that it would be closing the plant in 2016.

Note that all three of these facilities are at increased risk of flooding due to hurricanes, like the disaster at the Crosby, TX Arkema plant as a result of Hurricane Harvey last summer. Improvements to the chemical safety rule would have helped facilities plan how to manage for future floods, including implementing plans crucial for mitigating incidents and exposures. As EPA prepares to rescind such improvements, Administrator Scott Pruitt is allowing business as usual at these facilities, which face the increased threat of potential flooding, spills, and releases as a result of natural disasters.

Accidents at chemical facilities that may injure and sometimes kill plant workers and residents of nearby communities are not to be taken lightly.  There have been 46 such incidents already this year. The future head of OLEM should be advocating for changes at plants that help to prevent disasters like these from ever happening. The Chemical Disaster Rule would have helped to ensure that adequate measures were in place so that emergency responders have rapid access to chemical risks before entering buildings after reports of a spill or release.

Mr. Wright has spent years defending Dow Chemical Co, a member of the American Chemistry Council, which has lobbied long and hard to avoid more safety precautions for chemical plants to save its member companies the money it would cost to make critical preparedness updates. At his hearing on Wednesday, Senators should ask Wright for one reason to trust that he would be looking after the public interest in his role at OLEM, because advocates in favor of the RMP amendments who spoke at last week’s public comment hearing certainly don’t see a 20-year run at Dow Chemical as supporting evidence. Perhaps Savannah Georgia community organizer Mildred McClain characterized the current situation best when she told the EPA, “If industries were authentic in their pursuit of justice for the communities, they would listen to the voices of the residents…The companies will just keep saying ‘I’m meeting the EPA standard’ while the community members are saying, ‘but we’re sick, we still smell stuff and we still don’t have a concrete plan as to what we’d do if there was a major disaster.”

While we wait to see how Mr. Wright responds to Senate questioning this week, there is still plenty of time to comment on the EPA’s proposed rule which is open until June 29th. You can join us in urging Pruitt to consider worker and community health over industry costs by submitting a comment today. For assistance developing a strong comment, check out this RMP public comment guide.

Flickr/Roy Luck

Massachusetts Senate Unanimously Endorses a Bold Vision for Clean Transportation

UCS Blog - The Equation (text only) -

Photo: Eric Kilby/Flickr

The Massachusetts Senate yesterday unanimously passed an energy bill that promises to dramatically reshape the vehicles and fuels that power our transportation system.

If enacted, this legislation would make Massachusetts a national and even global leader in the deployment of electric vehicle technology. It would dramatically reduce our consumption of oil, and the pollution that comes from petroleum. It would save lives by significantly improving air quality, especially in urban areas. It would produce a stronger and more resilient modern grid that will provide ratepayers with greater efficiency and reliability. And it would produce long-term cost savings for Massachusetts drivers and transit agencies.

And I’m just talking about the provisions related to transportation. (See here for more about the provisions related to renewable energy and storage, en espanol aqui.)  But even a focus on just the implications for the transportation sector is meaty enough. Here’s why this bill is taking on transportation emissions and what the bill would do.

Transportation and climate goals

Let me start with a little background on transportation and Massachusetts climate policy.

Under Massachusetts climate law (the Global Warming Solutions Act or “GWSA”), the state is required to achieve significant reductions in economy-wide global warming emissions. When it comes to emissions from electricity, we’re making remarkable progress: over the past decade Massachusetts has cut our emissions from electricity impressively, thanks in part to a strong set of policies, including our investments in energy efficiency and our participation in the Regional Greenhouse Gas Initiative (or “RGGI”).

But when it comes to transportation, things have been more difficult. Improving vehicle efficiency standards have helped reduce emissions some since 2008, but the gains in efficiency have been partially offset by increases in total driving and increasing purchases of SUVs and light trucks. Electric vehicles are a technology with extraordinary promise but still represent only about 2 percent of new vehicle sales. The state has committed to putting 300,000 electric vehicles on the road by 2025, but we have a ways to go to achieve that goal.

Overall, transportation emissions are about the same as they were in 1990, and transportation now represents the largest source of pollution in Massachusetts, including over 40% of global warming emissions.

What would this bill do?

The Senate bill would accelerate the rapid electrification of our transportation system, while taking steps to ensure that all residents of Massachusetts benefit from electric vehicle technology.

To start with, the Senate bill envisions the end of diesel fuel in our public transportation system. Heavy duty diesel engines are some of the dirtiest vehicles on the road, contributing significantly to urban air pollution that causes asthma and other respiratory problems. Existing technologies such as electric buses have zero tail pipe emissions and can reduce global warming emissions from diesel equivalents nearly 80 percent on today’s grid. The bill would require the Department of Transportation to replace all diesel engines with zero-emission vehicles in its bus, commuter rail, and marine fleet. This would build on recent announcements in California and New York City to electrify their vehicle fleets.

The bill would also require the Department of Transportation to enact policies that would ensure that 25% of all vehicles on the road are electric by 2026. This would represent a major increase from current levels, as electric vehicles are currently about 2 percent of new vehicles sold in Massachusetts. Achieving sales at those rates will require policies that are strong enough to make electric vehicles affordable for low- and moderate-income residents, in addition to building the infrastructure necessary to keep EVs charged.

As we build out our electric vehicle infrastructure, it’s going to be important for us to also think about the impact of EV charging on our electric grid.  Another important provision of this legislation would require utilities to offer rates that reward electric vehicle drivers for charging their vehicles at night, when electricity use is low. These “time of use” rates can lead to big savings not only for electric vehicle drivers, but for all ratepayers.

The critical role of market-based programs

The Senate bill not only sets out big goals, it also identifies a way to pay for the investments that we need in clean transportation: by requiring the state to establish a market-based program to reduce emissions from transportation fuels (as well as heating fuels).

Under RGGI, Massachusetts and the other states of the Northeast have placed a limit on emissions from power plants. This limit is enforced by a requirement that power plant operators purchase allowances that are sold in regional auctions. By limiting the number of allowances available, RGGI ensures overall emission reductions. Meanwhile, the sale of allowances raises money that is invested in efficiency and renewable energy.

This “cap and invest” model has proven effective in reducing emissions from electricity—and beyond. While RGGI only applies to electricity, other jurisdictions, including California, Ontario and Quebec, have expanded this model into heating and transportation fuels and the result has been billions in new investments in clean transportation. California alone is projected to spend over $2 billion on clean transportation investments this year – money that is going to expanded electric vehicle incentives, electric buses, improved transit services, and more affordable housing near transit.

If Massachusetts adopted a program similar to the California-Ontario-Quebec model, it could raise over $450 million per year in investments in clean transportation. That would be enough to not only make a major new investment in electric vehicles, but also to address critical issues facing our Commonwealth such as public transportation and affordable housing.

The path ahead

With the Senate promising bold action to accelerate the electrification of transportation, the action now turns towards the House of Representatives. The House has shown interest in promoting electric vehicles this session, including a good bill from Rep. Tom Golden, chair of the energy committee, that would encourage car dealers to sell electric vehicles.

One big question is whether legislators in both chambers agree on a sustainable and dedicated source of funding for investments in clean transportation. Too often, the policies we use to promote electric vehicles are based on one-time infusions of funds, such as the $12 million that Gov. Charlie Baker committed to the state’s main electric vehicle incentive in December 2016. The problem with one-time cash infusions is that they expire: the state is now running out of funds and may have to cut back on their rebate program.

The senate’s proposals in that regard—and the many other good transportation provisions in last night’s bill—are welcome indeed.

Eric Kilby

Hey California, Let’s Spare the Air and Turn Down the Gas

UCS Blog - The Equation (text only) -

The Glenarm natural gas plant in Pasadena, California. Source: Wikimedia

On March 4, California set a new record by supplying nearly half of the state’s electricity needs from renewables. That’s just the latest payoff of the state’s admirable clean energy investments, thanks to plentiful solar power and strong policies like the Renewables Portfolio Standard (RPS).

But California still relies on fossil fuels, via natural gas power plants, to provide nearly 40 percent of annual electricity needs. In fact, in-state natural gas generation comprises about 10 percent of greenhouse gas emissions statewide. Reaching our long-term energy and climate goals means ramping up renewables and at the same time turning down our gas.

In many cases, gas plants will be turned off during the day, when renewable generation is most abundant. However, as the sun sets, solar generation decreases and natural gas plants must be turned on—or, if they’re already operating, they must ramp up generation to meet the evening demand spike.

The solution to this evening ramp problem is to:

  • Build cleaner alternatives than gas that can produce power in the evening,
  • Build more energy storage,
  • Use load shifting and increased energy efficiency to reduce evening electricity demand.

Many of the state’s natural gas power plants were constructed to provide baseload power, meaning they were designed to stay on all day, nearly every day. Most of California’s natural gas plants were not designed to be turned on and off daily, nor was their frequent cycling anticipated in their original air quality permits. A natural gas plant starting up can produce as much as 30 times more nitrogen oxide (NOx) emissions than it will after it has been running for few hours.

Nitrogen oxides are the particles visible in smog. They irritate lung tissue, exacerbate asthma, and make people more susceptible to chronic respiratory diseases like pneumonia and influenza. Starting up gas plants more often could increase air pollution concentrations and should be considered in their air permits.

To make sure California’s clean energy transition also reduces criteria air pollution from natural gas plants, UCS is proudly co-sponsoring legislation—Senate Bill 64 by Senator Bob Wieckowski—with the California Environmental Justice Alliance and the Clean Power Campaign. The legislation aims to do three things:

  • Require generators to provide data on the hourly change in emissions, startups, shutdowns, and cycling. Many plants are required to report hourly emissions data to the US EPA, but the data is not in a user-friendly format and it’s difficult to ascertain how power plant operations are changing over time without some complex analysis. More accessible information about how power plants are actually operating, as opposed to how they were predicted to operate when they were first permitted, is an essential first step to better decisions about how dispatch of natural gas power plants are impacting local air quality.
  • Require local air districts to analyze, using this data, how power plants are currently operating and likely to operate in the future, to ensure air quality protections are included in applicable permits. SB 64 would also require air districts to limit generation from the dirtiest power plants on days with poor air quality as long as the needs of the grid can be met with other resources. Since the worst air quality days are often the hottest days with the highest electricity need, limits on power plant dispatch must not jeopardize grid reliability.
  • Require the state to plan for how it will reduce natural gas generation and accelerate the eventual retirement of gas plants, placing a priority on reducing natural gas generation in communities most impacted by air pollution.

Because natural gas–fired power plants supply a substantial portion of California’s current electricity demand and support grid reliability, natural gas generation will continue to play a role on California’s electricity grid for some time.

California is charting new territory for other states and countries in terms of level of renewables on the grid, and making a dramatic shift away from natural gas generation will not happen overnight. But, Californians are already starting to feel the impacts of climate change, and communities in California breathe some of the unhealthiest air in the country.

For these reasons, it’s critical that the state shift to cleaner sources for all of its energy needs including electricity. The state needs better tools to understand how changing natural gas plant operations may impact air quality, and an explicit mechanism in law for air districts to coordinate with grid operators to reduce the dispatch of natural gas power plants on the worst air quality days. SB 64 would ensure that California’s ramp-up in clean generation does not lead to the unintended consequence of frequently cycling natural gas power plants in a way that leads to increased air pollution.

Contents Under Pressure: Speak Out Against EPA Proposed Chemical Facility Safety Rollbacks That Put Communities at Risk

UCS Blog - The Equation (text only) -

The Chevron Richmond refinery fire, August 6, 2012. Photo: Greg Kunit/CC BY-NC-SA 2.0 (Flickr)

Over the last year, we have written extensively on the actions that Scott Pruitt’s Environmental Protection Agency (EPA) has taken to eliminate or weaken critical science-based protections, particularly on chemical facility safety. From the outset, Pruitt was determined to delay the implementation of updates to the Risk Management Plan(RMP) that called for the assessment of safer technologies, more accessible and quality information for communities near facilities, and improved emergency response coordination. Now with a new proposed rule, the saga continues as the EPA under Pruitt moves one step closer to eliminating hard-fought improvements to the RMP.

Tomorrow, the EPA will hold a public hearing in Washington, D.C. for comments on the proposed changes to the RMP amendments. This is the only hearing being held on the proposed rule, which means that, just like last year, the frontline communities affected by these decisions will likely not have an opportunity to speak out against it. I will be testifying on behalf of UCS, in hopes of amplifying the concerns of the communities that are unable to be present. You can see my prepared comments below.

The agency is currently taking written comments on the proposed rule, open until June 29, 2018. UCS has created an RMP public comment guide for tips on writing a strong comment for this particular rule, which can be found here. Please join us in telling Administrator Pruitt that the health and safety of frontline communities, workers, and first responders will not be best served by favoring industry.

***

Thank you for this opportunity to speak on the proposed amendments to the Risk Management Plan.

My name is Charise Johnson. I am here on behalf of the Union of Concerned Scientists. With more than 500,000 members and supporters across the country, we are a nonpartisan, non-profit group, dedicated to improving public policy through rigorous and independent science. This proposed rule rolls back many of the critical public safeguards implemented in the 2017 Chemical Disaster Rule. Just last year, I was in this building along with many other partners and fence-line community groups asking EPA to end its dangerous delay of the 2017 Chemical Disaster Rule. Those updates to the original RMP were hard fought and deliberated by various stakeholders including multiple agencies and took several years to finalize. Now I am here today to ask the EPA to rescind these dangerous rollbacks.

This rule is particularly important to the health and safety of fence-line communities, first responders, and workers in the facilities. The Husky Energy Oil Refinery explosion in Wisconsin, the Valero Refinery explosion and fire in Texas, and the Chevron Richmond Refinery flaring of at least 500 pounds of sulfur dioxide in California are a few examples just in the past two months of how chemical facilities need to better coordinate with first responders, offer more direct access to information to communities to plan for evacuation, and assessment of safer practices that could make workers and surrounding communities safer in case of an accident. And with the strengthening of severe weather events such as intense hurricane seasons in the Gulf region, the frequency of chemical disasters like the Arkema explosion will become more commonplace for neighboring communities. 

The modest, commonsense requirements that the EPA is aiming to rollback include:

  • A requirement that industrial facilities presenting the highest risks undertake a safer technology alternatives assessment (STAA). Safer technology alternatives assessment is a business best practice, industry should be looking at ways to make their practices and technology safer for their facility, workers, and for the surrounding communities.
  • A requirement that an “incident analysis” include determining the “root cause” of the incident to avoid such incidents in the future. Root cause analyses are necessary to determine what the cause of an incident or a near miss is, so the facility can fix the problem and prevent a future disaster.
  • A requirement that qualified, independent third-party audits be conducted when a facility has an incident to ensure the cause of the incident is addressed. In the case of the highest risk facilities and extreme incidents a third-party audit of the facility should be necessary to gain an objective view and assessment of the safety of the facility.
  • A requirement that facilities provide the public with information critical to the surrounding communities’ understanding of the potential risks from these facilities, including how to protect themselves should a release occur and what potential health risks they might face from a recent release incident. Information sharing should be a basic tenet of this rule. The EPA requires individuals travel to their respective state’s federal reading room to acquire information on facilities, yet not every state has a reading room, and some must travel great distances. Communities and first responders deserve to have better access to basic information about facilities in their community such as 5-year accident history, safety data sheets, planned emergency exercises, and evacuation information. These provide basic access to information that the public has a right to know and hampers the ability of affected communities to know and prepare for chemical risks.
  • A requirement that facilities provide emergency planners and first responders with additional information needed for responding to a chemical release. The proposal would return to the status quo, where companies have more leeway to refuse to share relevant safety information with first responders.

EPA’s own rulemaking states that the proposed changes to this rule would impact low-income communities and communities of color the hardest. We are here in solidarity with our environmental justice community partners, including the Environmental Justice Health Alliance (EJHA) and Texas Environmental Advocacy Services (TEJAS), among countless others who among the few community voices able to make it all the way to DC to make sure the EPA considers vulnerable communities over industry profits.

Since the delay of the 2017 Chemical Disaster Rule, there have been at least 45 known incidents at chemical facilities. That is at least 45 incidents too many. The 2017 finalized amendments are commonsense protections that could have helped prevent and mitigate the harm of those chemical disasters and protect us from future ones. EPA needs to put the health and safety of the public first, and not move forward with this proposed rule.

 

New Evidence Shows Just How Bad the Trump Administration is at Governing

UCS Blog - The Equation (text only) -

Photo: Gage Skidmore/Flickr

President Trump likes to brag about how many regulations his administration is removing. The President’s “2 for 1” order requires federal agencies to revoke two regulations for every new rule they want to issue. This order is aimed at getting “rid of the redundancy and duplication that wastes your time and money.” Call me crazy, but I’d like to keep the regulations on the books that protect consumers, safeguard clean air and water, and keep our kids safe. Forcing agencies to dump two rules for every new one requires agencies to take a sledgehammer to any imperfect rule (and its friend) when a scalpel would suffice.

Regardless of what the Trump Administration boasts, a dive into the data shows just how ineffective the President has been at both enacting and removing regulations. The Executive Office of Management and Budget (OMB) reviews every rule that a federal agency wants to enact and has been counting how many rules have come through its doors. I took a look at the OMB database to find out how many rules – including repeals of existing rules and new rules – the Trump Administration has sent to OMB during its first 18 months compared to how many rules the Obama Administration sent to OMB during its first 18 months.

The results are striking.

The Trump Administration has sent about half the number of economically significant* rules and about a third of all rules sent to OMB for review compared to the Obama Administration over the same timeframe.

 

OBAMA

(1/20/09 – 06/01/10) TRUMP

(1/20/17 – 06/01/18) % DIFFERENCE Economically significant* rules sent to OMB 173 90 (75 concluded + 15 pending) -48% All rules sent to OMB 889 338 (273 concluded + 65 pending) -62%

*As defined in Executive Order 12866, a rulemaking action that will have an annual effect on the economy of $100 million or more or will adversely affect in a material way the economy, a sector of the economy, productivity, competition, jobs, the environment, public health, or safety, or State, local or tribal governments or communities.

 

This evidence further supports claims that the White House and our federal agencies have been hollowed out by the Trump Administration, which has ostracized longstanding public servants, failed to fill key roles, and created a working environment no one wants to join. In 2017 alone, nearly 400 workers left the EPA, and the agency’s staffing has reached its lowest point in almost 30 years. If every EPA employee eligible to retire by 2021 does so, the EPA would have about less than 8,000 employees by the end of President Trump’s term – a cut of nearly half.

Overall, this weakening of the federal government’s ability to enact and update rules is having a negative effect on public health and our environment. EPA has fewer staff to push back against an attempt by EPA Administrator Pruitt to delay rules designed to prevent chemical disasters. The Department of Interior is accepting early retirements and has less staff to manage our public lands and push back against industry petitions for oil and gas drilling permits. And, the White House itself seems unable to present a coordinated approach to policy, both foreign and domestic.

But fear not! The good news is that UCS is helping mobilize people across the country to #StandUpForScience, and is helping organize events to promote the use of science in sound rulemaking. UCS is also working to stop the rollback of rules designed to protect public health and fighting the Trump Administration’s efforts to stymie the effectiveness of the federal government. You can join these efforts too! All it takes is signing up for some emails and devoting some time to help make a difference. See you there!

Photo: Gage Skidmore

Acceso equitativo a la energía solar en Massachusetts, en manos del Senado

UCS Blog - The Equation (text only) -

Las instalaciones solares compartidas se han convertido en una alternativa para brindar acceso a la energía solar para todos permitiendo sacar ventaja de las economías de gran escala. (Fuente: Wikimedia)

ACTUALIZACIÓN: El Senado está votando una Ley sobre energía limpia hoy. Si vive en Massachusetts contacte su Senador/a y haga oir su voz. Puede enviarle un email (en inglés acá) o aún mejor, llamarle (números telefónicos acá).

La energía solaren Massachusetts ha ayudado a que las familias que tienen acceso a ésta disminuyan sus facturas de electricidad, a que el aire que respiramos en el estado sea más saludable, a generar más de 11.000 empleos y a que quienes quieren contribuir con sus acciones en la lucha contra el cambio climático tengan un aliado en los páneles solares para generar electricidad.

Desafortunadamente, debido a que cerca del 40% de la población de Massachusetts vive en arriendo, esto hace que más de 2 millones de personas no puedan instalar páneles solares en los techos de sus viviendas al no ser propietarios del techo de las mismas. Este número es aún mayor si pensamos en quienes viven en edificios con múltiples propietarios o áreas con techos reducidos, con deficiente orientación o en condiciones no óptimas para la instalación de páneles solares.

Las instalaciones solares compartidas en Massachusetts

Una alternativa para que quienes no pueden instalar páneles solares en los techos de sus casas son las instalaciones solares compartidas (community shared solaren inglés). La instalación solar compartida consiste en un proyecto solar desarrollado por una organización o empresa que instala una mayor cantidad de páneles en un lugar apropiado. Los subscriptores invierten en el proyecto, compran su electricidad o reciben otros beneficios específicos como créditos para pagar menos en la factura eléctrica.

A pesar de la gran oportunidad de acceso que estas instalaciones representan, en el año 2016 la legislación solar de Massachusetts cambió impactando gravemente las instalaciones compartidas. En este momento, quienes tienen páneles solares instalados en los techos de sus casas pueden ser son compensados en totalidad por la electricidad que sus páneles producen y que no es usada en sus hogares. En cambio, a quienes cuentan con instalaciones solares compartidas se les reconoce tan solo el 60% del valor de la electricidad que los páneles producen.

Esto claramente impacta negativamente las finanzas de quienes quieren beneficiarse de la energía solar mas no cuentan con el techo apropiado para tener acceso a la misma. Es casi como si la legislación del 2016 penalizara el no ser propietario de un techo propio al cambiar las condiciones en que los residentes son compensados dependiendo del poder adquisitivo de los consumidores. Y por ende, este cambio impacta también desproporcionadamente a las personas de bajos recursos y pertenecientes a grupos minoritarios étnicos y raciales, quienes son los que usualmente viven en arriendo o en viviendas multifamiliares.

El brillo de la solar:en las manos del Senado

Las flores están brotando. Ahora es tiempo de que nuevas leyes también lo hagan. (Crédito: J. Rogers)

Afortunadamente, un número de propuestas lideradas por senadores como Sonia Chang-Diaz  y Jamie Eldridge, así como organizaciones como Boston Community Capital están siendo evaluadas para garantizar que el Senado ayude a corregir los problemas de equidad incluidos en la legislación del 2016. Por ejemplo, las propuestas incluyen que se requiera que al menos un porcentaje mínimo de los incentivos en materia solar sean destinados a hogares de bajos recursos y comunidades urbanas y de justicia ambiental. La idea es garantizar que los beneficios económicos y ambientales de estos incentivos se distribuyan de manera equitativa. También se propone que los sistemas solares compartidos sean compensados por el 100% de la electricidad que producen, y la creación de programas para proveer acceso a la energía solar a comunidades para quienes el inglés no es su lengua nativa.

El Senado de Massachusetts tiene entonces en sus manos la posibilidad de apoyar las propuestas de Ley necesarias para que quienes no poseen un techo (o cuentan con uno pero sin condiciones adecuadas) puedan acceder a los mismos beneficios de acceso a la energía solar de quienes si lo poseen. Ésta es una oportunidad de oro (por su valor para la sociedad, la economía y el medio ambiente) que esperamos no se escurra entre sus dedos.

Massachusetts’s Clean Energy Economy: What the Legislature Needs to Do Now

UCS Blog - The Equation (text only) -

The Massachusetts legislature is in the final weeks of its two-year legislative session, and there’s finally some movement in both houses on our clean energy future. Here are four things that need to happen to get to the legislative finish line, and why.

But first: some context.

Where the state is coming from

Flowers are blooming on Beacon Hill. Now we need legislation to blossom. (Credit: J. Rogers)

Many of the issues we talked about on this platform a few months ago—what the key issues for next steps on clean energy in Massachusetts were—are still the issues of the day. What’s different now is that we’ve had a project selected by the state and our utilities for sizeable amounts of low-carbon power (Canadian hydro, in this case). And we’ve just selected the first tranche of Massachusetts offshore wind power.

Both of those advances were courtesy of the legislature’s strong action in 2016, when it passed the Energy Diversity Act. What we need now is to fill in the gaps left in that package.

What got left undone

In the lead-up to what became the 2016 law, UCS published an analysis about the risks of natural gas overreliance and the possibility of bringing new and stronger strategies to bear on the problem. That analysis focused on both natural gas risks and carbon pollution, and the prospects for cutting both.

The analysis showed that combining the large-scale renewables procurement, a hefty offshore wind commitment, and a strengthening of the state’s renewable portfolio standard (RPS) could do the trick nicely, and at low cost.

The RPS has been a major driver of renewable energy in and for Massachusetts, and keeping it out ahead of the market—think of it as a carrot in front of the (non-partisan) donkey of energy progress—is what keeps things humming and growing.

Alas, strengthening the RPS was one big piece that wasn’t included in the final version of the Energy Diversity Act. But leaders in both the house and senate pledged to rectify that this session, given that the offshore wind coming will eat up basically all the demand that the growth in the RPS over the next dozen years would otherwise supply. And we need the RPS to drive more than that.

Also needed, but lacking, was a long-term solution to let solar—rooftop, large-scale, or other—keep doing what we need it to do: diversify our electricity mix, increase our energy resilience, and create jobs, jobs, and more jobs.

So, what needs to happen?

1. The house needs to pass strong energy bills

Last month the really important energy committee of the House of Representatives “reported out” a suite of bills that can serve as the basis for strong action by the whole chamber:

  • RPS – Increases our requirement for new renewable energy to 35% new renewable energy by 2030—a nice round(ish) number, though not as strong as what other states have done (50×30; see below), or even our neighbor Connecticut (40×30)
  • Solar – Raises the “net-metering” caps that keep many solar projects from connecting and keep disrupting the growth of our solar economy (though our strong solar economy and interest in investing in solar mean that we’ll hit even those new caps soon)
  • Energy storage – Aims to strengthen our power sector’s resilience with targeted deployments of batteries or other technologies
  • Electric vehicles – Drives EVs with rebates and other policies

There’s also an important bill to take our nation-leading energy efficiency efforts to the next level.

Several of the bills are now in front of the House Ways & Means Committee, so it’s up to that committee to strengthen them and get them to the full house, and quickly. And then it’s up to the full membership to strengthen and pass the lot of them.

2. The senate needs to pass its strong, multi-part bill.

Meanwhile, elsewhere in the state house… The Massachusetts Senate Ways & Means Committee passed a solid bill last week. The “Act to Promote a Clean Energy Future” (Senate Bill 2545) includes a range of important provisions to drive progress in the electricity sector and more. A taste:

  • RPS – Increases our requirement for new renewable energy to 50% by 2030—on par with requirements in New York, New Jersey, California, and other states
  • Solar – Removes the net-metering caps, and addresses some other barriers that have been thrown up in utility proceedings
  • Climate progress – Lays out a process for setting targets for cutting our carbon pollution by 2030 and 2040, and makes sure the administration has the tools to address carbon emissions from our transportation sector, and in the commercial and industrial sectors
  • Energy storage – Has the state get a lot more serious about driving innovation in energy storage
  • Plus – More offshore wind, more clean energy procurements, more…

The bill deliberations will also present some opportunities for getting it even more right. On solar, for example, a key issue is making the technology more accessible to a broader swath of our society, in particular with programs aimed at low-income households.

All in all, and particularly with strengthening amendments, a solid foundation for clean energy progress. And the full senate will be taking up the bill any day now. (Watch this space for more info.)

3. The two chambers need to agree on a final version.

Unless one chamber passes verbatim the bill(s) that the other one has approved, whatever comes out of these processes will need to go to a conference committee—basically, a small group appointed by each chamber to hammer out something that will satisfy members of both bodies.

That means the house and senate need to get close enough so that they have something that qualifies for conference. What we can’t have happen is ending up with a suite of bills on one hand and a multi-part bill on the other, and no way to connect the two. That should be an important consideration for the leadership of each chamber.

Once the bills get into conference, and the conference committee does its thing, then the membership of each needs to say, “Aye.”

4. The governor needs to sign.

One last step: The governor. Gov. Charlie Baker has contributed a few pieces to the energy-future discussion, so whatever results from the legislative process is likely to reflect some of his priorities, and likely to earn his signature.

And then?

And then… we make it happen. There will be regulatory proceedings to go through—basically, the process of filling in the blanks, putting meat on the bones of the new statutes, so that people know the rules of the game for implementation.

None of this procedural stuff, on the legislative or regulatory fronts, is entirely easy. But it’s not particularly hard, either. And we know that the technologies are there and ready, and that so are the entrepreneurs, the businesses, and the customers.

So what we need from our legislature, and now, is the framework for progress. Not further reflection. Not studies. Action. That’s what leadership is about.

And that’s what we’re looking for over the next few weeks, as citizens and constituents, residents and ratepayers.

We Ranked All 50 States from Farm to Fork. Why We Bothered—and a Taste of Our Takeaways

UCS Blog - The Equation (text only) -

Photo: Preston Keres, USDA

Recently, some fellow data geeks and I spent (quite a lot of) time ranking all 50 states on the health and sustainability of their food systems, from soil to spoon.

We went through the trouble for a few reasons. First, as you may have heard in bits and pieces, the state of our farms, our food supply, and our dietary health is not good—globally, nationally, regionally, and likely even in your neighborhood. As all these things are interrelated, we wanted to dig into the data to better understand what’s going on. Second, when it comes to food systems, we believe that the United States can do better. And, since innovative solutions are already popping up across the country, highlighting these as models may be key to building a healthy, sustainable, and just world. Finally—call us crazy—but we just love data and (yes) food systems.

What’s the fuss about the food system?

Before explaining what we did, let me refresh your memory about some of the most worrisome food system trends. Globally, you likely know that with population growth, climate change, and 11 percent of the world facing hunger, pressures on food supplies and natural resources are intense. And although there’s growing dialogue around transformative solutions to these intertwined challenges, the United States isn’t exactly leading the way.

In the past year, we as a country fell squarely in the “also ran” category in a Food Sustainability Index; the US Department of Agriculture (USDA) reported that our public agricultural R&D funding has been losing ground; and we withdrew from the Paris Agreement, which addresses the growing threat of climate change (with serious implications for agriculture, and maybe also the nutritional quality of our food).

But you don’t have to look beyond our borders to see signs of trouble. US farms are disappearing, rural communities are struggling, policy debates are putting farmers and eaters under stress, the food system includes some of the worst employers in the country, the Gulf of Mexico dead zone continues to be huge, and so on. Clearly, we need to seek solutions, but where to begin?

The not-so-secret ingredients in our scorecard

With an eye toward opportunities, we set off to capture and crunch the numbers to provide a snapshot of the US food system. To this end, we delved into data dealing with different pieces of the problem, including farming practices, labor conditions, water quality, public health, and more. We explored data sources such as the USDA, the Bureau of Labor Statistics, the Environmental Protection Agency, the Centers for Disease Control and Prevention, and the Census Bureau.

While we can’t possibly claim to have uncovered everything, we searched until we felt we had a critical mass of information representing food systems from coast to coast. With data for 68 indicators, we looked for patterns and potential (read more about our methods). We aimed to standardize data to compare states with both similarities and differences (natural resources, geographies, histories, cultures, populations, etc.). Finally, we grouped data into categories representing core aspects of the food system, and we synthesized these to get a sense of which states are leading the way.

The report? A mixed bag 

All in all, our analysis revealed both strengths and weakness of US food systems, distributed all across the country. To learn more and see where your state falls in the rankings—with maps, charts, and stories—you should check out our interactive scorecard. Here, I’ll just offer a flavor for our findings:

  • Action abounds: On the plus side, we found that different states rank better on different aspects of food systems, meaning that all states have a role to play in leading the way to a better future. From Alaska (with a smaller ecosystem footprint from its farms) to Wyoming (with farm production supporting relatively healthy diets), and California (boasting stronger farmer-to-eater infrastructure) to Maryland (a role model for conservation agriculture), states from sea to sea show strengths.
  • Bright spots: In more good news, we discovered brilliant bright spots, even in states ranking lower in some aspects of our food system scorecard. For example, the Chillinois Young Farmers Coalition is devoted to improving the outlook of farming in Illinois, and Practical Farmers of Iowa has had a big hand in the recent surge of cover crop adoption—and associated conservation benefits—throughout that state.
  • Costly consequences: While our focus was on opportunities, our analysis also exposed some of the dangerous consequences of our current conditions, from climate change contributions to water quality challenges to health outcomes and inequities. It’s also important to note that we ranked states against one another, not against some hypothetical ideal, so even top-ranking states have lots of room for improvement.
  • Data limitations: In several cases, the ideal data we were seeking wasn’t available, because it either simply didn’t exist, or was difficult to access at the scales we needed. To really get a holistic understanding of the food system—one that measures needs and progress—we need more public, accessible, and transparent data.
Fighting for food systems that fare better

If we want a food system that we can all be proud of—one that is healthy and equitable for farmers, laborers, eaters, and the environment—we have a ways to go. Fortunately, however, our new analysis revealed a lot of bright spots worth building on.

With farm bill season in full force, there’s no better time to protect and build up the programs and investments that help make positive change possible. The draft House farm bill, which failed to pass last month, likely would have had a negative impact on food systems across the country due to its utter failure to invest in healthy food access. However, just last week, Senate leaders released their proposal for a bipartisan farm bill, which defends and even boosts many critical initiatives, such as those that support nutrition, regional economies, beginning farmers, and sustainable agriculture research. While it’s clear there’s a lot of work ahead, investments like these can give us confidence that we’re heading in the right direction—so raise your voice and urge your senators to pass a farm bill that brings us one step closer to a food system, from farm to fork, that we can be proud of.

ExxonMobil Refuses to Give Scientists the Floor: Reflections from a Corporate Shareholders’ Meeting

UCS Blog - The Equation (text only) -

Photo: Mike Mozart/Flickr

It was with great anticipation that I attended the ExxonMobil Shareholders Meeting last month at the invitation of the Union of Concerned Scientists (UCS). My attendance was facilitated via proxy from Mercy Investment Services. In doing so, I joined a multitude of interested parties—some of whom had traveled great distances—to engage ExxonMobil’s CEO Darren Woods in discussions concerning a wide array of topics including, but certainly not limited to, climate change. Alas, none of us (representatives of the Union of Concerned Scientists or others who had come prepared with questions about climate change or environmental issues) were called upon. We were, in fact, studiously avoided.

Thus, sadly, I must admit that when it was all said and done: I walked away from the experience with my skepticism of the petro-chemical industry giant’s sincerity in addressing climate change in any meaningful way intact. Simply stated, what I heard from Mr. Woods—though masterfully cloaked in symbolic-laden rhetoric—came down to one very clear point: ExxonMobil is committed to business as usual.

Yes, Mr. Woods did indeed address the company’s efforts in advancing lubricants for expanding wind facilities; yes, he addressed efforts to advance cutting-edge technologies in carbon sequestration; yes, he addressed lowering emissions from natural gas production; and finally yes, he also addressed furthering the company’s commitment to developing algae-based biofuels of the future. Historically, total projected investment capital for these projects amounts to roughly $8 billion.

Nevertheless, all of this spending was offset (and not in the good way) with a promise of low-cost/high-return investment in oil and gas mega-projects: (1) Offshore oil reserves of Guyana and Brazil, (2) Liquefied natural gas reserves in Mozambique and Papua New Guinea, and (3) Unconventional shale reserves in the Permian Basin of Texas. Total projected investment capital for these projects is roughly $30 billion.

This “dilemma of rhetorical disconnect” was further exemplified throughout the course of Mr. Woods’ remarks to company shareholders and other interested parties.

In specific terms, Mr. Woods began his remarks by stating, “We’re… committed to be a part of the solution in addressing the risk of climate change and other pressing societal challenges.” He continued, “Society has aspirations for economic growth, reliable and affordable energy and environmental protection. We see our role as helping close the gap between what people want and what can be responsibly done. This is what I believe sustainability is all about and frankly, is what we’re all about.”

Mr. Woods then expressed confidence in projections of steady markets for oil and gas through 2040, and the belief that “… meeting the world’s energy need will require trillions of dollars in new investments, even in a two-degree scenario.” At the close of his remarks, Mr. Woods restated ExxonMobil’s commitment, “…to help close the gap between what society wants and what is economically available, using advantaged investments and promising technology. As society’s need [sic] continue to evolve, we’ll continue to respond.”

All of which leads me to the question I would’ve asked Mr. Woods had I been given the opportunity:

In a recently published article with my colleague, Dr. Katharine Hayhoe, we note that should the known fossil fuel-based energy reserves within and around the Arctic Circle be developed, the probability of achieving the Paris Climate Agreement’s stated goal of limiting global temperature at or below 2C will be highly unlikely, if not impossible.

As a scientist, and a citizen, I worry that ExxonMobil’s conclusion, as stated in the company’s recent “Energy & Carbon Summary Report,” that its upstream recoverable reserves poses “little risk” is a false conclusion given the scientific-basis for the heightened risks posed to human society, including our health, our natural resources, and even our national security should those reserves in the Arctic be developed and then utilized as a primary source for generating energy.

In order to resolve this problematic finding, Exxon/Mobil should seriously consider shifting the lion’s share of its capital investment resources away from exceedingly difficult and expensive endeavors like that of developing Arctic-based fossil fuel resources and instead toward— what by all free-market indicators suggest is taking hold across the globe—meeting/addressing the public’s increasing demand for alternative and renewable energy resources.

So, given that scenario as well as Exxon/Mobil’s existing investment portfolio my question then, is this:

(1) Why is Exxon/Mobil so risk averse to shifting away from an upstream fossil fuel-based investment paradigm and toward an upstream alternative/renewable-based investment paradigm?

(1a) Why not redirect those monies into the clean energy of the future to better sustain Exxon/Mobil’s business model as a global energy leader for future generations of stockholders?

(1b) Wouldn’t that resolve the problematic finding in your “Energy & Carbon Summary Report” that the emissions trajectory of Exxon/Mobil’s “Outlook for Energy” will far exceed the Paris Climate Agreement goal of keeping global temperature increase well below 2C?

The questions I raised are based on two fundamentally important components that drive any shift in energy policy. First, policy research finds that three conditions are required for energy policy to shift: (1) energy markets, (2) energy technology, and (3) political willpower. Second, that same research also finds that there is an inverse relationship between the advancement of energy production technology used to develop exceedingly difficult upstream fossil fuel resources and the rapid deterioration in the environmental quality of our collective natural resources.

Any closing of Mr. Woods’ so-called “gap(s),” then, requires that energy markets, technology, and political willpower align themselves in such a manner that the well- below two degrees Celsius (2°C) objective of the Paris Climate Agreement is attainable.

If ExxonMobil is serious about addressing climate change, then I would suggest that Mr. Woods abandon the rhetoric of seeking change and provide greater detail to his vision of a sustainable future. (Note to Mr. Woods: your definition of “… what sustainability is all about” is not even close to being correct). I would also suggest to Mr. Woods, or any other member of ExxonMobil’s Board of Directors, that the only condition left unmet for fully realizing a historic shift in how America powers its economy is that of political willpower.

Or perhaps, what’s missing is dynamic leadership from the private sector, from an industry leader in innovation, that’s willing to take the risk to sustain its business model and the environment for future generations. Clearly, while Mr. Woods suggests that ExxonMobil is articulating some notion of what it views as being responsive to the public’s demand for action on climate change, its actions remain cloaked in rhetorical subterfuge.

Taking risk to address the pressing societal problems of climate change requires bold and dynamic leadership. What are you waiting for, Mr. Woods?

 

Dr. Robert E. Forbis Jr. is an Assistant Professor of Political Science and Research Associate with the Climate Science Center at Texas Tech University (CSC-TTU). He is a former Research Affiliate with the Center for Advanced Energy Studies at the Idaho National Laboratory (CAES-INL). His research interests primarily concern the policy nexus of environmental protection and energy development. He teaches courses on Public Lands and Resource Management, Climate and Sustainability, Energy and Environmental Policy, Environmental Theory, and Environmental Justice. Dr. Forbis is a recipient of the “Professing Excellence” Award (2014) and “Phi Beta Kappa Honored Professor” (2018) from Texas Tech University.

Supreme Court Ignores Science, Enables Voter Purging, But Data May Have Final Say

UCS Blog - The Equation (text only) -

The Supreme Court, in a narrow 5-4 decision, has upheld a restrictive Ohio election law that initiates a process to purge eligible voters from its voter list if they fail to vote in a single election. A number of other states and localities have also implemented voter list purging tactics, and it is expected that this decision will result in additional states adopting more restrictive voter list purges.

The central question is whether or not the law, which relies on failure to vote as a trigger to remove voters from the voter registration list, violates the prohibition in the 1993 National Voter Registration Act (NVRA) that voters can be removed from a list based solely on a failure to vote.

Writing for the majority, Justice Alito interpreted the law’s request to confirm residence as additional criteria, enough that this “supplemental process” complies with basic requirements of the NVRA. In dissent, Justice Breyer held that the law is in violation of the NVRA, which includes a “broad prohibition on removing registrants because of their failure to vote.” He continues:

Ohio’s system adds to its non-voting based identification system a factor that has no tendency to reveal accurately whether the registered voter has changed residences. Nothing plus one is still one.

The factor Breyer refers to is the notification requiring targeted voters to prove their residency. Indeed, voter data analysis revealed that the Ohio system would have disenfranchised over 7,000 eligible voters in 2016. Plaintiffs found that most people throw the verification requests in the trash, such that the method is not a reliable indicator of voter eligibility.

Moreover, I have previously noted that there are far more effective methods for verifying voter eligibility through database matching algorithms, but Alito’s opinion effectively ignored all of the science available on the topic, in addition to ignoring the possibility of implementing voter list maintenance procedures that are less invasive of voting rights.

However, the data may still get its day in court, as suggested in Justice Sotomayor’s individual dissent. Sotomayor criticized the majority for “distorting the statutory text to arrive at a conclusion that…contradicts the essential purposes of the statute, ultimately sanctioning the very purging that Congress expressly sought to protect against.”

As Richard Hasen notes at Slate, Justice Sotomayor also laid out a path to directly challenge these discriminatory laws under the Voting Rights Act, and through the ballot box, as more conservative states inevitably adopt similar laws.

Justice Sotomayor focused her dissent on the requirement that any removal program “be uniform, nondiscriminatory, and in compliance with the Voting Rights Act” and illustrated the disparate impact of voter purging on minority, disabled, veterans, and low-income voters. Specifically, she described the inequalities in removals between downtown, African-American communities in Cincinnati (10%) and the more affluent, white suburbs of Hamilton County (4%).

Empirical demonstrations of the discriminatory impact of voter purging laws can be used in future litigation to have them thrown out in violation of the Voting Rights Act, even though the impact of such laws was barely considered in the Court’s ruling.

In addition, voters themselves can fight back at the ballot box. By supporting reforms such as Automatic Voter Registration (AVR), which provides greater security over voter lists and improves political participation, voters can ensure that they are protected. Over a dozen states have already adopted AVR, and many more are following. The crucial difference between standard voter registration procedures and AVR is that eligible citizens are automatically registered through interaction with a state agency (typically the department of motor vehicles), and must “opt-out” rather than being required to “opt-in.”

Finally, voters have the opportunity, this November, to support meaningful, effective reforms that protect and strengthen the right to vote. That will require that we help to make sure that people are registered, that they have adequate ballot access, and that they are mobilized. Effective participation in the 2018 elections could very well determine the impact of this divisive, divided decision that the Supreme Court has given us.

The Versatile Test Reactor Debate: Round 2

UCS Blog - All Things Nuclear (text only) -

In mid-February, the House of Representatives passed the “Nuclear Energy Research Infrastructure Act of 2017” (H.R. 4378). It authorizes the secretary of energy to spend nearly $2 billion to build and begin operating a facility called a “versatile, reactor-based fast neutron source” by the end of 2025 “to the maximum extent practicable.” The purpose of the facility would be to provide an intense source of fast neutrons that could be used by startup companies developing fast reactors for power production. Current US power and test reactors do not generate large quantities of fast neutrons.

However, the facility itself would be a fairly large, experimental fast neutron reactor, likely fueled with weapon-usable plutonium, and would pose significant security and safety risks. H.R. 4378 authorizes the Department of Energy (DOE) to construct this facility, now known as the “Versatile Test Reactor” (VTR), without really knowing how much it would cost or how long it would take, let alone whether there was a significant need for it in the first place. In fact, at the time of the bill’s passage in the House, the DOE had not even begun to conduct such an analysis. This is bad public policy.

Even though H.R. 4378 has not yet become law, Congress is already funding the VTR program. The DOE requested $10 million in its Fiscal Year 2018 budget request for preliminary studies of the VTR and $15 million in its Fiscal Year 2019 budget request to begin “preconceptual design development.” But the FY 2018 omnibus budget bill that President Trump signed into law in March provided the DOE with $35 million—$25 million more than the DOE requested. In FY 19, while Senate appropriators would only provide the DOE with its $15 million request, the House has voted to give the VTR $65 million. And Congressman Randy Weber (R-TX), a co-sponsor of H.R. 4378, is pushing to increase the final FY 19 appropriation to $100 million.

It is unclear what DOE could even do with all that money at this early stage of the project.

According to DOE official John Herczeg, the agency only began in April to determine the VTR’s “true cost and schedule” and will not decide whether to build it until after the study is completed in 2021. Congress should stop pressuring the DOE to move forward on construction of the VTR before this analysis is done. And the DOE should use some of the extra money to conduct a nonproliferation and nuclear terrorism impact assessment of the VTR.

In response to my February blog post criticizing the VTR project, the Idaho National Laboratory (INL), the DOE facility where the VTR would likely be housed, issued a “technical rebuttal” containing numerous inaccuracies and misleading assertions.

More information has come out about the VTR project since I posted my critique that confirms many of my points. I have  issued a reply to INL’s rebuttal. Hopefully, it will shed more light on the substantial risks and questionable benefits of the VTR project.

Will Chevron Show Leadership in Climate Solutions? Notes From the 2018 Shareholders’ Meeting

UCS Blog - The Equation (text only) -

Photo: ArtBrom/Flickr

Last week, I joined the Union of Concerned Scientists at the Chevron shareholders’ meeting in San Ramon, CA. We were there to ask why Chevron leadership, and shareholders, have not pushed for more meaningful action to meet global emissions targets that would keep climate warming well below 2 degrees celsius.

The security to get into Chevron Headquarters in San Ramon was tight – more significant than your typical airport security. In addition to multiple steps of checking of our passes to enter and walking through metal detectors, we were only able to bring in paper and pen, and each of our papers were shuffled through and inspected on the way in. Once seated, we listened to the presentations by the company’s Chair and CEO and by shareholders advocating proposals on environmental, social, and governance issues. During this time, shareholders followed the Board’s recommendation to reject proposals to “transition to a low carbon business model” and improve lobbying disclosures, among other things.

During much of the meeting, I was scribbling down notes and adapting my prepared statement based upon what I was hearing. I also spent some time staring into this infographic that was provided in the Chevron Climate Resiliency Report (data from IEA 2015 World Balance and Final Consumption Report 2015):

This diagram highlights the flow of energy — the width of the bars reflects the relative size of the production/consumption budget — in our current fossil-fuel focused energy system. This diagram allows you to watch the flow of energy towards different areas of our economy that utilize that source. One remarkable aspect of this data, which is pointed out in the Climate Change Resilience Report, is that “about 25% of global oil consumption is used in personal vehicles” (to see this, follow the bar from “oil”, to “transport”, and then to “passenger”). This means every day that we drive in our personal vehicles we are making choices about fossil fuel emissions that add up to something very significant. I was struck by this statistic because it underscores something that I frequently address in my public talks about climate change: personal, individual action is one piece of the puzzle in solving the climate problem. But there are other pieces of the puzzle – government leadership and corporate accountability which I address again below.

At the end of the scheduled shareholder proposals, it was time for the lottery of Q&A. Each of us who had a question or statement had to get a numbered ticket; tickets were pulled randomly and there was no guarantee that all questions would be heard. In total, about a dozen people asked questions or made statements to the Chairman. Of these, almost all of them were on three topics: climate change, human rights, and an ongoing lawsuit with the people of Ecuador due to a decades old environmental disaster.

Here was my statement and question when my number was called:

Good morning Mr. Chairman, members of the Board, and Stakeholders. Your recent Climate Change Resilience report was a step toward responding to investor demands that you disclose your plans for operating in a world where global temperature increase is kept well below two degrees Celsius. However, your company emphasizes potential conflicts rather than synergies between climate solutions and other societal goals and dismisses a rapid transformation of our energy system as “unlikely.”

I am a scientist here in Northern California. One of the areas of my research focuses on the impact of rising carbon dioxide concentrations on the changing chemistry of the ocean. I collaborate with businesses along the coast that are deeply concerned about the impacts of rising carbon dioxide on their financial future. Specifically, rising carbon dioxide concentrations threaten a key part of our history, culture and economy of California – sustainable harvests of food from the sea. As a scientist, I understand the grave risks we are facing without deep reductions in emissions and know that swift action is precisely what is needed to avoid the worst effects of climate change.

You stated this morning, and you describe in the Climate Resilience Report, that a first principle that guides your views on climate change is that “reducing greenhouse gas emissions is a global issue that requires global engagement and action”. Yet, in this report you bet against our ability to tackle meaningful energy transformation. When will Chevron show greater ambition to keep global warming below 2 degrees C?

In his answer, Chair and CEO Michael Wirth was respectful, and thanked me for my work in the scientific community. He explained that the company simply “meets the demands of energy used by people around the world,” and that it does “look at low carbon scenarios” as part of its business plan. However, Mr. Wirth argued that global policies are needed – ones that would require government intervention – and that it isn’t the role of individual companies to make decisions on this matter. This was an interesting answer because it spelled out something that Chevron doesn’t say directly in its public report – the company isn’t planning on taking leadership on climate change until governments lead the way. Which is hard to imagine, since fossil fuel companies spend millions every year lobbying our government to support policies that promote the use of oil and gas.

Why does this matter – and why would a climate scientist attend a Chevron shareholders’ meeting? I pondered this quite a bit when I was asked to join the UCS team for the meeting that day. For me, the decision came down to three things. First, because I am asking Chevron to use the best available science to make decisions for our future. Was a being an ‘advocate’ – yes – I am advocating for the use of science in decision making. Second, because I have made a commitment to not just communicate with those who already agree with me. We need to be able to put ourselves in situations where we work to find common ground and similar values with people in many different communities. Finally, as I’ve discussed above, I think individual responsibility is an aspect of the problem – people need to feel emboldened to make their own decisions that place our planet on a better path. But individuals can’t solve this problem alone: corporate accountability is important here. We need to be asking more of corporations that contribute significantly to our greenhouse gas burden. If they contribute significantly to the problem, they should be contributing significantly to the solution.

 

Dr. Tessa Hill is a Professor and Chancellor’s Fellow at University of California, Davis, in the Department of Earth & Planetary Sciences. She is resident at UC Davis Bodega Marine Laboratory, a research station on the Northern California Coast. She is part of the Bodega Ocean Acidification Research (BOAR) group at Bodega Marine Laboratory, which aims to understand the impact of ocean acidification on marine species. Tessa leads an industry-academic partnership to understand the consequences of ocean acidification on shellfish farmers. Tessa is a Fellow of the California Academy of Sciences, a AAAS Leshner Public Engagement Fellow, and a recipient of the Presidential Early Career Award for Scientists & Engineers (PECASE).

Our Latest Automaker Rankings: What The Industry Needs to do to Keep Moving Forward

UCS Blog - The Equation (text only) -

Every few years, UCS takes a look at the auto industry’s emission reduction progress as part of our Automaker Rankings series of reports. This year’s analysis, based on model year (MY) 2017 vehicles, shows that the industry has once again reached the lowest levels yet in both smog-forming and global warming emissions from new vehicles, despite the fact that many off-the-shelf technologies are deployed in less than one-third of all new vehicles.  Unfortunately, this record-setting trend in progress also shows some indications of slowing down, with Ford and Hyundai-Kia showing no progress towards reducing global warming emissions, and Toyota actually moving backwards.

At the same time, the industry spearheaded an effort to re-litigate fuel economy and emissions standards set through 2025, and this report comes out while a proposal from the current administration responding to their request that would completely halt progress in the industry at 2020 levels sits awaiting public release. Therefore, while this year’s Automaker Rankings highlights some of the progress made by leaders in the industry to move forward on the technology front, it’s also critical that on the political front these companies stand up to the administration to ensure the rest of the industry continues to move forward on reducing emissions.

The technology to meet future standards is out there

For me, one of the key takeaways I had from this report is that while standards have in many cases accelerated the deployment of new technologies, some of the most cost-effective strategies to reduce emissions are still sitting on the shelf. The industry’s progress to-date is barely a glimpse of where gasoline-powered vehicles could be in the future as shown in the figure below.

While vehicle standards have led to significant growth in a number of technologies, even many of the most cost-effective technologies to lower emissions have been deployed in only a small fraction of the fleet, leaving plenty of room for further reductions.

On top of this, many of the deployed technologies, like advanced transmissions, still have significant incremental progress that can be made. We’re also seeing novel developments in other technologies like start-stop, where we are beginning in 2018 to see the deployment of higher-voltage (48V) systems that can result in complementary technology such as electric boost and again continue to push out the horizon for combustion engine improvements. For this and many other reasons, it’s baffling to see the industry assert that meeting 2025 vehicle standards requires widespread vehicle electrification.

No more Greenest Automaker

Of course, electric vehicles are one of the reasons for a key difference in this year’s report: we are now including the results of all automakers, not just those largest companies who sell vehicles of all sizes and types. A lot of the development for some of the technologies that could pave the way to a lower-emissions future are coming from some of the smallest manufacturers, whether that’s Tesla’s all-electric fleet or Mazda’s SkyActiv-X spark-assisted charge compression engine, which looks to bring diesel-like operation to a gasoline engine. Ignoring this leadership from smaller automakers would be ignoring some of the most forward-looking technology deployment in the industry.

Additionally, it’s important to recognize that this report is limited to the emissions of the vehicles sold by manufacturers—it does not consider other aspects of operations which also affect the sustainability and “greenness” of a company, whether that’s related to water use at its facilities, renewable power sourcing, or other aspects of the manufacture and distribution of a manufacturer’s fleet.

Considering these two central limitations, we have decided to no longer award a “Greenest Automaker.”  It’s important to recognize the wide difference between the emissions from the fleet of Honda, who has again asserted its leadership to provide the lowest emission fleet from full-line automakers, and Fiat Chrysler, who finds itself producing a fleet better only than McLaren, Ferrari, and Aston Martin—automakers who produce only exotic sports cars meant more for a track than a highway—but that is only part of the story.

The gap between leaders and laggards is huge and pervades all vehicle classes

One of the reasons we have previously ignored small manufacturers is that they provide a narrow spectrum of vehicles—and it’s been a historic complaint from companies like Ford that they should get a pass because people want big trucks. But one of the key findings from this year is that the Detroit Three fall to the bottom of the pack not because they sell big trucks, but because in virtually all classes of vehicles they sell, their cars and trucks emit more than the rest of the industry.  And the reverse is true for a company like Honda.

Honda is the manufacturer with the lowest emissions because it invests broadly in improvements across its entire fleet. Similarly, the Detroit Three don’t perform poorly because they sell a lot of trucks—they perform poorly because their vehicles emit more than the industry average in most classes of vehicle.

The only company whose ranking is significantly affected by the mix of vehicles they sell is Toyota—but that was an intentional decision on their part.  They chose to boost production of their least efficient vehicles, trucks and SUVs, while at the same time bypassing investment in improving those vehicles.  If they want to catapult back to the top of the pack, they’ll need more than the Prius to make them look like a leader—it’s about providing consumers lower emission choices across the entire spectrum of vehicles sold.

A path forward

With every Automaker Rankings, we try to provide the industry with a path forward. And the truth is, the engineers at these companies have been working their butts off to provide a bright future for the industry…should they choose to embrace it.

Manufacturers have made a number of pronouncements about the vehicles planned over the next five years which could easily end up keeping emissions levels on the path envisioned under the 2025 standards now on the books. And we have tried to highlight the role these vehicles can play in creating a more sustainable transportation future.

But too many within the industry have been looking to ignore their role in getting to this low-emissions future, so the question remains:  Will the industry accelerate toward a cleaner future by following their engineers, or continue to deploy their lobbyists to slam on the brakes?

It’s Time to Implement Stronger Autonomous Vehicle Testing Standards

UCS Blog - The Equation (text only) -

Photo: Grendelkhan/Wikimedia Commons

The widespread introduction of autonomous vehicles could potentially bring about many benefits – advocates argue they will reduce traffic, the burden of driving, and emissions should the cars be electrified. The could also improve access for children, the elderly or people with disabilities – but the most important benefit is improved safety.

U.S. road fatalities increased 5.6 percent from 2015 – 2016. This is a disturbing trend, as this is the largest increase in the last decade. Proponents of the self-driving community will tell you that the cars will help to slash the numbers significantly because the human driver is taken out of the equation. According to the National Highway Traffic and Safety Administration, there were 5,987 pedestrian fatalities in 2016 – the highest number since 1990 – and 847 bicyclist fatalities, the highest since 1991. In addition, fatalities due to drunk driving and speeding went up at least 1 percent. Although fatalities from distractions and drowsiness went down 2.2 and 3.5 percent, respectively, they were offset by an increase in other reckless behaviors (speeding increased 4 percent, alcohol impairment increased 1.7 percent, and unbelted incidents increased 4.6 percent).

Autonomous vehicles are being tested in several states and provinces, such as California, Pennsylvania, and Ontario. The graphic below shows the status of autonomous vehicle testing laws in the various states across the country – 25 of 50 states have passed laws overseeing testing. Uber and Waymo have taken the lead in testing – Waymo has logged over 5 million miles and Uber, although far behind Waymo, has logged a significant number of miles itself with 2 million. California has been working with testing companies under a regulatory framework, while states like Arizona have allowed free reign to the companies to test the vehicles on the public roads, with a backup human in the driver seat to compensate for any failures in the software. However, what happens if the driver gets distracted and loses focus? Or when the autonomous system doesn’t have a sufficient way of warning the driver that they need to take over?

Current Status of State Laws on Self-Driving Cars
Source: Brookings Institution and the National Conference of State Legislatures. Click to enlarge.

The NTSB presents its findings

According to a preliminary report released by the National Transportation Safety Board (NTSB), that is exactly what happened when an Uber self-driving platform controlling a Volvo XC90 autonomous vehicle killed a bicyclist in Tempe, Arizona on March 18. The initial reaction of the chief of the Tempe police on March 19 was that Uber was likely ‘not at fault’ for the incident after viewing the vehicles own video of the event. After a more thorough investigation, however, the NTSB report states that the Uber system “registered…observations of the pedestrian about 6 seconds before impact, when the vehicle was traveling at 43 mph. As the vehicle and pedestrian paths converged, the self-driving system software classified the pedestrian as an unknown object, as a vehicle, and then as a bicycle with varying expectations of future travel path.” The Volvo XC90 had its own emergency braking system, but this system was disabled when the Uber self-driving system was controlling the vehicle, to “reduce the potential for erratic behavior.” The Volvo emergency braking system could have prevented or reduced the severity of the crash, since it detected the bicyclist 1.3 seconds before the collision, and if enabled would have made an emergency stop.  The driver appeared to have been distracted by the computer interface and did not see the bicyclist step out into the street. By the time the driver looked up, saw the bicyclist and pressed the brake, it was too late.

View of the self-driving system data playback at about 1.3 seconds before impact.
Source: National Transportation Safety Board.

Safety advocates across the spectrum have cautioned lawmakers about the rapid pace of testing saying that it is too soon to have them tested on public roadways, interacting with pedestrians and bicyclists. Moreover, reports suggest that Uber’s self-driving system was struggling to navigate public streets, with drivers needing to intervene and take control from the automated system once every 13 miles, compared to more than 5000 miles between interventions for the Waymo systems being tested in California.  Real world testing on public roads is clearly needed to test and improve the self-driving technology but testing on public roads must only be done once public safety can be assured.

Congress is pushing federal legislation too quickly

This fatal crash is a stark reminder of the risks involved in racing to bring automated driving technology to market without adequate oversight. Senator John Thune, the Republican Chairman of the Senate Committee on Commerce, Science, and Transportation, remarked that “the [tragedy underscores the need for Congress to] update rules, direct manufacturers to address safety requirements, and enhance technical expertise of regulators.” Senator Gary Peters also chimed in, saying that “Congress must move quickly to enhance oversight of self-driving vehicles by updating federal safety rules and ensuring regulators have the right tools and resources to oversee the safe testing and deployment of these emerging technologies.”

Yet while state and local governments grapple with responses to this tragedy, the architects of the Senate self-driving bill are renewing their push to get it passed through Congress.  The Detroit News reported that Peters and Thune are still attempting to win support from reluctant senators. The bipartisan duo also is looking at the possibility of trying to attach the measure to another bill that has better prospects for a full vote or passing it as a standalone bill.

This push concerns us as we question whether the AV START Act is the right vehicle to meet those aims. The bill would allow hundreds of thousands more autonomous vehicles on our roads, with lax oversight, and would pre-empt the great work that state and local governments are doing to regulate AV testing in their jurisdictions.

Safety of all users of the road must be the top priority

In our policy brief “Maximizing the Benefits of Self Driving Vehicles,” UCS advocates that “rigorous testing and regulatory oversight of vehicle programming are essential to ensure that self-driving vehicles protect both their occupants and those outside the vehicle.” In October 2017, UCS expressed its concerns on the lack of scientifically-based safeguards in the Senate’s AV START Bill. Already, cities and states are having discussions on how to regulate AVs more strictly. The mayor of Pittsburgh Bill Peduto planned to ask representatives from the AV industry agree to a 25-mph limit on city roads, stating “Pittsburgh should have a very strong voice in whatever Pennsylvania should decide to do,” Peduto told reporters Tuesday. “These are our streets. They belong to the people of the city of Pittsburgh and the people of the city of Pittsburgh should be able to have certain criteria that shows them that safety is being taken first.” However, the city has  limited authority to regulate vehicles on its streets California is taking a different tack, as its Public Utilities Commission recently released guidelines that will allow AVs to pick up passengers – as long as the company holds an autonomous vehicle testing permit from the DMV for at least 90 days before picking up passengers, agrees to not charge for the ride, and files regular reports including the number of miles their self-driving vehicles travel, rides they complete and disabled passengers they are serving.

Uber and other companies will have to reassess their procedures for AV road testing and states will have to re-evaluate how freely they allow self-driving cars to be tested on their roads. Furthermore, municipal governments need to be at the table working with companies to develop robust safety standards. We need to ensure at all levels of government that adequate, sound safeguards are implemented, so that autonomous vehicles can truly achieve the safety benefits they are expected to have.

Grendelkhan /Wikimedia Commons

Update on the Low-Yield Trident Warhead: Time for the Senate to Step Up

UCS Blog - All Things Nuclear (text only) -

A couple of weeks ago, we noted that the Senate Armed Services Committee was about to get its chance to consider the National Defense Authorization Act (NDAA), which in its current form includes $88 million in funding for a new, lower-yield warhead for the Trident D5 submarine-launched ballistic missile (SLBM), designated the W76-2. At the time, the House Armed Services Committee had voted, along party lines, to reject an amendment that would have eliminated funding for the new warhead.

An unarmed Trident II D5 missile launches from the Ohio-class ballistic missile submarine USS Nebraska, These missiles currently carry W76 and W88 warheads with yields of 100 kilotons and 455 kilotons, respectively. The proposed W76-2 warhead would reportedly have a yield in the range of 6.5 kilotons.

Plans for the new lower-yield warhead have drawn criticism from many quarters, including a number of prominent former officials and military leaders, who collectively sent a letter to Congress asking that it not fund the program. The letter, signed by former defense secretary William Perry, former secretary of state George Shultz, former vice chairman of the Joint Chiefs of Staff Gen. James Cartwright (Ret.), and former head of the National Defense University Lt. Gen. Robert Gard (Ret.), among others, calls the new warhead “dangerous, unjustified, and redundant.” The signers say that “the greatest concern…is that the president might feel less restrained about using [the new warhead] in a crisis.” They call on Congress to exercise its authority to reject the administration’s request for funding and thereby “stop the administration from starting down this slippery slope to nuclear war.”

Since our previous post, the full House passed its version of the NDAA on May 24th, in the process rejecting—in a surprisingly close and but mostly partisan vote—an amendment sponsored by Rep. John Garamendi (D-CA) and Rep. Earl Blumenauer (D-OR) that proposed fencing half of the funding for the low-yield warhead. This funding would have been held until the secretary of defense submitted a report to Congress assessing: the effect of the new warhead on strategic stability, ways to reduce the risk of miscalculation associated with it, and how to preserve the survivability of subs that would carry it, should it ever be launched. The House rejected it by a vote of 226 to 188, largely along party lines, with seven Democrats voting against the amendment and five Republicans voting for it.

The Senate version of the NDAA is expected to be released any time now, and could go to the floor as early as this week. One point of debate is likely to be over a House provision that eliminates a 2004 requirement that the Secretary of Energy must get Congressional approval before developing a new low-yield nuclear weapon.

Success, however, will require Republican support as well. The 2004 limitation on low-yield weapons was approved on a bipartisan basis. And in 2005 Congress did find bipartisan backing to reject a Bush administration proposal for a new bunker-busting nuclear bomb. Senators should take a lesson from these and other past decisions that looked beyond knee-jerk party allegiance. They should listen to the experts—from both parties—who argue based on their long involvement with these questions that low-yield nuclear weapons are both dangerous and unnecessary, and they should base their votes on what is best for the security of the country as a whole. That security does not require a new lower-yield Trident warhead; in fact, just the opposite: it requires that the Senate oppose it.

 

 

 

Why NASA’s Carbon Monitoring System Matters

UCS Blog - The Equation (text only) -

Future funding for NASA’s Carbon Monitoring System (CMS) was recently cancelled, leaving the important program in jeopardy unless Congress takes action soon.

Why is the Carbon Monitoring System important?

The CMS complements other NASA carbon monitoring work by taking Earth science observations and doing research with them that improves understanding of how carbon flows through our land, biosphere, waterbodies, ocean, and atmosphere. As a result, the CMS funds work that allows for better management of these natural resources and better understanding of how they will respond to global changes, such as climate and land use change.

The research that comes out of the CMS has real world benefits for communities across the United States, as one of its objectives is to meet stakeholder needs across a range of scales. For example, NASA’s Carbon Monitoring System recently funded work that provided fine-resolution information on a vast swath of forest stocks in interior Alaska. This project made use of NASA airborne LiDAR technology to provide information on forest resources within the state that were previously understudied due to their remote location. The region covered in the analysis accounts for 15% of US forested land.

Other CMS projects include research to:

  • Improve measurements of carbon storage in the U.S. Corn Belt, which stands to enhance measurements of agricultural productivity and management practices.
  • Advance understanding of the Great Lakes, and how carbon is stored in them.
  • Improve understanding of how land use decisions and changes in climate such as extreme weather events affect nutrient cycling and water quality in the Gulf of Mexico.

Through these projects and others, CMS is helping stitch together observations of carbon sources and sinks to produce a high resolution representation of how carbon flows through our planet.

What Congress can do to ensure the survival of the Carbon Monitoring System 

In May, the House Commerce, Justice, and Science subcommittee approved language that would restore funding for the CMS in a bill that provides funding for NASA in 2019. While the amendment provides $10 million to the program, now under a slightly different name (the Climate Monitoring System), the funds would, “… help develop the capabilities necessary for monitoring, reporting, and verification of biogeochemical processes to better understand the major factors driving short and long-term climate change.” Given that this is virtually identical to the mission of the current CMS, it seems clear the intent of this amendment is to continue funding for this existing program.

As the Senate now turns to mark-up the FY-19 Commerce, Justice, and Science appropriations bill, they will have the opportunity to ensure that funding for this important initiative continues. An investment in the CMS would ensure that NASA has the funds available to study its carbon measurements and make them usable for decision makers, resource managers, and communities across the country.

 

Debriefing the EPA’s Science Advisory Board Meeting

UCS Blog - The Equation (text only) -

I spent most of Thursday and Friday this week at the EPA’s Science Advisory Board meeting in Washington, DC, as the 44 members gathered to discuss EPA’s regulatory agenda and hear updates from EPA programs on lead, per- and polyfluoroalkyl substances (PFAS), and the Integrated Risk Information System (IRIS). As I explained earlier this week, it was the first meeting for 18 of the members who had been appointed after Administrator Pruitt issued his directive barring EPA-funded scientists from serving on the committee. Much of the meeting turned out to be an exercise in reaching consensus in a group of over 40 people on a select few decisions (you can follow my twitter thread for more detail here), but there were some important things that came out of the two half-days.

Overwhelming public support for more review of scientific underpinning of EPA regulations

My colleague, Dr. Dave Cooke, during the public comment period.

During the public comment period section, 21 speakers provided comments on the EPA SAB workgroup’s memos on the EPA’s regulatory agenda. Public comment periods at these meetings don’t usually last a full hour, but scientists and experts in person and on the phone clearly wanted to express their support for the EPA SAB’s review of several rulemakings in process that would effectively roll back science-based regulations on vehicle and power plant emissions. Many of the oral comments echoed the same concerns that the SAB workgroup raised in its assessment of glider vehicles, namely that EPA had not undertaken an assessment of the emissions impacts from this rule and should, and that the technical information relied upon in the proposal was both at odds with EPA’s own tests and had now been withdrawn by the body conducting the research. Additionally, and notably, nine of the oral commenters (including myself) from different fields were in strong support of SAB review of the “Strengthening Transparency in Regulatory Science” proposed rule.

Consensus from SAB on general lack of analysis supporting EPA’s regulatory decisions

After the comments finished up, the SAB moved on to discuss the recommendations of the workgroup (here and here) and to come to a consensus on the advice that would be contained in a letter to the administrator coming out of this meeting. At first, some of the newer members advocated that the SAB postpone review of several of the regulatory actions that had been flagged as meriting review by the workgroup until more information was provided by the EPA. In fact, SAB member, Dr. Christopher Frey told Politico on Thursday that, “Basically they just didn’t provide us with any answers,” said Frey. “That kind of put us in a position where all we can really do is say EPA has not identified the science or any plan to review it, and clearly there are science issues that are in the proposed rule.” Luckily, after much conversation, there was an acknowledgement, even from the newest members, that it is better to agree to review and find out later that the scope of the review can be narrowed than to simply kick the can down the road and hope for better information from EPA. Thus, the full SAB was able to vote in favor of recommending to Administrator Pruitt that they review all five deregulatory actions identified by the workgroup as requiring scientific review.

Agreement that Pruitt’s restricted science proposal warrants review

Then the committee moved onto the question of whether to review the EPA’s April proposed rulemaking on transparency in regulatory science. From the outset, all members seemed to agree with the workgroup’s recommendation that it merited review. Dr. Honeycutt even justified the need for SAB review because of the sheer number of questions (27) that the EPA posed in such a short proposed rule. Stanley Young was the only member to show support for the rule, arguing that “mischief has been done” in the past with “studies hiding behind data,” calling out the Six Cities study as an example. This is a common talking point used by Administrator Pruitt and others when talking about so-called transparency, however it’s easily countered by showing that after all of the controversy around the study, once the data was reanalyzed by the Health Effects Institute, its findings were confirmed. The majority of members, however, were supportive of SAB review of this policy and ultimately voted unanimously in favor of recommending that Pruitt charge them with that duty.

Calls for delay of SAB review come from Pruitt-appointed members

The sentence proposed by committee member, Dr. Steven Hamburg, on the restricted science proposal. It will likely be edited during the drafting of the SAB’s letter to Administrator Pruitt.

The perhaps more contentious piece of this conversation happened on Friday when the SAB had time to consider exactly what they would be asking of the Administrator Pruitt regarding this rule. The question became: would they just ask to review the rule or would they also request that the agency defer all action on the rule (moving any further in the rulemaking process) before SAB’s review was complete. Interestingly, only new members disagreed with asking for agency deferral and Dr. Kimberly White of the American Chemistry Council commented that she thought SAB review shouldn’t begin until the rule is in final rule stage. It’s important to note that the American Chemistry Council has lobbied on similar legislation in Congress (the HONEST Act). Not only is it supportive of the rule, but its member companies stand to gain financially from such a rule that would limit the agency’s ability to use independent science to implement strong standards on chemicals that are environmental contaminants. Thus, Dr. White’s interest in delaying SAB review until it’s too late is right in line with her employer’s agenda in getting this rule finalized and implemented as soon as possible.

Another reason for delaying SAB review of regulatory actions is to wait until the makeup of the committee changes again this fall. Some 15 committee members’ terms will end at the end of September 2018 and while 8 of them have only served one term and could be reappointed, it is likely that Pruitt will not do that and appoint all new members. The final 11 members who were appointed under the Obama administration have terms expiring in September 2019, 8 of whom have only served one term. By the end of 2019, it is possible that Pruitt could have a hand-selected SAB and so far, Pruitt appointees appear more interested in delaying SAB review and allowing EPA actions to get farther into the rulemaking process before SAB weighs in. But, the SAB’s role is to be involved in EPA’s deliberative process and providing advice early enough in the rulemaking process that it can actually have an influence on the science considered by the agency. Advice received after a rule is already finalized is useless.

The ball will soon be in Pruitt’s court

At the very end of the meeting, the SAB agreed that Dr. Honeycutt and the Designated Federal Officer would draft a letter to Administrator Pruitt that would be sent to members for comment. This letter will then be sent to the Administrator’s office and once there, there are no requirements for him to respond in any window of time and no mandates that he follow the SAB’s advice. He could agree with the SAB and charge them with reviewing EPA actions within a matter of months, he could do the same thing but have them do it over the course of the year (at which time several of the rules under review might already be finalized), he could disagree with their recommendations and give them no charge or a different charge altogether, or he could ignore them completely. It’s hard to foresee what he will do because while he seems uninterested in scientific backing for his deregulatory agenda and never responded to the SAB letter sent in September 2017, he took quite an interest in EPA’s federal advisory committees when he issued a directive that changed the composition of many of them in October.

It would be in the public’s best interest for the SAB to have the opportunity to review these EPA regulatory actions so that there is at least some public record of scientific input and peer review feeding into the rulemaking process that has been entirely lacking at the EPA under the Trump administration. Administrator Pruitt should listen to his advisors on these issues, charging them with immediate review of these potentially destructive policies. Otherwise, the message he’ll be sending is that he can’t handle the truth: best available science will not support his deregulatory agenda.

 

Who’s Interested in the Trump Coal Bailout?

UCS Blog - The Equation (text only) -

Old coal plant Photo: San Juan Citizens Alliance/EcoFlight

Changing technology, from low cost wind and solar, to pollution control added to coal plants, to fracking for natural gas, has created a new debate about where we should get our electricity. The debate has reached new levels with an order today from the White House to protect coal and nuclear plants from competition, even where the plants voluntarily agreed to participate in competitive markets.

This isn’t the first time the debate over grid reliability has included the Department of Defense (DOD), and despite what the White House says, the regional grid operators are still maintaining reliability.

The DOD is the largest single buyer of energy on the planet. They know a bit about fuel security, and they spend a lot of money on day-to-day energy needs.  In the announcements and justifications today that the national defense depends on old coal and nuclear plants, there isn’t any recognition of what the DOD has been saying and doing.

Grid support from offshore wind is real in New England. Photo: M. Jacobs

James Mattis, Trump’s Secretary of Defense, is credited with the energy strategy for war fighters to “unleash us from the tether of fuel.”  In this regard, there is an on-going effort to build renewable energy and energy-storage facilities on US military bases. Because almost every power outage in the US was caused by problems with the wires, the DOD is interested in power supplies located on base, and not dependent on fuel. Wind and solar have been welcomed by the DOD.

The DOD has also warned against “not appropriately valuing the continuing contribution of renewable resources in meeting requirements.”  This was in the context of a PJM (grid operator for 13 states in Mid-Atlantic and Ohio Valley) debate over need for more fuel assurance in winter weather.

Meanwhile, the operators of grids in the US are not interested in the Trump Administration’s interference with energy markets. Comments filed at FERC by the federally-regulate independent grid system operators were uniformly hostile when Secretary Perry ordered a rulemaking last year for plants with a 90-day fuel supply. This is true for grid operators that have little or no coal plants running on their system. It is also true of PJM, which put out a statement that it is uninterested in the today’s recycling of the Administration’s coal plant proposal.

Earlier today PJM, the grid operator with one of the least-supportive environments for wind and solar,  said that they are doing just fine without the current proposal:

Our analysis of the recently announced planned deactivations of certain nuclear plants has determined that there is no immediate threat to system reliability,” the operator stated. “Markets have helped to establish a reliable grid with historically low prices. Any federal intervention in the market to order customers to buy electricity from specific power plants would be damaging to the markets and therefore costly to consumers.

(Credit Julia Pyper, Jeff St. John at Greentech Media.)

 

Given that the organizations most relevant to this issue are not seeing this as helping, the question has to be, “Why are we going through this again?”  There isn’t a claim that there is some long-term benefit to raising costs for all consumers, trashing markets and the decisions made to invest in them, and polluting our air, land and water with the coal ash, waste heat and carbon emissions. This proposal does not make America safer, does not advance the technologies we need, and does not provide a path to cost-reductions, or cleaner environments, or even a better-supplied military.

Once we get the Adminstration’s ideas for implementation (yes, they’re not ready with that) we can compare to last year’s proposed rule, which would have had disastrous impacts on human health, cost consumers billions, hurt competitive energy markets, all while doing nothing to improve and possibly even impede, grid reliability and resiliency.

Stay tuned.

Pages

Subscribe to Union of Concerned Scientists aggregator - Combined UCS Blogs