Combined UCS Blogs

Our Science for Public Good Project: Hosting a Holiday Air and Water Quality Party

UCS Blog - The Equation (text only) -

Photo: Anna Scott

Nothing says ‘happy holidays’ like environmental justice, so the three of us co-hosted a holiday party in West Baltimore to talk about a recent lead water testing campaign and an upcoming air quality monitoring campaign called Baltimore Open Air. Anna is a graduate student studying climate science. Jennifer is an organizer with Clean Water Action, a grassroots environmental organization focused on water and air quality, climate change, and environmental justice. And Nabeehah works for a grassroots community organization called Communities United in West Baltimore which addresses trauma and building resiliency. We know each other from Baltimore’s People’s Climate Movement table, and were excited about receiving a grant from the Science for Public Good fund.

We decided to highlight key environmental justice challenges that Baltimore neighborhoods face.  Rates of lead poisoning are high, especially among children. Much of the risk is from lead paint, still present in many homes throughout the city. Water is a concern too: more than ten years ago, water fountains in all Baltimore Public Schools were shut off after water repeatedly failed to meet safe lead standards. They still haven’t been turned back on.  Air pollution is likewise a major health threat: in 2013, the asthma hospitalization rate in Baltimore City was 2.3 times higher than the average rate for Maryland, driven by nearby coal plants, trash incinerators, and highways. We’re each involved in monitoring and advocacy campaigns to clean up Baltimore’s water and air, and wanted to share information and ways for people to get involved.

Coalition partners in West Baltimore were invited to attend, and to share the event with their members. Nabeehah went door-to-door in the surrounding community to tell residents about water testing and air quality monitoring, and invited residents to come to our event to learn more. Anna researched answers to questions about the health impacts of lead, water contaminants, and air pollution, and prepared information on her study of local air quality using citizen science and affordable monitors. Jennifer found a local caterer to serve food, and shared information local campaigns against big polluters and her organization’s study of lead drinking water pipes in Baltimore. (You can see the presentation we put together here.) And we all worked together to write questions and answers for a fun game of Environmental Justice Jeopardy. About 50 people from West Baltimore attended the party and learned more about what local organizations are doing to fight for clean air and water in the community.

Does this sound like something you’re interested in doing, but don’t know where to start?

First off, it’s critical to partner with a local group working in the community. What community members in West Baltimore tell Nabeehah and her colleagues is that they have been “surveyed to death.” They have been offered help that never came. Residents see that their community is receiving grants and funding, but they can’t account for what it was spent on. These experiences have led people to be wary of even well-intentioned organizers, psychologists, scientists, and others who start working in their community—particularly when it hits the news due to a traumatic event—without building relationships first.

Seeing this happen over and over makes communities feel used and taken advantage of. The best way to bring science to communities is to start with building relationships and trust by finding organizations that are already working there.

To find those organizations, start being present in the community. Is there a community association meeting coming up? See if you can attend just to listen and learn about what’s happening in the neighborhood. Have you heard about a campaign to address problems that residents face? Follow the news, see who is leading those efforts, and get in touch. Finally, if you are connected with any fellow scientists working on Community-Based Participatory Research or other community efforts, ask them how they got started.

This collaboration was an excellent experience because it helped us develop an understanding of how these core principles directly correlate to science: just as scientists must maintain an open mind, exhaust every possibility, and follow data where it leads, organizers and others pursuing social change must work to invite and involve everyone in a community, practice the skills of listening before leaping to conclusions, attack all angles of injustice, and commit to continuous self-transformation as we change both our society and ourselves.

Anna Scott is a graduate student studying climate science. Jennifer Kunze is an organizer with Clean Water Action, a grassroots environmental organization focused on water and air quality, climate change, and environmental justice. Nabeehah Azeez works for a grassroots community organization called Communities United in West Baltimore, which addresses trauma and building resiliency.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

 

The Truth about Coal, in Under Three Minutes

UCS Blog - The Equation (text only) -

Photo: CSIRO/Wikimedia Commons

Coal’s been on the way out for a while now. Why is that? For a quick and accessible look at the state of the coal industry—where it’s been and where it’s going—check out the new video from the Union of Concerned Scientists.

We’ve been writing about coal’s decline and its implications—and setting the record straight on misinformation about coal—for quite some time: responding to the administration’s cheerleading of coal, assessing and understanding the shift away from coal-fired electricity, why the transition away from coal is hard on workers and communities, and how intrepid business leaders in Coal Country are leading the way to new economic opportunities.

We kicked off a blog series recently with an explanation of why a slight increase in coal jobs in 2017 is no indication of a long-term trend. In this second Coal in Context blog, I’d like to highlight a short video on the reasons behind coal’s decline.

 

The need for a just transition

Here at UCS, we value facts and evidence—and we’re doing our part to set the record straight in a time of great uncertainty. It’s important to emphasize that the coal industry is not returning to its heyday and will instead continue to decline, despite what you may hear from administration officials and the president himself. That propaganda is dangerous because it leads to false hope—leading some to refuse training opportunities in other industries, hoping that coal mining jobs materialize.

Let’s take the longer-term view and understand that coal communities will need to develop new economic sectors to support good-paying jobs in the future—and that it is our collective responsibility to invest in those communities—through proposals like the RECLAIM Act and the POWER Initiative—so they can succeed.

We also need to take the longer-term view on the power sector as a whole– to address the urgent threat of climate change. Yet the administration continues its efforts to rescind the Clean Power Plan, something I testified against back in November in West Virginia. Please join our efforts to push back on these misguided actions—and share the video with your friends to help spread the truth.

ExxonMobil’s Climate Disinformation Campaign is Still Alive and Well

UCS Blog - The Equation (text only) -

An ExxonMobil-funded senator from Oklahoma, James Inhofe, cited a debunked ExxonMobil-funded study at a recent Senate hearing. C-SPAN

In a recent blog post, ExxonMobil executive Suzanne McCarron reiterated her company’s claim that it fully accepts the reality of climate change and that it wants to do something about it.

“I want to use this opportunity to be 100 percent clear about where we stand on climate change,” she wrote. “We believe the risk of climate change is real and we are committed to being part of the solution.”

So why is the company still a part of—in fact, a major part of—the problem?

An exchange between Oklahoma Sen. James Inhofe and Environmental Protection Agency (EPA) Administrator Scott Pruitt during a recent oversight hearing is a case in point, providing a window into how ExxonMobil’s undue influence continues to block climate action.

During the January 30 hearing, which was held by the Senate Environment and Public Works (EPW) Committee, Sen. Cory Booker inadvertently provoked Inhofe by raising the issue of environmental justice. The New Jersey Democrat cited the threat climate change-induced flooding poses to three dozen Superfund sites in his state and asked Pruitt if he was “taking into account the environmental burdens disproportionately impacting communities of color, indigenous communities and low-income communities.”

Inhofe seized the opportunity to contradict Booker, claiming that minority and low-income communities are disproportionately harmed by environmental protections, specifically citing the Obama administration’s Clean Power Plan, which would have dramatically reduced carbon emissions from coal-fired power plants if Pruitt hadn’t repealed it.

Booker, Inhofe said, was implying that Pruitt was trying to “punish” vulnerable Americans. “I wanted to just remind you,” Inhofe told the committee, “that we had a guy I remember so well, Harry Alford. He was the president of the National Black Chamber of Commerce. He provided some of the most powerful testimony that I have ever heard when it comes to the effects of the Clean Power Plan and some of the other regulations … on black and Hispanic poverty, including job losses and increased energy costs.

“[Alford] was very emphatic as to who was paying the price on these,” the Oklahoma Republican continued, “and I think sometimes that the previous administration forgot that there are already people out there who are paying all they can pay just to try to eat and keep their house warm.”

ExxonMobil’s Echo Chamber

Inhofe’s source for his assertion? A discredited, ExxonMobil-funded study by an Exxon-funded advocacy group that was based on discredited studies by other ExxonMobil-funded organizations.

Inhofe rested his argument on previous congressional testimony by Harry Alford, president of the National Black Chamber of Commerce (NBCC), a shoestring, mom-and-pop operation that is unapologetic about taking fossil fuel industry money. “Of course we do and it is only natural,” Alford wrote on NBCC’s website. “The legacy of Blacks in this nation has been tied to the miraculous history of fossil fuel…. [F]ossil fuels have been our economic friend.”

One of NBCC’s closest economic friends is ExxonMobil, which has donated more than $1.14 million to the group since 2001.

What did the company get for that money?

In 2015, NBCC commissioned a report that claimed the Clean Power Plan would “inflict severe and disproportionate economic burdens on poor families, especially minorities.”

In fact, unchecked climate change would more than likely hurt those communities most, and investments in energy efficiency under the plan would ultimately lower electricity bills across the country.

How did NBCC arrive at its upside-down assessment? The Union of Concerned Scientists took a close look at the report and found it was based on several flawed fossil fuel industry-friendly studies. Two of those bogus studies were produced by ExxonMobil grantees: the Heritage Foundation, which received $340,000 from ExxonMobil between 2007 and 2013, and the U.S. Chamber of Commerce, which received $3 million between 2014 and 2016.

The Chamber study, which came out just days before the EPA released a draft of the Clean Power Plan, was debunked not only by the EPA, but also by PolitiFact.com and The Washington Post. Among its many faults, the Chamber study—which was co-sponsored by the American Petroleum Institute—wildly inflated the cost of the plan and failed to consider the benefits of cutting carbon emissions.

ExxonMobil Spreads its Money Around

But there’s more than just the fact that ExxonMobil financed deliberately flawed studies to try to derail the Clean Power Plan. The company also is a major supporter of a number of Senate EPW Committee members, including Inhofe, who are adamant climate science deniers.

Inhofe has deep ties to the oil and gas industry, which has donated $1.85 million to his campaign war chest over his long career in Congress, more than twice than any other industry. Three oil and gas companies are among the senator’s top 10 corporate contributors: Koch Industries, Devon Energy and … ExxonMobil.

Six of the other 10 Republicans on the EPW Committee also are on ExxonMobil’s donation list, including Wyoming Sen. John Barrasso, the current committee chairman. Roughly half of the $119,500 ExxonMobil contributed to the seven senators over the last decade went to Barrasso and Inhofe, the committee’s previous chairman.

Then there’s ExxonMobil’s link to Pruitt, Oklahoma’s attorney general before President Trump tapped him to run the EPA. From 2012 through 2013, he chaired the Republican Attorneys General Association (RAGA) and afterward served on the organization’s executive committee. From 2014 through 2016, ExxonMobil gave RAGA $160,000 in three annual installments.

Just a few weeks after the company made its 2016 donation of $50,000, Pruitt and then-RAGA Chairman Luther Strange, at the time Alabama’s attorney general, co-authored a National Review column attacking a coalition of state attorneys general investigating ExxonMobil and other fossil fuel companies for misleading investors and the general public about climate change. Parroting ExxonMobil’s argument, Pruitt and Strange charged that the coalition was violating the company’s first amendment right to free speech.

“The debate” over climate change, they wrote “is far from settled. Scientists continue to disagree about the degree and extent of global warming and its connection to the actions of mankind. That debate should be encouraged — in classrooms, public forums, and the halls of Congress. It should not be silenced with threats of prosecution. Dissent is not a crime.”

Inhofe Upstaged

In this case, Inhofe’s counterfactual comment didn’t make it into the ensuing media coverage. Along with nearly everything else that was said during the two-and-a-half hour marathon, it was eclipsed by a bombshell dropped by Sen. Sheldon Whitehouse. The Rhode Island Democrat revealed that during a February 2016 radio interview, Pruitt said Trump would be “more abusive to the Constitution than Barack Obama, and that’s saying a lot.” Whitehouse read Pruitt’s remark aloud and asked him if he recalled making it.

“I don’t,” Pruitt responded, “and I don’t echo that today at all.”

“I guess not,” Whitehouse replied.

Not surprisingly, that embarrassing nugget was the story. There was no way that Inhofe’s rambling, ExxonMobil-sponsored falsehood could compete in the media with red meat like that. But overshadowed or not, Inhofe provided yet another incontrovertible piece of evidence that—despite ExxonMobil’s statements to the contrary—the company is still very much engaged behind the scenes in trying to stymie any government attempt to seriously address climate change.

Science Alert to EPA Chief Pruitt: Pollution Kills People

UCS Blog - The Equation (text only) -

An X-ray showing the affected lungs from acute exacerbation of chronic obstructive pulmonary disease, which can be triggered by air pollution. Photo: Wikimeda

President Trump’s chief of the Environmental Protection Agency (EPA), Scott Pruitt, has been hard at work undermining some of our nation’s most important public health safeguards in the guise of “reform.” His talking points omit the fact that these policies, guidelines, and programs have a strong record of protecting us from toxic chemicals and harmful air pollutants. And he’s leveraging the fact that many have obscure sounding names, hoping the public won’t notice that he’s stripping away safeguards at a time when the science on air pollution and health signals the need to strengthen protections for our families and communities.

Let’s dig into the details.

Less MACT means more HAPS

Breaking with more than 20 years of precedence in implementing the Clean Air Act, late last month the EPA reversed long-standing guidance that limits hazardous air pollutants (HAPS) such as the neurotoxins mercury and lead from major sources like power plants and large industrial facilities.

The agency’s new guidance means that major sources of HAPS may no longer be required to employ Maximum Achievable Control Technology (MACT), which reforms guidance that has been singularly successful in reducing toxic air pollution. Giving polluting facilities that have been employing available MACT technologies for years a way out of that requirement may line pockets of the polluting parties, but it is certainly not in the public interest. This is especially true for environmental justice and low-income communities and for people of color; they are already bearing a disproportionate burden of toxic pollution. Read more about all this here and here.

All eyes on IRIS

The EPA’s chemical risk assessment program—the Integrated Risk Information System (IRIS)—is the gold standard for chemical toxicity reviews at the federal, state, and local level, and even internationally. Because IRIS provides a scientific basis for regulating chemicals, it has been a target of criticism by the chemical industry, trade groups, and their friends in Congress and the White House.  IRIS is at serious risk. Read more here.

And then there’s TSCA

After years of effort and with the bipartisan support that now seems a distant memory, Congress passed legislation in 2016 to update and reform the Toxic Substances Control Act (TSCA). Enacted 42 years ago, TSCA is a fundamental safeguard meant to protect our nation’s children, families, and communities from the health effects of dangerous chemicals—health effects like cancer, birth defects, and reproductive disorders.

Two years ago, many public health and environmental leaders cheered this much-needed reform (read more here). But then the Trump administration gave the task of writing new rules to Dr. Nancy Beck, formerly director of regulatory science policy at the American Chemistry Council, the chemical industry’s leading lobbying group. Not surprisingly, under her leadership, the agency has rolled out rules that reflect industry-favored positions, despite objections from the EPA’s own scientists and staff, who warned that the new changes could seriously underestimate health risks and make it harder to track and thus regulate dangerous chemicals. Read how and why the agency shifted here.

Pollution control is good for the economy and public health

Contrary to what Mr. Pruitt and the Trump administration say, pollution control is healthy economically. Air quality improvements in high-income countries have not only reduced deaths from cardiovascular and respiratory disease, but have also yielded substantial economic gains. In the US, an estimated $30 in benefits (with a range of $4 – $88) has been returned to the economy for every dollar invested in air pollution control since 1970. Read more here and here.

What the science says—a global look

Given these and other threats to clean air and to the science-based protections that have been established to safeguard public health, it seemed like a good time to take a look at what the latest science says about pollution and health—both globally and here at home.

The Lancet Commission on Pollution and Health provides the most recent, overarching, and in-depth analysis of the health and economic impacts of pollution. Its report covers air pollution, water pollution, and soil pollution, as well as occupational pollutants and the emerging threats of developmental neurotoxicants, endocrine disrupters, and pesticides. Its focus is global, but the report includes some country-specific data and information. The report shows that no country is unaffected. And it also notes while that many effects of chemical pollutants are yet to be determined, much is already known.

Quoting directly from the report, here are some of the findings and key takeaways:

  • Pollution is the largest environmental cause of disease and premature death in the world today—responsible for an estimated 9 million premature deaths in 2015 alone. That’s three times more deaths than from AIDS, TB, and malaria combined and 15 times more than from all wars and other forms of violence.
  • Pollution disproportionately kills the poor and the vulnerable. In countries at every income level, disease caused by pollution is most prevalent among minorities and the marginalized.
  • Children are at high risk of pollution-related disease and even at extremely low-dose exposure to pollutants during windows of vulnerability in utero and in early infancy, which can result in disease, disability, and death in childhood and across their lifespan.
  • Pollution endangers planetary health, destroys ecosystems, and is intimately linked to global climate change. Fuel combustion—fossil fuel combustion in high-income and middle-income countries and burning biomass in low-income countries—accounts for 85 percent of airborne particulate pollution and for almost all pollution by oxides of sulphur and nitrogen. These pollutants cause some serious health effects, like asthma, shortness of breath, wheezing, and other respiratory problems.
  • More than 140,000 new chemicals and pesticides have been synthesized since 1950.The 5,000 produced in greatest volume have become widely dispersed in the environment and are responsible for nearly universal human exposure.
  • [There is] increasing movement of chemical production to low-income and middle-income counties where public health and environment protections are often scant.
What the science says—a US look

A recent nationwide study of seniors in the US sounds similar alarm bells. Researchers at the Harvard School of Public Health studied 60 million Americans—nearly 97 percent of people 65 years of age and older—and found that long-term exposure to fine airborne particulates (PM2.5) and ozone increases the risk of premature death. Even at levels below current EPA national ambient air quality standards (NAAQS). And that the “effect was most pronounced among self-identified racial minorities and people with low income.”

Francesca Dominici, principal investigator of the study, commented on the study’s unprecedented statistical power given the massive size of the study population, noting that the “findings suggested that lowering the NAAQS for fine particulate matter will produce important public health benefits, especially among self-identified racial minorities and people of low income.” These findings build on past work that has long showed the effect of long term exposure to particulates on mortality.

A second Harvard study examined short-term exposure to the same pollutants (PM2.5 and ozone) and also found a link to higher premature death among US elders—again with low income, female, and black elders at higher risk. The study found that “Day-to-day changes in fine particulate matter and ozone exposures were significantly associated with higher risk of all-cause mortality at levels below current air quality standards, suggesting that those standards may need to be reevaluated.”

Lead author Qian Di noted that “No matter where you live—in cities, in the suburbs, or in rural areas—as long as you breathe air pollution, you are at risk.” [Can’t resist a shout out to Qian Di, a doctoral student in environmental health—and to other early career scientists who are out there bringing their science to bear on critical matters of public policy, public health, environmental protection, and environmental justice.]

Study co-author Dr. Sara Grineski noted that “We’re only now realizing how toxins don’t just affect the lungs but influence things like emotional development, autism, ADHD, and mental health…” Photo: RuslanDashinsky/iStock

What the science says—a look at our kids

There is robust scientific evidence on the adverse impacts of air pollution on children’s health—from health impacts of fossil fuel combustion (nice summary of research here) to new research on children’s exposure to neurotoxins. Researchers at the University of Utah studied air pollution exposure in nearly 90,000 public schools across the US using EPA and census data. They found that ambient neurotoxins like lead, mercury, and cyanide compounds pose serious risks to children at our public schools.

Chicago, Pittsburgh, New York, Jersey City, and Camden were among the 10 worst polluted areas. They also found racial disparities, with students attending high risk schools nationwide significantly more likely to be Hispanic, black, or Asian/Pacific Islander.  In a lengthy  Guardian piece on the research, study co-author Dr. Sara Grineski noted that “We’re only now realizing how toxins don’t just affect the lungs but influence things like emotional development, autism, ADHD, and mental health…“ Socially marginalized populations are getting the worst exposure….“This could well be impacting an entire generation of our society.” Other recent studies and reports of air pollution impacts on children’s health can be found here , here, here, here .

EPA, please follow the science

This new research pretty much puts the kibosh on arguments that our nation’s air is clean enough and that it’s time to “reform” (read weaken) current policies, guidelines, and programs.

The science is clear—this is not the time to roll back efforts to control pollution, to squelch reviews and assessments of environmental chemicals, and to otherwise defund and de-staff critical public health protection programs at the agency. The EPA’s mission is to protect human health and the environment; this means putting the public interest first. In his goal to remake the EPA, we can’t let Mr. Pruitt sideline science and put our public health at risk.

Here at UCS we are doing our best to monitor and fight back against attacks on science and on the science-based policies that protect our public health and safety. Many partner organizations—from large national organizations to small grassroots groups and organizers fighting for environmental justice—are doing the same.

You can help by speaking out to elected officials in Congress, to your regional EPA office, and directly to Mr. Pruitt. He needs to understand that our public health is his priority, not easing industry’s path to higher profits. Join us as a science champion; we provide information and other resources to help you engage in this effort.

UCS is Surveying Federal Scientists Working Under the Trump Administration

UCS Blog - The Equation (text only) -

Beginning on Monday, February 12, UCS is administering another survey that will assess the status of scientific integrity across 16 federal science agencies. More than 63,000 government scientists will have the opportunity to anonymously share their perspectives on scientific integrity in the government.

UCS is partnering with Iowa State University’s Center for Survey Statistics and Methodology (CSSM) because of their deep expertise in the technical and operational aspects of sample surveys. CSSM has taken the technical steps needed to fully assure the anonymity of federal scientists who choose to take the survey on their personal time and equipment. Scientists will have the option of completing the survey online, on paper or by phone. The survey will close the morning of March 26, providing scientists with a large window of time to complete the survey.

The results will be used to get a quantitative measure of the status of scientific integrity across the government. Such a study is especially important now, as we live in a world of quick news cycles and anecdotal evidence. When it comes to science under the Trump administration, the public knows about individual cases that have made the news, but we have a less clear sense of the extent and pervasiveness of these problems. Are scientists being inhibited from conducting and communicating their work? How common are incidents of political interference? Are some agencies faring better than others? What can be done to better ensure scientists are able to carry out the missions of their science-based agencies? The survey will shed light on these critical questions.

Surveying scientists—a history

In 2015, under the Obama administration, UCS administered a survey that assessed the status of scientific integrity at federal agencies. In this survey, we asked the following open-ended question: How do you think the mission of your agency and the integrity of the scientific work produced by your agency could best be improved? The responses to this question run the gamut, from scientists stating that more transparency about scientific integrity policies at agencies is needed, to scientists saying that science needs to be less driven by politically motivated policies. Answers to questions like these provide crucial feedback on what’s happening at federal agencies, especially now, as concerns have been raised about the Trump administration’s treatment of science.

UCS has been conducting surveys to assess federal scientific integrity since 2005. The information provided through these surveys has been incredibly useful and in some cases has led federal agencies to update their policies to create a better work environment for federal scientists and allow science to inform government decisionmaking. For example, in 2011, the National Science Foundation developed a media policy in response to survey responses and policy analysis developed by the Union of Concerned Scientists. In 2013, the US Geological Survey improved its social media policy to better ensure scientifically accurate agency communications.

The Trump administration’s attacks on science and scientists make it more important than ever to conduct a survey now. Scientists have blown the whistle on the Trump administration for reassigning them to do tasks for which they do not have expertise. Right out of the gate, the Trump administration gagged federal agency scientists from speaking to the press. Additionally, scientists have been barred from attending professional meetings and presenting their work. It also has been reported that scientists may be being told or choosing not to use politically contentious language such as “climate change” or “evidence-based.”

These stories, and others, suggest that science in the federal government is currently being conducted in an environment that discourages the use of scientists’ knowledge in decision-making; yet little is known about the environment in which most federal scientists conduct their day-to-day work and how this affects their ability to meet the goals of their science-based agencies’ mission. Are these isolated examples of the worst-case scenarios? We don’t have a measure of the extent of the problems.

Giving scientists a voice

In a time when federal scientists and their work are likely under attack, surveys such as these are important to fully understand what kind of conditions America’s federal employees are working under. Federal scientists provide important knowledge that guides US science-based policies to be most effective to protect public health and safety. Thus, we need to ensure that these workers are taken care of and that science reaches the decisionmakers, journalists, and members of the public who need it. If we don’t, who will?

For further information about UCS and CSSM’s 2018 survey on scientific integrity, see UCS’s web page and a FAQ on this project at www.ucsusa.org/2018survey, or visit CSSM’s survey web page.

If you are a federal scientist who was not identified for participation in the survey but would like to share your thoughts and views with UCS, learn how to connect with us with the level of confidentiality and anonymity that is most appropriate to your situation here: https://www.ucsusa.org/scienceprotection.

An Unseasonably “Hot” February for California’s Clean Energy Landscape

UCS Blog - The Equation (text only) -

By and large, major policy action for California’s electricity sector mimics the seasons: winter is a relatively quiet, reflective time and major policy developments start to bud in the spring. As the air heats up, so do policy debates in Sacramento, which ultimately bloom fully or die on the vine in September, when the Legislature wraps up its session.

But lately, the weather in California and electric sector policy developments seem unseasonably hot. For example, it’s currently 75 degrees outside my office in Oakland. And below are some of the things happening in the policy world that also seem particularly “hot”:

CPUC approves a 2030 clean energy blueprint.

Late last week, the California Public Utilities Commission (CPUC) approved a blueprint laying out the electricity sector investments through 2030 that will be necessary to reach greenhouse gas reduction goals consistent with the statewide requirement to reduce emissions 40% below 1990 levels by 2030.

This system-level blueprint is the first phase of what’s called the Integrated Resource Plan (IRP); the next step is for all investor-owned utilities (IOU) and community choice aggregators (CCA) to submit their individual plans, which are due in August. More information about the IRP and individual IOU and CCA progress can be found here. The publicly-owned utilities (POUs) in the state will submit their plans to the California Energy Commission (CEC) and progress can be tracked here.

UCS conducted analysis in the IRP proceeding to underscore a key blind spot in the CPUC’s own work: the fact that all of the gas generation capacity that exists today was assumed to still be around in 2030 to provide energy and grid services.

It’s well known that California has an excess of natural gas generation capacity on the grid, and it remains a significant source of global warming pollution in California. We built a lot of natural gas plants in the 90s and early 2000s, and we don’t need it all now. Our own analysis showed that a significant portion of the natural gas peaker generation capacity may not have much value to the grid in 2030. But, we also know that some gas will be important for reliability through 2030.

The question is, which plants stay and which plants go? The IRP decision underscores the need to understand the role of gas in California’s clean energy future, to make sure that the inevitable downsizing of the fleet does not jeopardize grid reliability, and benefits people that are most impacted by gas plant pollution, especially “fenceline” communities that bear the brunt of this pollution. UCS is planning some additional analysis on this issue, so stay tuned.

Big bills are being discussed in Sacramento.

Senate Bill (SB) 100, a bill that would set a bold and achievable target of getting 100% of California’s electricity from carbon-free resources by 2045 is still alive, and waiting to be taken up for a vote in the Assembly. Although there is a lot of public support for SB 100, the policy is getting hung up by potential amendments that deal with the treatment of distributed energy resources. UCS is trying to do what it can to break that logjam and in the meantime, communicate to the Assembly that we’d like to see SB 100 move forward without additional amendments.

Assembly Bill (AB) 813 is a bill that would make it possible for the California Independent System Operator (CAISO)—which operates the grid that serves about three quarters of California’s electricity needs—to expand and include other western states. Pivoting California’s energy market into one that’s west-wide is ambitious and complicated, but worth the effort. Expanding the pool of resources that a grid operator has to manage the system is one of the most cost-effective ways to incorporate more wind and solar generation onto the electricity system.

Energy storage and small-scale renewables are giving natural gas a run for its money.

In early January, the CPUC issued a resolution that authorizes PG&E to hold competitive solicitations for energy storage or “preferred resources” (e.g. demand response and distributed solar) to meet local reliability requirements that have previously been met with gas power plants. This decision, combined with the CEC’s recent decision to reject NRG’s request to build a natural gas peaker plant in the Oxnard, is evidence of what will hopefully become a very significant shift away from the assumption that gas plants are the best and most cost-effective way to provide grid reliability services in the future.

These are just three examples of major clean energy advancements that have unfolded in the last six months. And, many decisions are still developing about whether the state will pass SB 100 and nearer-term plans we’ll need in order to move towards a cleaner grid. Clearly, there is more work to do. But there’s no doubt in my mind that we are making meaningful progress on these “hot topics,” and UCS will be working to make sure California continues its clean energy momentum and climate leadership to “cool down” global warming.

Pruitt Squirming away from the Weight of Climate Evidence

UCS Blog - The Equation (text only) -

Photo: Gage Skidmore

Since taking office, EPA Administrator Scott Pruitt has shifted how he talks about climate change. You may have heard that he recently suggested that global warming might actually be a good thing. If the consequences of global warming weren’t so serious for Americans, his determination to take down one of the most studied scientific topics of our time would be silly in a Wile E. Coyote and the Road Runner kind of way. However, the shifts in his tactics may signal just how difficult it is to refute such an enormous body of evidence.

The beginning

Of course, we know more than enough about how our climate is changing and the degree to which humans are causing these changes for decision makers to take action. In fact, the U.S. government’s latest assessment of the state of our climate told us that human activity likely contributed to at least 92% of the change in Earth’s average temperature observed since 1951.

Out of step with the science, at his confirmation hearing, Administrator Pruitt remarked that, “The climate is changing and human activity impacts our changing climate in some manner. The ability to measure with precision the degree and extent of that impact, and what to do about it, are subject to continuing debate and dialogue.”

The middle

By mid-summer 2017, about six months into his tenure, Administrator Pruitt expanded on his question of the extent to which scientists can measure the effect of human activity on climate (answered above). This time, in an interview with Reuters, Administrator Pruitt shifted toward questioning the harm that climate change will cause: “It is not a question about whether the climate is warming. It is not a question about whether human activity contributes to it. It is a question about how much we contribute to it? How do we measure that with precision? And by the way, are we on an unsustainable path? And what harm… is it causing an existential threat?”

This is an interesting shift, because Administrator Pruitt seems to be trying to move the conversation away from whether humans are the primary cause of climate change (which again, science is very clear on), to a conversation about whether we are headed down a harmful path or not. Fortunately or unfortunately, the science is also very clear about this – we know that in the US, we are likely to see more frequent large wildfires in the West, category 4 and 5 hurricanes, coastal flooding, and intense heat waves in a warming world (among other impacts). All of these have real, harmful effects on people’s lives.

Where he is now

Administrator Pruitt continued his shift away from questioning whether or not humans are the primary cause of climate change, to whether or not climate change will harm Americans. Again, perhaps that is because the science on this is so difficult to refute.

Now, Administrator Pruitt is trying to shift conversations toward an idea that climate change may not be harmful, but beneficial, and is questioning what the ideal temperature of the Earth should be.

As we’ve already covered, global warming is projected to cause significant damage to American infrastructure, health, and wellbeing. We are already seeing these effects – we know, for example, that global warming increased the chances – and damaging impacts – of Hurricane Harvey.

And governments and scientists have already come together through the Paris Agreement to decide what the limit of warming should be to avoid dangerous climate change – between 1.5 and 2 degrees Celsius since pre-industrial times. Now, scientists are refining our understanding of the difference in how bad the impacts will be at these two levels, and what it will take to avoid the worst damages of climate change. Given the billion-dollar disasters associated with extreme weather, global emissions reductions have human health, economic, and societal benefits for the United States.

Why question climate science?

All of this begs the question – why is Administrator Pruitt (and others in this Administration) so adamant about refuting climate science findings? (While considering this, it is important to take into account his deep connections to the fossil fuel sector and how this shapes his perspective and priorities with respect to climate action.) The logic behind questioning the science seems two-fold for Administrator Pruitt:

  • Each time he questions the scientific consensus that climate is changing and humans are the primary cause, he contributes to confusion that helps stall significant action on climate change.
  • If he can find his Loch Ness Monster amidst the ocean of climate science (the Loch Ness Monster in this instance being fundamental errors in the science of climate change), Administrator Pruitt could open the door to taking down the EPA’s endangerment finding, which underpins the regulation of carbon dioxide as a pollutant under the Clean Air Act.

In the end, enacting policies that accept the scientific consensus on climate change might be a lose-lose for Administrator Pruitt’s agenda, but it would be a win-win for Americans.

Making the Leap from Coal to What Could Be: New Mexico’s Energy Future

UCS Blog - The Equation (text only) -

Photo: San Juan Citizens Alliance/EcoFlight

After decades and decades of commitment to coal, New Mexico is rapidly heading toward a future that’s coal free.

But a commitment to transition away from coal is just one part of the story; equally important is a commitment to where that transition will lead. What’s more, how this transition plays out—like who has access, and what happens to the communities and industries otherwise left behind—can have ramifications that last long into the future.

Now, legislators in the state are wrestling with one policy tool for transitioning to a truly clean and low-carbon future, and are considering several more that could help speed the journey along.

Legislation that asks not whether, but how

New Mexico’s lawmakers are in the midst of racing their way through this year’s 30-day legislative session, doing their best to deploy limited hours against a towering to-do list.

During these short sessions, which alternate years with regular 60-day sessions, lawmakers tend to prioritize budgetary issues above all else. But as a testament to the growing recognition of just how economically pivotal the state’s energy transition is, this year legislators are also devoting serious hours to SB 47/HB 80, or the Energy Redevelopment Bond Act.

Legislators have the opportunity to deploy policies that will help propel the state to a better, cleaner future.

This bill is a historic piece of legislation, and has the potential to catalyze the clean energy transition that New Mexico’s economy so desperately needs, and that New Mexicans so fully deserve. Achieving that potential, though, is key, and it turns on the central issue flagged above—a commitment away from something just isn’t enough; there must also be a commitment to what comes next.

At its core, SB 47/HB 80 is about enabling the state’s largest power provider—PNM—to issue securitized bonds to recover costs from the early retirement of San Juan Generating Station. Because the terms of such bonds result in much lower interest rates, customers can actually save money by paying PNM to, effectively, move on from coal.

But the bulk of the debate surrounding the bill is not about the proven securitization tool itself. Instead, it has to do with everything that could—and should—happen as a result.

Securitization presents an incredible opportunity to intentionally shape the state’s energy future, and squandering that opportunity (or worse, using it to point the state down a bad path) is too serious to let slide. That’s why stakeholders have been so deeply engaged in trying to make the initial proposal much stronger, including:

  • Securing meaningful support to ensure Farmington and San Juan County are provided a viable opportunity to diversify, develop, and ultimately transition their economy away from the dwindling coal sector.
  • Committing PNM to a future energy mix that would supply customers with 40 percent renewable energy by 2025 and 50 percent renewable energy by 2030—strong and critical waypoints that would keep the utility moving toward an ever-cleaner (and more cost-effective) energy future.
  • Requiring more market competition for the renewable energy resources that replace coal.
  • Ensuring that the Public Regulation Commission (PRC) has full authority to review PNM’s proposed closure costs.

Stakeholder negotiations to develop a consensus bill have improved the text from its original form, but it’s still not yet where it needs to be to warrant passage. A few central issues that demand prudent resolution include:

  • Making sure that a coal plant retired under this legislation cannot reopen again as a coal plant further down the road.
  • Limiting the amount of new resources PNM is guaranteed to own, as more market competition can drive down prices for ratepayers and create more development opportunities for the state’s growing clean energy industry.
  • Ensuring the PRC is sufficiently empowered to do its job as regulator, including by being able to adequately vet any securitization proposals that cross its desk.
  • Making sure the bill works for the communities it’s directly affecting by ensuring all stakeholders have a say.

The negotiating process for this bill has been a winding one, with stakeholders on both sides being brought together to try to come to a workable, consensus agreement. And in a sign that there’s still hope that a sufficiently improved bill will emerge this session, the legislation was tabled—not killed—in a vote by the Senate Conservation Committee last week. If designed well, the tool has incredible potential. We’re looking for legislation that ensures all that potential is met.

But transitions take a lot more than one bill, and a lot more than just bills

The remaking of New Mexico’s power sector cannot hope to be resolved in a single bill. There are many, many policies and regulations that can be brought to bear to best facilitate and accelerate New Mexico’s transition to a vibrant clean energy economy—one that’s open to participation and innovation from people all across the state.

Just this session alone there are multiple additional energy bills under consideration that could help pull the state forward, including:

  • A proposal to reinstate the solar tax credit: Especially in light of the Trump Administration’s recent enactment of solar import tariffs, the state can play a critical role in supporting its nascent-but-growing solar industry by making sure solar stays affordable—and available—for all New Mexicans as the costs of solar continue to fall.
  • A proposal to re-fund the Renewable Energy Transmission Authority (RETA): Cost-effectively shifting to a high-renewables future means taking advantage of all the state’s incredible renewable resource potential, which will require the buildout of new transmission lines to ensure the best opportunities can be brought to market. Refunding RETA would allow for the development of a strategic and centralized transmission planning approach.

And then, of course, there are all the many and varied ways that progress is being facilitated outside the legislature, from local community commitments to go green, all the way up to the PRC investigating the feasibility of PNM joining a broader energy market and considering the development of a Clean Energy Standard, and so much else in between.

New Mexico’s transition away from coal should lead directly to a clean energy future—the least-cost, healthiest, and most economically favorable future for the state. This transition must be open to all, and supportive of those who could otherwise get left behind.

State legislators have the chance now to make a leap toward this vibrant clean energy future—they should do everything they can to make the best of it.

Mr.TinDC/Creative Commons (Flickr)

Top Clean Cars and Trucks of 2018

UCS Blog - The Equation (text only) -

Some of the cleanest cars you can buy today are powered by electricity, though the emissions of an electric vehicle (EV) varies depending on where it is plugged in. Even though parts of the U.S. still partially rely on coal fired power, the average EV sold in the U.S. produces the emissions equivalent of a gas vehicle that gets 73 MPG, and over 70 percent of Americans live in an area where driving an EV results in fewer emissions than a 50 MPG gas-powered vehicle. Check out how electric vehicles (EVs) fare in your neck of the woods with this interactive tool that will calculate an EV’s emissions via zip code.

Looking for the most fuel-efficient SUV or pickup truck? Read on as I’ll detail the most fuel efficient vehicles in each of these classes below.

1. Toyota Prius Prime (Plug-in Hybrid Electric Vehicle) – 133 MPGe running on electricity + 54 MPG running on gas

Image via Toyota

This isn’t necessarily the most exciting vehicle on the planet, but the 2018 Toyota Prius Prime offers serious fuel economy and a modest electric range at a reasonable price (from $27,100 before any federal or state credits or rebates). The 2018 Prime is equipped with both a fuel-sipping 1.8 liter four-cylinder engine and an electric motor that runs off an 8.8 kWh battery pack that can be recharged in just 5 hours from any regular outlet, or around 2 hours from a 220V outlet (the type of outlet used by home appliances like a washer/dryer. For more info on different types of EV charging, head here). But even when out of juice, the Prime will still achieve a city/highway combined 54 mpg when running off gasoline alone. Overall, EPA gave the 2017 Prime an estimated fuel economy rating of 133 MPGe making it one of the most fuel-efficient vehicles that still uses gasoline today.

 

2. Nissan Leaf (Battery Electric Vehicle) – 112 MPGe

Image via Nissan

If you’re ready to ditch gasoline for good, you may want to check out the 2018 Nissan Leaf. The all-new Leaf not only got a style upgrade, but it also got a 40-kWh battery that provides an EPA-estimated 151 miles of all-electric range and a fuel economy rating of 112 MPGe. This is a big improvement from the original Leaf’s 84-mile range, and enough of a range boost that will make the Leaf work for even more people’s driving needs. Charging at home or on-the-go should be easy for Leaf owners as well. The Leaf can be fully charged in as little as 40 minutes with DC fast fast charging, or charged in around 8 hours via level 2 (220V) charging. Starting at $29,900, the Tennessee-built Leaf is cheaper than many of its all-electric competitors, though has slightly less range than the Chevy Bolt (238 miles) or Tesla Model 3 (220 miles).

 

3. Honda Clarity PHEV (Plug-In Hybrid Electric Vehicle) – 110 MPGe running on electricity + 42 MPG running on gas

Image via Honda

Like the Prius Prime, the 2018 Honda Clarity Plug-In Hybrid (PHEV) includes both a gasoline engine and an electric motor powered by a 17 kWh battery pack, which is good for an EPA-estimated 48 miles of all-electric range. When the electric range is exhausted, the Clarity PHEV relies on an efficient 1.5 liter 4-cylinder engine that produces a 42 mpg, which is very good for a big sedan. The Clarity PHEV base price is $33,400, but don’t forget about the $7,500 federal tax credit, which can knock the sticker price down to $25,900. Overall, the Clarity PHEV offers the best pure electric range of any plug-in hybrid sedan and should be able to compete with other PHEVs like the Toyota Prius Prime, Hyundai Ioniq PHEV, and Chevy Volt.

 

4. Chevy Bolt (Battery Electric Vehicle) – 119 MPGe

Image via Chevy

The Bolt was MotorTrend’s Car of the Year in 2017, will go 0-60 in just 6.5 seconds, and has an estimated all-electric range of 238 miles. The 2018 Bolt EV remains largely the same, and starts at $37,495. Of course, this price can be lowered by qualifying for the $7,500 federal tax credit and any other state EV credits or rebates. Interested in what EV incentives apply in your neck of the woods? Head over here for a handy guide. The Bolt’s battery pack can gain 90 miles of charge in just 30 minutes from DC fast charging and a full charge from empty will take about 9 hours via level 2, 220V charging. The Bolt charge time shouldn’t be a deal breaker considering the vast majority of EV charging is done at home – and mostly overnight. It’s also important to note that EV drivers typically don’t need a full charge every time they plug-in. If you drive 50 miles in a day, for example, then you only need to replace that 50 miles of lost range, which can happen in a matter of minutes from a DC fast charger or hours from a level 2 charger.

 

5. Tesla Model 3 (Battery Electric Vehicle) – 130 MPGe

Image via Tesla

There’s not too much to say about the Model 3 that hasn’t already been said. The Model 3 remains one of the most exciting clean vehicles on the market. Just how clean depends on where you plug-in, but UCS analysis has found that for over 70 percent of Americans, driving the average EV results in fewer emissions than a 50 MPG gasoline vehicle. The Model 3 comes with either 50 kWh or 75 kWh battery pack that gives the sedan a range of 220 miles or 310 miles, respectively, and can be fully charged in around 12 hours from level 2 (220V) charging or up to a 50 percent charge in 20 minutes via Tesla’s network of supercharger charging stations. Of course, Tesla has had some trouble meeting the 400,000 Model 3 pre-orders, but they are still taking reservations if you want to get in line and wait an estimated 12-18 months for one of the most hyped electric vehicles of all-time.

 

6. Hyundai Ioniq PHEV – 119 MPGe running on electricity + 52 MPG running on gas

Image via Hyundai

The Ioniq is Hyundai’s first foray into the electric vehicle market and offers a great alternative to the Toyota Prius Prime at a comparable price – the 2018 Ioniq PHEV starts at $25,835 and the Prime starts at $27,100. The Ioniq also marks the first-time American car buyers will be able to choose between a conventional hybrid, a plug-in electric hybrid, or a battery electric version of the same model. Giving consumers a family of clean options in the same vehicle is a clever move by Hyundai, and one that other automakers may seek to duplicate in their efforts to make electric drive more mainstream.

The 2018 plug-in hybrid version of the Ioniq includes an 8.9 kwh rechargeable battery pack that provides more than 25 miles of all-electric range and can be fully charged in two hours and 18 minutes from a Level 2 charger. Given its inoffensive styling and techno-inclusions like Apple CarPlay, Android Auto, and wireless smartphone charging, the Ioniq may challenge the Prius for hybrid sedan market share—a welcome sight for clean car enthusiasts everywhere. Also, the gas-only version of the Ioniq gets a best-in-class 58 combined MPG!

 

7. Ford F-150 Diesel – 30 MPG (estimated)

Image via Ford

Truck sales continue to outpace passenger vehicle sales. Ford, for example, sold more than 820,000 F-series trucks in 2016, which is more than double the sales of the Toyota Camry, the top-selling passenger car. So it’s critical that manufacturers improve the fuel economy of pick-ups to meet both the consumer demand for more fuel efficient vehicles and the demands of the federal fuel economy standards. So, it’s exciting to see the first F-150 with a diesel engine and 10-speed transmission heading to showrooms this spring, because it is expected to be the first full-size pickup to crack the 30 MPG barrier. This MPG doesn’t come at the expense of towing power either. The 2018 F-150 is expected to deliver a maximum tow rating of 11,400 pounds, which beats its closest rival, the Ram 1500 Ecodiesel, by over a ton and puts it in the upper echelon of all light duty pickups. In addition to the diesel, Ford recently announced plans for an F-150 hybrid, set to hit the market in 2020.

 

8. Lexus RX 450h – 30 MPG

Image via Lexus

Just because you may need an SUV doesn’t mean that you necessarily need to sacrifice fuel economy. The 2018 Lexus RX 450h gets a respectable 30 combined MPG and offers a 3 row configuration that can fit 7 or 8 passengers and a decent amount of cargo space. The standard all-wheel drive on this hybrid model is powered by a 308 horsepower V6 motor, and comes with the luxury amenities Lexus is known for. While not exactly a bargain, this model can transport a whole lot of people and stuff while achieving the same fuel economy as the similar sized Land Rover Discovery (22 MPG) or BMW x5 (16 MPG).  If this model is out of your price range, you may want to check out the Toyota Highlander I highlighted in this post.

NRC’s Project Aim: Off-target?

UCS Blog - All Things Nuclear (text only) -

A handful of years ago, there was talk about nearly three dozen new reactors being ordered and built in the United States. During oversight hearings, Members of Congress queried the Members of the Nuclear Regulatory Commission on efforts underway and planned to ensure the agency would be ready to handle this anticipated flood of new reactor applications without impeding progress. Those efforts included creating the Office of New Reactors and hiring new staffers to review the applications and inspect the reactors under construction.

Receding Tide

The anticipated three dozen applications for new reactors morphed into four actual applications, two of which have since been cancelled. The tsunami of new reactor applications turned out to be a little ripple, at best.

The tide also turned for the existing fleet of reactors. Unfavorable economics led to the closures of several reactors and the announced closures of several other reactors in the near future.

The majority of the NRC’s annual budget is funded through fees collected from its licensees. For example, in fiscal year 2017 the owner of an operating reactor paid $4,308,000 for the NRC’s basic oversight efforts. For extra NRC attention (such as supplemental inspections when reactor performance dropped below par and for reviews of license renewal applications), the NRC charged $263 per hour.

Still, the lack of upsizing from new reactors and abundance of downsizing from existing reactors meant that NRC would have fewer licensees from whom to collect funds.

Enter Project Aim

The NRC launched Project AIM in June 2014 with the intention of “right-sizing” the agency while retaining the skill sets necessary to perform its vital mission. Project Aim identified 150 items that could be eliminated or performed more cost-effectively. Collectively, these measures were estimated to save over $40 million.

Fig. 1 (Source: Nuclear Regulatory Commission)

Project Aim Targets

Item 59 was among the highest cost-saving measures identified by Project Aim. It terminated research activities on risk assessments of fire hazards for an estimated savings of $935,000. The NRC adopted risk-informed fire protection regulations in 2004 to complement the fire protection regulations adopted by the NRC in 1980 in response to the disastrous fire at the Browns Ferry Nuclear Plant in Alabama. The fire research supported risk assessment improvements to better manage the fire hazards—or would have done so had it not been stopped.

Item 61 was also a high dollar cost-saving measure. It eliminated the development of new methods, models and tools needed to incorporate digital instrumentation and control (I&C) systems into probabilistic risk assessments (PRAs) with an estimated savings of $735,000. Nuclear power reactors were originally equipped with analog I&C systems (which significantly lessened the impact of the Y2K rollover problem). As analog I&C systems become more obsolete, plant owners are replacing them with new-fangled digital I&C systems. Digital I&C systems fail in different ways and at different rates than analog I&C systems and the research was intended to enable the PRAs to better model the emerging reality.

Item 62 eliminated development of methods, models, tools, and data needed to evaluate the transport of radioactive materials released during severe accidents into aquatic environments. For example, the 2011 severe accident at Fukushima involved radioactive releases to the Pacific Ocean via means not clearly understood. This cost-saving measure seems to preserve that secret.

Fig. 2 (Source: Nuclear Regulatory Commission)

Project Aim Off Target?

The need to reduce costs is genuine. Where oh where could savings of $935,000 come if not from killing the fire research efforts? Perhaps the Office of Management and Budget (OMB) has the answer. On May 11, 2012, OMB issued Memorandum M-12-12 that capped the amount federal agencies spent on conferences at $500,000. This OMB action pre-dated Project Aim, but seems consistent with the project’s fiscal responsibility objectives.

But the NRC opts not to abide by the OMB directive. Instead, the NRC Chairman signs a waiver allowing the NRC to spend far more than the OMB limit on its annual Regulatory Information Conferences (RICs). How much does the RIC cost? In 2017, the RIC cost the NRC $932,315.39—nearly double the OMB limit and almost exactly equal to the amount fire research would have cost.

987 persons outside the NRC attended the RIC in 2017. So, the NRC spent roughly $944.60 per outsider at the RIC last year. But don’t fixate on that amount. Whether the NRC had spent $1,000,000 per person or $1 per person, the RIC did not make a single American safer or more secure. (It also did not make married Americans safer or more secure, either.)

Eliminating the RIC would save the NRC nearly a million dollars each year. That savings could fund the fire research this year, which really does make single and married Americans safer. And next year savings could fund the development of digital I&C risk assessment methods to better manage the deployment of these systems throughout the nuclear fleet. And the savings the following year could fund research into transport of radioactive materials during severe accidents.

Fig. 3 (Source: Nuclear Regulatory Commission)

If the cliché “knowledge is power” holds any weight, then stopping fire research, development of digital I&C risk assessment methods, and many other activities leaves the NRC powerless to properly manage the associated risks.

RIC and risk? Nope, non-RIC and lower risk.

Trump Administration Raids Workers’ Tip Jars, Buries Data Showing That’s a Horrible Idea

UCS Blog - The Equation (text only) -

In December the Trump administration proposed a new rule that specifically allows employers to control and distribute tip income as they see fit, taking away control of tip money from the employees—food servers, baristas, and many other hardworking people—who earned it.

Now it has come to light that the Department of Labor, which proposed the rule, has provided yet another example of the Trump administration following its science motto: “If the data don’t tell you what you want to hear, bury the data.” In this case, by suppressing data and analysis from the department’s own experts showing the economic impacts of this rule.

People are always hurt by such an approach, and the casualties this time are the people who serve restaurant meals for minimum wage or less.

“Tip pooling”—banned for good reason

Last fall, the department halted a rule put in place in 2011 that banned the practice of “tip pooling” in restaurants and other businesses. The idea of the original rule was to ensure that when patrons add tips to their bills, that money is controlled by employees, not restaurant owners or managers. This is important because in many states, servers are exempt from the legal minimum wage—making as little as $2.13/hour in some states—so their incomes rely heavily on tips. But the Trump administration not only rescinded the pooling ban, it went further in December, proposing a new rule specifically allowing employers to control and distribute tip income as they see fit.

It has created a lot of controversy to say the least. Hundreds of thousands of tipped workers, the vast majority women,  may lose as much as $5.8 billion in income if the rule is finalized. Our allies at the Restaurant Opportunities Center United have warned that opening the door to tip theft would also exacerbate sexual harassment in an industry already rife with it, as it would give employers extraordinary power over their workers’ tips.

Suppressing the evidence

As a matter of human rights and fairness this is an important issue, but is it a science issue too?  Unfortunately, yes. That’s because, in the course of overturning this worker protection, the Trump administration suppressed data and analysis from their own staff showing the economic impact of the reversal. That impact comes from transferring income away from waitstaff to a broader set of workers, or to employers who directly retain some of the money.

Public policy decisions like these are supposed to be based on the best data and analyses available and the decision itself must be weighed and justified based on that evidence. And the public must have access to that information when the department takes comments on the proposed rule.

In this case, Labor Department analysts compiled data on the economic effects of the rule change, but according to Bloomberg, political appointees at the Labor Department later ordered the staff to change the analysis to appear more supportive of their proposal to allow tip-pooling. When changes to the data and analysis were still not sufficient to show the proposal favorably, the administration just buried the report altogether.  Not only does that misinform the public and industry, it also manipulates the decision process itself and “tips” it in favor of the administration (pun intended).

Unfortunately, this action to sideline science in order to allow politics and influence to rule the day is not unique to the Department of Labor, but is rife throughout this administration. In this case, the attorneys general of 17 states are warning that the administration’s failure to release its data on tip pooling is illegal under federal rulemaking law.

We need to push back with our elected representatives and demand the rule be withdrawn. This administration, like any other, may have policies it wants to advance, but they must do the scientific analyses straight up and allow the public to have a voice.  Misinforming the public shouldn’t be an acceptable tactic for pushing ahead with public policies that hurt everyday hard-working people.

Demolishing Public Protections to Build Trump’s Wall

UCS Blog - The Equation (text only) -

Photo: Denver Gingerich/CC BY-SA 2.0 (Wikimedia)

From Hadrian’s Wall, to the Great Wall of China, the Wall of the North and many more , political walls have been built to discourage or control immigration, White Walkers, and ideologies at state borders since ancient times (though this is certainly not an endorsement). They have been built from stone, earth, wood, and steel, but one thing they all have in common—they are an ineffective strategy for border control, with other damaging side effects.

Still, the Department of Homeland Security (DHS) recently announced that they have waived 25 federal laws in order to expedite the construction of the US-Mexico border wall. Aside from being a humanitarian nightmare, this move stands to threaten the environment, wildlife, and culturally and historically significant public lands surrounding it—and the people whose lives and heritage depend upon these resources. The waiver will apply to the 20-mile vicinity east of the Santa Teresa, New Mexico port of entry, where existing vehicle barriers will be converted into a steel and concrete bollard wall.

Among the many statutes being set aside for construction are federal heavyweights like the National Environmental Policy Act (NEPA), the Magna Carta of federal environmental law, and the Endangered Species Act (ESA). The waiving of NEPA, ESA and other laws by the Secretary of Homeland Security, allowed under the REAL ID Act of 2005, is a huge power grab though not entirely unprecedented: the DHS has exercised the waiver requirement seven times prior to this, five times under President Bush from 2005-2008 and two times already in 2017. Waiving protective statutes, however, should be considered with caution and not wielded as a weapon for political gain. These laws were designed to ensure public health and environmental protection using science-based rules. Together, these safeguards have kept people safe and kept our nation beautiful. By exploiting waivers like this, the administration is throwing away the progress we’ve made in developing science-based laws for the public’s benefit.

For example, Santa Teresa, NM lies in the cultural and natural resource-rich Chihuahuan Desert near the Rio Grande. Disruptions from wall construction will put pressure on this delicate ecosystem, which is already threatened by climate change, agricultural expansion, and population growth. Waiving these acts shows a blatant disregard for public health and culturally important, irreplaceable sites and resources, and impinges on the rights and freedoms of tribal communities to preserve their heritage.

Here are all the different (evidence-based) laws that the wall will trample over:

Environment
  • National Environmental Policy Act—This will allow the administration to forgo an environmental impact assessment on the project.
  • Clean Water Act – This will allow the administration to escape regulation of pollutants discharged into US waters, deflecting requirements for water quality standards.
  • Resource Conservation and Recovery Act—The monitoring and enforcement of solid and hazardous waste generation, transportation, treatment, storage, and disposal standards will be overlooked, meaning this safeguard will be laid to waste (pun intended).
  • Comprehensive Environmental Response, Compensation, and Liability Act (Superfund)—A complement to RCRA (above), CERCLA “authorizes the President to respond to releases of hazardous substances into the environment”. Waiving this statute takes the administration off the hook for reporting and cleaning up releases of hazardous substances during construction activities.
  • Safe Drinking Water Act—This will remove requirements for monitoring contaminants in public water systems, as well as waiving the responsibility for notifying the public of contamination to the water systems.
  • Clean Air Act—Enforcement of air emissions standards from stationary and mobile sources will not be required for the construction project, potentially allowing for unmitigated emissions of hazardous air pollutants.
  • Noise Control Act—This will allow the administration to construct their wall without reducing excess noise pollution from transportation vehicles, equipment, and machinery.
Wildlife
  • Endangered Species Act—This waiver will remove administration requirements to consider the special protections of imperiled species and ecosystems in the proposed construction site.
  • Migratory Bird Treaty Act—This will allow for the harming, possession, purchase, sale, bartering, transport, export, and import of any migratory bird, nest, or eggs of migratory birds without the requirement of a federally issued permit.
  • Migratory Bird Conservation Act—This will allow for the purchase or rental of land (or water) in migratory bird habitat without consultation and approval by the Migratory Bird Conservation Commission (MBCC).
  • National Fish and Wildlife Act of 1956—Protections of fisheries and wildlife resources would be waived for the border wall construction. For example, collection of data on the availability and abundance of all wildlife (not just endangered) in the construction area would no longer be required.
  • Fish and Wildlife Coordination Act—This would relieve the requirements for federal agencies to conduct assessments on effects of sewage, waste, and other pollutants on wildlife.
  • Eagle Protection Act—This will allow for the harming, possession, purchase, sale, bartering, transport, export, and import of any bald (or golden) eagles, nest, or eggs of bald (or golden) eagles without the requirement of a federally issued permit.
Land rights
  • Farmland Protection Policy Act—This would allow the administration to construct their wall without requiring them to minimize impact to farmland (and forest land, pastureland, cropland, or other non-urban built-up land).
  • Federal Land Policy and Management Act—This Act establishes the procedures whereby the Bureau of Land Management manages public lands to accommodate multiple uses (for example: livestock grazing, mineral extraction, logging, fishing, hunting, conservation of historical and cultural resources), which won’t need to be adhered to for border wall construction purposes.
Cultural preservation and national heritage
  • National Historic Preservation Act—This will allow the administration to ignore the effects of federally funded projects on historic properties and artifacts.
  • Archaeological Resources Protection Act—This will allow the administration to excavate or remove archeological resources on federal or tribal lands without a permit.
  • Paleontological Resources Preservation Act—This will allow the administration to dismiss protections for non-renewable paleontological resources on federal lands.
  • Federal Cave Resources Protection Act of 1988—The waiver will allow for a lack of specific statutory protections for significant caves that are “an invaluable and irreplaceable part of the nation’s natural heritage.”
  • Archaeological and Historic Preservation Act—Also waived away are requirements for preservation of historical and archaeological data that might be lost or destroyed by federal projects or activities.
  • Native American Graves Protection and Repatriation Act—This will allow for the administration to construct their wall without regard for Native American burial sites. Moreover, consultations with tribes when archaeological investigations encounter or discover Native American cultural items on federal or tribal lands will not be required.
  • American Indian Religious Freedom Act—This will allow the administration to disregard protections of the religious cultural rights and practices of American Indian peoples, particularly by removing access to sacred sites and objects.
  • Antiquities Act—This Act provides protection for any cultural or natural resource, generally. The waiver, much like that of the Archaeological Resources Protection Act, will remove the permitting requirement for excavation activities.
  • Historic Sites, Buildings, and Antiquities Act—This will allow the administration to forego a survey of any historic and archaeologic sites, buildings, and objects existing at the site of construction.
Transparency in rulemaking
  • Administrative Procedure Act—sets requirements for publishing notices of proposed and final rulemaking in the Federal Register, opportunities for public comment on proposed rulemaking, and requires a 30-day delayed effective date.

Actions like this will not only succeed in straining US relations with Mexico, but also open the door to further dismissal of important science-based protections, particularly in the name of immigration reform. In fact, President Trump announced last week in his State of the Union address his plan to update US infrastructure—which will surely include gutting environmental protections in favor of corporate polluters, and at the expense of resources vital to nearby communities and wildlife.

Photo: Denver Gingerich/CC BY-SA 2.0 (Wikimedia)

How One Utility Is Using Tax Reform to Hide a Billion-Dollar Climate Problem

UCS Blog - The Equation (text only) -

Florida National Guard responds to Hurricane Irma in Flagler Estates, FL. Photo: Ching Oettel, Florida National Guard

Picture this: You’ve just completed a decade of investing about $3 billion of your customers’ dollars into keeping the lights on when severe weather strikes. Now Hurricane Irma’s blasted through, 90 percent of your customers were left in the dark, and the restoration and repair costs you intend to bill them are estimated at $1.3 billion.

That’s right. A storm you’ve spent a decade preparing for is looking like it’ll end up costing nearly half as much as the preparations themselves. Worse, there’s no reason to think it won’t keep happening again, and again, and again, as climate change drives the intensity of these storms ever higher.

Bit of a thorny customer relations problem, that one.

Which is why the usual way of recovering storm restoration costs—as an additional line item on customers’ monthly bills—isn’t exactly “convenient.” That’s because it serves as a repeated reminder and ongoing invitation to your ratepayers and regulators to question just how come it is that all those billions in investments still result in such high costs.

Enter the late 2017 tax cuts and Florida Power & Light’s (FPL) sweet, sweet sigh of relief.

Magical Money Magically Solves Climate Problem?

Over the past month or so, utilities across the country have been hauled in front of their regulatory commissions and asked about how they intend to funnel tax cut windfalls back to the ratepayer. In turn, plans to reduce customer rates are beginning to emerge.

But FPL? With tax cuts, it saw an open lane and it drove straight to the basket.

As opposed to charging ratepayers monthly storm costs, the utility realized that it could instead deploy those freed up funds to, in effect, wipe away the storm. No monthly charges, no monthly reminders, no more problem.

On its surface, there doesn’t seem to be anything outright wrong with this accounting approach. In fact, it could end up saving ratepayers some amount of money through economic efficiencies—I leave that part to the regulators.

Here’s the problem I do have.

FPL’s experience with Hurricane Irma should have been an eye-opener to every utility, every regulator, every policymaker, every planner—every person—in this country. Here’s a utility that has worked hard on doing better, made big-time ratepayer investments to back those efforts up, and was still left with 90 percent of its customers in the dark, including 12 senior citizens who died as the outages—and ensuing lack of air conditioning—dragged on.

Now overall, many of FPL’s outages from Hurricane Irma were resolved fairly quickly, which is a primary goal of boosting grid resilience. Indeed, it’s not just about minimizing the initial scale and scope of the outage, but also minimizing the time an outage lasts should one come to pass.

The thing is, that relatively quick recovery cost FPL—and thus ratepayers—a staggering $1.3 billion. So even if we did accept the magnitude and duration of the outage given the size of the storm, we’re left with the glaring realization that we may not keep being able to afford the fix. The unsustainability of the plan is even more apparent when we consider that we have every reason to believe these storms won’t be getting any more docile in the years to come, especially in a place like FPL’s territory, where sea level rise is making storm surge worse, rapidly placing ever more critical infrastructure at risk.

According to NOAA’s tracking, billion-dollar weather events in the United States show no sign of slowing down. In 2017 alone, these disasters totaled more than $300 billion in costs.

This storm should result in people asking hard questions. Raising red flags. Demanding dialogue. Saying hey, we might just need to make a change.

But instead what FPL’s proposing to do is kick the can down the road. It’s putting a damper on ratepayer reactions, minimizing run-ins with regulators asking pesky follow-ups, and buying itself a bit more time before the (inevitable) question of if not this, what.

Because let’s face it: there won’t always be a magical tax reform to solve the climate problem. These costs are real, and these questions will come. Ratepayers deserve to have them asked after this $1.3 billion instead of the next.

But for now, it’ll be largely left to just five commissioners (and at least one hurricane-response docket) instead of the ongoing and motivated voices of a constantly reminded nearly 5 million customers. Again, that’s not necessarily wrong, just incredibly unhelpful.

The Conversations We Should Be Having

Here’s one thought on a place for the conversation to start: recognizing that if today’s “good enough” isn’t really good enough, then perhaps it’s the approach itself that must change. We see a viable path to the future through the concurrent pursuit of a two-pronged approach:

  1. Steadily increase grid resilience:
    • There’s a lot that can be done to improve the performance of the grid, from replacing wooden poles with reinforced concrete, to burying select power lines, to elevating flood-exposed substations, to installing smart devices across the system—all of which FPL has pursued, at least in part. But there’s also a broader need to shift the whole of the system from one that’s oriented around central generating stations, to one that’s powered by ever more decentralized and distributed clean energy resource—and here, FPL has seriously lagged (and dragged much of the rest of the state down with it by not using its heft to push for broader climate and clean energy action).
    • It’s also critically important for everyone to have on hand a shared vision of the future, and a plan for how to get there. That way every time there’s an upgrade or emergency repair, the grid gets smarter, stronger, and one step closer to that future state. Yet we can’t expect utilities and regulators to go it alone—we need the federal government to help in establishing guiding resilience principles and disseminating best practices along the way.
  2. Ensure continuous power for those who can’t afford to go without:
    • It will take time to achieve wholesale grid modernization, and even when we get there, we should still assume that the power will still, sometimes, go out. As a result, we must identify those populations and critical services for whom even a brief disruption is too much, and arm them with an electricity Plan B. This must be equitable and just, foremost through engagement with the communities themselves.
    • One top option for keeping the lights on? Ensuring that utilities have allowed a clear path for the individual installation and use of solar panels alongside storage, which can provide users with benefits year-round—not just when the electricity blinks out.

As much as we may not want to face it, preparing for climate impacts will cost real, significant, mind-bending amounts of money. And that means trade-offs. And that means dialogue. Everywhere. Not just with FPL—after all, other Florida utilities quickly jumped to follow FPL’s accounting lead—and not just in Florida (or Texas, or Puerto Rico, or California, or…). Everywhere we need to be asking these hard questions and having these hard conversations, all so that we can begin to set a course to a better tomorrow.

Using accounting maneuvers to mask the true costs of climate preparedness? Not the best conversation starter.

But though we may have lost this prompt, it doesn’t mean the conversation can’t go on.

NOAA 2017

Conflicts of Interest in the Trump Administration: The Cases of Alex Azar and Brenda Fitzgerald

UCS Blog - The Equation (text only) -

Alex Azar, secretary of Health and Human Services. Photo: Wikimedia

It is slim relief that Brenda Fitzgerald was forced to resign last week as director of the Centers for Disease Control and Prevention. Her final offense in her very short and immoral tenure was investing in tobacco stocks after being appointed in July, according to a Politico report. Before being appointed by the Trump administration, the former Georgia health commissioner had long invested in cigarette companies whose products kill 480,000 Americans a year and 6 million worldwide, according to her own agency.

She was further compromised by investments in drug, insurance, and health diagnostic firms that posed conflicts in dealing with cancer, opioids, and dissemination of health information. She told the New York Times that she was considering renewing CDC ties with Coca-Cola. In Georgia, she was a cheerleader for Coca-Cola’s physical fitness programs. The world’s largest soda company, based in Atlanta, was exposed in 2015 for funding scientists who said America’s obesity crisis was all about exercise, not the excess empty calories from sugary drinks. As documented by the Union of Concerned Scientists, Coca-Cola was following an all-too-common tactic of the corporate disinformation playbook—hiring scientists to produce results that obscure a product’s harm.

But her departure hardly guarantees that we can count on the CDC to protect the nation’s health. For the moment, the acting director is Anne Schuchat, a respected infectious disease expert, known for leading domestic and global response teams against flu viruses in the US and infectious diseases in Africa and China, including Ebola and SARS. A member of the National Academy of Medicine and a rear admiral in the United States Public Health Service, her disease detective work was the model for a lead character Kate Winslet played in the movie “Contagion.”

The ethical conundrum of Alex Azar

It is rare for acting directors, even if immortalized by actresses, to win a full appointment. So who comes next bears serious watching, especially since Alex Azar is the new secretary of Health and Human Services (HHS), the department that oversees the CDC, and he himself is an ethical conundrum.

Azar, a lawyer who clerked for the late conservative Supreme Court Justice Antonin Scalia, and was on George W. Bush’s legal team for the 2000 Florida recount, became general counsel at HHS and ultimately deputy secretary of the department. He left in 2007 to become the top lobbyist for the Eli Lilly pharmaceutical giant and worked his way to the presidency of the company in 2012.

When President Trump nominated him in November to replace Tom Price, who abused taxpayer dollars by traveling by private jet, the president tweeted Azar “will be a star for better healthcare and lower drug prices.” When Azar was sworn in on January 29, Trump, who has repeatedly said that drug companies get away with “murder,” said prices would now “come rocketing down.”

From drug company CEO to people’s champion on drug prices? Unlikely.

There is no evidence to remotely suggest that Azar, the first pharmaceutical executive ever to head HHS, according to the Washington Post, will miraculously transform from drug company CEO into the people’s champion on drug prices. In July, the Indianapolis Business Journal reported that in the last 20 years, while the price of milk went up 23 percent, the cost of a Dodge minivan rose 21 percent and general inflation was 32 percent, the price of Lilly’s insulin drugs Humalog and Humulin skyrocketed by 1,157 percent and nearly 800 percent respectively. A vial of Humalog that cost $21 in 1996 cost $274.70 last summer.

Lilly is a defendant along with global diabetes drug titans Novo Nordisk and Sanofi in a class action price-fixing lawsuit filed last year in federal court in Massachusetts. According to the lawsuit, Lilly’s Humulin shot up 325 percent from just 2010 to 2015, a period covering Azar’s first three years at the helm. Several news stories and guest columns last year featured the difficulty many American diabetics have in affording insulin. One out of every eight American adults has diabetes, and the lower the socioeconomic status, the higher the incidence of the disease.

According to the CDC, diabetes was listed as any cause of death on a quarter million US death certificates in 2015, and the annual direct and indirect cost of diabetes to the nation is a quarter billion dollars. Several small studies over the last two decades have shown that high percentages of patients admitted to hospitals with life threatening diabetic ketoacidosis became sick after discontinuing insulin therapy because it was unaffordable.

Lilly blamed the rise in drug prices to other parts of the health care system, but it refused to disclose to the Indianapolis Business Journal its net prices. Lilly ranks 132nd in the Fortune 500 with profits last year of $2.7 billion.

Keeping a watchful eye

Pressed in his Senate confirmation hearings on drug pricing, Azar acknowledged they were high but offered no major solutions, having opposed the Affordable Care Act and saying the government should not have a heavy hand in negotiating drug prices. As average Americans struggle with diabetes drug costs, Azar made $3.6 million in his last year at Lilly in salary and severance. He also sold off $3.4 million in Lilly stock, according to the Associated Press.

Azar’s light hand on out-of control drug prices merits a very watchful eye over both HHS and CDC. We need to make sure that our government officials, especially those making decisions about access to health care, advocacy against diseases, and scientific research aren’t beholden to profitable companies producing drugs, cigarettes, soda, or other health-related products or services. Before her departure, Fitzgerald came under fire when the Washington Post reported that certain words were being banned from budget requests, such as “evidence based” and “diversity.”

Trump says that under Azar, drug prices will come rocketing down. In actuality, the watch is on to see if Azar instead is another incoming missile from Trump against federal protection of the nation’s health.

Clinton Power Station: Déjà vu Transformer Problems

UCS Blog - All Things Nuclear (text only) -

The Clinton Power Station located 23 miles southeast of Bloomington, Illinois has one General Electric boiling water reactor with a Mark III containment that began operating in 1987.

On December 8, 2013, an electrical fault on a power transformer stopped the flow of electricity to some equipment with the reactor operating near full power. The de-energized equipment caused conditions within the plant to degrade. A few minutes later, the control room operators manually scrammed the reactor per procedures in response to the deteriorating conditions. The NRC dispatched a special inspection team to investigate the cause and its corrective actions.

On December 9, 2017, an electrical fault on a power transformer stopped the flow of electricity to some equipment with the reactor operating near full power. The de-energized equipment caused conditions within the plant to degrade. A few minutes later, the control room operators manually scrammed the reactor per procedures in response to the deteriorating conditions. The NRC dispatched a special inspection team to investigate the cause and its corrective actions. The NRC’s special inspection team issued its report on January 29, 2018.

Same reactor. Same month. Nearly the same day. Same transformer. Same problem. Same outcome. Same NRC response.

Coincidence? Nope. When one does nothing to solve a problem, one invites the problem back. And problems accept the invitations too often.

Setting the Stage(s)

The Clinton reactor was operating near full power on December 8, 2013, and on December 9, 2017. The electricity produced by the main generator (red circle labeled MAIN GEN in Figure 1) at 22 kilovolts (KV) flowed through the main transformers that upped the voltage to 345 KV (345,000 volts) for the transmission lines emanating from the switchyard to carry to residential and industrial customers. Some of the electricity also flowed through the Unit Auxiliary Transformers 1A and 1B that reduced the voltage to 6.9 and 4.16 KV (4,160 volts) for use by plant equipment.

The emergency equipment installed at Clinton to mitigate accidents is subdivided into three divisions. The emergency equipment was in standby mode before things happened. The Division 1 emergency equipment is supplied electrical power from 4,160-volt bus 1A1 (shown in red in Figure 1). This safety bus can be powered from the main generator when the unit is online, from the offsite power grid when the unit is offline, or from emergency diesel generator 1A (shown in green) if none of the other supplies is available. The Divisions 2 and 3 emergency equipment is similarly supplied power from 4,160-volt buses 1B1 and 1C1 respectively, each with three sources of power.

Fig.1 (Source: Clinton Individual Plant Examination Report (1992))

The three buses also provided power to transformers that reduced the voltage down to 480 volts for distribution via the 480-volt buses. For example, 4,160-volt bus 1A1 supplied 480-volt buses A and 1A.

Stage Struck (Twice)

On December 8, 2013, and again on December 9, 2017, an electrical fault on one of the 480-volt auxiliary transformers caused the supply breaker (shown in purple in Figure 2) from 4,160-volt bus 1A1 to open per design. This breaker is normally manually opened and closed by workers to control in-plant power distribution. But this breaker will automatically open to prevent an electrical transient from rippling through the lines to corrupt other equipment.

When the breaker opened, the flow of electricity to 480-volt buses A and 1A stopped, as did the supply of electricity from these 480-volt buses to emergency equipment. It didn’t matter whether electricity from the offsite power grid, the main generator, or emergency diesel generator 1A was supplied to 4,160-volt bus 1A1; no electricity flowed to the 480-volt buses with this electrical breaker open.

Fig. 2 (Source: Clinton Individual Plant Examination Report (1992))

The loss of 480-volt buses A and 1A interrupted the flow of electricity to emergency equipment but did not affect power to non-safety equipment. Consequently, the reactor continued operating near full power.

The emergency equipment powered from 480-volt buses A and 1A included the containment isolation valve on the pipe supplying compressed air to equipment inside the containment building. This valve is designed to fail-safe in the closed position; thus, in response to the loss of power, it closed.

Among the equipment inside containment needing compressed air were the hydraulic control units for the control rod drive (CRD) system (shown in orange in Figure 3). The control rods are positioned using water pistons. Supply water to one side of the piston while venting water from the other side creates a differential pressure causing the control rod to move. Reversing the sides that get water and get vented causes the control rod to move in the opposite direction. Compressed air keeps two scram valves for each control rod closed against coiled springs. Without the compressed air pressure, the springs force the scram valves to open. When the scram valves open, high pressure water is supplied below the pistons while water from above the pistons is vented. As a result, the control rods fully insert into the reactor core within a handful of seconds to stop the nuclear chain reaction.

Fig. 3 (Source: Nuclear Regulatory Commission)

Ten minutes after the electrical breaker opened on December 8, 2013, an alarm in the control room sounded to alert the operators about low pressure in the compressed air system. The operators followed procedures and responded to the alarm by manually scramming the reactor.

Four minutes after the electrical breaker opened on December 9, 2017, an alarm in the control room sounded to alert the operators about low pressure in the compressed air system. Two minutes later, other alarms sounded to inform the operators that some of the control rods were moving into the reactor core. They manually scrammed the reactor. (The timing difference between the two events is explained by the amounts of air leaking from piping inside containment and by the operation of pneumatically controlled components inside containment that depleted air from the isolated piping.)

The event had additional complications. The loss of power disabled: (1) the low pressure core spray system, (2) one of the two residual heat removal trains, the reactor core isolation cooling system, and the normal ventilation system for the fuel handling building (the structure on the left side of Figure 3). These losses were to be expected – subdividing the emergency equipment into three divisions and then losing all the power to that division de-energizes about one-third of the emergency equipment.

Fortunately, the loss of some emergency equipment in this case was tolerable because there was no emergency for the equipment to mitigate. The operators used non-safety equipment powered from the offsite grid and some of the emergency equipment from Divisions 2 and 3 to safely shut down the reactor. The operators anticipated that the loss of compressed air to equipment inside containment would eventually cause the main steam isolation valves to close, taking away the normal means of removing decay heat from the reactor core. The operators opened other valves before the main steam isolation valves close to provide an alternate means of sustaining this heat removal path. About 30 hours after the event began, the operators placed the reactor into a cold shut down mode, within the time frame established by the plant’s safety studies.

Staging a Repeat Performance

Workers replaced the failed Division 1 transformer following the December 2013 event. Clinton has five safety-related and 24 non-safety-related 4,160-volt to 480-volt transformers, including the one that failed in 2013. Following the 2013 failure, a plan was developed to install windows in the transformer cabinets to allow the temperature of the windings inside to be monitored using infrared detectors. Rising temperatures would indicate winding degradation which could lead to failure of the transformer.

But the planned installation of the infrared detection systems was canceled because the transformers were already equipped with thermocouples that could be used to detect degradation. Then the owner stopped monitoring the transformer thermocouples in 2015.

Plan B (or C?) involved developing a procedure for Doble testing of these 29 transformers that would trend performance and detect degradation. The Doble testing was identified in October 2016 as a Corrective Action to Prevent Recurrence (CAPR) from the 2013 transformer failure event. The Doble testing procedure was issued on November 18, 2016.

Clinton was shut down on May 8, 2017, for a refueling outage. The activities scheduled during the refueling outage included performing the Doble testing on the Division 2 4,160-volt to 480-volt transformers. But that work was canceled because it was estimated to extend the length of the refueling outage by three whole days. So, Clinton restarted on May 29, 2017, without the Doble testing being conducted. As noted by the NRC special inspection team dispatched to Clinton following the repeat event in 2017: “…the inspectors determined that revising the model work orders [i.e., the Doble test procedure] alone was not a CAPR. In order for the CAPR to be considered implemented, the licensee needed to complete actual Doble testing of the transformers.”

The NRC’s special inspection team also identified a glitch with how some of the non-safety-related transformers were handled within the preventative maintenance program. A company procedure required components whose failure would result in a reactor scram to be included in the preventative maintenance program to lessen the likelihood of failures (and more importantly, costly scrams). In response to NRC’s questions, workers stated that three of the non-safety-related transformers could fail and cause a reactor scram, but that these transformers were not covered by the preventative maintenance program.

Plan C (or D?) now calls for replacing all five safety-related transformers: the two Division 2 transformers in 2018 and the single Division 3 transformer in 2021. The two Division 1 transformers have already been replaced following their failures. A decision whether to replace the 24 non-safety-related transformers awaits a determination about seeking a 20-year extension to the reactor’s operating license.

NRC Sanctions

The NRC’s special inspection team identified two findings both characterized as Green in the agency’s green, white, yellow and red classification system.

One finding was the violation of 10 CFR Part 50, Appendix B, Criterion XVI, “Corrective Actions,” for failing to implement measures to preclude repetition of a significant condition adverse to quality. Specifically, the fixes identified by the owner following the December 2013 transformer failure were not implemented, enabling the December 2017 transformer to fail.

The other finding was the failure to follow procedures for placing equipment within the preventative maintenance program. Per procedure, three of the non-safety-related transformers should have been covered by the preventative maintenance program but were not.

UCS Perspective

Glass half-full: Clinton started operating in 1987 and didn’t experience a 4,160-volt to 480-volt transformer failure until late 2013. Apparently, transformer failures are exceedingly rare events such that lightning won’t strike twice.

Glass half-empty: All the aging transformers at Clinton were over 25 years old and heading towards, if not already in, the wear out region of the bathtub curve. Lightning may not strike twice, but an aging jackhammer strikes lots of times (until it breaks).

Could another untested, unreplaced aging transformer fail at Clinton? You bet your glass.

Fig. 4 (Source: Nuclear Regulatory Commission)

Rigor and Transparency as an Antidote to Politicization at EPA’s Integrated Risk Information System

UCS Blog - The Equation (text only) -

Tina Bahadori, Director of EPA’s National Center for Environmental Assessment (NCEA), and Kristina Thayer, Director of EPA’s Integrated Risk Information System (IRIS), speak about recent improvements to the IRIS program at Thursday’s workshop.

A National Academy of Sciences (NAS) study committee charged with reviewing advances made to the EPA’s National Center for Environment Assessment and its Integrated Risk Information System (IRIS) program met at the NAS headquarters in DC this week. Over a day and a half, IRIS presented the full slate of activities that the program has been engaging in to modernize and improve the ways that the program is completing its hazard assessment and dose response evaluations.

IRIS assessments on environmental contaminants represent the gold standard for chemical toxicity reviews at the federal, state, and local level, and even internationally. These reviews provide a science basis for many of the standards set by U.S. environmental statues, including the Clean Air Act, Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) also known as Superfund, Clean Water Act, Resource Conservation and Recovery Act (RCRA), The Toxic Substances Control Act (TSCA), and the Safe Water Drinking Act. IRIS is crucial in helping the agency meet its mission to protect human health and the environment. However, because of this program’s critical role in standard setting for federal and state policy, it is often targeted by industry for criticism and even calls to alter its mission or strike the program altogether.

During Thursday’s meeting, IRIS staff expressed concerns about the ability for IRIS to do the work at timelines expected due to staff attrition (now down to just 30 staffers) and lack of funding for external contractors to help with its workload, all while trying to meet Administrator Pruitt’s priority for increased efficiency. Over the course of four information sessions and a poster session, IRIS staff systematically addressed the ways in which the program has made targeted improvements to its processes as NAS recommended in its last review of the program in 2014. Layers upon layers of internal and external peer review and public engagement have been built into the review process using new tools and state-of-the-art methods.

The NAS meeting offered opportunities for public comment at several points throughout the meeting, and there were a series of comments communicating the value that this program offers, from a nonprofit organization that relies on IRIS assessments as it works to remediate superfund sites in New Jersey, to the plight of La Place, Louisiana community members living near a facility emitting chloroprene, a likely carcinogen as determined by IRIS. We cannot afford to have the work of this program diminished or politicized in any way.

My comment in support of the independence and integrity of the IRIS office is below.

Good afternoon, I would like to thank this National Academies study committee for the opportunity to provide this comment today. My name is Genna Reed. I am the science and policy analyst at the Center for Science and Democracy at the Union of Concerned Scientists. The Center for Science and Democracy at UCS advocates for improved transparency and integrity in our democratic institutions, especially those making science-based public policy decisions.

The EPA IRIS program provides a critical scientific service to the public, offering a public searchable database with scientific analyses that inform the decisions that protect us from environmental contaminants.[1] This office is not just important for federal policymaking, but IRIS assessments and associated toxicity values are used by state environmental and public health agencies, as well as community groups, to assess local risks from facilities producing chemicals across the country. This incredibly valuable program must be preserved and protected to conduct its scientific work without political interference. The EPA’s authority to determine the risks posed by hazardous chemicals should not be compromised by interference from other federal agencies or industry stakeholders with vested interests in decision outcomes.

This office has been targeted for political interference in the past. A 2009 U.S. Government Accountability Office (GAO) report found several examples of interference from EPA political appointees, the Office of Management and Budget (OMB), or other agencies to delay or weaken IRIS assessments, including decade-long review processes for naphthalene, formaldehyde, and RDX.[2]  A fall 2017 hearing held by the House Committee on Science, Space, and Technology about the integrity of IRIS failed to invite any IRIS staff to talk about the progress of the office.[3] This year, there have even been attempts to defund the program through the appropriations process.[4] Time and time again, the chemical industry has targeted IRIS because new assessments may lead to more stringent standards set based on the best available science on a chemical. Now there is the potential for the IRIS program to move under the jurisdiction of the TSCA program, which would limit the ability of the office to develop risk assessments for a range of industrial chemicals, instead forcing focus solely on those under TSCA’s authority.[5]

IRIS assessments and their staff provide institutional knowledge and assistance not only to risk assessors within the EPA and its regional offices, but also to public health practitioners in state and local governments. It is critical that the career staff scientists that comprise the IRIS office are supported so that they can continue to be a resource for individuals making regulatory decisions about these chemicals. This will allow for federal, state, and local decisions to be based on the best available science, using best methods for systematically evaluating that science. IRIS must continue to be housed in the Office of Research and Development as opposed to the policy office at the Office of Chemical Safety and Pollution Prevention because IRIS represents a scientific database that should be prepared by scientific experts. There is not room for political considerations in the work that IRIS staff do. The EPA’s scientific integrity policy explicitly protects the agency’s scientists and their work from political interference or personal motivations,[6] thus NAS should consider what a potential restructuring of the program would mean for its ability to conduct scientific work free from interference.

NAS has acknowledged some of IRIS’ challenges in the past, including room for improvement in transparency in communicating risks and decision points to the public and standardizing assessments, updating methodologies, and regularly training employees. Its most recent report in 2014 found that IRIS had made impressive strides toward implementing their previous recommendations.[7] The EPA Science Advisory Board found similar results after reviewing IRIS progress. In a September 2017 letter from the chair of the EPA’s Science Advisory Board (SAB), the Board commended the agency for its swift improvements to the IRIS program.[8]

At a time when the agency’s staff is shrinking[9] and science advisors are being underutilized,[10] the EPA needs its robust scientific staff to continue the work that has sustained stringent standards at the federal level and beyond. The healthy functioning of the IRIS program will ensure that we continue to have the data to set health-protective limits for hazardous chemicals and ensure public trust that the EPA has our best interests in mind.

Thank you.

 

 

[1] Integrated Risk Information System (IRIS). Online at https://cfpub.epa.gov/ncea/iris_drafts/simple_list.cfm, accessed January 30, 2018.

[2] United States Government Accountability Office. Low Productivity and New Interagency Review Process Limit the Usefulness and Credibility of EPA’s Integrated Risk Information System. Report to the Chairman, Committee on Environment and Public Works, U.S. Senate. Report No. GAO-08-440; March 2008. Online at http://www.gao.gov/new.items/d08440.pdf, accessed January 30, 2018.

[3] United States House of Representatives Committee on Science, Space, and Technology, Joint Subcommittee on Environment and Subcommittee on Oversight. 2017. Examining the Scientific and Operational Integrity of EPA’s IRIS Program, Hearing, September 7. Online at https://science.house.gov/legislation/hearings/joint-subcommittee-environment-and-subcommittee-oversight-hearing-examining, accessed January 30, 2018.

[4] Erickson, B.E., C. Hogue, J. Morrison. 2017. Trump EPA to shed chemical programs, grants. Chemical and Engineering News. Online at https://cen.acs.org/articles/95/i17/Trump-EPA-shed-chemical-programs.html, accessed January 30, 2018.

[5] United States Senate Committee on Appropriations. 2017. Summary: FY2018 Interior, Environment Appropriations Chairman’s Mark Released, November 20. Online at www.appropriations.senate.gov/news/minority/summary-fy2018-interior-environment-appropriations-chairmans-mark-released, accessed January 30, 2018.

[6] U.S. Environmental Protection Agency (EPA). 2014. U.S. Environmental Protection Agency Scientific Integrity Policy. Online at www.epa.gov/sites/production/files/2014-02/documents/scientific_integrity_policy_2012.pdf, accessed January 30, 2018.

[7] National Research Council. 2014. Review of EPA’s Integrated Risk Information System (IRIS) Process. Washington, DC: The National Academies Press. doi:10.17226/18764.

[8] Thorne, P.S. 2017. Letter to EPA Administrator E. Scott Pruitt, September 1. Online at https://yosemite.epa.gov/sab/sabproduct.nsf/0/A9A9ACCE42B6AA0E8525818E004CC597/$File/EPA-SAB-17-008.pdf, accessed January 30, 2018.

[9] Friedman, L., M. Affo, and D. Kravitz. 2017. E.P.A. Officials, Disheartened by Agency’s Direction, Are Leaving in Droves. New York Times, December 22. Online at www.nytimes.com/2017/12/22/climate/epa-buyouts-pruitt.html, accessed January 31, 2018.

[10] Reed, G., S. Shulman, P. Hansel, and G. Goldman. 2018. Abandoning Science Advice: One Year In, the Trump Administration is Sidelining Science Advisory Committees. Cambridge, MA: Union of Concerned Scientists. Online at www.ucsusa.org/sites/default/files/attach/2018/01/abandoning-science-advice-full-report.pdf, accessed January 31, 2018.

Your Home vs. Winter: Let these “Game of Thrones” Quotes Be Your Guide

UCS Blog - The Equation (text only) -

Photo: Norbert Stoop

For fans of HBO’s A Game of Thrones, the phrase “Winter is Coming” may evoke a myriad of feelings—anticipation, dread, (déjà vu),…

For those of us in many parts of the US, winter can mean cold days, colder nights, and the higher utility bills to go along with them. So how do we prepare… or deal with the fact that winter is here?

Fortunately, a few choice Game of Thrones quotes—and our book Cooler Smarter: Practical Steps for Low-Carbon Living—can get us where we need to be.

Winter is here. Let’s deal with it.

 

“Nothing burns like the cold.”

As described in Cooler Smarter, for the average American, heating and cooling are second only to transportation in terms of carbon pollution. They can also represent a sizeable chunk of the household budget, accounting for half of energy use in typical U.S. homes, according to the US Department of Energy.

All that means is that anything we can do to make our houses tighter will help with comfort, carbon emissions, and money, in the cold of winter or the heat of summer. The key, says Cooler Smarter,

…is that it’s not just a furnace or an air-conditioning system that keeps you and your family at a comfortable temperature; it’s the whole house. In cold weather, a house functions as a building-sized blanket, offering insulation from the freezing temperatures outside. In hot weather, a home shields you from the worst of the heat and humidity outside.

You can look for opportunities all over, that is—not just where the furnace or boiler is located.

“Knowledge is a weapon, Jon. Arm yourself well before you ride forth to battle.” (Master Eamon)

As with so many things in life, the first step is knowing where you stand. What do you heat with? How much do you use and spend? How does that compare with others’ energy habits, so that you can get a sense for what sort of opportunities there might be, efficiency-wise?

A few weapons you might turn to for help with that:

Gold and smiley faces. When you get energy efficiency right, the rewards are endless. (Credit: J. Rogers)

  • Your monthly bill – While you can’t necessarily do much about the costs of your energy per unit (kilowatt-hours, therms, gallons), you can see how many units you’re using. You can see how your usage compares to what you were using in previous winters, to get a sense if anything has changed (like, there’s a window open in the basement, or the dungeon…).
  • Online calculators – While you’re looking at your data, you can also see how that compares with how you maybe should be using, based on rules of thumb. Tools like this one can help with that.
  • Your comps – Even better, since it takes into account whether it’s been colder or warmer than usual, is if your utility shows you how you’re doing against others in your neighborhood with similar conditions (house size, heating fuel). It’s not perfect—maybe you’ve got four kids and a needy puppy dog, and they’ve got none. But it can help you orient yourself, particularly if you can look for changes in your relative standing over time; if you used to perform consistently better than neighbors, and now don’t, that might be a clue that something’s amiss.
“Once you’ve accepted your flaws, no one can use them against you.” (Tyrion Lannister)

When you’ve armed yourself as surely as the knights of Westeros (or Dothraki bloodriders across the sea) by seeing what opportunities there might be, it’s time to resolutely dive in.

You can view those opportunities in a few basic buckets: adapting, buttoning, and upgrading.

Adapting (changing how you operate). The easiest, lowest-cost (or no-cost) thing to do is likely to make better use of what you’ve already got.

Part of that is being more conscious about which parts of the house you’re heating (or cooling), and when. If you have the option of heating or cooling only the part of the house you’re using, that can be a fine way of staying comfortable and cutting utility bills.

That goes for the whole house, too, when you’re out for the day, or when you’re nestled all snug in your bed. The easiest way to do that is to not have to think about: a programmable thermostat can do it automatically, dialing the heat back during the day, and after bedtime, and bringing the heat back up in time for dinner or breakfast. (Just be sure to program it!)

Buttoning up your home. Another level of winterizing is helping your house keep you as warm and comfortable as possible with your existing heating system. Per Cooler Smarter:

Depending on how your home is constructed, you may be able to quickly reduce your carbon emissions and save money simply by caulking, sealing, and weatherstripping all seams, cracks, and openings to the outside. In fact, dollar for dollar, plugging these leaks is likely to be one of the most cost-effective energy-saving measures you can take.

Every house is different (and castles are a whole ‘nother kettle of fish), but here’s what leaks look like for the average house:

Source: Cooler Smarter

Replacing windows is a bigger commitment, but what these data suggest is that some caulk, some more insulation, and a few hours some weekend might do you a world of good.

And one thing I’ve learned in my own personal efficiency journey is that the most cost-effective amount of insulation might be much more than you’d think. In northern climes, you might do well to have something like 24 inches of insulation in your attic, if you can swing it.

Upgrading your heating system. The next level. Given that new HVAC equipment can run into the thousands of dollars, this isn’t necessarily something you do lightly. And it’s probably not something you can do unless you own the place.

But if you’re trying to ward off winter’s chill with something that was new when Dwight Eisenhower was in the White House—or even Jimmy Carter or Ronald Reagan—it might be well worth your while to at least look into options. Furnaces have gotten a whole lot better in recent decades, efficiency-wise.

You’ll want to weigh what a new system will cost vs. what it’ll save you in lower utility bills (not to mention added comfort). That involves making some assumptions about where fuel costs are headed, but so does sticking with your old clunker.

One pretty new option that I’ve gotten excited about is air-source heat pumps that work even in really cold weather, which is a recent innovation (and important where I live).

White Non-Walker (Credit: J. Rogers)

“There’s no shame in fear, my father told me; what matters is how we face it.” (Jon Snow)

These ideas won’t make polar vortexes go away, and they won’t drop utility bills to nothing. But they can help you seize your utility-bill destiny, to save money, increase your comfort, and cut your carbon pollution.

As for games of thrones: I’ve got to admit that I read the first book a while back, but found its don’t-put-me-down-or-else insistence more of a drag on my brain than I could handle. So that’s as far as I’ve gotten (for now).

Besides, I really need to spend less time fighting White Walkers (even vicariously) and more time protecting my family from the weather (or getting outside to enjoy it).

Let winter do what it will. There are heating bills to cut, and snowforts to build.

= = = = =

Some handy resources—because, of course, “One voice may speak you false, but in many there is always truth to be found” (Daenerys Targaryen):

Hat tip to Time and Goodreads for the quote help.

Trump’s Nuclear Posture Review: Top Take-Aways

UCS Blog - All Things Nuclear (text only) -

The Trump administration’s Nuclear Posture Review (NPR), to be released today, lays out a policy that will make the use of nuclear weapons more likely and undercut US security. (The final version is reportedly little changed from the draft version that was leaked two weeks ago.)

It includes a wide range of changes to US nuclear weapons policy and calls for deploying additional types of nuclear weapons. Some of these changes can take place relatively quickly—within the time remaining in President Trump’s term—and others will take years to realize. In the latter case, however, political repercussions could occur well before completion of the effort.

This post looks at some of the near-term changes and consequences. In a future blog, I’ll talk about some of the longer-term implications of the NPR.

  1. Preparing for nuclear war-fighting

One of the most significant changes to US policy outlined in the NPR is the tighter integration of US nuclear and conventional forces, including training and exercising with these integrated forces, so US forces can operate “in the face of adversary nuclear threats and attacks [emphasis added].” The NPR states (line numbers refer to the draft NPR text):

US forces will strengthen their ability to integrate nuclear and non-nuclear military planning and operations. Combatant Commands and Service components will be organized and resourced for this mission, and will plan, train, and exercise to integrate U.S. nuclear and non-nuclear forces and operate in the face of adversary nuclear threats and attacks. (lines 906-910)

The document asserts the new US policy “is not intended to enable, nor does it enable, ‘nuclear war-fighting.’” For a regional conflict, “nuclear war-fighting” refers to using nuclear weapons in an ongoing way once a conventional conflict has expanded to include nuclear weapons.

And if training to use nuclear and conventional forces in an integrated way isn’t preparing for nuclear war-fighting, what is? Russia and China will certainly view it that way, and the exercises themselves will be provocative. The new policy deliberately blurs the line between nuclear and conventional forces and eliminates a clear nuclear fire break. Doing so is not in US security interests.

Low-yield, accurate nuclear weapons are often described as “suited for war-fighting,” and would be an important component of the integrated nuclear and conventional force that the administration is planning for. As discussed below, the administration plans to deploy a new lower yield weapon on submarines. But the United States already has two types of low-yield weapons that it could use as part of an integrated force.

The United States currently deploys 100 B61 bombs in the United States for delivery by long-range bombers, and 150 B61 bombs at US airbases in five NATO countries—Belgium, Germany, Italy, the Netherlands, and Turkey—that would be delivered by pilots from those countries using their short-range aircraft. (Hundreds more are in storage.) These bombs allow the user to choose the yield of the weapon; depending on the variant, the yield ranges from 0.3 to 170 kilotons. The lowest yield of 0.3 kilotons is 50 times smaller than the yield of the bombs that destroyed Hiroshima and Nagasaki—which certainly qualifies as a warfighting weapon.

The United States also deploys 200 nuclear air-launched cruise missiles in the United States for delivery by long range bombers. These have variable yields ranging from 5 to 150 kilotons.

With these weapons the US military can begin planning, training and exercising with an integrated force of conventional and nuclear weapons—including low-yield weapons—within a year or two.

  1. Broadening scenarios for using nuclear weapons first

The new policy described in the NPR broadens the scenarios under which the United States would use nuclear weapons first, thus lowering the threshold for first use. The document explicitly lists a wide array of non-nuclear attacks that could constitute grounds for a US nuclear response. These “include, but are not limited to, attacks on the U.S. allied or partner civilian population or infrastructure, and attacks on U.S. or allied nuclear forces, their command and control, or warning and attack assessment capabilities.” (918-920)

Ironically, the Trump NPR makes a very strong case for a no-first-use policy. It states:

Russia must … understand that nuclear first-use, however limited, will fail to achieve its objectives, fundamentally alter the nature of a conflict, and trigger incalculable and intolerable costs for Moscow. Our strategy will ensure Russia understands that any use of nuclear weapons, however limited, is unacceptable. (1055-1059)

Surely, the same is true for the first use of nuclear weapons by the United States. However limited, US nuclear first-use will “fundamentally alter the nature of a conflict, and trigger incalculable and intolerable costs.” Any such use is “unacceptable.”

  1. Deploying new lower-yield submarine-launched weapons

The NPR states that the United States will replace some of the warheads on its submarine-launched Trident ballistic missiles with “low-yield” versions. These warheads would have a yield of roughly five kilotons; for comparison, the W76 and W88 warheads currently deployed on submarines have yields of 100 and 455 kilotons, respectively. Such a low-yield warhead can be produced by modifying an existing two-stage W76 or W88 warhead so that just the first stage explodes, which can be done relatively quickly. These weapons can—and likely will—be deployed during this presidential term.

As noted above, the United States already deploys low-yield bombs and air-launched cruise missiles with yield options that range from 0.3 to 150 kilotons. But the NPR argues that the new weapon will offer several advantages: it will not require “host nation support,” it will provide additional diversity, and it will be able to penetrate defenses. These arguments are spurious. The United States can deliver its bombs and air-launched cruise missiles using long-range bombers based in the United States—these require no host nation support. It is a truism that adding new types of weapons increases diversity, but it is irrelevant. It is also true that a ballistic missile will be able to penetrate defenses (especially since none exist), but this does not give it an advantage over the existing systems. The B-2 stealth bomber is designed to evade sophisticated air defenses, and the air-launched cruise missile can penetrate air defenses.

But the ultimate rationale the NPR gives for the low-yield Trident warhead is that it “will help counter any mistaken perception of an exploitable ‘gap’ in U.S. regional deterrence capabilities.” (392-393) Regardless of what the military thinks US nuclear weapons are deterring other countries from doing, to argue that the current arsenal is inadequate but will become adequate if we throw in a few low-yield Trident warheads is just silly.

  1. Undermining the Nuclear Non-Proliferation Treaty

The NPR describes the Nuclear Non-Proliferation Treaty (NPT) as the cornerstone of the nuclear non-proliferation regime. However, the new US policy undercuts the treaty in several ways:

Ignores NPT obligation to take measures toward nuclear disarmament

While claiming that the United States “continues to abide by its obligations” under the NPT, the NPR ignores the US obligation to take effective measures toward nuclear disarmament. Since the end of the Cold War, the United States has made progress—albeit slow progress—in reducing the number, types, and role of US nuclear weapons. The new policy reverses that progress. The non-nuclear weapon states are already fed up with the slow progress of the United States and Russia, and in response last year they negotiated a treaty banning nuclear weapons. The Trump NPR is a giant slap in their face.

Walks back from negative security guarantees.

Negative security guarantees—in which the nuclear weapon states assure countries without nuclear weapons that they will not be subject to a nuclear attack—are vital to the NPT.  Such guarantees reduce the incentive for countries to acquire their own nuclear weapons to counter threats from the nuclear weapon states. They were also key to the 1995 decision by the non-nuclear weapon states to extend the treaty indefinitely.

Current US policy is:

The United States will not use or threaten to use nuclear weapons against non-nuclear weapons states that are party to the NPT and in compliance with their nuclear nonproliferation obligations.

The Trump NPR reiterates this policy but follows it with a disclaimer:

Given the potential of significant non-nuclear strategic attacks, the United States reserves the right to make any adjustments in the assurance that may be warranted by the evolution and proliferation of non-nuclear strategic attack technologies and U.S. capabilities to counter that threat. (924-927)

In other words, don’t count on it.

Rejects CTBT Ratification

For 50 years now, the NPT non-nuclear weapon states have made it clear that they place high importance on achieving a treaty prohibiting nuclear explosive testing. The 1968 preamble to the NPT discusses the imperative of negotiating such a treaty, and when the non-nuclear weapons states agreed to indefinitely extend the NPT in 1995, it was predicated on their understanding that the Comprehensive Test Ban Treaty (CTBT) was near completion. The CTBT was opened for signature in 1996. The United States has signed, but not ratified the treaty. Over 20 years later, the treaty has still not entered into force, in part because the United States has not ratified it.

In another slap in the face of the non-nuclear weapon states, the NPR explicitly states, “the United States does not support ratification of the Comprehensive Nuclear Test Ban Treaty.” (529)

Stay tuned for future blogs on the new US policy!

Pruitt’s EPA Attempts to Undermine California’s Leadership on Vehicle Standards

UCS Blog - The Equation (text only) -

The current EPA administration has repeatedly mischaracterized California’s authority and role when it comes to vehicle emissions standards—here is what that really means for California and the country writ large.

For the current Administration, “One National Program” means “One WEAKER Program”

On Tuesday of this week, Scott Pruitt responded to questions from Senators Tom Carper (D-DE) and Ed Markey (D-MA) regarding vehicle emissions standards, declaring to a Senate Committee Hearing on EPA Oversight that “a national program is essential.” Yet in December, he declared that part of the ongoing midterm review of those standards could be revoking California’s waiver to maintain the current standards through 2025.

Similarly, last Thursday Assistant EPA Administrator Bill Wehrum, who leads the Office of Air and Radiation in charge of setting vehicle emissions standards maintained that “[t]he overarching goal of [ongoing conversations with California] is to maintain or retain one national program,” yet noted in the same line of questioning that the talks were held “with the intention and the goal of trying to achieve agreement as to whether changes should be made to the current (federal) standards.”

Just a heads-up to the EPA:  California already determined that the current standards remain appropriate.  And for that matter, the previous administration did so as well.  States who follow California’s program agree, which is why many have already intervened against any weakening.  So, if the current Administration wishes to make any changes to the program, it is they who are tampering with “One National Program.”  So much for that “essential” element, I guess!

Why does California get to set its own standards?

California was the first area of the country to encounter the problem of automotive pollution, and they were also the first region to take action.  Yet in doing so they encountered not just resistance by the auto industry towards regulation of those emissions, but denial by the industry such emissions were even a problem, and collusion by manufacturers to curtail the invention and adoption of emissions control devices, which I detailed in our report Time for a U-turn.


Fighting California’s standards is just one event in the timeline of automakers’ resistance to regulations. Learn more.

When Congress finally acted to regulate emissions from vehicles, they carved out an exemption for California to set its own, stronger emissions standards, recognizing its past leadership on the issue—an action which, again, the auto industry fought.  In the development of the Clean Air Act, this exemption was maintained, and in amendments to the Act this provision was expanded so that not only could California request a waiver to set its own standards stronger than those set by the federal government, but any other state could adopt California’s stronger standards.

Today, 12 other states and the District of Columbia have adopted California’s passenger vehicle emissions standards—altogether, 1/3 of the market for passenger cars and trucks has committed to California’s standards.

The need for this leadership is critical—despite continued progress on mobile source emissions, California continues to be home to significant air quality issues, and they are at the front line of the fight against climate change, having already been affected by an extended wildfire season and severe drought amplified by global warming.

This map from the Blue-Green Alliance depicts manufacturing facilities around the country who manufacture technologies to improve vehicle efficiency, showing the breadth of investment in strong standards.  These facilities employ over 288,000 workers and are spread across 48 states.

Walking back from strong standards cedes U.S. leadership

Administrator Pruitt mistakenly characterized California’s leadership as some sort of authoritarian rule (“Federalism doesn’t mean that one state can dictate to the rest of the country.”), when the state is simply protecting its inhabitants.  It also misses the point of setting a high bar for the country as a whole.

When the current EPA administration talks about changing the standards already finalized by California and the previous administration, it does so at the peril of investment in innovation in the United States.  The European Union is moving forward with stronger standards, regardless of what happens in the U.S.  Some countries are even looking past internal combustion engines altogether.  China, too, is setting both strong emissions targets and its own zero emission vehicles goals.

When other countries set these strong targets, the U.S. is ceding its leadership to those regions, and with it, a greener economic future.  Ford, for example, is betting heavily on an electrified future…in China.  Those are investments in next-generation vehicles that are being built abroad instead of North America.  Volkswagen, General Motors, and others are all following suit.  Automotive suppliers will move to those more advanced markets, too.

Manufacturers can meet the 2025 standards that were affirmed last year—California knows it, and frankly so do the auto companies.  If the administration is serious about a “data-driven” mid-term review, 1) we already had one of those and 2) we know it comes down on the side of strong standards.  If instead the administration undermines the data with political hullabaloo, California and the states that have already finalized the adoption of the current 2025 standards may end up being the backstop the rest of the country needs to ensure we don’t lose out on the jobs, fuel savings, and emissions reductions that strong standards provide.

We have One National Program now—if the EPA chooses to undo that by weakening the federal standards, it is Administrator Pruitt who will be responsible for unraveling the cost-effective, unified program currently protecting consumers and the public and in place today in large part due to strong state leadership.

BlueGreen Alliance and NRDC

China and Trump’s Nuclear Posture Review

UCS Blog - All Things Nuclear (text only) -

Chinese Vice Premier and Foreign Minister Qian Qichen signs the Comprehensive Nuclear Test Ban Treaty (CTBT) on September 24, 1996.

The Trump administration’s Nuclear Posture Review (NPR) repeats one of the most pervasive misconceptions about the current state of the US nuclear arsenal: that it does not compare well with the nuclear arsenals of Russia and China, which are supposedly engaged in nuclear modernization efforts the United States is neglecting.

China is making steady incremental improvements to its nuclear arsenal. But the gap between China and the United States is too wide to argue the United States is lagging behind in any meaningful way. We’ve laid out the details in a new white paper.

A Quick Comparison

China’s nuclear force is much smaller and far less capable than the nuclear force of the United States. Consider the following:

  • China’s nuclear arsenal is smaller than the US nuclear arsenal was in 1950.
  • China has a few hundred nuclear warheads and enough weapons-grade plutonium to make only several hundred more. The United States has 4,480 nuclear warheads (active and reserve) and enough weapons-grade plutonium to make approximately 5,000 more.
  • China conducted 45 nuclear weapons tests to develop and certify the nuclear warheads it has in its arsenal today. The United States conducted 1,056 nuclear weapons tests.
  • China can deliver 75 to 100 nuclear warheads to targets in the United States via ground-based intercontinental ballistic missiles (ICBMs).  The United States currently deploys 400 ICBMs and has another 400 nuclear warheads it could put on those ICBMs.
  • China does not currently deploy any nuclear weapons aboard ballistic missile submarines, although it could possibly deliver 60 nuclear warheads to targets in the United States aboard the five submarines it will have when the fifth one, currently under construction, is completed. The United States currently deploys about 900 nuclear warheads on ballistic missile submarines and its 248 missiles could carry as many as 2,976.

A Limited Force for a Limited Purpose

Despite the enormous disparity between Chinese and US nuclear forces, the leaked NPR about to be released by the Trump administration claims the United States needs new nuclear weapons because “China is expanding and modernizing its considerable nuclear forces” and because China “pursues entirely new nuclear capabilities tailored to achieve particular national security objectives.” The new NPR also expresses concern about the “increasing prominence” of nuclear weapons in Chinese defense policy, including possible Chinese first use of nuclear weapons.

There is little evidence China is pursuing “entirely new” nuclear capabilities.

The NPR implies China’s ability to put multiple warheads on its silo-based ICBM, its ability to deploy ballistic missile submarines and its ability to deliver nuclear weapons by aircraft are new. That needs to be considered in context.

China has had the ability to put multiple warheads on its largest silo-based ICBM for decades. It only did so recently with some of its ICBMs, adding a total of 20 warheads. Adding warheads to the rest of these ICBMs would add only another 20 total warheads. So the decision to utilize the capability to add multiple warheads does allow for a modest increase in the number of warheads China can deliver to the United States. But it is a small increase and it is misleading to characterize it as an “entirely new” capability. The United States deployed its first ICBM with multiple warheads in 1970.

The same is true for China’s ballistic missile submarines and bombers. China has had the capability to put nuclear-armed ballistic missiles on submarines for quite a while. It commissioned its first ballistic missile submarine in 1981. It began conducting sea trials of the submarine class it is building today in 2006. It has still not actively deployed them.

China does have a new nuclear capable air-launched cruise missile but US intelligence sources state it does not currently have a nuclear mission.

There is little compellng evidence that nuclear weapons are more prominent in China’s military strategy or that China intends to use nuclear weapons first.

Authoritative Chinese military sources state that the only national security objective China aims to achieve with its small nuclear force is to maintain an ability to retaliate if another state launches a nuclear attack against China first. Those same sources also confirm China remains committed to its longstanding policy of not using nuclear weapons first.

The limited size and capabilities of China’s nuclear force lends credibility to Chinese statements about the limited role of nuclear weapons in its military strategy.

Of course, China has been incrementally improving the quality and increasing the quantity of its nuclear forces since its first test of a nuclear-armed missile in 1966. The pace of these improvements has been steady but slow, especially when compared with the growth of China’s economy. As noted above, after a half-century of continuous incremental “modernization,” China’s nuclear arsenal remains smaller than the US nuclear arsenal was in 1950.

How to Keep China’s Nuclear Force Small and Limited

President Trump and many members of Congress from both parties seem to believe the United States is in a new nuclear arms race with China. There is no evidence China is engaged in a substantive build-up of its nuclear forces. But even so, for those who are concerned, the best thing the United States can do to win this hypothetical nuclear arms race with China is to limit China’s ability to build new warheads.

China cannot dramatically enlarge its nuclear force without producing more weapons-grade plutonium. And China cannot develop new lighter, variable-yield or low-yield nuclear warheads—like the United States already possesses—without resuming nuclear testing.  It stands to reason, therefore, that US and allied officials concerned about the future size and capabilities of China’s nuclear arsenal should take every measure possible to prevent China from producing more fissile material for nuclear weapons and from testing new nuclear warheads.

For the moment, China says it is still willing to negotiate a fissile material control treaty (FMCT) that would verifiably ban new production of fissile material for nuclear weapons.

In addition, China stopped nuclear testing in 1996 and signed the Comprehensive Test Ban Treaty (CTBT). Chinese nuclear arms control experts say their government is still willing to permanently end nuclear testing and ratify the CTBT as soon as the United States does. Entry into force of the CTBT would verifiably ban China from testing new nuclear warheads.

The Trump administration’s plan to develop and deploy new nuclear weapons does nothing to prevent China from expanding its nuclear forces. However, ratifying the CTBT and beginning negotiations on the FMCT would cap the size of China’s nuclear arsenal at its current level. Working towards the entry into force of these two arms control treaties, then, should be the top two priorities for anyone genuinely concerned about the future size and capability of China’s nuclear forces.

Pages

Subscribe to Union of Concerned Scientists aggregator - Combined UCS Blogs