Combined UCS Blogs

Hearing from the Scientists Who Rely on Sea Grant

UCS Blog - The Equation (text only) -

I can pinpoint my passion for marine conservation to a childhood full of opportunities to experience the wonders of nature and grounded in a deep appreciation for the ocean and fishing culture. This is why I have chosen to devote my life to ensuring these natural resources are around to inspire future generations.

However, the budget proposal released by the White House this week has made it clear that supporting scientists like me is not a priority. Governmental agencies that employ my respected colleagues, fellowships that helped me get through graduate school, and research programs that I rely on to do my job are lined up for the cutting block.

Among the worst of the proposed budget cuts is the complete elimination of Sea Grant. Sea Grant excels as a conduit between the scientists and the stakeholders in coastal areas who have real problems to solve. Integral to Sea Grant’s mission to promote integrated and applicable research is its commitment to the next generation of scientists. Sea Grant is a major source of fellowships for coastal science graduate students. While I personally was not funded through Sea Grant (I had EPA funding, which is also eliminated under the proposed budget), I have many colleagues and friends who benefitted from Sea Grant support as they began their careers. I interviewed a few for this post about the value of Sea Grant to their careers, to the environment, and to science in general.

Training the next generation of scientists

Tidal pools in Newport, OR.

For many young scientists, opportunities through Sea Grant are a path to a career in science that can really make a difference. Theresa Davenport, a marine scientist and a recent graduate of the Virginia Institute of Marine Science, was part of Sea Grant’s incredibly successful Knauss Marine Policy Fellowship program.

“The Knauss Fellowship’s hallmark is to take subject matter experts and provide them with experience and training to become globally engaged knowledge experts and leaders working at the intersection of academia, private citizens, industry and government,” said Theresa.

Knauss fellows are placed in federal legislative and executive offices in Washington D.C. In many cases, these interns are the only sources of science expertise in their offices, and the value of these young scientists to the American public is incalculable. For example, Theresa helped develop a restoration monitoring and adaptive management plan for the Deepwater Horizon oil spill recovery. In fact, she mentioned that her team on this important and crucial project was made up of mostly Sea Grant fellows or folks that had previously been involved in the Knauss fellowship program. She said this is not out of the ordinary.

“It would be interesting to compile the number of Sea Grant fellows involved in the two largest US environmental disaster responses in the last 10 years.” She is referring to the Deepwater Horizon oil spill and Hurricane Sandy, and she expects Sea Grant fellows played a large role in both cases.

Science informing policy

The benefits of funding early career scientists continue long after the fellowship ends. Introducing scientists directly to problems that can benefit from their unique gifts and knowledge ensures that they will be problem solvers. For Dr. Allison Colden, another graduate of the Virginia Institute of Marine Science, a Sea Grant fellowship was an important step to a career in conservation.

“As a former Sea Grant Knauss Marine Policy Fellow, I gained valuable experience in interpreting cutting-edge science into public policy, a skill that I now use daily at a leading environmental non-profit,” she said.

She sees Sea Grant playing an important role in solving many of the problems facing the world today.

“Sea Grant is vital to ensuring the continued prosperity and resilience of our nation’s coastal communities by connecting managers and stakeholders with innovative science to create viable solutions for the future,” said Allison. “Cuts to Sea Grant sever a critical link in the science-policy chain, undermining the social, economic, and ecological resilience of coastal communities in a time when it is needed most.”

Scientists are increasingly facing the burden to make the connection between research and impacts, and Sea Grant has been making that connection for nearly 50 years. We should be expanding, not gutting programs that bring together academia, private citizens, industry and government, and programs that inspire young scientists to build solutions to the challenges we face. This is the best way for society to achieve a healthier, safer, more sustainable future for all people.

 

Dr. Cassandra Glaspie is a postdoctoral scholar at Oregon State University in the Fisheries and Wildlife Department. Originally from Waterford, Michigan, Cassandra received her B.S. in Zoology from Michigan State University and her PhD in Marine Science from the Virginia Institute of Marine Science. Cassandra is passionate about the environment and the ocean, and her research involves marine food webs and predator-prey interactions, especially as they relate to changes in the environment. In Oregon, she studies climate-related changes in ocean habitat quality for ecologically and economically important fish such as Chinook salmon and albacore tuna. A resident of Corvallis, Cassandra is an advocate for local climate action and works with the Corvallis chapter of the Sierra Club to educate the community on issues related to climate change and sustainability initiatives.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

As the White House Fixates on Coal, Renewable Energy Goes Local

UCS Blog - The Equation (text only) -

The Trump Administration’s energy communications sound increasingly tone deaf these days.

The Department of Energy released a graphic last week that highlights six facts we may not know about coal, as if cheerleading the coal industry will minimize the fact that coal-fired electricity is the largest source of global greenhouse gas emissions and a significant contributor to air pollution that makes us sick.

Compared to where many states and cities across this country are headed, the focus on coal is at best nostalgic and misguided, and at worst desperate and dangerous. There is no question; coal is on the decline.

Contrast that silly graphic with a new report also released last week indicating that the transition to clean energy is picking up speed across the country. The 2017 Clean Tech Leadership Index report by Clean Edge ranks activities and investments in the clean-tech space (think electric vehicle adoption and investments in energy efficiency and renewables). For the second year in a row, wind and solar comprised almost 17 gigawatts of new power sources in 2016, representing more than half (61 percent to be exact) of all new electricity generation capacity installed in the US.

Three states, Iowa, South Dakota, and Kansas, generate at least 30 percent of their electricity from renewables and another three states, Oklahoma, California, and South Dakota, get at least 20 percent.

Perhaps most surprising and exciting is the number of cities across the country that are investing in clean electricity and transportation, and benefiting from the jobs and capital that come with it. The California cities of San Francisco, San Jose San Diego, and Los Angeles were standouts in the reports’ top-ten metro rankings. Others making it into the overall top-ten were Washington D.C., Portland, Boston, Seattle, Salt Lake City, and Austin.

But just looking at the map below shows that clean energy leadership is not confined to the coasts nor to blue states.

Source: Clean Edge, Inc.

Evidence of California’s clean energy leadership was on full display last week when it broke two new renewable energy generation records. On Tuesday, May 16, renewable energy supplied an all-time high of 41 percent of total electricity demand for the day, and on Saturday, May 13th, more than two-thirds of demand were satisfied by renewables during the 2pm hour.

The graph below from the California Independent System Operator (CAISO) shows how much of the state’s electricity came from renewables for each hour of the day on the 16th.

Hourly production in CAISO footprint for May 16, 2017. Source: CAISO Renewables Watch

The Clean Edge report is just the latest proof that cities and states around the country are setting their sights on clean energy, despite the Trump Administration’s misguided affection for coal. Our recent analysis Clean Energy Momentum: Ranking State Progress is further confirmation of that (encouraging) trend. California continues to blaze ahead and break new records, but many other areas in the country are picking up speed.

Photo: Black Rock Solar/CC BY 2.0, Flickr

Lamar Smith and Selective Transparency: Why I’ll Be Livetweeting the EPA Scientific Integrity Stakeholder Meeting

UCS Blog - The Equation (text only) -

For the past few years, the Environmental Protection Agency has held a meeting with outside groups to discuss its annual scientific integrity report. All kinds of organizations have attended in the past, from the American Chemistry Council (which represents chemical companies) to the American Association for the Advancement of Science (which represents scientists) to the American Lung Association (which represents people who breathe). They’re all invited again to this year’s meeting on June 14.

The EPA-produced report describes the actions the agency has taken under the EPA scientific integrity policy over the previous year. The meeting is an opportunity for organizations to ask questions about the report, to give feedback to the agency, and to identify new or emerging challenges. It’s not a perfect process, and the agency gets criticism from all sides (including UCS). But it’s an impressive attempt to reach out to the agency’s stakeholders. To my knowledge, no other agency or department does this.

Transparency is important to holding government accountable. It just shouldn’t be selective.

Yesterday, House Science Committee Chairman Smith sent a letter to EPA Administrator Scott Pruitt expressing concern about this meeting. As rat smellers go, he doesn’t exactly have the best nose, but he smells a rat. Chairman Smith seems to be trying to drum up controversy about the meeting, as he explicitly objects to some of the invitees (including me), and is calling for the agency to make it open to the public.

I wholeheartedly agree. So on June 14th at 3:00pm, I’ll begin livetweeting the EPA scientific integrity meeting. You can follow along at @halpsci. It’s usually a fairly humdrum affair, so I can’t promise everything will be interesting (although these days, let’s face it, everything at the EPA has some fireworks). But I can promise it will be transparent, and I will make at least a couple of attempts to be funny.

Better yet, I’d encourage the agency to put it on Periscope, or livestream it on Facebook, so the Chairman and his staff and anyone with an Internet connection can hear every question posed about an agency under siege by an administrator who is hostile to the science that it creates and communicates. That is, if the EPA still has the budget to pay for enough Internet bandwidth by the time the meeting happens.

Real talk about transparency

Chairman Smith claims to care about transparency. So let’s talk about that. Here is the type of transparency we deserve:

I’d like to know whether anyone from the Obama administration or the oil and gas industry influenced language in the EPA’s press release and executive summary about the impact of fracking on drinking water. Unfortunately, our FOIAs to help answer that question came back heavily redacted.

I’d like to know who from the chemical industry met or communicated with the EPA in advance of Administrator Pruitt’s decision to reject scientific advice and keep a dangerous pesticide on the market that has been shown to hurt endangered species and harm human brain development. Perhaps any such meetings or phone calls, too, should have been open to the public. They should at least have transcripts and recordings. While we’re at it, maybe these meetings should include representatives from groups like the American Public Health Association. Maybe they should even include some independent scientists with expertise in the impact of the chemicals on developing brains!

I’ll be livetweeting the EPA’s scientific integrity stakeholder meeting on June 14 at 3pm. Follow me @halpsci.

I’d like many more EPA meetings to be made open to the public. Every time they meet with lobbyists from an industry trade group or the U.S. Chamber of Commerce, I want it broadcast, live. I want to see Snapchat stories about industry input. I’d like to know, for example, exactly what was discussed at two consecutive days of meetings in North Carolina between EPA officials and representatives from the American Petroleum Institute on April 19 and 20. It could be completely innocuous, but I want to know.

So where the deuce are the chairman’s letters for any of that? Selective transparency does not suggest good faith.

Of course Chairman Smith’s letter to Pruitt is absurd, even by the ever-downward-spiraling standards of the House Science Committee. It is a clear attempt to cast doubt on the work of the agency’s scientific integrity office and thereby weaken its credibility and investigative authority. This attempt should be roundly rejected.

I know Dr. Francesca Grifo, the EPA’s Scientific Integrity Official, quite well. And I know that she travels all around the country talking about scientific integrity with anyone who will listen. She has given scores of presentations about the agency’s scientific integrity policy. She meets with environmental groups. She meets with industry organizations. She will fully investigate scientific integrity complaints from anyone who files them.

So I’m looking forward to June 14th, and can’t wait to share all of the details. It could possibly be the most interesting meeting yet.

If It Ain’t Broke, Defund It: Trump’s Budget Writes Off SNAP—and With It His Supporters

UCS Blog - The Equation (text only) -

Today, the Trump administration released a budget proposal for FY 2018 that would drastically reduce funding for SNAP—the largest nutrition assistance program in the federal safety net—by a full 28 percent over the course of the next ten years. This amounts to a $193 billion dollar cut from a program with a yearly budget of less than $75 billion.

Despite statements just last week from Secretary of Agriculture Sonny Perdue, who claimed no knowledge of proposed changes to SNAP, the budget proposal states that the drastic reductions in SNAP funding will be achieved through measures that expand work requirements, narrow eligibility, and establish a matching component for states to cover a portion of benefits—a potential first step toward block granting the program.

The preliminary “skinny budget” released by the administration in March, which called for a 21 percent reduction in USDA funding for discretionary programs, hinted at the direction, but not the magnitude of cuts to come for means-tested programs like SNAP. Now, with dollar signs and decimal points to demarcate the damage, the final budget proposal confirms a caustic indifference to the needs of millions of rural and urban families served by the agency’s cornerstone programs—and with it, a callous betrayal of a key segment of President Trump’s own voter base. The consequences of slashing funding for SNAP, compounded by equally significant cuts for programs like Medicaid to the tune of over $800 billion over 10 years, pose a very real and significant threat not only to the core function of our federal safety net, but to the backbone of our nation itself.

We can’t afford cuts to SNAP funding

SNAP provides support to 21 million American households in both urban and rural areas, lifting families out of poverty, reducing food insecurity, and improving long-term health outcomes. Put simply, SNAP works. It is one of the most effective federal assistance programs we have, and it operates with one of the lowest fraud rates. In 2014, the benefits provided by SNAP lifted an estimated 4.7 million people out of poverty—including 2.1 million children. In fact, nearly half of all SNAP recipients (about four in ten) are children.

Research is clear about the devastating consequences facing kids who don’t get enough to eat: they experience poorer health, incur higher medical expenses, and achieve less in the classroom and beyond. Reducing the amount of funding available to SNAP recipients by 25 percent is equivalent to removing a critical source of support for the growth and development of over half a million kids. How does the master plan to make America great (there is one… right?) compensate for the lost potential of a half million of its youth?

And kids aren’t the only ones who benefit from SNAP. Data shows that the program reduces food insecurity rates by 30 percent among participating households, which means fewer serious health complications and hospitalizations for adults living with diabetes. SNAP-Ed also plays an important role in promoting health and helping low-income families achieve healthy diets: evidence-based nutrition education programs funded through SNAP have yielded increases in fruit and vegetable consumption and greater physical activity levels among adults, with estimates of $10 saved in overall long-term health care costs for every dollar invested.

Research also suggests that SNAP expenditures act as economic stimuli, with every five dollars in new benefits generating as much as nine dollars in economic activity. This function is particularly important during times of economic downturn, as benefits redeemed contribute to both the economic stability of participating households and their broader communities.

Cultural elitism, institutional racism, and a dash of alternative facts

If you remember only two things from this post, let it be these brief and breathtakingly true facts: SNAP already has work requirements in place. And most SNAP participants who can work, do work. To be eligible for SNAP benefits, program regulations require that able-bodied adults without dependents must either work or participate in a work program for at least twenty hours per week. SNAP users may also be required to attend state-assigned employment and training programs. If they don’t meet work requirements within three months of enrolling, benefits are terminated and can’t be reinstated for a 36-month period.

Which brings us back to extraordinarily accurate fact number two: Most SNAP participants who can work, do work. USDA data shows that approximately 64 percent of SNAP participants are children, elderly, or disabled; 22 percent work full time, are caretakers, or participate in a training program; and only 14 percent are working less than 30 hours per week, are unemployed, or are registered for work. Moreover, among households with adults who are able to work, over three quarters of adults held a job in the year before or after receiving SNAP—meaning the program is effectively helping families fill temporary gaps in employment.

So if work requirements already exist, and most able-bodied adults are working…why are we still talking about stronger work requirements? We can attribute this in part to a dangerous and deeply rooted political narrative that has for decades cast a light of suspicion and mistrust on welfare recipients—particularly those of color—by painting them as lazy and deceitful in the public eye. And if you believe that we no longer suffer the aftershocks of the mythical Reagan-era welfare queens, recall that just three short years ago, current House Speaker Paul Ryan delivered a radio interview in which he raised concerns about a perceived “tailspin of culture, in our inner cities in particular, of men not working and just generations of men not even thinking about working or learning the value and the culture of work.” Ryan’s own constituents were quick to point out that his comments amounted to thinly veiled code language for “black men.”

Is there good news? Tell me there’s good news

There’s good news. Here it is: Trump doesn’t have the final say on the budget. Congress does, and there are strong indications that bipartisan opposition to proposed agency and program funding cuts has only gained momentum since the release of the preliminary budget. House Agriculture committee chair Mike Conaway has called the proposed budget wrongheaded, while ranking member Collin Peterson confidently stated that the preliminary budget would be ignored, “as it should be.

But it is critical not to mistake broad dissatisfaction with the president’s budget priorities for a commitment to protecting public assistance programs—particularly SNAP. As the farm bill program with the largest price tag, SNAP is, and will remain, a glaring target for those seeking areas to cut federal spending. At UCS, we will continue our work to provide sound scientific evidence demonstrating the long-term health impacts and cost savings generated by investments in SNAP, while countering political narratives that propagate harmful stereotypes about program participants and diminish public support for critical federal assistance programs. We can’t let allow politics and ideology to seal the fate of the federal safety net—the stakes are simply too high.

Photo: wisley/CC BY SA (Flickr)

Department of Interior Censors USGS Press Release on Climate Change, Flooding, and Sea Level Rise

UCS Blog - The Equation (text only) -

Late yesterday, the Washington Post reported that the United States Geological Survey deleted a sentence acknowledging the link between climate change and sea level rise from an official agency press release. The USGS describes itself as the sole science agency for the Department of Interior. UCS will today formally ask the department to investigate the deletion as a violation of departmental policy.

It’s not the first time that an administration has removed scientific information it doesn’t like from agency communications. What’s different now is that scientists are not taking it quietly. “It’s a crime against the American people,” study co-author Neil Frazer told the Post.

The press release announced a new scientific paper in Nature co-authored by two USGS scientists which finds that coastal flooding frequency could double in the tropics by mid-century because of global warming. The paper’s abstract leads with the link between climate change and sea level rise. “Global climate change drives sea-level rise, increasing the frequency of coastal flooding” reads its first sentence.  The science was already clear: global warming is the primary cause of current sea level rise.

The press release, however, intentionally omitted this fact–a demand for deletion that reportedly came from the Department of Interior, in which USGS is housed. “I disagree with the decision from the upper administration to delete it, not with the scientists who deleted it at the administration’s request,” said study co-author Chip Fletcher.

Censorship violates multiple policies

The changes may violate multiple government policies (emphasis added):

  1. The USGS Scientific Integrity Policy states that the agency “will not tolerate loss of integrity in the performance, use, or communication of scientific activities and their results” (my emphasis added).
  2. The DOI Communications Policy requires the office of communications to ensure the accuracy of press releases and other public communications by providing the materials “for review prior to release by scientists, scholars, engineers and other subject matter experts.” Public affairs officials may not “alter the substance of scientific, scholarly and technical information.”
  3. The DOI Scientific Integrity Policy prohibits agency decision makers from “engag[ing] in dishonesty, fraud, misrepresentation, coercive manipulation, censorship, or other misconduct that alters the content, veracity, or meaning or that may affect the planning, conduct, reporting, or use of scientific activities.”

 These policies were developed and strengthened by the Department of Interior after repeated scandals where science was censored or rewritten by political appointees. USGS is not a regulatory agency, however, and has traditionally enjoyed much more independence than other agencies within Interior.

Holding the Trump administration accountable

It has become clear that when they can, scientists will speak up when they see political interference in science. But it’s a lot easier for a university researcher like Dr. Fletcher or Dr. Frazer to do so.

Government scientists can file complaints under their agency scientific integrity policies when they witness political interference in the conduct or communication of science. Those who don’t feel comfortable doing so can securely share information with UCS, and we will work with reporters, Congress, inspectors general, and others to investigate and expose any wrongdoing. Together, we can raise the political price for the manipulation or suppression of science.

TVA’s Nuclear Allegators

UCS Blog - All Things Nuclear (text only) -

The Nuclear Regulatory Commission (NRC) receives reports about potential safety problems from plant workers, the public, members of the news media, and elected officials. The NRC calls these potential safety problems allegations, making the sources allegators. In the five years between 2012 and 2016, the NRC received 450 to 600 allegations each year. The majority of the allegations involve the nuclear power reactors licensed by the NRC.

Fig. 1 (Source: Nuclear Regulatory Commission)

While the allegations received by the NRC about nuclear power reactors cover a wide range of issues, nearly half involve chilled work environments where workers don’t feel free to raise concerns and discrimination by management for having raised concerns.

Fig. 2 (Source: Nuclear Regulatory Commission)

In 2016, the NRC received more allegations about conditions at the Watts Bar nuclear plant in Tennessee than about any other facility in America. Watts Bar’s 31 allegations exceeded the allegations from the second highest site (the Sequoyah nuclear plant, also in Tennessee, at 17) and third highest site (the Palo Verde nuclear plant in Arizona, at 12) combined.  The Browns Ferry nuclear plant in Alabama and the Pilgrim nuclear plant in Massachusetts tied for fourth place with 10 allegations each. In other words, Watts Bar tops the list with a very comfortable margin.

Fig. 3 (Source: Nuclear Regulatory Commission)

In 2016, the NRC received double-digit numbers of allegations about five nuclear plants. Watts Bar, Sequoyah and Browns Ferry are owned and operated by the Tennessee Valley Authority (TVA). Why did three TVA nuclear plants place among the top five sources of allegations to the NRC?

Because TVA only operates three nuclear plants.

The NRC received zero allegations about ten nuclear plants during 2016. In the five year period between 2012 and 2016, the NRC only received a total of three allegations each about the Clinton nuclear plant in Illinois and the Three Mile Island Unit 1 reactor in Pennsylvania (the unit that didn’t melt down). By comparison, the NRC received 110 allegations about Watts Bar, 55 allegations about Sequoyah, and 58 allegations about Browns Ferry.

TVA President Bill Johnson told Chattanooga Time Free Press Business Editor Dave Flessner that TVA is working on its safety culture problems and “there should be no public concern about the safety of our nuclear plants.” The NRC received 30 of the 31 allegations last year from workers at Watts Bar, all 17 allegations last year from workers at Sequoyah, and all 10 allegations last year from workers at Browns Ferry.

So President Johnson is somewhat right— the public has no concerns about the safety of TVA’s nuclear plants. But when so many TVA nuclear plant workers have so many nuclear safety concerns, the public has every reason to be very, very concerned.

Nuclear plant workers are somewhat like canaries in coal mines. Each is likely to be the first to sense danger. And when nuclear canaries morph into nuclear allegators in such large numbers, that sense of ominous danger cannot be downplayed.

California’s Cap-and-Trade Program and Low Carbon Fuel Standard Go Together Like Peanut Butter and Jelly

UCS Blog - The Equation (text only) -

This year is shaping up to be another action-packed year on climate change in the California Legislature. Last year, legislators passed a sweeping commitment to cut California’s global warming emissions 40 percent between 2020 and 2030, and this year policy makers are considering how California should achieve these big goals. At the center of that conversation is a debate about whether to extend the state’s cap-and-trade program beyond 2020.(For a quick primer on cap-and-trade, check out our Carbon Pricing 101 webpage.)

Lawmakers should extend and refine California’s cap-and-trade program

California’s cap-and-trade program is an important tool for addressing climate change. This is because it sets a price on global warming emissions and that price helps incorporate the costs of climate change and the value of low carbon technologies into the decisions businesses and consumers make.

In addition, the program’s revenues have proven to be a critical source of funds for investments in clean vehicle, fuel, and energy technologies, particularly in communities that are most impacted by fossil fuel pollution.

In short, California’s cap-and-trade program, while not perfect, is helping to address climate change. As the state sets a course to make big cuts in pollution over the next decade, the program’s price signal and investments in clean technologies will become even more important.

However, given that the cap-and-trade program is now in its fifth year and needs updating for the post-2020 period, it also makes sense to consider refinements to the program. The Air Resources Board has proposed some changes, but we support lawmakers taking a closer look at further improvements.

In particular, UCS supports AB 378 (C. Garcia, Holden, E. Garcia), which seeks to better align the cap-and-trade program with air quality goals. This legislation aims to promote strategies that deliver equitable reductions in criteria emissions, toxic contaminants, and global warming pollution that also benefit low income communities and communities of color. Additionally, we advise the legislature to consider:

  • Raising the cap-and-trade program’s price floor (or “auction reserve price”)—This will ensure the price signal from the program is adequately driving investments in clean technologies.
  • Requiring auctioning of allowances except for proven leakage risks—This will ensure that the value of allowances is being used for the benefit of the public.
  • Taking a cautious approach to offsets—It is difficult to make sure some offset projects represent additional and permanent emission reductions. An abundant use of offsets would also outsource the co-benefits that come with emission reductions from covered sectors.
  • Pursuing opportunities to link with other jurisdictions—In addition to Quebec and Ontario, Canada, several U.S. states, including neighboring Oregon, may seek to link future economy-wide cap-and-trade programs with California’s large and proven market. The opportunity for linkage is one important way for California’s leadership to spread to other jurisdictions.
The oil industry supports cap-and-trade too?

The politics of extending the cap-and-trade program are starting to get interesting.

For example, the program has picked up new and unlikely supporters. First on that list is the Western States Petroleum Association (WSPA), the oil industry’s trade group. (A few Republicans have also started to voice support for the concept.) Just last year WSPA opposed setting limits on climate pollution in California and previously it fought vehemently against including gasoline and diesel fuel in the cap-and-trade program. Nonetheless, WSPA now supports extending cap-and-trade to 2030.

What’s going on here? Well, the oil industry’s idea of how to best design California’s cap-and-trade program looks quite different from UCS’s vision for the program. WSPA wants to see a limit on the price of allowances, more free allowances to refineries and other industrial sources, and greater use of offsets to maximize flexibility, among other changes. If they succeed, we will see fewer emission reductions from the oil industry, the largest sector of emissions in California.

But the biggest prize on WSPA’s wish list in cap-and-trade negotiations is to roll back California’s Low Carbon Fuel Standard (LCFS), a program that focuses directly on the transportation fuel industry. Earlier this month, WSPA launched a website devoted to ending the LCFS. And just last week, at an informational hearing about cap-and-trade, the oil industry’s lobbyist spent half of his testimony talking about the need to eliminate “redundant” programs such as the LCFS.

LCFS guarantees a market for cleaner fuels

In order to understand the importance of the LCFS—and why the oil industry has consistently sought to undermine the program—one must understand the basics of the program.

The LCFS requires petroleum refiners and fuel importers to reduce global warming pollution associated with the fuels they sell. The program regulates the “carbon intensity” of fuels, which is a measurement of global warming emissions per unit of fuel. Moreover, the program looks at emissions over the fuel’s entire life cycle, which means the emissions that come from both producing and using the fuel.

The LCFS requires a gradual reduction in carbon intensity, reaching a 10 percent reduction in 2020, relative to 2010. (ARB plans to extend the program to 2030.) Refineries and fuel importers can meet the requirement by selling fuels that, on average, meet the carbon intensity standard, or by selling fuels over the standard while also purchasing credits generated by sellers of lower-carbon fuels, such as biodiesel or electricity.

Since gasoline and diesel are above the standard, the LCFS creates a dependable market for cleaner fuels, which drives steady investment into non-petroleum fuel sources. The program’s performance speaks for itself. Between 2011 and 2016, use of alternative fuels grew by 50 percent in California, while the average carbon intensity of these fuels declined by 30 percent. All told, the program reported 25 million tons of reduced carbon emissions.

California’s LCFS has helped grow the state’s clean fuels market by 50 percent.

Like peanut butter and jelly

The oil industry argues the cap-and-trade program and LCFS just don’t mix—like oil and water, you might say. However, I see the two policies more like peanut butter and jelly—they are good on their own but so much better together.

The two programs fulfill different niches in California’s climate-fighting repertoire. The LCFS is fostering research, development, and deployment of new and better clean fuel options. That’s why more than 150 clean fuel producers, vehicle manufacturers, and fleet operators recently voiced their support for the program.

Meanwhile, the cap-and-trade program is helping to integrate the costs of climate change into business decisions throughout the economy while also supporting investments in deployment of clean technologies through the program’s revenues.

The two programs also complement one another because compliance with the LCFS eases compliance with cap-and-trade. For example, recent research showed that extending the LCFS to 2030 would cut cap-and-trade allowance prices, reducing compliance costs for all sources covered by the cap-and-trade program.

While the oil industry would love to rely only on cap-and-trade to cut carbon pollution from cars and trucks, the reality is that a carbon price alone is not enough to decarbonize our transportation system. The cap-and-trade program and LCFS are two key components of the state’s multifaceted approach reduce the carbon content of fuels, improve the fuel efficiency of vehicles, and reduce vehicle use.  It’s critical that both policies are designed wisely and extended to 2030, even if that means overcoming the oil industry’s opposition.

Three Steps Shell Can Take for the Climate—and to Earn Public Trust

UCS Blog - The Equation (text only) -

“Trust has been eroded to the point where it is an issue for our long-term future.”

—Ben van Beurden, Royal Dutch Shell CEO, at CERAWeek in March 2017

Royal Dutch Shell holds its Annual General Meeting (AGM) tomorrow in the Netherlands, and like other major fossil fuel producers the company is under pressure from its investors to do more to address climate risks.

UCS took an in-depth look at Shell’s climate-related positions and actions for The Climate Accountability Scorecard last year. We found a few bright spots, and we made several recommendations for improvement. Here are three steps company decision makers could take at tomorrow’s AGM to signal that Shell wants to earn the trust of investors, the public, and policy makers.

1) Stop supporting disinformation

A 1991 video recently unearthed by The Guardian shows that Shell clearly recognized the risks of climate change decades ago. The film, titled “Climate of Concern,” warned of climate change “at a rate faster than at any time since the end of the ice age—change too fast perhaps for life to adapt, without severe dislocation.”

Yet despite this knowledge, Shell funded—and continues to fund—trade associations and industry groups that spread climate disinformation and seek to block climate action.

For a decade, Shell was part of the Global Climate Coalition, which presented itself as an umbrella trade association coordinating business participation in the international debate on global climate change. As we now know, its real purpose was to oppose mandatory reductions in carbon emissions.

Shell was also a member of the American Legislative Exchange Council (ALEC), a US-based lobbying group that peddles disinformation about climate science and tries to roll back clean energy polices. In announcing its decision to leave ALEC in 2015, the company said that ALEC’s stance on climate change “is clearly inconsistent with our own.” In an interview aired last week, CEO van Beurden reiterated that “we could not reconcile ourselves” with ALEC’s position.

Indeed, in The Climate Accountability Scorecard UCS scored Shell “advanced” for its own public statements on climate science and the consequent need for swift and deep reductions in emissions from the burning of fossil fuels.

However, Shell has not applied the same standard to other trade associations and industry groups that it did to ALEC. Shell still plays leadership roles in the American Petroleum Institute (API), the National Association of Manufacturers (NAM), and the Western States Petroleum Association (WSPA)—all of which take positions on climate science and/or climate action that are inconsistent with Shell’s stated position. Shell has not taken any steps to distance itself from climate disinformation spread by these groups.

2) Set company-wide emissions reduction targets consistent with 2°C

Shell scored “fair” in The Climate Accountability Scorecard in the area of planning for a world free from carbon pollution. The company was ahead of most of its peers in expressing support for the Paris Climate Agreement and its goal of keeping warming well below a 2°C increase above pre-industrial levels.

Since the Scorecard release, Shell has made a couple of positive moves in this area:

  • In March, the company announced plans to sell most of its production assets in Canada’s oil sands in a deal worth $7.25 billion. Oil sands are among the most carbon-intensive fuel sources to extract and refine, and thus clearly disadvantaged in the transition to a low-carbon energy future. (Shell will maintain an interest in the Athabasca oil sands project, directly and via its proposed acquisition of Marathon Oil Canada Corp.)
  • Also in March, Shell announced that climate-related metrics will be factored into executive pay: 10% of bonuses will be based on how well the company manages heat-trapping emissions in its operations.

Some shareholders, however, don’t believe these steps go far enough. The Dutch organization Follow This has filed a shareholder resolution calling on Shell to “set and publish targets for reducing greenhouse gas (GHG) emissions that are aligned with the goal of the Paris Climate Agreement to limit global warming to well below 2°C.”

Shell’s directors unanimously oppose the resolution, arguing it would have a detrimental impact on the company. While affirming Shell’s support for the Paris Climate Agreement, they maintain that “in the near term the greatest contribution Shell can make is to continue to grow the role of natural gas.” Yet as my UCS colleagues have demonstrated, there are tremendous risks to our growing over-reliance on natural gas.

Meanwhile, the UK responsible investment charity ShareAction is urging shareholders to reject Shell’s proposed remuneration policy in a binding vote, and engage with the company over the need to make a clearer commitment to the low-carbon transition. Among other arguments, ShareAction notes that “Shell fails to include indicators that meaningfully focus executive attention on transitioning the firm’s business model for <2°C resilience. The 10% weighted GHG metric focuses on operational emissions, rather than long-term strategic changes required in the context of the transition.”

3) Stand up for full disclosure of climate-related risks

Shell lagged behind other major fossil fuel companies in disclosing climate-related risks to investors, scoring “poor” overall in this category. For example, the company generally acknowledges physical risks—such as weather—to its operations, but does not include discussion of climate change as a contributor to those risks.

Recognizing the potential systemic risks posed by climate change to the global economy, the Task Force on Climate-Related Financial Disclosures (TCFD) is recommending consistent, comparable, and timely disclosures of climate-related risks and opportunities in public financial filings. UCS participated in the TCFD’s public consultation process, through which a broad range of respondents were generally supportive of its recommendations.

Unfortunately, several of Shell’s competitors (BP, Chevron, ConocoPhillips, and Total SA) funded a report attacking the TCFD’s recommendations, which was rolled out last week at an event hosted by the US Chamber of Commerce.

Shell has an opportunity to demonstrate leadership on transparency and disclosure by publicly supporting the TCFD’s recommendations—including transparent discussion of the business implications of a 2° Celsius scenario.

I’ll be keenly awaiting the results of Shell’s AGM tomorrow to see whether the company rises to these challenges. And I look forward to discussing developments at recent and upcoming annual shareholders’ meetings of fossil fuel producers at an event in Houston on Wednesday night, organized by UCS in collaboration with Rice University faculty concerned about climate change.

Climate Change and Climate Risk: Critical Challenges for Fossil Fuel Companies and Their Investors will feature a distinguished panel of scientists, public health experts, investment experts, and community leaders exploring the fossil fuel industry’s role in transitioning to a carbon-constrained future. The event will be live-streamed and available for viewing at this link.

Congress vs. Trump: Are the President’s Anti-Science Budget Priorities Headed for Another Defeat?

UCS Blog - The Equation (text only) -

The president’s “America First” budget blueprint, a.k.a. the “skinny budget,” made a lot of noise when it was introduced two months ago and brought focus to the administration’s upcoming FY2018 budget priorities. The administration followed up shortly after by requesting reductions in the 2017 budget for the remaining five months of this fiscal year.

But then it came time for Congress to act, and they said, “Thank you for the very amusing budget Mr. President, but we are going to do our own thing …and incidentally, thank you for uniting Republicans and Democrats in opposition to your draconian cuts.”

After all, it’s members of Congress that have to figure out how to keep the federal government operating. So with a government shutdown looming, Congress effectively ignored the administration’s requests, and on May 4 passed a bill to fund the government for the rest of the fiscal year through September 30, 2017. The bill was a repudiation of the president’s budget priorities, as it increased funding to many agencies, offices, and programs that the administration specifically targeted for cuts or elimination.

The president is expected to release his full fiscal year 2018 budget this week (fleshing out the details of his “skinny budget”), and there aren’t expected to be any surprises. It will likely track the skinny budget pretty closely, which means it’s going nowhere in Congress.

To get a clearer sense of the prospects for the president’s FY2018 budget, let’s look at some of the budget choices Congress made for the FY2017 Omnibus Spending Bill that are at odds with what President Trump proposed for 2018:

Department of Energy (DOE)

President Trump’s FY18 budget request proposes to eliminate ARPA-E, DOE’s innovative clean energy technology R&D program; and the Loan Programs Office, which provides credit support to help deploy innovative clean energy technologies. Additionally, it targeted critical programs in the Office of Energy Efficiency and Renewable Energy (EERE), like the Weatherization Assistance Program (which funds energy efficiency improvements for low-income households) and the State Energy Program (which provides funding and technical assistance to states for increasing energy efficiency or renewable energy). The president’s FY17 request specifically targeted EERE for a 25% cut ($516 million).

Instead of eliminating ARPA-E, congress gave it a 5% increase in funding in FY17 (from $291 million to $306 million) and also provided an extension of current funding for the Loan Programs Office. The Weatherization Assistance Program was given a 6% increase while the State Energy Program received sustained funding at the 2016 level. EERE ultimately received a very slight increase instead of a devastating cut.

National Oceanic and Atmospheric Administration (NOAA)

The president’s FY18 budget request proposed to cut over $250 million “in grants and programs supporting coastal and marine management, research, and education,” which essentially constituted 23% of the combined budget for the Office of Oceanic and Atmospheric Research (OAR) and the National Ocean Service (NOS).

The administration was more specific in their FY17 budget request, calling for cuts to coastal zone management grants, regional coastal resilience grants, and climate research grants. The administration also proposed reducing satellite capacity at the National Environmental Satellite, Data, and Information Service (NESDIS), which provides the data needed to produce National Weather Service forecasts.

Instead of cuts, in the FY17 Omnibus bill that Congress provided a slight increase in funding for Coastal Science and Assessment, as well as for Ocean and Coastal Management Services, at NOS. OAR received a 6.6% increase in funding (from $482 million to $514.1 million), with the climate research budget untouched.  And Congress increased funding for Environmental Satellite Observing Systems at NESDIS by 25% (from $130.1 million to $163.4 million).

Environmental Protection Agency (EPA)

The president’s FY18 budget request proposed cutting the EPA’s budget by 31% and eliminating 3,200 staff and over 50 programs, including those supporting international and domestic climate change research and partnership programs. His budget also reduces funds allocated to Superfund, Brownfields, compliance monitoring, and enforcement, which further endangers economically vulnerable communities and communities of color. While the administration would have the states take on more of the EPA’s responsibility, the president’s budget eliminates geographic programs and reduces funding for state categorical grants by a whopping 45 percent.

The EPA was spared any drastic cuts and staff layoffs in FY17. Its clean air and climate programs were funded at the previous year’s levels, as was the Compliance Monitoring Program (which helps ensure our environmental laws are followed), enforcement, and Superfund. State and Tribal Assistance Grants and Geographic Programs, which support Brownfields Projects, local air management, water protection, and lead and hazardous waste programs, actually received a slight increase in funding.

FEMA, NASA and more…

Congress rebuffed the president’s request to eliminate FEMA’s Pre-Disaster Mitigation Grant Program, which helps bring down the cost of disasters and protects communities by supporting preparedness efforts. Also escaping cuts was NASA’s Earth Science Program, which develops, launches, and maintains a network of satellites that collect data on Earth’s surface and atmosphere—a critical tool for improving predictive capacity for everything from agricultural commodities and water management to infrastructure.

There are examples like these all throughout the FY17 Omnibus spending bill that Congress passed two weeks ago. Some say the president was rebuffed because congress was in no mood to shut down the government over spending, but it’s also true that there were many congressional Republicans who opposed large parts of the president’s budget.

Appropriators are not interested in gutting the institutions they fund, and House Speaker Ryan is not interested in shutting down the government, which would call into question his party’s ability to govern. You can bet many Republicans breathed a private sigh of relief when leadership reached a deal on what effectively was another “continuing resolution” (CR).

It wasn’t all good

One significant flaw in the budget deal is the insertion of an anti-science policy rider that instructs the Departments of Agriculture and Energy to work with the EPA to establish policies that “reflect the carbon neutrality of forest bioenergy.”

Unfortunately, burning forest biomass to make electricity is not inherently carbon-neutral because “removing the carbon dioxide released from burning wood through new tree growth requires many decades to a century. All the while the added carbon dioxide is in the atmosphere trapping heat.“

Congress should not be legislating science, and this is a cautionary tale for the FY18 budget fight. Special interest amendments, or “riders,”,have the ability to make a reasonable budget an unsavory bill. The biomass rider got in because it had bipartisan support, but going forward, both parties will need to reach a clear understanding on what constitutes a “clean budget” if they want to eventually reach an agreement. Constituents will also need to hold their members of congress accountable if they don’t want government funding bills to become delivery devices for bad, long-lived policy.

The 2018 budget fight: government shutdown, continuing resolution, or “the nuclear option”?

So what does this mean for the 2018 budget? Where are we headed?

If Congress can’t pass another bill to fund the government for the 2018 fiscal year before October 1, the government will effectively shut down (and we all know what that looks like).

While the president has said “our country needs a good shutdown,” most Americans would strongly disagree …as would most members of Congress. But the president is angling to give himself some breathing room because he knows it is impossible for his budget priorities to pass the Senate’s 60-vote threshold for a filibuster.

A bill that continues funding the government at last year’s spending levels is a loss for the president, and there aren’t enough Democrats that would support a budget deal with the kinds of cuts to discretionary spending that he is proposing.

But the president is negotiating, and this tactic is straight out of “The Art of the Deal.” He’s betting that if he proposes extremely deep cuts, Congress will move slightly more in his direction on spending levels …and that government shutdowns don’t last forever.

The most likely outcome is a continuing resolution or “CR,” which would keep the federal government functioning at current spending levels for a limited period of time. Some Republican appropriators have already given up on the prospect of moving their subcommittee’s spending bills through the chambers and are instructing their staff to start developing a list of add-ons to the current spending package.

It takes Democratic votes to pass a spending bill out of the Senate and they will not support budget cuts. Shutting down the government is bad for both the president and the majority party in congress so most Republicans don’t want to go in that direction. Continuing funding at existing spending levels would prevent the president from advancing his domestic agenda and would be a big loss, but it’s also the most likely outcome …that is, unless the Senate changes the rules.

Senate Majority Leader Mitch McConnell (R-KY) could potentially employ “the nuclear option” and get rid of the 60-vote requirement (the Senate filibuster), taking away the need for Democratic votes to pass a budget. McConnell has already done this once this year to get the Gorsuch Supreme Court nomination through the Senate. It’s possible that when faced with a choice between a CR the president won’t sign, a bill the Democrats won’t pass, and a government shutdown, McConnell could set aside his institutionalist tendencies and do away with the filibuster on federal spending.

Going nuclear is an unlikely outcome, but it’s definitely a possibility. Do most Americans see the Senate as the greatest deliberative body in the world? Do they even know what the filibuster is? I suspect not, and that means that the only political downside to changing the rules for the budget would be reciprocity by the Democrats at a future time when they have control of Congress. Is that enough to keep Senator McConnell from doing it?

What you can do to protect critical programs and spending

Watchdog the appropriations process this year and weigh in throughout the summer with your members of Congress on the spending priorities you care about. Tell them not to vote for a budget that cuts those priorities, and if there are no appropriators in your congressional delegation, tell them to weigh in with the appropriations subcommittees and advocate for your priorities.

If we get a CR, that’s a good thing because federal spending would be set at current levels; no cuts. But CR’s don’t last forever; eventually Congress will pass another budget. Advocating with appropriators increases the likelihood of higher funding levels in those subcommittee appropriations bills for the things you care about. If you don’t work the appropriations process, if you don’t engage with your members of Congress, you get what you get (it may be cuts), and all you can do is pray for a never-ending CR.

We may be looking at a scenario where a federal budget is voted on by a simple majority, in which case the funding levels coming out of the appropriations subcommittees really matter. If you care about federal spending priorities, depending on the Senate filibuster as protection may not turn out to be a prudent strategy. Consider that there are also Republicans that care about some of these spending priorities, like research and innovation.

If constituents are actively engaged in communicating spending priorities with their members of Congress, even without the 60-vote hurdle, meaningful cuts to programs and agencies that support things like scientific research, clean energy innovation, public health, and community preparedness for climate change, won’t come to fruition.

So call your members of congress! Show up to those town halls! And drop by your local congressional office!

North Korea’s May 21 Missile Launch

UCS Blog - All Things Nuclear (text only) -

A week after the test launch of an intermediate range Hwasong-12 missile, North Korea has tested a medium-range missile. From press reports, this appears to be a Pukguksong-2 missile, which is the land-based version of the submarine launched missile it is developing. This appears to be the second successful test of this version of the missile.

South Korean sources reported this test had a range of 500 km (300 miles) and reached altitude of 560 km (350 miles). If accurate, this trajectory is essentially the same as the previous test of the Pukguksong-2 in February (Fig. 1). Flown on a standard trajectory, this missile carrying the same payload would have a range of about 1,250 km (780 miles). If this test was conducted with a very light payload, as North Korea is believed to have done in past tests, the actual range with a warhead could be significantly shorter.

Fig. 1: The red curveis reportedly the trajectory followed on this test. The black curve (MET=minimum-energy trajectory) is the same missile on a maximum range trajectory.

The Pukgukgsong-2 uses solid fuel rather than liquid fuel like most of North Korea’s missiles. For military purposes, solid-fueled missiles have the advantage that they have the fuel loaded in them and can be launched quickly after moving them to the launch site. With large liquid-fuel  missiles you instead need to move them without fuel and then fuel them once they are in place at the launch site. That process can take an hour or so, and the truck carrying the missile must be accompanied by a number of trucks containing the fuel. So it is easier to spot a liquid missile before launch and there is more time available to attack it.

However, it is easier to build liquid missiles, so that is typically where countries begin. North Korea obtained liquid fuel technology from the Soviet Union in the 1980s, and built its program up from there. It is still in early stages of developing solid missiles.

Building large solid missiles is difficult. If you look at examples of other countries building long-range solid missiles, e.g., France and China, it took them several decades to get from the point of building a medium-range solid missile, like North Korea has, to building a solid ICBM. So this is not something that will happen soon, but with time North Korea will be able to do it.

Infrastructure Spending Is Coming. Climate Change Tells Us to Spend Wisely

UCS Blog - The Equation (text only) -

The news of new federal infrastructure proposals landed in a timely fashion with this year’s Infrastructure Week, including a bill introduced by the House Democrats (LIFT America Act, HR 2479) and another expected shortly from Trump’s administration. For years now, the American Society of Civil Engineers has graded the U.S.’s infrastructure at near failing (D+). With the hashtag #TimetoBuild, Infrastructure Week participants are urging policymakers to “invest in projects, technologies, and policies necessary to make America competitive, prosperous, and safe.”

We must build for the future

Conversations in Washington, D.C. and across the country over the coming weeks and months are sure to focus on what projects to build. But we first need to ask for what future are we building? Will it be a version based on similar assumptions and needs as those we experience today, or a future radically shaped by climate change? (Changing demographics and technologies will undoubtedly shape this future as well.)

It’s imperative that this changing climate future is incorporated into how we design and plan infrastructure projects, especially as we consider investing billions of taxpayer dollars into much needed enhancements to our transportation, energy, and water systems.

Climate change will shape our future

A vehicle remained stranded in the floodwater of Highway 37 on Jan. 24, 2017. Photo: Marin Independent Journal.

Engineers and planners know that, ideally, long-lived infrastructure must be built to serve needs over decades and withstand the ravages of time—including the effects of harsh weather and extended use—and with a margin of safety to account for unanticipated risks.

Much of our current infrastructure was built assuming that past trends for climate and weather were good predictors of the future. One example where I currently live would be the approach to the new Bay Bridge in Oakland, California, which was designed and built without consideration of sea level rise and will be permanently under water with 3 feet of sea level rise, a likely scenario by end of this century. Currently, more than 270,000 vehicles travel each day on this bridge between San Francisco and the East Bay.

Another near my hometown in New Jersey is LaGuardia Airport in Queens, NY, which accommodated 30 million passengers in 2016. One study shows that if seas rise another 3 feet, it could be permanently inundated; the PATH and Hoboken Terminal are at risk as well.

Instead, we must look forward to what climate models and forecasts tell us will be the “new normal”- higher temperatures, more frequent and intense extreme weather events like droughts and flooding, larger wildfires, and accelerated sea level rise. This version of the future will further stress our already strained roads, bridges, water and energy systems, as well as the natural or green infrastructure systems that can play a key role in limiting these climate impacts (e.g. flood protection). As a result, their ability to reliably and safely provide the critical services that our economy, public safety, and welfare depend on is threatened.

The reality is we are not yet systematically planning, designing and building our infrastructure with climate projections in mind.

Recent events as a preview

We can look at recent events for a preview of some of the infrastructure challenges we may face with more frequency and severity in the future because of a changing climate. (These events themselves are not necessarily the direct result of climate change but studies do show that climate change is making certain extreme events more likely, like the 2016 Louisiana floods). For example:

  • In September 2015, the Butte and Valley Fires destroyed more than one thousand structures and damaged hundreds of power lines and poles, leaving thousands of Californians without power.
  • Earlier this year, more than 188,000 residents downstream of Oroville Dam were ordered to evacuate as water releases in response to heavy rains and runoff damaged both the concrete spillway and a never-before-used earthen emergency spillway, threatening the dam.
  • Winter storms also resulted in extreme precipitation that devastated California’s roads, highways, and bridges with flooding, landslides, and erosion, resulting in roughly $860 million in repairs.

View of the Valley Fire, which destroyed nearly 77,000 acres in Northern California from Sept. 12, 2015 to Oct. 15, 2015. Photo: U.S. Coast Guard.

Similar events have been occurring all over the country, including recent highway closures from flooding along the Mississippi River. Other failures are documented in a Union of Concerned Scientists’ blog series “Planning Failures: The Costly Risks of Ignoring Climate Change,” and a report on the climate risks to our electricity systems.

Will the infrastructure we start building today still function and meet our needs in a future affected by climate change? Maybe. But unlikely, if we don’t plan differently.

Will our taxpayer investments be sound and will business continuity and public safety be assured if we don’t integrate climate risk into our infrastructure decisions? No.

If we make significant federal infrastructure investments over the next few years without designing in protections against more extreme climate forces, we risk spending much more of our limited public resources on repair, maintenance, and rebuilding down the line–a massively expensive proposition.

Building for our climate future

UCS has recently joined and started to amplify a small but growing conversation about what exactly climate-resilient infrastructure entails. This includes several of the Steering Committee Members and Sponsors of Infrastructure Week, including Brookings Institute, American Society of Civil Engineers, AECOM, WSP, and HTNB. The LIFT America Act also includes some funding dedicated to preparing infrastructure for the impacts of climate change.

For example, last year, UCS sponsored a bill, AB 2800 (Quirk), that Governor Brown signed into law, to establish the Climate-Safe Infrastructure Working Group. It brings together climate scientists, state professional engineers, architects and others to engage in a nuts-and-bolts conversation about how to better integrate climate impacts into infrastructure design, examining topics like key barriers, important information needs, and the best design approach for a range of future climate scenarios.

UCS also successfully advocated for the California State Water Resources Control Board to adopt a resolution to embed climate science into all of its existing work: permits, plans, policies, and decisions.

A few principles for climate resilient infrastructure

At UCS, we have also been thinking about key principles to ensure that infrastructure can withstand climate shocks and stresses in order to minimize disruptions to the system and safety (and the communities that depend on it) as well as safety and rebound quickly. Our report, “Towards Climate Resilience: A Framework and Principles for Science-Based Adaption”, outlines fifteen key principles for science-based adaptation.

We sought input from a panel of experts, including engineers, investors, emergency managers, climate scientists, transportation planners, water and energy utilities, and environmental justice organizations, at a recent UCS convening in Oakland, California focused on how we can start to advance policies and programs that will result in infrastructure that can withstand climate impacts.

The following principles draw largely from these sources. They are aspirational and not exhaustive, and will continue to evolve. To be climate-resilient, new and upgraded infrastructure should be built with these criteria in mind:

  • Scientifically sound: Infrastructure decisions should be consistent with the best-available climate science and what we know about impacts on human and natural systems (e.g., flexible and adaptive approaches, robust decisions, systems thinking, and planning for the appropriate magnitude and timing of change).
  • Socially just: New or upgraded infrastructure projects must empower communities to thrive, and ensure vulnerable groups can manage the climate risks they’ll face and share equitably in the benefits and costs of action. The historic under-investment in infrastructure in low-income and communities of color must be addressed.
  • Fiscally sensible: Planning should consider the costs of not adapting to climate change (e.g., failure to deliver services or costs of emergency repairs and maintenance) as well as the fiscal and other benefits of action (e.g., one dollar spent preparing infrastructure can save four dollars in recovery; investments in enhancing and protecting natural infrastructure that accommodates sea level rise, absorbs stormwater runoff, and creates parks and recreation areas).
  • Ambitiously commonsense: Infrastructure projects should avoid maladaptation, or actions that unintentionally increase vulnerabilities and reduce capacity to adapt, and provide multiple benefits. It should also protect what people cherish, and reflect a long-term vision consistent with society’s values.
  • Aligned with climate goals: Since aggressive emissions reductions are essential to slowing the rate that climate risks become more severe and common and we need to prepare for projected climate risks, infrastructure projects should align with and complement long-term climate goals – both mitigation and adaptation.
Americans want action for a safer, more climate resilient future

A 2015 study found that the majority of Americans are worried about global warming, with more than 40% believing it will harm them personally. As we engage in discussions around how to revitalize our economy, create jobs, and protect public safety by investing in infrastructure, climate change is telling us to plan and spend wisely.

From the current federal proposals to the recently enacted California transportation package, SB 1 ($52 billion) and hundreds of millions more in state and federal emergency funds for water and flood-protection, there is a lot at stake: taxpayer dollars, public safety and welfare, and economic prosperity. We would be smart to heed this familiar old adage when it comes to accounting for climate risks in these infrastructure projects: a failure to plan is a plan to fail.

No Rest for the Sea-weary: Science in the Service of Continually Improving Ocean Management

UCS Blog - The Equation (text only) -

Marine reserves, or no-fishing zones, are increasing throughout the world. Their goals are variable and numerous, often a mix of conserving our ocean’s biodiversity and supporting the ability to fish for seafood outside reserves for generations to come. California is one location that has seen the recent implementation of marine reserves, where the California Marine Life Protection Act led to the establishment of one of the world’s largest networks of marine reserves.

A number of scientific efforts have informed the design of marine reserves throughout the world and in California. Mathematical models were central to these research efforts as they let scientists and managers do simulated “experiments” of how different reserve locations, sizes, and distances from each other affect how well reserves might achieve their goals.

While a PhD student in the early 2000s, I began my scientific career as one of many contributing to these efforts. In the process, a key lesson I learned was the value of pursuing partnerships with government agencies such as NOAA Fisheries to ensure that the science I was doing was relevant to managers’ questions, an approach that has become central to my research ever since.

Map of the California Marine Protected Areas; courtesy of California Department of Fish and Wildlife

A transition from design to testing

Now, with many marine reserves in place, both managers and scientists are turning to the question of whether they are working. On average (but not always), marine reserves harbor larger fish and larger population sizes for fished species, as well as greater total biomass and diversity, compared both to before reserves were in place and to areas outside reserves. However, answering a more nuanced question—for a given reserve system, is it working as expected?—can help managers engage in “adaptive management”: using the comparison of expectations to data to identify any shortfalls and adjust management or scientific understanding where needed to better achieve the original goals.

Mathematical models are crucial to calculating expectations and therefore to answering this question. The original models used to answer marine reserve design questions focused on responses that might occur after multiple decades. Now models must focus on predicting what types of changes might be detectable over the 5-15 year time frame of reserve evaluation. Helping to develop such modeling tools as part of a larger collaboration, with colleagues Alan Hastings and Louis Botsford at UC Davis and Will White at the University of North Carolina, is the focus of my latest research on marine reserves in an ongoing project that started shortly after I arrived as a professor at UC Davis.

To date we have developed new models to investigate how short-term expectations in marine reserves depend on fish characteristics and fishing history. Now we have a new partnership with California’s Department of Fish and Wildlife, the responsible management agency for California’s marine reserves, to collaboratively apply these tools to our statewide reserve system. This application will help rigorously test how effective California’s marine reserves are, and therefore help with continually improving management to support both the nutrition and recreation that Californians derive from the sea. In addition, it will let California serve as a leading example of model-based adaptive management that could be applied to marine reserves throughout the world.

The role of federal funding

The cabezon is just one type of fish protected from fishing in California’s marine reserves. Photo credit: Wikimedia Commons.

Our project on models applied to adaptive managed started with funding in 2010–2014 from NOAA SeaGrant, a funding source uniquely suited to support research that can help improve ocean and fisheries management. With this support, we could be forward-looking about developing the modeling tools that the State of California now needs.  NOAA SeaGrant would be eliminated under the current administration’s budget proposal.

My other experience with NOAA SeaGrant is through a graduate student fellowship program that has funded PhD students in my (and my colleagues’) lab group to do a variety of marine reserve and fisheries research projects. This fellowship funds joint mentorship by NOAA Fisheries and academic scientists towards student research projects relevant to managing our nation’s fisheries. Along with allowing these students to bring cutting-edge mathematical approaches that they learn at UC Davis to collaborations with their NOAA Fisheries mentors, this funding gives students the invaluable experience I had as a PhD student in learning how to develop partnerships with government agencies that spur research relevant to management needs. Both developing such partnerships and training students in these approaches are crucial elements to making sure that new scientific advancements are put to use. This small amount of money goes a long way towards creating future leaders who will continue to help improve the management of our ocean resources.

 

Marissa Baskett is currently an Associate Professor in the Department of Environmental Science and Policy at the University of California, Davis.  Her research and teaching focus on conservation biology and the use of mathematical models in ecology.  She received a B.S. in Biological Sciences at Stanford University and both an M.A. and Ph.D. in Ecology and Evolutionary Biology at Princeton University, and she is an Ecological Society of America Early Career Fellow.  

The views expressed in this post solely represent the opinions of Marissa Baskett and do not necessarily represent the views of UC Davis or any of her funders or partners.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

New Study on Smart Charging Connects EVs & The Grid

UCS Blog - The Equation (text only) -

We know that electric vehicles (EVs) tend to be more environmentally friendly than gasoline cars. We also know that a future dominated by EVs poses a problem—what happens if everyone charges their cars at the same time (e.g., when they get home from work)?

Fortunately, there’s an answer: smart charging. That’s the topic of a report I co-authored, released today.

As a flexible load, EVs could help utilities balance supply and demand, enabling the grid to accommodate a larger fraction of variable renewable energy such as wind and solar. As well, the charging systems can help utilities and grid operators identify and fix a range of problems. The vehicles can be something new, not simply an electricity demand that “just happens,” but an integral component of grid modernization.

Where the timing and power of the EV charging automatically adjust to meet drivers’ needs and grid needs, adding EVs can reduce total energy system costs and pollution.

This idea has been around since the mid-1990s, with pilots going back at least to 2001. It has been the focus of many recent papers, including notable work from the Smart Electric Power Alliance, the Rocky Mountain Institute, the International Council on Clean Transportation, the Natural Resources Defense Council, the National Renewable Energy Laboratory, Synapse Energy Economics, and many more.

Over the past two years, I’ve read hundreds of papers, talked to dozens of experts, and convened a pair of conferences on electric vehicles and the grid. I am pleased to release a report of my findings at www.ucsusa.org/smartcharging.

Conclusions, but not the end

This is a wide-ranging and fast-moving field of research with new developments constantly. As well, many well-regarded experts have divergent views on certain topics. Still, a few common themes emerged.

  • Smart charging is viable today. However, not all of the use cases have high market value in all regions. Demand response, for example, is valuable in regions with rapid load growth, but is less valuable in regions where electricity demand has plateaued.
  • The needs of transportation users take priority. Automakers, utilities, charging providers, and regulators all stress the overriding importance of respecting the needs of transportation users. No stakeholder wants to inconvenience drivers by having their vehicles uncharged when needed.
  • Time-of-use pricing is a near-term option for integrating electric vehicles with the grid. Using price signals to align charging with grid needs on an hourly basis—a straightforward implementation of smart charging—can offer significant benefits to renewable energy utilization.
  • Utilities need a plan to use the data. The sophisticated electronics built into an EV or a charger can measure power quality and demand on the electric grid. But without the capabilities to gather and analyze this data, utilities cannot use it to improve their operations.

The report also outlines a number of near-term recommendations, such as encouraging workplace charging, rethinking demand charges, and asking the right questions in pilot projects.

Defining “smart”

One important recommendation is that “smart” charging algorithms should consider pollution impacts. This emerged from the analytical modeling that UCS conducted in this research.

Basic applications of “smart charging” lower electric system costs by reducing peak demand and shifting the charging to off-peak periods, reducing need for new power plants and reducing consumer costs.  But, in some regions that have lagged in the transition to cleaner electricity supplies, “baseload” power can be dirtier than peak power. Our model of managed charging shifted power demand by the hour, without regard to lowering emissions or the full range of services that smart charging performs today (like demand response or frequency regulation), let alone adding energy back with two-way vehicle-to-grid operation.

The model illustrated that encouraging off-peak charging without attention to emissions might, at a national scale, slightly increase pollution compared to unmanaged charging. Both charging strategies would reduce pollution compared to relying on internal-combustion vehicles, and the managed case would have lower system costs.

This is not a prediction, but one possible outcome under certain circumstances—a possibility also noted by NREL and by other research teams. It is a consequence of off-peak power that is cheap but dirty, and of a model that does not yet properly represent the full capabilities of smart charging. Charging when renewables are greatest, or employing policies that assign a cost to pollution, would change this outcome.

Fortunately, even before we have such policies, we have existing systems that can selectively charge when the greenest power is “on the margin.” This technology and other systems are discussed in the report.

The broader context

Smart charging of electric vehicles has a key role to play in the grid modernization initiatives happening around the country. EVs can be a flexible load that communicates with the grid, incorporates energy storage, benefits from time-varying rates, and participates in ancillary services markets, representing many of the innovations that can improve the economic and environmental performance of our electricity system.

Photo: Steve Fecht/General Motors

There’s an Elephant in the Room, and It Smells Like Natural Gas

UCS Blog - The Equation (text only) -

A curious thing happened in the aftermath of President Trump attempting to sign away the past eight years of work on climate and clean energy: the public face of progress didn’t flinch. From north to south and east to west, utilities and businesses and states and cities swore their decarbonization compasses were unswerving; yes, they said, we’re still closing coal plants, and yes, yes!, we’re still building ever more wind and solar—it just makes sense.

But here’s why all the subsequent commentary reiterating the inevitability of coal’s decline and cheering the unsinkable strength of renewables’ rise was right in facts, but incomplete in message:

Coal is closing. Renewables are rising. But right now, we need to be talking about natural gas.

We’re fine without a map…

President Trump accompanied his signature on the Executive Order on Energy Independence with a vow that the order would put the coal industry “back to work.” But  shortly thereafter, even those in the business reported they weren’t banking on a turn-around. Coal plants just keep shutting down:

This map shows coal units that have retired just between 2007 and 2016—many more have been announced for closure in the near future.

At the same time, renewable resources have been absolutely blowing the wheels off expectations and projections, with costs plummeting and deployment surging. The renewable energy transformation is just that—a power sector transformation—and it certainly appears there’s no going back:

Wind and solar capacity has been growing rapidly since the early 2000s.

Now when you put these two trajectories together, you end up with an electric power sector that has, in recent years, steadily reduced its carbon dioxide emissions:

Three positive charts, and three tremendous reasons to cheer (which we do a lot, and won’t soon stop—clean energy momentum is real and it’s rolling). The problem is, these charts only capture part of the energy sector story.

What’s missing? Natural gas. Or, what is now the largest—and still growing—source of carbon emissions in the electric power sector.

…Until we finally realize we’re lost

There are two phases to climate change emissions reductions conversations. In Phase 1, we acknowledge that a problem exists, we recognize we’re a big reason for that problem, and we take action to initiate change. With the exception of just a few of the most powerful people in our government (ohthem), we seem to have Phase 1 pretty well in hand. Cue the stories about the triumphant resilience of our climate resolve.

The trouble is Phase 2.

In Phase 2, we move to specifics. Namely, specifics about what the waypoints are, and by when we need to reach them. This is the conversation that produces glum replies—and it’s the source of those weighty, distraught affairs scattered among the buoyant takes on the recent executive order—because the truth is:

  • We know what the waypoints are,
  • We know by when we need to reach them, and
  • We know that currently, we’re not on track.

Without a map, we’re left feeling good about the (real and true) broad-brush successes of our trajectory—emissions reductions from the retirement of coal plants; technology and economic improvements accelerating the deployment of renewables—but we have no means by which to measure the adequacy of our decarbonization timeline.

As a result, we put ourselves at grave risk of failing to catch the insufficiency of any path we’re on. And right now? That risk has the potential to become reality as our nation, propelled by the anti-regulatory, pro-fossil policies of the Trump administration, lurches toward a wholesale capitulation to natural gas.

Natural gas and climate change

Last year, carbon dioxide emissions from coal-fired power plants fell 8.6 percent. But take a look at the right-hand panel in the graph below. See what’s not going down? Emissions from natural gas. In fact, carbon dioxide emissions from natural gas overtook coal emissions last year, even omitting the additional climate impacts from methane released during natural gas production and distribution.

Bridge fuel? Not so much.

There’s no sign of the trend stopping, either. Natural gas plants have been popping up all across the country, and new plants keep getting proposed—natural gas generators now comprise more than 40 percent of all electric generating capacity in the US.

Natural gas plants are located all across the country, and new projects keep getting proposed.

And all those natural gas plants mean even more gas pipelines. According to project tracking by S&P Global Market Intelligence, an additional 70 million Dth/d of gas pipeline capacity has been proposed to come online by the early 2020s (subscription). That is a lot of gas, and would require the commitment of a lot of investment dollars.

When plants are built, pipelines are laid, and dollars are committed, it becomes incredibly hard to convince regulators to force utilities to let it all go.

Still, that’s what the markets—and the climate—will demand. As a result, ratepayers may be on the hook for generators’ bad bets.

The thing is, we know today the external costs of these investments, and the tremendous risks of our growing overreliance on natural gas. So why do these assets keep getting built?

Because many of our regulators, utilities, and investors are working without a map.

Now there are a growing number of states stepping up where the federal government has faltered, and beginning to make thoughtful energy decisions based on specific visions of long-term decarbonization goals, like in California, the RGGI states, and as recently as this week, Virginia. Further, an increasing number of insightful and rigorous theoretical maps are being developed, like the US Mid-Century Strategy for Deep Decarbonization, amongst many others (UCS included).

But for the vast majority of the country, the main maps upon which decarbonization pathways were beginning to be based—the Clean Power Plan and the Paris Climate Agreement—are both at immediate risk of losing their status as guiding lights here in the US, sitting as they are beneath the looming specter of the Trump administration’s war on facts.

Plotting a course to a better tomorrow

So where to from here? Ultimately, there is far too much at stake for us to simply hope we’re heading down the right path. Instead, we need to be charting our course to the future based on all of the relevant information, not just some of it.

To start, we recommend policies that include:

  • Moving forward with implementation of the Clean Power Plan, a strong and scientifically rigorous federal carbon standard for power plants.
  • Developing, supporting, and strengthening state and federal clean energy policies, including renewable electricity standards, energy efficiency standards, carbon pricing programs, and investment in the research, development, and deployment of clean energy technologies.
  • Defending and maintaining regulations for fugitive methane emissions, and mitigating the potential public health and safety risks associated with natural gas production and distribution.
  • Improving grid operation and resource planning such that the full value and contributions of renewable resources, energy efficiency, and demand management are recognized, facilitated, and supported.

We need to show that where we’re currently heading isn’t where we want to be.

We need to talk about natural gas.

Zorandim/Shutterstock.com U.S. EIA, Generator Monthly U.S. EIA U.S. EIA U.S. EIA U.S. EIA

April 2017 Was the Second Hottest April on Record: We Need NOAA More Than Ever

UCS Blog - The Equation (text only) -

Today, NOAA held its monthly climate call, where it releases the previous month’s global average temperature, and discusses future weather and climate outlooks for the US. According to the data released today, April 2017 was the second warmest April on record after only April 2016, with a temperature 0.90°C (1.62°F) above the 20th century April average. Data for the contiguous US was released earlier, and found April 2017 to be the 11th warmest on record, and 2017 to be the second warmest year to date (January to April data).

That means that, yes, we are still seeing warming that is basically unprecedented.

Photo: NOAA

Today’s data release was just one of the myriad ways NOAA’s data and research touches our lives in important ways. I can’t help but wonder if, before someone leaves their house in the morning, and checks the weather forecast—will it rain? Will it be hot or cold?— do they wonder how those numbers come about? Do they realize the sheer amount of science that goes into saying what will happen in every small town across the country (and the world)?

Do people think about science at all when they go about their lives? And do they wonder how that science comes to be?

Probably not. But here is why they should.

Science is essential for climate and weather predictions

NOAA (short for the “National Oceanic and Atmospheric Administration”) is one the lead agencies that helps provide that science. But NOAA’s mission and budget are increasingly under attack under the Trump administration. President Trump’s pick for the new NOAA administrator will soon be announced, and it’s critical that s/he take a strong stance to defend the mission and the budget of the agency.

The National Weather Service, administered by NOAA, is one of the most essential federal institutions for regular citizens’ everyday lives. It is there (and at the Climate Prediction Center) that the data collected by instruments managed by Federal agencies all over the globe, on air, land, and sea, turns into something as important as weather forecasts and seasonal climate outlooks. Data from satellites is routinely used by local stations for tornado warnings, and hurricane tracking is also provided courtesy of those satellites and other instruments, like tide gauges that show the water rising to a flooding threshold, which in turn triggers warnings from the NWS for the affected areas.

It takes very specific and detailed scientific and engineering training to build those instruments in the first place—tide gauges, satellites, thermometers, you name it. And then, science is needed to interpret and make sense of the raw data. And because most people would agree that better forecasts make for improved planning of one’s life—from daily activities to crop planting to storm preparedness—yes, you guessed it, we need better science.

Unfortunately, what we are seeing in this administration is not very promising when it comes to leveraging and supporting science. On many fronts—NASA, NOAA, EPA, DOI, DOE, to name a few—science is being dismissed or ignored, to the detriment of the environment and people like you and me. Proposed budgets include cuts to many scientific programs within agencies. One can’t help but wonder what the consequences (especially unforeseen ones) would be.

NOAA needs more, not less funding

Current funding is already strained to produce enough research to prepare for the increased seasonal variability that we are observing, and that is expected to increase with climate change. We are seeing more devastating floods and worsening wildfire seasons, and many of our coastal cities are seeing significantly more flooding at high tides and during storms, due to sea level rise.

The weather that makes up our climate is behaving so erratically, we need more, not less resources to help predict and prepare appropriately. Fortunately, Congress has held the line so far on keeping budgets for FY17 close to prior year levels rather than accepting the drastic reductions proposed by the administration. We are working hard to help ensure that this trend continues when Congress appropriates the FY18 budget. NOAA needs more funding to continue its climate monitoring program and to improve seasonal forecasts and operational programs, which in turn are essential for planning budgets at state and local levels, and for preparedness measures that can save resources, lives and property.

Wouldn’t it be great if we could tell how much snow is REALLY coming so the right amount of road treatments can be allocated? Or how much rain is going to fall in a very short period of time and how much that river is going up after that rain? I think we can all agree on that.

The Weather Research and Forecasting Innovation Act of 2017, which was signed into law in April 2017, is a breath of fresh air into NOAA’s forecasting lungs—but it is not enough. It focuses on research into sub-seasonal to seasonal prediction, and better forecasts of tornadoes, hurricanes, and other severe storms, as well as long-range prediction of weather patterns, from two weeks to two years ahead. One important aspect of the Act is its focus on communication of forecasts to inform decision-making by public safety officials.

The Act had bipartisan support and was applauded by the University Corporation for Atmospheric Research (UCAR), a well-respected research institution. It was also championed by Barry Myers, the CEO of Accuweather and a frontrunner for the position of NOAA administrator. It is definitely a good step, and a long time coming, but we need more. We need continued support for these types of initiatives, and for the broader mission of NOAA.

We need a vision, and the resources to make it happen. We need an administrator who will turn that vision into reality.

NOAA is a lot more than weather forecasts

NOAA plays a large role in the US economy. It supports more than one-third of the US GDP, affecting shipping, commerce, farming, transportation, and energy supply. The data coming from NOAA also helps us maintain public safety and public health, and enable national security.

In addition to the NWS, other programs within NOAA are essential to track climate change and weather, such as the National Environmental Satellite, Data, and Information Service (NESDIS), which supports weather forecasts and climate research through the generation of over 20 terabytes of data daily from satellites, buoys, radars, models, and many other sources. Other important programs are the Office of Oceanic and Atmospheric Research (OAR); and the Coastal Zone Management Program at the Office of Coastal MGMT (OCM), at the National Ocean Service (NOS).

Those programs provide state-of-the-art data that directly or indirectly affect all the aforementioned segments of Americans daily lives.

The US needs talent and resources to continue its top-notch work

In a recent blog, Dr. Marshall Shepherd laid out  the five things that the weather and climate communities need from a NOAA administrator: to offer strong support for research; to support the NWS; to fight back against the attack on climate science; to protect the satellite and Sea Grant programs; and to value external science expertise. I couldn’t agree more!

NOAA can be the cutting-edge science agency for a “weather ready nation” helping communities become more resilient as they prepare for climate change risks. All it needs is a great administrator, who will stand up for science and fight for the needed budget for the agency’s ever growing needs. Will the nominee be up for the job? And will Congress and the Trump administration continue to provide the budget the agency needs to do its job well?

Warhead Reentry: What Could North Korea Learn from its Recent Missile Test?

UCS Blog - All Things Nuclear (text only) -

As North Korea continues its missile development, a key question is what it may have learned from its recent missile test that is relevant to building a reentry vehicle (RV) for a long-range missile.

The RV is a crucial part of a ballistic missile. A long-range missile accelerates its warhead to very high speed—16,000 mph—and sends it arcing through space high above the atmosphere. To reach the ground it must reenter the atmosphere. Atmospheric drag slows the RV and most of the kinetic energy it loses goes into heating the air around the RV, which then leads to intense heating of the surface of the RV. The RV absorbs some of the heat, which is conducted inside to where the warhead is sitting.

So the RV needs to be built to (1) withstand the intense heating at its outer surface, and (2) insulate the warhead from the absorbed heat that is conducted through the interior of the RV.

The first of these depends on the maximum heating rate at the surface and the length of time that significant heating takes place. Number (2) depends on the total amount of heat absorbed by the RV and the amount of time the heat has to travel from the surface of the RV to the warhead, which is roughly the time between when intense heating begins and when the warhead detonates.

I calculated these quantities for the two cases of interest here: the highly lofted trajectory that the recent North Korean missile followed and a 10,000 km missile on a normal (MET) trajectory. The table shows the results.

The maximum heating rate (q) is only about 10% higher for the 10,000 km range missile than the lofted missile. However, the total heat absorbed (Q) is nearly twice as large for the long-range missile and the duration of heating (τ) is more than two and a half times as long.

This shows that North Korea could get significant data from the recent test—assuming the RV was carrying appropriate sensors and sent that information back during flight, and/or that North Korea was able to recover the RV from the sea. But it also shows that this test does not give all the data you would like to have to understand how effective the heatshield might be before putting a nuclear warhead inside the RV and launching it on a long-range missile.

Some details

The rate of heat transfer per area (q) is roughly proportional to ρV3, where ρ is the atmospheric density and V is the velocity of the RV. Since longer range missiles reenter at higher speeds, the heating rate increases rapidly with missile range. The total heat absorbed (Q) is the integral of q over time during reentry.

This calculation assumes the ballistic coefficient (β) of the RV is 48 kN/m2 (1,000 lb/ft2). The heating values in the table roughly scale with β. A large value of β means less atmospheric drag so  the RV travels through the atmosphere at higher speed. That increases the accuracy of the missile but also increases the heating. The United States worked for many years to develop RVs with special coatings that allowed them to have high β and therefore high accuracy, but  could also withstand the heating under these conditions.

The results in the table can be understood by looking at how RVs on these two trajectories slow down as they reenter. Figs. 1 and 2 plot the speed of the RV versus time; the x and y axes of the two figures have the same scale. The maximum deceleration (slope of the curve) is roughly the same in the two cases, leading to roughly the same value of q. But the 10,000 km range missile loses more total energy—leading to a larger value of Q—and does so over a longer time than the lofted trajectory.

Ad Hoc Fire Protection at Nuclear Plants Not Good Enough

UCS Blog - All Things Nuclear (text only) -

A fire at a nuclear reactor is serious business. There are many ways to trigger a nuclear accident leading to damage of the reactor core, which can result in the release of radiation. But according to a senior manager at the US Nuclear Regulatory Commission (NRC), for a typical nuclear reactor, roughly half the risk that the reactor core will be damaged is due to the risk of fire. In other words, the odds that a fire will cause an accident leading to core damage equals that from all other causes combined. And that risk estimate assumes the fire protection regulations are being met.

However, a dozen reactors are not in compliance with NRC fire regulations:

  • Prairie Island Units 1 and 2 in Minnesota
  • HB Robinson in South Carolina
  • Catawba Units 1 and 2 in South Carolina
  • McGuire Units 1 and 2 in North Carolina
  • Beaver Valley Units 1 and 2 in Pennsylvania
  • Davis-Besse in Ohio
  • Hatch Units 1 and 2 in Georgia

Instead, they are using “compensatory measures,” which are not defined or regulated by the NRC. While originally intended as interim measures while the reactor came into compliance with the regulations, some reactors have used these measures for decades rather than comply with the fire regulations.

The Union of Concerned Scientists and Beyond Nuclear petitioned the NRC on May 1, 2017, to amend its regulations to include requirements for compensatory measures used when fire protection regulations are violated.

Fire Risks

The dangers of fire at nuclear reactors were made obvious in March 1975 when a fire at the Browns Ferry nuclear plant disabled all the emergency core cooling systems on Unit 1 and most of those systems on Unit 2. Only heroic worker responses prevented one or both reactor cores from damage.

The NRC issued regulations in 1980 requiring electrical cables for a primary safety system to be separated from the cables for its backup, making it less likely that a single fire could disable multiple emergency systems.

Fig. 1 Fire burning insulation off cables installed in metal trays passing through a wall. (Source: Tennessee Valley Authority)

After discovering in the late 1990s that most operating reactors did not meet the 1980 regulations, the NRC issued alternative regulations in 2004. These regulations would permit electrical cables to be in close proximity as long as analysis showed the fire could be put out before it damaged both sets of cables. Owners had the option of complying with either the 1980 or 2014 regulations. But the dozen reactors listed above are still not in compliance with either set of regulations.

The NRC issued the 1980 and 2004 fire protection regulations following formal rulemaking processes that allowed plant owners to contest proposed measures they felt were too onerous and the public to contest measures considered too lax. These final rules defined the appropriate level of protection against fire hazards.

Rules Needed for “Compensatory Measures”

UCS and Beyond Nuclear petitioned the NRC to initiate a rulemaking process that will define the compensatory measures that can be substituted for compliance with the fire protection regulations.

The rule we seek will reduce confusion about proper compensatory measures. The most common compensatory measure is “fire watches”—human fire detectors who monitor for fires and report any sightings to the control room operators who then call out the onsite fire brigades.

For example, the owner of the Waterford nuclear plant in Louisiana deployed “continuous fire watches.” The NRC later found that they had secretly and creatively redefined “continuous fire watch” to be someone wandering by every 15 to 20 minutes. The NRC was not pleased by this move, but could not sanction the owner because there are no requirements for fire protection compensatory measures. Our petition seeks to fill that void.

The rule we seek will also restore public participation in nuclear safety decisions. The public had opportunities to legally challenge elements of the 1980 and 2004 fire protection regulations it felt to be insufficient. But because fire protection compensatory measures are governed only by an informal, cozy relationship between the NRC and plant owners, the public has been locked out of the process. Our petition seeks to rectify that situation.

The NRC is currently reviewing our submittal to determine whether it satisfies the criteria to be accepted as a petition for rulemaking. When it does, the NRC will publish the proposed rule in the Federal Register for public comment. Stay tuned—we’ll post another commentary when the NRC opens the public comment period so you can register your vote (hopefully in favor of formal requirements for fire protection compensatory measures.)

BP Hosts Annual General Meeting Amid Questions on Climate Change

UCS Blog - The Equation (text only) -

Tomorrow, BP holds its Annual General Meeting (AGM) in London. BP shareholders are gathering at a time of mounting pressure on major fossil fuel companies to begin to plan for a world free from carbon pollution—as evidenced by last week’s vote by a majority of Occidental Petroleum shareholders in favor of a resolution urging the company to assess how the company’s business will be affected by climate change.

BP was one of eight companies that UCS assessed in the inaugural edition of The Climate Accountability Scorecard, released last October. BP responded to our findings and recommendations, but left important questions unanswered. Here are four questions that we hope BP’s decision makers will address at the AGM.

1) What is BP doing to stop the spread of climate disinformation—including by WSPA?

BP 2016 Score: Fair

In its own public communications, BP consistently acknowledges the scientific evidence of climate change and affirms the consequent need for swift and deep reductions in emissions from the burning of fossil fuels. BP left the climate-denying American Legislative Exchange Council (ALEC) in 2015 (without explicitly citing climate change as its reason for leaving).

Still, the company maintains leadership roles in trade associations and industry groups that spread disinformation on climate science and/or seek to block climate action.

For example, the Western States Petroleum Association (WSPA) made headlines in 2015 for spreading blatantly false statements about California’s proposed limits on carbon emissions from cars and trucks. The association employed deceptive ads on more than one occasion to block the “half the oil” provisions of a major clean-energy bill enacted by California lawmakers.

In response to a question at last year’s AGM about the misleading tactics of WSPA in California, CEO Bob Dudley said, “of course we did not support that particular campaign.” Yet according to the most recent data available, BP remains a member of WSPA and is represented on its board of directors.

Shareholders should be asking how BP communicated its disapproval of WSPA’s tactics in California to the association, and how WSPA responded. And how is BP using its leverage on the board of WSPA to end the association’s involvement in spreading climate disinformation and blocking climate action?

BP is also represented on the boards of the American Petroleum Institute (API) and the National Association of Manufacturers (NAM), both of which are named defendants in a lawsuit brought by youth seeking science-based action by the U.S. government to stabilize the climate system.

UCS’s 2015 report, “The Climate Deception Dossiers,” exposed deceptive tactics by the Western States Petroleum Association (WSPA).

2) Why did BP fund an attack on disclosure of climate-related risks and opportunities?

BP 2016 Score: Fair

BP—along with Chevron, ConocoPhillips, and Total SA—funded a new report criticizing the recommendations of the Task Force on Climate-Related Financial Disclosures (TCFD). The TCFD was set up by the Financial Stability Board (FSB), an international body that monitors and makes recommendations about the global financial system, in recognition of the potential systemic risks posed by climate change to the global economy and economic system. Through an open, collaborative process, the TCFD is recommending consistent, comparable, and timely disclosures of climate-related risks and opportunities in public financial filings.

A broad range of respondents in the TCFD’s public consultation supported its recommendations, and on Monday the We Mean Business coalition issued a statement expressing support for the TCFD recommendations and calling for G20 governments to endorse them. Members of We Mean Business include BSR (Business for Social Responsibility) and the World Business Council for Sustainable Development—both of which, in turn, count BP among their members.

Meanwhile, US Chamber of Commerce will reportedly roll out the oil and gas company-sponsored report at an event this week. (We found no evidence that BP is a member of the US Chamber).

In its own financial reporting, BP provides a detailed analysis of existing and proposed laws and regulations relating to climate change and their possible effects on the company, including potential financial impacts, and generally acknowledges physical risks to the company, including “adverse weather conditions,” but does not include discussion of climate change as a contributor to those risks.

So where does BP stand on climate-related disclosures? The company’s shareholders and the business community at large deserve to know, and tomorrow’s AGM is a good opportunity for CEO Bob Dudley to explain why BP’s funding isn’t aligned with its stated positions.

3) How is BP planning for a world free from carbon pollution?

BP 2016 Score: Poor

Both directly and through its membership in the Oil and Gas Climate Initiative, BP has expressed support for the Paris Climate Agreement and its goal of keeping warming well below a 2°C increase above pre-industrial levels.

Last month, the company signed a letter to President Trump supporting continued U.S. participation in the Paris Climate Agreement.

BP has adopted some modest measures to reduce greenhouse gas emissions from its internal operations. The company has set a cost assumption of $40 per tonne of CO2-equivalent for larger products in industrialized countries, but it is not clear whether BP applies the price to all components of the supply chain.

The company has undertaken efforts to reduce emissions as part of the “Zero Routine Flaring by 2030” pledge, reports annually on low-carbon research and development, and offers a limited breakdown of greenhouse gas emissions from direct operations and purchased electricity, steam, and heat for a year.

Yet BP has no company-wide plan for reducing heat-trapping emissions in line with the temperature goals set by the Paris Climate Agreement. BP’s April 2017 Sustainability Report does little to address BP’s long-term planning for a low-carbon future. CEO Bob Dudley continues to insist that “we see oil and gas continuing to meet at least half of all demand for the next several decades.”

BP’s Energy Outlook webpage confirms that the company’s “Most Likely” demand forecasts, plans for capital expenditure, and strategic priorities plan on a greater-than-3°C global warming scenario. BP also fails to provide a corporate remuneration policy that incentivizes contributions toward a clean energy transition (read ShareAction’s thorough and thoughtful analysis of BP’s remuneration policy here).

We look forward to hearing how BP responds to shareholder questions about the misalignment of its business plans and executive incentives with its stated commitment to keeping global temperature increase well below 2°C.

4) When will BP advocate for fair and effective climate policies?

BP 2016 Score: Good

BP consistently calls for a government carbon policy framework, including a global price on carbon, as a policy it supports, and touts its membership in the Carbon Pricing Leadership Coalition.

The question here is simple: when will BP identify specific climate-related legislation or regulation that it supports, and advocate publicly and consistently for those policies?

We will be awaiting answers from BP’s leadership at tomorrow’s AGM.

Three Reasons Congress Should Support a Budget Increase for Organic Agriculture Research

UCS Blog - The Equation (text only) -

Recent headlines about the US Department of Agriculture’s leadership and scientific integrity have been unsettling, as have indications that the Trump administration intends to slash budgets for agriculture and climate research and science more generally. But today there’s a rare piece of good news: a bipartisan trio in Congress has introduced legislation that would benefit just about everyone—farmers and eaters, scientists and food system stakeholders, rural and urban Americans. Not only that, but the new bill promises to achieve these outcomes while maintaining a shoestring budget.

Organic dairy producers need sound science to be able to make informed decisions about forage production for their herds. At this on-farm demonstration at the Chuck Johnson farm in Philadelphia, Tennessee, Dr. Gina Pighetti and her research team from the University of Tennessee and the University of Kentucky grow organic crimson clover (right) and wheat to develop best management practices that will help farmers make production decisions. Source: University of Tennessee.

Representatives Chellie Pingree (D-ME), Dan Newhouse (R-WA), and Jimmy Panetta (D-CA) are sponsoring the Organic Agriculture Research Act of 2017, which calls for an increase in mandatory funding for a small but crucial USDA research program, the Organic Research Extension Initiative (OREI). Congress allocated this program a mere $20 million annually in both the 2008 and 2014 Farm Bills, but that small investment stretched across the country with grants awarded in more than half of all states. The new bill proposes to increase that investment to $50 million annually in future years.

While a $30 million increase to a $20 million program may seem like a lot, it is worth noting that these numbers are small relative to other programs. For example, the USDA recently announced that its flagship research program, the Agriculture and Food Research Initiative (AFRI), will receive $425 million this year (another piece of good news, by the way). And many R&D programs at other agencies have much higher price tags (e.g., the NIH will receive $34 billion this year). But the return on investment of agricultural research and investment is very high, so this increase could do a lot of good.

Students at UC Davis, under the leadership of Charles Brummer, Professor of Plant Sciences, examine their “jalapeño popper” crop, a cross between a bell pepper and a jalapeño pepper. This public plant breeding pipeline supports organic farming systems by designing new vegetable and bean cultivars with the particular needs of the organic farming community in mind. Source: UC Davis.

While there are many reasons we are excited about a possible budget boost for the Organic Research Extension Initiative (OREI), I’ll highlight just three:

1)  We need more agroecologically-inspired research. More than 450 scientists from all 50 states have signed our expert statement calling for more public support for agroecological research, which is urgently needed to address current and future farming challenges that affect human health, the environment, and urban and rural communities. This call is built upon agroecology’s successful track record of finding ways to work with nature rather than against it, producing nutritious food while also boosting soil health, protecting our drinking water, and more. Unfortunately, the diminishing overall support for public agricultural research is particularly problematic for agroecology, because this research tends to reduce farmers reliance on purchased inputs, which means that gaps in funding are unlikely to be filled by the private sector. So, programs that direct public funding more toward agroecological research and practice are particularly needed, and OREI is one of these.

2)  When it comes to agroecology, this program is a rock star. The OREI funds some of the most effective federal agricultural research, especially around ecologically-driven practices that can protect our natural resources and maintain farmer profits.  One highlight of the program is that it stresses multi-disciplinary research; according the USDA “priority concerns include biological, physical, and social sciences, including economics”, an approach that can help ensure that research leads to farming practices that are both practical and scalable. Importantly, this program also targets projects that will “assist farmers and ranchers with whole farm planning by delivering practical information”, making sure that research will directly and immediately benefit those who need it most. But it’s not just the program description that leads us to believe this is a strong investment. In fact, our own research on competitive USDA grants found that OREI is among the most important programs for advancing agroecology.  And this in-depth analysis of USDA’s organic research programs by the Organic Farming Research Foundation further highlighted the vital importance of OREI.

3) Research from programs like OREI can benefit all farmers, while focusing on practices required for a critical and growing sector of US agriculture. The OREI program is designed to support organic farms first and foremost, funding research conducted on certified organic land or land in transition to organic certification. However, the research from OREI can benefit a much wider group of farmers as well, as such results are relevant to farmers of many scales and farming styles, organic or not. Of course, directing funds to support organic farmers makes lots of sense, since this sector of agriculture is rapidly growing and maintaining high premiums that benefit farmers. But it’s important to recognize that the benefits of the research extend far beyond the organic farming community.

For all of the reasons listed above, this bill marks an important step in the right direction. It is essential that the next farm bill increases support for science-based programs that will ensure the long-term viability of farms while regenerating natural resources and protecting our environment. Expanding the OREI is a smart way forward.

 

One of Many Risks of the Regulatory Accountability Act: Flawed Risk Assessment Guidelines

UCS Blog - The Equation (text only) -

Tomorrow, the Senate will begin marking up Senator Rob Portman’s version of the Regulatory Accountability Act (RAA), which my colleague Yogin wrote a primer about last week. This bill is an attempt to impose excessive burdens on every federal agency to freeze the regulatory process or otherwise tie up important science-based rules in years of judicial review.

One of the most egregious pieces of this bill as an affront to the expertise at federal agencies is the provision ordering the White House Office of Management and Budget’s (OMB) Office of Regulatory and Information Affairs (OIRA) to establish guidelines for “risk assessments that are relevant to rulemaking,” including criteria for how best to select studies and models, evaluate and weigh evidence, and conduct peer reviews. This requirement on its own is reason enough to reject this bill, let alone the long list of other glaring issues that together would fundamentally alter the rulemaking process.

The RAA is a backdoor attempt at giving OIRA another chance to try and prescribe standardized guidelines for risk assessment that would apply to all agencies, even though each agency conducts different types of risk assessments based on statutory requirements.

OIRA should not dole out science advice

The way in which agencies conduct their risk assessments should be left to the agencies and scientific advisory committees, whether it is to determine the risks of a pesticide to human health, the risks of a plant pest to native plant species, the risks of a chemical to factory workers, or the risks of an endangered species determination to an ecosystem. Agencies conduct risk assessments that are specific to the matter at hand; therefore an OIRA guidance prescribing a one-size-fits-all risk assessment methodology will not be helpful for agencies and could even tie up scientifically rigorous risk assessments in court if the guidelines are judicially reviewable.

OIRA already tried writing guidance a decade ago, and it was a total flop. In January 2006, OMB released its proposed Risk Assessment Bulletin which would have covered any scientific or technical document assessing human health or environmental risks. It’s worth noting that OIRA’s main responsibilities are to ensure that agency rules are not overlapping in any way before they are issued and to evaluate agency-conducted cost-benefit analyses of proposed rules. Therefore OIRA’s staff is typically made up of economists and lawyers, not individuals with scientific expertise appropriate for determining how agency scientists should conduct risk assessments.

OMB received comments from agencies and the public and asked the National Academy of Sciences’ National Research Council (NRC) to conduct an independent review of the document. That NRC study gave the OMB a failing grade, calling the guidance a “fundamentally flawed” document which, if implemented, would have a high potential for negative impacts on the practice of risk assessment in the federal government. Among the reasons for their conclusions was that the bulletin oversimplified the degree of uncertainty that agencies must factor into all of their evaluations of risk. As a result, the document that OIRA issued a year later, under Portman’s OMB, was heavily watered down. In September 2007, OIRA and the White House Office of Science and Technology Policy (OSTP) released a Memorandum on Updated Principles for Risk Analysis to “reinforce generally-accepted principles for risk analysis upon which a wide consensus now exists.”

Luckily, in this case, the OMB called upon the National Academies for an objective review of the policy, which resulted in final guidelines that were far less extreme. As the RAA is written, it does not require that same check on OIRA’s work, which means that we could end up with highly flawed guidelines with little recourse. And the Trump administration’s nominee for OIRA director is Neomi Rao, a law professor whose work at the George Mason University Law School’s Center for the Study of the Administrative State emphasizes the importance of the role of the executive branch, while describing agency policymaking authority as “excessive.” I think it’s fair to say that under her leadership, OIRA will not necessarily scale back its encroachment into what should be expert-driven policy matters.

Big business is behind the RAA push

An analysis of OpenSecrets lobbying data revealed that trade associations, PACs and individuals linked to companies that have lobbied in support of the RAA also contributed $3.3 million to Senator Rob Portman’s 2016 campaign. One of the most vocal supporters of the bill is the U.S. Chamber of Commerce, whose support for the bill rests on the assumption that we now have a “federal regulatory bureaucracy that is opaque, unaccountable, and at times overreaching in its exercise of authority.” Yet this characterization actually sounds a lot to me like OIRA itself, which tends to be fairly anti-regulatory and non-transparent, and has a history of holding up science-based rules for years without justification (like the silica rule). Senator Portman’s RAA would give OIRA even more power over agency rulemaking by tasking the agency with writing guidelines on how agencies should conduct risk assessments and conveniently not requiring corporations to be held to the same standards.

When OIRA tried to write guidelines for risk assessments in 2006, the Chamber of Commerce advocated for OIRA’s risk assessment guidelines to be judicially reviewable so they could be “adequately enforced,” claiming that agencies use “unreliable information to perform the assessments,” which can mean that business and industry are forced to spend millions of dollars to remedy those issues. It is no wonder, then, that the Chamber would be so supportive of the RAA, which would mandate OIRA guideline development for risk assessments, possibly subject to judicial review. OIRA issuing guidelines is one thing, but making those guidelines subject to judicial review ramps up the egregiousness of this bill. All sorts of troubling scenarios could be imagined.

Take, for example, the EPA’s human health risk assessment for the pesticide chlorpyrifos, which is just one study that will be used for the agency’s registration review of the chemical, which has been linked to developmental issues in children. The EPA sought advice from the FIFRA Scientific Advisory Panel on using a particular model to better determine a chemical’s effects on a person based on their age or genetics and to predict how different members of a population would be affected by exposure, called the physiologically-based pharmacokinetic and pharmacodynamic (PBPK/PD) model. The agency found that there is sufficient evidence that neurodevelopmental effects may occur at exposure levels that are well below previously measured exposure levels.

If OIRA were to produce risk assessment guidelines that were judicially reviewable, the maker of chlorpyrifos, Dow Chemical Company, could sue the agency on the grounds that it did not use an appropriate model, consider the best available studies, or that its peer review was insufficient. This would quickly become a way for industry to inject uncertainty into the agency’s process and tie up regulatory decisions about its products in court for years, delaying important public health protections. A failure to ban a pesticide like chlorpyrifos based on inane risk assessment criteria would allow more incidences of completely preventable acute and chronic exposure, like the poisoning of 50 farmworkers in California from chlorpyrifos in early May.

“Risk assessment is not a monolithic method”

A one-size fits all approach to government risk assessments is a bad idea, plain and simple. As the NRC wrote in its 2007 report:

Risk assessment is not a monolithic process or a single method. Different technical issues arise in assessing the probability of exposure to a given dose of a chemical, of a malfunction of a nuclear power plant or air-traffic control system, or of the collapse of an ecosystem or dam.

Prescriptive guidance from OIRA would serve to squash the diversity and flexibility that different agencies are able to use depending on the issue and the development of new models and technologies that best capture risks. David Michaels, head of OSHA during the Obama Administration, wrote in his book Doubt Is Their Product that regulatory reform, and in this case the RAA, offers industry a “means of challenging the supporting science ‘upstream.’” Its passage would allow industry to exert more influence in the process by potentially opening up agency science to judicial review. Ultimately, the RAA is a form of regulatory obstruction that would make it more difficult for agencies to issue evidence-based rules by blocking the use of science in some of the earliest stages of the regulatory process.

The bill will be marked up in the Senate Homeland Security and Governmental Affairs Committee tomorrow, and then will likely move onto the floor for a full senate vote in the coming months. Help us fight to stop this destructive piece of legislation by tweeting at your senators and telling them to vote no on the RAA today.

Pages

Subscribe to Union of Concerned Scientists aggregator - Combined UCS Blogs