Combined UCS Blogs

This Beetle Lays its Eggs in Dead Mice Carcasses and then Covers Them With Mucus – But it’s Endangered and Important

UCS Blog - The Equation (text only) -

Photo: Wayne National Forest

The US Fish and Wildlife Service (FWS) rushed a scientific assessment on the endangered American burying beetle (Nicrophorus americanus) in Nebraska, seemingly because the agency didn’t want to disrupt agribusiness. Two biologists that were working on the assessment, Wyatt Hoback and Douglas Leasure, told the Washington Post that the FWS pushed them to conduct their science on an extremely constrained timeline. The beetle has been a source of contention in federal government research since 2013. The species was listed as endangered after 1989 when scientific evidence showed that the beetle had disappeared from over 90% of its historic range in the US.

Scientists pushed to conduct slapdash research

Drs. Hoback and Leasure are experts on the endangered American burying beetle, specifically in Nebraska. In 2017, the two biologists published a paper with a map of the species distribution in the state. Their expertise is the reason why the FWS asked these scientists to help them with the beetle’s status assessment under the Endangered Species Act. The FWS wanted the scientists to overlay their 2017 map of the beetle’s distribution with another map projecting future farmland to identify where agriculture may pose a threat to the beetle in the future.

When the scientists received the map projecting future farmland in Nebraska, created with data from Montana, North Dakota, South Dakota , they raised major concerns. These states have very different environments and soils so Hoback and Leasure didn’t think it would be accurate to overlay the two maps. Hoback said the two maps “were never intended to be put together.” Furthermore, they told the FWS that using a map of the distribution of the beetle at finer scales in the state would be more accurate. The agency gave the scientists “literally one day” to resolve these issues with combining the two maps – an impossible task, the biologists agreed.

Due to the constrained timeline, Hoback and Leasure voluntarily left the project, asked to have their names pulled from any resulting reports or publications, and asked that their data not be used. And even after the two biologists left the project and were removed as authors on the report, Dr. Leasure later received a draft report of the assessment from a FWS scientists that he said had “copied word-for-word,” a paragraph from his and Dr. Hoback’s 2017 publication. In February, Dr. Hoback alerted the regional director for the agency’s Mountain-Prairie Region, Noreen Walsh, of the plagiarism. Walsh replied that the paragraph was only used in a “pre-decisional draft of the analysis.” I’m glad that Dr. Hoback brought this to their attention since plagiarism is clearly defined as a violation of the agency scientific integrity policy.

The American burying beetle is important

Yes, it’s true that burying beetles are not the most charismatic of creatures. It literally lays its eggs in dead mice (gross) and then covers those mice with mucus (ew…even more gross). Do they also feed their young by vomiting in their mouths? Yes – yes they do. Look, I’m not here to convince you that this beetle is a fluffy cute puppy that you should pick up, take home, pet and call your “fur baby.” But just because this bug is not the most charismatic of creatures to humans, doesn’t mean that it isn’t critically important to our lives.

Recycling dead mice is quite important to ecosystem functioning. The cycling of energy and matter in an ecosystem is heavily dependent on the process of decomposition. Animals consume approximately 10% of organic matter generated by plants, and eventually those nutrients are returned to the system either through excreta or through the decomposition of the animal’s body when it dies (referred to as carrion). Dead animals are a highly nutrient rich resource that affects everything from scavengers and predators down to the microbial community. Therefore, where an American burying beetle places its mummified mouse, how its mucus affects the decomposition of that mouse, and where in time the mouse decomposes have profound effects on nutrient cycling and, subsequently, ecosystem functioning. And you know, you won’t see as many dead rodents around your neighborhood if the beetle does its thing!

Most scientists now agree that we are on the brink of a sixth major extinction event. Such an event will impact humans because we depend heavily on the services that these species provide, many times unknowingly. But once a species is gone, we cannot bring it back. That’s why we must fight to preserve our world’s most rare plants and animals.

Good science isn’t timed

Science, good science, takes time. For example, one of my PhD projects started in 2011 and it wasn’t published in a peer-reviewed journal until 2017. From conceptualizing the question, to understanding what was already known about this question, to determining the appropriate methodology to answer the question, to collecting the data, then analyzing the data, and finally writing a paper to be submitted to a journal, which then would go through a long process of review by my peers, to the paper being finally accepted – that took six years!

And I would say that is an average time to produce a good scientific product from start to finish. The FWS just asked two biologists to resolve a major methodological issue in “literally one day.” That is simply ridiculous. I find it difficult to believe that the FWS believed this was possible. And if they knew it wasn’t possible, then it seems there may be other reasons why they rushed these biologists. Were they fishing for an answer the whole time?

Hoback and Leasure say that the combination of the two maps that FWS wanted combined would have resulted in an analysis showing that the American burying beetle population is far safer from agricultural threats than it actually is. That sounds like a manipulation of methods to produce results that fit a political agenda, which is not science. In our 2018 federal scientists survey, FWS scientists reported that the top barrier to science-based decision-making at their agency was political interference. This example certainly supports that result.

Photo: Wayne National Forest

Op-Eds for Cheeseheads: Training New Scientists as Communicators in Wisconsin Food Systems Policy

UCS Blog - The Equation (text only) -

“Facts aren’t impartial. They have great implications for people. They threaten people.” A few dozen graduate students and handful of public employees and farmers in the room nod thoughtfully over Margaret’s comment, laughing as she says, “It has never been a rational world!” On a June afternoon at the University of Wisconsin-Madison, this group is looking to a panel of experts on science communication and advocacy with big questions: how should new scientists start public communication, and where do they have leverage in food systems policy?

Communication and advocacy in Midwestern agriculture

Cattle grazing near Madison, WI

Wisconsin is a unique place to work in agriculture and food systems, which is what drew many people in the room to work here. The state is home to a huge breadth of agricultural activities across its 68,500 farms, with many examples of progressive, farmer-led research and stewardship and initiatives with cutting edge technology. However, even with agricultural sciences and industry woven into state culture, Wisconsin faces the same communication challenges we see in the news across the nation: tension between a vision of agriculture as a business, a science, and as a public service, conflicts between conservation and production, and differences in urban-rural priorities that leave plenty of new researchers wondering how to connect with the public and legislators over agricultural issues.

We organized a science communication and advocacy workshop with help from the Union of Concerned Scientists’ Science for Public Good Fund after hearing graduate students in plant sciences wanted to improve their writing, speaking, and tweeting to connect with public policy on food systems. In addition to developing our abilities to frame our research to different audiences and issues, we wanted to learn more about how to advocate and contribute to new policy. Here’s a little of what we learned.

Use language thoughtfully

Eric Hamilton and Kelly Tyrrell brought their experience as science writers at UW-Communications on writing to connect with different audiences. They emphasized the importance of clear, straightforward language, and told the group to always avoid jargon or define it, thinking about buzzwords that carry baggage or might alienate your audience.

Keep it relevant

Workshop participants practice writing a “hook” for the beginning of an op-ed

To make an op-ed or blog post about your work timely, Eric and Kelly suggested using current events related to frame your research or expertise, whether in recent news or through the anniversaries of historical events. Google alerts and organizational newsletters are tools to help researchers stay tuned in to new research or activities on a given topic. Connecting with a new audience on their values and experience is more effective than rebutting their ideas point by point, and finding a topic of connection can frame your story or ideas. Finally, they encouraged us to get out there and use our resources: “the more you write, the more you’ll figure out how to write and what to write about.”

Use language thoughtfully

Eric Hamilton and Kelly Tyrrell brought their experience as science writers at UW-Communications on writing to connect with different audiences. They emphasized the importance of clear, straightforward language, and told the group to always avoid jargon or define it, thinking about buzzwords that carry baggage or might alienate your audience.

Keep it relevant

To make an op-ed or blog post about your work timely, Eric and Kelly suggested using current events related to frame your research or expertise, whether in recent news or through the anniversaries of historical events. Google alerts and organizational newsletters are tools to help researchers stay tuned in to new research or activities on a given topic. Connecting with a new audience on their values and experience is more effective than rebutting their ideas point by point, and finding a topic of connection can frame your story or ideas. Finally, they encouraged us to get out there and use our resources: “the more you write, the more you’ll figure out how to write and what to write about.”

Recommended resources for researchers and new communicators: COMPASS, CaSP, California Council for Science and Technology, TheOpEd Project, Pew Trust, National Academies, The Open Notebook, Medium, The Conversation, Massive Science 

Greta Landis is a PhD student at the University of Wisconsin-Madison. Her agroecology research is focused on conservation partnerships and decision-making for grazing management on public land. She also works for University of Wisconsin-Extension as a student evaluator.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Photo: G. Landis Photo: T. Campbell

Proposed Changes to the Endangered Species Act Threaten Wildlife

UCS Blog - The Equation (text only) -

The endangered black-footed ferret. Photo: USFWS Mountain-prairie

The Trump Administration is threatening species, land conservation, and human health and wellbeing by rolling back our health, safety, and environmental protections. This time the U.S. Fish and Wildlife Service (FWS) and the National Marine Fisheries Service (NMFS) are attempting to undercut the scientific basis of the Endangered Species Act (ESA) by proposing changes that will make it less effective, even increasing the chances that species will go extinct.

FWS and NMFS jointly issued the proposed rule, “Endangered and Threatened Species: Listing Species and Designating Critical Habitat.” This proposal contains troublesome language around economic consideration and planning for the foreseeable future.

Economic considerations

The Departments of Interior and Commerce (via the agencies that implement the law, FWS and NMFS) are proposing to allow for considerations other than science to influence listing decisions. The proposal adds economic considerations in the process for deciding if a species is in fact threatened or endangered, instead of making it a purely scientific decision. Currently, the determination of endangered or threatened status is made without reference to possible economic or other impacts.

The agencies argue in their proposal that including economic information in the listing decision better informs the public. There is, however, ample opportunity in the process of determining how to conserve threatened or endangered species to inform the public of possible economic and other impacts.

Including economic information in the listing decision itself is contrary to the four factors that the law states should be the basis for determining status: “(A) the present or threatened destruction, modification, or curtailment of its habitat or range; (B) overutilization for commercial, recreational, scientific, or educational purposes; (C) disease or predation; (D) the inadequacy of existing regulatory mechanisms; or (E) other natural or manmade factors affecting its continued existence.”

Foreseeable future

The proposal also limits the extent that the agency can look to potential harms to a species in the foreseeable future which could keep climate change impacts like drought, habitat loss from flooding, heat impacts, and range changes, out of the listing decision and recovery plans for species. The definition of a species as “threatened” includes the phrase “within the foreseeable future”.  In other words, it requires the agencies to look ahead as they make listing decisions. The proposal says that the agencies will evaluate what the foreseeable future means on a case-by-case basis, but in describing factors to be considered does not include changing climate (only environmental variability).  They also state that the foreseeable future should only include the time period for which predictions are “reliable”.

Climate change is not the same as environmental variability.  The ongoing effects of global warming are directional, very broad-scale, and have already posed observable and quantifiable harm to species and their habitats. To not account for a changing climate in listing decisions is to ignore a critical factor relevant to the listing criteria. Furthermore, stating that the timeframe to be considered should be based on when projections are “reliable,” without any indication on what that means, is being intentionally vague. It should be sufficient to say that the timeframe should be based on the best science available.

Voice your concerns

The Endangered Species Act has prevented 99% of species listed under the law from going extinct, and it has done so by basing decisions solely on the best available science, without reference to economic impacts. The proposed changes will weaken the ability of the FWS and NMFS to make decisions informed by science when implementing the Endangered Species Act, as well as send a loud message to the public that economic considerations will prevail over scientific evidence, even at the cost of an entire ecosystem and the species dependent upon it. Fortunately, there is a way we can voice our concerns and block these changes from taking place. There is still time to submit a public comment to the record, but the comment period for the proposed rule ends September 24th. UCS has created an public comment guide for tips on writing a strong comment for this particular proposed rule, which can be found here.

I would like to thank my colleagues in the Center for contributing their expertise to this blog and for their dedication to protecting the Endangered Species Act as a science-based statute. 

Photo: USFWS Mountain-Praire

What Happens in the Next 26 Days Could Change Our Food and Farm Future

UCS Blog - The Equation (text only) -

Photo: Andy Ciordia/CC BY-NC-ND 2.0 (Flickr)

It feels like I’ve been thinking about the 2018 farm bill forever, but we may have finally reached the beginning of the end. Tomorrow, an unusually large group of 56 (!) negotiators from the House and Senate are expected to shoehorn themselves into a room on Capitol Hill to begin the formal process of reconciling two very different visions of our food and farm system.

What happens next will either help small and midsize farmers thrive, put more healthy food on the dinner tables of our most vulnerable neighbors, and invest in farming practices that prevent water pollution and build healthy soil for the future…or not. There’s also an unfortunate third option, in which the farm bill process fails completely, leaving farmers and eaters in limbo.

But first, let’s review where we are, because this day has been a long time coming. As always, there have been ups and downs in the process of crafting a new farm bill, but this time the resulting bills passed in the House and Senate are particularly far apart on several crucial issues.

Two houses of Congress, two very different visions

The divisive House bill’s attack on the SNAP program (formerly food stamps) would be a disaster for millions of people in this country who struggle  to put food on the table—despite a strong economy, and even if they have a job. The House bill also cuts conservation incentives for farmers, completely eliminating the Conservation Stewardship Program (CSP), the USDA’s biggest and most comprehensive initiative that helps farmers take steps to protect their soil and prevent water pollution. The House bill even throws endangered species under the bus as a gift to the pesticide industry.

Contrast all that with the Senate bill, which passed with overwhelming bipartisan support. That bill protects and improves SNAP and includes funding for innovative programs that connect farmers with local consumers, expanding farming opportunities, enabling more people to afford nutritious food, and keeping food dollars in communities. Moreover, the Senate bill maintains CSP and improves the program in a way that would really pay off—returning $1.2 billion in net benefits to farmers and taxpayers (compared to the House bill’s CSP net loss of $4.7 billion in benefits).

More wonky details about the differences between the House and Senate farm bills can be found here and here.

Can negotiations bridge stark farm bill differences?

Leaders of the congressional agriculture committees (including the Senate committee’s chairman and the House committee’s ranking member) keep issuing statements reassuring farmers that the process is on track. But there are signs that negotiations will be very difficult, and as reported last week, hardliners bent on gutting SNAP are vowing not to budge, leaving little room for compromise.

It remains to be seen whether negotiators will come to some agreement before the current farm bill expires on September 30. But with every day that passes, that outcome seems less likely.

So what happens if they don’t beat the clock? Well, there are a couple of scenarios.

Scenario #1, A Farm Bill Fumble: If the House and Senate fail to pass an identical bill before the end of this month, or if the president decides he doesn’t like the bill they pass and refuses to sign it (a thing that could happen), Congress could vote to extend the current legislation for some period of time. That could be a week, a month, or even a year, depending on whether they just need a little more time to negotiate a sticky detail, or rather they want to kick the can to the next Congress.

Regardless of the extension’s duration, legislators would need to vote specifically to continue funding certain programs, due to a quirk in the legislative process (more on that below). And then the whole new-farm-bill-writing process would start over with a new Congress in 2019.

Scenario #2, A Total Farm Bill Fail: But if Congress fails to pass a new bill this month and they fail to pass an extension keeping money flowing to key programs, that’s when bad things happen. In this situation—let’s call it a TFBF for short—money would immediately dry up for a variety of programs that lack so-called baseline funding. These are programs that farmers, rural communities, and low-income consumers depend upon, and they include:

The Congressional Research Service has a good explanation of farm bill “baseline” funding, with a full listing of programs that don’t have it and would thus be stranded without a new farm bill or an extension of the current one.

While Congress negotiates, we need to keep the pressure on

Let’s assume for the moment that Congress manages to avoid a TFBF at the end of this month. In that case, the entire Congress will need to vote on something—either a new farm bill or an extension of current law. So even for the majority of representatives and senators that won’t be actively negotiating over the next couple of weeks, the farm bill is a thing they need to be thinking about.

You can help keep the pressure up by contacting your representatives and senators. Urge them to reject any farm bill that undercuts SNAP, fails to include local food programs, or eliminates CSP—send an email today.

Why is ExxonMobil Still Funding Climate Science Denier Groups?

UCS Blog - The Equation (text only) -

Photo: Mike Mozart/Flickr

A decade after pledging to end its support for climate science deniers, ExxonMobil gave $1.5 million last year to 11 think tanks and lobby groups that reject established climate science and openly oppose the oil and gas giant’s professed climate policy preferences, according to the company’s annual charitable giving report released this week.

Nearly 90 percent of ExxonMobil’s 2017 donations to climate science denier groups went to the US Chamber of Commerce and three organizations that have been receiving funds from the company since it started bankrolling climate disinformation 20 years ago: the American Enterprise Institute, Manhattan Institute and American Legislative Exchange Council, which — in a surprise move — ExxonMobil recently quit. (More on that later.)

The other ExxonMobil denier grantees last year were the Center for American and International Law ($23,000), Federalist Society ($10,000), Hoover Institution ($15,000), Mountain States Legal Foundation ($5,000), National Black Chamber of Commerce ($30,000), National Taxpayers Union Foundation ($40,000), and Washington Legal Foundation ($40,000).

ExxonMobil’s funding priorities belie the company’s purported support for a carbon tax, the Paris climate agreement and other related policies, which it reaffirmed in a January blog post by its public affairs director, Suzanne McCarron. If, as McCarron claims, ExxonMobil is “committed to being part of the solution,” why is the company still spending millions of dollars a year on groups that are a major part of the problem?

ExxonMobil’s history of deceit

There is ample evidence that Exxon was fully aware of the danger its products pose to the planet since the 1980s and likely even earlier. Nonetheless, the company helped initiate a fossil fuel industry-backed climate disinformation campaign in 1998, a year before it merged with Mobil.

The company’s behind-the-scenes role went largely unnoticed for nearly a decade, but in early 2007, a report by the Union of Concerned Scientists revealed that it had spent at least $16 million between 1998 and 2005 to fund a network of more than 40 think tanks and advocacy groups to manufacture doubt about climate science under the guise of being neutral, independent analysts.

In response to the negative press generated by the UCS report, ExxonMobil vowed in its “2007 Corporate Citizenship Report” to “discontinue contributions [in 2008] to several public policy research groups whose positions on climate change could divert attention from the important discussion on how the world will secure the energy required for economic growth in an environmentally responsible manner.”

Note that the company only promised to stop funding several policy groups, not all, and it did in fact drop some high-profile grantees, including the Cato Institute, Competitive Enterprise Institute, Heartland Institute and Institute for Energy Research. But it never completely ended its support for the disinformation network. From 1998 to 2007 — the year of the pledge — it spent nearly $23 million on it. From 2008 through last year, it spent another $13.17 million, for a total of $36.13 million over the last 20 years. As far an anyone has been able to determine from publicly available data, only Charles and David Koch, the multibillionaire owners of Koch Industries, have spent more to deceive the public about climate science and block government action on climate change.

Last year, $1.35 million of the $1.5 million ExxonMobil spent went to the following four organizations:

US Chamber of Commerce: Sponsoring slanted studies

In 2014, ExxonMobil committed to give $5 million to the US Chamber of Commerce’s Capital Campaign in $1 million-a-year increments on top of its annual dues, despite the lobby group’s history of misrepresenting climate science and the economics of transitioning to clean energy. Last year, the company kicked in another $15,000 for the Chamber’s Corporate Citizenship Center, bringing its total donation to $1,015,000.

If one takes ExxonMobil’s climate policy claims at face value, the Chamber’s positions are polar opposite.

ExxonMobil has been very vocal about its support for the Paris climate agreement, for example, and during its former CEO Rex Tillerson’s brief stint as US secretary of state, he reportedly implored President Trump to keep the United States in it. What did Trump cite last year when he announced he was pulling out of the accord? A widely debunked report from the US Chamber of Commerce.

Cosponsored by a former ExxonMobil grantee — the American Council for Capital Formation (ACCF) — the report maintained that the Paris accord would cost the US economy nearly $3 trillion over the next several decades and eliminate 6.5 million industrial sector jobs by 2040.

According to analyses by the Associated Press (AP), Politifact and The Washington Post, however, the Chamber and ACCF cooked the books. As the AP put it: “The study makes worst-case assumptions that may inflate the cost of meeting US targets under the Paris accord while largely ignoring the economic benefits to US businesses from building and operating renewable energy projects.”

American Enterprise Institute: Undue faith in the market

The American Enterprise Institute (AEI), an 80-year-old free-market think tank in Washington, D.C., has received more from ExxonMobil than any other climate science denier organization. In 2017, ExxonMobil gave AEI $160,000, bringing its total to $4.49 million since 1998.

Economist Benjamin Zycher, the only AEI staff member who writes regularly about climate issues, rejects mainstream climate science, insists a carbon tax would be “ineffective,” and has called the Paris agreement an “absurdity.” He not only disagrees with ExxonMobil’s professed climate policy positions, he has attacked the company for taking them.

Zycher’s colleague Mark Thiessen, a regular contributor to The Washington Post, also dismisses the Paris accord, maintaining that “free enterprise, technology, and innovation—not pieces of parchment signed in Paris and Kyoto — will revolutionize how we produce and consume energy.” Never mind that it often takes regulations to drive innovation and force corporations to adopt cleaner technology. Without federally mandated air pollution controls, for example, power plants and other industrial facilities would be emitting considerably more toxic pollution than they do today.

Manhattan Institute: Propaganda masquerading as news

Another free-market think tank, the Manhattan Institute, received $115,200 from ExxonMobil last year for its Center for Energy Policy. Since 1998, it has received $1.25 million. Like Zycher and Thiessen at AEI, Manhattan Institute fellows oppose a carbon tax and the Paris accord.

Earlier this year, the New York City-based organization hired longtime TV newsman John Stossel, former host of Fox Business Network’s “Stossel” and ABC’s “20/20,” to interview Manhattan Institute Senior Fellow Oren Cass for a slickly produced, 4-minute YouTube segment titled “The Overheated Costs of Climate Change.”

Cass, who regularly testified before Congress against Obama administration climate efforts, told Stossel that the Paris climate agreement “was somewhere between a farce and a fraud.” Stossel wholeheartedly agreed. “The Earth is warming,” Stossel intoned in his wrap-up. “Man may well be increasing that. But the solution isn’t to waste billions by forcing emissions cuts here while other countries do nothing. Well, pretend to make cuts. Trump was right to repudiate this phony treaty.”

Waste billions while other countries do nothing? Besides the fact that it is now cheaper to produce electricity from utility-scale solar and wind energy in the United States than nuclear, coal and even natural gas, as of last November — a year after the Paris agreement officially went into effect — China, India and other major carbon emitters were already making significant progress in meeting their Paris accord commitments.

The other glaring problem with the segment is it’s a prime example of fake news. With a former network news show host playing anchor, viewers could easily mistake the piece as a clip from of a legitimate newscast. At least one member of the conservative echo chamber treated it that way. The Washington Free Beacon, an online news organization funded by GOP megadonor Paul Singer, ran a news story about the Stossel-Cass interview on March 19.

American Legislative Exchange Council: Fossil fuel industry ‘bill mill’

On July 12, ExxonMobil announced it had ended its longtime membership in the American Legislative Exchange Council after a disagreement over the corporate lobby group’s climate policy. From 1998 through last year — when Exxon Mobil reported it gave the group $60,000 — ALEC received a $1.93 million from the oil company.

Over the last two decades, ALEC has routinely featured climate science deniers at its conferences and supplied state lawmakers with a range of fossil fuel industry-drafted sample legislation, including bills that would restrict investment in renewables, eliminate incentives for electric vehicles, and hamper the solar industry from selling electricity directly to residential and business customers.

Since 2012, more than 100 corporations, including BP, ConocoPhillips, Royal Dutch Shell and electric utilities Entergy, Pacific Gas & Electric and Xcel Energy, have severed ties with ALEC, in many cases because of its regressive policy positions.

ExxonMobil’s exit from ALEC came just months after the company fought to defeat a draft resolution sponsored by the Heartland Institute — an ExxonMobil grantee from 1998 through 2006 — calling on the Environmental Protection Agency to “reopen and review” its “flawed” conclusion that climate change poses a threat to human health. The EPA’s “endangerment finding” requires the agency to regulate carbon dioxide and other global warming emissions as hazardous pollutants under the Clean Air Act.

After ExxonMobil and the Edison Electric Institute (EEI), a utility trade group, objected to the resolution, the Heartland Institute withdrew it and accused the two of being in league with the likes of Greenpeace and the Sierra Club.

“Big corporations like ExxonMobil and trade groups like EEI have long been members of the discredited and anti-energy global warming movement,” Heartland’s president, Tim Huelskamp, said in a December 7 press release. “They’ve put their profits and ‘green’ virtue signaling above sound science and the interests of their customers.”

Huelskamp’s ludicrous assertion notwithstanding, some might construe ExxonMobil’s exit from ALEC as a welcome change in direction. The company’s money trail, however, clearly shows that it is still financing climate science denier groups that denigrate any and all climate policy options and provide cover for Congress and the current administration to do nothing. Until ExxonMobil stops funding these groups, its avowed support for a carbon tax, the Paris agreement and other climate initiatives can’t be seen as anything more than a cynical PR ploy.

Photo: Mike Mozart/Flickr

Trump Administration’s “Affordable Clean Energy” Rule Is Anything But

UCS Blog - The Equation (text only) -

Photo: Wigwam/Flickr

If there’s one thing you need to know about the Affordable Clean Energy (ACE) rule, the Trump Administration’s new proposal for limiting carbon emissions from power plants, it’s this: ACE was not designed to reduce emissions; ACE was designed to boost generation from coal plants.

Which is audacious! A clean air standard that somehow manages to increase the nation’s use of its dirtiest power source, even when compared against a scenario with no carbon standards at all?

Remarkably, yes.

Because under the cover of establishing emissions guidelines, ACE is actually peddling regulatory work-arounds that aim to increase coal generation, a brazen attempt at stalling the industry’s precipitous decline.

How could something like this possibly come to pass from an agency whose core mission is to protect human health and the environment? A proposal that not only manages to increase emissions, but also worsens public health and raises costs?

Here, we’ll take a look.

With ACE, something is worse than nothing

ACE is the Trump Administration’s proposed replacement to the Clean Power Plan (CPP), a standard developed by the Obama Administration’s Environmental Protection Agency (EPA) to cut carbon pollution from power plants. Both ACE and the CPP are underpinned by the agency’s Endangerment and Cause or Contribute Findings, which necessitate that EPA regulate carbon emissions to protect human health and welfare.

Importantly, ACE doesn’t question the Endangerment Finding, nor EPA’s responsibility to act. Instead, this replacement reflects the fact that EPA’s current leadership believes—though will not allow the courts to decide—that the CPP exceeded the agency’s statutory authority, and thus that a far narrower approach to standard setting is appropriate instead.

Consequently, whereas the CPP had allowed achieving emissions reductions by letting cleaner plants run more—an efficiency made possible by the interconnected nature of the power system—the new proposal only considers emissions improvements from upgrades at individual coal plants.

Specifically, EPA’s new analysis concludes that the only viable carbon emissions reductions from the power sector are heat-rate improvements (HRI) at select coal-fired power plants, resulting in emissions efficiency gains on the order of <1 to a few percent on a unit-by-unit basis. EPA projects that this approach would lead to total additional carbon dioxide emissions reductions of 0 to 1-2 percent compared to no policy at all.

But if only this proposal stopped at the point of being shamefully unambitious! At excluding improvements viable even within its unreasonably narrow approach, like co-firing cleaner fuels alongside coal, or deploying carbon capture and sequestration (CCS). At concluding that no reductions are viable at natural gas plants, even though their emissions present a major and growing challenge. At trying to do the absolute bare minimum to meet the agency’s statutory obligation to act.

If only, if only. But in fact, this feeble designation of the “best system of emissions reduction” (BSER) is only just the start.

Easing protections to boost coal

Alongside a dramatic weakening of the BSER, EPA advances two additional modifications that—though obscure—have the net effect of shifting ACE from impotent to unfortunately significant.

First, EPA proposes affording states wide latitude in determining whether their individual coal plants should pursue HRI, emphasizing extreme deference to states in the face of “source-specific factors.” As a result, even though EPA’s modeling allowed for coal plants to either make improvements or retire, almost certainly many units would choose the third option instead: keep running and change nothing. Which casts EPA’s projections of nearly negligible carbon emissions reductions from ACE into even more doubt.

But the stunning truth for the coal industry is that protecting plants from the costs of new regulations is still not enough to overcome its true existential threat: dismal economics compared to other energy sources.

And thus the second proposed change: an amendment to an EPA program called New Source Review (NSR).

NSR exists to protect the public from potentially significant increases in pollution from sites undertaking major construction projects, requiring those emitters that could trigger such increases to simultaneously upgrade their pollution controls.

Industry has repeatedly worked to eliminate NSR, balking at the potential costs of implementing new public health safeguards; however, these efforts have been repeatedly struck down by the courts. Now, EPA air chief Bill Wehrum is trying again.

The agency is proposing that if the hourly emissions rate of a plant doesn’t increase, it should be excused from NSR requirements. And the hourly rate won’t increase, because upgrades are specifically intended to improve them. But at an annual level? Well, if a plant incorporates a major efficiency upgrade, it’s likely to run a whole lot more as a result. Therefore, even though the hourly emissions will not increase, the annual emissions will.

The result is a green light from EPA to sink major investment dollars into plants—regardless of pollution, regardless of payoff, and regardless of public health costs.

In ACE proposal, costs are high and benefits are low

And the upshot?

When the three tactics come together—an appallingly weak BSER, wide latitude afforded states to excuse plants from compliance, and work-arounds for NSR—we’re left with an emissions standard that is projected to increase coal generation even beyond that expected in a future with no carbon standard at all.

Generation Mix (thousand GWh). EPA’s analysis projects more generation from coal plants in the ACE scenarios (HRI options) than in the scenarios for CPP or full CPP repeal (RIA Figure 3-1).

It’s unsurprising, then, that when compared against the CPP, ACE is projected to result in increased emissions and worse pollution, leading to more illnesses and even more premature deaths. EPA’s own analysis projects annual increases on the order of tens of thousands of asthma attacks and lost works days, and hundreds to up to 1,500 more deaths annually from changes in fine particulate matter and ozone alone.*

EPA’s analysis projects annual increases in premature deaths from higher levels of fine particulate matter and ozone in the ACE scenarios (HRI options) relative to the CPP (RIA Table 4-6).

What’s more surprising, though, is that despite this administration’s continued rhetoric about the extreme burden of the CPP, the ACE proposal could—wait for it—actually cost more.

EPA’s analysis projects higher compliance costs in multiple years for ACE scenarios (HRI option) compared to the base case of the CPP (RIA Table 3-11). When foregone climate and health benefits are included, the value of the CPP vastly exceeds ACE across all years (RIA Tables 6-9 – 6-11).

Because unlike the CPP, this proposal forces compliance on a plant-by-plant basis, meaning operators don’t have the option of turning to other resources to achieve far greater reductions at far lower costs. That myopic view of our integrated power sector is extremely economically inefficient.

Further, these cost comparisons are even less favorable for ACE than at first glance, because when EPA compared ACE against the CPP, it did not include energy efficiency as a possible CPP compliance mechanism and it did not include trading between states—two major opportunities for cost reductions. This means that in reality, the CPP likely would have cost even less, even though it would have achieved far more.

ACE deals the nation a losing hand

In releasing this proposal, Acting EPA Administrator Andrew Wheeler triumphantly declared that the agency would no longer be “picking winners and losers.”

Unfortunately, it appears his strategy is to just pick losers instead.

And that deals us all a losing hand. For our health, our environment, our savings, our future.

Because it’s hard to imagine a worse long-term investment than pumping hundreds of millions of dollars into old coal plants for net societal losses in the face of inevitable industry decline. But that’s the exact course that the Trump Administration’s EPA is trying to force our nation to take.

Indeed, the administration willfully ignores the fact that for effectively no upside, real communities will be left in the lurch for a long time to come, paying for misplaced investments with their wallets, and paying for worsened pollution with their lives.

 

* NOTE: In parallel rulemakings, EPA is working to: 1) restrict the use of scientific studies that let us ascertain such impacts, and 2) limit the degree to which such impacts will factor into future decision-making.

Photo: Wigwam/Flickr

The sociopolitical evolution of a scientist: incorporating advocacy into my graduate school experience

UCS Blog - The Equation (text only) -

During September of 2016, I was excited to begin my bioengineering master’s program in Boston, home to the world’s largest community of biomedical researchers. But on November 8th, the US political landscape abruptly transformed, and suddenly my research studying how cancer spreads throughout the body felt microscopic. The aftermath of the 2016 election forced me to examine my identity; I saw how the wave of anti-LGBT rhetoric and violence left my community feeling unsafe. Raised by a family of immigrants, I saw my lab mate barred from entering the country after visiting her family in Iran. And as a scientist, I saw how the spread of misinformation caused public distrust in science, permeating our highest levels of government.

I’ve always believed that science could and should have an impact on people’s lives. My interest in science was sparked by my cardiologist, who explained how engineers built the device that allowed her to visualize my heart’s electrical pathways, find my arrhythmia, and fix it. But in this climate, I worried that scientific research would not have the same impact on society – that our knowledge would not be reflected in our policies.

Finding my community: early career scientists making an impact

Amidst the barrage of misinformation and climate change deniers in positions of power, I knew that input from scientists was needed, but wasn’t sure how I could make an impact as a graduate student. I started attending MIT’s Science Policy Initiative (SPI), and discussions to plan SPI’s annual visit to Capitol Hill during STEM on The Hill Day gave me a sense of purpose. We were there for the same reason – to ensure scientists have a role in policy-making.

On the hill, I met with staffers from Senator Coons’ office to advocate against proposed cuts to the National Institutes of Health budget and National Oceanic and Atmospheric Administration’s SeaGrant program. Fortunately, the senator’s office agreed, and we asked Senator Coons to circulate a dear-colleague letter to gather wide support in opposing these cuts. A small but important endeavor, this ask made the meeting effective and opened the door for future dialogue. Overall, this experience was valuable training in communicating my science to policy-minded people.

Science advocacy on campus

I realized effective advocacy and communication were skills most graduate students were interested in, but didn’t know where to find. I learned of a grant offered by the Union of Concerned Scientists (UCS) to expand community-based science advocacy. I was awarded a Science for Public Good Fund to implement a science advocacy workshop series at Northeastern University.

Planning a three-part workshop by myself was no easy task, and I suffered from a serious case of impostor syndrome – there were moments where I felt unprepared to lead a workshop on advocacy. However, the mentorship provided by the staff at UCS helped me craft an effective event. They connected me with resources and experts in science advocacy, some of whom served as speakers. Importantly, the workshop helped pull together a group of graduate students whose passion for science-backed decision making formed the base of a new advocacy community at Northeastern. I realized it’s never too early to reach out and find a group of graduate students with similar passions to help initiate more formal skill-based programming efforts.

Citizen-scientists: Re-thinking graduate education and the roles of scientists outside the lab

Planning the workshop would not have been possible without leaning on my network of science communicators. Be that as it may, more structured university-driven science advocacy resources are needed at the student level. Likewise, while my experience in science advocacy took place in the context of my university, graduate programs must place more curricular emphasis on communicating the real-world implications of the important science being generated by their graduates.

For now, us graduate students need to reclaim our graduate school experiences to be that source of change. We need to push our universities and fellow scientists to think about how their scientific findings impact society, and more generally how their scientific training is valuable to the policy-making process. Building on existing university support systems to create student groups with funding and meeting space helps establish a local network. University government liaison offices are often willing to support student-driven efforts, and meeting with state representatives can be an easy way to start conversations and build long-lasting relationships with policy-makers.

While we as scientists gather information, as citizens and inhabitants of the world, we have a responsibility to ask “How is my work being used in the world?” STEM graduates are asking this question now, more than ever. To support early career scientists stepping into these roles, we need to support motivated graduate students in building networks, seeking out real-world experiences, and demonstrating to universities the importance of supporting these efforts.

Alex Hruska is a bioengineering PhD student at Brown University, working to develop new biomimetic models to study cancer cell invasion and phenotypic plasticity. He is passionate about amplifying the role of scientists in policy and governance. Find him on Twitter @alex_hruska

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

 

Brett Kavanaugh: Enemy of Innovation

UCS Blog - The Equation (text only) -

The Supreme Court at night. Photo: Wikimedia

The confirmation fight over Supreme Court nominee Brett Kavanaugh begins next week with a hearing on Tuesday. Supporters and opponents are drawing battle lines over crucial issues such as abortion, health care, immigration, and whether the President is subject to criminal processes. But, as I wrote in an earlier blog, the nominee’s views on the role of federal agencies in protecting public health, safety and the environment deserve our attention as well.

Unlike others before him, Brett Kavanaugh is no “stealth nominee.” As a judge on the DC Circuit Court of Appeals, Judge Kavanaugh authored many opinions on the role of federal agencies, and these opinions provide an unusually expansive window into his thinking.

Unfortunately, a careful review of his opinions reveals a disturbing pattern:

Judge Kavanaugh is hostile to innovation by executive branch agencies. He has such rigid and antiquated views of the respective roles of congress and executive agencies that he leaves little room for federal agencies to try new approaches to existing problems or to take on new challenges. This should alarm not just those on the left who would like to see more robust federal response to threats to public health, the environment, worker safety and the like, but conservatives as well, who should also want government to be nimble and able to adjust to new circumstances.

To see this pattern, follow me on a guided tour of his thinking in three key cases.

Interstate air pollution and the “Good Neighbor Rule.”

Air pollution crosses state boundaries, and many states are in the unenviable position of having dirty air even though they are effectively controlling pollution sources within their state. For example, even if Maryland were to shut down every business in its state that emits ozone-causing pollutants, portions of the state would still be in violation of federal ozone standards due to pollution from neighboring upwind states. There is a provision in the federal Clean Air Act, colloquially called “the Good Neighbor” rule, that prevents one state from causing or significantly contributing to another state’s violation of federal air quality standards.

The problem is that it is fiendishly complex to implement the good neighbor rule. Many “upwind” states emit multiple pollutants to many downwind states, many downwind states receive multiple pollutants from multiple upwind states, and some states are both upwind and downwind states. Thus, it is exceedingly difficult to point a finger at any one particular upwind state and say that it is “responsible” for any downwind’s state air quality, and even more difficult to devise a formula to fairly and effectively apportion responsibility.

In 2011, after many false starts, the Environmental Protection Agency (EPA) crafted an ingenious “Transport Rule” to address the problem. The EPA conducted extensive analysis of the costs of pollution control to determine how expensive it would be, per ton of pollutant reduction, to ensure that upwind states in the aggregate do not cause downwind states’ air quality in the aggregate to exceed federal standards. The EPA then gave each upwind state a pollution “budget” for the state to use to reduce the pollutants that were wafting beyond their borders, based on this “cost per ton” reduction benchmark. In this way, just enough pollution would be reduced so that upwind states would not tip a downwind state into non-compliance, and the amount of each state’s pollution reduction would be based on a common yardstick of cost-effectiveness.

But Judge Kavanaugh struck this plan down. In his view, Congress had not expressly embraced this particular approach, and therefore the EPA was not allowed to implement it. His decision instead required EPA to determine each upwind state’s “proportionate responsibility” for pollution in downwind states and base the required reductions on that (even though the statute does not explicitly require that approach). Judge Kavanaugh’s decision largely ignored the compelling practical difficulty of assigning proportionate responsibility, or the many economic benefits of the EPA’s proposed approach.

As a result, his ruling would have consigned downwind states to many more years of air pollution while the EPA grappled with how to implement it.

Had Judge Kavanaugh’s “proportionate” responsibility approach been required by the law, that would be one thing. But it wasn’t. The Supreme Court, on a 6-2 vote that included Justices Kennedy and Roberts, found that that the statute did not require a proportionate responsibility approach (even assuming one could be fashioned). Instead, they ruled that Congress had vested the EPA with broad discretion to devise an appropriate remedy, and the Transport Rule was both fair and cost effective.

The Clean Power Plan oral argument

This same apparent hostility to agency innovation was on display in Judge Kavanaugh’s comments on the Clean Power Plan during a court hearing. That case involved a challenge to the Obama Administration’s Clean Power Plan, the nation’s first-ever rules to limit carbon pollution from coal and gas fired power plants, one of the largest sources of greenhouse gases in the United States. The Clean Power plan, a measure that received extensive input from UCS and many others, relied on an infrequently used provision of the Clean Air Act that allows the EPA to require polluters to use the “best system of emissions reduction” to address pollutants such as greenhouse gases.

After years of review and receipt of over 4 million comments, the EPA issued a final rule in October 2015. The EPA determined that the “best system of emissions reduction” for carbon pollution from power plants included three strategies that are in widespread use today—improving the efficiency of coal plants, switching from coal to gas, and substituting low or no carbon generation, such as wind, solar and nuclear. The EPA quantified the emissions reduction that would be possible using these strategies, and devised a national standard based on this quantification. The rule was intended to cut carbon emissions from power plants by approximately 30 percent by 2030, and formed a key component of the United States’ pledge to reduce its overall emissions as part of the Paris Climate agreement.

Industry and states filed suit to challenge the Clean Power Plan, and the case was heard by the DC Circuit court of appeals. No decision was ever issued on the case, but the court held an all-day oral argument in which Judge Kavanaugh participated. His questions and comments were revealing.

A major point of debate focused on the unusual nature of the regulation. When regulating conventional air pollutants, EPA often sets pollution control standards by focusing on what each plant can do with pollution controls at the source to cut pollution, e.g. a scrubber to lower sulfur dioxide emissions, or a baghouse to collect soot. In the Clean Power Plan, in contrast, EPA established CO2 limits by focusing not on what each individual plant could do to cut CO2, but rather what the system as a whole could do by shifting away from coal-based generation towards gas and renewables.

The opponents contended that this “beyond the fenceline” approach rendered it illegal, because Congress had not specifically authorized it.

Judge Kavanaugh’s questioning at the hearing demonstrated that he bought into this line of thinking. Judge Kavanaugh stressed repeatedly that the rule would have significant economic consequences, that the EPA was using a previously unused provision of the Clean Air Act to implement this approach, and that Congress had not specifically embraced the policy of shifting to low or no carbon generation. Judge Kavanaugh seemed unmoved by the strong counterarguments that: 1) EPA had a mandatory duty under the act to lower carbon pollution from power plants; 2) this was the most cost-effective and tested method of doing so; and 3) it fit the statutory command to deploy “the best system of emissions reduction.”

While the court never issued a ruling, it seemed clear that Judge Kavanaugh was prepared to strike down the rule on this basis, leaving behind no remedy for carbon pollution from power plants.

The Case of the Killer Whale

In 2010, an employee of Sea World was lying on a platform above a pool during a whale training show when a killer whale dragged her into the water, maiming and drowning her. This marked the third death by killer whales in a roughly 30-year period.

The Occupational Health and Safety Administration (OSHA) responded by requiring the company to ensure minimum distances and physical barriers between a trainer and a whale.

Sea World challenged this order, claiming that OSHA impermissibly extended its authority to regulate the risks of sporting events. Two of the three judges, including Merrick Garland, President Obama’s ill-fated Supreme Court nominee, dispensed with the challenge, ruling that OSHA had the authority to require these commonsense safeguards for workers.

Not so Judge Kavanaugh. His dissenting opinion begins as an elaborate paean to the thrill of sporting events in which physical risks are present. He never actually critiques the solution that OSHA devised on the merits, but rather deploys the familiar lawyer’s trick of a “parade of horribles,” claiming, e.g. that if OSHA can regulate killer whale shows, it can prohibit tackling in football or set speed limits on NASCAR racing (things that OSHA has never done). All of this, according to Kavanaugh, would go well beyond the authority that Congress intended OSHA to have.

As for the physical safety of employees who work with whales—according to Kavanaugh’s logic, that would be up to Congress to legislate.

Common threads

What unites these opinions—and others like them—is that, in each of these cases, Judge Kavanaugh struck down solutions (or appeared poised to do so), when a federal agency responded to an existing problem with a novel approach or sought to address a new problem in a manner we should all value—with creativity, scientific evidence, consideration of costs and benefits, and an eye towards feasibility and practicality. In none of these cases did the agency violate any specific provision of its authorizing statute. But, in all of these cases, Judge Kavanaugh opposed these solutions under the theory that Congress had not specifically blessed the choice the agency had made.

Judge Kavanaugh and his defenders claim that curbing the power of agencies is essential to ensuring that elected leaders in Congress, rather than unelected bureaucrats, make the fundamental policy choices. This seemingly benign principal is either naïve, malevolent, or both.

The fact of the matter is that Congress is largely paralyzed and incapable of passing legislation on virtually any important issue—witness the stalemates on immigration, gun control, climate, health care, and many others. And even when Congress manages to overcome gridlock, as a necessity it legislates in broad generalities, not specifics. This is because Congress does not have a crystal ball to foresee all the possible variations of a problem or all the best solutions to it. That is why Congress wisely delegates implementation to agencies staffed with experts, and why we use a process of notice and comment to ensure that all views are heard before a regulation becomes final.

There is an important role for the courts in this rulemaking process judges must make sure that agencies do not violate the law or disregard sound reasoning and evidence. But Judge Kavanaugh takes the judicial role too far. His insistence that Congress specifically endorse an agency plan that is otherwise scientifically sound and legally within its discretion is a formula for paralysis, and the maintenance of the status quo (which helps explain his appeal to groups such as the Koch Brothers and the US Chamber of Commerce).

All of us will regret it if Judge Kavanagh’s reactionary view becomes the guiding principle of a new Supreme Court majority. With Congress already deadlocked and demonstrating almost daily basis its inability to respond to pressing challenges, we cannot thrive if executive branch agencies are paralyzed as well.

Photo: Wikimedia

Breaking Containment at Crystal River 3

UCS Blog - All Things Nuclear (text only) -

Role of Regulation in Nuclear Plant Safety #10

The Crystal River 3 pressurized water reactor in Florida was shut down in September 2009 for refueling. During the refueling outage, the original steam generators were scheduled to be replaced. The Nuclear Regulatory Commission (NRC) was reviewing the owner’s application to extend the reactor operating license for another 20 years. The replacement steam generators would enable the reactor to operate through the end of its current operating license period as well as to the end of a renewed license.

But those plans changed drastically when the process of cutting an opening in the concrete containment wall for the steam generator replacement inflicted extensive damage to the concrete. When the cost of fixing the broken containment rose too high, the owner opted to permanently shut down the facility before its original operating license expired.

Background

Crystal River 3 is located on the western coast of Florida and featured a pressurized water reactor (PWR) designed by Babcock & Wilcox. The NRC issued the reactor operating license on December 3, 1976.

Refueling Outage and Steam Generator Replacements

Operators shut down the reactor on September 26, 2009, to begin the plant’s 16th refueling outage. Workers planned to replace the steam generators during the outage. The original steam generators were wearing out and were to be replaced with steam generators made from materials more resistant to wear and tear. Since the first steam generator replacements more than two decades earlier, so many PWRs had performed this exercise that it was almost routine.

Figure 1 shows a simplified side view of the containment structure at Crystal River 3. The reactor core is the green rectangle within the capsule-shaped reactor vessel. The reactor vessel is flanked by the two larger steam generators. In front of the steam generator on the right is the pressurizer. The vertical portion of containment is a cylinder about 137 feet in diameter.

Fig. 1 (Source: Progress Energy)

The containment at Crystal River 3 was a 3-D post-tensioned concrete cylinder with a steel liner. The 0.475-inch thick steel liner formed the inner surface of the containment wall. Behind it were 42-inch thick concrete walls and a 36-inch thick concrete dome. Embedded in the concrete walls were 5.25-inch round tendons encased within metal sleeves. These tendons functioned like reinforcing bands—workers tightened, or tensioned, them to give the concrete wall additional strength against the internal pressure that could occur during an accident. This containment design was used for more than half of the PWRs operating in the United States.

The containment featured a large round opening called the equipment hatch. Figure 2 shows the equipment hatch in late November 1972 during plant construction. The concrete has not yet been poured in that section of containment, so the metal reinforcing bars and horizontal tendon sleeves (the vertical rows of white dots on either side of the equipment hatch) embedded in the concrete are visible.

 

Fig. 2 (Source: Progress Energy)

Because the original steam generators were expected to last throughout the 40-year operating life of the reactor, the equipment hatch was not large enough for the steam generators to be removed intact. They could have been cut up into sections and slices removed through the equipment hatch. But the equipment hatch was also too small for the replacement steam generators to enter intact. Cutting them up into sections was not an option. Plan B involved cutting an opening approximately 25-feet by 27-feet through the containment concrete wall and liner above the equipment hatch as shown in Figure 3.

Fig. 3 (Source: Progress Energy)

The Butterfly Defect

The operators began reducing the reactor power level at 7:03 pm on September 25, 2009, to enter the refueling outage. They shut down the reactor at 12:29 am September 26. They continued cooling the reactor water down over the next few hours and entered Refueling mode at 4:51 pm that afternoon. Seven minutes later, the contractor hired to cut through the containment wall was authorized to begin that work. An early step involved loosening and removing the horizontal tendons from the containment wall in the region where the opening would be cut.

On September 30, workers began using high-pressure water—at pressures up to 25,000 pounds per square inch—to cut and remove the concrete from an 8-feet wide by 6-feet tall test section of the concrete containment wall. Full-scale removal of the concrete began at 4:30 am on October 1. Workers installed a debris chute to carry away the excavated concrete and water.

About 5:00 am on October 2, the concrete cutting and removal work was halted because an obstruction in the debris chute caused water to spill. Workers noticed water streaming from a crack in the containment wall below and to the right of the new opening. Investigation into this unexpected waterfall identified a vertical crack in the concrete between the tendon sleeves and interior liner.

Fig. 4 (Source: Progress Energy)

It was not a tiny crack. It was visible along all four edges of the square opening cut through the containment wall. The defect in the concrete was termed delamination.

Fig. 5 (Source: Progress Energy)

Workers drilled dozens of bore holes into the containment wall supplemented by impulse response testing (essentially ultrasonic probing of the wall to look for voids within the concrete) to map out the extent of the delamination. Figure 6 shows that the delamination area resembled a butterfly, extending far beyond the crack around the steam generator replacement (SGR) opening. Figure 6 also shows the horizontal tendons loosened and removed because of the opening in blue while the tendons left tensioned are shown in red.

Fig. 6 (Source: Progress Energy)

The NRC Dispatches its Crack Inspection Team

The NRC formed a Special Inspection Team on October 13, 2009, to go to Crystal River 3 and investigate the containment damage. Because the reactor was shut down, the damage did not pose an immediate safety hazard. But the NRC recognized that the damage might have generic implications as other owners cut through containments for steam generator and reactor vessel head replacements. In addition, the NRC needed to understand the extent of the damage to ensure the containment was properly restored before the reactor restarted.

Delamination Déjà vu

The NRC team reported that the Crystal River 3 containment experienced concrete delamination about a year after the tendons had been initially tightened. In April 1976, electricians were drilling into the outer surface of the containment dome to secure anchors for the conduit they were installing. In certain areas, the anchors would not hold. Investigation found a region of about 105-feet in diameter where the concrete had delaminated. The delamination affected about 15 inches of the 36-inch thick concrete dome, with the maximum gap between layers being about two inches wide. Cracks were not evident on the inner or outer surfaces of the dome, but workers reported a “springiness” when walking across the dome’s delamination region. The degraded concrete was removed and replaced with the standard, non-springy kind.

Containment concrete delamination also occurred during construction at the Turkey Point nuclear plant in Florida in June 1970 and at the Unit 2 reactor at the Kaiga nuclear plant in India in May 1994.

Causes of the Concrete Cracking

The plant’s owner formed a team to determine the cause for the cracking experienced in fall 2009. The team developed a list of 75 potential causes and then evaluated each candidate. 67 suspects were dismissed due to lack of evidence. The remaining eight potential causes were determined to have conspired to cause the delamination—had any single factor been absent, the delamination would likely not have occurred.

The Crystal River 3 containment design featured higher stresses than most other designs. The concrete used in the containment met design specifications, but with considerably less margin than normal. And the sequencing used to loosen the tendons prior to cutting the steam generator replacement opening resulted in high localized stresses that exacerbated the design and material conditions to cause cracking.

NRC Sanctions

The NRC imposed no sanctions following the investigation by its Special Inspection Team. The team determined that the containment was damaged after the reactor entered the Refueling mode. In that mode, containment integrity was not required. The equipment hatch is wide open much of the time during Refueling mode, so having a damaged section of containment wall above that large opening did not violate regulatory requirements.

NRC Nuclear Fleet Outreach

The NRC’s Generic Communications program is its means for conveying operating experience to plant owners. The program uses Information Notices to provide warnings and updates about safety problems and Generic Letters and Bulletins to also require owners to take steps intended to prevent a common problem from rippling across the reactor fleet. While it is not uncommon for the NRC to send out at least an Information Notice to owners about problems like that experienced at Crystal River 3, the NRC did not exercise this option in this case. The NRC did post information to its website about the problem and made a presentation about the Special Inspection Team sent to the plant during the annual Regulatory Information Conference in March 2010.

The NRC’s Office of Nuclear Regulatory Research issued NUREG/CR-7208, “Study on Post Tensioning Methods,” in November 2015. While far from a treatise on what caused the delamination at Crystal River 3, it shed considerable insight on the analysis of stresses impacted on concrete structures when the embedded tendons are tightened.

Delamination to Defueled to Decommissioning

The plant’s owner made several attempts to repair the damaged concrete containment wall, but the efforts proven unsuccessful. During the efforts, workers completed offloading all the fuel assemblies from the reactor vessel into the spent fuel pool on May 29, 2011. After another repair failed, the company decided to permanently shut down the facility rather than undertake the cost—and uncertain outcome—of yet another attempt. On February 5, 2013, the company announced that the reactor had been permanently shut down and would transition into decommissioning.

UCS Perspective

This event reflects just right regulation by the NRC.

The NRC dispatched a Special Inspection Team to investigate the cause and corrective actions for the concrete degradation at Crystal River 3 even though the problem had no adverse safety implications for the reactor in refueling mode.

Had the NRC not done so or delayed doing so, any potential generic implications that adversely affected safety at operating reactors might have been missed. While no such implications were found, it’s far better to have looked for them and not found them than to have not looked for them and had them “surprise” us later.

Had the NRC not done so or delated doing so, the agency would not have clearly understood the cause of the concrete degradation in order to make informed decisions about the effectiveness of repairs. Restart of the plant would have been delayed as the NRC belatedly sought to acquire that awareness, or restart of the plant would have happened lacking the NRC’s independent verification that proper safety levels had been restored. The former would have placed an undue economic burden on the owner; the latter would have placed an undue risk burden on workers and the public.

But the NRC took just the right actions at just the right time to properly oversee safety at the plant. The owner’s decision to permanently retire rather than repair the plant without the NRC’s thumb on either side of the scales.

* * *

UCS’s Role of Regulation in Nuclear Plant Safety series of blog posts is intended to help readers understand when regulation played too little a role, too much of an undue role, and just the right role in nuclear plant safety.

Transportation Pollution is on the Rise in Massachusetts  

UCS Blog - The Equation (text only) -

Photo: Billy Hathorn/Flickr

Pollution from cars and trucks are on the rise in Massachusetts, undermining the Commonwealth’s ability to achieve the mandates of the Global Warming Solutions Act, according to preliminary numbers released by the Department of Environmental Protection on Thursday.

DEP’s updated emissions inventory showed a significant jump in emissions from transportation, from 29.7 MMT in 2015 to 31.7 MMT in 2016, an increase of over 6 percent. Transportation pollution is higher today than it has been at any point since 2008. It is the only sector where emissions are higher today than they were in 1990. Even as the state makes significant progress in other areas, the challenge of transportation pollution threatens to undermine our ability to achieve our legally mandated climate limits.

The growth in transportation pollution is occurring even though our cars and trucks are getting cleaner and more efficient every year, thanks to national vehicle emission standards in place since 2009.

Why are transportation emissions increasing?

Transportation emissions are growing because the economy of Massachusetts and the Boston metro area is booming: there are over 400,000 more jobs in Massachusetts today than 10 years ago. That’s a good thing for a state, but it is also putting unprecedented pressure on our transportation system. More jobs mean more commuters, travelling more miles, consuming more gasoline, and producing more pollution.

In addition to the spike in emissions, Boston commuters are spending more time than ever before stuck in traffic. The average Boston driver spent 60 hours (more than two days!) in traffic in 2017, making Boston the seventh-most congested in the city in the country.

One thing that is not growing right now in Massachusetts: use of public transportation. MBTA Bus and light rail public transportation are down 6.5 percent and 3.5 percent respectively over the past three years. Insufficient funding, unreliable service and increasing competition from ridesharing services such as Uber and Lyft are all playing a role reducing the use of public transit. Housing near public transportation centers is also becoming prohibitively expensive for many Massachusetts residents.

Another important factor: with gas prices relatively low and greater disposal income, consumers are buying bigger cars. Sales of SUVs and light trucks have grown to over 65 percent of the national U.S. vehicle market in 2017 – though the largest growth has been in smaller car-like SUVs. While national emission standards are improving the efficiency of all vehicles, including SUVs and pickup trucks, this trend towards larger vehicles is nevertheless undermining some of our expected gains in fuel efficiency.

Unfortunately, the Trump administration is now proposing to freeze federal vehicle standards – and to strip Massachusetts, California and other states of our right to set aggressive emission standards. If this federal attack is successful, it would be a critical blow to Massachusetts’ climate strategy. The vast majority of the projected emission reductions from transportation in the state’s recent Clean Energy and Climate Plan come from these standards.

What can we do?

The good news is that we have the tools to achieve dramatic reductions in transportation emissions regardless of what happens in Washington, DC.

Moreover, Massachusetts now has numerous studies and Commissions working on the problem of transportation emissions. In addition to the Future of Transportation Commission, the Comprehensive Energy Plan and the Clean Energy and Climate Plan, the state also announced on Thursday a new look at potential deep decarbonization studies for 2050, which will look at how we achieve dramatic reductions in emissions throughout our economy.

Here are three things the Commonwealth can do to get a handle on pollution from transportation:

Create a market-based limit on transportation emissions. One option would be to work with Northeast states to create a cap and invest program covering transportation fuels. The “cap” sets an overall limit on tailpipe pollution. This limit is enforced through a requirement that polluters purchase allowances based on the carbon associated with burning that fuel. The state only allows allowances up to the cap. As the cap gradually lowers, emissions reductions are guaranteed, while market forces raise the cost of allowances, generating proceeds. The state can then invest those proceeds in clean transportation solutions, like electric cars, trucks, and buses, better public transportation, and walking and biking options.

We’ve seen a similar program work before. In 2004, Massachusetts joined with the other states of the Northeast to create the Regional Greenhouse Gas Initiative (also known as “RGGI”) for the electric sector. Today, RGGI stands as a triumph of smart climate policy. Thanks to RGGI, in addition to other complimentary policies, the Northeast is on track to cut pollution from power plants by 65% by 2030. Funding from RGGI is used to support some of Massachusetts’ most innovative and important climate policies, including the MassSave program and the Green Communities Act. Overall, independent analysis shows that RGGI has created 44,000 jobs in the region while saving consumers over $773 million in reduced energy costs.

Promote responsible growth of ride hailing services. Ride hailing services such as Uber and Lyft are already changing the way people are getting around in our cities, and with autonomous vehicles on the horizon, these services will continue to shape our mobility choices in the years to come. However, these services can only operate effectively if they are working hand in hand with a strong public transportation system. Massachusetts should consider fees, regulations and incentives for these companies. Proceeds could be used to support public transit while requirements and incentives could encourage electrification of ride hailing fleets, encourage pooling to provide more rides with less congestion

Increase incentives for vehicle electrification. Electric vehicles are a critical technology for the future of the Commonwealth, but right now they are too expensive for many low or moderate-income residents. As the state considers future program models, there should be increased funds available for rebates targeted toward low and moderate-income residents so that these vehicles are truly affordable for everyone.  In addition, the state should consider additional rebates to encourage people to trade in old and dirty pickup trucks and SUVs for cleaner and more efficient models.

One thing that we cannot do is continue to ignore the challenges facing transportation and climate in our Commonwealth. Massachusetts climate law requires reductions from all sources of pollution in the state, and we will not meet the requirements of that law without addressing transportation. Beyond emissions, we need to address the interconnected challenges of increasing congestion, the increasing cost of housing, and the declining state of our public transportation services or these problems will grow more difficult and more frustrating for Massachusetts residents.

Photo: Billy Hathorn/Flickr

Transitioning the Workforce in the Era of Autonomous Vehicles: Meet Dr. Algernon Austin

UCS Blog - The Equation (text only) -

Photo: Dllu/Wikimedia Commons

Autonomous vehicles (AVs) are sure to bring about a significant shift in the job market. While it is important to think about how many jobs will be lost or created because of this change, there must be a focus on the workers themselves and what they will need to support a just transition. As explained in our policy brief Maximizing the Benefits of Self-Driving Vehicles:

Self-driving technology will create jobs for some, but it will change or reduce employment opportunities for others, especially in the trucking, delivery, taxi, and ridesharing industries. Before self-driving vehicles comprise a significant share of the markets for passenger cars and heavy-duty trucks, policy must recognize the economic impact of this technology, and must support career pathways and transitions for the Americans who will be affected by automated driving technology. In addition, jobs created in the self-driving vehicle industry should be accessible to all, with a focus on increasing career opportunities for populations historically underrepresented in transportation and technology industries.

I spoke with Dr. Algernon Austin*, an economist with the think tank Dēmos and co-author of “Stick Shift: Autonomous Vehicles, Driving Jobs, and the Future of Work,” to get an expert’s opinions on the future of the driving workforce. I asked him about potential impacts of AVs on the labor market and he discussed ways to provide job training opportunities for transportation workers that will be affected by the AV revolution.

 

 

Richard Ezike: When discussing the possible negative impacts of autonomous vehicles, job loss will always come up. What do we know about the Americans that hold driving jobs? Who, exactly, is at risk here?

Dr. Algernon Austin: Driving occupations are disproportionately held by men, and when broken down by race and ethnicity, those overrepresented in the driving industry include minority groups such as African-Americans, Latinos, and Native Americans. People within those groups face a potential loss of jobs that pay very good wages.

RE: Are all drivers’ jobs threatened equally? Will self-driving vehicles be able to replace human critical tasks like unloading packages from a delivery truck?

AA: There are varying levels of threat to the jobs that will be affected by AVs. Currently, the jobs that appear to be most at risk are taxi drivers, followed by jobs in the freight industry. It is feasible to see autonomous trucks replacing human drivers for long freight hauls. Those in the package delivery space are also at risk, but because those jobs require manually delivering a package to an individual, the threat level is not as severe.

RE: Unions have raised concerns about self-driving vehicles affecting employment. What are ways labor groups can work to protect employees as automated technologies become more widespread?

AA: There was a recent article published in the Washington Post about how the gig economy is not competitive at all. The researchers stated that an online market is not ideal – there are a few power players that control the market and prices, and potential employees cannot easily compare wages across different job offers. The researchers suggested supporting worker cooperatives or labor unions that would set up online labor markets to prevent the system from being stacked against workers.

Simply put, if a commitment is made to protect workers and not jobs, we can easily make this technological transition.

What it really comes down to, as suggested by Jean Tirole, Nobel Prize willing economist and author of “Economics for the Common Good,” is the protection of workers instead of jobs. Both unions and governments should adopt this stance, develop a strong safety net, and encourage retraining and skill development. With such strong programs available, people would be less resistant to technological change, which we see often now when people are asked about the impact of technology in their lives.

RE: People in other professions have seen unemployment due to automation in the past, notably in manufacturing and mining, and will continue to see these effects across sectors in the future. Is the loss of driving jobs unique? Should we be trying to solve this problem systemically, or looking at solutions focused specifically in this sector?

AA: This is not a unique situation. The history of technological displacement has been one of constant evolution and the problem needs to be tackled in a systemic manner. We need to protect workers with a safety net and to eliminate poverty and homelessness. We are a rich country and we should not have so many people suffering from financial hardship. Simply put, if a commitment is made to protect workers and not jobs, we can easily make this technological transition.

RE: Education and retraining programs have been tried on different scales to varying levels of success in the past. What do you think makes an effective job retraining program, and what kinds of resources are needed to really help workers transition into a new career?

AA: The most effective programs are apprenticeships, which combine on-the-job training with classroom instruction. Such programs allow people to gain new skills while working and earning an income. You must remember, most people needing retraining are adults who have families. It is not reasonable to believe they can stop working and take on student debt to pursue a four-year degree.

Sector-specific training is also a viable strategy and of interest to many companies. What companies can do is set up programs at a community college that train people for the skills most in demand by the company. Sectoral training focuses on quickly growing industries, and usually leads to jobs that pay well.

It also needs to be ensured that there are a variety of affordable training programs – both short and long term – for workers of all ages. Increasingly we are seeing older adults going to two-year colleges and working part time, and the American education system needs to adjust to accommodate those adults’ lives.

RE: Assuming the federal government doesn’t make any significant changes to mitigate these job losses, what can state and local governments do to proactively deal with this issue? What systems can they put into place to help those that will be out of a job when AVs hit the market in full force?

AA: State and local governments can support sectoral and apprenticeship training programs. They can reform their educational policies with adults who are transitioning in mind and support strong safety nets.

They should also use their voices to speak to the federal government. Job loss is a looming issue and the transition in the AV environment needs to be managed. Unfortunately, there has been very little attention given to the negative impacts this new technology could have on the labor market. We are getting warning signs however, and state and local leaders should ask federal officials to help protect workers.

* Dr. Algernon Austin conducts economic policy research for the Dēmos think tank. He has been a Senior Research Fellow at the Center for Global Policy Solutions, and he was the first Director of the Economic Policy Institute’s Program on Race, Ethnicity, and the Economy. Prior to his shift to the “think tank world”, he served on the faculty of Wesleyan University. He has discussed racial inequality on PBS, CNN, NPR, and on other national television and radio networks.

Photo: Dllu/Wikimedia Commons

Art & Climate Change

UCS Blog - The Equation (text only) -

Wouldn’t it be bizarre to explore how an artist would go about preparing their artwork for climate change? In a natural disaster scenario, no one says “Grab the passports and the dog… and the Modigliani!” So, the works presumably will have to fend for themselves. This thought launched my most recent series of work. Now, a year and a half later, I will be floating down the Potomac River on top of one of my “climate change ready” paintings as both a demonstration of its capabilities and a symbolic gesture to the government establishments I feel are ignoring the problem.

I’ve always believed that art and science strengthen each other when combined. In an era of half-truths, full-lies, and intentionally blurred lines, the relationship between art and science is more important than ever. Science provides us with the evidence, but I believe art can plant the message deep enough emotionally to sprout into useful action.

As a Miami native, I’ve decided to currently focus on rising sea levels since it is an issue that I have experienced first-hand. The streets of Miami and Miami Beach regularly flood during high tides, a change from years past. As an April edition of Business Insider explained, “Because of ocean currents and Miami’s location, sea levels are rising in and around the city and Miami Beach faster than in most of the world.” However, Miami isn’t the only city at risk. According to the National Ocean Service of NOAA, “In the United States, almost 40 percent of the population lives in relatively high-population-density coastal areas, where sea level plays a role in flooding, shoreline erosion, and hazards from storms. Globally, eight of the world’s 10 largest cities are near a coast.”

The idea of climate change isn’t anything new. Before Al Gore and Leonardo Di Caprio, there were others who warned us of our trajectory. It was in 1824 that French physicist Joseph Fourier first observed and wrote of the “greenhouse effect” of the Earth. He was followed by Swedish chemist Svante Arrhenius in 1896 who concluded that the huge influx of coal burning from the industrial revolution would greatly augment the greenhouse effect. By the 1960s, it was becoming increasingly clear that human induced carbon dioxide was warming the atmosphere at a rate faster than ever experienced in recorded history. As stated in the Union of Concerned Scientists’s article Global Warming Science, “Over the past 130 years, the global average temperature has increased 1.5 degrees Fahrenheit, with more than half of that increase occurring over only the past 35 years… [and] …scientists have detailed records of past CO2 levels from ice core studies, which show that CO2 levels are higher today than at least any point in the last 800,000 years ago.”

What’s most alarming to me is that despite so many years of mounting research and warnings, we’ve done so little to solve the problem. There is a general sense of apathy when speaking to the general public about climate change. “Yes, it’s a problem, but what can I personally do about it?” they ask with a shrug. Photographs of starving polar bears and melting icecaps just aren’t jarring enough to a public exposed to far worse, albeit fictional, scenes on television and in movies. I hope to sneak through this wall of apathy that has currently built up around the issue, and I’ve found that my Trojan horse is a sense of humor. Humor catches people off guard, and I can often find myself in conversations with people who otherwise wouldn’t have been open to the topic.

When I began researching various ways to make my paintings “climate change ready,” one place I looked was in the tomes of art history, and I found that creating paintings has always involved planning for their long term stability. Artists of the past learned how to layer their oils in such a way as to minimize future cracking. They created new pigments that were less volatile and held their original color far longer. Knives were hung beneath portraits so that should a home catch fire, the owner could quickly cut the painting from its heavy frame and toss it out the window to safety. With that history in mind, I decided to follow the long evolutionary lineage of pre-emptive art preservation, but today’s issues call for new solutions. I consulted with an expert in the field of art conservation, and through a process of trial and error, I began experimenting. I learned to weave my own canvas and embed buoys and other found flotation devices into the fabric of my paintings. I have tried approaches as simple and playful as wrapping existing paintings in common pool noodles to more serious and complex waterproof paintings made of synthetic sailcloth & marine foam.

People generally laugh and engage when I point out that my artwork is designed to survive rising sea levels.  As for my own amusement, I imagine some day in the future, an art collector will be safely sitting on top of their floating artwork exclaiming “Thank goodness we bought a Noel Kassewitz!”

 

 

Kassewitz plans to float down the Potomac during the last week of August, weather permitting. For more information on her art and when and where Kassewitz will be floating next, you can visit her website: www.noelkassewitz.com or follow her on social media via Instagram @noelkassewitzart.

Miami-born Noel Kassewitz is a Washington, D.C.-based artist whose paintings and sculptures explore metaphysical, gender, and environmental concerns. Her works have exhibited nationally in New York, Chicago, Philadelphia, and Miami, as well as internationally in Milan and Bologna, Italy.

Ante la crisis fiscal, climática, y humanitaria, la comunidad científica boricua se moviliza para informar la política pública post-María

UCS Blog - The Equation (text only) -

Puerto Rico atraviesa por el momento más crítico de su historia ante el embate de la grave crisis fiscal y climática que devastó a la Isla hace un año tras el paso del Huracán María. De cara a la reconstrucción del territorio no incorporado de los Estados Unidos, los puertorriqueños dentro y fuera de la Isla exigen un lugar en la mesa donde se tomarán las decisiones que definirán el tipo de país en que les tocará vivir y al que ansiamos regresar. ¿Están los boricuas dispuestos a permitir que se repitan los errores del pasado que nos pusieron en el camino de la ruina fiscal y la vulnerabilidad climática? O, por el contrario, ¿exigirán los puertorriqueños el diseño de una infraestructura energética, de vivienda, de educación y económica que responda tanto a las necesidades presentes como a las que se vaticinan ya debido al cambio climático?

El desarrollo de la infraestructura energética, industrial, urbana y económica en la Isla desde la invasión de 1898 ha respondido más a las necesidades geopolíticas de los Estados Unidos que a las propias de Puerto Rico, y no se ha tomado en cuenta la geografía, la localización en el Caribe, ni las necesidades y aspiraciones de la población puertorriqueña. Puerto Rico fue insertado violentamente—“a la brava”—dentro del esquema constitucional y de desarrollo de los Estados Unidos, y no han tenido los boricuas una sola oportunidad a lo largo de 120 años de colonización de hacer oír su voz en el proceso de toma de decisiones y formulación de políticas públicas federales que afectan sus vidas y que determinan temas tan fundamentales como el costo de vida (por ejemplo el alza en el costo de bienes de consumo debido a la Ley Jones), y el desarrollo económico (por ejemplo el impuesto a manufactureras estadounidenses en la Isla). En la esfera local, la intervención política en el Fideicomiso para Ciencia, Tecnología e Investigación mediante una votación de la Cámara tarde en la noche y a espaldas del pueblo, y la intentona de eliminar el Instituto de Estadísticas de Puerto Rico ponen de manifiesto el pobre ejercicio de los derechos democráticos en Puerto Rico en materia de política pública científica.

Esto, combinado con una maquinaria partidista criolla más interesada en adelantar ideologías políticas que en mejorar las condiciones de vida de los puertorriqueños, finalmente nos ha pasado la factura: una red eléctrica en ruinas y susceptible a colapsar ante la menor presión; una fuga masiva de población a los Estados Unidos; cierres masivos de escuelas públicas. La sociedad civil boricua, mientras tanto, exige condiciones de vida decentes y una recuperación equitativa; el gobierno del Dr. Rosselló responde con gas pimienta tras ya haber criminalizado el derecho a protestar antes de María.

La comunidad científica boricua presenta la conferencia “Ciencia en Acción: Política Pública Puertorriqueña Apoyada por Evidencia”

Ante este cuadro, la comunidad cientÍfica en Puerto Rico y en la diáspora se moviliza para servir como puente entre el gobierno y la sociedad civil. Union of Concerned Scientists, en cumplimiento de su compromiso con la justicia ambiental y climática, y el apoyo a políticas públicas basadas en evidencia científica para el beneficio de todos, se ha unido al liderato de la División del Caribe de la Asociación Americana para el Avance de la Ciencia (AAAS-CD) y Ciencia Puerto Rico (CienciaPR) para presentar la conferencia titulada Ciencia en Acción: Política Pública Puertorriqueña Apoyada por Evidencia, a celebrarse el sábado primero de septiembre del 2018 en el Centro Criollo de Ciencia y Tecnología (C3Tec) en Caguas, Puerto Rico. Además, se estará lanzando la Red de Acción de Política Científica de Puerto Rico (Puerto Rico-Science Policy Action Network o PR-SPAN), una red de científicos comprometidos a actuar como expertos en materias de ciencia y tecnología, y servir como enlaces en sus respectivos campos para asegurar la participación de la comunidad científica en la elaboración de políticas públicas relevantes para Puerto Rico.

UCS se enorgullece en apoyar el liderato de la Dra. Zulmarie Pérez Horta (AAAS), la Dra. Giovanna Guerrero Medina (CienciaPR), y el Dr. Juan Ramírez Lugo (AAAS-CD) en convocar la conferencia y el lanzamiento de PR-SPAN. Esperamos que el evento sirva para enlazar a la comunidad científica del Caribe y Puerto Rico en la gestión de política pública basada en la ciencia que tanto necesita la región y la Isla para una recuperación sostenible y equitativa a futuro.

El otro día un colega decía que a Puerto Rico no lo destruyó el Huracán María, sino que la crisis fiscal que arrastramos hace décadas ya había destruido la Isla. De esa manera me imaginé a María como una suerte de escoba climatológica que barrió los escombros de lo que la crisis fiscal destruyó. Ya que hemos visto que mucha de la infraestructura de Puerto Rico está en escombros, las y los científicos tenemos la obligación de movilizar nuestro conocimiento y poder colectivo para crear un futuro resiliente a crisis fiscales, energéticas, sociales, y climáticas. El país lo exige y los científicos han contestado el llamado.

El evento está abierto al público – regístrate aquí: Ciencia en Acción: Política Pública Puertorriqueña Apoyada por Evidencia

 

 

The Importance of Briefing NASA Deputy Administrator Nominee on the Latest Climate Science

UCS Blog - The Equation (text only) -

Photo: NASA

At his Senate nomination hearing yesterday, when asked whether he agrees with the scientific consensus that climate is changing and humans are the dominant cause, NASA Deputy Administrator nominee James Morhard stated that he believes “the climate is changing and man has a significant impact on it.” When pressed further about whether he accepts the scientific consensus that humans are the dominant cause, he replied that he cannot speak authoritatively to make that statement. Given James Morhard’s discomfort with speaking to this topic, it is critical that moving forward, he be briefed expeditiously by experts from NASA’s Earth Science Division to fill this knowledge gap.

We face an unprecedented prospect of having both a NASA Administrator and their Deputy who have neither a formal STEM education nor space professional credentials. As a result, it is evermore important that NASA’s leadership be closely advised by NASA’s world-class experts to ensure that the leaders have the knowledge necessary to advocate for the research that NASA conducts to make our life on Earth safer and more prosperous. This will also ensure that the agency maintains its scientific leadership that Americans and many around the world depend on.

NASA’s Earth Science Division carries out timely, critical research on how our Earth functions and how it responds to both human-caused and natural drivers. For example, NASA’s Earth Science Division carries out research on how hurricanes develop, which when coupled with research and operations at NOAA, improves our forecasts. NASA Earth scientists conduct research to improve understanding of where there is wildfire risk, and NASA satellites help track wildfires in near real-time. NASA Earth scientists also study how and why the composition of our atmosphere is changing, and how those changes will affect both the quality of the air that we breath each day, as well as the climate that our children will grow up in. These are just a few topics that NASA scientists study and NASA satellites observe. It is imperative that NASA Deputy Administrator be up-to-speed on the science coming from the Earth Science Division’s full suite of portfolios. 

The progression of NASA Administrator Jim Bridenstine’s understanding of the scientific consensus around climate change has been positive. Hopefully, James Morhard, if confirmed, will follow suit.

Photo: NASA

Trump Twists the Law to Bail Out Coal

UCS Blog - The Equation (text only) -

Photo: Tammy Anthony Baker/Wikimedia Commons

As you may have heard, President Trump has a new toy – national security – that he’s using to sidestep congressional oversight and funnel taxpayer dollars to his fossil fuel buddies.

First, he weaponized “national security” to impose tariffs designed to stifle the economic competitiveness of solar power (it didn’t work). Now, he’s using it as misguided rationale for ordering the Department of Energy (DOE) to bail out uneconomic coal plants on our dime to the tune of billions of dollars, according to estimates. His hiding behind national security is like me hiding behind a lunchbox – it doesn’t work.

Unfortunately, if the Trump Administration gets away with it, there are profound consequences for our wallets, our environment, and yes – our national security.

an old coal-fired power plant in Chicago

President Trump’s efforts to bailout uneconomic coal plants could have profound consequences for our wallets, our environment, and our national security.

A leaked memo first reported by Bloomberg News details a plan by the DOE – officially ordered by President Trump on June 1st – to artificially prop up uneconomic coal and nuclear plants. “But wait,” you may be thinking, “hasn’t this administration already tried this and failed?” Yes, they have, and it did.

What’s different this time is the President’s inappropriate use of the federal government’s authority under two laws, the Federal Power Act and the Defense Production Act, to keep uneconomic plants operating based on the uninformed (and purposely misleading) contention that bailing out these plants improves the resiliency of America’s electricity supply and protects against theoretical cyber attacks on our energy infrastructure.

Where to begin on the many ways this proposal falls flat?

Before we dig into how these two laws are actually meant to be used, just for a moment consider the facts that:

This constitutes almost sector-wide agreement that there is no specific crisis that would warrant emergency orders under the guise of national security.

Outgoing FERC Commissioner, Robert Powelson, described Trump’s proposal as “the greatest federal moral hazard we’ve seen in years and something that would be the wrong direction for us to venture down.” In sum, when considered in the context of what the electric industry is saying, President Trump’s and DOE’s claim of a national security crisis falls flat.

Our president – charged with enforcing the law – is abusing the law (again)

What it all boils down to is the distinction between:

  • Appropriately using the law to solve a specific, identified problem with a specific, targeted solution, as opposed to
  • Stoking fear over broad, generalized claims of risk, then offering half-baked solutions that are poorly-disguised handouts to your political cronies.

Both the Federal Power Act and the Defense Production Act give authority to the DOE and President respectively to order power plants to continue operating when the nation’s electricity supply is truly threatened or to address a well-defined national security threat. But neither has ever been used in as broad and sweeping way, or under such flimsy rationale, as President Trump envisions.

The two laws at the center of it all

 The Federal Power Act has been used to enable DOE order specific power plants to continue operating when their shutdown threatens reliability. But it’s historically been used as a scalpel, not a sledgehammer – applying to a very select group of power plants for a short period of time in response to a well-defined, specific reliability issue. By “well-defined,” I mean based on analysis, input from experts, and initiated by those responsible for keeping the lights on. DOE pointed to the Federal Power Act in last year’s similar, failed proposal from DOE that was unanimously struck down by FERC.

Forcing consumers to bail out coal plants in the name of grid resilience is akin to forcing us to buy rotary phones to protect against cyberattacks on our communications system.

What’s new this time is the Defense Production Act that has been thrown into the mix. This law was passed in 1950 at the beginning of the Korean War. For over fifty years, this law has stood to ensure the nation’s industries are responsive to the needs of the U.S. during times of legitimate crisis, such as when a hostile force invades the U.S. or one of our allies, or when materials are needed to respond to a national disaster. These are actual events that require immediate and effective responses, not hypothetical and wildly broad threats dreamed up to push a political agenda. By twisting the law into this broad authority to interfere with the nation’s free markets, President Trump is abusing his authority and throwing out decades of precedent.

Take, for example, America’s communications network. Obviously, maintaining channels of communication is of a national security interest and faces its own cybersecurity threats. But does that mean we should all be required to install rotary phones in our homes because the CEO of a rotary phone company sat next to President Trump at a fundraiser, wrote him a check, and over their steak dinner lamented how the rotary phone business isn’t what it used to be? I hope not.

That’s exactly the type of scheme President Trump is trying to force on American consumers.

How about real solutions that build resilience?

To be clear, cybersecurity threats are a real issue and we need to be vigilant about addressing them. And there are agencies and organizations working together to assess and respond to real, identified threats to our electricity system. Recommendations are on the table to improve coordination and communication, strengthen investments, and harden our energy infrastructure against cyberattack. None of them that I’ve seen mention paying billions of dollars to outdated, inefficient, and dirty coal plants. In fact, a number of cyber experts have decried this bailout plan’s ability to keep us safe from cyberattacks.

The question in my mind is: if we’re spending billions to bail out coal and nuclear plants on the premise of national security, what real solutions aren’t we spending that money on? By diverting attention from real solutions, President Trump’s proposal actually makes us less safe. Further, keeping uneconomic coal plants operating on the consumer’s dime only exacerbates the real national security threat –  both here and abroad – that is climate change.

Photo: Tammy Anthony Baker/Wikimedia Commons Photo: swanksalot/Flickr  Photo: Nition1 [CC BY-SA 3.0]/Wikimedia Commons

Naughty and Nice Nuclear Nappers

UCS Blog - All Things Nuclear (text only) -

Role of Regulation in Nuclear Plant Safety 9

The Peach Bottom Atomic Power Station in Delta, Pennsylvania is known for its tireless workers. They stop working long before getting tired and nap while on duty. The Nuclear Regulatory Commission (NRC) treated the nuclear nappers as naughty in 1987 but as nice in 2007. The reason for such disparate handling of the same problem isn’t apparent. Maybe if I took a nap it would come to me in a dream.

Peach Bottom is home to three reactors. Unit 1 was a high temperature gas-cooled reactor that got its operating license in January 1966 and was permanently shut down in October 1974. Units 2 and 3 are boiling water reactors that began operating in 1974.

Naughty Nuclear Nappers in 1987

On March 31, 1987, the NRC ordered both operating reactors at Peach Bottom to be shut down. The NRC had received allegations that control room operators were routinely sleeping in the control room. Victor Stello, the NRC’s Executive Director for Operations, wrote in the order:

… it is apparent that the licensee, through its enforcement history and from what has been developed by the ongoing investigation, knew or should have known of the unwillingness or inability of its operations staff to comply with Commission requirements, and has been unable to implement effective corrective action. Consequently, the NRC lacks reasonable assurance that the facility will be operated in a manner to assure that the health and safety of the public will be protected. Pending the development of other relevant information, I am unable to determine that there is reasonable assurance that the facility will be operated in a manner to assure that the health and safety of the public will be protected. Accordingly, I have determined that continued operation of the facility is an immediate threat to the public health and safety.

Fig. 1 (Source: CBS Evening News, March 31, 1987)

Nucleonics Week reported on August 18, 1988, that the NRC proposed a then-record $1,250,000 fine on the company and fines ranging from $500 to $1,000 for 33 of the plant’s 36 licensed operators for the nuclear naps. The remaining three operators were cited for violating federal regulations, but not fined.

The NRC issued amendments to the operating licenses for Peach Bottom Units 2 and 3 on March 22, 1989, to add limits on how many hours the operators could work. The added requirements limited hours worked in any 24-hour period to 16, 24 hours worked in any 48-hour period, and 60 hours in any week. The amendment wasn’t clear whether hours sleeping on duty counted against the limits or not.

Unit 2 remained shut down until May 22, 1989, while Unit 3 remained shut down until December 11, 1989. The outages lasted longer than two years not to let the operators get plenty of rest but to remedy the many problems caused by the same inadequate management oversight that condoned operators sleeping in the control rooms.

Nice Nuclear Nappers in 2007

On March 27, 2007, the NRC received allegations that individuals working for the contract firm providing security at Peach Bottom were routinely sleeping in the “ready room” and that management of the security contractor and the plant owner knew about it. (The “ready room” is where armed responders wait. When security force personnel in another room monitoring video cameras and sensors detect unauthorized intruder(s), the armed responders are deployed to deter the intrusion.)

On April 30, 2007, the NRC wrote the plant owner a letter asking whether security officers were inattentive on duty. On May 30, 2007, the owner wrote back to the NRC saying that security officers were properly attentive, and that additional radio checks and periodic post checks were being instituted to boost and sustain that attentiveness level.

In mid-June 2007, a security officer informed security management about his videotapes showing fellow security officers still sleeping on duty. In late June 2007, the security officer was instructed by security management to stop videotaping sleeping security officers. On August 22, 2007, NRC inspectors confirmed that security officers were attentive while on duty.

On September 10, 2007, WCBS-TV (New York City) broadcast videos of security officers sleeping at Peach Bottom on June 9, June 20, and August 10, 2007. On September 17, 2007, the security officer who reported sleeping security officers to security management, plant management, and the NRC was suspended due to “trustworthiness concerns.”

Fig. 2 (Source: CNN Situation Room, September 2007)

The ensuing NRC investigation commended the company’s handling of the situation and reported:

Overall, Security Plan implementation provided assurance that the health and safety of the public was adequately protected at all times. Notwithstanding, the security officer inattentiveness adversely impacted elements of the defense-in-depth security strategy. In addition, actions by security guard force supervision were not effective in ensuring that unacceptable security officer behavior was promptly identified and properly addressed.

The NRC asked other owners on December 12, 2007, about their ways and means for maintaining security officers who were bright-eyed or bushy-tailed (not both, both attributes would not have passed the backfit rule) while protecting nuclear power plants. The NRC’s mandate clearly resulted from the nuclear nappers at Peach Bottom, but it did not mention the incidents, the company’s name, or the plant’s name for unknown reasons.

The NRC did not order either Peach Bottom reactor to reduce power, yet alone shut down.

The NRC did not fine the company, Exelon, or the napping security officers.

Instead, the NRC issued a White finding to the company on February 12, 2008, for the inattentive security officers. If you ever had to have a bad report card signed by your parents or paid a nickel for an overdue library book, you suffered a harsher sanction than NRC imposed for the nice nuclear nappers.

UCS Perspective

There were two sequences involving nuclear nappers at Peach Bottom. The series leading up to the March 1987 shutdown order did not involve an operator nodding off, but rather a deliberate practice of sleeping on duty with management’s awareness and tolerance.

The series leading up to the February 2008 White finding also did not involve one security officer nodding off at his or her post, but rather a sustained practice of sleeping on duty with management’s awareness and tolerance.

Clearly, the NRC considered the nuclear nappers to be naughty in one case and nice in the other.

Such disparate regulatory response to the same underlying situation means that one series represented over-regulation and the other was under-regulation. My vote on which goes where should be obvious. I’ll leave it up to the reader to place the 1987 series into either the under-regulation or over-regulation bin, with the 2007 series going into the other bin.

Two wrongs still don’t make a right, so these two cases cannot be melded into one just-right regulation story. That just wouldn’t be right.

* * *

UCS’s Role of Regulation in Nuclear Plant Safety series of blog posts is intended to help readers understand when regulation played too little a role, too much of an undue role, and just the right role in nuclear plant safety.

How to Think about Space-Based Missile Defense

UCS Blog - All Things Nuclear (text only) -

The idea of space-based missile defense system has been around for more than 30 years. There are at least two reasons for its continuing appeal.

The first is that it is seen as a global system that could defend against missile launches from anywhere in the world.

The second is the attraction of intercepting long-range ballistic missiles during their “boost phase”—the few minutes when their engines are burning. Hitting a missile while it is burning sidesteps the difficulty of evading decoys and other countermeasures that missiles can release during midcourse phase after their engines shut off. Defenses that are intended to intercept during midcourse phase, like the US Ground-based Midcourse Defense and Aegis systems, are highly susceptible to countermeasures.

But for an interceptor to be able to reach a missile during the short boost phase, it must be stationed close to where the missile is launched—which is the motivation for putting interceptors in orbit so they can pass over the launch site.

However, the reality of space-based defenses is not so appealing.

Technical studies (for example, by the American Physical Society (APS) (2004) and National Academies of Science and Engineering (2012)) show that even a system with many hundreds of space-based interceptors would not provide an effective defense—in part because the interceptor constellation would be vulnerable to anti-satellite weapons and to being overwhelmed by a salvo of missile launches.

Yet it would be extremely expensive. The National Academy study concluded that a space-based boost-phase missile defense would cost 10 times more than any terrestrial alternative. It said that even an “austere and limited-capability” system would cost at least $300 billion.

These problems are intrinsic to the system because of the physics of operating in space. A few diagrams can make clear why—see below.

Basics, and Implications

The technology does not exist for space-based lasers powerful enough for missile defense, so the defense systems being discussed would use kinetic interceptors that would accelerate out of orbit and physically collide with a missile. Since a missile’s boost phase lasts only a few minutes, in order to reach the missile the interceptors need to be in low-altitude orbits (typically 300 to 500 km (200 to 300 miles)) that pass over the launch site.

Fig. 1. An orbit lies in a plane that passes through the center of the Earth. The angle between that plane and the plane that contains the equator is called the “inclination” of the orbit. The “ground track” of an orbit is the line of points on the Earth directly below the satellite. (Source)

The fact that the interceptors are in low-altitude orbits has three important implications:

  1. The system needs a very large number of interceptors in orbit: An interceptor can’t sit over one location on Earth (the orbit that allows satellites to appear stationary over a point on the ground is 100 times higher—in the geostationary band—which is much too far away). Instead, to remain in orbit the interceptor constantly moves at very high speed (25 times the speed of a jet); at this speed it circles the Earth in about 90 minutes. As a result, it spends very little time over any particular spot on the Earth.

That means the system needs many interceptors in orbit so that one moves into position as the one in front of it moves out of position. As I show below, 300 to 400 interceptors are needed in orbit just to cover North Korea, and 1,000 or more for global defense coverage.

  1. An adversary will know where the interceptors are at all times: At these low altitudes, the interceptors can be easily tracked by an adversary, who can then calculate where they will be in the future since objects in orbit move in a predictable way. An adversary will therefore also know where there are any holes in the defense coverage. A defense with predictable holes in it is not an effective defense.

Fig. 2. Even a 1,200 km (750 mile) range missile could lift an anti-satellite weapon high enough to attack a space-based interceptor in a 300 to 500 km altitude orbit.

  1. The interceptors will be vulnerable to attack from low-cost ground-based weapons: To launch objects into orbit you need to lift them to high altitude AND accelerate them to very high orbital speed. That requires a large space-launch rocket and is very expensive, which contributes to the high cost of creating a large constellation of interceptors in space.

However, firing an anti-satellite (ASAT) weapon at an interceptor as it passes overhead just requires lifting the ASAT to the altitude of interceptor, and that can be done with a relatively cheap short-range or medium-range missile. Interceptors orbiting at 300 to 500 km would easily be within range of the Chinese DF-21D missile. Figure 2 shows that even a missile like a North Korean Nodong or Iranian Shahab 3 fired vertically could reach high enough altitudes to attack these interceptors if these countries developed or acquired ASAT capability to put on them.

Estimating the Number of Space-based Interceptors to Cover North Korea

This section shows why the physics of space-based boost-phase interceptors requires such a large constellation.

For a system optimized to defend against launches from North Korea, a space-based interceptor would be in an orbit like the white one in Figure 3, which is inclined at 45o to the equator and can carry the interceptor over North Korea.

Fig. 3. The white circle is the ground track of an interceptor orbit that is inclined at 45o to the equator (red circle).

Figure 4 shows missile trajectories (yellow lines) from North Korea to the east and west coasts of the United States. The yellow circle shows the region in which a space-based interceptor traveling on the white orbit could intercept a missile below it. This circle is 1,600 km (1,000 miles) in diameter, which assumes a very capable interceptor in a low-altitude orbit against liquid-fueled missiles like North Korea has. Against solid-fueled missiles, which would typically have a shorter burn times, the circle would be smaller.

Fig. 4. The white curve is the ground track of the interceptor’s orbit. The yellow circle is the region in which the interceptor could reach a missile launched below it. The circle is 1,600 km in diameter, which assumes δV = 4 km/s for the interceptor, in line with the assumptions in the APS and National Academies studies.

The interceptor moves rapidly in orbit, circling the Earth in about 90 minutes. That means the yellow circle will only be over North Korea for 3.5 minutes. To keep an interceptor over North Korea at all times there must be other interceptors in the orbit (black dashed circles) that move into place when the ones in front of them move out of place (Fig. 5).

Fig. 5. As the interceptor moves in orbit, the yellow circle will not stay over North Korea and additional interceptors—indicated here by the black dashed circles—must be in position to take its place.

To have constant coverage over North Korea, there must be interceptors all around the orbit. In the case shown here, it takes 25 interceptors to fill up this orbit so that one of them is always over some part of North Korea. Since you would want overlap between the circles, you would need more than that—probably 40 to 50 interceptors in the orbit.

So far we have taken into account the motion of the interceptor in its orbit but not the fact that the Earth is rotating under this orbit. Three and a half hours after the situation shown in Figure 5 North Korea will have moved 4,000 km (2,500 miles) east. The interceptors on this orbit will no longer be able to reach missiles launched from North Korea: Figure 6 shows that the yellow circle no longer contains any part of the missile trajectories. That means the system would need seven or eight orbits spaced around the Earth, each with 40 to 50 interceptors, so that interceptors on these other orbits will be over North Korea as the Earth rotates.

Fig. 6. Three and a half hours later than the situation shown in Figure 5, the Earth will have rotated under the orbit and the interceptor in the yellow circle will no longer be able to reach missiles launched from North Korea toward the United States.

Figure 7 shows eight equally spaced orbits (white lines) for a constellation optimized to cover North Korea, with a total of 300 to 400 interceptor satellites. That constellation, however, would only give constant coverage over latitudes near North Korea (red dot). Below about 35o latitude there would be big gaps in the coverage through which a country could fire a missile. And the constellation gives no coverage at all above about 55o latitude, which includes almost all of Russia (Fig. 8).

Fig. 7. Eight orbits (white lines) making up a constellation to cover North Korea.

Fig. 8. This figure shows the ground coverage (gray areas) of interceptor satellites in a constellation using equally spaced orbital planes with 45° inclination, assuming the interceptors can defend an area 1,600 km in diameter. The two dark lines are the ground tracks of two of the interceptors in neighboring planes. As the gray areas show, this constellation can provide complete ground coverage for areas between about 30° and 50° latitude (both north and south), less coverage below 30°, and no coverage above about 55°.

Achieving more global coverage would require a constellation of 1,000 or more interceptor satellites. Figure 9 shows a constellation of 24 orbits with inclinations of 65o. With 40 to 50 interceptor satellites per orbit, this system would have a total of 960 to 1,200 satellites.

Such a system would still only be able to engage a few missiles fired in a volley from the same place. It would give thin coverage at all latitudes between 70 degrees north and south, assuming a boost-phase interceptor that could defend an area shown by the yellow circle in Figure 2.

Fig 9. This figure shows a constellation of 24 orbits with inclinations of 65o. With 40 to 50 interceptor satellites per orbit, this system would have a total of 960 to 1,200 satellites and could give thin coverage of the Earth between 70o north and south latitude. The yellow circle is the area one interceptor could cover, which we assume is 1,600 km in diameter, as in Figures 4-6.

Two final notes:

  1. It doesn’t make sense to put midcourse interceptors in space: midcourse interceptors do not need to be close to the launch site, and deploying them in space leads to a very expensive system compared to ground-based systems.
  2. For a geographically small country bordered by water—in particular, North Korea—boost phase intercepts may be possible from from air-borne drones or ships, which are options currently being researched.

For more on space-based defenses, click here.

What Congress Does Next Could Cost Farmers and Taxpayers Billions

UCS Blog - The Equation (text only) -

Management intensive rotational grazing of beef cattle is one example of a conservation practice incentivized by CSP. Here, the author moves cows at the Michigan State University AgBioResearch Center in Lake City, Michigan Photo: Paige Stanley

This year has been hard for all farmers—they have faced an ongoing trade war from the Trump administration and an uphill battle with climate change. But farmers who want to use sustainable practices are being particularly hard hit, as their interests are sidelined for the benefit of agribusinesses. And for the rest of us, 2018 has—almost like clockwork—shown the failure of half-hearted efforts to control farm-sourced water pollution that contaminates drinking water and destroys fisheries.  

The House Committee on Agriculture’s farm bill proposal to eliminate a program that offers tangible hope in difficult times is the biggest blow yet. Not only is the Conservation Stewardship Program (CSP) popular among farmers, it addresses agricultural challenges and delivers environmental benefits that impact us all. As the deadline to complete the 2018 farm bill approaches, Congress should think long and hard before giving CSP the axe. According to new UCS analysis, they’d sacrifice as much as $4.7 billion dollars in annual taxpayer value to do it.

Maybe you’re not familiar with the farm bill (no one completely understands it), or maybe you’re more concerned with other happenings, like the current attack on science at the EPA. I can’t say I blame you. But if you like to eat food and drink clean water—and you want strong returns on your tax dollars—then listen up.

The good, the bad, and the ugly of today’s farming system

Industrialized US agriculture is highly productive, but it comes at an enormous cost. UCS has documented how the two most widely-grown commodity crops, corn and soy, are failing to feed people and are grown in ways that degrade our soil, increase damage from droughts and floodspollute drinking water, and create vast dead zones along our coasts. Industrialized animal agriculture often leads to even worse outcomes for the environment.

Luckily, there are better methods of agricultural production, and scaling them up is within reach. Conservation and ecologically based farming (agroecology) can not only prevent pollution and soil loss, they can help regenerate ecosystemsincrease productivity, and improve farmer livelihoods. And while federal policies have played a big role in incentivizing many of today’s damaging practices, there are also federal programs that deliver solutions.

Introducing the Conservation Stewardship Program

Conservation practices can improve soil health and soil ecosystem function, which leads to reduced erosion and runoff, improved water quality, and taxpayer savings.

The five-year farm bill funds several such programs run by the US Department of Agriculture (USDA). Some, like the Conservation Reserve Program, pay for farmers to retire sensitive land from production. Others, like the Environmental Quality Incentives Program (EQIP) and the Conservation Stewardship Program (CSP), termed “working lands programs,” incentivize farmers to adopt more sustainable practices on farm lands that stay in production. Although these programs make up only 6 percent of total farm bill spending, they pack in co-benefits like soil, air and water quality, climate change mitigation, and wildlife habitat. Tiny, but mighty.

Of these, CSP is the crown jewel. It is the only program that promotes comprehensive, whole farm sustainability. As the largest conservation program covering over 72 million acres, CSP targets high priority sustainability concerns and ensures we’re getting the most bang for our buck. Not only does this program pay for practices that are scientifically proven to produce results, such as resource conserving crop rotations, management intensive rotational grazing, cover cropping, and establishment of wildlife habitats, it pays for farmers to implement such practices in combination. And that is where the money is, literally, as our analysis shows.

CSP offers taxpayers an eye-popping deal

We sought to quantify the return on investment of public dollars in CSP and compare the effects that Senate and House farm bill changes to CSP could have on farmers and taxpayers. You can dig into our detailed methodology but, in short, we did this by considering the cost of CSP to taxpayers (according to the USDA budget) and estimating the benefits that CSP is known to deliver, including things like reduced erosion, increased grazing land productivity, improved air quality, carbon sequestration, and more.

Benefits included costs savings to farmers (like reduced need for fertilizer) and consumers (like reduced expenses for contaminated water), as well as the projected benefits of other ecosystem services (like increased productivity and reduced greenhouse gases).

Here’s what we found:

  • For every dollar of taxpayer money invested into CSP, we get about $3.95 in returned value. This value is notably higher than ROIs estimated for other conservation programs, thanks to CSP’s holistic approach and synergistic benefits that maximize returns.
  • Using this ROI, we estimated that the House bill eliminating CSP would result in lost benefits of $4.7 billion dollars per year. Pause for shock effect (I know we did). These are costs that would impact us ALL- from increased input costs for farmers, increased environmental degradation, and risking food security with a changing climate.
  • Conversely, we estimate that the Senate bill would lead to a net increase of benefits likely valued at around $1.2 billion dollars per year. Though the Senate bill does include some CSP funding cuts, it also improves the program in ways that emphasize high-value practices, so it’s more efficient.

The chart below summarizes the economic impacts of each of our farm bill scenarios.

Change in benefits calculated across four possible farm bill outcomes: 1) House bill is adopted and CSP is eliminated, 2) Senate bill is adopted, but without program improvements, and 3) Senate bill is adopted but with improvements resulting in a) a small increase (10 percent) to the expected ROI in both the Minimal and Likely ROI scenarios and b) a larger (33 percent) increase in both the Minimal and Likely ROI scenarios (see appendix for more details)

Farmers want—and need—incentives to pursue conservation goals

This program is in high demand. An average of between 50 and 75 percent of farmers and ranchers who apply each year are turned away. Recently, more than 165 farmers and ranchers wrote a letter to the House of Representatives Ranking Member Collin Peterson urging him to not only maintain the program, but to keep the promise of enrolling 10 million new acres per year. The results of a survey of more than 2,800 farmers earlier this year provided even more evidence that farmers are interested in this type of support from the farm bill.

The future fate of the CSP rests in the hands of those negotiating the 2018 farm bill. With the current farm bill set to expire on September 30, 2018, the House and Senate agriculture committees are scrambling to reauthorize a bill in time. Considering the intense backlash from farmers on Trump’s current tariff war and hot debates on proposed cuts to SNAP, this egregious crime of side-stepping the environment by cutting CSP is happening largely under the radar.

It’s illogical to eliminate, or even cut, a program that so efficiently provides broad-ranging environmental benefits to so many people across the country. Farmers are begging to keep it. Taxpayers benefit from it. The environment depends on it. So, while the House is busy trying to eliminate it – which would effectively cost us billions of dollars – all evidence suggests that strengthening it should be the real priority.

House and Senate negotiators are now deciding on the final outlines of the 2018 farm bill, including what happens to CSP. Tell them to prioritize this and other proven, science-based policies and programs that are good for all of us.

Paige L. Stanley has a Master’s of Animal Science for Michigan State University and is currently a Doctoral Researcher at the University of California Berkeley in the Department of Environmental Science, Policy, and Management. She is interested in transitions toward sustainable and humane livestock production systems with a focus in beef cattle. Her research is currently focused on farmer and rancher barriers to entry to adopting sustainable management practices.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Photo: Paige Stanley NRCS/Ron Nichols, Flickr Creative Commons

Bringing Communication Back Into Science: Incentivizing #scicomm Efforts for Early Career Scientists

UCS Blog - The Equation (text only) -

Photo: Laura Hamilton

About 10% of STEM PhD students ultimately go on to secure a coveted tenured position at a university. That is a discouraging statistic for those keen on an academic career track, especially considering that the overall number of new PhDs per year far outpaces the number of new faculty jobs per year.

Of course, that percentage may vary by specific discipline or sub-field, but the overall sentiment is the same no matter where you look — staying on the academic track has become the “alternate career.” Yet, landing one of these coveted faculty positions remains the pinnacle of success to many academics, so young scientists increasingly feel pressure to spend a majority of their time writing papers to be competitive for tenure-track positions. They might not even land that job when all is said and done anyway.

Worse, more time writing papers means less time for other enjoyable activities, such as education and outreach, that would benefit the general public as a whole. The net result is that the general public often learns of new scientific information through a complicated reporting path that can grossly distort the original message, just like in a game of “Telephone.”

From correlation to doomsday

Imagine this: your research group discovers a new, small object orbiting the Sun many times farther away than Neptune. You know it won’t affect the lives of hardly anyone, except maybe your collaborator down the hall who works on similar topics. But the properties of this new object are weirdly similar to a small collection of other known objects that might suggest a new undiscovered planet in the distant solar system.

News of your discovery is picked up first by your institution and then by other news outlets. You enjoy your time in the spotlight as people talk about your new discovery. Most outlets get most of the facts right. However, the concept of new and undiscovered planets in our solar system is not a new one, and a few outlets conflate this latest proposed planet with Planet X or Nibiru. Next thing you know, there’s a video on the internet claiming your new object is actually Planet X or Nibiru and is going to hit the Earth and destroy all civilization. It even goes so far as to accuse NASA of covering up the impending apocalypse.

The scenario just described is a true story that recently happened to my research group. While the video in question is admittedly a conspiracy video likely followed by believers of conspiracy theories, the sentiment of this story is all too familiar to many researchers. Too often, scientific results are subject to a game of Telephone that distort the original result beyond recognition. These distorted results then can influence the beliefs of the public or even make their way into misled governmental policy decisions. A much better scenario would be bypassing Telephone altogether, allowing the scientists themselves to share their work directly with the public.

Research vs. outreach, or research + outreach?

While the idea of having the scientists themselves share their results seems excellent in theory, the current cut-throat and competitive nature of academia renders that infeasible. The end result is that many scientists, who are frequently early career scientists, often yearn to do outreach work and acknowledge its importance, but they don’t have time to spare away from their research.

I’ve seen this exact phenomenon at play in my collaboration, the Dark Energy Survey, where I have been both very active in outreach efforts and chair of the Early Career Scientists Committee. With terabytes upon terabytes of beautiful images of the Southern Hemisphere sky, the science communication possibilities are practically endless. But the person-power available to make those scicomm possibilities a reality is painfully limited. A large part of the problem is simply that there are no incentives or rewards for doing outreach work. In fact, scientists who prioritize outreach are often punished for doing so in the form of one or two fewer papers to their names.

We need to reevaluate our priorities. Passion for outreach and science communication definitely exists, but the incentives don’t. In an era where young scientists need to publish more frequently to stay competitive for those coveted tenure-track faculty positions, there’s not much time for other things, including science outreach and communication. Publishing one more peer-reviewed paper won’t convince your uncle to pay taxes to fund basic science or your senator to vote “No” on that bill with detrimental consequences for your field — but science communication will.

 

Stephanie Hamilton is a physics graduate student and NSF graduate fellow at the University of Michigan. For her research, she studies the orbits of the small bodies beyond Neptune in order learn more about the Solar System’s formation and evolution. As an additional perk, she gets to discover many more of these small bodies using a fancy new camera developed by the Dark Energy Survey Collaboration. When she gets a spare minute in the midst of hectic grad school life, she likes to read sci-fi books, binge TV shows, write about her travels or new science results, or force her cat to cuddle with her. Find her on Twitter or LinkedIn.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Public Discussion of Science Policy Surges Nationwide as Thousands Engage in Science Rising

UCS Blog - The Equation (text only) -

Photo: Omari Spears/UCS

The great awakening of the science community is only gaining steam in the wake of increased attacks on science. Since the spring launch of Science Rising, we’ve recorded more than 125 events submitted by 118 organizations around the country who are focused on making sure science is front and center in the decision-making processes that affect us all. As we get closer to the midterm elections, it becomes ever more critical for us to create conversations that encourage elected officials to protect science. Here are 5 Science Rising events you may have missed—and 5 more coming up, including a Twitter chat later today.

At a Science Rising event in June, residents of the Amani neighborhood in Milwaukee and community organizations gathered to discuss lead abatement strategies and organize around preventing lead exposure.

Gun Violence Prevention Challenge Summit & Hackathon: In Boston in April, public health advocates, local government officials, and neighborhood residents gathered to discuss evidence-based gun violence prevention solutions. The following day, they held a hack-a-thon to collaborate on innovative local solutions to gun violence. This event, supported by the Consortium for Affordable Medical Technologies, demonstrated a creative way to bring together community members and harness their collective power to explore new ways to address the issue.

Environmental Lobby Day: The Illinois Environmental Council, Illinois Sierra Club, and Faith in Place supporters gathered in Springfield for an April Environmental Lobby Day. Building relationships with legislators at the state level is an important way to make sure that science and environmental issues are being taken seriously. If you’d like to organize your own lobby day, check out this blog from the Illinois Environmental Council on how to organize a DIY Advocacy Day.

Black Panther Lives – Wakanda STEM Equity Outreach: This Oakland event began with the idea that science is relevant to all people—but science still struggles with a history of racism, exclusion, and inequity. By bringing in pop culture references like Black Panther, as well as well-respected scientists of color to speak at the event, this organizers were able to reach and resonate with a broader audience and help break down some of the barriers traditionally placed around science talks.

Hosting a Wikipedia Edit-a-thon: Many organizations are thinking about the broader science advocacy movement, and how to build a more diverse and inclusive community to stand up for science. 500 Women Scientists hosted an edit-a-thon as part of Caveat NYC’s Underground Science Festival to recognize the scientific contributions of women, gender and sexual minorities, and people of color, who remain underrepresented on Wikipedia’s pages.

And here’s a quick look at 5 upcoming Science Rising events, both in person and virtual. (Don’t see any in your area? Check the full list at www.sciencerising.org)

  • Tidal Town Halls in coastal Florida (August-October). Tidal Town Halls allow voters to hear ideas for solutions to sea level rise and other related topics directly from candidates running for public office, so that they may make an informed choice at the polls.
  • #ScienceRising Twitter chat: A Healthy Democracy Requires Honesty and Accountability: August 22, 12-1pm ET. Lying to the public for private or political gain is always wrong. We should all be able to know the facts, even when they are inconvenient—especially when they are inconvenient. Public officials and private interests should face consequences when they mislead the public. Future of Research and Science Rising invite you to the fourth in a series of Twitter Chats on the principles of #ScienceRising and the science advocacy movement.
  • Art for Science Rising: Interested Parties Hearing: Tonight in Columbia, Missouri, the city’s Office of Cultural Affairs will be seeking feedback on the design of a proposed large-scale mural that will showcase the Columbia’s Climate Action and Adaptation Plan. Resident Arts is a recipient of an Art for Science Rising grant.
  • Empowering Puerto Rican Scientists to be Science Policy Stewards: in Puerto Rico September 1 (and live-streamed online), you’re invited to attend the kickoff event for the Puerto Rico Science Policy Action Network.
  • Women’s Assembly for Climate Justice: In San Francisco on September 11, join an impressive gathering of women leaders from around the world to discuss how women are leading solutions on the front lines of climate change.
Getting your own event started

How to organize an event that makes an impact: If you’re inspired by these past events, you can organize an event in your community with guidance from the Union of Concerned Scientists. Watch the recording of the training, or use this checklist on How to organize an event.

When you’ve fleshed out the details of your event, submit it here so we can help you make it a success. Science Rising has plenty of additional resources, stories, and events to check out; more will be added regularly between now and the election. Help us send the message that the scientific community—and indeed, anyone who cares about the crucial role of science in our democracy—will resist attacks on science and fight to advance the role of science in public life.

Photo: Omari Spears/UCS Photo: John Saller

Pages

Subscribe to Union of Concerned Scientists aggregator - Combined UCS Blogs