Combined UCS Blogs

Fake News about Chinese Nuclear Weapons

UCS Blog - All Things Nuclear (text only) -

On the left, a still from a video shot at an intersection in the Chinese city of Daqing. On the right, a picture of the Russian Topol-M taken during a military parade in Moscow. Both are carried on eight-axel TEL vehicles, indicating they are approximately the same size.

A video recently discovered on a Chinese internet service appears to show a new Chinese road-mobile missile making a turn at an intersection in the city of Daqing. The discovery generated sensational claims about changes in Chinese nuclear strategy. However, a careful search of Chinese sources shows that none of those claims can be substantiated. Some are obvious distortions.

The Dongfeng (DF)-41 Missile

Multiple foreign media sources claimed the missile in the video was a new nuclear-armed long-range ballistic missile called the DF-41. The Chinese government does not comment on the composition and capabilities of its nuclear missile force and has neither confirmed nor denied the existence of the DF-41.

The missile seen in the video appears slightly larger than the DF-31A long-range ballistic missiles China displayed in a national military parade in 2015. The U.S. National Air and Space Intelligence Center states the DF-31A has a range of 11,000+ kilometers and could deliver a single Chinese nuclear warhead, estimated to weigh approximately 500kg, to targets within the continental United States.

Almost all of the reported information about the existence and characteristics of the DF-41 can be traced to a handful of foreign media sources that have a questionable track record when reporting on Chinese missile technology. However, the U.S. Department of Defense recently reported China “is developing a new road-mobile ICBM, the CSS-X-20 (DF-41) capable of carrying MIRVs.”

Although foreign media sources routinely claim the DF-41 could carry 10 or 12 nuclear warheads, the missile seen in the video could not. It’s too small given the mass of Chinese warheads. Similar in size and appearance to the Russian Topol-M, which can carry a payload of 1,200 kg approximately 10,500 km, the missile in the video may be able to carry two Chinese warheads, but it most likely is designed, like the DF-31A, to carry a single warhead and a set of countermeasures to confuse missile defenses.

If the missile seen in the video is the new road-mobile missile discussed in the Pentagon report, it purportedly has a slightly longer range than the DF-31A. This would allow China to reach US targets it could previously reach only with a liquid-fueled, silo-based missile called the DF-5—which was also displayed during the 2015 military parade. Because silo-based missiles are more vulnerable to a preemptive first strike, having a road-mobile missile with the same range as the DF-5 increases Chinese confidence in their ability to retaliate.

False Claims About Chinese Nuclear Strategy

On January 24, Popular Mechanics published a story with a still from the video that claimed the Chinese government “publicly announced the deployment” of the DF-41 and that announcement “is likely a warning to U.S. President Donald Trump, who is known for sharply worded anti-Chinese rhetoric and has announced plans for a new ballistic missile system.” Two days later The Independent ran the same story with the same claims. The Sun, the Daily Caller, the International Business Times, the Moscow Times, Quora.com, ZeroHedge.com, STRATFOR, TASS, RT, and Sputnik International all ran stories about the alleged Chinese nuclear missile “deployment” and what it supposedly revealed about the intentions of the Chinese government.

Breitbart ran the same story with the same claims on January 27, but with the added twist that the so-called deployment of the missile in Heilongjiang province, which shares a border with the Russia, is a prelude to an “approaching Clash of Civilizations world war” where “Russia and the United States will be allied against China.”

The sole basis of the claim that China announced the existence and deployment of the DF-41 is a commentary in the English-language edition of China’s Global Times. The commentary is a response to the publication of images from the posted video in the Hong Kong and Taiwan media, which in turn seem to have their origins in a French website. Yet, the Global Times clearly states,”there has been no authoritative information on whether China has a Dongfeng-41 strategic missile brigade, how many such brigades it has and where they are deployed.” The Chinese commentary is critical of President Trump, and does express the hope that the existence of the DF-41 “will be revealed officially soon.” But that is a far cry from the “official announcement” described in many of the foreign news reports on the posted video.

Fake News about Nuclear Weapons is a Cause for Concern

The fabrication and distribution of misinformation about the size, capability and intent of China’s nuclear arsenal is nothing new. Several years ago an adjunct faculty member at Georgetown University cited Chinese-language blog posts to recast decades-old rumors from a Hong Kong tabloid as a “leaked Chinese military document” that allegedly proved China’s nuclear arsenal was ten times larger than existing US estimates. His assertions and sources are demonstrably not credible. Yet, Dr. Peter Navarro, an adviser to President Trump, repeated these alternative facts about the size of China’s nuclear arsenal in a recent book on Chinese military strategy.

President Trump recently directed Secretary of Defense Mattis to initiate a review of the US nuclear posture. This follows a series of statements in the wake of the November election that indicated Mr. Trump supported a major build-up of US nuclear forces. While the new U.S. president’s comments on the need for US nuclear modernization are not unprecedented, his ability to push modernization plans through a Republican-led Congress, despite the enormous projected costs, may be enhanced by exaggerated perceptions of a Chinese nuclear threat to the United States.

As the debate on US nuclear weapons policy takes shape under the direction of Secretary Mattis, who may have reservations about the need for a US nuclear build-up, it is important that US decisions be made on the basis of the best available information, rather than the alternative facts now circulating in Washington.

How to Ensure Self-Driving Vehicles Don’t Ruin Everything

UCS Blog - The Equation (text only) -

Zipcar’s former CEO has cast the self-driving future as a “heaven or hell” scenario, and she has a point. Self-driving cars could save lives, smooth traffic congestion, expand access to jobs or schools—especially for people who can’t drive themselves today—and reduce the number of vehicles on our roads. On the other hand, they could worsen smog and local air quality pollution, disrupt the US economy by putting millions of people out of work, justify cuts in public transit funding and services, and force urban planners to focus more on providing space for vehicles instead of for parks, bicyclists, or pedestrians.

To maximize the potential benefits of self-driving vehicles and minimize their potential consequences, UCS developed this set of principles that we will be pushing policymakers, businesses, and other stakeholders to follow. Doing so will ensure that self-driving vehicles reduce oil consumption and global warming emissions, protect public health, and enhance mobility for everyone.

Science-based policy will be key for shaping the introduction of self-driving technology

Many are rallying against any regulation of self-driving technology beyond ensuring it’s safe to use. I’ve even heard the claim that over regulating this technology will literally kill people by slowing the speed at which self-driving cars are introduced, thus delaying their potential safety benefits.

To be fair, this argument has merit. Self-driving vehicles are forecast to reduce the tens of thousands roadway fatalities that occur each year in the US by as much as 90 percent, and can offset the rise of distracted driving that may have caused the biggest spike in traffic deaths in 50 years (though reaching these improved safety levels will take further advances in the technology and widespread deployment).

But, self-driving technology won’t just impact transportation safety. Researchers are forecasting how it will affect traffic congestion, vehicle-related emissions, land-use decisions, public transit systems, data security, and the economy. Unfortunately, the emphasis that many, including the US Department of Transportation, have placed on the safety benefits can be distracting from the need to consider how policy should address the other equally great potential impacts of self-driving technology.

I’m not saying self-driving technology should be regulated to the scrapheap. The technology is highly likely to improve traffic safety and increase access to transportation—both important outcomes. Yet self-driving vehicles will need to be regulated on issues other than safety, as their full breadth of potential impacts won’t be addressed by safety-focused policy or market forces alone.

For example, studies have found that self-driving vehicles could double transportation emissions (already the largest source of climate change emissions in the US), place millions Americans out of work as automated driving replaces truckers and taxi drivers, and/or exacerbate urban sprawl.

The jackpot for winning the race to produce the best self-driving vehicle can still be won even if these negative affects are suffered, and today’s policy frameworks may be insufficient to effectively curtail these future impacts. Let’s not forget that automakers have historically been against regulation (see: seat belts, fuel economy, air bags) and are encouraging policymakers to clear the way for self-driving vehicles not only because they seek to improve transportation safety, but because they see a potential to make a profit.

So science-based policy covering the broader implications of self-driving cars, including how they affect emissions and our economy, will be needed to ensure the best possible self-driving future and these discussions need to happen today. To kickstart these conversations, UCS released these principles that will create a safe, healthy, and equitable autonomous future. Join the conversation on whether and how self-driving technology should be regulated by checking out our new self-driving vehicle web content and signing up for future action alerts here.

North Korea’s February Missile Launch

UCS Blog - All Things Nuclear (text only) -

North Korea reportedly launched a medium-range missile Sunday morning local time (about 6 pm Saturday on the US east coast).

People are speculating about what missile it could have been. Based on the range, there are at least two candidates, which would be distinguishable by US intelligence if it was able to observe the launch.

Fig. 1

The missile was apparently launched eastward from the Panghyon air base near Kusong, northwest of Pyongyang and traveled 500 km, splashing down in the Sea of Japan. According to the South Korean military, it flew on a lofted trajectory, reaching an apogee of about 550 km.

A missile flown on this trajectory would have a range of 1,200-1,250 km if flown on a standard trajectory with the same payload (Fig. 1).

Nodong or KN-11?

That range is similar to that of the North Korean Nodong missile, which was first tested in the early 1990s and has been launched repeatedly since then. Another launch of the Nodong would not be particularly useful for advancing Pyongyang’s missile program, so if that was what was launched it would have had a political motivation.

However, as Jeffrey Lewis points out, the trajectory is very similar to the trajectory the submarine-launched KN-11 missile flew in its first successful test last August. While similar in range to the Nodong, the KN-11 has the advantage that it uses solid rather than liquid fuel, which means it would take less preparation time before a launch. The North is likely to be interested in developing and testing a land-based version of the missile.

If this is what was launched, it would represent a useful developmental step for North Korea, no matter what may have driven the timing of the launch.

The KN-11 would have a clear fingerprint that would distinguish it from the Nodong (or the Musudan, see below), since it has two stages rather than one, and that difference would be clear if US, Japanese, etc., sensors were able to observe the test.

Other options?

Some of the reports have speculated the test was of a Musudan missile, but I haven’t seen anything about the test that supports that. The Musudan range is considerably longer. The one successful Musudan launch, which took place last June, suggested a maximum range of about 3,000 km, although a recent analysis suggest that the range is probably less than 2,500 km if it carries a payload similar to the mass of a nuclear warhead. (Note that repeated claims that the Musudan can reach up to 4,000 km are not credible.)

It’s also worth noting that North Korea apparently fired several extended-range Scud missiles last September, which have a similar but somewhat shorter range than seen in the test, depending on the payload. These are also single stage and could be distinguished from a KN-11 test.

Of course, the North may surprise us with something else entirely.

Can Republicans Find Their Voice on Climate Change via a Carbon Tax?

UCS Blog - The Equation (text only) -

Earlier this week a group of conservative opinion leaders and experts launched the Climate Leadership Council, championing a national carbon tax to cut emissions and help achieve climate goals.

As with any carbon pricing proposal, the politics are complicated and there is no telling how much traction this particular initiative will get. There are also definite concerns about some of the details of the proposal. But it’s very encouraging to see a meaningful solution to climate change put forth by conservatives. I look forward to seeing where this will go, especially with Republican lawmakers and the Trump administration.

Starting from the facts

This proposal begins with recognizing the scientific facts about climate change and the urgency of acting on solutions. To see leading conservatives articulate those basic realities is important, and I hope Republicans in Congress and the Trump administration are listening.

Climate change should not be a partisan issue. There’s no time to waste on the dangerous new types of denial or delay tactics that were in evidence during the nomination hearings for Rex Tillerson and Scott Pruitt, for example.

Just like the near-universal consensus among climate scientists about the facts of climate change, there is an overwhelming consensus among economists that a carbon price is an essential policy tool for driving down carbon emissions. The CLC proposal’s starting price of $40/ton CO2, escalating over time, shows the seriousness of their proposal.

What’s more, the authors of the proposal recognize that we have to act on climate as a global community and the US must live up to its international commitments under the Paris Climate Agreement. Yes, to meet long term climate goals countries will have to do a lot more than they have currently committed to, but walking away from the Paris Agreement would be a serious mistake.

Notes of caution

There is obviously room for discussion about ways to improve the policy proposal, as and when it gets serious consideration from policymakers. Some aspects of the proposal that could definitely use further scrutiny include:

  • Regulatory rollbacks that harm public health or undermine key legal protections are cause for concern. The EPA’s authority to regulate global warming emissions is a critical safeguard that cannot be negotiated away. There may be middle ground possible here but further conversations with a wide set of stakeholders, including environmental justice groups, are critical.
  • A carbon price alone will not be sufficient to deliver on the deep emission reductions consistent with climate goals; we need complementary policies to address other market failures. For example, policy incentives for innovation in low carbon technologies are important. In sectors like transportation, a small surcharge on fuel prices won’t be enough to drive the big changes needed in vehicle fleets and the investments in infrastructure for public transit or electric vehicles so other policies are needed. And we need policies to address non-CO2 emissions, such as methane.
  • What happens with the (considerable) carbon revenues is obviously a hugely important policy choice that must be made in consultation with lawmakers, with the interests of the broader public squarely in mind. Priorities—such as appropriately offsetting the disproportionate impacts of energy price increases associated with a carbon tax; transition assistance for coal workers and coal-dependent communities; assistance for communities facing climate impacts, especially frontline low income and minority communities; and investments in low-carbon infrastructure—require dedicated funding which could come from carbon revenues, or would require appropriations from Congress.
Getting (back) to bipartisan approaches on climate policy

In recent years, views on climate change have become politicized to the point that climate denial has become a form of tribal identity for most conservative-leaning politicians, and one more instance of the ‘just say no’ approach to any issue championed by the Obama administration.

Given the anti-science rhetoric from many Republicans in Congress, it’s hard to remember that there was a time when climate change was not a partisan issue. There was a time when Senators John McCain and Lindsey Graham and other leading Republicans not only openly accepted climate science but worked hard, together with Democrats, to find bipartisan solutions.

We got tantalizingly close to a national climate policy in the form of the American Clean Energy and Security Act of 2009 (aka the Waxman-Markey bill), which passed the House but was never brought to a Senate vote because of insufficient support. The failure of that legislative effort is what led to the EPA’s Clean Power Plan as an alternative. Regulation was not the first choice of the Democrats or of the Obama administration.

There is lots of blame to go around about how and why bipartisan approaches to addressing climate change have failed thus far. But we don’t have the luxury to wallow in past mistakes; we have to break through the partisan divide and act on climate now.

And that’s why I am particularly encouraged by a proposal from conservatives that attempts to bridge that divide, albeit imperfectly.

The future can be different

Call me a delusional optimist, but I fervently hope that Republicans in Congress will now feel free to acknowledge the reality of climate change because that position will no longer be associated with a Democratic administration. And that they will work to advance solutions that can help meet the urgency of the challenge we face.

Even during the Obama years, there were some who stepped out of the party line, including a group of Republicans who joined the bipartisan Climate Solutions Caucus in the House and those who signed on to the Gibson Resolution.

Yesterday, along with the news of the CLC carbon tax proposal, we also heard news of four new members added to the bipartisan Climate Solutions Caucus. The Caucus now has 12 Republican members and 12 Democratic members.

Maybe these types of bipartisan efforts will grow in strength and size and we will get to a political tipping point on climate action. Maybe climate science and smart solutions can take center stage instead of partisan politics. One can hope this happens soon…

Actually, no. Hope is simply not enough. We need action urgently.

Republicans (and Democrats) must step up

We cannot afford another four years of denial, obstruction, artful lies, and ‘just say no’ politics, aided by fossil fuel interests. Climate impacts are already imposing harms on Americans and a costly burden on our economy. The recent climate data are stunning and sobering. Just a few examples:

Meanwhile, solutions like ramping up wind and solar energy are getting cheaper every year, and bring the promise of huge new economic opportunities IF we accelerate the momentum already underway.

Let’s build that clean energy infrastructure and create jobs. Let’s cut pollution from fossil fuels that causes numerous health problems including exacerbating asthma in children, and contributing to other types of heart and lung ailments, and even premature death. Let’s help coastal communities struggling with flooding worsened by sea level rise.

And let’s put a price on carbon while we’re going about it. There’s nothing partisan about any part of this bright vision for our future.

Still waiting for Republican leadership on climate change

Of course, President Trump must also show leadership from the top. His administration’s threats to dismantle existing climate and energy policies without any clear alternative plan are not a promising start. Thus far, the administration doesn’t show any indication of an interest in helping Americans facing the impacts of climate change, or recognizing the serious consequences of our continued dependence on fossil fuels.

If the president won’t lead, then Congress—including members of his own party—needs to have the courage to hold him accountable and advance their own climate solutions, perhaps along the lines of the CLC proposal.

The future will not be kind to this Congress and this administration if all they do is continue to find new creative ways to deny the science and dodge their responsibility to act on climate. We the people—Democrats, Republicans, and Independents alike—deserve much better from our government.

Electricity Rates Are Sorely Outdated. Let’s Give them an Upgrade.

UCS Blog - The Equation (text only) -

Last month, to great and enthusiastic email fanfare, my utility presented me with a redesigned electricity bill. One meant to help me better understand the various costs and components that make up the final amount due. In an entirely relatable manner, my household met such news with chortles of joy. What a day!

But the utility’s trick? Colors and a stacked bar chart. They were nice colors, and yet…it proved a letdown. If our electricity bills contained just a bit more of the right information, we could collectively be saving billions of dollars a year, reducing air pollution all around us, and helping to bring ever more renewables online—a true step forward toward our vision of the modern grid. Now tell me that’s not a neat trick.

Shining a light on system costs

So what’s the right information, and how do we get it? Time-varying electricity rates, or rates that go up and down to let us know when it’s costlier and less efficient to be using electricity, and when it’s cheaper and cleaner.

As my colleagues and I explain in a new issue brief Flipping the Switch for a Cleaner Grid, with that extra information, we can make more informed decisions about how and when to use electricity, and save money and clean our air in the process.

Right now, most of us get charged the same flat rate for electricity no matter when we use it. But in reality, the actual cost to the system varies widely over times of day, days of week, and even seasons. These fluctuations in price are driven in large part by the need to meet ever-changing customer demand.

In particular, though we can’t see it with flat rates, our last bits of ill-timed load can mean sky-high prices as the system powers up inefficient plants, which we pay to build and maintain even though we use them for just a small amount of time each year. Talk about a wasteful design. By using price signals to mobilize flexible demand, time-varying rates flip this operations paradigm on its head.

Rates as guides

Time-varying rates use price signals to encourage customers to use electricity at some times and not others. Credit: UCS.

Time-varying rates are designed to encourage customers to alter when and how they use electricity. Different structures go about it in different ways to target different points of inefficiency. The figure on the right shows three of the most common forms: time-of-use (TOU) rates, critical peak pricing (CPP), and real-time pricing.

  • TOU rates (top right) target daily repeating patterns of peak and off-peak periods,
  • CPP rates (middle right) focus on just those few hours a few days a year when electricity use is at its very highest, and
  • Real-time pricing (bottom right) approximates the actual system cost in 5-minute to 1-hour intervals, which allows interested customers to best take advantage of the dynamic up-and-down swings of prices.

Time-based rates are not new; in particular, TOU and CPP rates have been around for a long time, especially for commercial and industrial electricity customers. However, it’s only been with the recent deployment of tens of millions of smart meters over the last few years that wide-scale, administratively low-cost programs have been more readily attainable at the residential level.

Still, except for a few places where state-wide implementation of time-varying rates is on the table (see California and Massachusetts, for example), most utilities continue to see these rates as a boutique approach.

Put me in, Coach!

Despite their simplicity, time-varying rates can create significant outcomes for the grid by shepherding lots of individuals into taking small actions at the same time—in aggregate, all these little contributions can add up to major effects. Take a look at the below example out of New England to get a sense:

New Englanders move as one when the Patriots are in the Super Bowl–namely, to in front of the TV at start time, and into the kitchen at the half. Credit: ISO-NE.

The left panel shows the load curve, or total electricity demand, for a regular winter Sunday in 2012; the right shows Super Bowl Sunday of that year, when New England played New York. Notice the narrowing of the peak and the spikes on the far right of the Super Bowl curve around 6:30, 8, and 10 p.m.? They correspond with the start, half-time, and end of game, respectively.

Now the half-time spike might look small, but it’s actually in the range of a whole natural gas generator needing to come online. Time-varying rates provide a mechanism for coordinating that type of chip-and-dip-refill fervor in our everyday lives.

In practice, the options for shifting demand run from simple to high-tech. For example, doing something like pressing the “delay start” button on a dishwasher (or just waiting to press start) is an easy, no-upgrades-required fix. On the other hand, some forms of flexibility require a technology intervention before they can be used, like turning water heater tanks—commonly a large residential electricity load—into energy storage devices that heat water during off-peak periods for use whenever needed. Because these resources can be so valuable to the system overall, it can be worth it for utilities to sponsor some of the upgrades themselves.

Excitingly, the recent mass deployment of smart meters means that many new opportunities for shifting electricity use and responding to price signals are beginning to be explored. In particular, innovation around third-party aggregators controlling electricity-dependent devices—from air conditioners to electric vehicles, in ways that are imperceptible to users—could mean even bigger opportunities for savings.

Still, it’s important to look back at that Super Bowl example to remember that it doesn’t actually take much to make a big difference to the grid, and that what we can do today is already a lot.

Fast-tracking our clean energy future

When we talk about the benefits of flexible demand—including those resulting from time-varying rates—we usually focus on the immediate (and persistent) cost savings that occur from not bringing those last costly power plants online. But such benefits are only the beginning of the story. This is especially the case when we consider the needs our grid will have as we race toward a clean energy future supplied by vast amounts of renewable resources.

Time-varying rates can help support a brighter, cleaner, more joyful wind-powered world. Credit: UCS.

Because wind and solar power production is variable, we need ways to fill the gaps when the wind eases or a cloud passes. Additionally, as more and more solar power comes online, the grid can start to run into challenges when the sun sets; solar resources decrease electricity production right around when people are returning home for the night and starting to use lots of electricity.

To manage this variation, we’ve traditionally relied on fossil-fueled power plants. But that reliance comes with a number of strings attached, and often at the expense of renewables, as my colleagues in California have detailed.

Enter flexible demand. If we can guide electricity use to times when our renewable resources are most abundant—and away from when they aren’t—we can take a vitally important step forward on the path to a clean energy future, and make the many and varied goals of our modern, clean grid easier to reach.

Critically, to ensure that access to these benefits is equitable and widespread, it takes a well-designed, well-considered program, as we lay out in our issue brief and as our peers have been diligently monitoring in California.

Think time-varying rates are neat? Take a peek at all the other wonders of an upgraded grid

Here at UCS, we’re working hard to make sure the electricity grid is ready and able to bolster our vision of a clean energy future. Time-varying rates, and their ability to unleash the incredible power of flexible demand, are but one part of this vision. In the time to come, my colleagues and I will be sharing exactly how we see upgrades to the grid enabling this pursuit; for now, though, allow our new video calling for an upgraded grid to brightly shine a light:

Missile Defense Agency to Choose Preferred Location for Third GMD Site

UCS Blog - All Things Nuclear (text only) -

Sometime in early 2017, and it could be any day now, one of the communities on the map below (designated by red dots) will get big news from the Missile Defense Agency (MDA). Congress mandated the MDA to choose a preferred location in case the United States decides to build an additional deployment site for the Ground-based Midcourse (GMD) System missile defense.

The site studies were on track to wrap up at the end of 2016. We’ve updated our fact sheet on it, posted here.

Fig. 1. Sites being studied as a potential third site for the GMD system: Fort Custer Training Center, near Battle Creek, MI.; Camp Ravenna Joint Military Training Center, near Akron, OH; and Fort Drum, NY. (Source: Google Earth)

There’s no military requirement for an additional missile defense site. Nor was the idea of building a third site (in addition to the two existing ones in Alaska and California) the result of a rigorous study of what would best improve the system’s ability to intercept ballistic missile threats to the homeland.

But you can count on Congress to run with this idea and push as hard as it can.

Fig. 2. Workers preparing an interceptor in Alaska (Source: Missile Defense Agency)

Every year since 2012, Congress has attempted to dedicate/earmark money to build such a site, despite Pentagon budgets that never included a dime for it. When asked, missile defense officials have said repeatedly that they have higher priorities for their next dollar. And they are skeptical about what starting this expensive project would do to their priorities in a constrained budget environment, including improving the reliability and effectiveness of the existing system. Improving reliability and effectiveness would be a good thing. The GMD system has been plagued with serious reliability problems and has a poor test record.

However, congressional delegations (with a few exceptions) from Michigan, New York and Ohio have crossed party lines and asked the Missile Defense Agency to support locating the site in their respective states. Their support appears to be largely driven by an interest in creating jobs. Each proposed site is in an economically depressed area, and many in the local communities are understandably eager for an infusion of federal cash to generate new job opportunities.

But is this an effective way to create jobs?

Let’s talk about money. This would be an expensive project. The Congressional Budget Office estimated that a new site would cost at least $3.6 billion to build and operate over the first five years. This includes ground equipment ($1.2 billion); developing the site, building the facilities, and constructing the silos ($1 billion); the cost of buying 20 interceptors ($1.3 billion), and operations costs ($100 million). For the full complement of 60 interceptors, it would cost at least $2.6 billion more.

Note, however, that the interceptors would not be built at the new sites, and neither the $1.3 billion for the first 20 interceptors nor money for extra interceptors would be spent locally. For example, Raytheon builds the GMD system’s kill vehicles in a facility outside Tucson, which it recently expanded to increase its capacity. The GMD interceptor’s boosters are also produced primarily in Arizona, at Orbital ATK’s facility outside of Phoenix.

So support for local industry and jobs for constituents may partially explain why Sen. John McCain, who usually provides a healthy dose of skepticism about defense expenditures, has endorsed the plan to build a third site.

Turning back to the potential sites in the Midwest, these above estimates indicate that under this plan, the Pentagon would spend at most about $2.3 billion in the local community. While that sounds enticing, studies show that military spending is not a particularly effective way to generate good paying jobs. Investing a comparable amount in clean energy technologies, health care or education is likely create a much larger number of jobs across all pay ranges than military spending.

The GMD site studies provided detailed information about what kinds of jobs would be created by building a new site. While it varies from site to site, the estimate is that construction, would generate 600 to 800 temporary jobs. A large fraction of those jobs, 15 to 50 percent, could be filled by workers from outside the region, depending on the skills of local residents.

After construction, the site would require an operations staff of 650 to 850 people. About 85 percent of the permanent staff jobs would be filled by workers from elsewhere due to the fact these positions demand specialized expertise.

The facility would indirectly generate a larger number of jobs, mainly low-to-median wage service jobs spurred by the economic activity. During construction, estimates range from 1,800 to 2,300 indirect jobs, while after the facility is completed, an estimated 300 to 400 indirect jobs would remain.

How does that compare to other types of investment?

Investing in wind projects would be a good bet—and both Michigan and New York are among the top 20 states for wind energy potential. As I noted a few years ago, a 2008 study by the National Renewable Energy Laboratory, which looked at the economic impact of building wind turbines in Colorado, estimated that developing 1,000 megawatts of wind-generated power would create 1,700 full-time equivalent jobs (including engineering and manufacturing jobs), and operation and maintenance would provide 300 permanent jobs in rural areas. In a 2013 report, Lawrence Livermore Laboratories calculated an average cost of building wind power to be $1,940 per kilowatt (and this cost is dropping). So these wind industry jobs would cost an initial outlay of around $2 billion, comparable to the investment in a third GMD site, and would continue to provide a return on investment.

For roughly the same amount of money, Hemlock Semiconductor, in Saginaw County, Michigan created 1,000 new jobs, spending $2.5 billion over five years on manufacturing facilities that produce materials for solar panels.

Building a third GMD missile defense site isn’t the result of a considered study of priorities to strengthen U.S. security, nor is it a sensible next step to improve strategic missile defense capabilities. It is symptomatic of a broader problem with strategic missile defense: Congress is not providing adequate oversight nor the necessary skepticism.

Regardless, we expect Congress to continue to push for a new site anyway once a preferred site is selected. However, if Congress has an extra few billion dollars available for one of these locations, it is fair to ask that it be spent in a way that provides economic security for the chosen community and a much better return on investment.

Congress is Trying to Protect Federal Scientists Because President Trump Isn’t

UCS Blog - The Equation (text only) -

Today members of the Senate, led by Senator Bill Nelson, introduced a bill to strengthen scientific integrity in federal decision making. If ever there was a time that such a bill is needed, it is now.

Today, members of Congress introduce a bill to strengthen scientific integrity at federal agencies and enhance protections for government scientists. Photo: USDA

The Trump administration has already revealed its disrespect for the use of science in federal decision-making. From instating sweeping gag orders on federal scientists right out of the gate, to across-the-board hiring freezes and disruptive holds on grants and contracts, early indications suggest that this administration is not likely to be a leader in championing scientific integrity in government decision-making.

Moreover, the administration’s pick to lead the EPA, Scott Pruitt, has expressed limited understanding and respect for the EPA’s scientific integrity policy, noting in his confirmation hearing, “I expect to learn more about EPA’s scientific integrity policies.” In the face of such abuses, a move to strengthen scientific integrity at federal agencies is certainly welcome.

A bill to strengthen federal scientific integrity

Aimed “to protect scientific integrity in federal research and policymaking,” the bill requires federal agencies that fund or conduct science to adopt and implement scientific integrity policies, an idea initially introduced by the Obama administration in 2009. Specifically, the bill compels science agencies to develop scientific integrity policies that include specific provisions to enhance scientific integrity.

Importantly, the bill reinforces key elements of some federal agencies’ scientific integrity policies. It includes provisions requiring agencies to develop procedures that allow scientists to review and ensure the accuracy of public-facing materials that rely on their work, such as reports, press releases, and factsheets. This provision could help safeguard against political interference that might come from political appointees or public affairs staff that edit scientific documents before they are released. This type of political interference happened in several instances under the George W. Bush administration. Julie MacDonald, for example, a political appointee at the Department of the Interior, edited the scientific content of a document that served as evidence for listing the sage grouse under the Endangered Species Act.

A safeguard against such abuse could prove useful under a Trump administration, which has already suggested that it will emphasize uncertainty on climate science on NOAA websites and appears to be keeping a tight control on agencies’ scientific communications. The provision could be made even stronger by granting scientists the right to approve the scientific content of the public-facing materials that rely on their work.

Preventing political tampering

Another provision of the bill requires agencies to develop procedures that “identify, evaluate the merits of, and address instances in which the scientific process or the integrity of scientific and technological information may be compromised.” This is an important inclusion since to date, not all scientific integrity policies at federal agencies have detailed procedures for assessing the validity of and addressing allegations of scientific integrity abuses.

This lack of clarity in current agency policies has had damaging impacts on scientists who raise, or are accused of, scientific integrity violations. A scientist at Los Alamos National Laboratory, for example, appeared to have lost his job over publishing a paper that the Department of Energy didn’t like. When a scientist at the US Department of Agriculture was accused of violating the scientific integrity policy, he was subjected to a long review process that may not have included an independent assessment of the claims. Thankfully, both the DOE and USDA have revised their scientific integrity policies to strengthen the allegation evaluation procedures.  A law requiring all science agencies to make allegation procedures clearer would improve evaluation of scientific integrity violations across the government and give federal scientists fairer assessments.

The bill also requires the National Academy of Public Administration to conduct a study of scientific integrity across the government. This is a great idea and one that was included in our recent policy recommendations to the Trump administration. An independent assessment of the effectiveness of scientific integrity policies would provide illuminating findings on how the relatively new policies and procedures could be further improved.

A positive step in uncertain times

To date, 24 federal agencies have developed scientific integrity policies. The policies vary in quality, but in general they afford federal scientists rights to communicate, and include provisions to safeguard against political interference in science-based decisions. The bill would strengthen these provisions by uniformly applying some basic protections across all science agencies. This raises the floor on scientific integrity in the government.

Kudos to all the 26 senators co-sponsoring this welcome legislation. This includes the ranking members of the Senate Environment and Public Works Committee (Sen. Carper), the Senate Energy and Natural Resources Committee (Sen. Cantwell), the Senate Health, Labor, and Pensions Committee (Sen. Murray), the Senate Armed Services Committee (Sen. Reed), and of course, the Senate Commerce, Justice, Science Committee (Sen. Nelson).

Advancing Scientific Integrity Through Federal Advisory Committees

UCS Blog - The Equation (text only) -

Back in October, I provided a comment at a public meeting for a National Academies of Science, Engineering and Medicine (NASEM) advisory committee that was set up to review the process to update the Dietary Guidelines for Americans. Their first charge was to write a report with recommendations on how the Dietary Guidelines Advisory Committee (DGAC) selection process could be improved to provide more transparency, minimize bias, and include committee members with a range of viewpoints.

After some time to assess the DGAC’s process and consider the public feedback they received, the committee released the report last Friday. It includes several important proposals that would be beneficial for the DGAC, and really all federal advisory committees (FACs), to employ. My assessment of the report will come later, but first, I want to talk a little bit more about the importance of FACs, generally.

Quick facts on FACs

FACs play an indispensible role in providing independent science advice for our government’s decision making. The government relies on this technical advice from scientists outside the government on everything from drug approvals to air pollution standards to appropriate pesticide use. There are over 1,000 advisory panels within the federal government, some of which offer technical scientific advice that may be used by agencies to inform key policy decisions. Some advisory committees are mandated by law, while others are created for ad hoc policy guidance. The Federal Advisory Committee Act requires that agencies take measures to ensure transparency and ample public participation, but how and the degree to which these are implemented varies depending on the agency.

In our most recent report, “Preserving Scientific Integrity in Federal Policymaking,” we discuss the opportunity to improve the way in which federal agencies obtain objective technical advice from advisory committees so that conflicts of interest are minimized and fully disclosed. Several studies have shown a positive association between authors’ financial conflicts of interest and recommendations that benefit those vested interests. Likewise, an individual on an advisory committee may choose to sideline the evidence and instead make recommendations that favor his or her special interest, especially if they stand to profit in some way. Federal advisory committees have been co-opted by industry for political reasons before, including when G.W. Bush administration officials pushed existing committee members out and replaced them with appointees in order to reject the prospect of stricter lead poisoning standards.

The DGAC plays the essential role of analyzing heaps of nutrition and epidemiological data and making recommendations to the U.S. Department of Agriculture (USDA) and the Department of Health and Human Services (HHS) to inform the Dietary Guidelines for Americans that is released every five years. As a lover of food and a student of food policy, I rely on the DGAC to translate science into objective recommendations that will ultimately shape federal nutrition guidance and regulations spanning from school lunches to nutrition facts labels. UCS commended the DGAC on its 2015 report to HHS and USDA, most notably for the way in which it followed the science to recommend that Americans consume no more than 10 percent of daily calories from added sugars.

NASEM’s report challenges undue influence of science

The NASEM committee’s report identified five values upfront that would enhance the integrity of the DGAC selection process, which closely echo the core values we identified for ensuring scientific integrity in federal policymaking:

  • Enhance transparency
  • Promote diversity of expertise and experience
  • Support a deliberative process
  • Manage biases and conflicts of interest
  • Adopt state-of-the-art processes and methods

For the reasons I mentioned earlier, the fourth value could use strengthening to something more like “Minimize and manage biases and conflicts of interest,” to emphasize that conflicts should be avoided, if possible, to maximize objectivity.

Figure: NASEM

As for its concrete guidance, the NASEM committee suggested changes to HHS and USDA’s process (see figure at right), including that when the departments first solicit nominations for the DGAC, they should “employ a third party to review nominations for qualified candidates.” This would add a crucial layer of independent review into the process, especially if, as NASEM recommends, the third party is an “organization without a political, economic, or ideological identity,” and not necessarily an expert in nutrition or dietary guidance. The NASEM committee would also add a public comment period after the provisional committee is selected by the departments, allowing an opportunity for the public to weigh in on any potential biases or conflicts of interest of the proposed members. We strongly agree with NASEM’s assertion that “candid information from the public about proposed members is critical for a deliberative process.”

The report also recommended that the departments create and make public strict policies on how to identify and manage conflicts of interest and mandate that committee members sign a form that captures nonfinancial conflicts of interest and biases, since that is not currently covered by the required Office of Government Ethics form. Additionally, the committee elaborated on what “management” of conflicts of interest looks like in practice and had some helpful ideas like granting waivers in limited amounts (and making them public) depending on the type of conflict, asking that individuals sell stock or divest property to avoid conflicts, excluding members with conflicts from certain discussions and voting, or allowing for a review of potential conflicts of interest to be discussed at the beginning of each meeting. The committee also suggested that a statement be added to the final DGAC report to review how biases and conflicts of interest were managed throughout the advisory committee’s work.

Overall, the report managed to cover most of the recommendations I made in my public comment, but one thing that I hope the committee explores in its future deliberations is the prevention of undue influence from department leadership after the DGAC report has been submitted, since that is where the translation of science into policy is most critical. DGAC is solely advisory and should not have a role in writing the final Dietary Guidelines report, but it would be appropriate for former DGAC members to have a role in peer review and to make sure that the report language fairly considers the best available science and aligns with DGAC’s recommendations. This last part of the process proved to be controversial in the most recent version of the Dietary Guidelines when the DGAC recommended that environmental sustainability concerns be included in the DGA because the overall body of evidence points to a dietary pattern higher in plant-based foods, and lower in meat, but the final report did not include these important concerns.

NASEM should follow its own advice on conflicts of interest

In light of this report, it seems that NASEM should follow its own advice as it considers itself to be a purveyor of nonpartisan, objective guidance for policymakers, but has been recently scrutinized for conflicts of interest on its own panels. This past December, the New York Times reported that NASEM put together a committee of 13 scientists to make recommendations on regulation of the biotechnology industry, and failed to disclose the clear conflicts of five of the committee members. In fact, the majority of committee members had conflicts (7 out of 13), and the NASEM study director was applying for a job at a biotechnology organization while he was putting together his recommendations for committee members. If that isn’t egregious enough, three of the committee members he recommended for the NASEM biotech panel were actually on the board of the organization at which he was seeking employment. This level of undisclosed conflict is completely inappropriate and should have been caught in the early stages of the committee selection process, not uncovered after the final report had already been released. NASEM should strive to “promote diversity of expertise and experience,” as the committee identified as a core value, rather than stack committees with individuals that have similar industry experience and connections.

Ode to independent science

Independent science at its core must be free from undue political or financial pressure. We of course acknowledge that all policy decisions are not made based on science alone, but in order to create the best possible government policies, the relied-upon science must be independent. We appreciate the work that this committee is putting into advising DGAC on how best to ensure the process facilitates truly objective science advice, because FACs are vulnerable to politicization or interference if not carefully managed. This report should be considered by all federal agencies and other entities, including NASEM itself, that seek to provide scientific advice to policymakers for the benefit of us all.

Restoring America’s Wetland Forest Legacy

UCS Blog - The Equation (text only) -

Like many white, middle-class, suburban kids, I grew up with one foot in the forest. To me, that small woodlot, a green buffer along a half-polluted tributary, was a paradise unmatched by any other forest in the world. Unfortunately, like many other tracts of land across the United States, my childhood forest is gone—cleared for a housing development.

Wetlands, including wetland forests, are the “filters” of our natural system, combating pollution, removing excess nutrients, and securing fresh drinking water for surrounding and downstream communities. Photo: Dogwood Alliance.

Wetland forests offer massive economic benefits

Even small forests across the United States work to provide “ecosystem services”—non-monetary benefits like clean water, clean air, carbon sequestration, hunting, fishing, and yes—recreation for children. Ecosystem services may sound like “lip service” to the natural world, but it’s not. New York city chose to spend $500 million to protect and preserve its upstream watershed (and resulting water quality), to avoid the $3-5 billion price tag of a new water supply system on the Hudson river. Forests in the U.S. offset about 13% of our yearly carbon emissions. In 2002, southern forests supported over a million jobs in the recreation/tourism sectors, generating $19-76 billion dollars in annual revenue. All of these services require healthy, standing forests across the landscape.

As our country continues to grow, we are increasing the pressures on our forests. We need clean air and clean water, but we also need wood products, food, and housing. As Research Manager at Dogwood Alliance, I work every day with other organizations and communities to improve the quality and quantity of southern forests. Much of my day-to-day is focused on coordinating and organizing a new initiative, the Wetland Forest Initiative, to conserve, restore, and improve southern wetland forests.

Cypress-tupelo forests, also known as bottomland hardwood forests, can occasionally have trees live for over a thousand years. Photo: Dogwood Alliance

Wetland forests are the best of both worlds. You can visit during a dry season to walk beside ancient trees, or explore during the wet season by kayaking in submerged habitat, teeming with aquatic invertebrates, migratory birds, fish, reptiles, and amphibians. “Wetland forest” describes so much of the American landscape—from forests edging creeks and the culturally treasured bayous; to coastally influenced forests, which somehow survive the onslaught of the ocean. Wetland forests span 35 million acres across 14 southern states, and provide twice the ecosystem services value of upland forests.

Taking action to save our wetlands

Yet, with a majority of wetland forests lost—cleared for agriculture, drained for commercial or residential development, even cut and converted to fast-growing commercial pine plantations—we are at a fork in the road. Will we allow our wetland forests to dwindle to less than one percent of their original range, like we did with longleaf pine? Or will we take action now to conserve these vital ecosystems, before it’s too late?

Wetland forests are home to many endemic species, found nowhere else on earth. This photo was taken during a flyover search for the swallow-tailed kite, a bird native to southern wetland forests. Photo: Maria Whitehead

The Wetland Forest Initiative is working to conserve, restore, and improve these habitats. In special places, we will work to protect the legacy of rare, threatened, and endangered species, ensuring that they will have habitat for decades to come. In places where wetland forests have been degraded by lack of management, changes in hydrology, or pollution, we will work with local groups and governments to restore the land to ecological function. Beyond the tree line, we will work with politicians and government agencies to ensure that landowners are awarded with fair compensation for their restoration and conservation efforts. And perhaps most importantly, we will work with communities, to educate them about the beauty and importance of what’s happening on the ground in their local wetland forests.

Although I never thought I would leave academia, I am happy to spend my working hours on a project that has the potential to impact 35 million acres across 14 states. Despite the differences in opinion that some of our member organizations may have, it is inspiring to see so many people from different walks of life (academic, community, environment, forestry, government, land management, landowners, social justice, and tribal) come together and create meaningful change. I am excited for the future of our southern wetland forests.

I encourage you to head over to the Wetland Forest Initiative website to learn more, endorse the platform, and get your organization or university involved.

Sam Davis is Research & Program Manager at Dogwood Alliance. A life-long treehugger, Sam earned a Ph.D. in Environmental Science in 2015 at Wright State University, and completed a postdoc at University of California Merced before leaving academia for greener forests. Sam is thrilled to be translating science into action with Dogwood Alliance. On the weekends, Sam enjoys hiking, home improvement, and gaming with friends and family.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Oregon’s Climate Check-Up Offers Serious Prognosis Without Preventative Action

UCS Blog - The Equation (text only) -

Each January, I journey to my doctor’s office for my annual physical. She briefly reviews my medical history before conducting an examination, and we end our visit by discussing key risk factors and a plan to manage them.

Well, just in time for the start of the 2017 legislative session, Oregon received its periodic “climate physical.” The results are sobering, and the treatment plan involves further action to put the Beaver State on the path to a low carbon, climate-resilient economy – a path to good “climate health.”

Oregon, like other states, is already experiencing climate change

The Third Oregon Climate Assessment Report by the Oregon Climate Change Research Institute (OCCRI) incorporates findings from recently published studies on climate science and impacts in Oregon.

Hotter and drier conditions caused by climate change contribute to increased wildfire risks and other key impacts in Oregon. Source: UCS

The legislatively mandated report reaffirms what scientists have been telling us. Oregon is already experiencing the impacts of climate change, and human activity has played a key role. It’s a stark contrast to statements by several of the Trump administration’s cabinet nominees.

According to the authors, global emissions of heat-trapping gases are largely responsible for the overall increase in average annual temperatures in the Pacific Northwest over the past century. (Yes, despite an unusually cold winter, the statewide average temperature for 2016 was still much warmer than average.) They found additional signs of human-caused global warming in the 2015 record-low snowpack, more acidic waters off the Oregon coast in 2013, and wildfire activity over the past three decades.

A future of more extremes in every region of the state

Oregonians will face more severe impacts in the future if we continue on our current global carbon emissions trajectory. As shown in the table below, annual temperatures could increase by an average of 8 degrees by the century’s end compared to the late 20th century.

Average temperatures will continue to rise in Oregon compared to the late 20th century under both low and high emissions pathways. Source: Oregon Climate Change Research Institute

Rising temperatures will mean a shrinking snowpack, earlier snowmelt, and diminished summer water supplies as well as increased wildfires and more acidic oceans that affect coastal ecosystems. Sea-level rise will lead to more coastal flooding and erosion. There also will likely be overall negative impacts to agriculture over time.

The 100+ page report provides detailed information and projections for each of these impacts. One of the most striking findings is that higher temperatures and a record-low snowpack despite normal precipitation levels – conditions that led to the devastating 2015 snow droughtcould become commonplace by mid-century.

Another key takeaway is that climate change will affect every region of Oregon. It will also disproportionately impact tribal communities, as well as low-income and rural residents and communities of color. The assessment divides the state into four regions, with snapshots of anticipated climate impacts over the rest of the century:

  • The Coast: Due to rising sea levels, thousands of homes and more than 100 miles of road face a greater risk of inundation. Warmer and more acidic oceans will affect near-shore fisheries and hatcheries, endangering the local shellfish economy and the workers who rely on that industry. Wildfires in coastal forests will likely become increasingly common as well.
  • The Willamette Valley: Heat waves will grow in frequency and intensity as temperatures continue to climb, increasing heat-related illnesses and deaths among the region’s residents. Studies project increasing summer water scarcity and growing wildfire risks that could significantly expand burn areas.
  • The Cascade Range: Precipitation will increasingly fall as rain instead of snow, affecting the ski industry and water supplies. At the same time, forests will likely become even more vulnerable to wildfire, insect infestations and disease. Increased risk of wildfire-related respiratory illnesses is a key health concern for Jackson County.
  • Eastern Oregon: As snowpack shrinks, water supply will be a concern, especially for residents in the John Day basin with no man-made water storage capacity. Drought is a key health risk for Wasco, Sherman, Gilliam, and Crook counties. The Blue Mountains will also likely experience higher tree mortality and wildfire activity.
Ambitious climate action is the prescription

The Third Oregon Climate Assessment Report includes good news for Oregonians. The worst climate impacts can be avoided through ambitious efforts to curb global carbon emissions.

The Beaver State has already taken significant steps to decarbonize its economy, yet it’s still not on track to meet its near-term 2020 emissions goal. Two key next steps for Oregon are ensuring that any transportation funding package helps reduce global warming emissions from the transportation sector, and putting a price on carbon. A carbon price is an important tool in the overall portfolio of critical policies for cutting heat-trapping pollution.

The Oregon legislature should show continued leadership by heeding the experts’ prognosis and taking further preventative climate action today to ensure its climate health tomorrow!

UCS Oregon Climate Change Research Institute

Standing Up to Pernicious New Attacks on Federal Climate Scientists

UCS Blog - The Equation (text only) -

The time-tested climate denial strategy of attacking the reputations of prominent climate scientists in order to sow doubt about the evidence and risks of climate change is being trotted out again.

Exhibit A: The Daily Mail, a British tabloid, has published a screed by David Rose alleging serious scientific misconduct by Dr. Tom Karl, a leading climate scientist recently retired from the National Oceanographic and Atmospheric Administration (NOAA).

A writer with a history of inaccurate reporting on climate science, Rose claims that Karl and coauthors deliberately used misleading global temperature data, side-stepped NOAA scientific integrity policies, and “rushed to publish” a 2015 paper in the prestigious journal Science in order to influence the climate negotiations held that year in Paris. His piece draws in part on a blog post by former NOAA scientist John Bates.

The Science paper is one of several recent studies refuting the notion that the rate of global warming had slowed down, or “paused”, in recent decades, an idea that opponents of climate policies have often used to justify inaction on reducing emissions. Karl and coauthors showed the apparent “pause” in warming was simply an artifact of how earlier studies had over time incorporated data on ocean surface temperatures from different sources (satellites, ships, buoys and so on); when temperature data sources and quality were properly taken into account, no slowdown was detectable.

Repeated and amplified through the climate denial echo-chamber, Rose’s allegations of misconduct have now been taken up by Rep Lamar Smith (R-TX), Chairman of the House Science Space and Technology Committee. Smith, who has long used his perch to harass NOAA scientists, issued a press release reiterating these unsubstantiated claims and accusing Karl and colleagues of manipulating data for political purposes.

Along with other recent high profile attacks on prominent climate scientists and science agencies, this may well be part of larger political strategy to intimidate federal scientists, justify cuts in agency budgets, staffing and missions, weaken support for US and international climate policies and, most fundamentally, erode public trust in science and evidence so central to a functioning democracy.

At its core, it is a very old strategy.

As the Irish essayist Jonathan Swift wrote in 1710, “Falsehood flies, and the Truth comes limping after it; so that when Men come to be undeceiv’d, it is too late; the Jest is over, and the Tale has had its Effect…”

But today, scientists are fighting back.

Rose’s claims have been quickly and forcefully rebutted:

  • Top experts on temperature record research have called attention to several errors in Rose’s piece and his failure to mention that multiple independent published analyses support and corroborate the corrected temperature data in the NOAA scientists’ findings.
  • To claims that Karl and colleagues violated NOAA guidelines on scientific integrity, Ret. Rear Admiral David Titley (Ret.), former chief operating officer at NOAA, points out that “[t]here is both a NOAA internal process on scientific integrity….and the opportunity to submit allegations of wrongdoing to the Department of Commerce Inspector General who if there is reasonable evidence to substantiate the allegation, would undertake an independent investigation.” Yet, no allegations of violations of the NOAA scientific integrity policy were brought to the agency’s scientific integrity office regarding this research.
  • Jeremy Berg, editor of Science, firmly rejects the notion of a “rush to publish”: “The article by Karl et al. underwent handling and review for almost six months [longer than average for this journal]. Any suggestion that the review of this paper was ‘rushed’ is baseless and without merit. Science stands behind its handling of this paper, which underwent particularly rigorous peer review.”

Global land/ocean temperature records from NOAA, NASA, Berkeley Earth, Hadley/UAE, and Cowtan and Way. Old (pre Karl et al 2015) NOAA temperature record is only available through the end of 2014. Source: Hausfather et al (2017) Assessing recent warming using instrumentally homogeneous sea surface temperature records, Science Advances. Figure obtained here.

Attacks on the reputations and research findings of federal climate scientists are a deplorable attempt to distract attention from the overwhelming evidence of climate change and the urgent need to deeply reduce carbon emissions from the burning of fossil fuels and other sources.

We can’t keep tabloids from publishing misinformation. But we can and must hold elected officials accountable for doing their jobs to protect science and evidence-based decision-making.

As former Congressman and Chair of the House Science Committee Sherry Boehlert (R-NY) puts it: “The current attacks should be received with extreme skepticism, given the enormous body of evidence supporting the conclusion that the climate is changing and poses a danger that needs to be addressed. And public officials have an obligation to follow the scientific consensus…”

Chairman Smith, it’s high time for you to follow suit.

What is Oil Used For? What the Super Bowl Commercial Didn’t Tell You…

UCS Blog - The Equation (text only) -

A commercial during yesterday’s Super Bowl about oil may have given you pause.

Besides the sports car (about to go off-roading), the commercial was about things you probably don’t associate with oil. Like graffiti; makeup; prosthetics; a heart; and outer space.

Is oil really diversifying? Or is this ad just a marketing ploy?

Looking at data from the U.S. Energy Information Administration, it is pretty clear that oil and natural gas are still being used overwhelmingly for what they have always been used for—combustion, whether in vehicles or power plants.

The American Petroleum Institute (API) ran the commercial in question. API is the largest oil trade association in the United States. Member companies include BP, ConocoPhillips, Chevron, ExxonMobil, and Shell. You may have heard of API for their role in a concerted campaign to spread denial about climate change. They merged with America’s Natural Gas Alliance last fall, so it now lobbies for both oil and natural gas interests. This merger came about because major oil companies now have large natural gas assets.

As a chemist, I know that many consumable products like asphalt, paint, and plastics have oil or natural gas as a precursor ingredient. And while these products have many positive impacts in society, they are absolutely tiny fractions of the oil and gas industry and should not be used to justify the bulk of their business. Over 90% of oil and gas is used for combustion, either in power plants or vehicles.

Let’s not discount the many benefits energy provides society

But while coal, oil, and natural gas have been our primary sources of energy for many decades, we will not rely on them in the future. We are moving to a world that gets most of our energy from clean, renewable resources like wind and solar. This is in large part because the cleanest sources of energy are becoming the cheapest. Our cars and trucks can plug into that clean grid for their future fueling needs.

There are many chemists exploring ways to make plastics etc. from non-petroleum resources such as plants. This is great work (and tough chemistry) that will lead to a more sustainable world. But if we are going to stop the worst effects of global warming and clean our air, we must remember the most obvious effects oil and natural gas are having on our communities and our world.

We have solutions

While oil may currently play a role in making paint, plastics, or rocket fuel, it doesn’t “gush art,” “pump life,” or “explore space”–that would be artists, doctors, and scientists. And it is artists painting a picture of environmental justice; doctors treating patients suffering from asthma; and scientists discovering clean energy solutions.

The Fate of the Clean Power Plan under President Trump

UCS Blog - The Equation (text only) -

Shortly, we are likely to see and hear much more about what jurists, Congress, and the new Administration think about the Clean Power Plan, the cornerstone of our nation’s efforts to reduce carbon emissions. Regardless of how the court rules—and how Congress and President Trump respond—there’s no denying the reality of climate change or the many compelling reasons to double down on the clean energy transition already underway.

Imposing limits on carbon pollution would help the President deliver on two campaign promises—to create jobs and protect clean air.

Protesters rallied outside the US Court of Appeals for the District of Columbia Circuit early yesterday as judges prepared to hear arguments on the Clean Power Plan.

Accelerating the clean energy transition

Market trends are already driving a transition to cleaner energy. The costs of wind and solar energy are dropping dramatically, driving new renewable energy deployment that is outpacing all other new energy resources. This transition is delivering huge health and economic benefits to communities around the country.

The Clean Power Plan would lock in those gains and create a framework for continuous improvement, in the exact same way the Clean Air Act took on pollution problems in previous eras (acid rain in the 70’s, soot and smog in the 80’s, and mercury earlier this century). While these pollutant still cause problems, sometimes concentrated  in low income or racially diverse neighborhoods, the CAA required significant pollution prevention measures to taken. We need to do the same for carbon dioxide and other greenhouse gases.

How Tillerson and Pruitt view US Climate Action

As we wait to hear the DC Circuit Court of Appeals Decision, expected to be issued in the near future, we’ll likely see confirmation votes for Rex Tillerson, the former CEO of the world’s largest fossil fuel company, and possibly Scott Pruitt, one of the state AG’s who sued to have the Clean Power Plan overturned.

On left: Scott Pruitt. On Right: Rex Tillerson

As Secretary of State, Tillerson will be called upon by the foreign ministers of 190 countries to account for how the US plans to meet its commitment to the Paris Agreement. While additional policies to limit harmful global warming emissions beyond the CPP would still be needed to meet the US international climate targets, the CPP  is the down payment.

Tillerson has said he would like to see the US maintain a ‘seat at the table’ of the climate talks. If the Administration is casting aside cost-effective emission reducing actions like the CPP, he’ll find that seat more than a little warm.

As part of the EPA Administrator confirmation process, Scott Pruitt conceded that carbon is a pollutant subject to Clean Air Act regulation, indicating that the CPP has a strong legal foundation. The Clean Air Act itself, and subsequent elaborations through the 2007 Mass v. EPA Supreme Court decision and a 2009 Endangerment Finding by the EPA, make this absolutely clear.

However, when asked if there was an EPA program or rule he supported, he could not or would not cite a single one—which doesn’t bode well for his leadership of the agency.

The Clean Power Plan is the Clean Air fight of this generation

I’ve had the privilege of working with clean air advocates for 20 years. I’ve heard the stories of how they successfully fought for laws that would curb the acid rain contributing to the dying lakes in the Northeast; measures to reduce the emissions of soot that settled on cars downwind of Midwest coal plants; tailpipe standards to reduce smog-choked cities; and limits to mercury that was contaminating fish in our streams.

The pattern is always the same: scientists study the problem and identify the causes; advocates petition EPA and Congress for action; and industry casts doubt about the science and fights the solutions with claims of economic collapse.

Ultimately, when all legal remedies are exhausted, industry complies at a cost far less than predicted and the promised health improvements from cleaner air are realized. My colleague Rachel Cleetus noted in her blog the benefits of EPA for real people and cited the finding that “over a 20-year period from 1990 to 2010 the Clean Air Act helped drive down total emissions of the six major air pollutants by more than 40 percent while GDP grew more than 64 percent.”

While we are far from having pristine air quality, we have a science-based process underlying the Clean Air Act that results in ratcheting down the regulations as better information becomes available and new cost-effective pollution control technologies become available.

My career has largely been spent trying to get carbon pollution treated the same way as these other pollutants.  Carbon is the pollutant driving the most pressing environmental problem of our generation. Its impacts go beyond typical local and regional air pollution effects, like the aggravation of asthma and other respiratory diseases, to threaten the ‘regulator’ of the planet, the very climate that makes human existence possible.

Climate impacts demand a response

As global average temperatures rise, arctic ice melts, sea levels rise, heat waves are more frequent and last longer, and extreme weather events intensify. Scientists and advocates began calling for action to reduce carbon emissions at least as far back as the early 1990s, hoping to prevent these events from coming to pass.  We are now seeing these impacts as our reality. They are becoming more common as every day passes, leaving little room for doubt that our climate is changing.

The predicted impacts are coming to pass, and despite the doubt continuing to be peddled by the likes of Tillerson and Pruitt, scientists do know—with a great deal of certainty—that burning fossil fuels is the primary cause of those impacts and they can predict, with ever improving reliability, what a warmer world would look like.

And it’s not good, it’s not something we can ‘adapt to’ and it’s coming to pass faster than expected.

Both legally and morally, this Administration is compelled to act on clean air and climate. Many local and state governments are fully committed to continuing clean energy and climate progress because it’s good for public health and their local economies, and many businesses will continue to ramp up their clean energy investments because it’s good for their bottom line.

Throwing out the Clean Power Plan won’t bring back coal. Coal is increasingly uneconomic for a variety of reasons, including cheaper alternatives like natural gas and, increasingly, wind and solar. Those market conditions will exist with or without the CPP. That’s why the Trump Administration and the Congress must do something real to help miners/coal dependent communities instead of meaningless posturing around the CPP. The clean energy transition is good for our health and is one of the fastest growing job creators. Now we need to make it work for all Americans.

The Clean Power Plan could also prevent us from becoming over-reliant on natural gas.  A rush to gas would hit consumers the hardest, due to the price volatility that results from the boom and bust cycles of gas exploration.  While I’m sure it is hard for an Oklahoma oil company attorney like Mr. Pruitt to believe, but too much natural gas is bad for the economy and our health.

Natural Gas Gamble

What’s your climate plan, President Trump?

So the real question is, regardless of how the court rules, what will this Administration do to tackle today’s air pollution crisis: the need to reduce the carbon pollution that is fueling global warming?

The Clean Power Plan rule did not come about on a whim. It wasn’t rushed out the door as the Obama Administration was leaving. After decades of inaction by Congress, the EPA crafted these rules over a three year period that included consultation with scientists, state officials, power companies, and public hearings. They reviewed millions of comments from citizens around the country.  Similar to healthcare, this Administration has an obligation to replace if it intends to repeal.

Before Pruitt is confirmed, Senators and all Americans are entitled to know, if not the Clean Power Plan, then what? President Trump, how will your Administration address this huge environmental and public health problem?

Erika Bolstad Ecowatch Union of Concerned Scientists

Nuclear Safety Performance at Pilgrim

UCS Blog - All Things Nuclear (text only) -

The Nuclear Regulatory Commission (NRC) held a public meeting on Tuesday, January 31, 2017, in Plymouth, Massachusetts. A large crowd of over 300 individuals (perhaps thousands more by White House math) attended, including me. Elected officials in Massachusetts—the attorney general, the governor, the entire US Congressional delegation, and state senators and representatives—had requested the meeting. Many of these officials, or their representatives, attended the meeting.

The elected officials asked the NRC to conduct a public meeting to discuss the contents of an email from the leader of an NRC inspection team at Pilgrim to others within the agency regarding the results from the first week’s efforts. An NRC staffer forwarded this email to others within the agency, and inadvertently to Diane Turco of the Cape Downwinders, a local organization. The contents of the leaked email generated considerable attention.

Unique NRC Meeting
During my nearly two decades at UCS, I have attended dozens, perhaps hundreds (maybe even millions by White House accounting) of NRC meetings. The Plymouth meeting was unique. It was the only NRC meeting I’ve attended to discuss an email.

And it was the only NRC meeting I’ve attended where public speaking slots were chosen by raffle. In all prior meetings, members of the public raised their hands to be called upon by the NRC staff, queued behind a microphone in the room in order to speak, or added their names to a list to speak in the order specified by the sign-up sheet. At this meeting, the NRC used a raffle system. I received Ticket #4 (see Figure 1), giving me an opportunity to “win” a chance to speak for up to 3 minutes (or 180 seconds, whichever came first) during the meeting.

Fig. 1 (Source: Nuclear Regulatory Commission)

Fig. 2 (Source: Nuclear Regulatory Commission)

My ticket, along with at least 74 other tickets, was placed into a fishbowl. Brett Klukan, an attorney in NRC Region I, drew tickets from the bowl to establish the speaker order. Because the fishbowl was clear glass, Brett gazed at the ceiling to avoid charges of cherry-picking preferred ticket numbers (see Figure 2). Brett then wrote the number drawn on a whiteboard without showing the number to anyone else, somewhat offsetting the averted gaze tactic since he could have jotted down any number he wished.Unique NRC Discussion

Brett Klukan opened the meeting by introducing the NRC panelists and covering some ground rules for the meeting. The ground rules included a decorum standard—any audience member disrupting the meeting three times would be asked to leave. If the individual did not leave voluntarily, Brett explained that law enforcement officers (and there were numerous uniformed officers in the room and in the hallway outside) would escort the person from the room.

Brett then turned the meeting over to the NRC panel of Dan Dorman, the Regional Administrator for NRC’s Region I, Bill Dean, the NRC’s Director of the Office of Nuclear Reactor Regulation, Raymond Lorson, the Director of the Division of Reactor Safety in Region I, and Don Jackson, the leader of the NRC inspection team at Pilgrim and author of the email.

Don went through the leaked email, which he had written, updating the audience on each issue and supplementing the email with results from the team’s efforts since that initial week. I had expected the NRC to talk about what systems, components, and administrative processes the inspection team examined, but anticipated the NRC would not discuss results until the team’s report was approved and publicly released. But Don candidly provided the results, too. More than once, Don explained that the team identified an apparent violation of NRC’s regulations—in fact, he stated that 10 to 15 potential violations had been identified.

After the NRC panel finished their remarks, the meeting moved to comments and questions from the public. I was the third member of the audience to speak to the NRC. Figure 3 shows Brett Klukan at the podium to the left, the NRC panel in the center, and several members of the audience turning to look at the speaker standing at the microphone located towards the back of the room out of view to the far right.

Fig. 3 (Source: Nuclear Regulatory Commission)

I asked the NRC four questions. After I posed the four questions, the NRC panel answered. My questions and the NRC’s answers:

UCS Question #1

The NRC’s 20-member inspection team covered a lot of ground, but still examined a small fraction of the safety systems at Pilgrim. Based on the large number of safety violations in the small sample the team examined, what assurance can the NRC provide about the state of the majority of safety systems the team did not examine?

NRC Answer: The NRC’s reactor oversight process (ROP) features periodic inspections of safety systems at Pilgrim with the team inspection being supplemental to those activities. If there were problems in those other safety systems, the periodic inspections would reveal them.

UCS Response: Don Jackson described his team identifying 10 to 15 apparent violations of federal safety regulations in the small sample of safety systems they examined—violations that apparently were NOT revealed previously by the ROP’s periodic inspection efforts. Those routine inspection efforts failed to identify violations among the small sample, strongly suggesting that the routine inspection efforts also fail to find violations in the larger sample.

UCS Question #2

Don Jackson explained that the text in his email about the staff at Pilgrim appearing overwhelmed or shocked referred to their reaction to the arrival of the NRC’s 20-member inspection team. Does the NRC believe that this staff might also be overwhelmed or shocked in response to an accident?

NRC Answer: Don Jackson explained that his email comments referred primarily to the plant’s support staff (e.g, engineers, maintenance workers, etc.) rather than about the control room operators. Don said that his assessment of the operators at Pilgrim during their duties in the control room and during exercises on the control room simulator gave him complete confidence that the operators would be able to successfully respond to an accident.

UCS Response: Even if Don’s assessment is correct (and the operators losing control of the reactor during a routine startup causing it to automatically shut down to avoid fuel damage, the operators mis-operating numerous safety components following Winter Storm Juno and the operators not receiving proper training on the use of the high pressure coolant injection system leaves room for doubt), it is incomplete. The response to an accident involves considerably more than the handful of operators on duty at the time. NRC’s regulations require dozens of other plant workers to staff the Technical Support Center, the Operations Support Center, and the Emergency Operations Facility. The work force freaking out because 20 NRC inspectors arrive on site—by an appointment made weeks in advance—suggests that work force could be equally stressed out responding to an unannounced accident.

UCS Question #3

Dan Dorman mentioned the NRC planned to conduct another public meeting in late March about this inspection and to release the team’s final report in mid-April. Would it be possible for the NRC to issue the final report before the public meeting to allow the public to review the report and participate meaningfully in the meeting?

NRC Answer: Don Jackson mentioned that the report for a recent team inspection at another nuclear plant was over 350 pages due to all the information it contained. He said it would take sustained effort for the report by the team for their inspection at Pilgrim to be issued by mid-April, with no real opportunity for putting it out sooner.

UCS Response: There are two items both under full control of the NRC—the public meeting and the team inspection report. I have no reason to doubt Don’s word that mid-April is the soonest that the report can be released. I have every reason to doubt why the NRC must hold the public meeting in late March. The NRC could conduct the public meeting in late April, or early May, or mid-May, or late-May, or early June, or any time after they release the team’s report. The only reason for the NRC to conduct a public meeting about a non-existent report is because that’s the way they prefer to do it.

UCS Question #4

Audience members for this meeting are given three strikes before they are out of the meeting. How many strikes has the NRC given Pilgrim before it is out?

NRC Answer: Bill Dean began to answer the question, but Dan Dorman interrupted him. Dan labeled the question rhetorical and directed Brett to proceed with the next speaker.

UCS Response: I appreciate NRC bringing back Bert the turtle with this Duck and Cover gimmick. To be sure, I’d have better appreciated the NRC’s explanation why audience members get dragged out of the room after three strikes while Pilgrim does not get shut down after 10 to 15 violations of federal safety regulations. But this is America where everyone has the right to chicken out. My apologies if I put the NRC in a fowl mood.

To Be (Shut Down) or Not to Be (Shut Down)

The recurring theme during the meeting was whether the known performance problems warranted the shutdown of Pilgrim (either permanently or until the problem backlog was eliminated) or if Pilgrim could continue operating without exposing the community to undue risk.

Best I could tell, the meeting did not change any participant’s viewpoint. If one entered the room believing Pilgrim was troubled but sufficiently safe, one left the room with this belief intact. If one entered the room feeling Pilgrim’s problems posed too great a hazard, one probably left the room with even stronger convictions.

The meeting was somewhat like a court trial in that two reasonably supported but entirely opposite arguments were presented. The meeting was unlike a court trial in that instead of a jury, only time may decide which argument is right.

The Argument for Pilgrim Continuing to Operate

The team inspection led by Don Jackson is a direct result of an increasing number of problems at Pilgrim that caused the NRC to drop its performance assessment from Column 1 of the ROP’s Action Matrix into Column 2, 3 and eventually 4. The NRC developed the ROP in the late 1990s in response to high-profile troubled nuclear plants like Millstone, Salem, and Cooper.

The Action Matrix has five columns. A reactor with performance so bad that the NRC places it into Action Matrix Column 5 cannot operate until the NRC is satisfied enough of the problems have been corrected to permit restart.

Dan Dorman and Don Jackson tried to explain during the meeting that it was not the number of problems that determined placement into Column 5, it was the severity of the problems that mattered. They said several times that the 10 to 15 apparent violations identified by the team reinforced the NRC’s determination that Pilgrim was a Column 4 performer, but did not cause them to feel movement into Column 5 was warranted.

The Action Matrix is like our legal system. Persons guilty of a single misdemeanor generally receive lesser sanctions than persons guilty of multiple misdemeanors who in turn generally receive lesser sanctions than persons guilty of a single felony. Persons guilty of multiple felonies tend to be those receiving the severest sanctions and incarceration.

Pilgrim got into Column 4 as the result of several violations identified by NRC inspectors that were classified as White, the second least severe classification in the NRC’s Green, White, Yellow, and Red system. The data suggest performance shortcomings warranting regulatory attention, but it doesn’t suggest a trip to nuclear jail.

The Argument for Pilgrim Shutting Down

The NRC panelists stated several times during the meeting that they did not see any immediate safety concern that required Pilgrim to be shut down. Those assurances would be more meaningful and credible had the panelists or their NRC colleagues periodically seen an immediate safety concern, even from a distance.

The last time the NRC saw an immediate safety concern and ordered an operating reactor to shut down was March 31, 1987 when the agency ordered the Unit 2 and 3 reactors at the Peach Bottom nuclear plant in Pennsylvania to be shut down (the Unit 1 reactor had already been permanently shut down). Dan Dorman and Ray Lorson did not join the NRC staff until 1991. Don Jackson did not come to the NRC until 2003. Of the four NRC panelists, only Bill Dean was with the agency the last time an immediate safety concern was spotted.

Yet there have been times since 1987 when immediate safety concerns have existed:

Davis-Besse Safety Blindspot

In the fall of 2001, the NRC staff drafted an order that would require the Davis-Besse nuclear plant to be shut down. To justify the order, the NRC staff assembled the strongest circumstantial case one could hope to build that an operating reactor was unsafe. The NRC staff evaluated the reactor against five criteria in Regulatory Guide 1.174 (RG 1.174). All five criteria had to be satisfied for a reactor to be considered safe. The NRC staff determined that one criterion was not met and the other four criteria were most likely not met. Absent dead bodies or a mushroom cloud, you cannot build a stronger case that an operating reactor is unsafe.

Fig. 4 (Source: Nuclear Regulatory Commission)

But NRC senior managers shelved the order and allowed Davis-Besse to continue operating. When the reactor finally shut down, workers discovered the reactor was less safe than the NRC staff had feared. Per the NRC, Davis-Besse came closer to a meltdown than any reactor since the Three Mile Island accident in March 1979 (much closer than Peach Bottom ventured in March 1987).

Worse still, when interviewed by the NRC’s Office of the Inspector General, the NRC senior managers stated, under oath, stood behind their decision. They claimed they needed absolute proof that an operating reactor was unsafe before they would order it shut down. Somehow, failing to meet five of five safety principles does not constitute absolute proof to the NRC. Perhaps not meeting eight or nine out of five safety principles would suffice.

Oconee Safety Blindspot

In June 2010, the NRC issued a confirmatory action letter (CAL) to the owner of the Oconee nuclear plant in South Carolina. The CAL required that the owner take fifteen steps to reduce risk of failure at the upriver Jocassee Dam (which was also owned by Oconee’s owner) and to lessen the flooding vulnerability at Oconee should the dam fail.

The NRC staff discovered that the failure rate for the Jocassee Dam was as high as other hazards that Oconee was protected against. Thus, failure of the dam could not be dismissed as incredible or overly speculative.

The NRC staff further estimated that if the Jocassee Dam failed, flooding at the Oconee site created a 100 percent chance of causing all three operating reactors to melt down, all cooling of the spent fuel pools to be lost, and all three reactor containments to fail.

The high risk of flooding causing three operating reactors to melt down prompted the NRC to issue the CAL to Oconee’s owner nine months before flooding caused three operating reactors at Fukushima to melt down.

The hazard was real enough to cause NRC to require the owner to take steps to lower the risk, but not real enough to warrant the reactors to shut down until the risk was better managed.

Most galling is the fact that the NRC withheld information about this hazard from the public. Their June 2010 CAL was issued in secret. When the NRC conducted their annual public meeting in the Oconee community in April 2011—about six weeks after flooding melted three operating reactors at Fukushima—they said nothing about the CAL being issued to better manage flooding vulnerabilities at Oconee. The public cannot trust an agency that withholds relevant information from them.

It may be true that the NRC would order an operating reactor to be shut down if it saw an immediate safety concern. But it’s been nearly thirty years since the NRC noticed an immediate safety concern at an operating reactor. Since then, the NRC has noticed very serious safety problems at Davis-Besse and Oconee, yet allowed those reactors to continue operating.

The Davis-Besse and Oconee cases occurred after the NRC adopted the ROP and its Action Matrix. None of the safety problems that led to the NRC staff drafting a shutdown order for Davis-Besse or issuing a CAL for flood protection problems at Oconee were considered in the ROP. Thus these safety problems were entirely invisible as far as the Action Matrix was concerned.

The NRC should not rely on a safety yardstick that ignores significant safety issues.

UCS’s Argument about Pilgrim

Because the NRC has demonstrated its ability to jettison safety standards when an operating reactor doesn’t measure up, and because it has not recently demonstrated an ability to spot an immediate safety concern, it is entirely reasonable for the community around Pilgrim to have anxiety about the plant’s known performance problems. Shutting down Pilgrim would lessen that anxiety.

Should public anxiety be used as a pretext for shutting down an operating reactor?

Absolutely not.

Instead, the public should have trust and confidence in the NRC to protect them from Pilgrim’s problems. But the NRC has not done much to warrant such trust and confidence by the NRC. If public anxiety is high, it’s because public trust and confidence in the NRC is low.

Public trust and confidence in the NRC should be the proper context for a troubled reactor continuing to operate.

That proper context is missing.

The NRC must take steps to restore public trust and confidence. They should consistently establish and enforce safety regulations. NRC senior managers must stop looking for absolute proof that operating reactors are unsafe and instead look for absolute proof that operating reactors comply with federal safety regulations.

And when NRC senior managers see safety problems, they must disclose that finding to the public. Hiding such information, as they did with the flooding vulnerabilities at Oconee, provides the public with a distorted view. And such antics provide the public with zero reason to trust anything the NRC utters. When you cherry-pick what you say and when you say it, you stop being a credible authority.

If the NRC allows Pilgrim to continue operating and the reactor has an accident, will the agency be able to honestly look victims and survivors in the eye and say they did everything they could to protect them?

Will the FDA’s Picture of “Health” Match Ours?

UCS Blog - The Equation (text only) -

As we enter month two of 2017, our New Year’s resolutions of leading healthier lives might be starting to plateau. But that of course depends on how we are defining “healthy.” What’s healthy to me might not be the same kind of healthy to you. My vision of a healthy day done right might be eating a Sweetgreen salad for lunch and walking back and forth to the metro, while yours might entail a ten-mile morning run and a steak dinner.

What does the Food & Drug Administration (FDA) consider “healthy”? Well, the agency currently has an open comment period asking the public to weigh in on how it should redefine the term to stay up to date with evolving nutrition science. You would think that the FDA’s definition of “healthy” would be a bit more straightforward, since it has a wealth of consumption and nutrition data at its fingertips. However, in draft guidance to industry on the term “healthy,” the FDA has so far failed to include added sugar as an ingredient that can only bear a “healthy” claim if it meets an enforceable limit, despite the scientific consensus surrounding added sugar’s role in chronic disease risk. And depending on who ends up being appointed to run the FDA, the definition of “healthy” could be scrapped completely if it’s deemed too burdensome for food manufacturers (more on that later).

What’s “healthy,” anyway?

Under the FDA’s current definition, in order to bear a “healthy” claim on a food package, a food must have at least 10 percent of the daily reference value (DRV) for at least one of either vitamin A, vitamin C, iron, calcium, protein or fiber and not have more than a certain limited amount of fat, saturated fat, sodium, and cholesterol. Unacceptably high levels of these ingredients, known as disqualifying levels, bar a food from being labeled as “healthy.” Notably absent from the list is added sugar.

How does this play out at the grocery store? Well, have you ever reached for a box of cereal with a big “healthy” claim on the front, only to find out that it has more sugar in a serving than you might like to eat in an entire day? This is entirely common, and especially concerning given the fact that these claims are allowed on packages for children as young as two years old. And it is these kinds of deceiving claims that contribute to the excess amount of added sugars that Americans consume every year.

The FDA must take further action to protect consumers from misleading food claims

That is why we submitted a citizen petition to the FDA last week to ask that the agency set a disqualifying level for added sugars that would apply to nutrient content and health claims, including the term “healthy.” Over 30,000 men and women across the country signed onto our petition in support of this measure!

It’s high time that the agency take action to protect consumers from misleading statements about the health of a product with regard to added sugar. There should be a clear limit on added sugars deemed by food manufacturers to be “healthy” to help consumers navigate the food environment that has become chock full of sugar. A brand new U.S. Department of Agriculture Economic Research Service report looking at trends in food and nutrient availability data revealed that Americans are still eating far too much added sugar: about 366 calories (23 teaspoons) per day, which is 83 percent higher than the Dietary Guidelines recommended limit of no more than 10 percent of calories (less than 200 calories or 12.5 teaspoons per day).

While President Trump’s “2 for 1” executive order will certainly make rulemaking an even tougher lift for agencies, as they’ll have to get rid of two rules for every new rule issued, the FDA should continue to build on its progress around added sugar. Just last May, the agency released its nutrition facts label revisions that created daily reference values (DRVs) for added sugar so that new labels will include a discrete line for added sugars beginning in July 2018.  Now that the FDA has set DRVs for added sugar, and overwhelming evidence—supported by leading medical and public health organizations like the American Heart Association, the American Academy of Pediatrics, and the World Health Organization—has illustrated that excessive added sugar consumption is linked to several chronic diseases, the FDA has the science on its side and the authority to add a disqualifying level for added sugar.

A strong FDA means a healthier America

The science certainly supports the FDA moving forward with this commonsense measure on added sugar, but the political reality is that the Trump administration seems to be fairly uninterested in science-informed policies so far. Last week began with scientists at agencies like the EPA and USDA being told by leadership not to communicate their taxpayer-funded scientific findings with the public and that there would be a freeze of hiring, grants, and contracts at the EPA. And then earlier this week, President Trump signed an executive order requiring that agencies must eliminate two rules for every one new rule issued (which is likely illegal, according to UCS president Ken Kimmell). All of these directives have a chilling effect on federal scientists, with the “2 for 1” order forcing agencies to make impossible choices between protecting the public from one threat to their health versus another.

The Trump administration’s cabinet selections haven’t been heartening, either. Whether it’s the climate denying and EPA-suing Scott Pruitt or the agribusiness-supporting Sonny Perdue, it’s looking pretty clear that the corporate cabinet will favor industry talking points over actual science to inform policies. The FDA commissioner has yet to be nominated, and while this job usually goes to someone with a science background and an interest in protecting public health, the Trump administration appears to be focusing its search on individuals with experience working in the biotechnology industry, advised by venture capitalist, Peter Thiel, who has some pretty radical ideas about how to run the FDA more like a Silicon Valley startup. Some of the names that have been mentioned as being in the running for FDA commissioner include Thiel’s associate Jim O’Neill, American Enterprise Institute fellow Dr. Scott Gottlieb, executive director of Lewis Center for Healthcare Innovation and Technology Dr. Joseph Gulfo, and former biotechology company executive Dr. Balaji Srinivasan.

This shortlist of men is riddled with conflicts of interest in their former and current ties to biotechnology companies, and features a man who thinks drugs should be approved if proven safe, regardless of efficacy (O’Neill), a man who has criticized the FDA for being too restrictive in its regulations (Gulfo), a man who has claimed that FDA regulations have nothing to do with health and are merely “safety theater” (Srinivasan), and a man who has accused the FDA of “evading the law” due to an overregulated drug approval process (Gottlieb). Note that none of these men have expertise in the food and nutrition space, and it seems like any regulation that inhibits the ability of drug or food manufacturers to approve and introduce an endless stream of new drugs and food additives will be unpopular under this administration.

Whether it’s one of these men or not, whoever is selected to lead the FDA must respect the role of public servant and abide by the agency’s mission to first and foremost “protect public health,” guided by science, not by drug and food manufacturers’ interest in increasing their quarterly earnings. In this case, there’s only one way to define a “healthy” public, and that’s one whose safety and well-being is protected over the profits of Big Pharma and Big Food. Taking further action to regulate added sugar amounts on front of package labels would be a strong science-backed policy maneuver that will advance the crucial fight against obesity and help all Americans make clearer decisions to improve their health. That’s my kind of “healthy.”

Join UCS and urge the FDA to include a limit for added sugar in its “healthy” definition by submitting a comment on regulations.gov before April 26.

Massachusetts Moves to Limit Pollution from Transportation: 5 Things you Should Know

UCS Blog - The Equation (text only) -

The state of Massachusetts has been an important leader in the fight to protect our climate from global warming. But there’s one area where Massachusetts continues to struggle: controlling pollution from transportation. New limits on transportation emissions now under consideration by the Massachusetts Department of Environmental Protection (DEP) could determine whether the Commonwealth can stay track to achieve our climate mandates, or whether transportation emissions will undermine the progress the state has been able to make building a clean energy future.

Clean Vehicles, CV

Transportation and the Global Warming Solution Act

The Bay State has passed one of the strongest climate laws in the country, the Global Warming Solutions Act (GWSA), which requires the state to reduce emissions throughout our economy by at least 80 percent of 1990 levels by 2050. Massachusetts also leads the nation in energy efficiency, and last year, passed an energy bill that will see the largest ever procurement of offshore wind in the United States.

Massachusetts has been able to make significant progress on these issues because the people of the Commonwealth care a lot about climate, because our state is uniquely threatened by the impacts of sea level rise and other climate change impacts, and because our state boasts a proud bipartisan tradition of leadership on climate and energy.

But transportation has been a challenge for Massachusetts. Pollution from our cars and trucks is the largest source of emissions in the state, and it’s the one area of our economy where emissions have actually grown since 1990, as increased total driving in the state has outpaced gains in fuel efficiency:

Achieving significant reductions in transportation emissions basically boils down to using a lot less oil. The good news is that we know how to do this! More efficient cars, cleaner fuels, electric vehicles, and a transportation system that gets us where we need to go without spending so much time behind the wheel, can all help cut pollution from transportation.

Kain v. Department of Environmental Protection

This week, Massachusetts will take an important step towards tackling the pollution from transportation, as the state’s Department of Environmental Protection (DEP) considers new limits on emissions in the sector.

These proposed regulations are in response to last year’s landmark decision in Kain v. Department of Environmental Protection, in which the Massachusetts Supreme Judicial Court ordered the state needs to set mandatory and enforceable limits on the total mass of pollution emitted within the state from different sources, including transportation. These proposed regulations represent DEP’s response to the Kain decision. So how did the DEP do? Here’s what you need to know:

#1: DEP is proposing to limit most, but not all, emissions in the transportation sector.

The proposed DEP regulation covers the “surface transportation system” within Massachusetts, which means emissions that come from passenger vehicles, light and heavy duty trucks, and transit systems. The new regulations do not cover aviation or marine transportation. All told, that means that approximately 85 percent of Massachusetts transportation emissions are covered by this regulation.

Leaving aviation and marine travel out of the current regulation may make sense, given that these areas present different administrative challenges. In the long run, however, Massachusetts will need to make progress in these areas as well, and the state should consider additional regulations that will establish limits on boats and airplanes.

#2: The proposed limits are ambitious.

Overall the state is proposing to cut emissions in the transportation sector by approximately 1.87 percent per year for each of the three years covered by this regulation (2018, 2019 and 2020). That’s pretty challenging! Massachusetts has not been able to achieve a 1.87 percent reduction in transportation emissions for three consecutive years since 1990-1993, 25 years ago.

But, while ambitious, a 1.87 percent linear decline isn’t quite enough to achieve our long-term climate goals. Overall, the DEP proposal would put the state on track to achieve a 35% reduction by 2030 and a 57% reduction by 2050. So while these regulations represent an ambitious effort to begin to get transportation emissions under control, we’ll need to accelerate progress over the coming years to achieve our climate mandates.

#3: Achieving these limits will require additional policies.

The two biggest challenges with this regulation are: it isn’t clear how we are going to achieve these limits, and it’s not clear what happens if we fail to achieve them.

Right now, Massachusetts is relying heavily on federal and regional policies to reduce emissions in transportation. In fact, 93 percent of the projected emission reductions in the state’s most recent Clean Energy and Climate Protection plan come from National Greenhouse Gas and Fuel Economy Standards that, if fully implemented, will approximately double the fuel efficiency of new vehicles by 2025. These standards are now very much under threat from a combination of automaker intransigence and the current administration in Washington.

The new federal administration means that Massachusetts and other states are probably on our own when it comes to achieving our climate limits.  Massachusetts needs to think big about new policies that will help our residents and businesses drive less or purchase cleaner vehicles. Aside from reducing emissions in state fleets, the DEP is not yet proposing new policies to achieve the limits that they lay out in this regulation. But they are going to have to if they want to be successful.

#4: It’s not clear how these regulations will be enforced.

What happens if we go over our limit? The regulations are not clear on this very important point.

The most straightforward way to make the limits on transportation emissions enforceable is through a requirement that polluters purchase allowances from a limited pool (or cap). This market-based approach would build on the successful model of the Regional Greenhouse Gas Initiative, which has been really effective in reducing emissions while promoting economic growth in the electricity sector. RGGI is also an important source of funding for Massachusetts’ clean energy and efficiency programs. A market-based approach to ensuring emission reductions is explicitly authorized by Section 7 of the GWSA.

Without some kind of mechanism to ensure that the state actually achieves the reductions, this regulation will not be the kind of mandatory and enforceable limits required by the Supreme Judicial Court.

#5: Achieving long-term reductions in the transportation sector will require regulations that extend past 2020.

One major challenge facing DEP throughout this whole process is that the GWSA regulations that they are in charge of implementing sunset by statute in 2021. Achieving short-term reductions is challenging in the transportation sector, as vehicles, community development patterns, and transportation infrastructure investments all change slowly.

A more sensible approach would be for the state to establish limits through 2030. Several proposals in the Massachusetts legislature would eliminate the 2020 sunset and allow DEP to consider limits on a longer time horizon.

Working together, both parties and all three branches of government in Massachusetts have made significant progress reducing emissions from electricity generation and increasing the efficiency of our homes. Massachusetts’ policies to promote solar energy, for example, have allowed the technology to explode into the mainstream, providing thousands of Massachusetts residents with affordable zero-emission energy. With the growth of new technologies such as electric vehicles, new transportation systems such as car sharing, and ever-increasing use of public transportation and cycling in the Bay State, we have more options then ever before to promote clean transportation. It is time for policy leaders in Massachusetts to bring the same urgency and focus that has lead to so much success in the electric sector to the task of reducing pollution from transportation.

 

Climate Change, Resilience, and the Future of Food

UCS Blog - The Equation (text only) -

The United States food system has proven remarkably adaptable over the last 150 years, producing an abundant supply of food, feed, and fiber crops for national and international markets amidst dynamic social change, and despite dramatic natural resource variability across North America.

The story of American agriculture’s rise to world class status is usually told with technology in the hero’s role. In the typical story, the major “revolutions” in the industrialization of American agriculture came about as the result of one or more technological innovations—such as mechanical harvesters, hybrid corn and more feed-efficient livestock, chemical pesticides and fertilizers, and genetic engineering. As awareness of the current and potential costs of climate change to agriculture and food systems increase, this singular focus on technological solutions continues through widespread enthusiasm for sustainable intensification.

Public investment: The true hero of the story

Rarely acknowledged is the real, underlying reason for the success of industrial agriculture: the continuous intended and unintended investment of public resources to develop, support, promote, and enable the industrial food system. These resources have taken many forms:

  • Financial resources such as direct and indirect payments designed to stabilize production, recover from disasters, and reduce environmental harms
  • Public financing of the education, research and development programs and institutions that serve the agricultural-industrial complex
  • Unintended human resource subsidies as farm families struggle to balance the demands of full-time farming with full-time off-farm work to maintain family well-being in the face of steadily declining farm profitability
  • Unintended natural resource subsidies in the form of degraded soil, water, and air quality, biodiversity, and ecosystem services
  • Unintended social resource subsidies in the form of degraded health and well-being of rural communities both at home and abroad
laura-blog-1-resize

Resilient Agriculture grower Jim Koan explains to USDA-FSA administrator Val Dolcini how FSA programs have helped him reduce climate risk on his 500 acre organic farm located near Flushing, MI.

Although the costs of industrial food and the benefits of sustainable food systems are widely recognized, and despite new evidence that the global industrial food system is uniquely vulnerable to climate change and other 21st-century challenges, national and international agricultural policy continues to support public investment in an unsustainable global industrial food system.

Sustainable agriculture is the future of agriculture

Sustainable intensification, the newest chapter in the industrialization of agriculture, is just business as usual for many actors in the global industrial food system. Sustainable intensification rhetoric often promotes the widely discredited myth that low agricultural productivity is the root cause of world hunger and suggests that new resource-efficient technologies that reduce the environmental degradation associated with agriculture are the solution to global food security.

My work to apply resilience theory to questions of agricultural and food system sustainability suggests that sustainable intensification, rather than advancing sustainability and the broader public good, actually keeps us locked into a clearly maladaptive path. Measures to reduce the environmental damages associated with industrial practices are welcome and needed, but agricultural innovations that do not also regenerate the natural, human, and social resources degraded by 150 years of industrialism will do little to enhance the climate resilience of the global food system. In contrast, sustainable agriculture and food systems offer successful models of locally-adapted, climate-resilient alternatives that we can build upon to put humanity on a path to a sustainable and resilient food future.

 Karl Wolfshohl

Texas ranchers Gary and Linda Price produce cattle for the source-verified wholesale market on 2000 acres of restored tallgrass prairie in Blooming Grove. Credit: Karl Wolfshohl

Local and regional actions, supported by enabling policies at local, regional, national, and international levels, can be used to enhance the sustainability and resilience of existing agriculture and food systems. My research indicates that we can use existing USDA programs, integrative initiatives, and international partnerships to address six significant levers of change:

  1. Redirect USDA credit and crop insurance investments through programs such as the Farm Service Agency’s (FSA) Direct Operating Loans Program and the Risk Management Agency’s  Whole Farm Revenue Protection Program to increase support for farmers and ranchers transitioning to or already using ecosystem-based, diversified production and marketing practices, especially small and mid-sized agricultural businesses supplying local and regional markets with minimally-processed, nutrient dense foods.
  1. Expand incentives and rewards for producers who use production practices that enhance sustainability and resilience of the U.S. food system through the protection and regeneration of ecosystem services. Programs such as the Natural Resources Conservation Service’s Regional Conservation Partnership Program, and the FSA’s Conservation Programs could be reoriented to achieve these goals.
  1. Redirect economic development investments, such as those funded by the National Institute of Food and Agriculture’s (NIFA’s) Community Food Projects Program and the Rural Business Development Grants Program, to promote the re-regionalization of the U.S. food system.
  1. Redirect agricultural education, research, and extension investments to promote the study, investigation, and development of sustainable and resilient agroecosystems as a core mission of the land-grant university system. This goal can be addressed through the expansion of existing programs such as NIFA’s Sustainable Agriculture Research and Education Program, Higher Education Programs, and the Know Your Farmer, Know Your Food Program.
  1.  Climate Listening Project

    Farmers markets, like this one in Raleigh, NC, increase consumer access to fresh, locally-produced farm products and help build relationships between producers and consumers. Credit: Climate Listening Project

    Expand nutrition assistance and education programs that support sustainable and resilient regional food systems, such as the Farm to School Grants and Seniors Farmers Market Nutrition Program.

  1. Redirect U.S. international development investments such as those made through the Global Partnership on Nutrient Management, USAID Sustainable Agriculture and Natural Resource Management Innovation Lab, and Feed the Future to support collaborative, place-based development of sustainable and resilient regional food systems

The global industrial food system faces unprecedented challenges that are projected to increase in intensity in the years ahead.  Persistent hunger and poverty, growing human population, a degraded and eroding natural resource base, failing agricultural communities, increasing and shifting consumer demands, and the uncertainties of climate change demand a reexamination of the basic underlying assumptions of industrialism. We must accept that we cannot burn or build our way to global food security, that we cannot depend on human ingenuity alone, but must finally acknowledge the fundamental role that healthy ecosystems play in human well-being. We know enough to begin now to cultivate a new kind of food system, a sustainable food system that has the capacity to produce global food security as it protects us from the inevitable challenges ahead.

 

Laura Lengnick is an award-winning soil scientist who has explored agricultural sustainability for more than 30 years as a researcher, policy-maker, educator, and farmer.  Her work in sustainable farming systems was nationally recognized with a USDA Secretary’s Honor Award and she contributed to the 3rd National Climate Assessment as a lead author of the USDA report Climate Change and U.S. Agriculture: Effects and Adaptation. In 2016, Laura launched Cultivating Resilience, LLC, a private consulting firm offering ecosystem-based climate risk management services to government, business, and communities. Her book, Resilient Agriculture: Cultivating Food Systems for a Changing Climate (New Society Publishers, 2015), examines climate change, resilience and the future of food through the adaptation stories of 25 award-winning sustainable producers located across the U.S. You can learn more about Laura and her work at http://www.cultivatingresilience.com

 

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

These Investments in Food and Farm Research Will Pay Us Back—Urban and Rural Alike

UCS Blog - The Equation (text only) -

This fall, the Department of Agriculture (USDA) sent out an important announcement that went largely unnoticed (those of us interested in food and agriculture were, and continue to be, preoccupied with other things). Namely, the USDA’s National Institute for Food and Agriculture (NIFA) reported recent investments in research designed to improve food, fiber, and fuel production while protecting natural resources that farms and communities depend on and recognizing the pivotal importance of the farmer’s bottom line.

To refresh your memory, NIFA is the part of the USDA’s integral Research, Education, and Economics Mission Area (which also includes the Agricultural Research Service (ARS), Economic Research Service (ERS), and National Agricultural Statistics Service (NASS). Before this news falls too far back in the rearview mirror, let’s take a moment to recognize its importance in strengthening America—as a whole.

In public agroecological research, spare change can lead to big change

The $6.7 million investment by NIFA that I am so excited about came via the Agriculture and Food Research Initiative (AFRI) Foundational Bioenergy, Natural Resources, and Environment (BNRE) program area. While this is pennies compared to the $156 billion USDA budget, it’s still big.

For one thing, investment in agricultural research and development just tends to pay off. Further, this particular funding program is relatively young and is already helping to fill a gap that is important and urgent, as noted in a statement signed by over 400 PhD experts. While much agricultural research has focused largely on yields, this program encourages research on how anything from “soil, water and sun to plants, animals and people, interact with and affect food production.” It requires attention to economic, societal and environmental benefits to uncover solutions that don’t unintentionally create costly consequences.

Critically, although the current agricultural system—even in a high production year—doesn’t ensure economic success,  BNRE focuses on solutions that not only work for the environment, but can provide better economic incentives and options for farmers and for rural America.

The recent announcement explained how BNRE is enabling progress on working lands that range from grasslands to croplands to forests. Details on new projects (including a powerful workshop on the critical role of soils) are provided here. A few highlights:

In grazing lands ranging from the Chihuahuan Desert to the Corn Belt to Florida, researchers are

  • Considering how adding legumes to pasture can reduce greenhouse gas emissions and nutrient loss, increase soil carbon, improve access to local healthy food, and benefit farmers
  • Developing ways to convert lands dominated by invasive species to diverse grasslands that improve pollinator health, biodiversity, and cattle and landowner well-being
  • Evaluating how shrub control methods affect wildlife, plants, livestock productivity, and communities, and searching for ways to use keystone species to speed up restoration
  • Using native grasses and cover crops to reduce fertilizer needs, maximize profits and foster environmental benefits
  • Finding the best grazing and fire management strategies to improve water use, reduce climate impacts, and increase forage production and farm economics

In rural and urban farms from the Midwest to California, teams are

  • Working with farmers to optimize configurations of diversified farms to improve insect management and meet growing demands for local produce
  • Investigating how using multiple species of crops in fields (polycultures versus monocultures) affects yields, economics, weeds, pests, and drought resilience
  • Studying how urban garden management affects biodiversity, pests, pollinators, and food access
  • Evaluating how management practices in farms and gardens have revitalized industrial urban areas, and investigating the role of soils health in sustained improvements
The unique significance of public funding

One of the most important aspects of this new research is quite simply that it is funded publicly.

While the private sector has undoubtedly supported some needed areas of research, private sector funding cannot be expected to fill all the research gaps. When research investments can be recovered through products and profits, for example, private funding is logical. However, as reported in a recent ERS report, there is often little or no incentive for private investment into research geared toward valuable outcomes that are harder to put a price on (such as reduced reliance on fertilizers, or cleaner air and water). To enable this type of research, public funding is imperative.

Even beyond the need for public funds to fill certain research gaps, there are broader reasons why a strong contribution of public funds to food and farm research can be considered critical. As a recent post of Policy Pennings shared, public funding supports academic freedom, independent analysis, and research targeted towards the good of farmers, farm families, and the public. As public institutions become squeezed for resources and private sector funding starts playing a bigger role, additional risks—such as potential bias in academic research—can become a concern.

Taking strides to protect the US lead in public investments for agriculture

The role of the US as a world leader in public investment in agriculture, with all the benefits that can accrue as a result, is at risk.

Historically, the public sector made up the majority of total US agricultural funding (50% between 1970 and 2008), and public funding from the US made up the largest portion of the global investment (20-23% between 1990 and 2006). But recent research has documented that the US is falling behind. The US has cut back on public funding for agricultural R&D while private sector contributions have grown, bringing public sector contributions down to less than 30% of the total, behind private investments. Also, as the US has reduced public investments, other countries have ramped up. As a result, the US share of global public agricultural funds has dropped significantly, to just 13%.

It is time to up our game. And, by the way, the full burden doesn’t have to be on USDA. Other agencies have made relevant investments, which could be built upon. For example, the Department of Energy recently announced $35 million for new projects that could help develop new crops to replenish soil health, conserve water, and reduce climate emissions.

Whether we live in corn country or among skyscrapers, research in an agriculture that jointly considers the economy, society and environment is a smart investment. Thanks to programs like BNRE, the committed administrators and staff that have made programs like this possible – in NIFA and across the USDA, and the researchers who are taking on some of the work outlined above, we are on the right path.

P.S. If you are interested in taking action to support public funding for agriculture, and agroecology, click here!

What’s Congress Doing to our Methane Waste Regulations?

UCS Blog - The Equation (text only) -

Yesterday I spoke at a forum in the Capitol on the Bureau of Land Management’s Methane Waste Rule, an event organized by Democratic members of the House Natural Resources Committee. I offered testimony on a panel of experts including a former BLM official involved in developing the rule, a nurse speaking about the public health benefits of the rule, a scientist from Clean Air Task Force who discussed the Colorado rules on which parts of the BLM rule were modeled, and a pastor who talked about the moral imperative to use natural resources responsibly, and limit the harms caused by climate change.

I joined a panel of experts testifying on the BLM methane waste rule (I am wearing a red tie).

Four democratic Representatives asked questions and made statements. These included Congressman Grijalva from Arizona, who is the ranking member on the Committee for Natural Resources, along with Congressman Huffman (CA), Congressman Lowenthal (CA), and Congressman McEachin (VA).

Representatives (from left to right) Lowenthal, Huffman, Grijalva, and McEachin

Republicans are threatening to eliminate the BLM Methane regulation using an obscure, radical, and rarely used congressional trick called the Congressional Review Act (CRA).

The CRA allows Congress, with a simple majority, to completely revoke any rules made in the last 6 months of the Obama administration. It is a blunt tool that would revoke regulations that went through extensive stakeholder review, used evidence-based science, had public notice and comment, and took a few years on average to be finalized.

In addition, it stipulates that any rule that is similar to the rule can NEVER be done again, unless Congress gives explicit permission–thus salting the earth.

The BLM methane regulation updates rules issued in the Carter administration governing how oil and gas are produced on Federal and tribal land. The new rules will reduce leaks, venting, and flaring of natural gas, which not only wastes a resource that belongs to the American people, but also turns it into a health and climate hazard.

Apparently, the oil industry likes the 1979 vintage rules better, and the new Congress is rushing to do their bidding, quickly moving to revoke a rule that was three years in the making. But rolling back the regulatory clock to 1979 would be as dumb as removing requirements for airbags and anti-lock brakes from modern cars.

A lot has changed in the last 38 years, including the rise of fracking and the associated methane pollution from tight oil production. Rapidly reducing methane pollution–the leading non-CO2 pollutant responsible for climate change–is more urgent than ever before.

The last decades have also seen new technologies to measure, manage and utilize natural resources responsibly. An up-to-date regulatory framework for the oil and gas industry is essential to holding a massively polluting industry accountable.

The CRA is touted as a tool to exert control over unauthorized, unnecessary, or unreasonable agency regulation, but the methane and waste prevention rule is clearly authorized, necessary and reasonable.

Former Counselor to the Director of the Bureau of Land Management Alexandra Teitz explains in her testimony that BLM is required by law to prevent waste and ensure that resource extraction on public lands is conducted in a safe and responsible manner. The Government Accountability Office (GAO) estimated that State and Federal taxpayers are losing as much as $23 million per year in royalty revenue due to this waste, and GAO found that the BLM needed to update its rules to address this waste.

The BLM worked on these rules for three years, holding numerous hearings around the country. They received more than 300,000 public comments and made changes to the final regulation based on this feedback. As Dr. David McCabe, Senior Scientist at the Clean Air Task Force explained in his testimony, the waste rule was modeled on policies already implemented in Colorado, Wyoming and North Dakota.

As these states’ experience shows, sensible up-to-date standards work to cut pollution and waste, and their requirements are easily implemented. These rules are not going to stop the oil industry from drilling for oil and gas; they just set reasonable standards of performance that reflect the current best practices modeled in states.

Responsible industries recognize that an up-to-date regulatory framework is necessary to protect the public and ensure that irresponsible actions by a few bad actors do not tarnish the whole industry. Cars, trucks and even appliances are subject to numerous standards that ensure that as technology changes, so do requirements for safety, pollution and efficiency.

The history of oil and gas extraction is filled with egregious examples illustrating the need for strong regulations to protect the public, and it is especially obvious that oil companies operating on public lands, who are extracting resources that belong to the American people, should be held to reasonable standards to avoid waste and unnecessary pollution.

The same day I was speaking to House Democrats, Former ExxonMobil CEO Rex Tillerson was being narrowly confirmed as Secretary of State and Jack Gerard of the American Petroleum Institute was speaking at a hearing on regulations in the Senate.

Under cover of the maelstrom that is DC this last couple weeks, Mr. Gerard invented some alternative facts and head-spinning doublespeak about the CRA. API’s press release on Mr. Gerard’s testimony was titled “smart, science-based regulation needed to advance America’s energy renaissance and jobs.” Apparently even API knows that is what people expect and demand.  But what Mr. Gerard actually said was:

This week, we support the efforts of Congress as it takes the first step to pull back a number of these ill- considered and hasty regulations under the CRA. These include Section 1504 of Dodd-Frank, which places U.S.-based energy companies at a competitive disadvantage in the world marketplace, and BLM’s methane regulations, which are technically flawed and redundant to state regulation. Furthermore, we look forward to the anticipated CRA resolution on EPA’s redundant and unnecessary Risk Management Program rulemaking.

Despite Mr. Gerard’s doublespeak, the CRA has the opposite effect – killing smart, science-based regulations and blocking agencies from issuing any similar updates in the future, unless Congress passes new legislation specifically authorizing it.

The CRA is the opposite of a smart science-based regulation; it is a dirty trick that Congress can use to do the oil industry’s bidding.

What Can “Local” Food Do?

UCS Blog - The Equation (text only) -

What does “local food” mean? Most of us think of local food as something that was grown nearby geographically, although the distances can vary a lot.

We also tend to make a lot of assumptions about what local food can do. For example, we think of “local” food, as a more sustainable alternative to the global, industrial food system that produces lots of food, but is also environmentally destructive, makes people sick, and leaves many hungry.

Thinking critical about the role of local food in creating more sustainable food systems.

Thinking critical about the role of local food in creating more sustainable food systems.

Supporters of local food often assume that it’s fresher, more nutritious, and that it’s better for farm and other food system workers, the environment, and local communities. One of the themes of my research on food systems has been that we need to question assumptions like these, and to separate as much as possible our assumptions of how the world is, from our goals for how we think it should be. One of the biggest challenges of local food is disentangling these two kinds of assumptions.

Local food can do a lot to improve our food system, but our assumptions about what it’s doing may or may not be true in any specific case, and if they aren’t tested, they can fool us (what I call drinking green Kool-Aid®), and enable corporate greenwash. This means our food choices won’t be helping change the food system the way we hope they will, and can even work in the opposite direction.

So, we need to keep asking questions: What are our specific goals for a more sustainable alternative to the global industrial food system? Is promoting local food helping us to make progress toward those goals? Is “local” a good indicator of progress toward those goals? How can we adjust our actions and policies, and the indicators we use to measure them, to make more progress? I’ll give a few examples of how this works, from our research in Santa Barbara County, California.

Local food, transportation, and climate change The effect of localizing fruit and vegetable consumption in Santa Barbara County, California.

The effect of localizing fruit and vegetable consumption in Santa Barbara County, California.

We often assume that because local food doesn’t travel very far to get to us, that it produces fewer greenhouse gas emissions (GHGE) overall, because of less transportation. So, a question we asked in our Santa Barbara research was, “Is reducing food miles a good way to reduce GHGE?”

Santa Barbara County (SBC) is a prime example of the missed potential for local food; despite having an active local food movement, 95% of fruits and vegetables consumed in 2008 was imported from outside the county, while 99% of the more than $1 billion dollars’ worth (2.36 billion pounds) of vegetables and fruits grown in Santa Barbara county in 2008 was exported.

To see what contribution localization could make to reduce GHGE, we modeled the effect on GHGE of a change to all fruits and vegetables consumed in the county being grown in the county. We found that this would be a savings per household of only 0.058 MT of GHGE per year, or about 9% of the average U.S. household’s annual GHGE for produce. However, that only amounts to about 0.7% of a U.S.  household’s total GHGE for food, and less than 0.1% of total U.S. GHGE per person.

In fact, most GHGE from food are from production, especially of animal foods. So if fighting climate change is a goal, maybe we need to look beyond localization. For example, the only life cycle assessment of the complete US food system found that eliminating meat and dairy from our diets just one day a week could reduce GHGE more than totally localizing the entire food system.

What about food gardens, food waste, and composting?

You can’t get more local than growing food in your home, community, or school garden. So, we modeled the effect of converting an area of lawn to a household vegetable garden in Santa Barbara County, and composting household organic waste at home for use in the vegetable garden. We found that gardens reduced GHGE by about 2 kg per kg of vegetable, compared to households with no gardens, purchasing all their vegetables, an 82% reduction in GHGE. And if 50% of single-family housing units in Santa Barbara County had a 200 square foot garden, they could contribute 3.3% of the official county GHGE reduction target, and if scaled to the state level, 7.8% of California’s target.

We also looked into the effect of the way household organic waste was managed, since this accounted for the largest portion of garden emissions savings, even greater than the emissions savings from reducing purchased vegetables. We found that if landfills that efficiently captured and burned methane for energy and efficient aerobic composting operations were an option, gardeners could have the greatest emission reductions by exporting their organic waste to those operations. They could then import the compost, rather than composting at home, so gardeners need to ask questions about their options for processing their organic waste—it may be more climate-friendly to advocate for municipal composting facilities, rather than the more local option of composting on site.

What about the bottom line? Wesley Sleight and Anna Breaux, founders of Farmer Direct Produce local food hub

Wesley Sleight and Anna Breaux, founders of Farmer Direct Produce local food hub

Can local food be economically profitable? Local food hubs that consolidate local farm harvests and redistribute them are an important tool for localizing food. But when they try to scale up volume to have a larger impact and more revenue, they need to adapt to the dominant industrial food system, from infrastructure to economics. This can compromise their goals, because there are often tradeoffs among environmental, social, and economic aspects of sustainability. Can local food be economically viable while prioritizing people and the environment?

In our case study of a local a food hub in Santa Barbara, we found that the key to success in meeting the goals of environmental sustainability and improved community nutrition was prioritizing those over the goal of economic profit, while still being economically viable.

Helping local food do more

On September 28, 2016, Senator Debbie Stabenow [D-MI] introduced S.3420, the Urban Agriculture Act of 2016. It includes support for a wide range of urban agriculture, from community gardens to technology intensive methods like aeroponics, based on the assumption that these will support local food infrastructure and economies, better nutrition, and environmental sustainability.

This bill is timely, as urban agriculture has become a popular form of local food production. For example, in our survey of Santa Barbara County residents, we found that the majority favored not building on land used for urban agriculture.

I think one of the strongest parts of this bill is the provision calling for research on the funded projects. This means asking if the goals of urban agriculture are actually being promoted, and providing information for improving them.

As our research has demonstrated, while local food systems can do a lot to promote more sustainable alternatives to the industrial system, we need to keep asking questions to ensure that our good intentions aren’t unintentionally compromised. In many cases other actions, such as changing production practices, and especially changing diets, may be more effective, or are needed to complement localization.

 

Bio: David Arthur Cleveland is Research Professor in the Environmental Studies Program and the Department of Geography at the University of California, Santa Barbara. He is a human ecologist whose research and teaching focuses on sustainable, small-scale agrifood systems, including work with small-scale farmers and gardeners around the world. He is currently researching the potential for agrifood system localization and diet change, to improve nutrition, reduce greenhouse gas emissions, and promote food and climate justice, in California, the US, and globally. His latest book is Balancing on a Planet: The Future of Food and Agriculture.

For copies of studies by David Cleveland not available on his website, please email him.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Pages

Subscribe to Union of Concerned Scientists aggregator - Combined UCS Blogs