Combined UCS Blogs

The Man Who Sued the EPA is Now Running It. What Does That Mean for the Environment?

UCS Blog - The Equation (text only) -

Voting largely along party lines, Congress just confirmed Scott Pruitt as Administrator of the Environmental Protection Agency (EPA)—an attorney who has spent his professional career suing the EPA to stop the agency from performing its fundamental mission of ensuring clean air and water for all Americans. This confirmation marks a sharp break with precedent; most EPA Administrators from both parties have come to the office with a demonstrated commitment to the EPA’s mission.

One might even say that this vote signals the end of an era of bipartisan congressional support for a strong federal role in protecting our environment, as this newly confirmed Administrator is likely to dismantle the safeguards that both parties have supported since the 1970s.

What that means for all of us who care about clean air and water and the protection of our environment is this: It is up to us to monitor carefully what happens next, and to be prepared to spring into action as needed.

Here are some of the key developments I’m watching for:

Will Scott Pruitt recuse himself?

As repeatedly noted in his nomination hearing, Pruitt has represented the State of Oklahoma in numerous lawsuits against EPA. Many of these cases are still active today, directed at major EPA regulations, including the Clean Power Plan (which limits carbon emissions from power plants); national air quality standards; mercury emissions from coal plants; methane limits for the oil and natural gas excavation; and a Clean Water Act rule that clarifies federal jurisdiction over bodies of water.

During the nomination hearing, Pruitt did not commit to recusing himself from these cases, but he did say he would rely on advice from the EPA ethics counsel. Common sense tells us that he cannot possibly be impartial on these issues, and conflicts of interest abound. For example, the state attorneys general who joined him in the suit against the Clean Power Plan have written a letter to the Trump Administration, asking the President to issue an executive order declaring that the rule is unlawful. Responding to this request would, in the normal course of business, require EPA input, since it is an EPA regulation. How can Scott Pruitt possibly participate in any review of that request given that, just a few weeks ago, he himself was one of the attorneys general making this claim?

He must recuse himself, as thirty senators have made clear in a recent letter.

Will Scott Pruitt cut federal law enforcement?

As a candidate, Mr. Trump pledged to dismantle the EPA. He lacks a filibuster-proof majority to change the laws that created the EPA, such as the Clean Air and Clean Water Act. But he could cripple the EPA with budget cuts, which are much harder for a minority to stop.

By wide margins, most Americans favor enforcement of laws that protect our air and water. Cutting EPA enforcement will therefore be unpopular—but Scott Pruitt is likely to argue that we can rely on states to enforce environmental laws, so cutting the EPA’s budget won’t do any real harm.

This is a dangerous myth.

Having served as a state environmental commissioner, I know from personal experience that state environmental agencies are already strapped. They typically lack the technical experts employed at the EPA, and stand in no position to take on additional enforcement responsibilities shed by the EPA.

In Massachusetts where I served, for example, my former agency’s staff was cut nearly in half between 2002 and 2012 due to budget cuts, even as the agency’s responsibilities grew. That occurred in a state well known for its strong commitment to environmental protection. As a result, my agency was forced to cut back on important and effective programs, such as water sampling to locate sources of bacteria that pollute rivers. If the EPA’s budget is cut, it will mean even fewer resources for states, because states now receive a significant share of the EPA’s budget to cover enforcement activities.

Second, state environmental agencies sometimes experience political pressure against enforcement that might harm a large employer or impose significant costs on residents. We saw some of this in play in Flint, Michigan, where a state agency did not enforce a law requiring corrosion treatment of pipes to reduce lead contamination; it took an EPA staffer and outside scientists, as well as the residents themselves, to blow the whistle on lax state enforcement.

Third, states are not equipped to deal with the widespread problem of interstate pollution. To cite one of the most egregious examples, the state of Maryland could shut down virtually all in-state sources of air pollution and yet still not be in compliance with health-based air quality standards due to pollution from neighboring “upwind” states. A strong federal law enforcement presence is needed to address the simple fact that air and water pollutants do not honor state boundary lines.

We and others stand prepared to fight crippling budget cuts at the EPA, and explain that the protection of our air and water requires both federal and state environmental law enforcement.

Scott Pruitt will likely gut the Clean Power Plan; what will he replace it with?

Photo: Gage Skidmore/CC BY-SA (Flickr)

During the campaign, President Trump called for abolishing the Clean Power Plan, the EPA regulations that limit carbon emissions from power plants. And as noted, Administrator Pruitt sued to block it. It now seems nearly inevitable that he will move to drastically undermine the plan.

The question is, what will he propose to replace it? The EPA does not have the option of doing nothing. The United States Supreme Court ruled in 2007 that the EPA has a duty to regulate greenhouse gases under the Clean Air Act if it makes a determination that such gases endanger public health and the environment. In 2009, EPA made such a finding (which Mr. Pruitt fought, though unsuccessfully).

Thus, EPA remains obligated to regulate carbon dioxide emissions in general, and in particular with respect to power plants, which are among the nation’s largest source of these emissions.

One predictable approach would be a revised regulation that reduces emissions, but by a much smaller percentage. The current litigation over the Clean Power Plan could serve as a roadmap for a diminished rule. The Clean Power Plan relies on three strategies to reduce emissions—improving efficiency of coal plants, switching from coal to gas, and switching to renewables. During the litigation, Scott Pruitt conceded that the EPA had the authority to require improvements to coal plant efficiency, but claimed that the other two strategies, which go “beyond the fenceline” of an individual source, were unlawful.

Thus, one might expect that a revised rule will mirror what Mr. Pruitt called for in court. If so, rather than cutting carbon emissions by approximately 32 percent by 2030, the rule would result in barely noticeable emission reductions.

If this happens, litigation will be necessary. The court that mandated the EPA to address greenhouse gas emissions should not be satisfied with a rule that does little to cut one of the nation’s largest sources of CO2 emissions.

How about vehicles?

The second biggest carbon cutting program of the Obama Administration is the UCS-backed fuel economy standards for cars which, it is estimated, will roughly double fuel economy between 2012 and 2025. Those standards were agreed to by the automakers at the time. They are projected to cut billions of tons of CO2, reduce oil use by billions of barrels, and save consumers an average of $8000 over the lifetime of a vehicle.

When the standards were put in place, they included a “mid-term review” provision in which the EPA would assess whether changes in technology, costs, or factors might warrant a change to the standards. The review was to be completed by April 2018, but the Obama administration in its closing days completed the review and determined, based on a thorough review, that there was no reason to change the standards, since automakers are ahead of schedule in meeting these standards, and at a lower cost than originally predicted.

Some automakers are calling for this determination to be re-opened, presumably so that the rules can be modified and perhaps weakened. And one can justifiably be anxious that they could offer something that the Trump administration is keen to secure—a commitment to increased manufacturing in the United States—in exchange for relaxing these standards.

It would be a disaster for these historic standards to be rolled back, and we’ll fight any such rollback along with many allies.

How about science?

As I wrote recently, Mr. Pruitt’s record shows little evidence of deference to scientists. After all, he sued the EPA for relying upon the world’s most prominent climate scientists, including many employed by the federal government, in finding that greenhouse gases endangered the environment. And he claimed that the question of climate change and the role of human causes of it are still an open question for debate.

As EPA Administrator, he will hear from EPA scientists whose expert judgment will not align with his deregulatory agenda in some cases. Will these scientists’ findings be suppressed or disregarded?

We call on Mr. Pruitt to declare that scientific integrity is a core guiding principal for the EPA, that he will abide by the existing EPA scientific integrity policy, and even look for ways to improve it, as recommended by UCS.

Vigilance required

Scott Pruitt comes to his new position with the heavy baggage of having devoted a good part of his career to opposing EPA, not to mention the apparent antipathy of his boss towards the agency. The Trump transition team, composed of career ideologues, further fueled anxiety over the EPA’s fate, with threats of gag orders on agency scientists, deletion of climate data from the website, and draconian budget cuts. This is why we see, for example, hundreds of career civil servants risking their jobs by publicly protesting Mr. Pruitt’s confirmation.

Scott Pruitt has a chance now to push the reset button, and position himself as an open-minded and principled conservative, rather than a deregulatory ideologue. Most helpful to him will be to invest significant time in hearing from the agency’s talented scientists, engineers, policy analysts and attorneys.

No matter what, we will be watching his actions vigilantly and stand prepared to fight to retain key protections of Americans’ health and safety at the agency he now oversees.

Photo: Photo: Gage Skidmore/CC BY-SA (Flickr)

Learning from Oroville Dam Disaster: State Water Board Proposes Climate Change Resolution

UCS Blog - The Equation (text only) -

Earlier this week, while areas downstream of Oroville Dam were still under an evacuation order, California’s State Water Resources Control Board (State Water Board) released a draft resolution for a comprehensive response to climate change. It resolves that the agency will embed climate science into all of its existing work, both to mitigate greenhouse gas emissions, and to build resilience to the impacts of climate change. In doing so, the State Water Board demonstrates how public agencies can respond more proactively to the very real challenges that global warming is bringing our way.

A failure to plan is a plan to fail

After five years of record drought conditions, in just a couple months, California has received more rain than reservoirs can store. This may seem strange but it is exactly what climate scientists have predicted for the state since the 1980s: prolonged warm and dry conditions punctuated by intense wet spells, with more rain and less snow, causing both drought and floods.

Despite having a wealth of science at our fingertips describing how our water system is changing due to global warming, too often we have not put this information to use. During the federal relicensing of the Oroville Dam, the California Department of Water Resources (DWR) chose not to assess how climate change might affect the dam’s operation. In response to this “foundational error,” Butte County and Plumas County sued DWR. Their suit argues that the environmental analysis associated with the dam relicensing should be rejected as unscientific:

“Rather than rigorously assessing climate change, DWR’s Oroville FEIR [Final Environmental Impact Report] presumes that hydrologic variability from the previous century ‘is expected to continue in the foreseeable future’ and that it would be ‘speculative’ to further analyze other climate change scenarios…Due to this error, the FEIR is predicated upon a hypothetical future that DWR knows to be dangerously false.”

While we know that the past is no longer a predictor of the future, we continue to plan for the past. It’s easier, it’s seems less expensive, but it has huge, hidden costs. Costs now being borne by the nearly 200,000 residents who were evacuated, affected counties, and, eventually, taxpayers who will pay to repair the damage.

This is why it is incredibly important to plan for the future, and particularly more “extreme” climate conditions. We are on the precipice of giving away almost $3 billion of public money for new water infrastructure without requiring these new water projects use climate science and existing modeling results to assess how the proposed projects would fare under more “extreme” climate conditions. We have repeatedly encouraged the California Water Commission to require that new water projects provide a quantitative assessment of the impact of climate “extremes” on project operations. However, in December 2016, the California Water Commission approved regulations without this requirement.

State Water Board commits to using climate science

Mistakes are an inevitable part of life, but we need to learn from our mistakes. The State Water Board has taken an important step forward by drafting this resolution, which requires that the State and Regional Water Boards rely on sound modeling and analyses that incorporate relevant climate change data and model outputs to account for and address impacts of climate change in permits, plans, policies, and decisions.

There are many lessons from the Oroville Dam crisis, including the critical importance of using science to prepare for a future that will be different from the past due to global warming. We applaud the State Water Board for their leadership and hope other agencies will soon follow and commit to making better decisions using climate science.

Kudos to NRC for Lessons-Learned Review at Columbia Fuel Fabrication Facility

UCS Blog - All Things Nuclear (text only) -

Disaster by Design/Safety by Intent #63

Safety by Intent

Westinghouse Electric Corporation notified the Nuclear Regulatory Commission (NRC) on July 14, 2016, that workers at its Columbia Fuel Fabrication Facility (CFFF) in South Carolina found significant accumulation of uranium in a ventilation system. The amount of enriched uranium exceeded limits established at the facility as protection against inadvertent criticality.

The uranium accumulated in process vent scrubber S-1030 shown towards the upper left side of Figure 1.

Fig. 1 (Source: Nuclear Regulatory Commission)

The NRC dispatched an Augmented Inspection Team (AIT) to the site to investigate the causes and corrective actions for the event. The NRC sends Special Inspection Teams and Augmented Inspection Teams to investigate discoveries like the one reported at CFFF that have the potential for increasing the risk of an accident.

The AIT concluded in its report dated October 26, 2016, that “Westinghouse failed to provide adequate levels of oversight, enforcement, and accountability to the organizations directly involved with configuration management, operations, and maintenance of the wet ventilation systems.” Specifically, Westinghouse had assumed that only minute quantities of uranium could collect in that portion of the ventilation system and took no actions to either validate or confirm that key assumption.

To this point, both Westinghouse and NRC followed established practices. Upon discovery a condition above the reporting threshold, Westinghouse notified the NRC. Upon receiving notification from Westinghouse about a condition above its normal response threshold, the NRC dispatched an Augmented Inspection Team.

The NRC’s Extra Effort

The NRC did not stop with its AIT probe into whatever problems Westinghouse had that resulted in the event at CFFF. Two days after issuing the AIT report, the NRC chartered a team to examine lessons the agency could learn from the event. This second team was not tasked with supplemental Westinghouse bashing. That had been the AIT’s role. The lessons-learned team was tasked with assessing whether the NRC could make changes in its efforts so as to lessen the likelihood events like the CFFF would recur. Specifically, the lessons learned team was asked to evaluate the NRC’s license review process, inspection program, operating experience program, organization of oversight groups, and knowledge management programs.

It is commendable that the NRC undertook this introspective review. The review would either confirm that the agency is effective applying its resources or recommend ways to reallocate resources for increased effectiveness.

The NRC’s Extra Safety Gains

The AIT verified that Westinghouse had taken or would be taking appropriate corrective actions to lessen the likelihood of recurrence of this problem at its CFFF. The lessons-learned task force identified steps the NRC could take in all five focus areas to lessen the likelihood that such an event could recur at any NRC-licensed fuel cycle facility.

The team concluded that the NRC’s license review process and its inspection program allocated resources based on perceived risk significance. In other words, items with high and moderate risk significance received more attention than items having low risk consequences. The team did not find this triage system unacceptable. It is imperative to properly focus limited resources. But the team did make recommendations on ways NRC’s reviewers and inspectors could verify that items deemed low risk truly have low risk.

The team characterized the agency’s operating experience and knowledge management programs as being more supplemental than integral parts of business. Some of the NRC staff interviewed by the team used the programs extensively; other staffers were aware of the programs but had not used them. The team made several recommendations intended to integrate the operating experience and knowledge management programs into day-to-day work practices. For example, the team recommended training on using the operating experience database to lower the height and shorten the duration of the learning curve needed for users to become proficient with this tool.

The NRC’s Safety Backstop

In theory, NRC’s reviewers and inspectors should find no safety problems. NRC’s licensees—the owners of nuclear power plants and fuel cycle facilities—are responsible under the law for complying with regulations intended to manage risk to workers and the public.

In practice, NRC’s reviewers and inspectors could, and do, find safety problems. Not because NRC’s licensees are deliberately violating safety regulations, but compliance is a dynamic challenge.

By undertaking the lessons learned review of the CFFF event, the NRC makes its safety backstop more robust and reliable. The recommendations made by the team will, when implemented, improve the effectiveness of NRC’s reviewers and inspectors. The NRC’s reviewers and inspectors were already good, but the agency’s efforts to make them better result in making workers and the public safer.

It may not be the ultimate win-win situation, but it’s got to be among the top ten.


UCS’s Disaster by Design/Safety by Intent series of blog posts is intended to help readers understand how a seemingly unrelated assortment of minor problems can coalesce to cause disaster and how effective defense-in-depth can lessen both the number of pre-existing problems and the chances they team up.

Solar vs Nuclear: Is this the Last Chapter?

UCS Blog - The Equation (text only) -

Last year’s solar deployment numbers just came in, and they are, in a word, phenomenal. Utilities bought more new solar capacity than they did natural gas capacity: an astounding 22 states added more than 100 MW of solar each.

At the same time, there is grim news about delays in construction and associated cost over-runs  for nuclear plant construction projects in Georgia and South Carolina. SCANA—owner of South Carolina Electric & Gas and sponsor of the VC Summer Nuclear Project—has just reported new delays in the in-service dates of its new reactors to 2020. Construction started more than 7 years ago, with energy deliveries promised to begin in 2016.

Neighbors with solar. Courtesy of Grid Alternatives.

Past hopes for a “renaissance” in nuclear power in the United States, with four to eight new nuclear plant facilities projected to come on line in America between 2016 and 2018, have been overwhelmed by competition. UCS predicted this trend in costs many times.

Great solar news

Meanwhile, there is much to say about the solar boom. Just ask one of your 1,300,000 neighbors who have solar on their property.

To put these achievements in perspective, let’s talk about solar jobs and productivity. The solar industry employs more than 260,000 people in the United States. The continuous improvement in know-how in construction techniques and in manufacturing drives down solar deployment costs every 3 months. The pricing for new solar projects is coming in the range of 4 cents (Texas) to 5 cents (California) per kilowatthour.

In comparison with nuclear, the amount of solar power built in 2016, taking into account how many hours each can operate each day, is the equivalent of more than 3 new nuclear plants.

To dive in a little deeper: let’s use a 25 percent capacity factor for new solar, making the 14,626 MW installed equivalent to 3,650 MW of theoretically perfectly running nuclear plants. The Westinghouse AP 1000 units under construction for the last 7-10+ years produce about 1,100 MW.  So, in one year, solar additions were equal to what takes more than 7 years to build. The difference in speed of deployment is why UCS is clear that nuclear power isn’t a near-term climate solution.

The demise of the nuclear option

In the energy business, nuclear is fading fast. Struggles to keep existing plants open in competitive markets are roiling the electricity markets. But the recent news about the very few manufacturing firms supplying nuclear construction illustrates how very different the nuclear industry is from solar.

Cost over-runs in the US plants are so large that when state regulators finally put a cap on what South Carolina and Georgia consumers would pay, manufacturer Toshiba (owner of Westinghouse) found itself with $6 Billion in losses and the likely end of its business in nuclear power plant construction.

The concentration of nuclear component manufacturing in so few companies has shown how a problem with quality led to a “single point of failure” plaguing the fleet of French nuclear plants. Policy in the US has been to shield the utility companies from the risks of their business decisions to construct nuclear plants, continuing with the Vogtle plant in Georgia.

Would we ever go 100% solar?

Would we ever build only solar? Maybe, but that’s not the right question. “What can we do with lots of solar?” is a better one.

We can keep absorbing the solar pattern of production with the tools we have. We can plan to adjust to cheap energy in the middle of the day with time-varying rates. And if we can get energy storage further along, we can get to the end of this debate.

Public Source

One Way You Can Help Fight Against Political Interference in Science: Tell Us About It

UCS Blog - The Equation (text only) -

Since Election Day and into the first weeks of the Trump presidency, we’ve heard a lot about “alternative facts” and clampdowns on the ability of scientists to present scientific evidence or speak to the press. Congress last week signaled its intent to neutralize the Environmental Protection Agency and other federal departments by cutting science out of the way they make policy.

Truth and science cartoon

Federal employees can help create an accountable government by reporting political interference in science (even anonymously). More info:

But together, we can raise the political price of manipulating science or censoring scientists by exposing these actions and publicly communicating their consequences for public health and the environment. Sometimes, this requires people within government or who are funded by government to speak up and share challenges that they experience or perceive.

Learn how to securely and/or anonymously communicate with UCS here. The shortlink is

UCS has many years of experience working with government employees, journalists, and members of Congress to get stories out in a way that protects those with information to share. We want to hear about actions that compromise the ability of science to fully inform the policymaking process—and the consequences of those actions. We also want to hear your stories that describe how government data and government experts protect public health and safety.

Just as there are many steps in the policymaking process, so too are there many ways to attack and politicize science. People often think of the muzzling of scientists, or the censorship of documents. This happens, of course. But there are other, more subtle ways of inappropriately influencing how science is used to make decisions. A partial list is at the end of this post.

Political interference in science can be difficult to assess. It’s often not clear whether a person’s actions are normal or crossing the line—especially within an administration where some don’t want to leave a paper trail. To that end, feel free to share what you’ve heard or what you’ve been told verbally. Our staff are ready and willing to help you figure out the best course of action.

CensorMatic CartoonYou should also consider approaching the official who is responsible for implementing your agency’s scientific integrity policy for advice. Outside of government, in addition to UCS Public Employees for Environmental Responsibility, the Government Accountability Project, and the Climate Science Legal Defense Fund are all good resources for learning more about your rights and responsibilities.

Now that partial list of subtle and overt ways that vested interests have used to undermine or politicize science, in no particular order:

  1. Prevent scientists from publishing research, or delay publication of research (see: former EPA clearance process)
  2. Prevent scientists from presenting at or attending scientific meetings that are relevant to their work (see: airborne bacteria)
  3. Diminish or destroy agency scientific libraries and library content or similar resources (See EPA, Department of Fisheries Canada)
  4. Allow agencies with conflicts of interest to second-guess or undermine the work of agency scientists through the inter-agency review process (see: the chemical perchlorate)
  5. Require scientists to manipulate scientific methods (See: lead in children’s lunch boxes)
  6. Restrict the types of information and methods that experts can use (See: attempts to prevent climate scientists from using scientific models)
  7. Manipulate or censor scientific information in testimony before Congress (see: CDC testimony on climate change and public health)
  8. Place misinformation on official government websites (see: breast cancer)
  9. Redefine terms to prevent the successful application of science to policymaking (see: OMB peer review guidelines, critical habitat under the Endangered Species Act)
  10. Promote scientifically inaccurate educational curricula (see: abstinence-only sex education)
  11. Refuse to comply with court-mandated analysis (see: endangerment finding)
  12. Waste scientists’ time with baseless subpoenas or open records requests
  13. Manipulate agency scientific documents before release to create false uncertainty or otherwise change the scientific meaning (see: endangered species)
  14. Limit or prevent scientists from communicating with the media, the public, or Congress, including social media, or through requiring minders that sit in on interviews with agency scientists (see: numerous reports from journalists)
  15. Prevent scientists from speaking to the press, or have “minders” present to ensure that scientists say the “right” thing
  16. Selectively route interviews away from scientists with inconvenient scientific analysis (see climate change and hurricanes)
  17. Remove or decrease accessibility to government data sets, tools, models, and other scientific information, or stop collecting data altogether (see Canada’s Harper Government)
  18. Appoint technically unqualified people or people with clear conflicts of interest to federal science advisory committees (see childhood lead poisoning)
  19. Use political litmus tests for federal advisory committee membership (see workplace safety panel)
  20. Threaten, demote, or defund scientists who refuse to change information (see Vioxx)
  21. Create a hostile work environment that causes scientists to self-censor (see FDA surveillance)
  22. Disregard the law by not making decisions solely on best available science when statutorily required to do so (see air pollution limits)

Threats to science-based policymaking and public access to scientific information— essential components of democracy—have never been more real. But scientists are also ever more committed to defending the integrity of science in the policy making process. We depend on sources with knowledge of what’s happening within government to help us prevent a weakening of the federal scientific enterprise and the public protections that science informs.

Once again, that link for reporting what you see:

UCS Founder Kurt Gottfried Wins AAAS Award

UCS Blog - The Equation (text only) -

Kurt Gottfried, a founder of UCS in 1969 and a guiding spirit and intellect since then, has won the prestigious 2017 Scientific Freedom and Responsibility Award given by the American Association for the Advancement of Science (AAAS). AAAS is the world’s largest general scientific society and publisher of the journal Science.

I can’t think of anyone more deserving of this award, which recognizes Kurt’s lifetime of dedication and achievements. AAAS said it is to recognize Kurt’s “long and distinguished career as a ‘civic scientist,’ through his advocacy for arms control, human rights, and integrity in the use of science in public policy making.”

Source: UCS

Kurt receiving this award also means a lot to me personally, since he has been one of the biggest influences on my professional life. I first met him in 1978 when I took his quantum mechanics course as a physics grad student at Cornell. He was a wonderful teacher and communicator, and generations of students have learned the subject from his classic text book (now in its second edition).

But I actually got to know him a couple years later—early in the Reagan presidency—when we were part of a group at Cornell that brought high-level speakers to campus to talk about the nuclear arms race, which was heating up. I’ve been privileged to have continued to work with him since that time. Kurt’s way of thinking about the world and approaching the problems he worked on have helped shaped my own.

Kurt’s history

I would guess that even the people who know him may not be aware of the range of activities Kurt has taken on over the years.

Kurt was born in Vienna, Austria, in 1929. He has had a long and distinguished career as a theoretical physicist. He received his PhD from MIT, became a Junior Fellow at Harvard, and has been a physics professor (now emeritus) at Cornell since 1964.

At the same time, he has dedicated boundless energy to improving the world, in areas including international security and nuclear arms control, human rights, and preventing political intervention in scientific input in policymaking. For example:

Science, International Security, and Arms Control

On leave at MIT in 1968-9, Kurt helped draft a statement encouraging scientists to consider society’s use of technical knowledge, and calling on scientists and engineers across the country to join a national effort to discuss these issues in university classes on March 4, 1969.

Following the success of that effort, Kurt co-founded UCS that same year. His goal was to help scientists bring their expertise to bear on public policy issues that had an important technical component. From the beginning, the vision was to build a research and advocacy organization that combined technical experts with experts in policy analysis, media engagement, and outreach and education for the public and policy makers, while keeping issues of science and technology at the core of its work.

Today, UCS has grown to more than 180 staff members and has an annual budget of more than $27 million. More than 45 years after UCS’ founding, Kurt remains a valuable member of the Board of Directors.

Over the years, UCS not only helped inform debates and shape policy on a wide range of issues, it also helped legitimize the active role of scientists in these debates and created staff positions allowing scientists to work on these issues full time. And it helped engage a broad set of scientists in part-time policy work, educating them about the issues and training them in writing and speaking for policy makers.

Working with UCS, Kurt was among the first people to raise concerns about the development of missile defenses, co-authoring a report on the topic in 1969. Kurt and UCS were particularly active in the debate in the 1980s and 1990s following President Reagan’s “Star Wars” speech. Kurt weighed in with articles and op-eds in Scientific American, the New York Times, the Washington Post, and elsewhere, and co-authored the influential books The Fallacy of Star Wars (1984) and Countermeasures: A Technical Evaluation of the Planned U.S. National Missile Defense System (2000).

Kurt at a 2000 press conference in Washington. Source: UCS

Kurt also worked to prevent the development of anti-satellite weapons and weapons based in space. He wrote and spoke widely about this issue and worked with Dick Garwin to develop a draft treaty banning anti-satellite weapons, which he presented to the Senate and House Foreign Relations Committees in 1983 and 1984.

In addition, he authored or co-authored articles on nuclear weapons, command and control systems and crisis stability, and cooperative security in Nature, the New York Review of Books, and elsewhere. He edited two books on these issues—Crisis Stability and Nuclear War (1988), and Reforging European Security: From Confrontation to Cooperation (1990)—and contributed chapters to several others.

Scientists and Human Rights

Kurt was also very active in human rights issues for many years—activities he undertook outside his work with UCS. During the 1980s he traveled to the Soviet Union to meet with and support refuseniks, and he urged others in the scientific community to actively support these dissidents.

Kurt was a major figure in the American Physical Society (APS) Committee on International Freedom of Scientists (CIFS), which helped oppressed scientists in the Soviet Union and other countries. CIFS described its goal as:

The Committee was formed to deal with those matters of an international nature that endanger the abilities of scientists to function as scientists. The Committee is to be particularly concerned with acts of governments or organizations, which through violation of generally recognized human rights, restrict or destroy the ability of scientists to function as such.

Kurt served as CIFS’ first chair in 1980 and 1981. One of CIFS’ innovations was its use of “small committees,” typically consisting of three or four people, who would pick a persecuted scientist and regularly write to the scientist and his/her family, friends, and local officials.

Even when these letters were intercepted by the authorities, they raised the profile of the scientist and made clear that international attention was focused on this person. By 1983, these committees were writing to 63 scientists, and the number continued to increase through the mid-1980s.

Kurt also helped found the organization Scientists for Sakharov, Orlov, and Sharansky (SOS) to focus attention on three of the most prominent Soviet refuseniks. He served on the SOS Executive Committee from 1978-90. SOS’s call for a moratorium on scientific cooperation with the Soviet Union to highlight concern about the treatment of scientists was joined by nearly 8,000 scientists and engineers from 44 countries, and gained international attention.

Soviet physicist Yuri Orlov was jailed for a decade in the Soviet Union after forming Moscow Helsinki Watch to monitor Soviet actions on human rights after it signed the Helsinki Accords in 1975. Kurt’s involvement in his case led to Orlov coming to Cornell after his release in 1986 and joining the physics faculty.

Kurt was also instrumental in winning the release in 1978 of the physicist Elena Sevilla, who was imprisoned in Argentina because of political activities by her husband, a newspaper reporter. On her release, Kurt arranged for her to come to Cornell to finish her graduate studies in physics.

Kurt’s work not only helped the refuseniks and other oppressed scientists. His actions over the years have helped inspire others in the scientific community to recognize and act on their ability and responsibility to help scientists who were denied basic human rights.

For his work on these issues, Kurt was awarded the APS Leo Szilard Award in 1992.

Scientific Integrity/Science and Democracy

In the wake of growing evidence that some officials in the George W. Bush Administration were distorting scientific knowledge and the scientific advisory process to an unprecedented degree, Kurt recruited 62 preeminent scientists to sign a statement titled Restoring Scientific Integrity in Policy Making, which was released in February 2004.

The statement charged the Bush Administration with widespread “manipulation of the process through which science enters into its decisions” and called out the administration’s misrepresentation of scientific evidence, appointment of unqualified members of scientific advisory committees, and silencing of federal government scientists—actions that threatened the integrity of science in policy making.

The statement drew wide public attention to these issues. It was signed on-line by more than 12,000 scientists.

Subsequently, Kurt led the effort to create a new program at UCS to work on this issue, which researched examples of abuse, engaged the scientific community on this issue, and worked with administration agencies to reform their practices, including writing draft rules on scientific integrity for these agencies. Kurt was also the force behind evolving that program into the UCS Center for Science and Democracy in 2012, arguing there was a need to address a broader set of issues related to the role of science and evidence-based analysis in democratic society.

* * *

Kurt, Hans Bethe, Dick Garwin, and Henry Kendall at a press conference on missile defense, March 22, 1984 (Source: James J. MacKenzie)

For half a century, Kurt has engaged the scientific community, policy makers, and the general public on important issues related to international security, human rights, and the role of science in democratic society. Moreover, he has encouraged his colleagues to become involved, mentored younger scientists in these issues, and created an organization that has magnified his efforts and will continue this work well beyond his lifetime.

Kurt has been an inspiration to me and other scientists who decided to make a career of applying our technical backgrounds to important policy issues, and helped break the ground to make a career of this kind more possible.

Love Local Food? Here’s a Promising Way to Protect the Local Land that Grows It

UCS Blog - The Equation (text only) -

Does your heart beet for farmer’s markets? Do you carrot all about protecting the soil? This Valentine’s Day, lettuce dive deeper into a promising solution for simultaneously protecting land for local food production, ensuring more sustainable agriculture, and creating opportunities for beginning farmers: land trusts.

If you heart local food, it is important to remember that farmland for the food needs protecting, and land trusts are one part of the solution.

Agriculture puns aside, land trusts are nonprofit organizations designed to protect land in perpetuity. Essentially, landowners donate or sell the long-term rights on their property to a land trust—an outside organization that ensures that in the future land is only used for specific purposes, such as for wildlife habitat or agriculture.

There are several reasons why agricultural land trusts can be beneficial. The American Farmland Trust estimates that 40 acres of farmland (roughly the size of 36 football fields) are lost every hour to urban sprawl and development in the United States (that’s over 350,000 acres per year). And there is also no shortage of concerns around existing agricultural lands, including water pollution, soil degradation, and a recent dramatic drop-off in farm incomes. Agricultural land loss and degradation necessitate conservation options such as trusts.

Protecting land for beginner farmers and sustainable agriculture

Land trusts, such as the Sustainable Iowa Land Trust (SILT), are non-profit organizations that work with landowners to facilitate different arrangements, such as long-term leases or land donations that legally protect or ensure particular uses of land in the future. Land trusts fill an important need in facilitating the major transfer of land that is anticipated in agriculture because the average farmer’s age is 58, combined with growing competition for land use from urbanization and energy development. Suzan Erem, SILT’s Board President, pointedly reminded me that “the history of the U.S. is that we haven’t seen cities shrink”. Photo: SILT.

One example of an organization with a dedicated focus on sustainable agriculture is the Sustainable Iowa Land Trust (SILT). SILT launched in 2015 with a mission to permanently protect land to grow healthy food, and this is the major distinction between SILT and other non-profit land trusts: the requirement for sustainable food production on their farms. While most land trust agreements include prohibitive language to prevent development-related activities, SILT also adds affirmative language requiring sustainable farming (defined by several different sustainability certifications).

SILT also hopes that more and more landowners will donate or participate in long-term leases through their model to institutionalize affordable land access. This will help make land—particularly land for sustainable food production—available so that it is not just about “where you’re born or sheer dumb luck,” according to Suzan Erem, SILT’s Board President. SILT is proud of its relationships with both national organizations such as the National Young Farmer’s Coalition and statewide programs including Lutheran Services, which assists refugee populations in finding land to launch farm businesses.

That’s another crucial benefit of SILT’s approach: landowners who hope to preserve the integrity of their land are paired with beginner farmers looking for an affordable way to get started. Erem explains that the popularity of programs like SILT is related to the excitement of seeing it “giving people a place and a purpose,” and because they provide opportunity to “redefine what you can do with your legacy.”

Local food demand and supporting midsize farms are further reasons to protect agricultural land near cities

Another important piece of this puzzle is strong consumer demand for local food. Late last year, USDA released the results of their first-ever survey of direct marketing (food products sold by farmers directly to consumers, retailers, institutions or other local food intermediaries), and reported that total sales across the country generated this way were an estimated $8.7 billion. The survey estimated that 67% of these sales were from farms located in metropolitan counties and that the 38% of producers responsible for these sales were women (a greater proportion of women than in the general farming population), and 14% were veterans. As I’ve noted previously, women and veterans are groups that have plenty of room to expand in the agricultural sector.

One component of the most profitable farms—regardless of size—is direct marketing, as Dr. Dawn Thilmany McFadden, a member of our Science Network, explained in a blog post last year. This form of sales is particularly important to protect “agriculture of the middle” or midsize farms and ranches, which have been declining for many decades (a trend likely to worsen under the present tightening agricultural economy). Growing Economies, our 2016 report, similarly noted that more direct sales from institutional food purchasers could be a multi-billion dollar boon for the state of Iowa.

Despite the benefits of protecting local farms and food, it’s important to recognize that local food is certainly not a panacea for all environmental concerns. Tradeoffs with impacts such as greenhouse gas emissions require careful consideration, as another Science Network colleague, Dr. David Cleveland, recently noted on our blog. Still, given the stimulus for local economies, and the need to protect farmland in general, how we protect land for local food deserves an important part of the conversation.

And remember for Valentine’s Day, let’s turnip attention to the idea that land trusts and local food make a great pear!

Fake News about Chinese Nuclear Weapons

UCS Blog - All Things Nuclear (text only) -

On the left, a still from a video shot at an intersection in the Chinese city of Daqing. On the right, a picture of the Russian Topol-M taken during a military parade in Moscow. Both are carried on eight-axel TEL vehicles, indicating they are approximately the same size.

A video recently discovered on a Chinese internet service appears to show a new Chinese road-mobile missile making a turn at an intersection in the city of Daqing. The discovery generated sensational claims about changes in Chinese nuclear strategy. However, a careful search of Chinese sources shows that none of those claims can be substantiated. Some are obvious distortions.

The Dongfeng (DF)-41 Missile

Multiple foreign media sources claimed the missile in the video was a new nuclear-armed long-range ballistic missile called the DF-41. The Chinese government does not comment on the composition and capabilities of its nuclear missile force and has neither confirmed nor denied the existence of the DF-41.

The missile seen in the video appears slightly larger than the DF-31A long-range ballistic missiles China displayed in a national military parade in 2015. The U.S. National Air and Space Intelligence Center states the DF-31A has a range of 11,000+ kilometers and could deliver a single Chinese nuclear warhead, estimated to weigh approximately 500kg, to targets within the continental United States.

Almost all of the reported information about the existence and characteristics of the DF-41 can be traced to a handful of foreign media sources that have a questionable track record when reporting on Chinese missile technology. However, the U.S. Department of Defense recently reported China “is developing a new road-mobile ICBM, the CSS-X-20 (DF-41) capable of carrying MIRVs.”

Although foreign media sources routinely claim the DF-41 could carry 10 or 12 nuclear warheads, the missile seen in the video could not. It’s too small given the mass of Chinese warheads. Similar in size and appearance to the Russian Topol-M, which can carry a payload of 1,200 kg approximately 10,500 km, the missile in the video may be able to carry two Chinese warheads, but it most likely is designed, like the DF-31A, to carry a single warhead and a set of countermeasures to confuse missile defenses.

If the missile seen in the video is the new road-mobile missile discussed in the Pentagon report, it purportedly has a slightly longer range than the DF-31A. This would allow China to reach US targets it could previously reach only with a liquid-fueled, silo-based missile called the DF-5—which was also displayed during the 2015 military parade. Because silo-based missiles are more vulnerable to a preemptive first strike, having a road-mobile missile with the same range as the DF-5 increases Chinese confidence in their ability to retaliate.

False Claims About Chinese Nuclear Strategy

On January 24, Popular Mechanics published a story with a still from the video that claimed the Chinese government “publicly announced the deployment” of the DF-41 and that announcement “is likely a warning to U.S. President Donald Trump, who is known for sharply worded anti-Chinese rhetoric and has announced plans for a new ballistic missile system.” Two days later The Independent ran the same story with the same claims. The Sun, the Daily Caller, the International Business Times, the Moscow Times,,, STRATFOR, TASS, RT, and Sputnik International all ran stories about the alleged Chinese nuclear missile “deployment” and what it supposedly revealed about the intentions of the Chinese government.

Breitbart ran the same story with the same claims on January 27, but with the added twist that the so-called deployment of the missile in Heilongjiang province, which shares a border with the Russia, is a prelude to an “approaching Clash of Civilizations world war” where “Russia and the United States will be allied against China.”

The sole basis of the claim that China announced the existence and deployment of the DF-41 is a commentary in the English-language edition of China’s Global Times. The commentary is a response to the publication of images from the posted video in the Hong Kong and Taiwan media, which in turn seem to have their origins in a French website. Yet, the Global Times clearly states,”there has been no authoritative information on whether China has a Dongfeng-41 strategic missile brigade, how many such brigades it has and where they are deployed.” The Chinese commentary is critical of President Trump, and does express the hope that the existence of the DF-41 “will be revealed officially soon.” But that is a far cry from the “official announcement” described in many of the foreign news reports on the posted video.

Fake News about Nuclear Weapons is a Cause for Concern

The fabrication and distribution of misinformation about the size, capability and intent of China’s nuclear arsenal is nothing new. Several years ago an adjunct faculty member at Georgetown University cited Chinese-language blog posts to recast decades-old rumors from a Hong Kong tabloid as a “leaked Chinese military document” that allegedly proved China’s nuclear arsenal was ten times larger than existing US estimates. His assertions and sources are demonstrably not credible. Yet, Dr. Peter Navarro, an adviser to President Trump, repeated these alternative facts about the size of China’s nuclear arsenal in a recent book on Chinese military strategy.

President Trump recently directed Secretary of Defense Mattis to initiate a review of the US nuclear posture. This follows a series of statements in the wake of the November election that indicated Mr. Trump supported a major build-up of US nuclear forces. While the new U.S. president’s comments on the need for US nuclear modernization are not unprecedented, his ability to push modernization plans through a Republican-led Congress, despite the enormous projected costs, may be enhanced by exaggerated perceptions of a Chinese nuclear threat to the United States.

As the debate on US nuclear weapons policy takes shape under the direction of Secretary Mattis, who may have reservations about the need for a US nuclear build-up, it is important that US decisions be made on the basis of the best available information, rather than the alternative facts now circulating in Washington.

How to Ensure Self-Driving Vehicles Don’t Ruin Everything

UCS Blog - The Equation (text only) -

Zipcar’s former CEO has cast the self-driving future as a “heaven or hell” scenario, and she has a point. Self-driving cars could save lives, smooth traffic congestion, expand access to jobs or schools—especially for people who can’t drive themselves today—and reduce the number of vehicles on our roads. On the other hand, they could worsen smog and local air quality pollution, disrupt the US economy by putting millions of people out of work, justify cuts in public transit funding and services, and force urban planners to focus more on providing space for vehicles instead of for parks, bicyclists, or pedestrians.

To maximize the potential benefits of self-driving vehicles and minimize their potential consequences, UCS developed this set of principles that we will be pushing policymakers, businesses, and other stakeholders to follow. Doing so will ensure that self-driving vehicles reduce oil consumption and global warming emissions, protect public health, and enhance mobility for everyone.

Science-based policy will be key for shaping the introduction of self-driving technology

Many are rallying against any regulation of self-driving technology beyond ensuring it’s safe to use. I’ve even heard the claim that over regulating this technology will literally kill people by slowing the speed at which self-driving cars are introduced, thus delaying their potential safety benefits.

To be fair, this argument has merit. Self-driving vehicles are forecast to reduce the tens of thousands roadway fatalities that occur each year in the US by as much as 90 percent, and can offset the rise of distracted driving that may have caused the biggest spike in traffic deaths in 50 years (though reaching these improved safety levels will take further advances in the technology and widespread deployment).

But, self-driving technology won’t just impact transportation safety. Researchers are forecasting how it will affect traffic congestion, vehicle-related emissions, land-use decisions, public transit systems, data security, and the economy. Unfortunately, the emphasis that many, including the US Department of Transportation, have placed on the safety benefits can be distracting from the need to consider how policy should address the other equally great potential impacts of self-driving technology.

I’m not saying self-driving technology should be regulated to the scrapheap. The technology is highly likely to improve traffic safety and increase access to transportation—both important outcomes. Yet self-driving vehicles will need to be regulated on issues other than safety, as their full breadth of potential impacts won’t be addressed by safety-focused policy or market forces alone.

For example, studies have found that self-driving vehicles could double transportation emissions (already the largest source of climate change emissions in the US), place millions Americans out of work as automated driving replaces truckers and taxi drivers, and/or exacerbate urban sprawl.

The jackpot for winning the race to produce the best self-driving vehicle can still be won even if these negative affects are suffered, and today’s policy frameworks may be insufficient to effectively curtail these future impacts. Let’s not forget that automakers have historically been against regulation (see: seat belts, fuel economy, air bags) and are encouraging policymakers to clear the way for self-driving vehicles not only because they seek to improve transportation safety, but because they see a potential to make a profit.

So science-based policy covering the broader implications of self-driving cars, including how they affect emissions and our economy, will be needed to ensure the best possible self-driving future and these discussions need to happen today. To kickstart these conversations, UCS released these principles that will create a safe, healthy, and equitable autonomous future. Join the conversation on whether and how self-driving technology should be regulated by checking out our new self-driving vehicle web content and signing up for future action alerts here.

North Korea’s February Missile Launch

UCS Blog - All Things Nuclear (text only) -

North Korea reportedly launched a medium-range missile Sunday morning local time (about 6 pm Saturday on the US east coast).

People are speculating about what missile it could have been. Based on the range, there are at least two candidates, which would be distinguishable by US intelligence if it was able to observe the launch.

Fig. 1

The missile was apparently launched eastward from the Panghyon air base near Kusong, northwest of Pyongyang and traveled 500 km, splashing down in the Sea of Japan. According to the South Korean military, it flew on a lofted trajectory, reaching an apogee of about 550 km.

A missile flown on this trajectory would have a range of 1,200-1,250 km if flown on a standard trajectory with the same payload (Fig. 1).

Nodong or KN-11?

That range is similar to that of the North Korean Nodong missile, which was first tested in the early 1990s and has been launched repeatedly since then. Another launch of the Nodong would not be particularly useful for advancing Pyongyang’s missile program, so if that was what was launched it would have had a political motivation.

However, as Jeffrey Lewis points out, the trajectory is very similar to the trajectory the submarine-launched KN-11 missile flew in its first successful test last August. While similar in range to the Nodong, the KN-11 has the advantage that it uses solid rather than liquid fuel, which means it would take less preparation time before a launch. The North is likely to be interested in developing and testing a land-based version of the missile.

If this is what was launched, it would represent a useful developmental step for North Korea, no matter what may have driven the timing of the launch.

The KN-11 would have a clear fingerprint that would distinguish it from the Nodong (or the Musudan, see below), since it has two stages rather than one, and that difference would be clear if US, Japanese, etc., sensors were able to observe the test.

Other options?

Some of the reports have speculated the test was of a Musudan missile, but I haven’t seen anything about the test that supports that. The Musudan range is considerably longer. The one successful Musudan launch, which took place last June, suggested a maximum range of about 3,000 km, although a recent analysis suggest that the range is probably less than 2,500 km if it carries a payload similar to the mass of a nuclear warhead. (Note that repeated claims that the Musudan can reach up to 4,000 km are not credible.)

It’s also worth noting that North Korea apparently fired several extended-range Scud missiles last September, which have a similar but somewhat shorter range than seen in the test, depending on the payload. These are also single stage and could be distinguished from a KN-11 test.

Of course, the North may surprise us with something else entirely.

Can Republicans Find Their Voice on Climate Change via a Carbon Tax?

UCS Blog - The Equation (text only) -

Earlier this week a group of conservative opinion leaders and experts launched the Climate Leadership Council, championing a national carbon tax to cut emissions and help achieve climate goals.

As with any carbon pricing proposal, the politics are complicated and there is no telling how much traction this particular initiative will get. There are also definite concerns about some of the details of the proposal. But it’s very encouraging to see a meaningful solution to climate change put forth by conservatives. I look forward to seeing where this will go, especially with Republican lawmakers and the Trump administration.

Starting from the facts

This proposal begins with recognizing the scientific facts about climate change and the urgency of acting on solutions. To see leading conservatives articulate those basic realities is important, and I hope Republicans in Congress and the Trump administration are listening.

Climate change should not be a partisan issue. There’s no time to waste on the dangerous new types of denial or delay tactics that were in evidence during the nomination hearings for Rex Tillerson and Scott Pruitt, for example.

Just like the near-universal consensus among climate scientists about the facts of climate change, there is an overwhelming consensus among economists that a carbon price is an essential policy tool for driving down carbon emissions. The CLC proposal’s starting price of $40/ton CO2, escalating over time, shows the seriousness of their proposal.

What’s more, the authors of the proposal recognize that we have to act on climate as a global community and the US must live up to its international commitments under the Paris Climate Agreement. Yes, to meet long term climate goals countries will have to do a lot more than they have currently committed to, but walking away from the Paris Agreement would be a serious mistake.

Notes of caution

There is obviously room for discussion about ways to improve the policy proposal, as and when it gets serious consideration from policymakers. Some aspects of the proposal that could definitely use further scrutiny include:

  • Regulatory rollbacks that harm public health or undermine key legal protections are cause for concern. The EPA’s authority to regulate global warming emissions is a critical safeguard that cannot be negotiated away. There may be middle ground possible here but further conversations with a wide set of stakeholders, including environmental justice groups, are critical.
  • A carbon price alone will not be sufficient to deliver on the deep emission reductions consistent with climate goals; we need complementary policies to address other market failures. For example, policy incentives for innovation in low carbon technologies are important. In sectors like transportation, a small surcharge on fuel prices won’t be enough to drive the big changes needed in vehicle fleets and the investments in infrastructure for public transit or electric vehicles so other policies are needed. And we need policies to address non-CO2 emissions, such as methane.
  • What happens with the (considerable) carbon revenues is obviously a hugely important policy choice that must be made in consultation with lawmakers, with the interests of the broader public squarely in mind. Priorities—such as appropriately offsetting the disproportionate impacts of energy price increases associated with a carbon tax; transition assistance for coal workers and coal-dependent communities; assistance for communities facing climate impacts, especially frontline low income and minority communities; and investments in low-carbon infrastructure—require dedicated funding which could come from carbon revenues, or would require appropriations from Congress.
Getting (back) to bipartisan approaches on climate policy

In recent years, views on climate change have become politicized to the point that climate denial has become a form of tribal identity for most conservative-leaning politicians, and one more instance of the ‘just say no’ approach to any issue championed by the Obama administration.

Given the anti-science rhetoric from many Republicans in Congress, it’s hard to remember that there was a time when climate change was not a partisan issue. There was a time when Senators John McCain and Lindsey Graham and other leading Republicans not only openly accepted climate science but worked hard, together with Democrats, to find bipartisan solutions.

We got tantalizingly close to a national climate policy in the form of the American Clean Energy and Security Act of 2009 (aka the Waxman-Markey bill), which passed the House but was never brought to a Senate vote because of insufficient support. The failure of that legislative effort is what led to the EPA’s Clean Power Plan as an alternative. Regulation was not the first choice of the Democrats or of the Obama administration.

There is lots of blame to go around about how and why bipartisan approaches to addressing climate change have failed thus far. But we don’t have the luxury to wallow in past mistakes; we have to break through the partisan divide and act on climate now.

And that’s why I am particularly encouraged by a proposal from conservatives that attempts to bridge that divide, albeit imperfectly.

The future can be different

Call me a delusional optimist, but I fervently hope that Republicans in Congress will now feel free to acknowledge the reality of climate change because that position will no longer be associated with a Democratic administration. And that they will work to advance solutions that can help meet the urgency of the challenge we face.

Even during the Obama years, there were some who stepped out of the party line, including a group of Republicans who joined the bipartisan Climate Solutions Caucus in the House and those who signed on to the Gibson Resolution.

Yesterday, along with the news of the CLC carbon tax proposal, we also heard news of four new members added to the bipartisan Climate Solutions Caucus. The Caucus now has 12 Republican members and 12 Democratic members.

Maybe these types of bipartisan efforts will grow in strength and size and we will get to a political tipping point on climate action. Maybe climate science and smart solutions can take center stage instead of partisan politics. One can hope this happens soon…

Actually, no. Hope is simply not enough. We need action urgently.

Republicans (and Democrats) must step up

We cannot afford another four years of denial, obstruction, artful lies, and ‘just say no’ politics, aided by fossil fuel interests. Climate impacts are already imposing harms on Americans and a costly burden on our economy. The recent climate data are stunning and sobering. Just a few examples:

Meanwhile, solutions like ramping up wind and solar energy are getting cheaper every year, and bring the promise of huge new economic opportunities IF we accelerate the momentum already underway.

Let’s build that clean energy infrastructure and create jobs. Let’s cut pollution from fossil fuels that causes numerous health problems including exacerbating asthma in children, and contributing to other types of heart and lung ailments, and even premature death. Let’s help coastal communities struggling with flooding worsened by sea level rise.

And let’s put a price on carbon while we’re going about it. There’s nothing partisan about any part of this bright vision for our future.

Still waiting for Republican leadership on climate change

Of course, President Trump must also show leadership from the top. His administration’s threats to dismantle existing climate and energy policies without any clear alternative plan are not a promising start. Thus far, the administration doesn’t show any indication of an interest in helping Americans facing the impacts of climate change, or recognizing the serious consequences of our continued dependence on fossil fuels.

If the president won’t lead, then Congress—including members of his own party—needs to have the courage to hold him accountable and advance their own climate solutions, perhaps along the lines of the CLC proposal.

The future will not be kind to this Congress and this administration if all they do is continue to find new creative ways to deny the science and dodge their responsibility to act on climate. We the people—Democrats, Republicans, and Independents alike—deserve much better from our government.

Electricity Rates Are Sorely Outdated. Let’s Give them an Upgrade.

UCS Blog - The Equation (text only) -

Last month, to great and enthusiastic email fanfare, my utility presented me with a redesigned electricity bill. One meant to help me better understand the various costs and components that make up the final amount due. In an entirely relatable manner, my household met such news with chortles of joy. What a day!

But the utility’s trick? Colors and a stacked bar chart. They were nice colors, and yet…it proved a letdown. If our electricity bills contained just a bit more of the right information, we could collectively be saving billions of dollars a year, reducing air pollution all around us, and helping to bring ever more renewables online—a true step forward toward our vision of the modern grid. Now tell me that’s not a neat trick.

Shining a light on system costs

So what’s the right information, and how do we get it? Time-varying electricity rates, or rates that go up and down to let us know when it’s costlier and less efficient to be using electricity, and when it’s cheaper and cleaner.

As my colleagues and I explain in a new issue brief Flipping the Switch for a Cleaner Grid, with that extra information, we can make more informed decisions about how and when to use electricity, and save money and clean our air in the process.

Right now, most of us get charged the same flat rate for electricity no matter when we use it. But in reality, the actual cost to the system varies widely over times of day, days of week, and even seasons. These fluctuations in price are driven in large part by the need to meet ever-changing customer demand.

In particular, though we can’t see it with flat rates, our last bits of ill-timed load can mean sky-high prices as the system powers up inefficient plants, which we pay to build and maintain even though we use them for just a small amount of time each year. Talk about a wasteful design. By using price signals to mobilize flexible demand, time-varying rates flip this operations paradigm on its head.

Rates as guides

Time-varying rates use price signals to encourage customers to use electricity at some times and not others. Credit: UCS.

Time-varying rates are designed to encourage customers to alter when and how they use electricity. Different structures go about it in different ways to target different points of inefficiency. The figure on the right shows three of the most common forms: time-of-use (TOU) rates, critical peak pricing (CPP), and real-time pricing.

  • TOU rates (top right) target daily repeating patterns of peak and off-peak periods,
  • CPP rates (middle right) focus on just those few hours a few days a year when electricity use is at its very highest, and
  • Real-time pricing (bottom right) approximates the actual system cost in 5-minute to 1-hour intervals, which allows interested customers to best take advantage of the dynamic up-and-down swings of prices.

Time-based rates are not new; in particular, TOU and CPP rates have been around for a long time, especially for commercial and industrial electricity customers. However, it’s only been with the recent deployment of tens of millions of smart meters over the last few years that wide-scale, administratively low-cost programs have been more readily attainable at the residential level.

Still, except for a few places where state-wide implementation of time-varying rates is on the table (see California and Massachusetts, for example), most utilities continue to see these rates as a boutique approach.

Put me in, Coach!

Despite their simplicity, time-varying rates can create significant outcomes for the grid by shepherding lots of individuals into taking small actions at the same time—in aggregate, all these little contributions can add up to major effects. Take a look at the below example out of New England to get a sense:

New Englanders move as one when the Patriots are in the Super Bowl–namely, to in front of the TV at start time, and into the kitchen at the half. Credit: ISO-NE.

The left panel shows the load curve, or total electricity demand, for a regular winter Sunday in 2012; the right shows Super Bowl Sunday of that year, when New England played New York. Notice the narrowing of the peak and the spikes on the far right of the Super Bowl curve around 6:30, 8, and 10 p.m.? They correspond with the start, half-time, and end of game, respectively.

Now the half-time spike might look small, but it’s actually in the range of a whole natural gas generator needing to come online. Time-varying rates provide a mechanism for coordinating that type of chip-and-dip-refill fervor in our everyday lives.

In practice, the options for shifting demand run from simple to high-tech. For example, doing something like pressing the “delay start” button on a dishwasher (or just waiting to press start) is an easy, no-upgrades-required fix. On the other hand, some forms of flexibility require a technology intervention before they can be used, like turning water heater tanks—commonly a large residential electricity load—into energy storage devices that heat water during off-peak periods for use whenever needed. Because these resources can be so valuable to the system overall, it can be worth it for utilities to sponsor some of the upgrades themselves.

Excitingly, the recent mass deployment of smart meters means that many new opportunities for shifting electricity use and responding to price signals are beginning to be explored. In particular, innovation around third-party aggregators controlling electricity-dependent devices—from air conditioners to electric vehicles, in ways that are imperceptible to users—could mean even bigger opportunities for savings.

Still, it’s important to look back at that Super Bowl example to remember that it doesn’t actually take much to make a big difference to the grid, and that what we can do today is already a lot.

Fast-tracking our clean energy future

When we talk about the benefits of flexible demand—including those resulting from time-varying rates—we usually focus on the immediate (and persistent) cost savings that occur from not bringing those last costly power plants online. But such benefits are only the beginning of the story. This is especially the case when we consider the needs our grid will have as we race toward a clean energy future supplied by vast amounts of renewable resources.

Time-varying rates can help support a brighter, cleaner, more joyful wind-powered world. Credit: UCS.

Because wind and solar power production is variable, we need ways to fill the gaps when the wind eases or a cloud passes. Additionally, as more and more solar power comes online, the grid can start to run into challenges when the sun sets; solar resources decrease electricity production right around when people are returning home for the night and starting to use lots of electricity.

To manage this variation, we’ve traditionally relied on fossil-fueled power plants. But that reliance comes with a number of strings attached, and often at the expense of renewables, as my colleagues in California have detailed.

Enter flexible demand. If we can guide electricity use to times when our renewable resources are most abundant—and away from when they aren’t—we can take a vitally important step forward on the path to a clean energy future, and make the many and varied goals of our modern, clean grid easier to reach.

Critically, to ensure that access to these benefits is equitable and widespread, it takes a well-designed, well-considered program, as we lay out in our issue brief and as our peers have been diligently monitoring in California.

Think time-varying rates are neat? Take a peek at all the other wonders of an upgraded grid

Here at UCS, we’re working hard to make sure the electricity grid is ready and able to bolster our vision of a clean energy future. Time-varying rates, and their ability to unleash the incredible power of flexible demand, are but one part of this vision. In the time to come, my colleagues and I will be sharing exactly how we see upgrades to the grid enabling this pursuit; for now, though, allow our new video calling for an upgraded grid to brightly shine a light:

Missile Defense Agency to Choose Preferred Location for Third GMD Site

UCS Blog - All Things Nuclear (text only) -

Sometime in early 2017, and it could be any day now, one of the communities on the map below (designated by red dots) will get big news from the Missile Defense Agency (MDA). Congress mandated the MDA to choose a preferred location in case the United States decides to build an additional deployment site for the Ground-based Midcourse (GMD) System missile defense.

The site studies were on track to wrap up at the end of 2016. We’ve updated our fact sheet on it, posted here.

Fig. 1. Sites being studied as a potential third site for the GMD system: Fort Custer Training Center, near Battle Creek, MI.; Camp Ravenna Joint Military Training Center, near Akron, OH; and Fort Drum, NY. (Source: Google Earth)

There’s no military requirement for an additional missile defense site. Nor was the idea of building a third site (in addition to the two existing ones in Alaska and California) the result of a rigorous study of what would best improve the system’s ability to intercept ballistic missile threats to the homeland.

But you can count on Congress to run with this idea and push as hard as it can.

Fig. 2. Workers preparing an interceptor in Alaska (Source: Missile Defense Agency)

Every year since 2012, Congress has attempted to dedicate/earmark money to build such a site, despite Pentagon budgets that never included a dime for it. When asked, missile defense officials have said repeatedly that they have higher priorities for their next dollar. And they are skeptical about what starting this expensive project would do to their priorities in a constrained budget environment, including improving the reliability and effectiveness of the existing system. Improving reliability and effectiveness would be a good thing. The GMD system has been plagued with serious reliability problems and has a poor test record.

However, congressional delegations (with a few exceptions) from Michigan, New York and Ohio have crossed party lines and asked the Missile Defense Agency to support locating the site in their respective states. Their support appears to be largely driven by an interest in creating jobs. Each proposed site is in an economically depressed area, and many in the local communities are understandably eager for an infusion of federal cash to generate new job opportunities.

But is this an effective way to create jobs?

Let’s talk about money. This would be an expensive project. The Congressional Budget Office estimated that a new site would cost at least $3.6 billion to build and operate over the first five years. This includes ground equipment ($1.2 billion); developing the site, building the facilities, and constructing the silos ($1 billion); the cost of buying 20 interceptors ($1.3 billion), and operations costs ($100 million). For the full complement of 60 interceptors, it would cost at least $2.6 billion more.

Note, however, that the interceptors would not be built at the new sites, and neither the $1.3 billion for the first 20 interceptors nor money for extra interceptors would be spent locally. For example, Raytheon builds the GMD system’s kill vehicles in a facility outside Tucson, which it recently expanded to increase its capacity. The GMD interceptor’s boosters are also produced primarily in Arizona, at Orbital ATK’s facility outside of Phoenix.

So support for local industry and jobs for constituents may partially explain why Sen. John McCain, who usually provides a healthy dose of skepticism about defense expenditures, has endorsed the plan to build a third site.

Turning back to the potential sites in the Midwest, these above estimates indicate that under this plan, the Pentagon would spend at most about $2.3 billion in the local community. While that sounds enticing, studies show that military spending is not a particularly effective way to generate good paying jobs. Investing a comparable amount in clean energy technologies, health care or education is likely create a much larger number of jobs across all pay ranges than military spending.

The GMD site studies provided detailed information about what kinds of jobs would be created by building a new site. While it varies from site to site, the estimate is that construction, would generate 600 to 800 temporary jobs. A large fraction of those jobs, 15 to 50 percent, could be filled by workers from outside the region, depending on the skills of local residents.

After construction, the site would require an operations staff of 650 to 850 people. About 85 percent of the permanent staff jobs would be filled by workers from elsewhere due to the fact these positions demand specialized expertise.

The facility would indirectly generate a larger number of jobs, mainly low-to-median wage service jobs spurred by the economic activity. During construction, estimates range from 1,800 to 2,300 indirect jobs, while after the facility is completed, an estimated 300 to 400 indirect jobs would remain.

How does that compare to other types of investment?

Investing in wind projects would be a good bet—and both Michigan and New York are among the top 20 states for wind energy potential. As I noted a few years ago, a 2008 study by the National Renewable Energy Laboratory, which looked at the economic impact of building wind turbines in Colorado, estimated that developing 1,000 megawatts of wind-generated power would create 1,700 full-time equivalent jobs (including engineering and manufacturing jobs), and operation and maintenance would provide 300 permanent jobs in rural areas. In a 2013 report, Lawrence Livermore Laboratories calculated an average cost of building wind power to be $1,940 per kilowatt (and this cost is dropping). So these wind industry jobs would cost an initial outlay of around $2 billion, comparable to the investment in a third GMD site, and would continue to provide a return on investment.

For roughly the same amount of money, Hemlock Semiconductor, in Saginaw County, Michigan created 1,000 new jobs, spending $2.5 billion over five years on manufacturing facilities that produce materials for solar panels.

Building a third GMD missile defense site isn’t the result of a considered study of priorities to strengthen U.S. security, nor is it a sensible next step to improve strategic missile defense capabilities. It is symptomatic of a broader problem with strategic missile defense: Congress is not providing adequate oversight nor the necessary skepticism.

Regardless, we expect Congress to continue to push for a new site anyway once a preferred site is selected. However, if Congress has an extra few billion dollars available for one of these locations, it is fair to ask that it be spent in a way that provides economic security for the chosen community and a much better return on investment.

Congress is Trying to Protect Federal Scientists Because President Trump Isn’t

UCS Blog - The Equation (text only) -

Today members of the Senate, led by Senator Bill Nelson, introduced a bill to strengthen scientific integrity in federal decision making. If ever there was a time that such a bill is needed, it is now.

Today, members of Congress introduce a bill to strengthen scientific integrity at federal agencies and enhance protections for government scientists. Photo: USDA

The Trump administration has already revealed its disrespect for the use of science in federal decision-making. From instating sweeping gag orders on federal scientists right out of the gate, to across-the-board hiring freezes and disruptive holds on grants and contracts, early indications suggest that this administration is not likely to be a leader in championing scientific integrity in government decision-making.

Moreover, the administration’s pick to lead the EPA, Scott Pruitt, has expressed limited understanding and respect for the EPA’s scientific integrity policy, noting in his confirmation hearing, “I expect to learn more about EPA’s scientific integrity policies.” In the face of such abuses, a move to strengthen scientific integrity at federal agencies is certainly welcome.

A bill to strengthen federal scientific integrity

Aimed “to protect scientific integrity in federal research and policymaking,” the bill requires federal agencies that fund or conduct science to adopt and implement scientific integrity policies, an idea initially introduced by the Obama administration in 2009. Specifically, the bill compels science agencies to develop scientific integrity policies that include specific provisions to enhance scientific integrity.

Importantly, the bill reinforces key elements of some federal agencies’ scientific integrity policies. It includes provisions requiring agencies to develop procedures that allow scientists to review and ensure the accuracy of public-facing materials that rely on their work, such as reports, press releases, and factsheets. This provision could help safeguard against political interference that might come from political appointees or public affairs staff that edit scientific documents before they are released. This type of political interference happened in several instances under the George W. Bush administration. Julie MacDonald, for example, a political appointee at the Department of the Interior, edited the scientific content of a document that served as evidence for listing the sage grouse under the Endangered Species Act.

A safeguard against such abuse could prove useful under a Trump administration, which has already suggested that it will emphasize uncertainty on climate science on NOAA websites and appears to be keeping a tight control on agencies’ scientific communications. The provision could be made even stronger by granting scientists the right to approve the scientific content of the public-facing materials that rely on their work.

Preventing political tampering

Another provision of the bill requires agencies to develop procedures that “identify, evaluate the merits of, and address instances in which the scientific process or the integrity of scientific and technological information may be compromised.” This is an important inclusion since to date, not all scientific integrity policies at federal agencies have detailed procedures for assessing the validity of and addressing allegations of scientific integrity abuses.

This lack of clarity in current agency policies has had damaging impacts on scientists who raise, or are accused of, scientific integrity violations. A scientist at Los Alamos National Laboratory, for example, appeared to have lost his job over publishing a paper that the Department of Energy didn’t like. When a scientist at the US Department of Agriculture was accused of violating the scientific integrity policy, he was subjected to a long review process that may not have included an independent assessment of the claims. Thankfully, both the DOE and USDA have revised their scientific integrity policies to strengthen the allegation evaluation procedures.  A law requiring all science agencies to make allegation procedures clearer would improve evaluation of scientific integrity violations across the government and give federal scientists fairer assessments.

The bill also requires the National Academy of Public Administration to conduct a study of scientific integrity across the government. This is a great idea and one that was included in our recent policy recommendations to the Trump administration. An independent assessment of the effectiveness of scientific integrity policies would provide illuminating findings on how the relatively new policies and procedures could be further improved.

A positive step in uncertain times

To date, 24 federal agencies have developed scientific integrity policies. The policies vary in quality, but in general they afford federal scientists rights to communicate, and include provisions to safeguard against political interference in science-based decisions. The bill would strengthen these provisions by uniformly applying some basic protections across all science agencies. This raises the floor on scientific integrity in the government.

Kudos to all the 26 senators co-sponsoring this welcome legislation. This includes the ranking members of the Senate Environment and Public Works Committee (Sen. Carper), the Senate Energy and Natural Resources Committee (Sen. Cantwell), the Senate Health, Labor, and Pensions Committee (Sen. Murray), the Senate Armed Services Committee (Sen. Reed), and of course, the Senate Commerce, Justice, Science Committee (Sen. Nelson).

Advancing Scientific Integrity Through Federal Advisory Committees

UCS Blog - The Equation (text only) -

Back in October, I provided a comment at a public meeting for a National Academies of Science, Engineering and Medicine (NASEM) advisory committee that was set up to review the process to update the Dietary Guidelines for Americans. Their first charge was to write a report with recommendations on how the Dietary Guidelines Advisory Committee (DGAC) selection process could be improved to provide more transparency, minimize bias, and include committee members with a range of viewpoints.

After some time to assess the DGAC’s process and consider the public feedback they received, the committee released the report last Friday. It includes several important proposals that would be beneficial for the DGAC, and really all federal advisory committees (FACs), to employ. My assessment of the report will come later, but first, I want to talk a little bit more about the importance of FACs, generally.

Quick facts on FACs

FACs play an indispensible role in providing independent science advice for our government’s decision making. The government relies on this technical advice from scientists outside the government on everything from drug approvals to air pollution standards to appropriate pesticide use. There are over 1,000 advisory panels within the federal government, some of which offer technical scientific advice that may be used by agencies to inform key policy decisions. Some advisory committees are mandated by law, while others are created for ad hoc policy guidance. The Federal Advisory Committee Act requires that agencies take measures to ensure transparency and ample public participation, but how and the degree to which these are implemented varies depending on the agency.

In our most recent report, “Preserving Scientific Integrity in Federal Policymaking,” we discuss the opportunity to improve the way in which federal agencies obtain objective technical advice from advisory committees so that conflicts of interest are minimized and fully disclosed. Several studies have shown a positive association between authors’ financial conflicts of interest and recommendations that benefit those vested interests. Likewise, an individual on an advisory committee may choose to sideline the evidence and instead make recommendations that favor his or her special interest, especially if they stand to profit in some way. Federal advisory committees have been co-opted by industry for political reasons before, including when G.W. Bush administration officials pushed existing committee members out and replaced them with appointees in order to reject the prospect of stricter lead poisoning standards.

The DGAC plays the essential role of analyzing heaps of nutrition and epidemiological data and making recommendations to the U.S. Department of Agriculture (USDA) and the Department of Health and Human Services (HHS) to inform the Dietary Guidelines for Americans that is released every five years. As a lover of food and a student of food policy, I rely on the DGAC to translate science into objective recommendations that will ultimately shape federal nutrition guidance and regulations spanning from school lunches to nutrition facts labels. UCS commended the DGAC on its 2015 report to HHS and USDA, most notably for the way in which it followed the science to recommend that Americans consume no more than 10 percent of daily calories from added sugars.

NASEM’s report challenges undue influence of science

The NASEM committee’s report identified five values upfront that would enhance the integrity of the DGAC selection process, which closely echo the core values we identified for ensuring scientific integrity in federal policymaking:

  • Enhance transparency
  • Promote diversity of expertise and experience
  • Support a deliberative process
  • Manage biases and conflicts of interest
  • Adopt state-of-the-art processes and methods

For the reasons I mentioned earlier, the fourth value could use strengthening to something more like “Minimize and manage biases and conflicts of interest,” to emphasize that conflicts should be avoided, if possible, to maximize objectivity.

Figure: NASEM

As for its concrete guidance, the NASEM committee suggested changes to HHS and USDA’s process (see figure at right), including that when the departments first solicit nominations for the DGAC, they should “employ a third party to review nominations for qualified candidates.” This would add a crucial layer of independent review into the process, especially if, as NASEM recommends, the third party is an “organization without a political, economic, or ideological identity,” and not necessarily an expert in nutrition or dietary guidance. The NASEM committee would also add a public comment period after the provisional committee is selected by the departments, allowing an opportunity for the public to weigh in on any potential biases or conflicts of interest of the proposed members. We strongly agree with NASEM’s assertion that “candid information from the public about proposed members is critical for a deliberative process.”

The report also recommended that the departments create and make public strict policies on how to identify and manage conflicts of interest and mandate that committee members sign a form that captures nonfinancial conflicts of interest and biases, since that is not currently covered by the required Office of Government Ethics form. Additionally, the committee elaborated on what “management” of conflicts of interest looks like in practice and had some helpful ideas like granting waivers in limited amounts (and making them public) depending on the type of conflict, asking that individuals sell stock or divest property to avoid conflicts, excluding members with conflicts from certain discussions and voting, or allowing for a review of potential conflicts of interest to be discussed at the beginning of each meeting. The committee also suggested that a statement be added to the final DGAC report to review how biases and conflicts of interest were managed throughout the advisory committee’s work.

Overall, the report managed to cover most of the recommendations I made in my public comment, but one thing that I hope the committee explores in its future deliberations is the prevention of undue influence from department leadership after the DGAC report has been submitted, since that is where the translation of science into policy is most critical. DGAC is solely advisory and should not have a role in writing the final Dietary Guidelines report, but it would be appropriate for former DGAC members to have a role in peer review and to make sure that the report language fairly considers the best available science and aligns with DGAC’s recommendations. This last part of the process proved to be controversial in the most recent version of the Dietary Guidelines when the DGAC recommended that environmental sustainability concerns be included in the DGA because the overall body of evidence points to a dietary pattern higher in plant-based foods, and lower in meat, but the final report did not include these important concerns.

NASEM should follow its own advice on conflicts of interest

In light of this report, it seems that NASEM should follow its own advice as it considers itself to be a purveyor of nonpartisan, objective guidance for policymakers, but has been recently scrutinized for conflicts of interest on its own panels. This past December, the New York Times reported that NASEM put together a committee of 13 scientists to make recommendations on regulation of the biotechnology industry, and failed to disclose the clear conflicts of five of the committee members. In fact, the majority of committee members had conflicts (7 out of 13), and the NASEM study director was applying for a job at a biotechnology organization while he was putting together his recommendations for committee members. If that isn’t egregious enough, three of the committee members he recommended for the NASEM biotech panel were actually on the board of the organization at which he was seeking employment. This level of undisclosed conflict is completely inappropriate and should have been caught in the early stages of the committee selection process, not uncovered after the final report had already been released. NASEM should strive to “promote diversity of expertise and experience,” as the committee identified as a core value, rather than stack committees with individuals that have similar industry experience and connections.

Ode to independent science

Independent science at its core must be free from undue political or financial pressure. We of course acknowledge that all policy decisions are not made based on science alone, but in order to create the best possible government policies, the relied-upon science must be independent. We appreciate the work that this committee is putting into advising DGAC on how best to ensure the process facilitates truly objective science advice, because FACs are vulnerable to politicization or interference if not carefully managed. This report should be considered by all federal agencies and other entities, including NASEM itself, that seek to provide scientific advice to policymakers for the benefit of us all.

Restoring America’s Wetland Forest Legacy

UCS Blog - The Equation (text only) -

Like many white, middle-class, suburban kids, I grew up with one foot in the forest. To me, that small woodlot, a green buffer along a half-polluted tributary, was a paradise unmatched by any other forest in the world. Unfortunately, like many other tracts of land across the United States, my childhood forest is gone—cleared for a housing development.

Wetlands, including wetland forests, are the “filters” of our natural system, combating pollution, removing excess nutrients, and securing fresh drinking water for surrounding and downstream communities. Photo: Dogwood Alliance.

Wetland forests offer massive economic benefits

Even small forests across the United States work to provide “ecosystem services”—non-monetary benefits like clean water, clean air, carbon sequestration, hunting, fishing, and yes—recreation for children. Ecosystem services may sound like “lip service” to the natural world, but it’s not. New York city chose to spend $500 million to protect and preserve its upstream watershed (and resulting water quality), to avoid the $3-5 billion price tag of a new water supply system on the Hudson river. Forests in the U.S. offset about 13% of our yearly carbon emissions. In 2002, southern forests supported over a million jobs in the recreation/tourism sectors, generating $19-76 billion dollars in annual revenue. All of these services require healthy, standing forests across the landscape.

As our country continues to grow, we are increasing the pressures on our forests. We need clean air and clean water, but we also need wood products, food, and housing. As Research Manager at Dogwood Alliance, I work every day with other organizations and communities to improve the quality and quantity of southern forests. Much of my day-to-day is focused on coordinating and organizing a new initiative, the Wetland Forest Initiative, to conserve, restore, and improve southern wetland forests.

Cypress-tupelo forests, also known as bottomland hardwood forests, can occasionally have trees live for over a thousand years. Photo: Dogwood Alliance

Wetland forests are the best of both worlds. You can visit during a dry season to walk beside ancient trees, or explore during the wet season by kayaking in submerged habitat, teeming with aquatic invertebrates, migratory birds, fish, reptiles, and amphibians. “Wetland forest” describes so much of the American landscape—from forests edging creeks and the culturally treasured bayous; to coastally influenced forests, which somehow survive the onslaught of the ocean. Wetland forests span 35 million acres across 14 southern states, and provide twice the ecosystem services value of upland forests.

Taking action to save our wetlands

Yet, with a majority of wetland forests lost—cleared for agriculture, drained for commercial or residential development, even cut and converted to fast-growing commercial pine plantations—we are at a fork in the road. Will we allow our wetland forests to dwindle to less than one percent of their original range, like we did with longleaf pine? Or will we take action now to conserve these vital ecosystems, before it’s too late?

Wetland forests are home to many endemic species, found nowhere else on earth. This photo was taken during a flyover search for the swallow-tailed kite, a bird native to southern wetland forests. Photo: Maria Whitehead

The Wetland Forest Initiative is working to conserve, restore, and improve these habitats. In special places, we will work to protect the legacy of rare, threatened, and endangered species, ensuring that they will have habitat for decades to come. In places where wetland forests have been degraded by lack of management, changes in hydrology, or pollution, we will work with local groups and governments to restore the land to ecological function. Beyond the tree line, we will work with politicians and government agencies to ensure that landowners are awarded with fair compensation for their restoration and conservation efforts. And perhaps most importantly, we will work with communities, to educate them about the beauty and importance of what’s happening on the ground in their local wetland forests.

Although I never thought I would leave academia, I am happy to spend my working hours on a project that has the potential to impact 35 million acres across 14 states. Despite the differences in opinion that some of our member organizations may have, it is inspiring to see so many people from different walks of life (academic, community, environment, forestry, government, land management, landowners, social justice, and tribal) come together and create meaningful change. I am excited for the future of our southern wetland forests.

I encourage you to head over to the Wetland Forest Initiative website to learn more, endorse the platform, and get your organization or university involved.

Sam Davis is Research & Program Manager at Dogwood Alliance. A life-long treehugger, Sam earned a Ph.D. in Environmental Science in 2015 at Wright State University, and completed a postdoc at University of California Merced before leaving academia for greener forests. Sam is thrilled to be translating science into action with Dogwood Alliance. On the weekends, Sam enjoys hiking, home improvement, and gaming with friends and family.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Oregon’s Climate Check-Up Offers Serious Prognosis Without Preventative Action

UCS Blog - The Equation (text only) -

Each January, I journey to my doctor’s office for my annual physical. She briefly reviews my medical history before conducting an examination, and we end our visit by discussing key risk factors and a plan to manage them.

Well, just in time for the start of the 2017 legislative session, Oregon received its periodic “climate physical.” The results are sobering, and the treatment plan involves further action to put the Beaver State on the path to a low carbon, climate-resilient economy – a path to good “climate health.”

Oregon, like other states, is already experiencing climate change

The Third Oregon Climate Assessment Report by the Oregon Climate Change Research Institute (OCCRI) incorporates findings from recently published studies on climate science and impacts in Oregon.

Hotter and drier conditions caused by climate change contribute to increased wildfire risks and other key impacts in Oregon. Source: UCS

The legislatively mandated report reaffirms what scientists have been telling us. Oregon is already experiencing the impacts of climate change, and human activity has played a key role. It’s a stark contrast to statements by several of the Trump administration’s cabinet nominees.

According to the authors, global emissions of heat-trapping gases are largely responsible for the overall increase in average annual temperatures in the Pacific Northwest over the past century. (Yes, despite an unusually cold winter, the statewide average temperature for 2016 was still much warmer than average.) They found additional signs of human-caused global warming in the 2015 record-low snowpack, more acidic waters off the Oregon coast in 2013, and wildfire activity over the past three decades.

A future of more extremes in every region of the state

Oregonians will face more severe impacts in the future if we continue on our current global carbon emissions trajectory. As shown in the table below, annual temperatures could increase by an average of 8 degrees by the century’s end compared to the late 20th century.

Average temperatures will continue to rise in Oregon compared to the late 20th century under both low and high emissions pathways. Source: Oregon Climate Change Research Institute

Rising temperatures will mean a shrinking snowpack, earlier snowmelt, and diminished summer water supplies as well as increased wildfires and more acidic oceans that affect coastal ecosystems. Sea-level rise will lead to more coastal flooding and erosion. There also will likely be overall negative impacts to agriculture over time.

The 100+ page report provides detailed information and projections for each of these impacts. One of the most striking findings is that higher temperatures and a record-low snowpack despite normal precipitation levels – conditions that led to the devastating 2015 snow droughtcould become commonplace by mid-century.

Another key takeaway is that climate change will affect every region of Oregon. It will also disproportionately impact tribal communities, as well as low-income and rural residents and communities of color. The assessment divides the state into four regions, with snapshots of anticipated climate impacts over the rest of the century:

  • The Coast: Due to rising sea levels, thousands of homes and more than 100 miles of road face a greater risk of inundation. Warmer and more acidic oceans will affect near-shore fisheries and hatcheries, endangering the local shellfish economy and the workers who rely on that industry. Wildfires in coastal forests will likely become increasingly common as well.
  • The Willamette Valley: Heat waves will grow in frequency and intensity as temperatures continue to climb, increasing heat-related illnesses and deaths among the region’s residents. Studies project increasing summer water scarcity and growing wildfire risks that could significantly expand burn areas.
  • The Cascade Range: Precipitation will increasingly fall as rain instead of snow, affecting the ski industry and water supplies. At the same time, forests will likely become even more vulnerable to wildfire, insect infestations and disease. Increased risk of wildfire-related respiratory illnesses is a key health concern for Jackson County.
  • Eastern Oregon: As snowpack shrinks, water supply will be a concern, especially for residents in the John Day basin with no man-made water storage capacity. Drought is a key health risk for Wasco, Sherman, Gilliam, and Crook counties. The Blue Mountains will also likely experience higher tree mortality and wildfire activity.
Ambitious climate action is the prescription

The Third Oregon Climate Assessment Report includes good news for Oregonians. The worst climate impacts can be avoided through ambitious efforts to curb global carbon emissions.

The Beaver State has already taken significant steps to decarbonize its economy, yet it’s still not on track to meet its near-term 2020 emissions goal. Two key next steps for Oregon are ensuring that any transportation funding package helps reduce global warming emissions from the transportation sector, and putting a price on carbon. A carbon price is an important tool in the overall portfolio of critical policies for cutting heat-trapping pollution.

The Oregon legislature should show continued leadership by heeding the experts’ prognosis and taking further preventative climate action today to ensure its climate health tomorrow!

UCS Oregon Climate Change Research Institute

Standing Up to Pernicious New Attacks on Federal Climate Scientists

UCS Blog - The Equation (text only) -

The time-tested climate denial strategy of attacking the reputations of prominent climate scientists in order to sow doubt about the evidence and risks of climate change is being trotted out again.

Exhibit A: The Daily Mail, a British tabloid, has published a screed by David Rose alleging serious scientific misconduct by Dr. Tom Karl, a leading climate scientist recently retired from the National Oceanographic and Atmospheric Administration (NOAA).

A writer with a history of inaccurate reporting on climate science, Rose claims that Karl and coauthors deliberately used misleading global temperature data, side-stepped NOAA scientific integrity policies, and “rushed to publish” a 2015 paper in the prestigious journal Science in order to influence the climate negotiations held that year in Paris. His piece draws in part on a blog post by former NOAA scientist John Bates.

The Science paper is one of several recent studies refuting the notion that the rate of global warming had slowed down, or “paused”, in recent decades, an idea that opponents of climate policies have often used to justify inaction on reducing emissions. Karl and coauthors showed the apparent “pause” in warming was simply an artifact of how earlier studies had over time incorporated data on ocean surface temperatures from different sources (satellites, ships, buoys and so on); when temperature data sources and quality were properly taken into account, no slowdown was detectable.

Repeated and amplified through the climate denial echo-chamber, Rose’s allegations of misconduct have now been taken up by Rep Lamar Smith (R-TX), Chairman of the House Science Space and Technology Committee. Smith, who has long used his perch to harass NOAA scientists, issued a press release reiterating these unsubstantiated claims and accusing Karl and colleagues of manipulating data for political purposes.

Along with other recent high profile attacks on prominent climate scientists and science agencies, this may well be part of larger political strategy to intimidate federal scientists, justify cuts in agency budgets, staffing and missions, weaken support for US and international climate policies and, most fundamentally, erode public trust in science and evidence so central to a functioning democracy.

At its core, it is a very old strategy.

As the Irish essayist Jonathan Swift wrote in 1710, “Falsehood flies, and the Truth comes limping after it; so that when Men come to be undeceiv’d, it is too late; the Jest is over, and the Tale has had its Effect…”

But today, scientists are fighting back.

Rose’s claims have been quickly and forcefully rebutted:

  • Top experts on temperature record research have called attention to several errors in Rose’s piece and his failure to mention that multiple independent published analyses support and corroborate the corrected temperature data in the NOAA scientists’ findings.
  • To claims that Karl and colleagues violated NOAA guidelines on scientific integrity, Ret. Rear Admiral David Titley (Ret.), former chief operating officer at NOAA, points out that “[t]here is both a NOAA internal process on scientific integrity….and the opportunity to submit allegations of wrongdoing to the Department of Commerce Inspector General who if there is reasonable evidence to substantiate the allegation, would undertake an independent investigation.” Yet, no allegations of violations of the NOAA scientific integrity policy were brought to the agency’s scientific integrity office regarding this research.
  • Jeremy Berg, editor of Science, firmly rejects the notion of a “rush to publish”: “The article by Karl et al. underwent handling and review for almost six months [longer than average for this journal]. Any suggestion that the review of this paper was ‘rushed’ is baseless and without merit. Science stands behind its handling of this paper, which underwent particularly rigorous peer review.”

Global land/ocean temperature records from NOAA, NASA, Berkeley Earth, Hadley/UAE, and Cowtan and Way. Old (pre Karl et al 2015) NOAA temperature record is only available through the end of 2014. Source: Hausfather et al (2017) Assessing recent warming using instrumentally homogeneous sea surface temperature records, Science Advances. Figure obtained here.

Attacks on the reputations and research findings of federal climate scientists are a deplorable attempt to distract attention from the overwhelming evidence of climate change and the urgent need to deeply reduce carbon emissions from the burning of fossil fuels and other sources.

We can’t keep tabloids from publishing misinformation. But we can and must hold elected officials accountable for doing their jobs to protect science and evidence-based decision-making.

As former Congressman and Chair of the House Science Committee Sherry Boehlert (R-NY) puts it: “The current attacks should be received with extreme skepticism, given the enormous body of evidence supporting the conclusion that the climate is changing and poses a danger that needs to be addressed. And public officials have an obligation to follow the scientific consensus…”

Chairman Smith, it’s high time for you to follow suit.

What is Oil Used For? What the Super Bowl Commercial Didn’t Tell You…

UCS Blog - The Equation (text only) -

A commercial during yesterday’s Super Bowl about oil may have given you pause.

Besides the sports car (about to go off-roading), the commercial was about things you probably don’t associate with oil. Like graffiti; makeup; prosthetics; a heart; and outer space.

Is oil really diversifying? Or is this ad just a marketing ploy?

Looking at data from the U.S. Energy Information Administration, it is pretty clear that oil and natural gas are still being used overwhelmingly for what they have always been used for—combustion, whether in vehicles or power plants.

The American Petroleum Institute (API) ran the commercial in question. API is the largest oil trade association in the United States. Member companies include BP, ConocoPhillips, Chevron, ExxonMobil, and Shell. You may have heard of API for their role in a concerted campaign to spread denial about climate change. They merged with America’s Natural Gas Alliance last fall, so it now lobbies for both oil and natural gas interests. This merger came about because major oil companies now have large natural gas assets.

As a chemist, I know that many consumable products like asphalt, paint, and plastics have oil or natural gas as a precursor ingredient. And while these products have many positive impacts in society, they are absolutely tiny fractions of the oil and gas industry and should not be used to justify the bulk of their business. Over 90% of oil and gas is used for combustion, either in power plants or vehicles.

Let’s not discount the many benefits energy provides society

But while coal, oil, and natural gas have been our primary sources of energy for many decades, we will not rely on them in the future. We are moving to a world that gets most of our energy from clean, renewable resources like wind and solar. This is in large part because the cleanest sources of energy are becoming the cheapest. Our cars and trucks can plug into that clean grid for their future fueling needs.

There are many chemists exploring ways to make plastics etc. from non-petroleum resources such as plants. This is great work (and tough chemistry) that will lead to a more sustainable world. But if we are going to stop the worst effects of global warming and clean our air, we must remember the most obvious effects oil and natural gas are having on our communities and our world.

We have solutions

While oil may currently play a role in making paint, plastics, or rocket fuel, it doesn’t “gush art,” “pump life,” or “explore space”–that would be artists, doctors, and scientists. And it is artists painting a picture of environmental justice; doctors treating patients suffering from asthma; and scientists discovering clean energy solutions.

The Fate of the Clean Power Plan under President Trump

UCS Blog - The Equation (text only) -

Shortly, we are likely to see and hear much more about what jurists, Congress, and the new Administration think about the Clean Power Plan, the cornerstone of our nation’s efforts to reduce carbon emissions. Regardless of how the court rules—and how Congress and President Trump respond—there’s no denying the reality of climate change or the many compelling reasons to double down on the clean energy transition already underway.

Imposing limits on carbon pollution would help the President deliver on two campaign promises—to create jobs and protect clean air.

Protesters rallied outside the US Court of Appeals for the District of Columbia Circuit early yesterday as judges prepared to hear arguments on the Clean Power Plan.

Accelerating the clean energy transition

Market trends are already driving a transition to cleaner energy. The costs of wind and solar energy are dropping dramatically, driving new renewable energy deployment that is outpacing all other new energy resources. This transition is delivering huge health and economic benefits to communities around the country.

The Clean Power Plan would lock in those gains and create a framework for continuous improvement, in the exact same way the Clean Air Act took on pollution problems in previous eras (acid rain in the 70’s, soot and smog in the 80’s, and mercury earlier this century). While these pollutant still cause problems, sometimes concentrated  in low income or racially diverse neighborhoods, the CAA required significant pollution prevention measures to taken. We need to do the same for carbon dioxide and other greenhouse gases.

How Tillerson and Pruitt view US Climate Action

As we wait to hear the DC Circuit Court of Appeals Decision, expected to be issued in the near future, we’ll likely see confirmation votes for Rex Tillerson, the former CEO of the world’s largest fossil fuel company, and possibly Scott Pruitt, one of the state AG’s who sued to have the Clean Power Plan overturned.

On left: Scott Pruitt. On Right: Rex Tillerson

As Secretary of State, Tillerson will be called upon by the foreign ministers of 190 countries to account for how the US plans to meet its commitment to the Paris Agreement. While additional policies to limit harmful global warming emissions beyond the CPP would still be needed to meet the US international climate targets, the CPP  is the down payment.

Tillerson has said he would like to see the US maintain a ‘seat at the table’ of the climate talks. If the Administration is casting aside cost-effective emission reducing actions like the CPP, he’ll find that seat more than a little warm.

As part of the EPA Administrator confirmation process, Scott Pruitt conceded that carbon is a pollutant subject to Clean Air Act regulation, indicating that the CPP has a strong legal foundation. The Clean Air Act itself, and subsequent elaborations through the 2007 Mass v. EPA Supreme Court decision and a 2009 Endangerment Finding by the EPA, make this absolutely clear.

However, when asked if there was an EPA program or rule he supported, he could not or would not cite a single one—which doesn’t bode well for his leadership of the agency.

The Clean Power Plan is the Clean Air fight of this generation

I’ve had the privilege of working with clean air advocates for 20 years. I’ve heard the stories of how they successfully fought for laws that would curb the acid rain contributing to the dying lakes in the Northeast; measures to reduce the emissions of soot that settled on cars downwind of Midwest coal plants; tailpipe standards to reduce smog-choked cities; and limits to mercury that was contaminating fish in our streams.

The pattern is always the same: scientists study the problem and identify the causes; advocates petition EPA and Congress for action; and industry casts doubt about the science and fights the solutions with claims of economic collapse.

Ultimately, when all legal remedies are exhausted, industry complies at a cost far less than predicted and the promised health improvements from cleaner air are realized. My colleague Rachel Cleetus noted in her blog the benefits of EPA for real people and cited the finding that “over a 20-year period from 1990 to 2010 the Clean Air Act helped drive down total emissions of the six major air pollutants by more than 40 percent while GDP grew more than 64 percent.”

While we are far from having pristine air quality, we have a science-based process underlying the Clean Air Act that results in ratcheting down the regulations as better information becomes available and new cost-effective pollution control technologies become available.

My career has largely been spent trying to get carbon pollution treated the same way as these other pollutants.  Carbon is the pollutant driving the most pressing environmental problem of our generation. Its impacts go beyond typical local and regional air pollution effects, like the aggravation of asthma and other respiratory diseases, to threaten the ‘regulator’ of the planet, the very climate that makes human existence possible.

Climate impacts demand a response

As global average temperatures rise, arctic ice melts, sea levels rise, heat waves are more frequent and last longer, and extreme weather events intensify. Scientists and advocates began calling for action to reduce carbon emissions at least as far back as the early 1990s, hoping to prevent these events from coming to pass.  We are now seeing these impacts as our reality. They are becoming more common as every day passes, leaving little room for doubt that our climate is changing.

The predicted impacts are coming to pass, and despite the doubt continuing to be peddled by the likes of Tillerson and Pruitt, scientists do know—with a great deal of certainty—that burning fossil fuels is the primary cause of those impacts and they can predict, with ever improving reliability, what a warmer world would look like.

And it’s not good, it’s not something we can ‘adapt to’ and it’s coming to pass faster than expected.

Both legally and morally, this Administration is compelled to act on clean air and climate. Many local and state governments are fully committed to continuing clean energy and climate progress because it’s good for public health and their local economies, and many businesses will continue to ramp up their clean energy investments because it’s good for their bottom line.

Throwing out the Clean Power Plan won’t bring back coal. Coal is increasingly uneconomic for a variety of reasons, including cheaper alternatives like natural gas and, increasingly, wind and solar. Those market conditions will exist with or without the CPP. That’s why the Trump Administration and the Congress must do something real to help miners/coal dependent communities instead of meaningless posturing around the CPP. The clean energy transition is good for our health and is one of the fastest growing job creators. Now we need to make it work for all Americans.

The Clean Power Plan could also prevent us from becoming over-reliant on natural gas.  A rush to gas would hit consumers the hardest, due to the price volatility that results from the boom and bust cycles of gas exploration.  While I’m sure it is hard for an Oklahoma oil company attorney like Mr. Pruitt to believe, but too much natural gas is bad for the economy and our health.

Natural Gas Gamble

What’s your climate plan, President Trump?

So the real question is, regardless of how the court rules, what will this Administration do to tackle today’s air pollution crisis: the need to reduce the carbon pollution that is fueling global warming?

The Clean Power Plan rule did not come about on a whim. It wasn’t rushed out the door as the Obama Administration was leaving. After decades of inaction by Congress, the EPA crafted these rules over a three year period that included consultation with scientists, state officials, power companies, and public hearings. They reviewed millions of comments from citizens around the country.  Similar to healthcare, this Administration has an obligation to replace if it intends to repeal.

Before Pruitt is confirmed, Senators and all Americans are entitled to know, if not the Clean Power Plan, then what? President Trump, how will your Administration address this huge environmental and public health problem?

Erika Bolstad Ecowatch Union of Concerned Scientists


Subscribe to Union of Concerned Scientists aggregator - Combined UCS Blogs