Combined UCS Blogs

Nuclear Plant Risk Studies: Then and Now

UCS Blog - All Things Nuclear (text only) -

Nuclear plant risk studies (also called probabilistic risk assessments) examine postulated events like earthquakes, pipe ruptures, power losses, fires, etc. and the array of safety components installed to prevent reactor core damage. Results from nuclear plant risk studies are used to prioritize inspection and testing resources–components with greater risk significance get more attention.

Nuclear plant risk studies are veritable forests of event trees and fault trees. Figure 1 illustrates a simple event tree. The initiating event (A) in this case could be something that reduces the amount of reactor cooling water like the rupture of a pipe connected to the reactor vessel. The reactor protection system (B) is designed to detect this situation and immediately shut down the reactor.

Fig. 1. (Source: Nuclear Regulatory Commission)

The event tree branches upward based on the odds of the reactor protection system successfully performing this action and downward for its failure to do so. Two emergency coolant pumps (C and D) can each provide makeup cooling water to the reactor vessel to replenish the lost inventory. Again, the event tree branches upward for the chances of the pumps successfully fulfilling this function and downward for failure.

Finally, post-accident heat removal examines the chances that reactor core cooling can be sustained following the initial response. The column on the right describes the various paths that could be taken for the initiating event. It is assumed that the initiating event happens, so each path starts with A. Paths AE, ACE, and ACD result in reactor core damage. The letters added to the initiating event letter define what additional failure(s) led to reactor core damage. Path AB leads to another event tree – the Anticipated Transient Without Scram (ATWS) event tree because the reactor protection system failed to cause the immediate shut down of the reactor and additional mitigating systems are involved.

The overall risk is determined by the sum of the odds of pathways leading to core damage. The overall risk is typically expressed something like 3.8×10-5 per reactor-year (3.8E-05 per reactor-year in scientific notation). I tend to take the reciprocal of these risk values. The 3.8E-05 per reactor-year risk, for example, becomes one reactor accident every 26,316 years—the bigger the number, the lower the risk.

Fault trees examine reasons for components like the emergency coolant pumps failing to function. The reasons might include a faulty control switch, inadequate power supply, failure of a valve in the pump’s suction pipe to open, and so on. The fault trees establish the chances of safety components successfully fulfilling their needed functions. Fault trees enable event trees to determine the likelihoods of paths moving upward for success or downward for failure.

Nuclear plant risk studies have been around a long time. For example, the Atomic Energy Commission (forerunner to today’s Nuclear Regulatory Commission and Department of Energy) completed WASH-740 in March 1957 (Fig. 2). I get a kick out of the “Theoretically Possible but Highly Improbable” phrase in its subtitle. Despite major accidents being labeled “Highly Improbable,” the AEC did not release this report publicly until after it was leaked to UCS in 1973 who then made it available. One of the first acts by the newly created Nuclear Regulatory Commission (NRC) in January 1975 was to publicly issue an update to WASH-740. WASH-1400, also called NUREG-75/014 and the Rasmussen Report, was benignly titled “Reactor Safety Study: An Assessment of Accident Risks in U.S. Commercial Nuclear Power Plants.”

Fig. 2. (Source: Atomic Energy Commission)

Nuclear plant risk studies can also be used to evaluate the significance of actual events and conditions. For example, if emergency coolant pump A were discovered to have been broken for six months, analysts can change the chances of this pump successfully fulfilling its safety function to zero and calculating how much the broken component increased the risk of reactor core damage. The risk studies would determine the chances of initiating events occurring during the six months emergency coolant pump A was disabled and the chances that backups or alternates to emergency coolant pump A stepped in to perform that safety function. The NRC uses nuclear plant risk studies to determine when to send a special inspection team to a site following an event or discovery and to characterize the severity level (i.e., green, white, yellow, or red) of violations identified by its inspectors.

Nuclear Plant Risk Studies: Then

In June 1982, the NRC released NUREG/CR-2497, “Precursors to Potential Severe Core Damage Accidents: 1969-1979, A Status Report,” that reported on the core damage risk from 52 significant events during that 11-year period. The events included the March 1979 meltdown of Three Mile Island Unit 2 (TMI-2), which had a core damage risk of 100%. The effort screened 19,400 licensee event reports submitted to the AEC/NRC over that period, culled out 529 event for detailed review, identified 169 accident precursors, and found 52 of them to be significant from a risk perspective. The TMI-2 event topped the list, with the March 1975 fire at Browns Ferry placing second.

The nuclear industry independently evaluated the 52 significant events reported in NUREG/CR-2497. The industry’s analyses also found the TMI-2 meltdown to have a 100% risk of meltdown, but disagreed with all the other NRC risk calculations. Of the top ten significant events, the industry’s calculated risk averaged only 11.8% of the risk calculated by the NRC. In fact, if the TMI-2 meltdown is excluded, the “closest” match was for the 1974 loss of offsite power event at Haddam Neck (CT). The industry’s calculated risk for this event was less than 7% of the NRC’s calculated risk. It goes without saying (but not without typing) that the industry never, ever calculated a risk to be greater than the NRC’s calculation. The industry calculated the risk from the Browns Ferry fire to be less than 1 percent of the risk determined by the NRC—in other words, the NRC’s risk was “only” about 100 times higher than the industry’s risk for this event.

Fig. 3. Based on figures from June 1982 NRC report. (Source: Union of Concerned Scientists)

Bridging the Risk Gap?

The risk gap from that era can be readily attributed to the immaturity of the risk models and the paucity of data. In the decades since these early risk studies, the risk models have become more sophisticated and the volume of operating experience has grown exponentially.

For example, the NRC issued Generic Letter 88-20, “Individual Plant Examination for Severe Accident Vulnerabilities.” In response, owners developed plant-specific risk studies. The NRC issued documents like NUREG/CR-2815, “Probabilistic Safety Analysis Procedures Guide,” to convey its expectations for risk models. And the NRC issued a suite of guidance documents like Regulatory Guide 1.174, “An Approach for Using Probabilistic Risk Assessment in Risk-Informed Decision on Plant-Specific Changes to the Licensing Basis.” This is but a tiny sampling of the many documents issued by the NRC about how to conduct nuclear plant risk studies—guidance that simply was not available when the early risk studies were performed.

Complementing the maturation of nuclear plant risk studies is the massive expansion of available data on component performance and human reliability. Event trees begin with initiating events—the NRC has extensively sliced and diced initiating event frequencies. Fault trees focus on performance on the component and system level, so the NRC has collected and published extensive operating experience on component performance and system reliability. And the NRC compiled data on reactor operating times to be able to develop failure rates from the component and system data.

Given the sophistication of current risk models compared to the first generation risk studies and the fuller libraries of operating reactor information, you would probably think that the gap between risks calculated by industry and NRC has narrowed significantly.

Except for being absolutely wrong, you would be entirely right.

Nuclear Plant Risk Studies: Now

Since 2000, the NRC has used nuclear plant risk studies to establish the significance of violations of regulatory requirements, with the results determining whether a green, white, yellow, or red finding gets issued. UCS examined ten of the yellow and red findings determined by the NRC since 2000. The “closest” match between NRC and industry risk assessment was for the 2005 violation at Palo Verde (AZ) where workers routinely emptied water from the suction pipes for emergency core cooling pumps. The industry’s calculated risk for that event was 50% (half) of the NRC’s calculated risk, meaning that the NRC viewed this risk as double that of the industry’s view. And that was the closest that the risk viewpoints came. Of these ten significant violations, the industry’s calculated risk averaged only 12.7% of the risk calculated by the NRC. In other words, the risk gap narrowed only a smidgen over the decades.

Fig. 4. Ratios for events after 2000. (Source: Union of Concerned Scientists)

Risk-Deformed Regulation?

For decades, the NRC has consistently calculated nuclear plant risks to be about 10 time greater than the risks calculated by industry. Nuclear plant risk studies are analytical tools whose results inform safety decision-making. Speedometers, thermometers, and scales are also analytical tools whose results inform safety decision-making. But a speedometer reading one-tenth of the speed recorded by a traffic cop’s radar gun, or a thermometer showing a child to have a temperature one-tenth of her actual temperature, or a scale measuring one-tenth of the actual amount of chemical to be mixed into a prescription pill are unreliable tools that could not continue to be used to make responsible safety decisions.

Yet the NRC and the nuclear industry continue to use risk studies that clearly have significantly different scales.

On May 6, 1975, NRC Technical Advisor Stephen H. Hanauer wrote a memo to Guy A. Arlotto, the NRC’s Assistant Director for Safety and Materials Protection Standards. The second paragraph of this two-paragraph memo expressed Dr. Hanauer’s candid view of nuclear plant risk studies: “You can make probabilistic numbers prove anything, by which I mean that probabilistic numbers ‘prove’ nothing.”

Oddly enough, the chronic risk gap has proven the late Dr. Hanauer totally correct in his assessment of the value of nuclear plant risk studies. When risk models permit users to derive results that don’t reside in the same zip code yet alone the same ball park, the results prove nothing.

The NRC must close the risk gap, or jettison the process that proves nothing about risks.

START from the Beginning: 25 Years of US-Russian Nuclear Weapons Reductions

UCS Blog - All Things Nuclear (text only) -

For the past 25 years, a series of treaties have allowed the US and Russia to greatly reduce their nuclear arsenals—from well over 10,000 each to fewer than 2,000 deployed long-range weapons each.  These Strategic Arms Reduction Treaties (START) have enhanced US security by reducing the nuclear threat, providing valuable information about Russia’s nuclear arsenal, and improving predictability and stability in the US-Russia strategic relationship.

US and Russian team members shake hands before a Strategic Arms Reduction Treaty inspection visit in 2009. START established an in-depth verification regime, including boots-on-the-ground inspections that provided unprecedented levels of data exchange and transparency. Photo: U.S. Air Force/Christopher Hubenthal

Twenty-five years ago, US policy-makers of both parties recognized the benefits of the first START agreement: on October 1, 1992, the Senate voted overwhelmingly—93 to 6—in favor of ratifying START I.

The end of START?

With increased tensions between the US and Russia and an expanded range of security threats for the US to worry about, this longstanding foundation is now more valuable than ever.

The most recent agreement—New START—will expire in early February 2021, but can be extended for another five years if the US and Russian presidents agree to do so. In a January 28 phone call with President Trump, Russian President Putin reportedly raised the possibility of extending the treaty. But instead of being extended, or even maintained, the START framework is now in danger of being abandoned.

President Trump has called New START “one-sided” and “a bad deal,” and has even suggested the US might withdraw from the treaty. His advisors are clearly opposed to doing so. Secretary of State Rex Tillerson expressed support for New START in his confirmation hearing. Secretary of Defense James Mattis, while recently stating that the administration is currently reviewing the treaty “to determine whether it’s a good idea,” has previously also expressed support, as have the head of US Strategic Command and other military officials.

Withdrawal seems unlikely, but unless Mattis and other military officials push the president hard, so does an extension. Worse, even if Trump is not re-elected, and the incoming president is more supportive of the treaty, there will be little time for a new administration, taking office in late January 2021, to do an assessment and sign on to an extension before the deadline. While UCS and other treaty supporters will urge the incoming administration to act quickly, if the Trump administration does not extend the treaty, it is quite possible that New START—and the security benefits it provides—will lapse.

The Beginning: The Basics and Benefits of START I

The overwhelming bipartisan support for a treaty cutting US nuclear weapons demonstrated by the START I ratification vote today seems unbelievable. At the time, however, both Democrats and Republicans in Congress, as well as the first President Bush, recognized the importance of the historic agreement, the first to require an actual reduction, rather than simply a limitation, in the number of US and Russian strategic nuclear weapons.

By the end of the Cold War, the US had about 23,000 nuclear warheads in its arsenal, and the Soviet Union had roughly 40,000. These numbers included about 12,000 US and 11,000 Soviet deployed strategic warheads—those mounted on long-range missiles and bombers. The treaty limited each country to 1,600 strategic missiles and bombers and 6,000 warheads, and established procedures for verifying these limits.

The limits on missiles and bombers, in addition to limits on the warheads themselves, were significant because START required the verifiable destruction of any excess delivery vehicles, which gave each side confidence that the reductions could not be quickly or easily reversed. To do this, the treaty established a robust verification regime with an unprecedented level of intrusiveness, including on-site inspections and exchanges of data about missile telemetry.

Though the groundwork for START I was laid during the Reagan administration, ratification and implementation took place during the first President Bush’s term. The treaty was one among several measures taken by the elder Bush that reduced the US nuclear stockpile by nearly 50 percent during his time in office.

START I entered into force in 1994 and had a 15-year lifetime; it required the US and Russia to complete reductions by 2001, and maintain those reductions until 2009. However, both countries actually continued reductions after reaching the START I limits. By the end of the Bush I administration, the US had already reduced its arsenal to just over 7,000 deployed strategic warheads. By the time the treaty expired, this number had fallen to roughly 3,900.

The Legacy of START I

Building on the success of START I, the US and Russia negotiated a follow-on treaty—START II—that required further cuts in deployed strategic weapons. These reductions were to be carried out in two steps, but when fully implemented would limit each country to 3,500 deployed strategic warheads, with no more than 1,750 of these on submarine-launched ballistic missiles.

Phase II also required the complete elimination of independently targetable re-entry vehicles (MIRVs) on intercontinental ballistic missiles. This marked a major step forward, because MIRVs were a particularly destabilizing configuration. Since just one incoming warhead could destroy all the warheads on a MIRVed land-based missile, MIRVs create pressure to “use them or lose them”—an incentive to strike first in a crisis. Otherwise, a country risked losing its ability to use those missiles to retaliate in the case of a first strike against it.

While both sides ratified START II, it was a long and contentious process, and entry into force was complicated by provisions attached by both the US Senate and Russian Duma. The US withdrawal from the Anti-Ballistic Missile (ABM) treaty in 2002 was the kiss of death for START II. The ABM treaty had strictly limited missile defenses. Removing this limit created a situation in which either side might feel it had to deploy more and more weapons to be sure it could overcome the other’s defense. But the George W. Bush administration was now committed to building a larger-scale defense, regardless of Russia’s vocal opposition and clear statements that doing so would undermine arms control progress.

Russia responded by announcing its withdrawal from START II, finally ending efforts to bring the treaty into force. A proposed START III treaty, which would have called for further reductions to 2,000 to 2,500 warheads on each side, never materialized; negotiations had been planned to begin after entry into force of START II.

After the failure of START II, the US and Russia negotiated the Strategic Offensive Reductions Treaty (SORT, often called the “Moscow Treaty”). SORT required each party to reduce to 1,700 to 2,200 deployed strategic warheads, but was a much less formal treaty than START. It did not include the same kind of extensive verification regime and, in fact, did not even define what was considered a “strategic warhead,” instead leaving each party to decide for itself what it would count. This meant that although SORT did encourage further progress to lower numbers of weapons, overall it did not provide the same kind of benefits for the US as START had.

New START

Recognizing the deficiencies of the minimal SORT agreement, the Obama administration made negotiation of New START an early priority, and the treaty was ratified in 2010.

New START limits each party to 1,550 deployed strategic nuclear warheads by February 2018. The treaty also limits the number of deployed intercontinental ballistic missiles, submarine-launched ballistic missiles, and long-range bombers equipped to carry nuclear weapons to no more than 700 on each side. Altogether, no more than 800 deployed and non-deployed missiles and bombers are allowed for each side.

In reality, each country will deploy somewhat more than 1,550 warheads—probably around 1,800 each—because of a change in the way New START counts warheads carried by long-range bombers. START I assigned a number of warheads to each bomber based on its capabilities. New START simply counts each long-range bomber as a single warhead, regardless of the actual number it does or could carry. The less stringent limits on bombers are possible because bombers are considered less destabilizing than missiles. The bombers’ detectability and long flight times—measured in hours vs. the roughly thirty minutes it takes for a missile to fly between the United States and Russia—mean that neither side is likely to use them to launch a first strike.

Both the United States and Russia have been moving toward compliance with the New START limits, and as of July 1, 2017—when the most recent official exchange of data took place—both are under the limit for deployed strategic delivery vehicles and close to meeting the limit for deployed and non-deployed strategic delivery vehicles. The data show that the United States is currently slightly under the limit for deployed strategic warheads, at 1,411, while Russia, with 1,765, still has some cuts to make to reach this limit.

Even in the increasingly partisan atmosphere of the 2000s, New START gained support from a wide range of senators, as well as military leaders and national security experts. The treaty passed in the Senate with a vote of 71 to 26; thirteen Republicans joined all Democratic senators in voting in favor. While this is significantly closer than the START I vote, as then-Senator John F. Kerry noted at the time, “in today’s Senate, 70 votes is yesterday’s 95.”

And the treaty continues to have strong support—including from Air Force General John Hyten, commander of US Strategic Command, which is responsible for all US nuclear forces. In Congressional testimony earlier this year, Hyten called himself “a big supporter” of New START and said that “when it comes to nuclear weapons and nuclear capabilities, that bilateral, verifiable arms control agreements are essential to our ability to provide an effective deterrent.” Another Air Force general, Paul Selva, vice chair of the Joint Chiefs of Staff, agreed, saying in the same hearing that when New START was ratified in 2010, “the Joint Chiefs reviewed the components of the treaty—and endorsed it. It is a bilateral, verifiable agreement that gives us some degree of predictability on what our potential adversaries look like.”

The military understands the benefits of New START. That President Trump has the power to withdraw from the treaty despite support from those who are most directly affected by it is, as he would say, “SAD.”

That the US president fails to understand the value of US-Russian nuclear weapon treaties that have helped to maintain stability for more than two decades is a travesty.

North Korea’s Next Test?

UCS Blog - All Things Nuclear (text only) -

North Korean Foreign Minister Ri Yong Ho warned reporters in New York that his country may place a live nuclear warhead on one of its missiles, launch it, and then detonate the bomb in the open air.

It would not be the first time a country conducted such a test. The Soviet Union tried and failed in 1956. The United States was successful in 1962. But perhaps the most relevant historical precedent is the Chinese test in 1966.

 

An excerpt from 东方巨响 : a documentary film on the history of China’s nuclear weapons program produced by China’s People’s Liberation Army and released in 1999.

 

China’s Choice

At the time China was nearly as isolated as North Korea is today. The Soviet Union was no longer an ally but an adversary, massing military forces along China’s northern border. The United States kept the People’s Republic out of the United Nations and encircled its eastern coast with military bases in Japan, South Korea, the Republic of China on Taiwan, the Philippines, Australia and New Zealand. Despite relentless Chinese propaganda proclaiming invincible revolutionary strength, China’s leaders felt extraordinarily insecure in the face of mounting Soviet and US pressure.

China set off its first nuclear explosion in October of 1964 and proved it could deliver a militarily useful nuclear weapon with a bomber less than a year later. But the Chinese leadership still felt a need to demonstrate it could launch a nuclear-armed missile and detonate it near a target hundreds of kilometers away. Only then could Chinese leaders feel confident they introduced the possibility of nuclear retaliation into the minds of US and Soviet officials considering a first strike. Chinese Marshall Nie Rongzhen, who led China’s nuclear weapons program and directed the test, summed up Chinese thinking in his memoir.

Mating an atomic bomb to a missile and conducting a real swords and spears test required facing very great risks. If the missile exploded at the launch site, if it fell in the middle of its flight or if it strayed out of the target area there would be unthinkable consequences. But I was deeply confident in our scientists, in our engineers and in our comrades working at the bases, who all possessed a spirit of high responsibility. Our research and design work was thorough and the medium-range missile we developed was reliable, with a highly successful launch rate. But more than that, in order to show our missiles were genuinely a weapon of great power that could be used in war we had to conduct this test of them together.

North Korea’s Choice

It is impossible to know if the individuals leading North Korea’s nuclear weapons program have the same degree of confidence in their technology and their personnel.  But it is not hard to believe they feel the same urgent need to prove North Korea has a useable nuclear weapon, especially in the face of continuing US doubts. China’s expansive land mass allowed its leaders to conduct their test in a way that only put their own people at risk. But tiny North Korea must send its nuclear-armed missile out into the Pacific Ocean on a trajectory that would fly over Japan. If a failed North Korean test were to impact Japan it could precipitate a large-scale war in North-East Asia that could kill a million people on the first day.

Hopefully, avoiding that horrible outcome is the top priority of the North Koreans contemplating the test and the Americans considering responses. Kim and his cadres might feel less inclined to risk the test if it they were convinced President Trump and his national security team were already genuinely worried about the possibility of North Korean nuclear retaliation. Unfortunately, that’s an assurance Washington is unlikely to give Pyongyang. It still hasn’t given it to Beijing. US unwillingness to take the option of a first strike off the table, combined with demonstrations of resolve like the provocative flight of B1 bombers out of Guam and F15 fighters out of Okinawa, could tip North Korean scales in favor of conducting the test.

Critical Differences

Chairman Mao didn’t worship nuclear weapons. He famously disparaged the atomic bomb as a paper tiger. Mao believed nuclear weapons were too destructive to use in a war. Their only value was in vitiating nuclear threats against China with the fear of potential retaliation. Does Kim Jong-un think about nuclear weapons the same way? We don’t know, because we don’t talk to the North Koreans enough to understand their point of view or trust anything they say.

China went on to develop a very limited nuclear force calibrated to maintain a credible possibility of nuclear retaliation. The United States government not only never panicked, it found a way to develop a viable relationship with the nuclear-armed communist giant. By the time China first tested an ICBM capable of reaching the United States, reforms within China made it appear even less threatening. Profound US discomfort with China’s nuclear force remains, but the two sides have managed to not only avoid a war but to develop robust and mutually beneficial ties.

North Korea may seem too small, its culture too parochial to make dialog and cooperation as appealing to the United States as Nixon’s opening to China in 1972—just six years after China’s daring nuclear-armed missile test. It is hard for the nation of 24 million with a GDP the size of Jackson, Mississippi’s to command the same respect as China’s 1.3 billion. Perhaps the North Korean leadership sees nuclear weapons as a great equalizer: a viable means to force the United States to sign a peace treaty, and, as one North Korean student recently told a US reporter, “leave us alone.

The US Choice

Ri told the United Nations that the “ultimate goal” of his country’s nuclear weapons program was to “establish a balance of power with the United States.” It is worth exploring what that means, and bilateral dialog is the only way to do that.

There is no indication North Korea will agree to denuclearize unless the United States agrees to join them. The US must decide whether the risks of continuing to rely solely on pressuring North Korea, at the cost of Pyongyang’s ever more provocative demonstrations of its capability to harm the United States, are more likely to yield an acceptable outcome than the risks of engaging the North Koreans in a discussion of what might be required to make their nuclear weapons program less threatening to the United States and its allies. The most immediate choice is whether continuing to introduce ambiguity about pre-emptive US military action is worth provoking the test flight of a nuclear-armed missile over Japan.

In the Chinese case the United States came to tolerate its nuclear weapons program in the context of broader shifts in the international security environment that encouraged a bilateral rapprochement, even though the fundamental security problem – Chinese reunification and the status of the Republic of China on Taiwan – remained unresolved. The initial impetus for reestablishing relations was a shared concern about a mutual adversary, the Soviet Union. But the relationship managed to outlive the Soviet Union’s collapse. Tensions within the US-China security relationship have slowly intensified in the post-Cold War period and the United States is still unwilling to accept its vulnerability to Chinese nuclear retaliation. Yet both sides, for the time being, do not seem overly concerned about the risk of a nuclear confrontation.

Despite their volatility, Donald Trump and Kim Jong-un could find the basis for a US-North Korean rapprochement in their shared concern about an accidental nuclear war, or the outbreak of a conventional confrontation that would cause great harm to both nations. Talking about stopping a risky test of a nuclear-armed missile that would fly over Japan is a good place to start.

China is urging both sides to come to the table.

 

Will Scott Pruitt Tap Polluter-Friendly Scientists for Key Advisory Panel?

UCS Blog - The Equation (text only) -

Photo: Wikimedia

A third of the Environmental Protection Agency’s Science Advisory Board, an influential panel that reviews the science the agency uses in formulating safeguards, could be succeeded by climate science-denying, polluter-friendly replacements when their terms expire at the end of this month.

The board, which has been in existence for nearly 40 years, is traditionally populated by bona fide scientists from academia, government, and industry who volunteer to serve three-year terms. This time around, as first reported by E&E News, at least a dozen of the 132 candidates vying for one of the 15 open seats reject mainstream climate science. But that’s not all. There are at least 10 other equally inappropriate candidates on the list, and not all of them are scientists, despite the fact that it’s supposed to be a panel of science advisers.

Among the 12 climate science deniers are Weather Channel co-founder Joseph D’Aleo, who wrongly claims global warming is due to natural oceanic, solar, and volcanic cycles; and former Peabody Energy science director Craig Idso, now chairman of his family’s Center for the Study of Carbon Dioxide and Global Change, who insists “there is no compelling reason to believe that the rise in [average earth] temperature was caused by the rise in carbon dioxide.” D’Aleo, Idso, and six of the other climate-fact-challenged candidates are affiliated with the fossil fuel industry-funded Heartland Institute, which has a long history of misrepresenting science.

The other 10 unsuitable candidates consistently side with industry when it comes to protecting the public from toxic hazards, regardless of the scientific evidence, and falsely accuse the EPA of being unscientific to try to undermine its credibility.

Soot makes you live longer

One of the 10, toxicologist Michael Honeycutt, failed to secure a seat on the EPA’s seven-member Clean Air Scientific Advisory Committee when he was nominated for one last fall—with good reason. Over the last decade, Honeycutt, who heads the toxicology division of the Texas Commission on Environmental Quality, rolled back the state’s relatively weak protections for 45 toxic chemicals, including arsenic, benzene, formaldehyde, and hexavalent chromium, the carcinogen that made Erin Brockovich a household name.

Honeycutt also has attacked EPA rules for ground-level ozone (smog), which aggravates lung diseases, and particulate matter (PM) (soot), which has been linked to lung cancer, cardiovascular damage, reproductive problems, and premature death. In October 2014, Honeycutt argued that there would be “little to no public health benefit from lowering the current [ozone] standard” because “most people spend more than 90 percent of their time indoors” and “systems such as air conditioning remove it from indoor air.” And despite the overwhelming scientific evidence directly linking fine soot particles to premature death, Honeycutt testified before Congress in June 2012 that “some studies even suggest PM makes you live longer.”

Better living through chemistry

Another industry-friendly nominee, Kimberly White, is senior director of chemical products at the American Chemistry Council (ACC), the country’s largest chemical manufacturing trade association. Representing the interests of 155 corporate members, including chemical companies Dow, DuPont, and Olin; pharmaceutical firms Bayer, Eli Lilly, and Merck; and petrochemical conglomerates BP, ExxonMobil, and Shell, the ACC has delayed, weakened, and blocked science-based health, environmental, and workplace protections at the state, national, and even international levels.

For example, the ACC has lobbied against establishing federal rules on silica dust exposure and disclosing the chemicals used in hydraulic fracturing. It has been instrumental in limiting community access to information about local chemical plants. And it has played a key role in quashing government efforts to regulate bisphenol A (BPA), an endocrine-disrupting chemical used in plastics and can linings; flame retardants, which have been linked to birth defects and cancer; and formaldehyde, a known carcinogen. White downplayed formaldehyde’s risks in a September 2016 blog on the ACC website.

The ACC also lobbies to weaken existing environmental safeguards. In written testimony for a House Science, Space and Technology Committee hearing last February, for example, White charged that the EPA uses irrelevant or outdated data and procedures when drafting new regulations.

Who needs a cleaner environment?

Finally, three of the pro-polluter candidates are economists with a distinct corporate tilt: Richard Belzer, whose clients include the American Chemistry Council and ExxonMobil Biomedical Sciences; Tony Cox, whose clients include the America Petroleum Institute, Chemical Manufacturers Association, and Monsanto; and John D. Graham, dean of Indiana University’s School of Public and Environmental Affairs, who is currently doing contract work for the Alliance of Automobile Manufacturers on fuel economy standards and the libertarian Searle Freedom Trust on regulatory “reform.” All three emphasize the cost to industry to reduce pollution, discount scientific evidence of the risk of exposure, and ignore the benefits of a cleaner environment.

Perhaps the best known is Graham, who ran the Office of Management and Budget’s (OMB) Office of Information and Regulatory Affairs (OIRA) for five years during the George W. Bush administration. His appointment to that position was hotly contested because in his previous job, directing the Harvard Center for Risk Analysis, he routinely understated the dangers of products manufactured by the center’s corporate sponsors by using questionable cost-benefit analyses.

As predicted, Graham applied that same simplistic, industry-friendly calculus at OIRA, which oversees all government rulemaking, and at the tail end of his tenure in 2006, he unsuccessfully attempted to standardize risk assessments across all federal agencies. Public interest groups and the scientific community, spearheaded by the American Association for the Advancement of Science, came out in full force against the idea, and a National Research Council (NRC) committee unanimously rejected it as “fundamentally flawed.”

“Economists like Graham are frustrated because the EPA has been conservative about risk,” said Center for Progressive Reform co-founder Rena Steinzor, who wrote a stinging indictment of Graham’s government-wide proposal in a May 2006 issue of Inside EPA’s Risk Policy Report. “The EPA gives more margin to safety. That drives economists crazy. They think it leads to over-protection. But there are not many examples of chemicals that turn out to be less harmful than we thought.”

Foxes advising the foxes in the henhouse?

Putting climate science deniers and industry apologists on the EPA Science Advisory Board (SAB) would not only undercut the panel’s legitimacy, it also would provide cover for the corporate shills now in key positions at the agency, starting with Administrator Scott Pruitt, who has the final say on who is selected, and Nancy Beck, a deputy assistant administrator who most recently worked for the American Chemistry Council, and before that, for Graham at OMB.

“The Science Advisory Board has been providing independent advice to the EPA for decades, ensuring that the agency uses the best science to protect public health and the environment,” said Genna Reed, a policy analyst at the Union of Concerned Scientists. “SAB members have always been eminent scientists who are committed to the often-challenging public service of working through complex scientific topics to help guide EPA decision-making. They are the EPA’s scientific compass. The agency’s mission to safeguard our air and water will be further compromised if Administrator Pruitt winds up selecting these unacceptable candidates.”

Get involved! Submit a comment to EPA by Thursday!

You can submit comments about the EPA Scientific Advisory Board nominees by email to Designated Federal Officer Thomas Carpenter no later than close of business on Thursday, September 28, at carpenter.thomas@epa.gov. (Note that public comments are subject to release under the Freedom of Information Act.)

Tell the EPA that the following candidates are unacceptable for the Science Advisory Board:

Climate-science-denier nominees: Edwin Berry, Alan Carlin, Joseph D’Aleo, Keven Dayaratna, Paul Dreissen, Gordon Fulks, Craig Idso, Richard Keen, David LeGates, Anthony Lupo, David Stevenson and H. Leighton Steward.

Pro-polluter nominees: Richard Belzer, James Bus, Samuel Cohen, Tony Cox, James Enstrom, John D. Graham, Michael Honeycutt, Walt Hufford, James Klaunig and Kimberly White.

Who Not to Pick for the EPA’s Science Advisory Board

UCS Blog - The Equation (text only) -

In its effort to fill fifteen positions on the Science Advisory Board, the EPA has posted a list of 132 nominees to be a part of the esteemed EPA Science Advisory Board (SAB). The SAB is a group of over forty scientists, experts in a range of disciplines, who provide peer review and expert advice on EPA issue areas.

While many of the nominees are highly qualified and distinguished in their fields, there are a handful of individuals that are extremely concerning due to their direct financial conflicts, their lack of experience and/or their historical opposition to the work of the EPA in advancing its mission to protect public health and the environment.

The SAB was established by the Environmental Research, Development, and Demonstration Authorization Act of 1978 and operates as a federal advisory committee under the Federal Advisory Committee Act of 1972. Note that board members should be experts in their fields, with the training and experience to evaluate EPA-relevant scientific and technical matters. Source: U.S. GPO

Many of these concerning individuals were nominated by Heartland Institute—an organization that has actively worked to sow doubt about climate change science—and have the seal of approval by Trump EPA transition team member and Heartland staffer, Steve Milloy. When interviewed about some of the names on the nominee list, Milloy said that he is glad that EPA administrator Scott Pruitt is in office since he’ll be brave enough to reconstitute the SAB. A “thumbs up” from Milloy is an immediate red flag for me.

My colleague, Andrew Rosenberg, categorized questionable political appointees in three distinct buckets: the conflicted, the opposed, and the unqualified. The same can be said of nominees for the SAB. You don’t have to dig too deep to find individuals who may appear to be qualified on paper, but have a track record of undermining the work of the EPA and advancing policies that benefit special interests over the general public. Appointing these individuals to the SAB would be in direct opposition to the critical work of the SAB itself and to the EPA’s mission.

Take Dr. Michael Honeycutt, lead toxicologist at the Texas Commission on Environmental Quality, for example. Industry representatives, including at the American Chemistry Council, ExxonMobil, and the Texas Oil and Gas Association launched a campaign to get Honeycutt appointed to the CASAC in 2016, which fortunately was unsuccessful. Now Honeycutt’s name is on the list for the SAB.

He co-authored an article in 2015 that argued that available science did not support the EPA’s assertion that tighter ozone standards would provide significant public health benefits. In criticizing the scientific studies used by the EPA, Honeycutt has cherrypicked studies to exaggerate uncertainty on risks of ozone pollution, including making hay of the argument that ozone pollution isn’t a huge issue because “most people spend more than 90 percent of their time indoors,” which has been picked up and spouted off by climate deniers, like Michael Fumento.

Honeycutt has also served on the steering committee of the Alliance for Risk Assessment (ARA), along with President Trump’s nominee to head the Office of Chemical Safety and Pollution Prevention, Michael Dourson. The ARA was created by the TERA, an organization founded by Dourson that does research for industry and maintains a database of risk assessments.

According to its website, about a third of TERA’s funding comes from the private sector, including the American Chemistry Council and Coca-Cola. Rena Steinzor, professor at the University of Maryland School of Law has accused TERA of “whitewashing the work of industry.” The TCEQ has awarded TERA at least $700,000 in contracts between 2010 and 2014.  As a steering committee member, Honeycutt oversaw ARA scientific reviews of TCEQ work. While Honeycutt claims that he recused himself from those projects, the quagmire of ties between TCEQ, ARA, and TERA are hard to dispute, especially when you consider that during those same years, the TCEQ loosened two-thirds of the already-weak protections for the 45 chemicals it chose to reassess between 2007 and 2014. In 2013, The TCEQ paid $1.6 million to another industry-friendly consulting firm, Gradient, to review EPA’s science on ozone.

Honeycutt has spent his career at TCEQ politicizing the EPA and actively working to obstruct science used to inform important standards at the agency, so it seems out of character for him to want so badly to be a member of an EPA science advisory committee. Unless, of course, he is interested in the platform or the ability to provide formal advice to his personal friend, Michael Dourson.

What does Honeycutt have in common with fellow nominee, Dr. John Graham? Under Graham’s leadership in January 2006, The White House Office of Management and Budget (OMB) released a proposed Risk Assessment Bulletin which would have covered any scientific or technical document assessing human health or environmental risks.

OMB asked the National Academy of Sciences’ National Research Council (NRC) to conduct an independent review of the document. Its study gave the OMB a failing grade, calling the guidance a “fundamentally flawed” document which, if implemented, would have a high potential for negative impacts on the practice of risk assessment in the federal government. Among the reasons for their conclusions was that the bulletin oversimplified the degree of uncertainty that agencies must factor into all of their evaluations of risk. This idea for standardized risk assessment is of interest to regulatory reform advocates like Graham and has made its way into the dangerous Regulatory Accountability Act in Congress and into the new toxic substances rules under the Frank Lautenberg Chemical Safety for the 21st Century Act that Graham’s protégé and former ACC staffer, Nancy Beck, is now crafting from her position as Deputy Assistant Administrator of the EPA.

Before his stint at OMB, Graham led the Harvard Center for Risk Analysis, which notably skewed risk analyses in favor of industry: costs saved by not regulating versus lives saved regulating. In one case, Graham’s OMB rejected a National Highway Transportation Safety Administration rule that would reduce the toll of vehicle rollovers by requiring that automakers install tire pressure warning systems. Graham made this decision despite the direct conflict of interest as his Harvard think tank was funded by General Motors Corp., Ford Motor Co., Volvo Car Corp. and the Alliance of Automobile Manufacturers.

Another individual that the SAB should steer clear of is Dr. Richard Belzer, an agricultural economist and, like Graham, is a cost-benefit-analysis enthusiast who worked for the OMB’s Office of Information and Regulatory Affairs (OIRA) from 1988 to 1998. In 2000, Belzer criticized SAB’s role in peer reviewing the EPA’s evaluation of costs and benefits of the Clean Air Act. Belzer and his co-author called SAB’s reviews “ineffective” because, in their opinion, they couldn’t force the agency to change the direction of policy.

Belzer appears to misunderstand the purpose of the SAB which is to simply advise the agency on its science. The EPA has the discretion to heed that advice and apply it to policies. SAB members are not decision-makers, they are esteemed scientists whose expertise is best suited to evaluate scientific considerations, not political ones. In 2010, Belzer participated in a panel on “The EPA’s Ambitious Regulatory Agenda” sponsored by the American Enterprise Institute, the description of which includes the erroneous statement: “all major EPA decisions are contentious.” According to his bio, his clients include ExxonMobil and American Chemistry Council. And speaking of American Chemistry Council…

Kimberly White, senior director of chemical products and technology at the ACC is among those nominated to serve on the SAB. She has been summoned by House Science Committee Majority Staff, Lamar Smith, to testify at the hearing called “Making EPA Great Again” earlier this year where she spoke about the need to improve the SAB’s transparency and peer review methods and accused the EPA of being too involved in the SAB’s peer review process: “conversations that are happening in that peer review get stymied by [the] EPA’s input during the peer-review process so it’s not as independent as it should be.”

She also agreed when one member of congress suggested that the SAB was not truly balanced and that there should be a devil’s advocate on the committee. Perhaps Dr. White wants to fill that very role. The problem with that, however, is that the American Chemistry Council and her previous employer, the American Petroleum Institute, are organizations that actively work to spread disinformation about a range of scientific topics to thwart the EPA’s work to keep us safe. Dr. White has criticized an EPA assessment on formaldehyde, for example, because it wasn’t inclusive enough of science. Formaldehyde is a known carcinogen and thanks in large part to ACC, the EPA’s emissions standard for wood products set to be enforced in December has been delayed at least four months.

Who Pruitt appoints for the fifteen open positions will be a test to see whether he is going to continue seeking exclusive counsel from polluters. There are a handful of qualified scientists who have only served one term that can be easily reappointed for a second, which is common practice for the board. For the sake of continuity, it would behoove Pruitt to keep those experts on. For the other positions, it would be in the agency’s best interest for Pruitt to choose a balanced roster of new members from the dozens of well-qualified scientists on the list, rather than stack the committee with folks who have spent their careers working to undermine the mission of the EPA and weaken policies that are supposed to keep us safe.

All members of the public can submit comments encouraging the EPA to appoint independent and qualified scientists as advisors. You have until Thursday, September 28th at 11:59pm to email your comment to Thomas Carpenter, the Designated Federal Officer of the SAB, at carpenter.thomas@epa.gov.

 

 

Illinois is Expanding Solar Access to Low-Income Communities—But It Didn’t Happen Without a Fight

UCS Blog - The Equation (text only) -

Installing solar panels in PA Photo: used with permission from publicsource.org

When the Future Energy Jobs Bill (FEJA) passed the Illinois General Assembly and was later approved by Governor Rauner in early December last year, a key component of the legislation was to expand solar access for low-income communities. To get a feeling for how the legislation came about, I caught up with Naomi Davis, president and founder of the Chicago-based non-profit Blacks in Green (BIG). She has been on the front lines of developing this innovative program and is excited to finally see it coming together.

Illinois Solar for All

The Illinois Solar for All Program, a key piece of FEJA, provides funding to train and employ residents of low-income and economically disadvantaged communities, residents returning from the criminal justice system, and foster care graduates, in the solar installation industry. It’s a comprehensive solar deployment and job training program that will open access to the solar economy for thousands of Illinois residents.

For Naomi Davis, who has been advocating for renewable energy in a variety of platforms since BIG’s founding 10 years ago, Solar for All is a dream come true.

“[Solar For All] means the realization of a fundamental aim of BIG, which is to build an earned income business model for our non-profit,” Davis says. “We are launching BIG SOLAR in partnership with Millennium Solar and SunSwarm and creating a social enterprise for education and outreach, household subscriptions, workforce training and placement, design, installation, and maintenance of systems – residential, commercial, industrial, and are also exploring the development of a light solar pv assembly facility in West Woodlawn.”

The Solar for All program is a solar deployment and and job training initiative under FEJA.

The path to solar

The path to solar for all hasn’t been easy. “Not talked about is the sausage-making chaos of building a market almost from scratch, and the incredibly detailed and exhaustive examination of details and scenarios required,” admits Davis. She shares the camaraderie created when “folks who never talk to each other are huddled over time to understand the roles of the other and how to create economic harmony. That tiny organizations like BIG have to carry an incredible weight to stay at that table and ensure the interests of our constituents are represented.”

Although the legislation passed with many having a hand in its success, she highlights that communities of color are the unsung heroes of this legislation. Her organization’s affiliation and membership with the Chicago Environmental Justice Network was pivotal in having their needs considered. Among the organizations part of the network is the Little Village Environmental Justice Organization (LVEJO).

Juliana Pino, Policy Director for LVEJO, made sure the direction and content of the Future Energy Jobs Act took into consideration the needs of their community. It’s through their work, Davis says, that many of the benefits to communities of color now will be realized.

Solar growth benefits communities

According to the Low Income Solar Policy Guide, the growth of solar in the United States provides a significant opportunity to address some of the greatest challenges faced by lower-income communities: the high cost of housing, unemployment, and pollution. Solar can provide long-term financial relief to families struggling with high and unpredictable energy costs, living-wage employment opportunities in an industry adding jobs at a rate of 20 percent per year, and a source of clean, local energy sited in communities that have been disproportionately impacted by fossil fuel power generation.

According to Davis, Chris Williams, owner of Millennium Solar Electric, should be funded through this training. Davis says Williams is a third-generation African American IBEW electrician and founder of the now-reviving South Suburban Renewable Energy Association and go-to ComEd solar youth educator. Training and education are key.

Still, the work is hardly over. In fact, it’s just begun.

“As with any industry poised for enormous market share – in this case, energy – strategic tech training is essential,” says Davis. “Not just African Americans historically discriminated against, but also coal region towns desperately need the re-education this legislation can provide. Market forces are already finding cheaper sources than coal and without public dollars. Coal towns across Illinois and around the country all need what Solar for All provides – a better way forward.”

Community partnerships

Under the Illinois Solar for All Program, developers of community solar projects need to identify partnerships with community stakeholders to determine location, development, and participation in the projects. Communities will play a pivotal role in this program, and continuing to build partnerships is critical to its success.

Thanks to the Illinois Solar for All Program, Illinois is poised to bring more solar power to homes, communities, places of faith, and schools in every part of the state.

Public Source

Mesothelioma Awareness Day: Our Past Must Dictate the Future

UCS Blog - The Equation (text only) -

It shouldn’t come as a surprise that asbestos isn’t good for you. The mineral is a known carcinogen and has been tied to thousands of deaths from mesothelioma, asbestosis, and other asbestos-related diseases. On average, close to 3,000 people each year in the United States are diagnosed with mesothelioma. And for those unfortunate enough to be diagnosed with the incredibly rare disease, the results are often not good. Patients are usually given a grim prognosis averaging somewhere between 12 and 21 months.

Asbestos-related diseases are rarely quick to present themselves, often taking decades before symptoms finally show. When you breathe in or accidentally ingest the invisible fibers, they enter the lungs and may lodge themselves deep into the lung lining, known as the mesothelium. The area becomes irritated and over the years tumors begin to form. Mesothelioma is often difficult to diagnose, which means the resulting cancer is caught later and treatment options are more limited.

Breaking down barriers

Armed with that kind of information, one would assume it’d be a slam dunk to phase out asbestos use in the United States. Unfortunately, that isn’t the case. Last year, roughly 340 tons of raw asbestos were imported into the US, primarily for use in the chlor-alkali industry. Some types of asbestos-containing materials can also be imported. The Environmental Protection Agency tried to ban asbestos use nearly three decades ago, but many of the rules established by the department were overturned in a resulting court decision two years later. Today there’s hope things could change in the coming years, including renewed interest from the EPA.

In 2016, Congress approved the Frank R. Lautenberg Chemical Safety for the 21st Century Act, amending the 40-year-old Toxic Substances Control Act (TSCA) and giving the EPA more power to regulate dangerous chemicals as they are introduced in an effort to more effectively remove those posing an unnecessary risk to public health. Chemicals deemed to pose an unreasonable risk during the evaluation process will be eliminated based on safety standards, as opposed to a risk-benefit balancing standard used under the previous TSCA requirements. What this means is that under the old TSCA, an unreasonable risk would require a cost-benefit analysis and any restrictions would have to be the least burdensome to addressing the risk. Under the Lautenberg Act, the “least burdensome” requirement is removed, though the EPA still needs to take costs of other regulatory actions and feasible alternatives into consideration.

The amendment also requires the agency to perform ongoing evaluations of chemicals to determine their risk to public health. In December, asbestos was included on a list of ten priority chemicals slated for evaluation and a scoping document for the mineral was issued in June. Problem formulation documents for each of the first ten chemicals are expected in December.

Drowning in red tape

Despite what the Lautenberg Act is doing to unshackle the EPA and allow it to properly regulate chemicals as it sees fit, the White House and Congress have taken actions that seem counterintuitive. For example, in January, President Donald Trump signed an executive order known as the “2-for-1 Order” forcing agencies to remove two existing rules for every new one they create. The risk here is that agencies like the EPA will have to pick which rules to enforce, creating a new series of public health concerns. When it comes to new hazards, the agency may be slower to react due to a new budget variable thrown into the mix. While it could help the agency identify rules that overlap others, it does create the risk of money taking precedence over public health.

In addition, the Senate’s recently introduced Regulatory Accountability Act, known in some circles as the ”License to Kill” Bill, poses a similar set of issues. If passed, the RAA could potentially resurrect much of the red tape that was removed by the Lautenberg Act. Once again, it would become difficult to regulate or ban chemicals in the future, despite dangers they may propose. For example, the EPA would have to prove that a full asbestos ban is the best option available to the agency compared to any other more cost-effective option. It also allows for anyone to challenge these decisions, which could delay a potential ruling for years or even halt the process entirely.

The EPA is also constrained by the people who have been appointed to several high level positions within the agency itself. Administrator Scott Pruitt sued the EPA 14 times, challenging rules he believes overstepped the agency’s boundaries. Deputy Assistant Administrator Nancy Beck, previously with the American Chemistry Council, lobbied for years against the very rules she has sworn to protect today. In 2009, Beck was criticized in a House report for attempting to undermine and create uncertainty regarding the EPA’s chemical evaluations while serving with the Office of Budget and Management for the Bush administration. The latest person nominated for an EPA position is Mike Dourson, who has, at times, proposed much less protective standards for chemicals than those in use by the federal government.

Where we stand now 

This Mesothelioma Awareness Day, we find ourselves one step closer to seeing asbestos banned in the US. Today, while we honor those who’ve lost their struggle against this disease, we also show support for those still fighting mesothelioma and refusing to give in.

The EPA has, once again, taken the first steps toward a potential ban, but until that day comes the need for more awareness is a never-ending battle. Mesothelioma is a misunderstood disease and asbestos isn’t something people might consider at work or at home, which is why educating others is so important. Mesothelioma is largely avoidable, but the need to remain vigilant to prevent exposure is paramount.

Asbestos exposure isn’t something that will come to a screeching halt overnight. Hundreds of thousands of homes, buildings, and schools still harbor the mineral and that is likely to be the case for years to come. But stopping the flow of raw and imported asbestos into the US is a great first step to combating the issue at large.

About the author: Charles MacGregor is a health advocate specializing in education and awareness initiatives regarding mesothelioma and asbestos exposure. To follow along with the Mesothelioma Cancer Alliance and participate in a MAD Twitter chat on September 26, find them at @CancerAlliance

Rebuilding Puerto Rico’s Devastated Electricity System

UCS Blog - The Equation (text only) -

Photo: endi.com

Over the last few days, I’ve been glued to social media, the phone, and ham radio-like apps trying to find out more about the fate of family members in the catastrophic situation in my native Puerto Rico following Hurricane María. (Fortunately, I was able to confirm on Friday that everyone in my immediate family is accounted for and safe).

My family is among the few lucky ones. My childhood home is a cement suburban dwelling built on well-drained hilly soils, some eight kilometers from the coast, and well outside flood zones. But many of my 3.4 million co-nationals in Puerto Rico have not been so lucky, and are experiencing, as I write this, catastrophic flooding. Further, tens of thousands have been without electricity since Hurricane Irma downed many of the distribution lines. In addition, there are more than 170,00 affected in the nearby US Virgin Islands and Dominica, Caribbean islands who have also experienced catastrophic damages.

Just in the largest suburban community in Puerto Rico—Levittown in the north—hundreds had to be evacuated on short notice during the early Thursday dawn as the gates of the Lago La Plata reservoir were opened and the alarm sirens failed to warn the population. The next day, a truly dramatic emergency evacuation operation followed as the Guajataca Dam in the northwest broke and 70,000 were urged to leave the area. At least ten have been confirmed dead so far.

The government of the Commonwealth has mounted a commendable response, but has been hampered in large part by the lack of power and communications facilities, which are inoperable at the moment except for those persons, agencies, and telephone companies that have power generators and the gas to keep them running. This has been one of the main impediments for Puerto Ricans abroad to communicate with loved ones and for the Rosselló administration’s efforts to establish communications and coordination with many towns that remain unaccounted for.

Chronic underinvestment and neglect of energy infrastructure increases human vulnerability to extreme weather

Why has Puerto Rico’s energy infrastructure been rendered so vulnerable in the recent weeks? The ferocity of Irma and María could stretch the capacity of even well-funded and maintained energy production and distribution systems. In Florida—where the power grid had received billions in upgrades over the last decade—Irma left two-thirds of the population without power (but was also able to bounce back after a few weeks).

But years of severe infrastructure underinvestment by the Puerto Rico Electric Power Authority (PREPA) has led to a fragile system that has completely collapsed after these two hurricanes. Irma’s indirect hit damaged distribution lines but not production; María’s eye made landfall on the southeast and exited through the central north, placing it right on the path of four of the high-capacity plants that burn heavy fuel and diesel oil. These plants are also located close to, or within, flood zones.

The reconstruction of the power infrastructure in Puerto Rico is a monumental task as it is critical to guarantee the well-being of Puerto Ricans. More than 3.4 million US citizens are now in a life-threatening situation and getting electricity up and running in the near term is critically important as it can support rescue and recovery efforts.

Wherever possible, these immediate efforts should aim to align with a broader rebuilding mission that points Puerto Rico toward a more economically robust and climate resilient future, not repairs that repeat the mistakes of the past. There is a need also to build resilience against the climate and extreme weather vulnerability Puerto Rico is so brutally facing right now.

There is a great need also for economic alleviation of the high cost of energy in Puerto Rico: electricity prices for all sectors (residential, commercial, and industrial) are much higher in Puerto Rico than in the United States. Reliance on imported fossil fuels for generation is one driver of the high cost: in 2016 nearly half of energy production came from petroleum, nearly one-third from natural gas, and 17 percent coal). Only 2 percent comes from renewables.

While there is quite a bit of clean energy momentum in the United States, that impetus is not being transferred to Puerto Rico. There are many reasons for that, including lack of support from PREPA. But Puerto Rico has strong solar and wind energy resource potential, and renewable energy has been proposed as a way to help PREPA pare down its $9 billion dollar debt, help reduce reliance on fossil fuels and fossil fuel price volatility, lower costs to consumers, and contribute to an economic recovery for the Commonwealth.

This unprecedented catastrophe affecting millions of US citizens requires the intervention of the federal government

To ensure a safe and just economic recovery for Puerto Rico, Congress and the administration need to commit resources to help the territory recover. President Trump has declared Puerto Rico a disaster zone, and FEMA director Brock Long will visit the island on Monday. The priority right now is to save lives and restore basic services. To aid these efforts, Congress and the Trump administration should:

  • Direct the Department of Defense to provide helicopters and other emergency and rescue resources to Puerto Rico.
  • Provide an emergency spending package to the US territory.
  • Increase the FEMA funding level for debris removal and emergency protective measures in Puerto Rico.
  • Temporarily suspend the Jones Act. The Jones Act, which mandates that all vessels carrying cargo into the US and its territories be US Merchant Marine vessels, significantly increases the cost of importing goods into the island.

Once the state of emergency ends, Governor Rosselló needs to be very vocal that Puerto Rico’s energy infrastructure reconstruction should help put the Puerto Rican people and economy on a path to prosperity and resilience from climate impacts. The 2017 hurricane season is not over yet, and the situation in Puerto Rico right now is catastrophic. Decisions about energy infrastructure will be made in the coming days, weeks, and months. Those decisions need to take into account the short- as well as the long-term needs of the Puerto Rican population and help make Puerto Rico more resilient to the massive climate and weather extreme dislocations that we are facing.

Want to help?

endi.com

Science Triumphs Over Disinformation in Initial Flame Retardant Victory

UCS Blog - The Equation (text only) -

In a stunning victory for consumer safety and a powerful display of the ability of independent science to spur policy change, the Consumer Product Safety Commission (CPSC) voted this week to ban a class of additive, polymeric organhalogen flame retardants (OFRs) that are present in many consumer products. Last week, I was one of many individuals who testified before the CPSC urging the body to grant a petition to ban the class of organohalogen flame retardants from four classes of consumer products: mattresses, children’s products, furniture, and electronic casings.

Of the 31 individuals who testified last week, there were only two individuals who advised the CPSC not to ban OFRs: representatives from the American Chemistry Council (ACC) and the Information Technology Industry Council. As Commissioner Marietta Robinson pointed out during the hearing, the only comments opposing the ban “represent those with a financial interest in continuing to have these potentially toxic, and some of them definitively, toxic, chemicals in our environment.”  She also noted that the presentations by those opposed to the petition were not transparent and used materials relating to chemicals that were irrelevant to the petition, a drastic contrast to the numerous scientists and scholars whose heavily footnoted statements provided evidence to support the arguments of the well-bounded petition.

Scientific information trumps corporate disinformation

Commissioner Robert Adler, who submitted the motion to grant the petition, compared the chemical industry’s talking points at the hearing on reasons not to ban OFRS to the tobacco industry’s same denial of the health impacts of smoking. His statement read, “if we took the tobacco industry’s word on cigarette safety, we would still be waiting. Similarly, we have waited for years for our friends the chemical industry to provide us with credible evidence that there are safe OFRS. I have little doubt that we will still be waiting for many years, to no avail.” Sadly, he’s probably right.

We have seen this trend time and time again. Whether it was the tobacco industry, the asbestos industry, the sugar industry, the PCB industry, the agrochemical industry, the pharmaceutical industry, and the oil and gas industry, corporate bad actors have known about risks of their products and have chosen not to act to protect the public for years, sometimes decades. Not only do they deny that there is harm, but they actively push for policies that allow them to conceal the truth for even longer. As Oxford University’s Henry Shue wrote about fossil fuel companies like Exxon in a recent Climatic Change article, noting that “companies knowingly violated the most basic principle of ‘do no harm.’” It is unethical and unacceptable that the public is not afforded the information we deserve on the harms of products we are exposed to every day in the air we breathe, the water we drink, the food we eat, and everything in between.

A 2008 EPA literature review on polybrominated diphenyl ethers, one type of OFR, found that 80 percent of total exposure to the chemical by the general population is through ingestion and absorption of house dust containing these chemicals. (Photo: Flickr/Tracy Ducasse)

Case in point: ACC’s statement after the CPSC’s vote included sticking to its talking points and pivoting from whether OFRs are safe to whether they reduce fire risk. During the hearing, the ACC representative argued that the petition was overly broad and that there was insufficient data on each OFR to ban them as a class. However, when asked by Commissioners for evidence that certain OFRs did not cause harm, he was unable to point to a specific chemical or cite relevant research. At a certain point, there is no place to pivot when the facts are stacked against you.

Dust is something I never gave much thought to growing up. If anything, “dusting” was always my favorite chore when faced with the options of vacuuming or washing the dishes. I never really gave much thought to what that elusive substance was composed of. I certainly wouldn’t have guessed that within those seemingly innocuous dust bunnies hiding behind bookshelves were a mix of chemicals that could impact my health. Dusting has taken on new meaning for me since conducting research on flame retardants.

For decades now, consumers have been left powerless and at the whim of manufacturers who have decided for us what chemicals go into our homes and end up in our dust.

The result? Most Americans have at least one type of flame retardant present in our blood, young children have higher levels than their mothers, and children of color and those from low income communities bear disproportionately high levels of these chemicals in addition to a host of other chemical burdens.

Shue writes,

To leave our descendants a livable world is not an act of kindness, generosity, or benevolence…it is merely the honoring of a basic general, negative responsibility not to allow our own pursuits to undercut the pre-conditions for decent societies in the future.

This ban is beyond due. Moving away from these chemicals toward safer alternatives is a win for all, this generation and next.

Product safety is not a political issue

During the vote, Commissioner Adler said that he holds strong to the belief that “product safety is not a partisan issue and should never politicized” after a statement from one of the two Republican Commissioners that granting this petition through a vote down party lines would turn the issue into a political football. Commissioner Robinson defended Adler, stating that she was “absolutely flummoxed” and had “absolutely no clue what the science of this petition and these flame retardants has to do with whether you’re a Democrat or Republican nor what it has to do with my term being potentially up.”  The granting of a petition rooted in rigorous science is not a political action. However, obstructing this science-based rulemaking process would be.

While the CPSC has voted to begin the process of rulemaking to ban OFRs under the Federal Hazardous Substance Act and to convene a Chronic Hazard Advisory Panel, the Commission will be shifting its composition as Marietta Robinson’s term ends in September. It is possible that this scientific issue could become politicized once President Trump nominates a Republican to join the CPSC and take back the majority. In fact, chairwoman Buerkle even suggested that the ban be overruled once the Republicans take back the majority. President Trump intends to nominate corporate lawyer, Dana Baiocco, who has defended companies that have faced charges regarding safety and misleading advertising of consumer and industrial products and medical devices.

We urge the Commission to continue the progress begun during yesterday’s vote to educate the public about the risks of OFRs and to create the policy that will ban these chemicals in consumer products for good. Let’s let science, not politics, have the final word. Our children will thank us someday.

 

 

Eric/Creative Commons (Flickr) Flickr/Tracy Ducasse

Puffins, Politics, and Joyful Doggedness in Maine

UCS Blog - The Equation (text only) -

Puffins were nearly extinct in Maine in the early 1900s, hunted for their eggs and meat. Their re-introduction to Eastern Egg Rock in Maine in the 1970s became the world's first successful restoration of a seabird to an island where humans killed it off. Photo: Derrick Jackson

Eastern Egg Rock, Maine — Under bejeweled blackness, the lacy string of the Milky Way was gloriously sliced by the International Space Station, the brightest object in the sky. Matthew Dickey, a 21-year-old wildlife and fisheries senior at Texas A&M, grabbed a powerful bird scope and was able to find the space station before it went over the horizon. He shouted: “I think I can make out the shape of the cylinder!”

The space station gone, Dickey and four other young bird researchers settled back down around a campfire fueled with wood from old bird blinds that had been blown out of their misery by a recent storm.

They were alone six miles out to sea on a treeless six-acre jumble of boulders and bramble.

44 years of Project Puffin

On this seemingly inconspicuous speck in Maine waters, a man once as young as they were, Steve Kress, began restoring puffins. He was part of the world’s first successful effort to restore a seabird to an island where they had been killed off by human activity. The experiment began in the spring of 1973 by bringing down 10-day-old chicks down from Newfoundland, feeding them until fledging size in the fall, and hoping that after two or three years out at sea, they would remember Maine and not Canada, where decades of management have maintained a population of about 500,000 pairs.

Tonight it was a celebratory fire, flickering off faces with crescent smiles. Besides Dickey, there was team supervisor Laura Brazier, a 26-year-old science and biology graduate of Loyola University in Maryland and masters degree graduate in wildlife conservation at the University of Dublin in Ireland. There was Alyssa Eby, 24, an environmental biology graduate of the University of Manitoba; Jessie Tutterow, 31, a biology graduate of Guilford College; and Alicia Aztorga-Ornelas, 29, a biology graduate from the Universidad Autonoma de Baja California, Mexico.

In the two days prior, their routine count of burrows with breeding pairs of puffins surpassed the all-time record. The previous mark was 150, set last year. During my four-night stay with them in late July, the count rose from 147 to 157. The summer would end with 173 pairs.

“We did it. We are awesome. You guys are awesome,” Brazier said. “Puffins are cool enough. To know we set a new record and we’re part of puffin history is incredible.”

As the fire roared on, celebration became contemplation. As full of themselves as they had a right to be, they know their record is fragile. Where once there were no more than four puffins left in Maine in 1902, decimated by coastal dwellers for eggs and meat, Kress and 600 interns in the 44 years of Project Puffin have nursed the numbers back to 1,300 pairs on three islands. The techniques used in the project—including the translocation of chicks and the use of decoys, mirrors, and broadcast bird sounds to make birds think they had company—have helped save about 50 other species of birds from Maine to Japan and China. (I have the distinct pleasure of being Kress’s co-author on the story of his quest, “Project Puffin: The Improbable Quest to Bring a Beloved Seabird Back to Egg Rock,” published in 2015 by Yale University Press.)

Interns (Left to right) Alyssa Eby, Matthew Dickey, Alicia Aztorga-Ornelas, and Eastern Egg Rock Supervisor Laura Brazier hold an adult puffin they banded. Also on the team but not pictured is Jessie Tutterow.

In the crosshairs of American politics

But in the last decade, the Atlantic puffin, which breeds in an arc up from Maine and Canada over to Iceland, Scandinavia, and the United Kingdom, has become a signal species of fisheries management and climate change.

On the positive side, Maine puffins are bringing native fish to their chicks that rebounded with strict US federal rules, such as haddock and Acadian redfish. Negatively, the last decade has also brought the warmest waters ever recorded in the Gulf of Maine. A study published in April by researchers from the National Oceanic and Atmospheric Administration (NOAA) predicts that several current key species of fish “may not remain in these waters under continued warming.” Last month, researchers from the University of Maine, the Gulf of Maine Research Institute, NOAA, and others published a study in the journal Elementa, finding that the longer summers in the Gulf of Maine may have major implications for everything from marine life below the surface to fueling hurricanes in the sky.

For puffins, there already is significant evidence that in the warmest years, the puffin’s preferred cold-water prey like herring and hake are forced farther out to sea while some of the fish that come up from the mid-Atlantic, such as butterfish, are too big and oval for small puffin chicks to eat. The new fish volatility is such that while puffins thrived last year on tiny Eastern Egg Rock, their counterparts could not find fish off the biggest puffin island in the Gulf of Maine, Canadian-administered Machias Seal Island. Last year saw a record-low near-total breeding failure among its 5,500 pairs of puffins.

The Atlantic puffin, from Maine to the United Kingdom, has rapidly become a signal bird for climate change via the fish the parents attempt to bring to chicks. The Gulf of Maine is one of the fastest warming waters in the world and as a result, more puffins are bringing in more southerly species such as butterfish, such as the one pictured here. Butterfish are too large and oval for chicks to eat, leading to starvation. Photo: Derrick Jackson

In the European part of the Atlantic puffin’s range, warmer water displacing prey, overfishing, and pollution have hammered breeding success. According to an article this year in the journal Conservation Letters, co-authored by Andy Rosenberg, the director for the Center for Science and Democracy at the Union of Concerned Scientists and a former regional fisheries director for the National Oceanographic and Atmospheric Administration, the north Atlantic realm of the puffin is one of the most over-exploited fisheries in the world, as evident by the crash of several fisheries, most notably cod.

On the Norwegian island of Rost for instance, the 1.5 million breeding pairs of puffins of four decades ago were down to 289,000 in 2015. A key reason appears to be voracious mackerel moving northward, gobbling up the puffin’s herring. Even though there are an estimated 9.5 million to 11.6 million puffins on the other side of the Atlantic for now, Bird Life International two years ago raised the extinction threat for puffins from “least concern” to “vulnerable.”

Much of that was on the minds of the Egg Rock interns, because the very puffins they were counting are in the crosshairs of American politics.

Incessant attacks on environmental accomplishments

Puffins are on land only four months to breed so Kress and his team a few years ago put geo-locators on some birds to see where they migrate in the eight months at sea. Two years ago, the team announced that in the fall and early winter, many Maine puffins go north to the mouth of the St. Lawrence River. In late winter and early spring, they come south to forage in fish-rich deep water far south of Cape Cod. That area of ocean is so relatively untouched by human plunder, the corals in the deep are as colorful as any in a Caribbean reef.

The Obama administration was impressed enough to designate the area as the Northeast Canyons and Seamounts National Marine Monument, protected from commercial exploitation. While vast areas of the Pacific Ocean under US jurisdiction earned monument status under Presidents Obama and George W. Bush, the canyons are the first US waters in the Atlantic to be so protected.

Yet President Trump, as part of his incessant attack on his predecessor’s environmental accomplishments, ordered Interior Secretary Ryan Zinke to review Obama’s monument designations for possible reversal. Even though the Coral Canyons account for a tiny fraction of New England’s heavily depleted waters, the fishing lobby bitterly opposed monument status. This week, the Washington Post reported that Zinke has recommended that the Canyons and Seamounts be opened to commercial fishing.

The researchers on Egg Rock mused around the fire over the concerted attempt, led by the Republican Party and often aided by Democrats in top fossil-fuel production states, to roll back environmental protections for everything from coral to coal ash and broadly discredit science in everything from seabird protections to renewable energy. Some of the divisions of NOAA that are directly involved in studying waters like the Gulf of Maine are targeted for massive budget cuts by the Trump administration.

Maine’s puffins are direct beneficiaries of strict federal fishing management since the 1970s. In recent years, puffins have supplemented their traditional diet of herring and hake with species that have rebounded in the Gulf of Maine, such as the haddock pictured here. Photo: Derrick Jackson

Fighting against a stacked deck

“It’s funny how in the business world and the stock market, no one questions the numbers and facts,” said Brazier, who marched in April’s March for Science in Washington, DC. “They’re taken as facts and then people use them to decide what to do. But now it’s ok to question science.”

“I think it’s because if you can deny science, you can deny what needs to be done,” Eby said. “It’s too hard for a lot of people in rich countries to get their heads around the fact is that if we’re going to deal with climate change, we’re going to have to change the way we live and the way we use energy. That’s so hard, a lot of people would rather find ways to skip the science and live in their world without thinking about the consequences.”

Tutterow, who hails from North Carolina, where the General Assembly in 2012 famously banned state use of a 100-year-projection of a 39-inch sea-level rise, added, “If I was offered a state or federal job, I’d take it. I’d like to believe there’s a lot of career professionals who work hard to get the job done. But it used to be the main thing you worried about was red tape. Now you have to worry about censorship.”

Dickey said simply, “Sometimes it feels like the deck is stacked against us. But we just have to keep working as hard as we can until someone realizes we’re just trying to deliver facts to help the world.”

Puffins in Maine breed in burrows that wind crazily underneath boulders that rim their islands. That tests the ability of interns to reach for chicks to band for future study. Photo: Derrick Jackson

Joyful doggedness

The stacked deck is unfair, given the joyful doggedness displayed by this crew. On two days, I followed them around the perimeters of Egg Rock as they wrenched their bodies to “grub” under the boulders, contorting until they could reach their arm into the darkness to puffin chicks to band for research.

The simple act of banding has led to understanding the puffin’s extremely high levels of fidelity, coming back to the same island and burrow year after year despite migrating hundreds of miles away. One Project Puffin bird was in the running for the oldest-known puffin in the world, making it to 35 before disappearing in 2013. A Norwegian puffin made it 41 before being found dead.

On other Atlantic puffin islands, the birds can nest in more shallow cavities of rocks and mounds in grassy cliffs within wrist and elbow reach. Researchers on those islands are able to band scores of puffin chicks and adults.

But the massive size of jagged boulders on Eastern Egg Rock makes it so difficult to grub that the summer record was only 14. On my visit, the crew went from 9 to 17 chicks, with Brazier constantly saying, “Oh no, we’re not giving up. We got this. The next crew’s going to have work hard to beat us.”

No face was brighter than Aztorga-Ornelas’ when she took an adult puffin they banded and lowered it between her legs like a basketball player making an underhanded free throw. She lifted up the bird and let it go to fly back to the ocean to get more fish for its chicks. “I’ll never forget that for the rest of my life,” she said.

On another day, with the same enthusiasm displayed for puffins, they grubbed for another member of the auk family, the black guillemot. At one point, they caught four chicks in separate burrows within seconds of each other. They gleefully posed with birds for photographs.

“I wish people could feel why I’m in this,” Tutterow said. She talked about a prior wolf study project in Minnesota. “We tracked in the snow what we thought was one wolf,” she said. “Then, at a junction, what we thought was one single wolf, the tracks split into five different sets of tracks. Your jaw drops at the ability of these animals to perfectly follow each other to disguise the pack.”

Eastern Egg Rock went from the 1880s to 1977 with no resident puffins. This year, the number of breeding pairs hit a record 173. Where there were two or four birds left in the entire state of Maine in 1902, there are 1,300 pair today. Photo: Derrick Jackson

Getting it right

My jaw dropped at how bird science is making world travelers out of this crew beyond Egg Rock. Brazier has worked with African penguins in South Africa, loggerhead turtles in Greece, snowshoe hares in the Yukon, and this fall is headed to Midway Atoll for habitat restoration in key grounds for albatross.

Eby has worked with foxes in Churchill, Manitoba; oystercatchers, murres, auklets, gulls, and petrels in Alaska; and ducks in Nebraska. Besides wolves, Tutterow has helped manage tropicbirds and shearwaters in the Bahamas, honeybees and freshwater fish in North Carolina, loons in the Adirondacks, and wolves in Minnesota. Aztorga-Ornelas has worked with oystercatchers and auklets on Mexican islands and Dickey has helped restore bobwhite, quail, deer, and wild turkey habitat in Texas.

Brazier said a huge reason she helped rehabilitate injured endangered African penguins in South Africa was because of her experience tending to them in college at the Maryland Zoo. “I actually didn’t get the section of the zoo I applied for,” she said. “I got the African penguin exhibit and when all these little fellas were running around my feet, it was the best day of my life.”

Though he is the youngest of the crew, Dickey said his quail and bobwhite work gave him self-sufficiency beyond his years. “My boss lived two miles away and my tractor had a flat four times. It was on me to fix it and I figured it out, even though it was hotter than hell every day, sometimes 110.”

Tutterow, the oldest, originally earned a bachelors degree in nursing at Appalachian State University, but found far more satisfaction in an outdoor career. Among her fondest childhood memories was her parents allowing her to wander in local woods to read spot on a rock on a creek. “You can build a lifestyle around any amount of income, but you cannot build happiness into every lifestyle,” she said. “Working with these animals, I’m building happiness for them and me.”

No myopic set of politics and denial of science should ever get in the way of this level of career happiness. Aztorga-Ornelas and I, despite her limited English and my stunted Spanish, shared a dozen “Wows!” sitting together in a bird blind, watching puffins zoom ashore with fish.

Eby said, “It’s strange for me. We just came out of a conservative government in Canada (under former Prime Minister Stephen Harper) where they stopped lake research for acid rain, fisheries, and climate change and government scientists did not feel the freedom to speak out. And now that we’re getting more freedom, I’m here. I hope the US can get it right soon.”

 

What’s My State Doing About Solar and Wind? New Rainbow Graphic Lets You Know

UCS Blog - The Equation (text only) -

[With costs dropping and scale climbing, wind and solar have been going great guns in recent years. Shannon Wojcik, one of the Stanford University Schneider Sustainable Energy Fellows we’ve been lucky enough to have had with us this summer, worked to capture that movement for your state and its 49 partners. Here’s Shannon’s graphic, and her thoughts about it.]

Do you ever wonder how much energy those rooftop solar panels in your state are contributing to renewable energy in our country? How about the wind turbines you see off the highway?

Our new “rainbow mountain” graphic lets you see your state’s piece of solar and wind’s quickly growing contribution to the US electricity mix. It shows how much of our electricity has come from wind and solar each month for the last 16 years. Just click on your state in the graph’s legend or roll your mouse over the graphic to see what’s been happening where you live.

Dashboard 1

var divElement = document.getElementById('viz1506006933250'); var vizElement = divElement.getElementsByTagName('object')[0]; vizElement.style.width='870px';vizElement.style.height='619px'; var scriptElement = document.createElement('script'); scriptElement.src = 'https://public.tableau.com/javascripts/api/viz_v1.js'; vizElement.parentNode.insertBefore(scriptElement, vizElement);

At first glance, this graphic looks like a disorderly rainbow mountain range. Keep staring though (try not to be mesmerized by the colors) and you can start to see patterns.

The peaks in the mountain range seem to be methodical, as well as the dips. The peaks where the most electricity is supplied by wind and solar can be seen in spring, where demand (the denominator) is lower due to moderate temperatures, and generation (the numerator) is high due to windy and sunny days. The crevasses, in July and August, happen because demand for electricity is high at those times thanks to air conditioning, increasing the overall load on the US grid—and driving up our calculation’s denominator. If you were to look just at monthly generation of wind and solar, this variation would be smaller.

Another, much more obvious thing about the mountains is that they’re getting taller. In fact, we passed a notable milestone in March of 2017, when, for the first time, wind and solar supplied 10% of the entire US electricity demand over the month. In 2012, solar and wind had only reached 4.6% of total US generation, so the recent peak meant more than a doubling in just 5 years.

That’s momentum.

Climbers and crawlers

Being able to see the different states lets you see where the action is on wind and solar—which are the climbers and which are the crawlers.

You know the saying about how everything is bigger in Texas? Well, that certainly holds true here. Texas is the bedrock of this mountain range, never supplying less than 14% of the wind and solar for the entire US after 2001. Even supplying as much as 35% some months. Texas hosts the largest wind generation, and doesn’t seem to be in danger of losing that title anytime soon.

California is another crucial state in this mountain range, and has been from the beginning. California was building solar and wind farms years before the other states, a trendsetter; in 2001, it was supplying up to 75% of all the wind and solar electricity in the US. California is still the second largest supplier of wind and solar.

Other notable states that are building this solar and wind mountain are Oklahoma, Iowa, Kansas, Illinois, Minnesota, Colorado, North Dakota, Arizona, and North Carolina. Most of these states are rising up due to wind, but Arizona and North Carolina, along with California, are leading with solar.

Not all states with strong solar and wind performances by some metrics show up here. South Dakota is #2 for wind as a fraction of their own generation, though on this graphic it’s barely visible.

What does this mean?

This graphic shows that the momentum of solar and wind growth in the United States is undeniable. It can be seen on rooftops, in windy valleys and on windy plains, and even in states where coal has been king. All 50 states are involved as well, as every state generates electricity with wind and solar.

There are many ways for your state to increase its overall percentage. It can either decrease its denominator with energy efficiency or increase its numerator with wind and solar installations.

Not satisfied with where your state shows up on this graph? Check out what more your state can do.

Free Lunches in New York City Public Schools Are a Win for Kids—and Technology

UCS Blog - The Equation (text only) -

Photo: USDA

It’s so good to share good news.

This month, the New York City Public Schools announced that, starting with the current school year, all students can receive free lunch with no questions asked. That means less stigma for kids facing food insecurity, less worrying for families, and less paperwork for school districts. And it might surprise you to learn that at the heart of this victory—carried across the finish line by a group of dedicated advocates—is a fairly common application of technology.

The underlying policy at play here is called the “Community Eligibility Provision,” or CEP. It was authorized with the Healthy, Hunger-Free Kids Act of 2010 to help schools and local educational agencies with a high percentage of low-income students. As a colleague wrote on this blog in 2016, CEP helps school systems (like New York City Public Schools) to reduce paperwork and poverty stigma, while making sure that free and reduced price meals are available to all kids who might need them. Instead of asking each family to fill out an application, CEP allows schools to determine student eligibility through household participation in programs like SNAP (the Supplemental Nutrition Assistance Program, commonly referred to as food stamps) and TANF (the Temporary Assistance for needy Families program). If over 40 percent of students are deemed eligible, schools receive additional federal reimbursement dollars to cover free meals for more students beyond those who qualify—ensuring that even those whose families are not enrolled in federal assistance programs can still get meals if they need them. 

So how is New York City able to cover free meals for all students?

Here’s the math answer: the CEP multiplier is 1.6, which means that if 50 percent of students at School X are eligible for free meals, School X can actually serve free meals to (50 percent) * (1.6) = 80 percent of students using federal reimbursement dollars. If New York City Public Schools are now receiving federal reimbursement for 100 percent of students, it would mean they have demonstrated that at least (100 percent) / (1.6) = 62.5 percent of students are eligible through CEP.

Which brings us to the real-world answer: New York is able to cover free meals for all students because it got smart about its use of technology to better reflect true student need. The New York Department of Education website describes the new data matching engine it has developed to identify eligible students:

“This new matching system provides a more efficient and accurate process for matching students across a range of forms that families already complete. This new matching process yielded an increase in the number of students directly certified – or matched to another government program – and increased the direct certification rate, allowing the City to qualify for the highest level of reimbursement in the federal CEP program. The number of families living in poverty has not increased; the changes to the matching process allow the City to better identify families.”

Why the technology matters

I know what you’re thinking. It’s awesome that all kids in New York City Public Schools can eat for free! But why make such a big deal about this technology? It doesn’t seem like rocket science.

Bingo.

New York City Public Schools is not using a particle accelerator to improve data matching among students. They haven’t even used a 3-D printer. The data integration and management systems they’re employing, while complex, are actually fairly commonplace. It’s the same sort of technology banks use to combine different databases of credit scores and application information to make credit offers, which is the same technology Netflix uses to deduce that because you watched Good Burger, you might like Cool Runnings. (Hypothetically speaking.)

Yet when it comes to the use of technology in the administration of nutrition assistance programs, we have fallen remarkably behind. The transition from actual paper food stamps to electronic benefit cards officially concluded in 2004, nearly fifty years after the introduction of the first major credit card. Even now, some states (looking at you, Wyoming!) require SNAP applications to be faxed, mailed, or returned in person.

To be clear, I’m not claiming technology is a silver bullet. For one, implementing new technology often comes with a price tag—and a steep learning curve. (Just ask Kentucky.) In particular, the use of data matching raises ethical concerns related to privacy and security, and these are not to be overlooked. But in many cases, these are arguments to improve, rather than disregard, the technology and the policies that guide its use. Because when our public assistance programs fall behind, so do the people who rely on them, and so does our ability to deliver maximum public benefit with increasingly limited resources. It is critical (and just plain sensible) to use the tools at our disposal to help realize the potential of current technological systems to enhance the strength and efficiency of the federal safety net. 

Carrying the momentum in the 2018 farm bill

Keep an eye on this issue. There is reason to suspect that the advancement of technology in public assistance programs will be addressed in the 2018 farm bill, and even reason to hope for a bipartisan effort. In fact, I’ll take the opportunity to quote Glenn Thompson, chairman of the House Agriculture Nutrition Subcommittee, who opened a June hearing on SNAP technology and modernization with this sentiment: “We need to get the policy right. As we approach the upcoming farm bill, it is critical we understand opportunities to amend and improve the program to properly account for the changes that come with our evolving, technological world.”

Bringing Down the House: A Hostile Takeover of Science-Based Policymaking by Trump Appointees

UCS Blog - The Equation (text only) -

The Trump administration is slowly filling positions below the cabinet officer level in the “mission agencies” of the federal government (e.g., EPA, NOAA , Interior, DOE, etc. whose job it is to implement a specific set of statutory mandates). The appointed individuals are leading day-to-day decision-making on policies from public health and safety to environmental protection to critical applied science programs. In other words, the decisions these appointees make will affect everyone in the country.

The job of the agencies and their political leadership is to represent the public interest. It is not to serve the private interests of particular industries and companies, or even to push political viewpoints, but to implement legislative mandates in the interest of the American public. After all, who else but government can do this? Our laws call for the water and air to be clean, our workers and communities to be safe, our environment to be healthy and our science to be robust and fundamental to better policy and decision-making. That is what mission agencies are tasked to do.

So, what have we seen so far? To be sure, the Administration has nominated and appointed some qualified individuals with good experience and little apparent conflicts of interest. But unfortunately, that is not the norm. In my mind, most of the key appointments with responsibility for science-based policymaking fall into three categories:

  • The conflicted: Individuals who have spent a significant part of their careers lobbying the agencies they are now appointed to lead to obtain more favorable policies to benefit specific industries or companies—and who will likely do so again once they leave the government. These individuals have a conflict of interest because of these connections. Despite President Trump’s call to “drain the swamp,” these appointees are well-adapted and key species in that very swamp (sorry, my ecologist background showing through).
  • The opposed: Individuals who have spent much of their careers arguing against the very mission of the agencies they now lead. This group is not entirely separate from the first, because often they made those arguments on behalf of corporate clients pushing for less accountability to or oversight from the American public. But further, they have opposed the very role played by the federal agencies they are appointed to serve. While they may have conflicts of interest as in (1), they also have an expressed anti-agency agenda that strongly suggests they will work to undermine the agency’s mission.
  • The unqualified: Individuals who are wholly unqualified because they haven’t the experience or training or credentials that are requisite for the job. Again, these appointees may also have conflicts of interest, and apposite political agendas to the missions of the agencies, but they also have no real place leading a complex organization that requires specific expertise.

With more than 4,000 possible political appointments to federal agencies, I of course cannot cover them all. In fact, scanning through the list of those 600 appointments requiring Senate confirmation, less than one-third have even been nominated for Senate action. But here is a disturbing set of nominees or appointments that undermine science-based policymaking.

The conflicted

William Wehrum is a lawyer and lobbyist nominated to lead the EPA Office of Air and Radiation (OAR). He previously worked at EPA during the G.W. Bush Administration. UCS opposed his nomination then. Mr. Wehrum’s corporate clients include Koch Industries, the American Fuel and Petrochemical Manufacturers, and others in the auto and petrochemical industries. He has been a vocal spokesperson against addressing climate change under the Clean Air Act, which would be part of his responsibility as OAR director. While he has advocated for devolving more authority to the states for addressing air pollution generally, he also opposed granting California a waiver under the Clean Air Act to regulate greenhouse gas emissions from vehicles. Mr. Wehrum has also been directly involved, both as a lobbyist for industry and during his previous stint at EPA, in efforts to subvert the science concerning mercury pollution from power plants, restrictions on industrial emissions, as well as lead, soot and regional haze regulations.

Dr. Michael Dourson has been nominated to be EPA Assistant Administrator for Chemical Safety and Pollution Prevention. He is well known by the chemical industry, having spent years working as a toxicologist for hire for industries from tobacco to pesticides and other chemicals. Dr. Dourson has argued that the pesticide chlorpyrifos is safe despite a large body of science to the contrary. He has advocated for the continued use of a toxic industrial chemical called TCE, which the EPA determined was carcinogenic to humans by all routes of exposure. [TCE was the chemical linked to leukemia in children in the 1998 film “A Civil Action.”] When asked about his controversial chemical risk assessment company, TERA, receiving funding from chemical companies, Dourson responded: “Jesus hung out with prostitutes and tax collectors. He had dinner with them.”

Dr. Nancy Beck, appointed to the position of EPA Deputy Assistant Administrator, now leads the agency’s effort to implement the Lautenberg Chemical Safety Act, which was signed into law last year. Dr. Beck was previously senior staff with the American Chemistry Council, the trade organization that worked very hard for years to weaken the rules protecting the public from toxic chemicals. The result? The new rules from the EPA are far weaker than those developed by the professional staff at the agency and remarkably similar to the position the industry favored, while dismissing the positions of other members of the public and other organizations including UCS. Previously, Dr. Beck worked in the G.W. Bush Administration at the Office of Management and Budget. During that part of her career Dr. Beck was called out by the U.S. House Science and Technology Committee for attempting to undermine EPA’s assessment of toxic chemicals and her draft guidance on chemical safety evaluations was called “fundamentally flawed” by the National Academy of Sciences.

Lest you think that the conflicted are all at EPA, consider David Zatezalo, nominated to be Assistant Secretary of Labor for Mine Health and Safety. He was formerly the chairman of Rhino Resources, a Kentucky coal company that was recipient of two letters from the Mine Safety and Health Administration for patterns of violations. Subsequently a miner was killed when a wall collapsed. The company was fined.

David Bernhardt has been confirmed as the Deputy Secretary of Interior. He was DOI Solicitor under the George W. Bush administration. In 2008, weeks before leaving office, Bernhardt shifted controversial political appointees who had ignored or suppressed science into senior civil service posts. While at his law firm Brownstein Hyatt Farber Schreck, he represented energy and mining interests and lobbied for California’s Westlands Water District. His position in the firm—he was a partner—and the firm’s financial relationship with Cadiz Inc. (which is involved in a controversial plan to pump groundwater in the Mojave desert and sell it in southern California) has led to one group calling him a “walking conflict of interest.” Bernhardt also represented Alaska in its failed 2014 suit to force the Interior department to allow exploratory drilling at the Arctic National Wildlife Refuge.

The opposed

Susan Combs has been nominated to be the Assistant Secretary of Interior for Policy, Management, and Budget. She was previously Texas’s agricultural commissioner and then the state’s Comptroller where she often fought with the U.S. Fish and Wildlife Service over Endangered Species Act issues. Notably she has a history of meddling in science-based policy issues like species protections. She has been deeply engaged in battling for property rights and against public interest protections; she once called proposed Endangered Species Act listings as “incoming Scud missiles” against the Texas economy. Of course, protecting endangered species, biodiversity and public lands is a major responsibility of the Department of Interior.

Daniel Simmons has been nominated to be the Principal Deputy Assistant Secretary of the Office of Energy Efficiency to foster development of renewable and energy-efficient technologies. He was previously Vice President at the Institute for Energy Research, a conservative organization that promotes fossil fuel use, opposed the Paris Climate Accord, and opposes support for renewable energy sources such as wind and solar. He also worked for the American Legislative Exchange Council (ALEC) as director for their natural resources task force. ALEC is widely known for advocating against energy efficiency measures.

The unqualified

Sam Clovis, the nominee for Undersecretary of Agriculture for Research, Education and Economics, effectively the department’s chief scientists, is not a scientist or an economist nor does he have expertise in any scientific discipline relevant to his proposed position at USDA—like food science, nutrition, weed science, agronomy, entomology. Despite this lack of qualifications, he does deny the evidence of a changing climate. He was a talk radio host with a horrendous record of racist, homophobic and other bigoted views which should be disqualifying in themselves.

Albert Kelly has been appointed a senior advisor to EPA Administrator Scott Pruitt and the Chair of the Superfund Task Force. He is an Oklahoma banker with no experience with Superfund or environmental issues, but he was a major donor to Mr. Pruitt’s political campaigns. So far the task force has focused on “increasing efficiencies” in the Superfund program.

Over at NASA, the nominee for Administrator is Rep. James Bridenstine, (R. OK). While he certainly has government and public policy experience (a plus), he does not have a science background, a management background or experience with the space program. He has called aggressively for NASA to focus on space exploration and returning to the moon, rather than its earth science mission. In addition, he has been a strong advocate for privatization of some of the work of the agency. He has questioned the science on climate change and accused the Obama Administration of “gross misallocation of funds” for spending on climate research.

Michael Kratsios is the Deputy Chief Technology Officer and de facto head of Office of Science and Technology Policy in the White House. He is a former aide to Silicon Valley executive Peter Thiel and holds a AB in politics from Princeton with a focus on Hellenic Studies. He previously worked in investment banking and with a hedge fund. How this experience qualifies him to be deputy chief technology officer is beyond me.

Can we have science-based policies?

This is by no means a full list of egregious nominees for positions that will have a big impact on our daily lives. So, the question remains, is science-based policy making a thing of the past? Will the conflicted, the opposed, and the unqualified be the pattern for the future?

Fortunately, we can and should fight back. We as scientists, concerned members of the public, and activists can call on our elected officials to oppose these nominees. If they are in place, then they can be held to account by Congress, the courts, and yes, in the court of public opinion. Handing over the fundamental job of protecting the public to champions for regulated industries and political ideologues is wrong for all of us. After all, if industry did protect the public from public health or environmental impacts, then regulatory controls would be superfluous.

We can’t just wring our hands and wish things didn’t go this way. Conflicted, opposed and unqualified they may be, but they are now in public service. Let’s hold them to account.

How Freight Impacts Communities Across California

UCS Blog - The Equation (text only) -

Photo: Luis Castilla

Today, UCS and the California Cleaner Freight Coalition (CCFC) released a video highlighting the impacts of freight across California. This video – and longer cuts of individual interviews here – touch on the many communities across California affected by freight.

Freight is a big industry in California. Nearly 40 percent of cargo containers entering and leaving the United States pass through California ports. California is also the largest agricultural producing state, supplying nearly one fifth the country’s dairy, one third of the country’s vegetables, and two-thirds of the country’s fruits and nuts.

Truck traffic on I-5 heading north towards the Central Valley near Castaic, CA.

Farm in Shafter, CA.

This means California is home to many ports, rail yards, warehouses, distribution centers, farms, and dairies – all of which are serviced by many trucks. Despite the latest (2010) engine standards and significant financial investments by the state and local air districts, air quality in California remains among the worst in the United States, due in large part to truck emissions.

The most polluted cities in the United States. Source: American Lung Association, State of the Air 2016.

Communities impacted by freight are often burdened by other sources of pollution

In the Central Valley, a trash incinerator is opposed by community groups yet classified by the state as a source of renewable energy. Biomass power plants emit significant amounts of particulate matter. Oil drilling operations contribute to both air pollution and unknown water contamination.

Dairies in the Valley contribute not only to methane emissions, but also to other health hazards including particulate matter (from reactions of ammonia in excrement with nitrogen oxides (NOx) from cars and trucks), smog/ozone (from reactions of NOx with volatile organic compounds produced by decomposing animal feed), and contamination of aquifers. Just like real estate prices drove dairies from the Inland Empire to the Central Valley, warehouses and distribution centers are following suit despite being 150 miles from the Ports of Los Angeles and Long Beach.

Silage (animal feed) pile near Shafter, CA.

Two views of a large Ross Distribution Center in Shafter, CA (measures over 1 mile around the building and 2 miles around the entire lot).

In the Los Angeles region, not only are roadways and the two ports major concerns for communities, but so are oil refineries and over 1,000 active oil drilling sites.

Most of these urban oil sites are within a few football fields of homes, schools, churches, and hospitals. Despite all of the “green” accolades bestowed on California, it is the 3rd largest oil producer in the United States after Texas and North Dakota.

Pumpjacks in California can be found next to farms, hospitals, and even In-N-Out.

So what’s the solution?

For trucks, we need stronger engine standards for combustion vehicles, commitments to and incentives for zero-emission vehicles, and roll-out of battery charging stations and hydrogen fueling stations with electricity and hydrogen from renewable energy.

Just last week, the California legislature passed bills (1) to get zero-emission trucks integrated to fleets owned by the state and (2) allocating $895 million from cap and trade revenue for cleaner heavy-duty vehicles. The California Cleaner Freight Coalition is working on a range of solutions from the state to local level and UCS is proud to be a member of this coalition. Watch and share the video!

Photo: Luis Castilla Photo: Jimmy O'Dea Photo: Jimmy O'Dea Photo: Jimmy O'Dea Photos: Jimmy O'Dea Photos: Jimmy O'Dea

Tax Credits and Rebates for Electric Cars Benefit US Drivers and Automakers

UCS Blog - The Equation (text only) -

Leadership on vehicle electrification is critical to tackling climate change, protecting consumers from volatile oil prices, maintaining the competitiveness of US automakers, and creating 21st century manufacturing jobs. However, electric vehicles (EVs) currently cost more to manufacture than comparably sized gasoline-powered vehicles, which can mean higher prices and slower adoption.  One important policy solution to help accelerate the rate of EV sales is to offer purchase incentives to potential EV buyers, as discussed in a new policy brief “Accelerating U.S. Leadership in Electric Vehicles” that I co-authored with my UCS colleague Josh Goldman.

Incentives, such as tax credits and rebates, encourage EV sales while automakers scale up manufacturing and technology improves. Much of the additional cost of making an EV is due to the battery, and this scale up of EV manufacturing, along with improved and novel battery technology, will reduce the cost of manufacturing EV batteries and make EVs more cost competitive.

Modern EVs have only been offered for seven years, yet during that time we have seen impressive reductions in the cost to produce automotive battery packs. Initially, costs of EV battery packs were estimated to cost over $750/kWh of storage capacity. Now battery costs have fallen to around $200/kWh, with further reductions predicted by industry analysts. Once battery costs reach the range of $125-$150/kWh, the costs of EVs are projected to reach parity with conventional vehicles.

As battery costs continue to decline the cost difference between EVs and conventional gasoline vehicles will fall, although the exact date at which EVs achieve cost parity ($125-150 per kWh) depends on the rate of EV sales and other factors. References for data sources available online.

It may make sense to reduce broadly-available incentives after EVs become more price competitive, but removing them too soon would stall U.S. leadership in a critical technology.

The US federal income tax credit, in particular, is a vital investment in the transition to electric vehicles. The credit provides a credit of up to $7,500 per EV, based on the size of the battery. Most battery-electric and long-range plug in hybrids qualify for the full credit value. However, this credit begins to phase out for a manufacturer once they sell 200,000 electric vehicles in the US.

Market leaders General Motors, Nissan, and Tesla are already over 100,000 cumulative EV sales as of mid-2017. General Motors and Tesla will likely hit the phase out first, probably before the end of 2018, especially if their new more affordable long-range EVs (Chevy Bolt EV and Tesla Model 3) sell well. Therefore, this phase out has the perverse effect of penalizing some of the leaders in EVs and notably EVs that are coming off assembly lines in the US (including all Tesla, General Motors, and Nissan EVs sold in the US), while other manufacturers like Honda would have incentives available for years to come.

The federal EV income tax credit phases out for a manufacturer’s EV models once they exceed 200,000 sales. General Motors and Tesla are on pace to hit the sales cap within 18 months, and Nissan is not far behind.

State incentives are also important to accelerate the switch from gasoline to electricity for our driving. The largest program, California’s Clean Vehicle Rebate Project, has helped over 200,000 buyers make the change to electric drive. And other states have also stepped up to support the transition to cleaner cars. For example, Josh Goldman blogged recently about Oregon’s newly enacted EV rebate program.

Increasingly, we are seeing studies that predict sales of EVs will overtake gasoline cars in the next 10-20 years. However, it is still important to support the nascent EV industry, both to increase the number of EVs on the road now and to support the US automakers that are leading this vital transition.

Purchase incentives for plug-in EVs have been a critical policy tool, accelerating the manufacture and adoption of EVs and making them accessible to car buyers. These investments in EV technologies are helping automakers transition to new technologies and enabling Americans to drive cleaner and cheaper.

In particular, the federal EV tax credit is essential. Not only is it important for US drivers, but it also for US manufacturers. With a number of countries announcing bold EV efforts (such as France, China, and India), letting the tax credit expire for leading US EV manufacturers could be a costly mistake.

Now is not the time to end a policy that works. Instead, the federal government should extend the credit to ensure continued progress, build upon success, and keep the United States in the lead with 21st century automotive technology.

Will Republican Mayors Crack the Party’s Wall of Climate Denial?

UCS Blog - The Equation (text only) -

Hurricane irma landfallHurricane Irma approaches landfall in southern Florida, September 10, 2017. Photo: NASA

If any set of Republicans cracks the party’s wall of denial on climate change, it will be those most responsible to deal with its ravages—mayors. That was reaffirmed during Hurricane Irma when Miami’s Republican mayor, Tomas Regalado, told the Miami Herald:

“This is the time to talk about climate change. This is the time that the president and the EPA and whoever makes decisions needs to talk about climate change. If this isn’t climate change, I don’t know what is. This is a truly, truly poster child for what is to come.”

Irma offered Regalado a crescendo for a message he and other Republican mayors in Florida have repeatedly sent to party leaders. During the 2016 Republican presidential primaries, he and fellow Republican James Cason, who this year retired as mayor of Coral Gables, penned an op-ed in the Miami Herald challenging Republican candidates to understand that the rising seas of climate change will make South Florida “unrecognizable.”

Describing themselves as “staunch” Republicans who otherwise are suspicious of federal regulations, Regalado and Cason wrote that “we shouldn’t waste time” debating the science. Rather, they said it was time to debate how to respond.

“We can debate ways to develop clean energy, how to put a price on carbon and how to protect coastline communities from flooding and storms,” they wrote. “We can debate ways to grow the economy and create new jobs while protecting lives and property from climate change.”

In an interview last year with National Public Radio, Cason said climate change affects everything from low-lying schools and hospitals to the owners of $5 million homes who “see their property values go down because they can no longer get a boat out. When they start flooding, whenever that is, when do they stop paying taxes?”

Republican mayors do not have the luxury to fly at the 30,000-foot level of denial as Florida Governor Rick Scott is accused of banning “climate change” from official state communications, as Environmental Protection Agency Administrator Scott Pruitt claims it is “insensitive” to connect climate change to Hurricane Irma, and as Arizona Senator John McCain made news by being a lonely Republican calling merely for “common-sense measures” on climate change.

Rather, Republican mayors say they fret along with their coastal Democratic counterparts about chronic high-tide inundation and the rising intensity and restoration costs of storms.

In Coral Gables, the new mayor, Republican Raul Valdes-Fauli, is a former tax attorney for oil companies. During his campaign, he said climate change “is not just a feel-good issue, it is a very vital issue for Coral Gables.”

In Palm Beach County, Steve Abrams, a Republican who was mayor and is currently a commissioner, last year told the British Guardian newspaper, “We don’t have the luxury at the local level to engage in these lofty policy debates. I have been in knee-deep water in many parts of my district during King Tide.”

In other parts of the nation far from hurricane zones, other Republican mayors have begun to speak out. Carmel, Indiana Mayor James Brainard vociferously opposed President Trump’s pullout of the Paris climate agreements. In June, he told National Public Radio that the Midwest is “at risk for all sorts of bad things,” particularly “the frequency and intensity of storms” and “the evolution of new pests in the fields outside our metro areas.”

Brainard said his city is taking many steps to fight climate change, by making the city more walkable, switching street lights to LEDs, adding parkland, and ordering all fleet vehicles to be either hybrids or use alternative fuel. He said that on a one-on-one basis at least, he stops critics of green investment in their tracks with the city’s dramatically lower electricity costs.

He told NPR that by ignoring the economic benefits of going green, the Trump administration has “missed a political opportunity to expand their base. They’re speaking to a very small portion of the Republican base, and it’s a big missed opportunity for them.”

And in America’s largest city to be governed by a Republican mayor, San Diego’s Kevin Faulconer continues to set the pace for many of his Democratic big-city counterparts with his city’s plan for all-renewable energy by 2035. Despite some local criticism that his administration is not working fast enough on bike lines, tree canopies, and giving communities more say in where their electricity comes from, Faulconer remains a standard bearer for Republicans who see that wildfires and rising seas are tied to fossil fuels. Last month, he told the prestigious Commonwealth Club in San Francisco:

“It’s time for today’s California Republicans to stop ignoring climate change. If we opt out of the conversation, we’re only going to get extreme one-party solutions. We should be proud to offer our own plans to preserve our environment—plans that don’t plunder the middle class.”

The Republican Party may not yet be proud of the likes of Faulconer and Regalado. But based on the battering over the last decade and a half of the East Coast and Gulf Coast by hurricanes, the blistering heat of the West, the increasing intensity of storms in the Midwest, and the northward spread of crop pests and diseases spread by mosquitoes and ticks, the party does not have long to opt in to the conversation.

California’s 100% Clean Energy Bill Faces Setback—But Progress Continues

UCS Blog - The Equation (text only) -

Image of California's Capitol Building Photo: Henri Sivonen/CC BY (Flickr)

The California Legislature failed to bring Senate Bill 100 (De León) for a full vote on Friday. Had the bill, SB 100 (De León), passed and been signed into law it would have accelerated the state’s primary renewable energy program, known as the Renewables Portfolio Standard (RPS), by raising the current requirement from 50 to 60 percent by 2030. It also would have set an ambitious new policy for all electricity produced in the state to come from zero-carbon resources by 2045.

Since Friday was the deadline to move bills for the regular 2017 legislative session, the bill is stalled but not dead. In fact, Assembly member Chris Holden, the chair of the committee for which the bill failed to be brought for a vote, has said the issues will be revisited in 2018.

Let’s take stock of where we are today: in 2016 California received about 25% of its electricity from eligible renewables. Another 19% came from a combination of nuclear and large hydropower, which are zero-carbon resources that would be eligible under SB 100. Statewide we are already on track to exceed the current RPS requirement of 50% by 2030. In the past several years California has made great strides to continue its position as a worldwide clean energy leader, and current policies in place ensure that the momentum will continue.

I am disappointed, but not discouraged. I spent a good bit of time working on SB 100 this year, and to me the fact that we couldn’t pass it in one year is not cause for despair. As I’ve said before, setting a goal to completely decarbonize California’s electricity sector by 2045 is bold and aspirational, and it should not be a surprise that a big new energy policy will take multiple legislative sessions to hammer out some of the details.

I am also encouraged that conversations at the end of the year were not about whether a zero-carbon electricity grid is the right path for California’s future but rather what that path should look like. I look forward to continuing the discussion and negotiation in January when the legislature returns. Reducing carbon emissions and air pollution by transitioning away from fossil fuels is one of the most important actions our country and world must take to avoid the worst consequences of climate change. While California’s share of global emissions is relatively small, transitioning completely away from fossil fuel-based electricity for the world’s sixth-largest economy would break new, important ground for other states and countries to follow. 2018 should be an exciting year.

What the Northeast Could Build With a Transportation Cap and Invest Program

UCS Blog - The Equation (text only) -

While the Northeast region struggles to make significant progress in reducing pollution from transportation, our neighbors and allies in California and Canada are investing billions of dollars in clean mobility solutions thanks to their successful implementation of a cap and invest program covering transportation emissions.

Today California finalized its plan to invest over $2 billion over the next year on initiatives designed to reduce our use of oil and pollution from transportation. These investments will make it easier for California residents to purchase an electric vehicle, or to save money by trading in an old gas guzzling car for an efficient conventional vehicle or hybrid. They will improve public transportation services, both in California’s big cities and its small towns and rural counties. They will provide more affordable housing in communities near public transportation. And they will create jobs, reduce emissions, and save consumers money.

Meanwhile, our neighbors in Ontario and Quebec are projected to spend $2.1 and $1.9 billion respectively on clean transportation programs by 2020.

These jurisdictions are making investments on a far greater scale than anything currently happening in any state in the Northeast. They are able to do so because unlike the Northeast, California, Ontario, and Quebec have enacted a comprehensive climate policy that establishes enforceable limits on pollution from transportation, holds polluters accountable for their emissions, and provides a dedicated funding source for clean transportation investments.

This policy, known as “cap and trade” but which could be more accurately called “cap and invest”, is run through the increasingly misnamed “Western” Climate Initiative (or WCI), an international carbon market that now limits emissions in a region covering over 60 million people in the United States and Canada.

Cap and invest is not new to the Northeast. Under the Regional Greenhouse Gas Initiative (or RGGI), the Northeast established the first market-based limit on pollution from power plants, and used the funds generated by the program to invest in efficiency and clean energy. Thanks in part to this policy, Northeast states have dramatically reduced pollution from electricity. Unfortunately, the Northeast states have yet to take the next logical step and enact a similar policy to limit emissions from transportation, which is now the largest source of pollution in the region.

As a result, Northeast states are missing out on an opportunity to make investments that will reduce pollution, save consumers money, increase economic growth, create jobs, improve public health, and reduce our use of oil. If the Northeast had a program similar to WCI covering transportation pollution, it could raise up to $4.7 billion every year for clean transportation initiatives in the Northeast.

Here are some of the things that we build in the Northeast with a cap and invest program:

Better transit

Unlike diesel and natural gas vehicles, electric trucks and buses, like the BYD articulated bus pictured here, produce no hazardous exhaust emissions.

At a time when we need to be making transformative investments in public transportation, the transportation agencies tasked with maintaining and expanding our public transportation systems are broken. While public transit use is near an all-time high, a variety of factors including inflation and increasing fuel efficiency are reducing real gas tax revenues. Limited transportation funding has led to several well publicized transit failures in New York City, Boston, New Jersey, and other cities in the Northeast.

State Revenues at $14.75 per ton  (million$) Transit (48%) Sustanable Communities (26%) Clean Vehicles (26%) Connecticut 246.33 118.24 64.04 64.04 Delaware 67.85 32.57 17.64 17.64 D.C. 17.70 8.50 4.60 4.60 Maine 143.08 68.68 37.20 37.20 Maryland 452.83 217.36 117.73 117.73 Massachusetts 469.05 225.14 121.95 121.95 New Hampshire 109.15 52.39 28.38 28.38 New Jersey 954.33 458.08 248.12 248.12 New York 1181.48 567.11 307.18 307.18 Pennsylvania 983.83 472.24 255.79 255.79 Rhode Island 66.38 31.86 17.26 17.26 Vermont 53.10 25.49 13.81 13.81 Total 4745.08 2277.64 1233.72 1233.72

Almost half of the transportation funding from California’s program will go towards improving public transportation services in the state. The long list of programs and projects that will be funded (at least in part) from California’s climate program includes transit expansions in every major metro region, high speed rail, bus service improvements in dozens of small towns and rural counties, replacement of diesel buses with electric buses, and programs to provide low or reduced fares for low income residents and college students.

Clean vehicles

Both California and (most) Northeast states offer rebates to make electric vehicles more affordable for drivers but California’s programs are larger, more comprehensive, and more specifically target moderate and low-income drivers. For example, low-income drivers who trade in a gas guzzler for an electric vehicle can qualify for a rebate of up to $14,000 through the state’s Enhanced Fleet Modernization Program.

California is also expanding their efforts to provide vehicle financing assistance to help residents who lack the credit to purchase or lease clean vehicles. These investments have helped California achieve electric vehicle sales numbers six times higher than the Northeast.

California also provides a rebate of up to $110,000 for businesses that replace diesel buses and trucks with zero-emission vehicles, which can have a dramatic impact on air quality in low-income communities. Finally, California is using funds from their climate program to build electric car-sharing networks in Los Angeles and Sacramento.

Sustainable communities

People want to live in communities with access to multiple transportation choices, if they can afford it. But rising demand and limited supply for transportation-accessible housing is contributing to a housing affordability crisis that is impacting every major metropolitan area in the Northeast. Five of the eight metro areas with the highest monthly rent in the United States are located in the Northeast; the other three are in California.

High housing costs have enormous implications for racial and economic equity. The cost of housing also has a significant impact on climate emissions. As families find themselves unable to afford communities with strong transportation choices they are forced to relocate to communities with cheaper rent but higher fuel consumption.

California has spent over $700 million to date from their climate program on affordable housing and sustainable community programs. The largest of these programs is the Affordable Housing and Sustainable Communities program (AHSC), which provides grants for affordable housing and bike and pedestrian infrastructure projects that reduce global warming emissions. In the most recent year in which data is available, AHSC-funded projects created 2427 affordable housing units near transit that will reduce emissions by over 800,000 metric tons.

Pollution from transportation is the largest source of emissions in the Northeast region, responsible for over 40 percent of our total emissions. Solving this problem is going to require bold new policies to transition our transportation system away from gas guzzling automobiles towards electric vehicles, transit, and sustainable communities. Cap and invest is a policy model that has been proven to be effective, in the Northeast under RGGI, and as a strategy to reduce transportation emissions in California, Ontario and Quebec. We encourage the Northeast states to consider adopting this model as a key component of our strategy to promote clean transportation in the region.

We Visualized the US Nuclear Arsenal. It’s Not Pretty.

UCS Blog - The Equation (text only) -

International security experts often refer to the twin goals of military policy: to minimize the risk of war and to minimize the damage should war start.

Because nuclear weapons are so destructive, the goal must be to eliminate—and not just minimize—the risk of nuclear war, which will require eliminating nuclear weapons.

Until then, it is essential that nations with nuclear weapons minimize both the risk and consequences of a nuclear war.

Numbers matter.

The consequences are directly related to the numbers of weapons used—which is limited by the number of weapons a nation has. Depending on the targets, the use of even a small number of weapons can result in horrific consequences.

For example, climate scientists using the latest climate models find that if India and Pakistan each used 50 of their weapons against the other’s cities, fires would inject so much soot into the atmosphere that the global climate would be affected for a decade. The decreased sunlight and lower temperatures would result in lower agricultural productivity and could lead to the starvation of over 1 billion people. This would be in addition to the people directly killed by the weapons.

Policy-makers and military officials often refer to the US nuclear arsenal as “our deterrent,” as if it were some sort of disembodied force rather than actual weapons. A “deterrent” is obviously a good thing, whereas “nuclear weapons” are more problematic. So, let’s take a look at what this “deterrent” actually consists of.

We have a new web graphic that displays all the US nuclear weapons. It provides a step-by-step visualization of the weapons the US deploys on land-based missiles in underground silos, on submarines, and on aircraft. All of these 1,740 weapons are ready for use.

But that is not all. The graphic then adds in the weapons the US keeps in storage for potential future use.

It all comes to a whopping 4,600 nuclear weapons.

Take a look—you can find more detail about the arsenal on the final page by hovering over each dot.

Policy matters, too.

Policy is what determines the risk of nuclear war.

As the graphic notes, the US keeps its land-based missiles on hair-trigger alert. Why? To allow the option of launching all these weapons in response to warning of an incoming attack from Russia. The warning is based on data from US satellites and ground-based radars, which is processed by computers.

No problem there, right? Wrong—not surprisingly, there have been false alarms in the past. Even more troubling, it takes only some 25 minutes for a missile to travel between Russia and the US. By the time the data has been analyzed, the president has about 10 minutes to decide whether or not to launch US missiles.

Which brings us to the next issue highlighted by the graphic: the president has the sole authority to use the 1,740 deployed nuclear weapons—meaning he or she can order an attack of any kind without the input of anyone else. Unless there is reason to think the president is incapacitated (e.g., drunk), the military is obligated to follow orders and launch.

Finally, it turns out that US nuclear weapons are not just a “deterrent” to dissuade other countries from using nuclear weapons first because the US could respond in kind. US policy also allows the first use of nuclear weapons against another country.

The US could reduce the risk of nuclear war by changing these three policies—by removing its land-based missiles from hair-trigger alert and eliminating launch-on-warning options from its war plans; by requiring the involvement of other people in any decision to use nuclear weapons; and by adopting a no-first-use policy. To reduce the potential consequences of war, it would need to dramatically reduce its arsenal.

And taking these steps would still leave the US with a strong “deterrent.”

Consumer Product Safety Commission Takes On Flame Retardants

UCS Blog - The Equation (text only) -

In 2014, Earthjustice and Consumer Federation of America, on behalf of a broad coalition of health, consumer, science and firefighter organizations, petitioned the Consumer Product Safety Commission (CPSC) to ban a class of flame retardants, additive organohalogen flame retardants, from children’s products, furniture, mattresses, and electronic casings as hazardous substances.

Graphic: Consumer Federation of America

The CPSC is considering whether or not to grant the petition and held a public hearing yesterday, at which I testified regarding the risks of flame retardants in households and the way in which flame retardant manufacturers and their lead trade association, the American Chemistry Council (ACC), have fought hard to keep these hazardous products on the market. The ACC is not a stranger to using the disinformation playbook. As we documented in our 2015 report, Bad Chemistry, the ACC has wielded influence to delay and quash important safeguards on a long list of chemicals, has funded science to exaggerate the chemicals’ effectiveness at lowering fire risk, and has employed innocuous-sounding front groups to do its dirty work without disclosing its relationship. A 2012 Chicago Tribune series did an excellent job of bringing much of the trade association’s activities to light.

The Commission will vote next week to determine whether they will grant the petition and begin to develop proposed rulemaking to ban these chemicals. We hope that the Commission will heed the recommendations of a long list of scientists, public health, and legal experts who agree that the CPSC has the legal authority and the scientific backing to ban these chemicals.

My testimony is below.

Good afternoon, I would like to thank Chairwoman Buerkle and the CPSC Commissioners for the opportunity to testify before you today on this important issue. My name is Genna Reed. I am the science and policy analyst at the Center for Science and Democracy at the Union of Concerned Scientists. With more than 500,000 members and supporters across the country, we are a national, nonpartisan, non-profit group, dedicated to improving public policy through rigorous and independent science. The Center for Science and Democracy at UCS advocates for improved transparency and integrity in our democratic institutions, especially those making science-based public policy decisions.

The Union of Concerned Scientists stands with other members of the scientific community in supporting this petition calling upon the Consumer Product Safety Commission (CPSC) to declare organohalogen flame retardants (OFRs) as a hazardous class of chemicals and to ban their use in children’s products, furniture, mattresses and the casings surrounding electronics. The scientific evidence laid out in the petition supports this regulatory change. The CPSC has the authority to protect the public from toxic substances that “may cause substantial personal injury or substantial illness.”

Since the Center’s inception, we have worked to protect scientific integrity within the federal government and called attention to incidences of special interests mischaracterizing science to advocate for specific policy goals. The chemical industry and its trade association, the American Chemistry Council’s, work to sow doubt about the science revealing harms about chemicals’ impacts on our health, including flame retardants, is an egregious example of this inappropriate behavior.

The companies that manufacture OFRs have put significant time and money into distorting the scientific truth about these chemicals. As a 2012 Chicago Tribune investigative series noted, the chemical industry “has twisted research results, ignored findings that run counter to its aims and passed off biased, industry-funded reports as rigorous science.” In one case, manufacturers of flame retardants repeatedly pointed to a decades-old government study, arguing the results showed a 15-fold increase in time to escape fires when flame retardants were present. The lead author of the study, however, said industry officials “grossly distorted” the results and that “industry has used this study in ways that are improper and untruthful,” as the amount of flame retardant used in the tests was much greater than would be found in most consumer items. The American Chemistry Council has further misrepresented the science behind flame retardants by creating an entire website to spread misleading ideas about flame retardants as safe and effective, even though research has consistently shown their limited effectiveness. In doing so, the American Chemistry Council and its member companies have promoted the prevalent use of OFRs at the expense of public health.

Looking at these chemicals through a strictly objective lens illustrates the need for CPSC’s swift action. Toxicity and exposure data support the assessment of organohalogen flame retardants as a class of chemicals under the Federal Hazardous Substances Act (FHSA). Properties that are shared by OFRs include their semivolatility and ability to migrate from consumer products into house dust and exposure has been associated with a range of health impacts including reproductive impairment, neurological impacts, endocrine disruption, genotoxicity, cancer, and immune disorders.  As a class, there is an adequate body of evidence supporting the conclusion that these chemicals have the “capacity to cause personal illness” and therefore meet the definition of “toxic” under FHSA. Perhaps most egregiously, biomonitoring data have revealed that communities of color and low-income communities are disproportionately exposed to and bear high levels of flame retardant chemicals, adding to the cumulative chemical burden that these communities are already experiencing, from increased fine particulate matter from power plants or refineries in their neighborhoods to higher levels of contaminants in their drinking water.

I’ve seen firsthand the persistence of the earliest form of flame retardants, polychlorinated biphenyls (PCBs), that still plague the sediment and water of the Hackensack Meadowlands just a couple of miles from where I grew up in New Jersey. One of my first jobs was working in the chemistry division of the Meadowlands Environmental Research Institute where I spent my days extracting PCBs and organochlorine pesticides from the soil and sediment of the Meadowlands and analyzing that data. Despite being banned in 1977, these chemicals are still found in dangerously high amounts all over industrial hotspots of the country, and continue to bioaccumulate in a range of species. The ban of PCBs happened decades ago and we are still managing the damaging impacts of the chemical’s prevalence across the country. The next generation of these chemicals, organohalogen flame retardants, are inside of our own homes in a range of products, thanks largely in part to the disinformation campaign sowed by special interests. The fact remains that the science does not support their continued use.

Seeing firsthand the persistence of PCBs in my local environment inspired me to use my scientific training to work to design or improve policies that minimize public health and environmental risks to prevent future scenarios of chemicals overburdening ecosystems and households. That is why I’m here today to ask the CPSC to act with urgency to grant this petition and further regulate OFRs to protect our children and future generations.

Thank you.

Pages

Subscribe to Union of Concerned Scientists aggregator - Combined UCS Blogs