Combined UCS Blogs

Lost in Space? The Zuma Satellite

UCS Blog - All Things Nuclear (text only) -

Many people awaited last Sunday’s Falcon 9 launch from Cape Canaveral of a highly classified US payload. The launch had been delayed for weeks, speculation as to the satellite’s purpose was rampant, and successfully delivering national security satellites to orbit is an important part of SpaceX’s business.

The launch, however, remains shrouded in mystery.

Shortly after the launch, Bloomberg reported that the satellite was lost, due to US Strategic Command saying they were not tracking any objects. The Wall Street Journal suggested that Congress was being briefed on a failure, and that it was due to a failure of the satellite to separate from the final stage, and so both were deorbited together.

A Verge story notes that neither SpaceX (the launcher) nor Northrup Grumman (the contractor who built the satellite) declared that the mission was a success after launch. SpaceX’s president said that the Falcon 9 “did everything correctly” and that they did not have a failure that requires investigation. Northrup Grumman stated that it does not comment on classified missions. Northrup Grumman provided the equipment that connects the satellite to the final rocket stage and that is eventually meant to separate them. So SpaceX’s claim that nothing went wrong with its end could be still be consistent with an overall failed mission.

What could the Zuma satellite be?

The Zuma satellite (USA280) is curious. It’s a classified satellite and so there’s no public description of its purpose. Satellite watchers usually pick up some clues about the purpose of a classified satellite by who made it and what orbit it is put in. For example, spy satellites that are imaging the ground in visible light often use sun synchronous orbits (close to a polar orbit) so that they can see the earth at a constant sun angle, which is helpful in detecting changes. Signals intelligence satellites tend to be at around 63 degrees inclination (the angle the orbit makes with respect to the equator).

Because there was no pre-launch announcement of orbital parameters, nor does the Space Track catalog provide them (it never does for such classified missions), we don’t know what orbit it was meant to go in exactly, but you can tell the approximate inclination by where the hazard zones are from its launch.

Marco Langbroek created this image of the Zuma launch hazard zone (in red in Fig. 1) for his blog:

Fig. 1 (Source: Marco Langbroek

This indicates that the satellite was launched in an orbit that was inclined around 50 degrees to the equator, similar to the International Space Station. Not many satellites use low earth orbits with 50 degree inclinations, except for satellites that were launched from the space station and so end up there. (See for yourself by sorting the satellites in the UCS Satellite Database.)

One other recent classified satellite, USA 276, was launched in that type of orbit, and it was launched in a similar direction as Zuma. That satellite was launched not only in the same orbital inclination as the ISS but also the same orbital plane. It was subsequently observed by the amateur observing community as having made a close approach to the ISS when it was performing docking maneuvers. Marco has a fascinating analysis in The Space Review about it.

What happened to it?

Zuma (USA 280) is still listed as a payload on orbit by the US space surveillance system (Fig. 2), as of this writing (January 12). So something made it into orbit and went around at least once. The object is listed as a payload and not as launch debris, indicating it is the satellite.

Fig. 2 (Source: Screen capture from Space Track)

Marco’s blog also reports sighting of the re-entry of an object that seems to square with the predicted time for the (intentional) de-orbit of the Falcon 9’s final stage, so that appears to no longer be in space. This is consistent with the successful placing into orbit of the satellite and the disposal of the last stage. (That’s good space “hygiene.”)

So there are a few possibilities:

  1. The Zuma satellite failed to separate from the final stage, and returned to earth along with the final stage and no satellite is in orbit. If this is the case, eventually the Space Track catalog will be updated and USA 280 will be removed. But this seems unlikely, since the satellite is still catalogued as being in orbit four days after launch.
  2. The satellite is in orbit. Indications this is the case would be that it remains in the catalog, and that amateur observers on the ground get a view of it. These observers use binoculars and telescopes to see satellites in reflected sunlight, and they are quite skilled at hunting satellites. However, they won’t get a chance to weigh in for a couple of weeks as the satellite won’t be optically visible in the regions of the northern hemisphere where most of them are. It’s possible that in the interim, the satellite will maneuver to another orbit, so finding it after a couple of weeks will be difficult.

Whether the satellite is functioning as intended would be difficult to tell, at least at first. If satellite watchers manage to see it and determine its orbital parameters over a period of time, they may be able to see whether it performs any maneuvers. An on-orbit maneuver is a positive sign that the satellite is at least alive, although doesn’t say whether it’s performing as designed. The lack of such maneuvers, especially if the satellite is in a relatively low orbit and would ordinarily need to compensate for atmospheric drag, can indicate that it is not functioning. Radars should be able to track the satellite, so presumably countries with space surveillance-capable radars, such as China and Russia, know quite a bit more about this already.

  1. While there is some precedent for using a launch failure as a cover story for a stealthy satellite (Misty), it’s hard to keep a satellite reliably hidden. (Note that the US has much more invested in space surveillance than other actors, so this would be even more difficult for countries other than the US.)

There are things that you can do to make it harder to see a satellite. You can minimize its radar reflectivity so that Russian and Chinese radars would have a harder time seeing it. You can minimize how reflective it is in the sunlight so that ground-based optical observers would have a hard time seeing it, too. Or you might make the satellite’s orbit unpredictable by maneuvering, so trackers must perform a time-consuming search for it each time they want to see it.

You’d probably need to do all these things at the same time to have hope of being stealthy for a significant period of time, and these techniques put a lot of constraints on the satellite itself. And one cannot credibly hope to stay stealthy indefinitely.

What’s curious about Zuma is that the bits of information don’t yet add up to a coherent story. There’s more information to come which may help—stay tuned!

The Trump’s Adminstration’s Dangerous New Nuclear Policy

UCS Blog - All Things Nuclear (text only) -

Last night the Huffington Post released a draft version of the Trump administration’s Nuclear Posture Review, a deeply dangerous document that makes nuclear war more likely. UCS has a press statement on the draft, and below is a compilation of some additional quick thoughts on the draft, with more to come.

+ + + + +

The Trident II D5 missile

The Trump NPR calls for a new, low-yield warhead for the Trident submarine-launched missile. The NPR premises the need for that warhead on the idea that the following systems will not be able to penetrate enemy air defenses to attack enemy targets:

  1. US dual-capable aircraft—including the new F35A stealthy fighter aircraft—armed with gravity bombs, including the new, high precision, low-yield B61-12;
  2. The dual-capable aircraft of allied countries in Europe that currently host US nuclear weapons;
  3. US B-2 stealth bombers armed gravity bombs, including the new B61-12;
  4. US B-52 bombers armed with air-launch cruise missiles and the future long-range standoff (LRSO) cruise missile, and
  5. the future B-21 “Raider” stealth bomber armed with gravity bombs and cruise missiles.


If that is the case, why are we spending hundreds of billions of dollars to deploy new stealthy nuclear-capable fighter aircraft and bombers, new gravity bombs, and new cruise missiles? The NPR calls for an unrealistic spending spree that is not justified by security needs.

+ + + + +

The Trump NPR significantly reduces the threshold for the use of nuclear weapons by explicitly listing a wide array of non-nuclear attacks on the United States that could constitute grounds for a US nuclear response, including attacks on civilians, infrastructure, nuclear forces, command and control, and early warning systems.

This is a dramatic change from the Obama administration’s nuclear policy, which explicitly sought to limit the roles and purposes of US nuclear weapons. It also reverses the trend of every administration since the end of the Cold War, Republican and Democratic alike. The Obama NPR set as a goal declaring that the sole purpose of US nuclear weapons is to deter a nuclear attack on the United States, its military forces, and its allies. It wanted to make nuclear war less likely. This document explicitly rejects that goal and in doing so makes nuclear use more likely.

The NPR also calls for tighter integration of nuclear and conventional forces. That deliberately blurs the line between the two and eliminates a clear nuclear fire break.

+ + + + +

The Trump NPR reverses the plan to retire the B83, a gravity bomb with a massive 1.2 megaton yield—by far the largest in the current US stockpile. Because the military had little use for this Cold War behemoth, the Obama administration had pledged to retire the B83 as soon as confidence was gained in the new B61-12 bomb, as a way to build support in Congress the for new B61. This document says it will keep the B83 “until a suitable replacement is identified.” That could be the B61-12, but there is no commitment to it.

+ + + + +

This document returns to the tired and inaccurate concept of “gaps” in US capabilities that ostensibly require new weapons systems to fill. President Kennedy campaigned on the idea of a “missile gap” when in fact it was the United States that had many more missiles than the Soviet Union. The document points to the “gap” in low-yield options that drive the need for new systems. But there is no “gap.” The US has multiple systems on multiple platforms able to deliver low-yield weapons.

+ + + + +

The document argues that Russia and China are modernizing their nuclear arsenals, and the United States is not. That is utter hogwash. The United States has been modernizing its forces consistently for the last several decades, but it has done so without building new systems. It has upgraded and improved the systems it already has. For example, a decade ago the United States still had submarines armed with Trident C4 missiles, which were not very accurate. Now, not only does every submarine carry the D5 missile, accurate enough to attack hardened targets, but those missiles are being updated, with newly built motors and improved guidance systems making them even more accurate. The W76-1 warheads on those missiles have also been improved, further increasing the ability to hold hardened targets at risk. And that system comprises the bulk of the US nuclear stockpile.

It’s also important to recognize that China’s nuclear arsenal remains tiny in comparison to the US arsenal. The United States has more than 1500 strategic warheads on three types of delivery systems. China has well fewer than 100 warheads on missiles capable of reaching the United States, and the warheads are not even mated to the missiles. They are fully de-alerted. There is zero comparison to US forces.

NAACP’s MLK Day initiative makes solar more accessible

UCS Blog - The Equation (text only) -

Solar Energy Industries Association

The sun shines on everyone, and the benefits of solar energy can too.  Look at the synergies of community-supportive/community-supporting solar, how this can spread.  Solar can create jobs, clean the air, and replace fossil fuel.  As Dr. Martin Luther King said: “We refuse to believe that the bank of justice is bankrupt. We refuse to believe that there are insufficient funds in the great vaults of opportunity of this nation.”

Smiles in the sunshine. Credit: Solar Energy Industries Association

The NAACP is launching a civil rights economic and environmental justice initiative to connect 30+ communities of color and low income communities across the nation with solar energy infrastructure for homes and community centers, as well as skills training for solar jobs, all supported by strengthened solar equity policies. This will provide solar job skills training, install solar panels on households and community centers, and strengthen equity in solar access policies. Partners supporting this national initiative include GRID Alternatives, Solar Energy Industries Association, Sunrun, United Methodist Women, Vote Solar, and others. The Solar Equity Initiative will advance the aims of multiple NAACP civil rights initiatives: Environmental and Climate Justice, Economic Development, Labor, Education, Health and Criminal Justice.

Installing solar on community buildings will lower the energy bills and strengthen the budgets for those service-providers. Any non-profit can take this up, and the funds raised can be tax-deductible. Profit-minded owners of commercial buildings do this with tax credits, churches can do this with donations that are tax-deductible.

The NAACP kicking off this initiative at the Jenesse Center in Los Angeles will provide lifetime financial savings to that service organization estimated at $48,825. These savings will enable Jenesse to infuse more funds into its life-saving services.  Similar environmental and economic savings will be replicated with installations by this initiative, and others where communities combine social equity and clean energy.

Communities of color and low-income communities are disproportionately impacted by pollution-emitting power plants, impacting health, education, incomes. Environmental justice can be served by using community-based solar to replace the fossil fuel burned at old power plants, and remove the plants entirely.

The NAACP has taken this direct action as part of its Environmental and Climate Justice Program.  There are toolkits, links to local efforts, full-length movies and videos and resources all available from the NAACP.

UCS is developing the science and tools to make the direct replacement of power plants with solar, efficiency and storage or demand response a common practice. We have been inspired by early work of Elaine Krieger  and real cases in California where power plants are to be closed, replaced, or never built as solar plus storage fill the need.  We have watched with hope as the solar solutions for Puerto Rico start to take shape, and as legislation in Illinois paves a path for solar for low-income folks.

To close with more words from Dr. King:

“Now is the time to make real the promises of democracy. Now is the time to rise from the dark and desolate valley of segregation to the sunlit path of racial justice. Now is the time to open the doors of opportunity to all of God’s children. Now is the time to lift our nation from the quicksands of racial injustice to the solid rock of brotherhood.”


Solar Energy Industries Association

Latest EPA Automaker Reports Show Compliance with and Success of Standards

UCS Blog - The Equation (text only) -

Today, EPA released its annual reports on new passenger vehicles. One report (Trends) highlights the historical trend in fuel economy for cars and trucks over time, while the other report (Compliance) discusses the progress of manufacturers towards meeting global warming emissions regulations now under attack by industry and this administration.

Fuel economy of the fleet has once again improved, from 24.6 mpg in 2015 to 24.7 miles per gallon (mpg) in 2016. Thanks to strong standards, every type of vehicle (car, truck, SUV) has gotten more efficient; however, consumers are choosing to purchase more SUVs, which is acting to diminish the levels of improvement we need to see to reduce global warming emissions in line with our long-term climate goals.

Taken together, the key findings from both reports are clear:  1) every type of vehicle is getting more efficient, driven by strong standards, and that’s great news for consumers; 2) despite a meager overall improvement in fuel economy, manufacturers continue to comply with the standards; and 3) there’s still a huge opportunity for future fuel economy improvements, as manufacturers continue to bring newly redesigned vehicles to market.

All types of vehicles are getting more efficient

Increasing sales of SUVs are making it more difficult to achieve our climate goals, but strong standards pushing all vehicle classes to be more efficient continue to be key to reducing our climate impacts.

The Trends report shows clearly that the regulations are doing what they were intended to do—every single class of vehicle is getting more efficient, including the fast-growing SUV segment.  In fact, every class of vehicles except vans/minivans achieved record levels of fuel economy in 2016.  This is critical both to provide consumers with fuel-efficient choices no matter what type of vehicle in which they might be interested and to diminish the negative impacts on the climate resulting from a more truck-centric vehicle mix.

The class of car-based SUVs that are so popular right now (including the Honda CR-V and Nissan Rogue) actually showed the greatest year-over-year improvement.  This is not surprising—Ford CEO Jim Hackett acknowledged that fuel economy is one of the major reasons why crossover sales are doing so well.

Some automakers claim that selling more SUVs means consumers don’t care about fuel economy, but the numbers tell a different story.  Consumers continue to show that fuel economy is important, particularly when it comes to SUVs—the Consumer Federation of America showed that SUVs which saw a marked improvement in fuel economy (+10% mpg or better) outsold their competitors.

Automakers are complying with the standards

All large-volume manufacturers are entering the 2017 compliance year with a massive bank of credits to draw upon to aid with compliance during a lull in product turnover.

As I’ve reported in many years past, the industry as a whole has been ahead of the regulatory targets—this means that they have built up a bank of overcompliance credits, which many of them are now drawing upon.  Some in the media may seize on this and say that this means the automakers are not complying with the rules—however, that ignores the way the rules work or how vehicles are planned.

Manufacturers are measured on compliance over a 5-year period because that is the typical product cycle of a single vehicle.  Once every five years (give or take), a vehicle will undergo a “redesign” where major changes occur—this includes body shape and major crash safety structural elements as well as the size and efficiency of the engines, which set the performance characteristics and, importantly, fuel economy.  Once in the middle of a product cycle, a vehicle will receive a “refresh” where they may make cosmetic alterations, maybe make some minor changes to the powertrain (like a new transmission or maybe bringing over an additional engine that’s used in another vehicle built on the same platform), but largely the fuel economy and emissions of a vehicle are fairly constant over its five-year lifetime.

This means that manufacturers need to use a credit bank to compensate for the fact that a vehicle largely doesn’t improve much over the course of its lifetime—a vehicle will typically earn credits early on for overcompliance when the technology is new, and that overcompliance can then be used to compensate for any shortfalls that occur as the vehicle “ages” before its next major update.

From 2009 to 2014, manufacturers turned over new vehicles at an accelerated pace in the first few years of the regulation to introduce some new technologies, but that has declined now for 2015 through 2017.  This will correct itself for 2018 through 2020, when again these older vehicles are all redesigned.

Today, the fleet is older than usual, so while in a couple years there will be a large opportunity to add new technologies, the Compliance report shows manufacturers are dipping into their credit banks today as planned to compensate for the age of the vehicles.  And because of the early turnover in the first few years of the regulations, the industry was well-prepared by banking hundreds of millions of tons of credits, more than enough to help ensure compliance for years to come.

Manufacturers are investing in efficiency at different rates

Consumers are some of the biggest beneficiaries from these rules, having saved well over $50 BILLION since new standards went into effect thanks to rules designed to make every vehicle type more efficient over time.  And that will be even more important as these more efficient options make their way to the secondary market.  But not all manufacturers are investing equally in providing their consumers more efficient choices.

The Trends report shows that in terms of overall fuel efficiency, Mazda is at the head of the pack.  While some of this is related to its somewhat car-heavy fleet, it continues to focus on improving its conventional gas-powered engines, and deploying these engines broadly across all vehicles.  And they aren’t resting on their laurels, either, having announced the next generation of their engines, bringing diesel-like efficiency to a gas-powered engine.

Unfortunately, Toyota continues to fall behind the rest of the pack, seeing absolutely no improvement in fuel economy compared to last year, which fell short of the year prior—in 2013, Toyota had the 3rd most efficient fleet; for 2016, they have now dropped to 9th, ahead of only Mercedes and the Detroit Three.  While many associate Toyota with efficiency thanks to its Prius family of hybrids, this fall from grace is because Toyota has not made similar investments to improve its trucks and SUVs.  In fact, its Tundra pick-up and 4Runner SUV have been using the same engines since 2010 and 2009, respectively, with the 4Runner one of just three vehicles being sold today still using an outdated 5-speed automatic transmission!

The Compliance report makes clear that no major manufacturer is in danger of falling out of compliance (as I noted at the start), even if some of them are relying more heavily upon their credit bank.  But manufacturers like Hyundai and Honda are much better positioned than most not just because they have such a massive bank of credits, but because they have continued to deploy steady improvements across its entire fleet instead of banking on a single green “halo” vehicle like the Toyota Prius.

Manufacturers have a wide range of technologies available to reduce fuel use and emissions, but many “off the shelf” technologies have still not been widely deployed.

The technology assessments in the Trends report indicate clearly that while manufacturers are making progress introducing and improving technologies for conventional vehicles, they have on the whole been slow at deploying those technologies across the fleet.  This is why we continue to emphasize the ability for manufacturers to continue to comply with the regulations well into the future with continued advancement of conventional gasoline-powered vehicles.

Leaders show industry’s capabilities, while laggards exemplify industry’s past

Last month, we released a report documenting the auto industry’s well-established history of fighting automotive regulations. For better and worse, today’s Trends and Compliance reports encapsulate both where the industry could be headed and the historical pull towards resisting that change.

The indicators I’ve laid out above all show that the standards are achievable and important for both consumers and the climate. Every class of vehicles is getting more efficient, and many in the industry continue to invest in that progress, driven by these standards.  And, because SUVs and trucks represent a growing share of the market, these standards remain as important as ever to ensure continued fleetwide efficiency improvements—the fleet mix shift acts as a drag on achieving our climate goals, so weakening the standards could set us backwards, as occurred in the 1990s.

At the same time, manufacturers are trying to seize upon misinformation about how the standards work and their ability to comply to weaken the rules.  It’s critical that they stop this nonsense so we can continue the progress already set forth.

The Trends and Compliance reports released today indicate that automakers are well on a path to comply with regulations that will nearly double the efficiency of the passenger vehicle fleet by 2025—so instead of fighting it, let’s focus on achieving it and then figuring out what lies beyond so we can continue to meet our climate goals.

Japan’s Role in the North Korea Nuclear Crisis

UCS Blog - All Things Nuclear (text only) -

Japanese Prime Minister Yukio Hatoyama (second from left) consults with US President Barack Obama during a 2010 summit on nuclear security.

During a recent trip to Japan I had the opportunity to discuss Japan’s role in the current North Korean nuclear crisis with Yukio Hatoyama, a former prime minister. He led the Democratic Party of Japan (DPJ) to victory in September 2009, becoming the only Japanese politician to defeat the ruling Liberal Democratic Party (LDP) at the polls since end of the Second World War. 

The DPJ campaigned on wresting political and economic power away from an unelected bureaucracy and returning it to Japan’s elected representatives. Mr. Hatoyama’s perceived inability to deliver on that promise led to a loss of public support and his resignation as the leader of the DPJ in June of 2010. His party held on to power until they were defeated in September 2012 by a chastened LDP led by the current prime minister, Shinzo Abe.

Hatoyama is concerned about Abe’s approach to the North Korea nuclear crisis. He believes the current Japanese prime minister is providing unwise and provocative encouragement to US President Donald Trump’s threats to launch a pre-emptive military attack. Hatoyama is not alone in that assessment. Most of the Japanese I spoke with during my stay in Japan feel their government should be encouraging dialogue rather than cheerleading for pre-emptive US strikes that could ignite a wider war and invite North Korean retaliation against US military bases in Japan.

Yukio Hatoyama comes from a storied political family, and one of the wealthiest in the country. His father, Ichirō, served as foreign minister from 1976-77 under Prime Minister Takeo Fukuda. His grandfather, also named Ichirō, served three terms as prime minister from December 1954 through December 1956.

Although he retired from electoral politics in 2010, Mr. Hatoyama continues to promote what he believes may be his most important political legacy: the creation of an East Asian regional institution comparable to the European Union. His controversial efforts to advance the idea during his term in office troubled US Japan hands, who worried an Asian version of the EU would undermine the US-Japan relationship, especially since Hatoyama believes greater Japanese cooperation with China is an essential prerequisite for success.

UCS came to know Mr. Hatoyama through colleagues in the Japanese nuclear disarmament community. They were encouraged by his strong support for President Obama’s effort to reduce the role of nuclear weapons in US national security policy, including US security policy in Asia. Together with our non-governmental counterparts in Japan, UCS continues to work with Japanese legislators, the broad majority of whom, from all political parties, support responsible nuclear reductions.

We hope to bring more of their voices to the US debate about US nuclear weapons policy as President Trump’s Nuclear Posture Review unfolds later this year.

Our interview with Mr. Hatoyama was conducted in his Tokyo office on November 21, 2017. An audio file of the interview is available upon request.


UCS: Today we have the honor of speaking with Yukio Hatoyama, the former Prime Minister of Japan and the current Director of the East Asia Community Institute. Mr. Prime Minister thank you for taking the time to speak with us today.

I suppose we should start with the question of North Korea. How do you think about the way the United States and Japan are responding to what North Korea is doing?

Hatoyama: In regards to the North Korean development of nuclear missiles of course it is a reality that this is indeed a threat and in that sense countries around the world should be cooperating together and it may be necessary also to impose sanctions as is being done now. However, the final purpose for these sanctions should always be how to bring North Korea to the dialogue table.

Unfortunately, in Japan Prime Minister Abe has said that the time now for dialogue has finished, but I believe this is incorrect.

And, of course, when we consider why it is that North Korea has gone ahead to develop its missiles and nuclear weapons as well we need to recognize that fact that while there is a ceasefire agreement in place between the United States and North Korea, the war is not yet over, it’s still just in a state of ceasefire.

When we think about how North Korea is looking to create its own situation as well, it also sees the United States’ nuclear weapons and missiles – that are being maintained – being possessed – as well. And this is also leading it to seek its own nuclear and missile development program.

If we consider that North Korea is looking at its possession of these weapons as a tool for dialogue I think this really shows even more how the fact that dialogue now is more necessary than ever.

UCS: So, you think they are using it to start a dialogue with the United States?

Hatoyama: Yes, I do think so. And I believe it is necessary for us to recognize the fact that while North Korea knows that if they were to launch a nuclear weapon or missile towards the United States their own country, in turn, would be obliterated. They are aware of this. And, therefore, I don’t believe it’s likely they would actually make such an attack.

Therefore, I think instead we should understand their actions as looking at a way to try and seek negotiations with the United States which would allow them to have a more equal position between the two countries.

UCS: One of the things that members of Congress and the critics of the Trump administration’s policy towards North Korea have been discussing is the possibility of an accidental war… because of the rhetoric about the time for dialog being over… sending a signal to North Korea that military action is what happens when the time for dialogue is over.

Do you think Prime Minister Abe’s repeating that phrase about dialogue – the time for dialogue being over – is increasing the risk of an accidental miscalculation that could lead to a war with North Korea?

Hatoyama: Of course, from the part of President Trump, looking at how he mentioned having to consider all possibilities, including attacking through use of force. That is something which perhaps as a president should be considered.

However, this use of force cannot be the first option. That cannot be what is first gone to, whether it includes accidental use or not. Of course, if there were to be an accidental use of weapons by the United States on North Korea, North Korea would retaliate, in turn, against Seoul, against South Korea and against Japan. Of course, this would not be in the interest… not be good for Japan.

Now that Prime Minister Abe is repeatedly saying that the time for dialogue is over, the more he says this – the more he repeats this – the more the risk is increased as well. And this is also not in the interest of Japan.

UCS: A related issue in the United States is China’s role in this whole problem. A lot of American officials and the American media are highly critical of China because they don’t think they’re doing enough. What do you think about that?

Hatoyama: I believe that rather than looking at…criticizing China in terms of its role … or what role it is or is not playing … the fundamental issue at stake here is an issue between the United States and North Korea. China, Japan and South Korea are therefore not central players in this but have the role of looking at how they can cooperate together between these countries to create the conditions and space for negotiations between the United States and North Korea as the two key players in this issue.

Of course, China and these other countries they themselves do not desire a war to break out. While some may be criticizing China for being too generous or too kind towards North Korea, rather we should be looking at how to have more cooperation between China, Japan and South Korea in order to bring the United States and North Korea to the negotiations table.

UCS: Well the main issue is that people in the United States want them to cut off oil and food. Do you think that’s a good idea?

Hatoyama: I believe that cooperation in the direction of sanctions is to an extent necessary. However, we also need to recognize that if North Korea is pushed too far into a corner then it’s unclear what actions they might take, and what means they might take to do this.

When we also consider Japan’s history as having been on the receiving side of economic sanctions – which actually contributed to Japan’s path towards waging the wrong war in the past century as well, this is something that we need to learn from history and recognize that strict sanctions can… well, do not necessarily always lead to positive results. They can actually lead to such negative results as well.

China is saying it will to an extent cooperate as part of the international community on the increase or strengthening of sanctions. We also need to make sure that this is not done in order to, well, let the people of North Korea completely starve. On the contrary, we need to look at what the purpose of this is.

UCS: Well I know our time here today is limited so I have just one final related question, and we’ll just keep the focus on North Korea. And that is the domestic political aspects of the North Korea question in Japan. I was invited to listen from the gallery to Prime Minister Abe’s speech to the Diet last week. North Korea seemed to be a prominent part of the speech. He conveyed the idea that this was an important issue in the last election. Was it? And do you think there is anything that the opposition, in Japan, can do to sort of change the Japanese view of the North Korea question.

Hatoyama: Unfortunately, in the recent election Prime Minister Abe was re-elected by bringing this idea of the threat of North Korea to the fore, and saying this is why we need a stable government in place. This was used to convince the people to vote in favor for him in this past election.

I believe that whether it’s President Trump or any American president, the policy of Japan, which is now being put forward by Prime Minister Abe, following the United States administration fully in its policies is not going to be the way to resolve any kind of issue including the issue of North Korea as well.

When we look at the policy…or Prime Minister Abe stating that the time for dialogue is over.. we’re merely following US policy in regard to North Korea. This is not the way to be able to resolve this issue. Rather, Japan needs to be looking at how it can play a role in bringing the United States and North Korea to the negotiations table, and aim in this direction. This is the direction in which the government should be aiming and the opposition parties should also be pushing the government towards this and encouraging this as well.

The Science of Sovereignty: Two Cases Show How the Future of Voting Rights Depends on the Integrity of Data

UCS Blog - The Equation (text only) -

This week, two major court cases concerning the right to an equal and effective vote revealed how crucial scientific integrity in the courts is going to be if voting rights are to be secured for all Americans. On Tuesday, a federal court threw out North Carolina’s Congressional districting plan as an unconstitutional partisan gerrymander, relying on extensive empirical models and statistical evidence that demonstrated both discriminatory intent and effect. On Wednesday morning, the Supreme Court of the United States heard oral arguments regarding Ohio’s “use-it-or-lose-it” voter list purging process, during which considerable time was dedicated to issues of data integrity and availability. Both cases illustrate the growing importance of our ability to measure equal justice under law, and the degree to which claims of voting rights violations are based on quantitative arguments.

Measuring intent and effect in gerrymandering

The North Carolina decision handed down Tuesday included an extended discussion of the courts’ “obligation to keep pace with the technological and methodological advances so it can effectively fulfill its constitutional role to police ever-more sophisticated modes of discrimination.” In their 205-page opinion, the court reprimanded defendants for arguing that claims should be dismissed simply because they “rely on new, sophisticated empirical methods that derive from academic research.”

The opinion explicitly relied on such methods for establishing the intent of the North Carolina legislative leaders to discriminate against voters of the opposing party. They did this through a combination of computer simulations that showed the near impossibility of the adopted plan being chosen without intending to discriminate, given the traits of that map compared to alternatives, and data visualizations that experts say illustrate the “signature” gerrymandering effect of partisan vote shares being non-linearly distributed across districts.

In finding a discriminatory impact, the court relied on both simulations and statistical tests of partisan asymmetry to demonstrate the near certainty that the governing party engineered itself at least one additional seat, and likely several additional seats, through the adopted plan.

Does a non-response “count” as a signal of any relevance?

In the Ohio voter list purge case, Justices repeatedly asked about available data from both sides that would provide an empirical context for the legal arguments. In particular, Justice Sotomayor asked about estimates of how many of those purged from voter lists had actually changed residence out of their districts, as the state assumes. Similarly, Justice Breyer stated that they were looking at “an empirical question” and inquired about the availability of “numbers, or surveys” of residential movement within the state, as well as estimates of what percentage of residents typically throw away mailed notifications, so as to get some grasp of what it means when a voter does not respond to a notification.

Indeed, a crucial challenge to the defense revolved around just what information was obtained from the notifications. Justice Alito pushed Plaintiff’s counsel to assess whether nothing additional was learned, such that the removal was based purely on non-voting, which is prohibited.  Alito suggested that the state learned something, non-response, from the unreturned notifications, but counsel countered that no information about residency was obtained.

Neither side claimed that they could provide accurate numbers of the total disenfranchised, even though the arguments critically turn on the extent of discrimination taking place, and how those who do not return notifications should be classified so as to estimate what percentage of non-respondents had actually moved. Plaintiff’s argument rested partly on the claim that, because the percentage of Americans who move every year is low relative to the share of non-respondent voters in Ohio who were purged, the Ohio process necessarily results in false-positives, and must “vastly over purge” voters from registration lists.

An arms race: the technology of discrimination v the technology of empowerment

The amount of time spent on such questions in the Ohio arguments reaffirms the extent to which the identification and measurement of voter behavior is going to be central to challenging voting rights as we move forward. Further, estimating the impact of administrative procedures and eligibility requirements, while statistically difficult, is going to be of even greater importance, if we need to untangle compound effects in order to assess their performance.

Together, this week’s cases show that scientists, the courts, and the public need to advocate for greater scientific integrity, not only in the domain of legislative policymaking, but throughout the policy making process, including litigation, where improved methods and models can “provide a new understanding of how to give effect to our long-established governing principles.”

Trump Political Appointees Interfere in Scientific Grants Process Take Two: The Department of Interior

UCS Blog - The Equation (text only) -

Photo: Gage Skidmore/CC BY-SA 2.0 (Flickr)

The Department of Interior (DOI) has directed political appointees to begin reviewing discretionary grants to make sure that they align with the priorities of the Trump administration. The discretionary grants include any grants worth $50,000 or more that are intended to be distributed to “a non-profit organization that can legally advance advocacy” or “an institution of higher education.” The memo detailing the directive was sent by Scott J. Cameron, Principal Deputy Assistant Secretary for Policy, Management, and Budget.

The directive is being strictly enforced. Cameron’s memo notes that “Instances circumventing the Secretarial priorities or the review process will cause greater scrutiny and will result in slowing down the approval process for all awards.” The sentence is not only bolded, it’s italicized as well. To explain why the new grants process was needed, Interior Spokeswoman Heather Swift said that “the new guidance continued the responsible stewardship of tax dollars.”

Remember when this happened at the EPA?

The Environmental Protection Agency (EPA) basically instituted the same process of grant review last year. While it is not uncommon for political appointees to get involved in the grants process, their involvement is generally limited to broadening solicitations for grant proposals. The US Federal Government refers to these solicitations as “Funding Opportunity Announcements” or FOAs. These FOAs include information about what type of work the agency is expecting and whether or not the applicant would be eligible for funding. Thus, an FOA is extremely important for both the government and the applicant because it highlights the agency’s priorities for the funding, which also serves as a guideline for an applicant’s proposal. Political appointees have generally broadened FOA’s in the past so that they are more inclusive, not restrictive to an administration’s priorities as is being seen here with DOI’s new process.

Agencies have grant review systems already in place

DOI already had a grant review system in place before this new system came along that worked just fine. This begs the question: why they are changing it? Part of the grant review system is that discretionary grants are reviewed by independent experts who assess grant proposals using a uniform rating or scoring system established by the awarding agency. The proposals are evaluated based on criteria specific to the grant—for some programmatic grants these criteria are dictated by statutory authority (e.g., grants in the brownfields program at the EPA). Therefore, as former Secretary of Interior David J. Hayes noted, “Subjugating Congress’s priorities to 10 of the Secretary’s own priorities is arrogant, impractical and, in some cases, likely illegal.”

Based on expert criteria, or those set out by statutes, a panel of experts will assign a score to each reviewed proposal and then meet to discuss the merits of each. The proposals that receive higher scores are deemed more competitive relative to those with lower scores. Depending on the amount of funding available for a grant program, the panel will recommend a percentage of the top scoring grants to be funded.

A list of recommended grants for funding are then sent to the head of the program, who may or may not be a political appointee, for review. The amount of information on recommendations that the appointee might receive varies. Sometimes the appointee might receive abstracts of proposals or they might just receive a list of the institutions or researchers recommended for funding. However, what is common practice when a head of a program receives this list is that they generally agree with the expert’s recommendations. Former EPA administrator under President George W. Bush, Christine Todd Whitman, chimed in on this issue when it happened at EPA, “We didn’t do a political screening on every grant, because many of them were based on science, and political appointees don’t have that kind of background.”

Will DOI use this new process to delete science?

In the case of EPA political appointees reviewing grants, scientifically defensible language was removed from many descriptions of grant projects, and some grants were rescinded that were already recommended by a panel of experts. It was clear that the new process was specifically set up to undermine science and scientific experts at EPA, especially those working on climate change related issues.

DOI doesn’t have a good track record of supporting science lately, having halted two important studies by the National Academies of Science, and completely scrapping climate change work from its new strategic plan. It has yet to be seen if the new grant review process will result in scientifically defensible language being deleted from grant descriptions at DOI, or if the agency will rescind important scientific work like was done at EPA. However, the scientific community will be watching for such attacks on science and we’ll fight back against them if this administration continues this appalling tactic.

Cap-and-Invest: A Key Tool to Help Oregon Fight Climate Change

UCS Blog - The Equation (text only) -

With the Trump administration undermining federal action to address climate change, states like Oregon are stepping up to protect the planet for future generations.

For example, after President Trump announced that he will withdraw the U.S. from the Paris Climate Agreement, Oregon joined the U.S. Climate Alliance, a bipartisan coalition of states committed to the goal of reducing global warming pollution consistent with the Agreement. In joining the Alliance last year, Governor Kate Brown said, “it is our moral obligation to fulfill the goals of the Paris Agreement. Oregon will continue to make meaningful strides, with the rest of the world, to ensure our communities and economies adapt to meet the challenge of climate change.”

Fortunately, Oregon lawmakers and Governor Brown have the opportunity this year to take a huge step in Oregon’s fight against climate change. Last year legislators debated the Clean Energy Jobs bill, which would establish a “cap-and-invest” program that cuts global warming emissions by requiring polluters to pay for pollution, and then use the proceeds to invest in clean energy solutions. A cap-and-invest program would be a major new tool in Oregon’s climate-change-fighting toolbox, and the Clean Energy Jobs bill is now poised to be the single-biggest issue before Oregon lawmakers in 2018.

Pricing Pollution and Funding Solutions  

Oregon is already experiencing the impacts of climate change, from higher temperatures, a warmer and more acidic ocean, increased wildfire activity, and reduced snowpack. Putting a price on global warming emissions through a cap-and-invest program helps integrate the risks of climate change into the cost of doing business because it forces the costs of climate impacts and the value of low-carbon technologies to be better reflected in decisions companies make about what to produce and how to produce it, and consumers make about what to buy. This leads to fewer emissions that heat our atmosphere.

In addition, because major polluters are required to pay for their pollution, a cap-and-invest policy also generates significant revenue. These proceeds can be used to fund investments that help reduce climate change emissions, like making renewable energy more affordable, improving the energy efficiency of homes and other buildings, increasing transportation options, and expanding clean energy job training programs.

The Clean Energy Jobs bill is modeled on similar programs in other states and Canadian provinces, which have seen impressive success in making clean energy investments. For example, through the end of 2017, California’s legislature appropriated more than $4.7 billion in proceeds from the sale of cap-and-invest pollution permits for investments in a range of useful programs, from supporting rooftop solar panels to water efficiency projects. The Regional Greenhouse Gas Initiative, a cap-and-invest program to reduce carbon dioxide emissions from power plants in nine Northeastern and Mid-Atlantic states, has invested heavily in energy efficiency programs, helping 141,000 households and 5,700 businesses with investments in 2015 that will return $1.3 billion in energy bill savings. Oregon has the chance to build on these successful examples to create a program that reduces pollution and meets the unique needs of Oregonians, including its rural residents.

A Cap-and-Invest Program Will Complement Other Oregon Policies

While Oregon has already taken considerable steps to reign in global warming pollution from electricity production and cars and trucks, the state is not yet on track to meet its pollution reduction goals. Passing a cap-and-invest program is the single-biggest step state lawmakers can take to get Oregon on track.

Fortunately, a cap-and-invest program would nicely complement existing policies in Oregon to reduce global warming pollution, such as the Clean Fuels Program (CFP) and Renewable Portfolio Standard (RPS). The CFP requires a 10 percent reduction in the carbon intensity of transportation fuels sold in Oregon by 2025 (compared to a 2015 baseline). The program creates a dependable market for cleaner fuels, which facilitates steady investment into research, development, and deployment of low-carbon fuels that are necessary to decarbonize the transportation sector in coming decades.

Meanwhile, the RPS requires that by 2050 half of electricity sold in Oregon is supplied by renewable sources, like wind and solar. Similar to the CFP, this policy creates a dependable market for renewable technologies, which is critical for facilitating investment in clean energy solutions necessary to decarbonize the electricity sector.

A cap-and-invest program would complement these policies by providing greater assurance that Oregon will meet its targets for reducing global warming emissions statewide. That is because the central design feature of a cap-and-invest program is a limited pool of pollution permits (i.e., the “cap”), which shrinks each year to ensure that emissions are staying in line with emission reduction targets. In addition, the revenue from the cap-and-invest program is important for helping overcome market barriers for clean technologies that performance standards—such as the CFP and RPS, but also others like energy efficiency standards—cannot solve on their own.

Finally, the programs complement each other because compliance with CFP or RPS eases compliance with the cap-and-invest program. This reduces the price of pollution permits, reducing compliance costs for all sources covered by the cap-and-invest program.

Clean Energy Jobs Bill is “Fully Baked”

The Oregon legislature has considered various forms of legislation like the Clean Energy Jobs bill for more than a decade, with a sustained push to develop a cap-and-invest policy since 2016. This past fall Senator Dembrow and Representative Helm chaired a work group process to make further refinements to the program’s design. While the Union of Concerned Scientists may not end up agreeing with every detail in the House and Senate bills that will be introduced in February, it is abundantly clear that legislators’ sustained engagement on this topic, along with extensive stakeholder input, has produced a thorough and well-vetted program design. In the parlance of the legislature, the policy proposal is “fully baked” and ready to be passed into law.

The daunting consequences of a changing climate require a swift response from governments around the world. In 2018 Oregon lawmakers have the chance to accomplish something big to maintain the Beaver State’s commitment to seriously addressing this crisis. Let’s hope—and work—to see a cap-and-invest program passed this year.

Department of Energy Coal Bailout Rejected: Told to Get the Facts First

UCS Blog - The Equation (text only) -

Florida Power & Light

The Trump Administration got a punch in the nose trying to overthrow energy markets and deprive consumers of the savings created by lower cost, competitive energy supplies. All five Federal Energy Regulatory Commissioners, four of them appointed by Donald Trump, rejected an open-ended bailout proposed by Energy Secretary Rick Perry.

Not the end, a beginning

Hurricane Irma restoration in Fort Lauderdale, FL, on Sept. 11, 2017.

The unanimous Commission decision starts a fresh look at what can threaten the power grid, and what is already being done, this time within the legal system that applies to energy markets. Going forward, FERC will look for a common understanding of resilience, and will ask the organizations that operate the power pools through independent wholesale markets, not monopolies.

The Commission asks over 25 questions to the independent grid operators that serve two thirds of the US population (New England; New York; the PJM region from Northern Illinois across the Ohio Valley to the Mid-Atlantic states; as well as the California ISO, the Midcontinent ISO and the Southwest Power Pool that have operated with very high levels of renewable energy).  Not included in this inquiry are the power pools in the Pacific Northwest, Texas, or the mountain states and southeast region of the U.S., areas where grid operators have not formed an independent system operator subject to Commission market rules.

Time for questions and answers

The Commission poses questions about existing practices, and market-based services that already support resilience. There are questions about how the grid operators assess the resilience of the grids they operate. The Commission asks if there have been studies about resilience in the face of threats, what sorts of threats have been studied, and how these kinds of events are selected. The grid operators are asked to explain their criteria for events, the basic methods for understanding threatening events, and useful preparations or practices for withstanding such events.

This assignment for the grid operators continues, asking what attributes of the power system contribute to resilience, and what design standards are in place, as well as how different generation types perform in drought and other extreme weather, physical attack, or cyber threat. Then the public and UCS can comment on the grid operators’ answers.

What will we say?

First, FERC is right to start with the power pool operators, as these were set up to make the electricity supply better, cheaper, faster.  Grid reliability can be improved, coal plants can be retired. Let’s gather up-to-date information. Unlike the proposal from the Trump Administration to rush a multi-billion dollar cronyism scheme, this is something worth discussing. We depend on electricity, the grid has weaknesses and signs of aging, and the threats to reliability are real. So yes, let’s consider how we can improve what we have built.  Better to look for needs first, and then ask what is available to fill these needs.

Wind and Solar Offer Resilience

Image: Mike Jacobs

Grid support from offshore wind is real in New England. We will urge the grid operators to look West, and to the Plains, where higher levels of wind and solar have provided new ways to make the grid resilient.

Using renewable energy brings the opportunity to forecast the impacts of weather, in more complex and subtle ways. The California ISO was first to implement wind forecasting, beginning in 2004.  The regional grid operators of Texas, New York ISO, and the Midcontinent ISO implemented wind forecasting in 2008 and PJM did so in 2009.

Look now at the Southwest Power Pool, (serving western Plains from North Dakota to northern Texas) where wind has reached 50% of the energy supply in some hours. There, grid operators have learned to adjust the voltage controls when wind is forecasted to be a major supply on the wires. In Colorado, Xcel has learned that there are some predictable aspects to the variability of wind, and operations can be improved by using some simple logic to reduce the costs. California ISO  recently learned it can use solar farms for reliability services. The technology is already in place to meet the need, but the contracts need to be written to gain access to the flexibility.

Water can be a vulnerability

We will urge the Commission to look to the state regulators’ body (NARUC) which resolved to support water-smart energy choices.  UCS has emphasized the impacts of drought on conventional power plants. Knowing fossil and nuclear plants depend on cooling water to run in hot weather, UCS quantified the water withdrawal needed for older generators, and technologies that need little or no water. Renewable energy and energy efficiency came out at the top of that analysis.

Storms can strike suddenly. Credit: Shane Lear

Wires are the backbone of the Grid

Transmission lines are key to the reliability of the grid. Most power outages are traced to transmission for large areas, and local outages are due to problems with wires on the street level. Storms can cause large-spread outages, but so can the frailty of transmission.  Transmission is burdened by the expense of including redundancy for reliability. Look to the role of energy storage to free up more of what is already built, part of a “smart” or modern grid that can diagnose and respond to disturbances.

To be continued

FERC Commissioner Rich Glick wrote in a concurring opinion how more work is needed to clarify resilience issues, and the role of new supply technologies:

“The Department’s own staff Grid Study concluded that changes in the generation mix, including the retirement of coal and nuclear generators, have not diminished the grid’s reliability or otherwise posed a significant and immediate threat to the resilience of the electric grid. To the contrary, the addition of a diverse array of generation resources, including natural gas, solar, wind, and geothermal, as well as maturing technologies, such as energy storage, distributed generation, and demand response, have in many respects contributed to the resilience of the bulk power system.”


Florida Power & Light Florida Power & Light Mike Jacobs owner

One Year of Attacking Science: How the Trump Administration Measures Up

UCS Blog - The Equation (text only) -

Having collected evidence on the multiple ways that the Trump administration has been attacking science over the past year, it is becoming clear that this administration doesn’t just want to hide or ignore the facts. This administration is attempting to decimate the scientific process.

At the six-month mark of this administration holding office, we documented 44 attacks on science in our report, Sidelining Science Since Day One—that number has now jumped to 64. The implications are frightening.

An open letter to the Trump administration

Prior to the Trump administration taking office, more than 5,500 scientists signed an open letter asking then president-elect Trump and the 115th Congress to ensure that science continues to play a strong role in protecting public health and well-being. There are now more than 6,000 signatories on the letter.

These scientists called on the Trump administration and Congress to take several actions to strengthen the role that science plays in policy making. So how has the Trump administration measured up against the demands of thousands of scientists?

Appointing unqualified agency heads

The scientific community called upon the Trump administration and Congress to ensure that science-based agencies be led by officials with a strong track record of respecting science as a critical component of decision making. Yet the Trump administration has often chosen leaders for science-based agencies that are unqualified, conflicted, and/or openly hostile to the mission of their agency.

For example, Kathleen Hartnett-White was just re-nominated to lead the White House’s Council on Environmental Quality (CEQ), the office in charge of overseeing the National Environmental Policy Act (NEPA). Hartnett-White’s anti-science views on climate change, air pollution and health, clean and renewable energy, and the role of science in public policy suggest she would do little but harm our environment and public health. She is unfit for the position.

Other government leaders now in charge of overseeing the regulation of industry have come directly from those industries – a clear conflict of interest. Dr. Brenda Fitzgerald, the director for the Centers for Disease Control and Prevention (CDC), has financial interests that are not severable to the agency’s work on cancer detection and opioid addiction treatment. Therefore, the director of the CDC has had to recuse herself from discussions and decisions regarding this work even though they are important issues the US needs to address. Without leadership on these issues, will the CDC be able to properly address these health challenges?

Some leaders also have been openly hostile to scientists. For example, Administrator Scott Pruitt of the Environmental Protection Agency (EPA) was recently entangled in a decision to hire Definers Public Affairs to handle the agency’s press coverage. However, there is evidence that Definers was also involved in targeting EPA staff who have expressed personal views not in line with those of the Trump administration by submitting Freedom of Information Act (FOIA) requests for their emails. This sends a chilling signal to staff to not speak out against any wrongdoing within the EPA.

Dismantling science-based policy-making

Scientists also called upon the Trump administration and Congress to ensure that our nation’s bedrock public health and environmental laws—such as the Clean Air Act and the Endangered Species Act—retain a strong scientific foundation, and that agencies are able to freely collect and draw upon scientific data to effectively carry out statutory responsibilities established by these laws. They should also safeguard the independence of those outside the government who provide scientific advice.

Yet, we’ve not seen the Trump administration or Congress protect the scientific backbone of these bedrock laws. The 115th Congress introduced 63 separate pieces of legislation to undermine the Endangered Species Act. Additionally, we have seen Congress attempt to include harmful anti-science policy riders in negotiations around the federal budget, all aimed at gutting the Clean Air Act, the Endangered Species Act, and more.

President Trump also began the process to rescind the US involvement in the Paris Agreement, which will have lasting effects on the planet’s atmosphere and air quality. Furthermore, this administration has made it more difficult for the public to access information and data that science-based agencies provide.

There are multiple other examples of how the Trump administration and the 115th Congress have and continue to undermine science…

Targeting government scientists

The scientific community called upon the Trump administration and Congress to allow federal agency scientists the freedom and responsibility to:

  • conduct their work without political or private-sector interference
  • candidly communicate their findings to Congress, the public, and their scientific peers
  • publish their work and participate meaningfully in the scientific community
  • disclose misrepresentation, censorship, and other abuses of science
  • ensure that scientific and technical information coming from the government is accurate

Since the Trump administration has taken office, we have seen federal scientists attacked. Federal scientists have been censored. They have been reassigned to undertake tasks not affiliated with their expertise. They have been prevented from attending conferences.

At the EPA, it is possible that some scientists may have been targeted for their personal views of the Trump administration. Additionally, the Trump administration is using various strategies to hollow out agencies by diminishing their scientific workforce.

Perhaps the most devastating impact of all, however, is that these actions create a hostile work environment for agency scientists that stokes fear, results in self-censorship, lowers staff morale, and sends a chilling message to scientists across the country that their work is not valued.

Slashing science funding

Lastly, the scientific community asked President Trump and Congress to provide adequate resources to enable scientists to conduct research in the public interest and effectively and transparently carry out their agencies’ missions.

Instead the Trump administration has proposed to cut a number of key research programs and slash substantial amounts of funding from science-based agencies. The administration has even withheld funding from research programs illegally. Congress also has attacked the funding of graduate students.

But when it comes to science funding, the true measure of the administration and Congress will be judged when a final decision on FY18 funding is made over the coming weeks, and a decision on FY19 funding is made over the coming months.

Scientists are keeping their word

The letter that scientists penned to President-elect Trump and the 115th Congress was an assurance that the scientific community would hold them accountable for the misuse of science in policy-making decisions.

Scientists have kept their word and are not taking these attacks on the scientific process lightly. They have successfully pushed back against and defeated anti-science nominees like Sam Clovis and Michael Dourson. Graduate students mobilized like we’ve never seen before to defeat a provision in the new tax reform law that would have prevented poor and middle-class Americans from pursuing scientific careers. And scientists are taking to the streets in unprecedented numbers to march and let this administration know that they’re not scared to use the powers bestowed upon them in this country to fight back.

Science has saved many lives and provided our society with extraordinary benefits. We cannot afford to let this administration undermine it.

Clean Energy in 2018: Here’s What to Expect

UCS Blog - The Equation (text only) -

While the year 2017 is one I don’t mind seeing in the rear view mirror (and I’ve got colleagues that would agree), in the field of clean energy we made a whole lot o’ progress. A new year, if I’ve done my math right, means 12 more months to move the ball forward on clean energy. Here are a few things I’ll be keeping my eyes on as we traverse the length of 2018.

Solar – Sour or soar?

For solar power, February should be when we see the official tally for 2017 progress—how much, where, what price, what sizes. We already know, though, that last year was down from the record-breaking 2016.

One factor is that very record-breaking-ness: Developers and customers pushed to get lots of solar installed before an important investment tax credit was scheduled to expire. The tax credit got extended, but momentum carried many solar projects to completion in 2016. Not a bad thing, certainly, but it left 2017 as a bit of a cooling-off period.

Another really important factor: Right now, US solar companies don’t know what kind of environment they’re going to be operating in, which is something I’ll be watching this very month. We’ve got a president who seems gung-ho about tariffs (in this case, on solar cell/module imports) even when they don’t make sense, and a decision due from him on January 26 about whether to slap tariffs on.

In the meantime, we’re seeing, in the words of Solar Energy Industries Association CEO Abigail Ross Hopper, “what happens when policy uncertainty becomes a disruptive factor…” What we get: Shaky investors, as they try to read the tea leaves from an unstable teacup, and higher prices (temporary though they may be), as solar companies scramble to bring in what they can before any tariff hammer falls.

What that adds up to is that lower 2017, and a 2018 with potentially even fewer new solar installations, even if wisdom prevails on the tariff issue.

Still, the solar totals are and could still be impressive. The third quarter of 2017 saw more solar come online than in all of 2011, and the 2017 totals will show that it was the second-best year ever. And 2018 could bring considerably higher amounts of new solar than any year other than 2016 and 2017—and still enough new solar to power more than a million and a half typical US homes.

And then there’s the broader, and longer-term picture: Solar costs have kept impressively dropping as scale has ramped up, and solar has gotten harder and harder to ignore (even for smaller projects). Global totals for new solar are projected to have ended up much higher in 2017 than in 2016, and to be in that same high range in 2018.

Solar’s (mostly) steady climb (Source: GTM-SEIA)

Wind – Sigh or fly?

Wind power is another exciting technology to watch in 2018, including because the viewing can take you from coast to coast, and from the Great Plains to the waters off our shores. And, like solar, this technology has both headwinds and tailwinds.

  • For wind (and solar), the big holiday almost-spoiler was the federal tax bill. The house and senate versions each had provisions that would have undercut incentives, even for projects already up and running with those incentives in their calculations. That particular crisis was averted—a testament to the strong bipartisan support that wind has earned (and the volume of opposition to wrong-headedness).
  • Another big factor for land-based wind is the planned phase-out of one of those key incentives, the production tax credit (PTC). Wind projects that began construction before last year can get 100% of the tax credit. In 2017, 2018, and 2019, though, that credit is falling to 80%, 60%, and 40%, before going away entirely for projects started in 2020 and beyond. That gives a strong incentive to get projects built now.
  • Wind power also has going for it the incredible cost reductions it has achieved in recent years: In some parts of the country, it’s the lowest-cost thing out there. A utility in Colorado, for example, had projected last year that it could save its customers hundreds of millions of dollars over the next few decades by replacing some of its coal plants with renewable energy. When they asked for bids, though, the wind projects came in at even lower prices than they had assumed (more info here; subscription required).

Lots of tailwinds, and the numbers are bearing that out: The amount of wind capacity under construction as of the third quarter of 2017 was about as strong as ever (and all over the map, in a good way). And the fourth quarters are always the strongest for actually turning projects on. So I’ll be looking for those results in the next few weeks.

Wind power on the move (Source: AWEA)

Wind in the water – Time to buy

And then there’s offshore wind. Since the launch in late 2016 of the first offshore wind project in the country (and the hemisphere), off Rhode Island, this has been a technology on the move. And 2018 looks to bring more progress:

  • Utilities in Massachusetts will be deciding on the offshore wind proposals that got submitted last month by several high-powered collaborations. The timeline calls for the winners to be selected later this year, meaning at least 200-800 megawatts of offshore wind could be on the way, the first phase of the state’s planned 1,600 megawatts by 2030.
  • New York Andrew Cuomo just announced the next step toward realizing the state’s 2030 goal of 2,400 megawatts of offshore wind: a call for the state to get bids for at least 800 megawatts of offshore wind power over this year and next, enough power for some 400,000 Empire State households.
  • With the election last November, New Jersey now has a governor (Phil Murphy) with an even bolder 2030 goal than New York’s or Massachusetts’s: 3,500 megawatts. The Garden State certainly has the coastline, and the wind resource, to support ambitious plans.
  • Meanwhile, Maryland has two projects that were awarded state support last year and that add up to more than 360 megawatts, Connecticut is exploring a requirement that might result in a couple hundred megawatts of offshore wind, and several other East Coast states also have offshore wind areas already leased out. Don’t lose sight of the Great Lakes or the West Coast, either.
For a happy 2018

There’s plenty more to watch beyond wind and solar, of course. Other renewable energy technologies stand ready to contribute. Energy storage is growing (and getting cheaper) more quickly than just about anybody imagined. Energy efficiency keeps making things better, smarter, cheaper. And we could talk about electric vehicle progress for hours (or you could just look here).

States are figuring out, too, how to modernize electricity grids and policies—and upgrade transmission lines—to help us get off large old centralized power plants and take advantage of renewable energy in all its forms. All worth monitoring, and helping move along.

And we need to keep an eye on fossil fuels, too, to make sure bad decisions don’t get made to try to prop up failing coal plants via federal policy (even if they fail), or at the state level. To make sure our EPA works for us in cutting pollution from power plants, not for polluters.

And we need to make sure we don’t hamstring our economies with an overcommitment to natural gas.

But for energy progress in 2018—for our economy, for jobs, for our environment, for our kids—clean energy is what I’ll be watching.


Trump Administration Rescinds Fracking Rule for Public Lands: A Blow to Public Protection

UCS Blog - The Equation (text only) -

Photo: Tim Evanson/Wikimedia Commons

At the end of last month, we saw yet another casualty in the Trump administration’s war on science-based policy. The administration announced it would rescind a 2015 rule at the Bureau of Land Management (BLM) to address risks associated with unconventional oil and gas development on public land. Though not unexpected, this move was disheartening. The 2015 BLM fracking rule was an important step in protecting people from many unchecked risks associated with hydraulic fracturing. To understand what we lost, let me quickly review the national setting back in 2015 when the rule was issued.

The BLM fracking rule, rescinded by the Trump Administration, was the only major federal rule governing risks associated with unconventional oil and gas development. Photo:

A Wild West of oil and gas drilling

Around 2008, the oil and gas industry refined methods of hydraulic fracturing combined with horizontal drilling that made the process of shale oil and gas drilling more economically viable. This led to a rapid expansion of oil and gas development across the country. Many of the new places seeing oil and gas development had never experienced drilling in close proximity before, and now it was happening around them—in some cases, literally in their backyards. The nation was caught off guard. Notably, the scientific community had to play catch-up to assess risks to air and water and from occupational chemical exposure. Reports began to surface from communities complaining of drinking water contamination or poor air quality causing health concerns. But scientific studies on these subjects were limited.

One major barrier to understanding the risks associated with the new boom in fracking was a lack of transparency from industry. Researchers studying air quality around facilities were only able to take measurements from facility fencelines without closer access to emission sources on site. Researchers studying water quality were forced to analyze samples without any baseline measurements of what water quality was like before drilling started. And physicians were forced to make medical decisions about patients without knowledge of the chemicals they were exposed to because companies refused to disclose chemicals used in fracking fluid.

As a result, the public was left in the dark. The federal government failed to take a large role in regulating this Wild West of oil and gas drilling, leaving regulation largely up to states—many of which were ill-equipped to manage a new and complex industry now ever-present in their state. In some ways the federal government was limited in what it could do. The infamous “Halliburton Loophole” in the Safe Drinking Water Act prevents the EPA from protecting people from fracking risks to drinking water using the Safe Drinking Water Act.

The BLM fracking rule: the federal government steps up

But one way the federal government could take action (or at least so they thought) was by creating a rule dictating how unconventional oil and gas drilling could happen on public lands. Thus, the BLM fracking rule was developed. While modest in scope, the BLM rule was an important step toward greater regulation and disclosure of an industry unaccustomed to playing under such constraints. The rule set standards for well construction, wastewater management, and chemical disclosure. The greater chemical disclosure requirement was huge. It meant companies couldn’t continue to hide behind trade secret claims and the public, scientists, landowners, medical personnel, workers, and first responders would now have greater access to information about chemicals involved and risks to health.

The rule’s requirements about well construction and wastewater management could also have gone a long way toward minimizing risks of groundwater contamination—perhaps the issue of greatest concern for communities based on several high-profile cases of large-scale drinking water contamination. This is an area where we also saw a high degree of industry misconduct. There were cases of dumping wastewater and many accidental spills and leaks. In other cases, water was contaminated by faulty well-casings or natural and manmade fissures in bedrock that allowed fracking chemicals or naturally occurring hazardous chemicals to reach water sources. All the while, industry often hid behind their narrow definition of fracking, despite a public understanding and concern about water contamination throughout the process—from initial drilling, to oil and gas production, to transportation from the site. Setting standards, even if only for public lands, could have encouraged companies to adopt such protocols industry-wide, guarding against future water contamination.

The rule also restricted where drilling could happen. It required companies to avoid endangered species habitats. This was important for species like the sage grouse which live in prime areas for development and need quiet spaces for mating. (Side note: This is actually one of three hits on the sage grouse by the Trump Administration. Earlier this year it reopened state management plans that were designed to ensure the species could stay off of the endangered species list. And last week, the administration rescinded an order for the BLM to prioritize oil and gas permit granting outside of sensitive sage grouse territory.)

Back to square one: leaving us in the dark

Though the rule was stayed by a court in 2016 and thus wasn’t in effect when the Trump Administration made this move, the rule did sent a signal to industry that they would soon have to accept a more transparent and safety-focused business model. Without such a rule on the books, we are now back to square one when it comes to federal protection of people against the risks of fracking. And that leaves us all in the dark.

New NOAA Report Shows 2017 Was the Costliest Year on Record for US Disasters

UCS Blog - The Equation (text only) -

Today NOAA’s National Climatic Data Center released its yearly report on “billion-dollar weather and climate disasters” that affected the US in 2017. Not surprisingly, the numbers were staggering.

The costs of 2017 disasters, at $306.2 billion dollars cumulatively, set a new US record that blew past previous totals. These events also resulted in 362 fatalities. (This number only reflects official tallies to date—as news reports indicate the true death toll from Hurricane Maria in Puerto Rico could as high as 1,052, over 16 times the current official figure.)

A disaster-filled year

2017 was truly an historic and unprecedented year of disastrous extremes—from hurricanes and wildfires to drought and flooding. Hurricanes Harvey, Maria, and Irma propelled 2017 to become the costliest hurricane season on record at $265 billion, and California’s terrible wildfires also led to it also being the costliest wildfire year at $18 billion.

There were 16 disasters in the US in 2017 whose costs topped a billion dollars, a record number of events for a single year and tied only with 2011. These events affected every region of the country either directly or indirectly (see map).

2017 global insured losses also record-breaking

NOAA’s report today echoes reports from the insurance industry, which has also cited 2017 as a record-breaking year for costly disasters globally.

Last week, Munich Re, one of the world’s leading reinsurers, stated that the costs to the insurance industry from Harvey, Irma, and Maria and other 2017 disasters are expected to reach $135 billion globally, the highest ever, with the US share dominating at 50 percent of these costs. Overall economic losses, including uninsured losses, will amount to $330 billion, the second highest ever for a single year. They also tagged 2017 as the most expensive hurricane season ever.

Last month, Swiss Re released a similar report stating that global insured losses for 2017 were $136 billion, and total economic losses are estimated to be $306 billion.

Globally, the human impact of these disasters has been particularly tragic with the total toll from 2017 disasters estimated at more than 11,000 people dead or missing. An estimated 2,700 people lost their lives in the flooding in South Asia alone.

Climate change is contributing to rising disaster costs

Disaster costs are rising both because of climate related factors and because growing development is putting more people and property (often very expensive property) in harm’s way. The NOAA report does not specifically parse out the relative contribution of these factors to growing costs.

Yet, we know that climate change is exacerbating the risks of many types of disasters, including heat waves, droughts, wildfires, and flooding worsened by heavy rainfall events and sea level rise. As my colleague Brenda Ekwurzel pointed out, this year’s abnormal and catastrophic hurricane season bore the fingerprints of climate change.

NOAA also released data today on the US climate in 2017. It shows that the continental US had its third warmest year on record and, for the third consecutive year, every state across the contiguous US and Alaska was warmer that average. Five states had their warmest year on record. Precipitation was also above average for the year.

Two recent attribution studies done in the wake of Hurricane Harvey show that climate change worsened the extreme rainfall associated with that storm.

A recent report published as a special supplement to the Bulletin of the American Meteorological Society (BAMS), Explaining Extreme Events in 2016 from a Climate Perspective, highlights several extremes in 2016 that were exacerbated by climate change. They include the record global heat, extreme heat over Asia, and unusually warm waters in the Bering Sea. The report contains 27 peer-reviewed analyses of extreme weather across five continents and two oceans during 2016, based on the research of 116 scientists from 18 countries.

As the science of attribution advances, we can expect more and more research of this type to explain whether and how human-made global warming influenced specific extreme events.

We’ve got to do more to limit the toll from disasters

There’s a lot we can and must do to limit the economic costs and human toll from disasters. First and foremost, we must do more to prepare and protect communities ahead of time by investing in risk reduction and disaster preparedness and by ensuring that our federal, state and local policies are guided by the best available science.

Earlier this year in the midst of the hurricane season, my colleagues and I offered some thoughts on what we were seeing as key priorities.

Federal agencies tasked with disaster preparedness and response need to be appropriately staffed and receive adequate budgets to do their job well. As just one example, the money we invest in satellites and other early warning systems are vital to limiting the human and economic toll of disasters. Accurate flood risk maps and storm surge maps are key to helping communities and emergency responders understand risks and take appropriate action.

We also need robust policies to support building back stronger after disasters, including a strong federal flood risk management standard, a robust flood insurance program, and funding for investments in flood mitigation.

Our nation’s recovery from 2017’s disasters will take a long time and we’ve got to pay attention to the needs of communities long after the hurricanes and wildfires have dropped out of the headlines. The people who are hit hardest and have the most difficulty recovering from disasters are often in low-income, minority, and disadvantaged communities. We’ve got to pay specific attention to their needs. The people of Puerto Rico, for example, many of whom still don’t have power, are suffering from the long-term effects of living in siege-like conditions.

Finally, limiting the carbon emissions that are fueling climate change is absolutely critical to containing the risks and costs of runaway climate change.

Climate change is a threat to human health and to the economy

Today’s NOAA report and recent reports from the insurance industry reinforce the fact that climate change is a threat to our health, and also a threat to our economy. The American people—and people around the world—are already experiencing these harmful impacts.

Forward-thinking businesses understand the economic implications too. In a press release accompanying the Munich Re report, Torsten Jeworrek, Munich Re Board member responsible for global reinsurance business, was quoted saying: “For me, a key point is that some of the catastrophic events, such as the series of three extremely damaging hurricanes, or the very severe flooding in South Asia after extraordinarily heavy monsoon rains, are giving us a foretaste of what is to come.”

Yet we have a Congress and an administration that continue to deny scientific realities and are failing to take the necessary and urgent steps needed to deal with the threat of climate change. What a shameful abdication of responsibility in the face of one of the most urgent issues we face.

An Ounce of Prevention…is Worth a Kiloton of Cure

UCS Blog - All Things Nuclear (text only) -

As part of its ongoing online training system, the Centers for Disease Control and Prevention (CDC) has scheduled a webinar later this month titled “Public Health Response to a Nuclear Detonation.”

The description of the webinar on the CDC website says: “While a nuclear detonation is unlikely, it would have devastating results and there would be limited time to take critical protection steps. Despite the fear surrounding such an event, planning and preparation can lessen deaths and illness.”

(Source: CTBTO)

On the one hand

This makes some sense. With global stockpiles of more than 15,000 nuclear weapons in the hands of nine countries around the world, thinking through the consequences of their use is the responsible thing for the CDC to do instead of pretending the world will make it through another few decades without someone detonating a nuclear weapon.

Nuclear use is a particular concern now given the flare-up of tension between North Korea and the United States and the bombastic threats by Kim Jong-un and President Trump (not to mention their recent boasts about their “nuclear buttons”).

Perhaps even more likely is a nuclear war by accident. The United States keeps hundreds of missile-based nuclear warheads on hair-trigger alert with the option of launching them very quickly if early warning sensors report a Russian attack. Russia is believed to do the same. But technical and human mistakes over the past decades have led to a surprising number of cases in which one or the other country thought an attack was underway and started the process to launch a nuclear retaliation. How long until one of those mistakes doesn’t get caught in time?

The use of nuclear weapons could have horrific results. Many US and Russian warheads have explosive yields 20 to 40 times  larger than those of the warheads that destroyed Hiroshima and Nagasaki in 1945.

Because North Korean missiles are not very accurate, it would need to aim its nuclear weapons at large targets, namely big cities. While the United States does not intentionally target cities, many of its warheads are aimed at military or industrial targets that are in or near major population centers. The same is true for Russian targets in the United States.

In addition, a nuclear detonation could have world-wide consequences. Studies have shown that even a relatively limited nuclear exchange between India and Pakistan for example, could eject so much soot into the atmosphere that there would be significant global cooling for a decade. This “limited” nuclear winter could lead to widespread starvation and disease.

So, on the other hand…

A key message of the CDC briefing will hopefully be that the role public health professionals can play following a nuclear attack is relatively small, and the only real option is to prevent the use of nuclear weapons in the first place. This is a case where an ounce of prevention is worth much more than a kiloton of cure.

Given that reality, there are several steps the United States should take to reduce the risk of nuclear use, including:

  1. Pursuing diplomacy with North Korea, with the immediate goal of reducing tensions and the risk of military attacks, and a longer term goal of reducing Pyongyang’s nuclear arsenal. Secretary of State Rex Tillerson has made clear repeatedly that he would like to do this. President Trump should get out of his way and let him.
  2. Eliminating the option of launching nuclear weapons on warning of an attack and taking all missiles off hair-trigger alert.
  3. Changing US policy so that the only purpose of US nuclear weapons is to deter, and if necessary respond to, the use of nuclear weapons by other countries. Under this policy, the United States would pledge to not use nuclear weapons first.
  4. Scaling back the $1.2 trillion plan to rebuild the entire US nuclear arsenal over the next 30 years.
  5. Starting negotiations on deeper nuclear cuts with Russia and taking steps toward a verifiable agreement among nuclear-armed states to eliminate their nuclear arsenals.

At the CDC, as Elsewhere Throughout the Government, Words Have Consequences

UCS Blog - The Equation (text only) -

It does not matter who pulls the semantic shroud over the Centers for Disease Control and Prevention. When it comes to matters of science and health, any level of silence at the CDC is a declaration that saving lives is secondary to politics.

According to a recent Washington Post story, higher-ups banned from budget requests the words: “diversity,” “entitlement,” “fetus,” “transgender,” “vulnerable,” even “evidence-based,” and “science-based.” CDC Director Brenda Fitzgerald claimed that no words were banned, “period.” But at a minimum, multiple sources confirm that meetings were held on how to craft budget requests so as not to trigger opposition from conservatives in Congress.

Evidence-based declines

Whether ordered or voluntary, the evidence of such changes is clear: many of the above words have already been disappearing during President Trump’s first year, according to Science Magazine. Use of the phrases “diversity” and “vulnerable” are down a combined 68 percent compared to President Obama’s last budget. Use of the phrase “evidence-based” is down 70 percent.

That latter fact hardly seems to be a coincidence given a President with a documented casual relationship with the truth who, according to the Washington Post fact-checking department, has made 1,950 false or misleading claims in his first year in the White House.

The declines in the use of this terminology is consistent with other scientific erasures in the first year of Trump. An analysis this fall by National Public Radio found a major drop in the number of grants awarded by the National Science Foundation containing the phrase “climate change.” Only 302 NSF grants contained the phrase last year, compared with the annual average of 630 during the Obama administration—that’s a 52 percent decline.

NPR quoted Katharine Hayhoe, director of the Climate Science Center at Texas Tech University, as saying, “In the scientific community, we’re very cautious people. We tend to be quite averse to notoriety and conflict, so I absolutely have seen self-censorship among my colleagues.”

Real-life consequences

The obvious question of course is whether shying away from diversity and vulnerable populations such as transgender people in budget requests, or shrinking from assuring that studies are evidence-based will result in failures to monitor disparities and effectively protect Americans’ health.

For instance, take the issue of racial health disparities. It would be tragic if the Trump administration allows a reversal of the progress that has come through decades of dedication from the career scientists and medical and public health experts at the CDC and its parent Department of Health and Human Services who have trudged on regardless of which party controls the White House or Congress.

A good example is the case of black men and women between the ages of 45 and 54, one of the most historically vulnerable groups, who have long died from chronic diseases such as heart disease, stroke and cancer at far higher rates than the national average.

In 1980, according to CDC data, the death rate for black men and women in that age group was 1,480 per 100,000 people and 768 per 100,000 people, respectively. Both rates stood more than twice as high as those for their white male and white female counterparts (699 per 100,000 and 373 per 100,000, respectively).

Even during the Reagan years, in an administration frequently hostile to civil rights and friendly with apartheid South Africa, then-HHS Secretary Margaret Heckler saw fit to address these yawning racial gaps head on. In a landmark 1985 task force report on “Black and Minority Health,” she wrote that the disparities were an “affront to both our ideals and to the ongoing genius of American medicine.” She said it was time to “decipher the message inherent in that disparity.”

That report called for a dramatic increase in health studies to help devise effective, evidence-based interventions for specific racial groups. Unlike the murky controversy of the moment, the 1985 report’s language made it clear that “diversity” was a critical word. Under a section titled, “Implications of Diversity,” the report said: “This diversity among populations is reflected in language difficulties, in cultural practices and beliefs with respect to illness and health, in differences in their birth rates, in differences in the afflictions which kill them.”

Years of progress, but more work ahead

The efforts during the Reagan years set the stage for dramatic progress, even though there is plenty more work still to do.

The death rate for black men aged 45 to 54 dropped 15 percent in the 1980s during the Republican administrations of Reagan and George H.W. Bush. It dropped another 19 percent in the 1990s, and 19 percent again in the 2000s, mostly under the two terms of Democrat Bill Clinton and the two terms of Republican George W. Bush. Finally, under the two Democratic terms of Obama, the rate dropped yet another 18 percent.

The result is a current death rate for these black men of 678 per 100,000, less than half the 1980 rate. The death rate for black women in the same age group is down 42 percent from 1980.

The dramatic progress, and the obvious work left to do, is precisely why the Trump administration must not turn its back on these kinds of evidence-based accomplishments—or the forthright use of language that helped achieve them.

Besides, semantic silence has been tried before and has failed.

In one telling episode during George W. Bush’s first term, for example, HHS tried to eliminate the words “inequality” and “disparities” from a national report on health disparities.

A strong early draft had said: “Inequalities in health care that affect some racial, ethnic, socioeconomic, and geographical subpopulations in the United States ultimately affect every American. From a societal perspective, we aspire to equality of opportunities for all our citizens. Persistent disparities in health care are inconsistent with our American values.”

That draft also said, “The personal cost of disparities can lead to significant morbidity, disability, and lost productivity.”

The final report in late 2003 erased the above, replacing it with this far more cheerful message: “The overall health of Americans has improved dramatically over the last century. Just in the last decade, the United States has seen significant reductions in infant mortality, record-high rates of childhood vaccinations, declines in substance abuse, lower death rates from coronary disease, and promising new treatments for cancer.”

The firestorm that erupted over the two versions forced then-HHS Secretary Tommy Thompson to publish the stronger draft online in early 2004. As if to second its importance, a National Academies report that year said “widespread, reliable, and consistent data” by race and ethnicity are “critical to documenting the nature of disparities in health care and developing strategies to eliminate disparities.”

This is no time to stop developing strategies. Remaining disparities are every bit as urgent as when Margaret Heckler’s task force report said that people of color “have not benefitted fully or equitably from the fruits of science or from those systems responsible for translating and using health sciences technology.”

We can see from examples such as these that words can have serious consequences for Americans’ health. After all, you cannot determine what needs to be done without the language to speak about it.

Data Integrity and Voting Rights: Will the Supreme Court Protect the Right to Not Vote?

UCS Blog - The Equation (text only) -

The first major voting rights case of the year comes before the Supreme Court Wednesday, when Justices hear arguments over the state of Ohio’s “supplemental process” for removing people from voter registration lists. The case is important procedurally and politically. While only a handful of states currently use a procedure as strict as Ohio’s, if the Court upholds, it will likely be adopted by more states. Politically, the issue of voter list management has become a partisan one, with Republicans claiming that threats of voter fraud necessitate more stringent cleaning of voter rolls, while Democrats argue that such tactics serve to suppress voter participation. The heart of the matter concerns the scientific integrity of eligible voter data.

The legal issues with Ohio’s process

If a registered voter in Ohio does not vote in a two-year period, they are sent a notification to verify eligibility by a local election board, and removed from the rolls if they fail to respond and do not vote in the next four years. Plaintiffs argue that such a voter purge would have disenfranchised over 7,000 Ohio voters in the November 2016 election, had it not been for the Sixth Circuit Court’s determination that the supplemental process “constitutes perhaps the plainest possible example of a process” that violates the National Voter Registration Act’s prohibition against list-maintenance programs that assume non-voters have moved after verifications have been sent, and remove them if they never respond.

The defendant, Ohio Secretary of State Jon Husted, has responded by arguing that the failure to respond, rather than the notification, is the trigger that “breaks the prohibited link between nonvoting and removal.” Further, defendants make a textual argument regarding a sub-section requirement of the National Voter Registration Act of 1993, that registrants who have not voted and do not respond to a notice “shall be removed” from the voter list, though that subsection also prohibits removal solely for failure to vote. Finally, defendants argue that prohibiting Ohio’s specific trigger would be an infringement on federalism and the right of states to determine their own triggers within the boundaries of federal law.

Looking for clarity on federal protections

The Supreme Court’s analysis will be consequential in shaping those federal boundaries, and by extension, the administrative policies and technologies that are adopted by states prior to the 2018 elections. Unfortunately, federal priorities have been confounded by the controversial reversal of the Department of Justice’s position on this case. Last August, federal attorneys under new Attorney General Jeff Sessions claimed that, as a result of the “change in Administrations,” they now support Ohio’s process, in contrast to Obama-era officials who sided with plaintiffs. This highly unusual about-face only muddies the legal waters as the Supreme Court prepares to wade in.

Ultimately, the Supreme Court needs to clarify the protections afforded by the federal government under the Elections Clause, which gives Congress authority to “make or alter” state election regulations, a provision that the Framers included specifically with the intent of keeping states from manipulating Congressional elections so as to subvert the power of the federal government. In this regard, defendants are correct to point out that this is a question of federalism.

Integrity and access aren’t conflicting goals

The National Voter Registration Act of 1993 and the Help America Vote Act of 2002 were both crafted by Congress with the explicit intent of increasing voter registration and participation in the electoral process. The state of Ohio claims that these regulations have “dueling purposes” of facilitating access to the ballot, but also ensuring the integrity of the process, which in their view necessitates regularly cleaning and purging voter lists.

While the value of one’s vote can be unconstitutionally diminished either by denying it to eligible voters, or by contaminating votes cast with ineligible votes, the Supreme Court needs to recognize that these purposes are not necessarily “dueling” or in conflict. To the extent that there are administrative policies and technologies that can improve both access and integrity, the Court should be suspicious of any regulation that sacrifices equal access to the vote when a less burdensome procedure is available that protects electoral integrity as well or better than the proposed policy.

In this context, there are several regulatory and technological innovations that are demonstrably superior to Ohio’s supplemental process. For one, it is possible to use existing statewide data in a manner that more effectively identifies residential changes and eligibility than Ohio’s notification process. Algorithms are being developed that couple multiple databases from state records, and could more accurately identify what Ohio claims they are looking for, residential changes, without accidentally removing eligible voters from the lists.

Studies have also indicated that Automatic Voter Registration (AVR), that is, requiring eligible residents to opt out of voter lists rather than opt in, can both substantially increase political participation and provide a more accurate database to verify voter eligibility. Employing such technologies not only provides a more secure and effective means of protection against voter fraud, to the extent that it exists, they also protect against external hacking and violations of integrity in a way that many current techniques do not. The Supreme Court needs to take questions of data management and integrity into account if they are to uphold their obligation to apply strict scrutiny to the violation of our constitutionally protected right to a free, fair and equal vote.

New Update of the UCS Satellite Database

UCS Blog - All Things Nuclear (text only) -

A newly updated version of the UCS satellite database was released this fall, including launches through the end of the summer. The total number of operating satellites is now 1,738.

The changes to this version of the database include: the addition of 321satellites, the deletion of 35 satellites, and, as always, the addition of and corrections to some satellite data.

The number of active satellites has historically grown modestly over time, since the newly-launched satellites are balanced by those that are de-orbited or have become inactive. But this quarter, 321 new satellites were added to the database, a record by far. While sheer numbers are growing, it’s also important to keep in perspective that much of this growth is in small satellites. For example, nearly half of the new satellites were Planet Labs’ Doves, part of a constellation of small satellites designed to provide constant, timely imagery of the earth’s surface. Dozens of others were Spire’s Lemur small satellites, providing commercial weather and ship-tracking services. Also launched were 20 new Iridium NEXT low earth orbit communications satellites. These satellites, at 860 kg launch mass, weigh as much as 86 Lemur satellites or 215 Doves.

In any case, it seems clear that the growth in numbers of satellites won’t be as slow as it used to be, and may accelerate quite a bit in the future. In 2016, commercial companies filed for a U.S. Federal Communications Commission license for 8,731 non-geostationary communications satellites, including 4,425 for SpaceX, nearly 3,000 for Boeing, and 720 for OneWeb.

3 Questions Worth Answering in the Wake of Winter Storm Grayson

UCS Blog - The Equation (text only) -

Yesterday in Massachusetts we were asking ourselves questions that have rarely, if ever, needed asking.

What happens when half-frozen seawater suddenly floods onto roadways? Can something the consistency of a milkshake and 3 feet deep be plowed? There’s a large dumpster floating down the street… What depth of water is sufficient to do that? What happens if some of this water freezes in place before it retreats (as I write this, the temps have plummeted to 12 degrees F and dropping)? Will those cars now filled with seawater in the snow-emergency parking lot run again? What if the water freezes inside them over the weekend, can that punch out doors?

The stories are countless. In Salem, MA, my mother watched out her window as fire and rescue workers hauled someone to safety on a raft through at times waist-deep water.

A rescue underway on normally busy route 107 behind my parents’ house in Salem. (Note the “flood zone” sign and mentally add exclamation points.) Credit: Fred Biebesheimer.

My colleagues and I think about coastal flooding a lot, but the footage from yesterday had our brains buzzing with new unknowns and threats never considered.

I’ve been keeping an eye on social media, the news, and hearing from friends and family, and these three questions emerged for me as needing to be asked and answered.

Why was this flooding so much worse than forecast?

In the lead up to yesterday’s storm, dubbed “Grayson” by the Weather Channel, the coastal flooding forecasts shifted from minor to moderate, from moderate to major. Coastal residents monitoring  this would have been concerned but not nearly enough. Even as the storm was getting underway, the flooding forecasts greatly understated what actually played out on the ground.

In the end, severe flooding struck multiple areas of the coast of Massachusetts from the North Shore to the Cape, with chest-high water in some locations, emergency boat rescues, and damage that we’re just beginning to take stock of. People were caught off guard, greatly increasing the risk to public safety and the damage to property.  For example, below is a shot of the Gloucester High School parking lot where residents are instructed to park their cars in a snow emergency. Ouch.

Look closely! These are cars submerged in record coastal flooding in #Gloucester Massachusetts! People were asked to get their cars off the streets and park at the local high school. Then the football field turned into a lake! @fox5dc

— Sue Palka FOX 5 DC (@suepalkafox5dc) January 5, 2018

Why didn’t we see this coming?

The reasons given by local meteorologists for the surprising severity are the astronomical high tide (Monday was a full moon) that coincided with the storm’s path, and strong onshore winds creating significant storm surge and damaging waves. The tide itself is no mystery and our ability to forecast storm surge is pretty good. The wind speeds and snowfall totals were mostly as forecast. So where was the gap between our forecasting methods and tools and this storm’s true coastal flood potential? And how do we close it? Have asked the National Weather Service Boston Office, and can report back.

Was this flooding made worse by climate change?

Boston’s Mayor Marty Walsh declared “If anyone wants to question global warming, just see where the flood zones are. Some of those zones did not flood 30 years ago.” And he did so while the storm was still lashing the city. The days of tiptoeing around this question are clearly over.

So, what can and can’t be said here on sound scientific footing? Like any storm, there were a lot of factors responsible for yesterday’s: wind speed and direction, and the resulting storm surge and wave height. And there are two ways that climate change plays a role in the impact of storms like this.

On the one hand, it can influence a storm itself – causing it to form faster, become stronger, etc., so that when it strikes, it has greater potential for doing damage. Tracing this to climate change is harder to do, but the science is catching up. As climate scientist and colleague, Rachel Licker, pointed out this week “According to the American Meteorological Society’s new report, science is now not only able to detect a climate change signal in individual extreme events, science is now able to determine whether climate change caused by humans was essential in the development of an extreme event. In other words, science is now at the point where it is able to tell us whether certain extreme events would or would not have happened without climate change.”

We don’t know the day after, however. Such research takes time. But I expect we’ll hear more about the detection of climate change fingerprints on this storm in the months to come. See my colleague Brenda Ekwurzel’s blog for more on this specific topic.

The other way climate change plays a role in the impact of storms is clear and can be discussed more definitively today: today’s storms have higher water levels to “work with” due to sea level rise. In Boston, water levels have risen ~5 inches just since the blizzard of ’78. (This upward trend is also responsible for the increased tidal flooding along Boston’s waterfront.) So ANY storm that hits our coasts today is working with water that is higher and closer to our cities, buildings, homes, and infrastructure, than when we first put them there.

It was interesting to note that the tide height associated with this storm topped the Blizzard of ‘78 by hundredths of an inch. In its defense, the Blizzard of 78 was working with an ocean that was 5 inches “shorter”. If that exact storm happened today, the flooding would be worse than it was in 1978 given this additional water. And importantly, the damages would likely be worse as well, given the additional people, property, and stuff we’ve put along our coasts since that time.

But speaking of comparisons. We released an analysis in 2017 that identified areas along the entire US coast that would flood on a chronic basis, just with normal tidal fluctuations. By 2060, the general area of Boston that flooded yesterday would flood at least 26 times per year, irrespective of storms or rainfall, with a high rate of sea level rise. (Add storms and rainfall and the frequency rises.) That’s about 45 years from now, well within the lifetime of the buildings and infrastructure we’ve built and continue to build in these areas. With a more moderate rate of sea level increase, it would flood chronically a couple of decades later.

Boston’s storm-flooded area becomes it’s tidally-flooded area later this century.

This sunny-day flooding—the kind seen today at places like Long Wharf during extreme high tides—wouldn’t have the destructive waves of yesterday’s storm. It would, however, put large areas under inches and potentially feet of sea water, it would be unaffected by the construction of major harbor storm barrier, and it would preclude business-as-usual along some of the busiest and highest-value parts of Boston’s waterfront. Go to this link to view your own coastal community.

Importantly, our analysis also shows that a lower rate of sea level rise, associated with adherence to the Paris Climate Agreement, could greatly reduce this flooding.

What are some responsible takeaways?

We’ll be taking stock of this storm for some time to come.

Boston, a city with a strong and growing commitment to coastal climate preparedness and resilience, an unsurpassed local expert community, and uniquely engaged business and philanthropic sectors, can emerge as an even stronger national leader in the wake of this storm.

Massachusetts, with its growing if patchy commitment to the same, can recognize its mounting exposure to coastal flooding and get much more serious on this front. Republican MA Governor Charlie Baker’s Executive Order on state government adaptation efforts shows that sensible, bipartisan action is possible. Passage of the bill to establish a comprehensive adaptation management action plan (CAMP) would codify this Order and represent a serious commitment toward tackling our climate risks.

The important takeaway for Boston, Gloucester, Scituate, Barnstable, Salem, and on down the line—as well as for places in other states that dodged this bullet, this time—is not simply how do we prepare for storms like this. It’s how do we prepare for a future —and to a certain extent, a present—where storms have the potential to be more destructive, and where no storm is needed for transformative flooding to occur. In Massachusetts, we can do that, and the sooner we start, the less costly and disruptive it will be.

My kids’ favorite playground during the storm. Credit: Caroline Maloney.

As President Trump Speaks to the Farm Bureau, Both Are Betraying Farmers

UCS Blog - The Equation (text only) -

Photo courtesy raymondclarkeimages/flickr

On Monday, the US president will take time out from his regular schedule to talk about agriculture. He’s scheduled to deliver a speech in Nashville at the annual convention of the American Farm Bureau Federation (aka “Farm Bureau”). The organization’s president, Zippy Duvall, called this a proud moment, and I’m not surprised. The Farm Bureau and Mr. Trump have a lot in common: they claim to serve farmers but in reality, they’re not doing much to improve most farmers’ lives and prospects.

The Farm Bureau bills itself as the nation’s largest nonprofit farmers’ organization and “the unified national voice of agriculture.” It is a powerful force in Washington, DC, spending millions of dollars each year lobbying Congress, and setting its sights on policy issues that range well beyond agriculture. After a 2016 presidential campaign that highlighted the plight of farming communities, Duvall cheered rural America’s role in electing Donald Trump. And now, the Trump administration is eager to tout its accomplishments for farmers and attempt to cement the support of this critical constituency.

The farmers Trump forgot

The problem is, despite all his populist rhetoric, President Trump and his agriculture secretary, Sonny Perdue, have delivered far more for agribusiness—the deep-pocketed corporations that buy, process, and trade farm commodities—than for the average farmer and farm worker. And while Secretary Perdue (a big agribiz guy from way back) trumpeted his department’s 2017 accomplishments in a self-congratulatory year-end press release, my assessment of the Trump administration’s actual contributions to the well-being of most farmers and their communities so far is quite different.

In his first 100 days, the president proposed steep cuts to the US Department of Agriculture’s budget, which would impact technical assistance to farmers as well as funding to improve rural water systems, and, potentially, food assistance programs that serve low-income rural residents. His hardline immigration rhetoric and increased deportation actions have led to a farmworker shortage that has affected farms from California to Michigan. He threatened early on to walk away from the North American Free Trade Agreement, which American farmers like because it has expanded markets for their grains, meat, and dairy products. And although President Trump later committed to “renegotiate” the pact with Canada and Mexico, his blustering, bullying tactics (disconcerting even to his own negotiators) may blow up the deal anyway. Many farmers feel betrayed.

Things haven’t gotten better from there.

In October, the Trump USDA rolled back the Farmer Fair Practices Rules, which the previous administration put in place to give poultry and livestock farmers more power in marketing contracts with meat processing companies, and to make it easier for contract farmers to sue those companies. The rollback means that farmers lose their recently-gained protection from exploitation by the consolidated corporate giants who control and monopolize nearly every step of the meat and poultry production chain. The Farm Bureau approved.

And then there’s the Tax Cuts and Jobs Act recently passed by the president’s party in Congress and signed into law. Among the provisions pushed by the White House (and the Farm Bureau) was all-out repeal of the estate tax, which the president said would “protect millions of small businesses and the American farmer” from disaster. With the nonpartisan Tax Policy Center estimating that only about 80 small business and small farm estates nationwide would face any estate tax in 2017, PolitiFact labeled the president’s statement a “pants-on-fire” claim. (Ultimately, the bill doubled the existing estate tax exemption to $11 million per person.)

In the tax bill as a whole, some observers see more downsides than benefits for all but the richest farmers, and analysis by one national agricultural accounting firm indicates that the benefits to farmers will be temporary. Still, the Farm Bureau applauded the final bill, including its imaginary estate tax benefit for farmers.

“Farm Bureau” ≠ “Farmers”

And of course the Farm Bureau applauded it. Because the agribusiness CEOs and investors that will reap benefits from big tax cuts and other Trump administration policies are the organization’s real constituency. That doesn’t mean the president’s audience in Nashville next week won’t include many honest-to-goodness farmers and ranchers. There will be many, but their actual interests are not served by the Farm Bureau’s federal policy priorities.

Moreover, as actual farmers, they make up a fraction of the Farm Bureau’s claimed membership numbers. Here’s how the organization describes its membership in legal filings:

The American Farm Bureau Federation (AFBF), a not-for-profit, voluntary general farm organization, was founded to protect, promote, and represent the business, economic, social, and educational interests of American farmers and ranchers. AFBF has member organizations in all 50 states and Puerto Rico, representing about 6 million member families.

But the Farm Bureau’s own website simultaneously acknowledges that there are about 2.2 million farms in the United States today. Now, math isn’t my strong suit, but I’d say it’s impossible for most Farm Bureau “members” to be farm families. Instead, it appears that the organization’s membership figure has been vastly inflated by…wait for it…

Insurance customers. That’s right, many state farm bureau affiliates are heavily invested in, or actually operate, insurance companies. And many, many of the insurance customers that the Farm Bureau then automatically claims as “members” have little or no connection to farming.

Moreover, investments in insurance companies have made at least a few state Farm Bureau affiliates spectacularly wealthy. For example, IRS documents from 2013 show that the Iowa Farm Bureau had investment income topping $46 million and total assets exceeding $1 billion that year.  So it’s easy to see why the Farm Bureau would promote big business interests while opposing programs and legislation that would benefit the majority of farmers.

Farmers deserve better champions

Still, many farmers get their information about the public policies that affect them from the Farm Bureau. And the farmers in Nashville will surely be seeking help—from the Farm Bureau and from the president—to succeed in a perilous farm economy that may become a full-blown farm crisis; to cope with the flood, drought, and wildfire disasters that are becoming increasingly common; and to pass down thriving farms to the next generation.

Mr. Trump seems to think he can drop in on the Farm Bureau’s annual shindig and tell them how much he cares about farmers (“believe me”). Perhaps he’ll deliver one of his infamous rally speeches, and maybe the assembled crowd of Farm Bureau leaders in Nashville will eat it up. Or maybe they will realize that it’s all talk, and the administration doesn’t really care about the interests of the average American farm family. As for the president and his team, I suspect they are mistaken to equate the Farm Bureau with “farmers” (and by extension, rural voters), but that remains to be seen.

Ultimately, farmers need more than speeches and slogans. They need real investments in their communities, in research and technical assistance, and in the long-term viability of farming. But it’s unlikely they’ll get it from the Farm Bureau, or the Trump administration.

Anti-Science Nominee Kathleen Hartnett-White Faces Renewed Scrutiny for Top Environmental Post

UCS Blog - The Equation (text only) -

Hartnett-White's extreme views and lack of understanding of basic scientific issues were on full display at her confirmation hearing before the Senate Environment and Public Works Committee.

Kathleen Hartnett-White was an ill-advised nomination to lead the White House Council on Environmental Quality (CEQ) from the get-go.

Perhaps the Trump administration assumed her nomination would fly under the radar. CEQ is not exactly a high-profile entity; you don’t hear or read much about it. Nor is there much public discussion about the National Environmental Policy Act (NEPA) that the CEQ oversees, even though NEPA is considered the Magna Carta of Environmental Law—so admired it has been replicated by nations around the world. (For more, here’s some quick background on CEQ and NEPA).

Serving industry interests over the public interest

Be assured, however, that regulated industries (especially the petrochemical industry) are well aware of NEPA. Some may be counting on Hartnett-White to weaken its environmental and public health protections. One commenter even noted that Hartnett-White’s nomination is a “game changer” in terms of CEQ ensuring agency enforcement and implementation of NEPA; the commenter mused that “It seems Trump has a very different role in mind, and CEQ is being lined up as a streamlining agency to make sure permitting is happening more quickly…. Maybe the idea is that CEQ will be pushing agencies to get things done quicker and not get bogged down in broader NEPA reviews.”

Along with other Trump appointees—like EPA Administrator Scott Pruitt, Interior Secretary Ryan Zinke, and Energy Secretary Rick Perry—Harnett-White, if confirmed, seems destined to serve industry interests over the public interest. That should worry all of us.

Facing intense scrutiny

White’s nomination hasn’t escape scrutiny in either mainstream or social media. It got a lot of press (here, here, here, here). More than 300 scientists sent a letter to the Senate opposing her nomination “because one thing more dangerous than climate change is lying.” (More here.) And she did herself no favors at her confirmation hearing before the Senate Environment and Public Works Committee where her extreme views and lack of understanding of basic scientific issues were embarrassing and on full display (see for yourself).

Her written responses to EPW Committee questions for the record also reiterated some of her anti-science views on climate change, particulate air pollution, mercury and air toxics, and the Clean Air Act. And then of course there is the fact that she actually plagiarized some responses from the answers submitted by other Cabinet nominees.

Despite her embarrassingly poor performance at the hearing and her historically extreme views, Hartnett-White was narrowly voted out of the EPW Committee on a strict party-line basis on November 29, 2017. But her nomination did not make it to the Senate floor in 2017.

Dangerous, outside the mainstream, and unfit to lead CEQ

Although CEQ is relatively small and unknown, it plays a critical role in our nation’s public health and environmental protection, especially through NEPA. And Hartnett-White’s views on climate science, air pollution and health, clean and renewable energy, and the role of science in public policy are dangerous and outside the mainstream. Our country needs and deserves a more qualified candidate to lead CEQ.

Kathleen Hartnett-White vs. Science

These few snippets of her views and prior statements make their own case—and you can see more here.

Hartnett-White: “Ambient PM [particulate matter, a.k.a. soot] levels in the United States today are low and I do not believe that PM at these levels pose a health hazard. There is considerable uncertainty in the scientific literature about whether exposure to PM actually causes adverse health outcomes and, if it does, at what concentration effects may occur.” (Written submission to EPW questions for the record, 2017: Question 34, page 13.)

Science: Numerous scientific studies document adverse health effects of particulate matter (e.g., here and here). A 2017 Harvard study could find no evidence of a safe level of exposure to smog or particulate matter.

Hartnett-White: “Carbon dioxide is not a pollutant, and carbon is certainly not a poison. Carbon is the chemical basis of all life on earth. Our bones and blood are made out of carbon. A natural, trace gas in the Earth’s atmosphere, invisible and odorless, carbon dioxide does not contaminate the air as genuine pollutants can do. Ambient CO2 has zero health impacts. This falsely maligned natural gas is better known as the “gas of life” because it is a necessary nutrient for plant growth — the food base of life on the planet earth.” (Op-Ed in Austin American-Statesman, 2016)

Science: Credible scientists and institutions recognize CO2 as the most important and dangerous driver of climate change; it remains in the atmosphere for decades and, as a climate pollutant, is associated with a host of health impacts. See here, here, here, and here.

Hartnett-White: “IPCC science claims of 95 percent certainty that human activity is causing climate calamity are more like the dogmatic claims of ideologues and clerics than scientific conclusions.” (June 2014 TPPF policy document, titled “Fossil Fuels: The Moral Case”).

“There’s a real dark side in the kind of paganism, the secular elites of religion now being evidently global warming.” (From 2016 interview on the TRP Show, The Right Perspective)

Science: IPCC’s 5th Assessment Report (AR5) is the result of the collaboration of over 800 scientists from 80 countries and the assessment of over 30,000 scientific papers.

Hartnett-White: “I am not at all persuaded by the IPCC science that we are standing on some precipice…. “We’re not standing on a cliff from which we are about to fall off.” She also called the scientific conclusions from United Nations panels “not validated and politically corrupt.” (Washington Post, 2017)

Science: IPCC, 5th Assessment Report (AR5), Summary for Policy Makers (SPM), pages 8-16:  “Continued emission of greenhouse gases will cause further warming and long-lasting changes in all components of the climate system, increasing the likelihood of severe, pervasive and irreversible impacts for people and ecosystems.”

Public health experts also sound the alarm. The Lancet Countdown on health and climate change…”exposes the urgency for a response as environmental changes cause damaging effects on health worldwide now.”

Hartnett-White: “[The IPCC] never really takes on an explanation of how the other variables in climate affect climate. […] It never takes on the Sun.” (Speaking at Ars Technica, 2016)

Science: IPCC SPM, Page 8.  Shows the negligible contribution of “natural forcings” and “anthropogenic forcings.” See also NASA data, Figures 1 and 2 and National Academy of Sciences, page 9

Hartnett-White: “Most green energy policies undermine human progress.” (from her book Fueling Freedom, co-authored with Heritage Foundation fellow Stephen Moore).

Science: See, for example: Renewable Resources: The Impact of Green Energy on the Economy; The Economics of Renewable Energy: Falling Costs and Rising Employment; more here and here.

Hartnett-White: “Government by popularly elected representatives on the one hand and the government by federal administrators swearing by the authority of science, on the other hand, are contradictory notions. I would call the latter, moreover, an acutely dangerous notion. Regrettably, in the modern United States these two incompatible policy-making models clash often, and with dire results. Elected officials trying to carry out their public duties—e.g., maximizing access to clean, affordable energy—meet stubborn opposition from federal mandarins brandishing their scientific credentials.”  (Texas Public Policy Foundation, 5/18/2012)

Science: To the benefit of public health, federal scientists have “brandished their scientific credentials” on a host of toxic substances (e.g., lead, arsenic, silica), not to mention on HIV, bird flu, food safety, etc.   (Add your favorite to the list here.) This recent op-ed by former EPA Administrator Ruckleshaus is also worth a read.

A second bite at the apple?

There may be hope yet. Senate rules require all unconfirmed nominations still pending at the end of its first session be sent back to the White House—unless there is unanimous consent that it be held over without re-nomination. If a nomination is sent back, the confirmation process begins all over again. And, thanks to Senator Tom Carper of Delaware, the Hartnett-White nomination has gone back to the White House. This means President Trump would need to re-nominate her—or someone else—and there may be another EPW hearing and vote before any nominee goes before the full Senate for confirmation. (Read about this here.)

This provides several opportunities for a re-think. Hartnett-White could avoid herself further embarrassment and withdraw her nomination or the administration could decide to do the same and instead nominate a more qualified candidate for this important post.

If Hartnett-White is re-nominated, EPW Committee members who previously voted for her could put party politics aside and NOT send her nomination to the Senate floor. And, if all else fails, the full Senate could look honestly at her record, her anti-science and unscientific views, her coterie of ardent benefactors among polluting industries, and her own deep conflicts of interest, along with her questionable support of and at times outright hostility for our fundamental environmental laws. And then the Senate itself could do the right thing and reject her nomination.

Time to speak up. Again.

We’ve seen how our collective voices can make a difference. Public pressure and scrutiny has already forced other unqualified candidates to withdraw, including Sam Clovis, who was nominated to be chief scientist at the US Department of Agriculture; and Michael Dourson, who was nominated to be Assistant Administrator for Toxic Substances of the EPA.

We need to exercise our voices (and writing skills) yet again to oppose this egregious nomination.

Call your senators today at 866-580-8532 and urge them to oppose the nomination of Kathleen Hartnett-White. You can prepare with more information and talking points to help you have as effective a call as possible.

If the White House does nominate her once again, call your Senators and ask them to oppose her confirmation. You can also help fight this nomination by penning a letter to the editor or an op-ed for your local paper (some helpful hints here), and speaking out on social media.

We’ve done it before, and we can do it again. But time is of the essence; the time to act is now.


Subscribe to Union of Concerned Scientists aggregator - Combined UCS Blogs