Combined UCS Blogs

Did My Tea Leaves Reveal the Supreme Court’s Upcoming Gerrymandering Ruling?

UCS Blog - The Equation (text only) -

This morning, I stirred my green tea vigorously to see if it would reveal the Supreme Court’s opinion on two partisan gerrymandering cases that are soon to be released. The tea spilled, I scalded my lap, then wondered why any Decent American Patriot would sip tea while the nation awaits a decision of such historic significance. I then made a cup of coffee and resolved to give up fortune telling.  So I won’t try and predict where the Court will come down on the constitutionality of partisan gerrymandering. However, I will offer some guideposts to help interested parties (see what I did there) understand the significance of the decision when it comes.

1. Is there a real decision?

It is always possible that SCOTUS decides to re-argue the points next session if there is serious fragmentation of opinion about what constitutional principles, if any, should govern partisan gerrymandering. Of course, that did not stop the Court from issuing a fragmented opinion in Vieth v Jubelirer, the decision that unleashed state legislatures to gerrymander without restraint in 2011.  Or they could decide that plaintiffs in the first Wisconsin case, Gill, do not have standing because they were not harmed within a gerrymandered district.  That outcome could have serious implications, and could depend on who writes the majority opinion.

2. Who writes the opinion?

While all eyes have been on Justice Kennedy as the decisive swing vote in these cases, Chief Justice Roberts is the only justice who has not yet written a majority opinion from this session, which makes it more likely that Roberts will be the author. The possibility of a Roberts opinion has led to speculation at Election Law Blog and other sites about the possibility that the Court will take a narrow, district-level approach, focusing on arguments such as those offered by Republican plaintiffs in the Maryland case, Benisek.

As has already been pointed out by Gill counsel Nick Stephanopoulos, this would be a misguided approach for SCOTUS to take if the goal is to conservatively reduce the number of applicable cases and thus restrain court intervention.  Moreover, the logic of state-imposed harm on all voters of the targeted party is inescapable and would inevitably make its way back into legal arguments.  As Justice Kennedy has acknowledged, it is the state that is imposing the inequity, and it is a state-level harm, in the sense that it is the number of seats denied the opposition party from all seats in the statewide districting plan that causes targeted voters (who voted for the opposition party) to suffer vote dilution.

An opinion that does the work that Kennedy and the liberals require, but is narrow enough for Roberts to be on board, will likely require more than a demonstration of intent to discriminate.  Harm will have to be demonstrated empirically, with clear evidence that the relationship between party vote and seat shares has been intentionally manipulated to punish voters who favor the opposition party.  And that takes us back to some of the fundamental scientific questions that gave rise to these cases in the first place.

3. What kind of rights are we talking about? Equal Protection? Free Speech and Association?

One of the most interesting aspects of these cases from the perspective of constitutional theory resides in the variety of ways that plaintiffs and lower courts have linked the harm of gerrymandering to constitutional protections. Traditionally, gerrymandering cases have used equal protection arguments, specifically the 14th Amendment, to protect voters from districting plans that don’t treat voters equally.  Alternatively, Justice Kennedy specifically, and the Court more generally, has been more receptive to “free speech” arguments as of late, especially in campaign finance and other election law cases, so this has become a more popular strategy.

The basic claim behind this strategy is that a vote cast is a form of expressive association, such that diluting or suppressing the value of that act violates the 1st Amendment.  There is considerable disagreement over the extent to which such claims are still implicitly dependent on the equal protection provided by the 14th Amendment, so it is certain that the Court’s response to these claims will shape future litigation and legislation.

4. Will the Court rely on a single metric to determine harm?

Almost certainly not, but the Court could set parameters and narrow the bounds of applicable cases by emphasizing that in the two cases in question, all of the empirical measures relied on by lower courts converged. That is, in the worst cases of gerrymandering, it doesn’t matter which metric is used, those for partisan bias, efficiency, and mean-median gaps will all show that a plan gives an asymmetric advantage to the voters of one party over another.

At the same time, the majority decision, or concurring opinions, could provide more support to some metrics over others. The efficiency gap is among the newer kids on the block and should receive a good deal of attention, but the model of asymmetry was developed over 20 years ago, and is still dominant in the field.  Of greater interest for those following the election science is the degree to which the Court considers the constitutional implications of these different measures, which are significant.  Specifically, as litigation and legislation moves forward, such arguments will be relevant for clarifying just what the constitution demands of our electoral systems, and how we can distinguish its bugs from its features.

5. How much is too much?

Again, it would be surprising for the Court to establish an empirical metric of “x percent.” Rather, a workable, manageable threshold would reflect both what is constitutionally required, but also respects judicial restraint.

This is why Maryland seems like an especially important case, in that a decision overturning that state’s Democratic gerrymander (the governing party manufactured an extra seat when they are already the dominant party) would provide a rather clear guideline, a one-seat principle. That is, if it can be shown, through whatever metrics, that the opposition party’s voters are effectively and reliably denied a minimum of a single seat as the result of an adopted plan, which is what would be required for vote dilution to occur, it would be grounds for overturning a districting plan.

If the Court can provide such guidance to lower courts, as to how much is too much inequality, that is as much as we can ask for. For the current situation is clearly too much, in the opinion of experts and citizens alike.

A Great Day for Offshore Wind: Massachusetts, Rhode Island, New Jersey All Go Big

UCS Blog - The Equation (text only) -

Photo: Derrick Z. Jackson

Offshore wind power is a powerful, plentiful resource, but that doesn’t mean that it’s been a slam dunk in terms of getting it into the US electricity mix. Movement forward on offshore wind in three different states, though, made yesterday a day to celebrate.

1. Massachusetts says yes to 800 megawatts

The state we’d been watching this week was Massachusetts. Yesterday was to be the date for an announcement about which offshore wind project or projects had been selected for the first phase of a 1600 megawatt commitment from the state based on a 2016 energy law.

And the day didn’t disappoint. While the law required at least a 400 megawatt first tranche, the state announced that an 800 megawatt proposal from Vineyard Wind was the winner of this round. The larger project likely brought with it some nicely lower pricing, and was a pleasant surprise.

That amount of power (as our handy new offshore wind calculator shows) will generate electricity equal to the consumption of more than 400,000 typical Massachusetts households. It will also, given the electricity mix and what that offshore wind power might displace, reduce carbon emissions by the amount emitted by almost 200,000 cars.

All that requires actually getting the wind farm built and the turbines spinning. But yesterday’s step was an important one.

2. Rhode Island goes for 400 megawatts

Another pleasant surprise from yesterday was the announcement that Rhode Island had taken advantage of the same bid process and selected a 400 megawatt project of its own.

While the announcement was a surprise, Rhode Island’s commitment to offshore wind isn’t. The new project-to-be, from Rhode Island-based developer Deepwater Wind, will build on the state’s (and Deepwater’s) experience with the first-in-the-nation 30 megawatt Block Island Wind Project. And it fits within Gov. Gina Raimondo’s recent call for 1,000 megawatts of renewable energy for the Ocean State by 2020.

Rhode Island has already shown it knows how to get offshore wind done. While the next project will be in federal, not state, waters, that experience is likely to count for something in the race to get the next steel in the water.

3. New Jersey grabs a piece of the limelight

Not to be outdone, New Jersey also used yesterday to move offshore wind forward. Gov. Phil Murphy signed into law a 3,500 megawatt state goal that the legislature had recently passed. That’s the largest state commitment to date, and the latest in the crescendoing drumbeat of state action on offshore wind.

And the first tranche of Garden State action may be even larger than what Massachusetts and Rhode Island just moved forward on. Just after coming into office, Gov. Murphy ordered the state’s public utility commission to carry out a solicitation for 1,100 megawatts of offshore wind.

Offshore wind means jobs (Credit: Derrick Z. Jackson).

While megawatts may be the stuff of headlines, each of those projects and commitments is about a lot more—jobs in the near term, and air quality improvements, carbon reductions, careers, and more once the projects are up and running.

What’s next?

All that is particularly true as even more states get into the act. So where should we look next for leadership on offshore wind?

Connecticut could be poised to join its neighbors as it makes decisions about proposals for meeting its own renewable energy needs. The bids included proposals from Vineyard Wind and Deepwater Wind, plus Bay State Wind, the other entity vying for the Massachusetts and Rhode Island attention.

It’s also unlikely that New York is going to stay quiet, given its new offshore wind master plan, a 96 megawatt project planned for off Long Island’s South Fork (also being developed by Deepwater), the record-breaking lease sale off New York City in late 2016, and federal moves to evaluate more potential sites in the New York Bight.

Or we could be hearing more from Maryland, with two projects making their way forward with state support. Or Virginia, with a pilot 12 megawatt project. Or Delaware, or North Carolina, or…

Lots of future to watch—and make happen—even as we celebrate the immediate past. Because, given our need for clean energy and good jobs, and given the incredible potential of offshore wind, we’ll be wanting a lot more days like yesterday.


Photo by Derrick Z. Jackson

EPA Extends Comment Deadline, Schedules Hearing on Science Proposal After Pretty Much Everyone Complains

UCS Blog - The Equation (text only) -

The EPA today extended the comment deadline to August 16 on its proposal to restrict the types of science that can be used in EPA decisions after pretty much everyone—from the American Home Builders Association to the American Geophysical Union—complained that a thirty-day comment period was grossly insufficient for a rule with such potential wide-ranging consequences. The EPA also scheduled a public hearing to be held in Washington, DC on July 17.

The EPA’s proposal would prevent the EPA from using many public health studies when making decisions. Scientists now have more time to comment on the potential harm that this proposal would have on public health and the environment.

The move gives scientists the ability to develop more sophisticated comments and ensure that their peers have the opportunity to detail how the rule would impact their own public health research and its use in EPA decisions—and to submit for the record specific studies that could be set aside. It is important for scientists to explain how and why specific communities would be harmed by excluding legitimate, peer-reviewed public health research from consideration by EPA.

In just three short weeks, nearly 100,000 comments were submitted.

From the beginning of the comment period, scientific organizations repeatedly and pointedly repudiated the EPA’s claim that the new rule is consistent with scientific transparency standards. The EPA heard from both industry and the science community that the short comment period on such a vague and badly written rule was wholly inadequate and possibly even in violation of the Clean Air Act and other statutes. Now scientists will have a few more weeks to fully detail the impact that such a fatally flawed rule would have on public health and the environment.

UCS and its partners have produced a guide for scientists and organizations on filing an effective public comment on this rule, and will be encouraging people to provide testimony at the July 17 hearing.

Experts ask Exxon and Chevron to Stop Climate Deception and to Act to Protect Human Health

UCS Blog - The Equation (text only) -

Woman raising hand to ask question in a meeting.

Next week, I’ll be joined inside the ExxonMobil and Chevron annual meetings by scientists, environmental justice advocates, and UCS colleagues—all of us representing shareholders concerned about climate change. These meetings are the one time every year that corporate CEOs, board members, and top management have to face their investors—and thus a rare opportunity for us to spotlight the impact of corporate decisions made behind closed doors. Those of us attending the meetings will have plenty of questions about both companies’ failure to plan for a carbon-constrained world, their outsize responsibility for global warming impacts, and their lobbying against climate action.

But why should a handful of us in Dallas, Texas, and San Ramon, California have all the fun? (Indeed, US Senator Sheldon Whitehouse raised serious questions about the assumptions underlying ExxonMobil’s recent climate risk report in a floor speech this week).

We wanted to know what questions other experts would ask if they had two minutes in front of the decision-makers of ExxonMobil and Chevron and the people they care most about—their shareholders, employees, retirees, financial analysts, and the business media.

Three early career scientists stepped up to the virtual mic:

  • Benjamin Franta is a PhD student in history at Stanford University. His research focuses on the history of climate science and the American petroleum industry. He has a PhD in applied physics from Harvard University and is a former research fellow at the Belfer Center for Science and International Affairs at the Harvard Kennedy School of Government.
  • Ploy Achakulwisut is a Postdoctoral Scientist at the George Washington University Milken Institute School of Public Health. She has a PhD in Atmospheric Science from Harvard University.
  • Leehi Yona is a graduate of Dartmouth College and the Yale School of Forestry and Environmental Studies. She is an incoming PhD candidate in environment and resources at the Stanford School of Earth, Energy, and Environmental Sciences. Leehi, who was named Canada’s Top Environmentalist Under 25, is a coauthor of “The role of college and university faculty in the fossil fuel divestment movement,” published this week in Elementa.

Here are their questions. What are yours? Share them on Facebook and Twitter using #ExxonAGM or #ChevronAGM. We’ll be highlighting some of your responses during ExxonMobil’s and Chevron’s annual meetings next week.

Dr. Benjamin Franta: Will Chevron renounce its history of deception by leaving the American Petroleum Institute?

Chevron and other major oil companies are members of the American Petroleum Institute (API), the industry’s largest trade association. The catch? For decades, the API has promoted disinformation about global warming, despite its extensive knowledge of the problem. That fraudulent behavior is owned by Chevron too—a connection that may haunt the company as fossil fuel litigation grows.

Consider this. In 1959—almost sixty years ago—the API and other heads of the oil industry were warned by the physicist Edward Teller that their products would cause global warming and sea level rise. Even at that time, the basic chain of cause and effect was clear: fossil fuels would contaminate the atmosphere with greenhouse gases, global warming would result, and the results for humanity would be serious. The API even commissioned its own private study on the problem in 1968, which confirmed the warning. And in 1980, its secret CO2 and Climate Task Force (which included members from across the oil industry) was informed that fossil fuels would cause global warming by the 2000s and “globally catastrophic effects” by the 2060s.

So the American Petroleum Institute (and its members) knew about the harms of its products for decades. What did it do with this knowledge? First, for years, it kept mum. Then, when states around the world made plans to reduce fossil fuel use, it led the charge on denial and disinformation. It helped lead the Global Climate Coalition, the ironically-named industry alliance that denied climate science, deceived the public, and blocked or neutered climate policies like the Kyoto Protocol (Chevron was also a member). And it cultivated a crop of economists-for-hire, who for decades provided misleading reports used to convince Congress, the nation, and the world that it couldn’t afford to reduce fossil fuel use. (Trump’s Paris pullout–involving the same economists–is simply the latest in this long-running strategy of deception and delay.)

The American Petroleum Institute knew about the dangers of its products. Then it lied about them. That’s called fraud. And Chevron participated in that fraud. As lawsuits target the company, will it renounce this history of deception by leaving the American Petroleum Institute?

Dr. Ploy Achakulwisut: How will ExxonMobil and Chevron incorporate the co-benefits of reducing air pollution into their planning for a carbon-constrained world?

Each year, 6.1 million lives are lost prematurely due to air pollution. A significant contributor to this global public health crisis is fossil fuel combustion. Decades of research have revealed that exposure to air pollution is associated with a wide range of adverse human health impacts, including asthma, cancer, heart disease, stroke, and premature birth. There is also emerging evidence that pollution from coal combustion and motor vehicles can cause development delays, reduced IQ, and autism in children. In planning for a low-carbon future, fossil fuel companies should also take into account the societal and economic costs of air pollution resulting from reliance on their products.

What will ExxonMobil and Chevron do to ensure that in planning for a 2° C scenario, they recognize that faster emissions reductions have immediate co-benefits in reducing the burden of human death and disease from air pollution?

Leehi Yona: How do ExxonMobil and Chevron plan to rectify their misinformation and deception, and stop trying to influence academia?

I just received my Master of Environmental Science from Yale University this week and will be pursuing a PhD at Stanford this Fall. As a climate researcher and a young person, I am concerned about the influence of the fossil fuel industry within academia. I have seen the ways in which institutions have been influenced by fossil fuel interests, including ExxonMobil and Chevron. Fossil fuel companies hide behind their relationships with prestigious university research departments in an attempt to improve their image by affiliation—and knowing that industry-funded studies are more likely to produce industry-favored results. This funding often comes with strings attached, or isn’t properly disclosed.

Time and time again, you have deliberately misinformed and deceived the public on climate change. You have used your power and resources to attempt to influence academic institutions. How do you plan to rectify this reckless behavior?


Next Wednesday, UCS will be asking ExxonMobil and Chevron about their inadequate climate risk reports and their ongoing climate deception—and we’ll use the hashtags #ExxonAGM and #ChevronAGM to highlight some of your questions during the companies’ annual meetings.

Many thanks to my colleague Ortal Ullman for their help gathering expert input for this blog. 

Between Two Terns: A Conversation on Endangered Species and Social Justice

UCS Blog - The Equation (text only) -

Pictured: The interior least tern (Sterna antillarum), a federally protected endangered species. USFWS

Endangered Species Day was introduced as a resolution by Congress in 2006 to encourage “the people of the United States to become educated about, and aware of, threats to species, success stories in species recovery, and the opportunity to promote species conservation worldwide.” This year, Endangered Species Day (May 18) began with a devastating school shooting. It really had me questioning how appropriate it would be to emphasize the importance of wildlife conservation while so many in the world and our nation seem to place little value on human lives. In a time where human rights are being enthusiastically attacked by the Trump administration, however, it has become necessary to think critically about how our nation promotes policies that undermine public protections and the way this affects vulnerable communities. Basically, I realized that there are connections between our wildlife conservation policies…and the social disparities built therein.

Hear me out. The connection is not necessarily obvious at surface level, I understand. Social justice is at the core of environmentalism. Conservation works to ensure the preservation of cultures, heritage, and livelihoods. The spaces we often deign as devoid of “nature” or “environment” are not as readily included in conservation conversations, often at the risk of alienating entire communities and ecosystems. From pristine lands to over-burdened industrial areas, environment is all around us.

I had a conversation with Lia Cheek, fellow woman of color and colleague at the Endangered Species Coalition, to further explore the relationship between endangered species protections and social justice.

Defining environment

Charise: Why do you think the way we view the environment is important for conservation and how is this tied to social justice?

Lia: We look at nature as something to use up. Something that exists to serve our needs.  We look at it without emotion, without acknowledgement of the life it holds and its right to existence. Even the words we use to describe it, Nature, natural worlds are inanimate.

Charise:  I like how you emphasized the idea of Nature with a big N. When we view it that way, it tends to be exclusionary of underrepresented groups – and that spills over into environmental regulations and even the research questions that are asked. We see this especially with policies and processes that are based solely on economic considerations, with very little regard for both science and community input.

There is also a tendency to forget that “environment” includes built environments, urban areas. Loss of biodiversity affects us all. And we’ve seen the benefits of conservation in urban areas: greater accessibility to green spaces improves mental health and well-being, marked increases in perceived safety, cleaner air to breathe, protection and restoration of terrestrial and aquatic species. The assumption that city-dwellers (especially those who aren’t as socially privileged) do not care about or benefit from species biodiversity in their communities, that they do not notice when the trees are cut down and the birds stop singing, is unfounded. Social justice is the fair treatment of others. We should not put the needs of wildlife above those of humans, rather, we should treat both fairly, and consider more than just our wallets and convenience. It is unjust to distribute resources unfairly, and it is unfair to expect those being treated unjustly to consider conservation their top priority.

Lia: Sure! This is part of the same thread.  The way we currently manage wildlife and natural areas feels a lot like colonialism.  It’s all about control isn’t it? Controlling the populations of animals that we find inconvenient, like predators, boosting the populations of species that we gain an economic benefit from.  That same mindset is built into our other government institutions, which are built around increasing profit and subduing inconveniences, and these goals can often mean stepping all over people’s rights, case and point, the battle at Standing Rock and the keystone pipeline.  It’s a very ego and self-driven model that is in the fabric of the way our country is run.  The question then becomes, who is this system of benefits really for, and how do we make our institutions expand the circle of who is benefiting from this policy of profit to include folks who have been marginalized.

Wildlife and social justice

Charise: How is wildlife conservation, specifically, a social justice issue?

Lia: The underlying decision to use differences to other a community or another life, rather than a recognition of the similarities is the same. When you “take” an animal without awareness of or respect for its right to existence, without acknowledgement that it has a purpose, a desire, a meaningful existence besides fulfilling your intended use for it.  Or without understanding that it experiences moments of joy, the understanding of what family is just like you do.   This is the same act of “othering” that creates space for injustice and the violation of human rights when they become inconvenient.  The refusal to recognize another life as similar to one’s own is the choice that is at the heart of both colonialism and extinction.

When we think about what it means for a species to go extinct, to cease to exist in any form or feather, memory or song, forever, this knowledge can manifest such a deep sadness in us that we can try to turn away from it to protect ourselves. We push away the instinctual pain we feel that comes with the knowledge that we’ve lost a species to extinction, or the pain and fear we feel when we have to hear about the injustices committed against African Americans by the institutions we are a part of, or the empathy we might feel with immigrant families being torn apart while we stand by and watch. We can choose to close our eyes to the painful and frightening, but when we do this, we are also closing our eyes to the humanity of others, and the connection we have to life on earth. And this is important because we make this choice every day. With when we choose to stand up and speak out about an injustice or sit quietly and watch it play out. When we choose to open that email asking for your help or delete.  It’s something about ourselves that we all need to be aware of and watch carefully.

Charise: Yes, beautifully put. I would add that the right to existence is what makes this a justice issue, not just for wildlife, but for people. Through diversity of life, we can exercise our human rights to food, health, and culture. If certain people are not given access to this right, that is unjust. On the flipside, if certain groups are not provided with the basic freedoms afforded others based on race, income, religion, or otherwise, we cannot expect conservation efforts to succeed. We can’t say we’re dedicated to conservation when there are still people being eradicated through the country’s prison pipeline, gun violence, and toxic pollution, with little input on solutions.

Conservation requires conversations

Species conservation is necessary for the protection of wildlife, a valuable natural resource. With so many attempts to dismantle science-based environmental regulations, we are putting more than our natural resources at risk. But we can change the narrative of who gets to benefit from “nature.” We can push for more consideration of traditional ecological knowledge (TEK) in scientific research and policy decisions. Instead of stifling community members or excluding them from discussions outright, we have to listen to and incorporate the problems and solutions they have already identified. Addressing the inherent biases in our institutions from an intersectional perspective is the first step in serving vulnerable communities justly. You can start by joining the conversation. If you’d like to learn more about how our Science Network members engage in their communities around justice-based issues, check out our Science for Justice blog series.


¿Ignora la PREPA sus compromisos y los beneficios de la energía renovable?

UCS Blog - The Equation (text only) -

Imágenes satelitales de Puerto Rico por la noche antes y después del huracán María.Puerto Rico a oscuras después de María.

Resulta inverosímil que Walter Higgins, director ejecutivo de la Autoridad de Energía Eléctrica de Puerto Rico (PREPA, por sus siglas en inglés), afirmara la semana pasada ante el Senado de Puerto Rico que la energía renovable es un terreno “totalmente desconocido” para su entidad. ¿Quiere esto decir que la PREPA ha ignorado por completo el compromiso de energía limpia que adquirió desde el año 2010?

Adicionalmente, ¿por qué Higgins promueve el uso de carbón en una época en la que la generación de electricidad con este combustible fósil es cada vez menos competitivaen comparación con el uso de energías renovables? Y luego de la tragedia del huracán Maríael año pasado, ¿cómo es posible que la PREPA desconozca el rol decisivo que ha jugado la energía solary el almacenamiento energético en la recuperación de la isla y prefiera depender de la importación de carbón?

Ofrezco 4 puntos que encuentro imprescindible que Higgins entienda sobre el pasado y el futuro de la energía en PR y más allá.

1. Desde el 2010 la isla se comprometió a generar al menos 20% de su electricidad con energía limpia para el año 2035

Acorde a este requerimiento (Renewable Electricity Standard), desde el año 2015 la PREPA debe cumplir con las diferentes metas interinas establecidas, incluyendo generar al menos el 12%de su electricidad usando energía limpia entre el año 2015 y el año 2019. En la actualidad, tan solo 2% de la electricidadproviene de energía renovable, y más del 90% de la quema de combustibles fósiles como el petróleo, el gas natural y el carbón.

Dada la afirmación de Higgins ante el Senado, ¿estará acaso mal informado el director ejecutivo de la PREPA al ignorar el requerimiento con el que debe estar cumpliendo desde el 2015?

2. Los costos de la energía solar han bajado drásticamente en los últimos años

El costo de la energía solar ha caído drásticamente, más del 70% desde el 2010. Resulta desconcertante que Higgins y su asesora técnica de la División Ambiental, María Mercado, afirmen que es más rentable adherirse a modelos de producción eléctrica casi obsoletos como el uso centralizado de grandes termoeléctricas de carbón.

Actualmente, la generación eléctrica en Puerto Rico depende de la importación de combustibles fósilescomo el petróleo, el gas natural y el carbón. Esta alta dependencia somete el sistema eléctrico al riesgo asociado a las marcadas fluctuaciones globales en el precio de los combustibles. Adicionalmente, los puertorriqueños pagan más por kilovatio-horaque en el resto de casi todo Estados Unidos.

El alto costo que la quema de combustibles fósiles tiene en la salud, bolsillo, y medio ambiente de los puertorriqueños son bien sabidos. La quema de estos en la centrales Palo Seco, Cambalache, y Aguirre emiten 50 por ciento más dióxido de carbono por megavatio-hora que el gas natural, y producen casi tanto dióxido de sulfuro y óxidos de nitrógenos (dos contaminantes peligrosos para la salud) como el carbón.  En Cataño, donde se encuentra la central de Palo Seco (la tercera más grande en la isla), el riesgo de ataques de asma se eleva cuanto más cerca resida la población de las fuentes de contaminación.

La energía solar y la energía eólica en cambio no requieren de importar ni pagar por el uso del viento o el sol, ya que estos son recursos limpios, locales y cortesía de la naturaleza.

3. La energía solar y el almacenamiento energético han apoyado a los puertorriqueños luego de María

Adicional a la trágica pérdida de vidas y los innumerables daños materiales, la llegada de Maríadejó a la mayoría de los 3.4 millones de habitantes de la isla totalmente a oscuras. Hospitales, refugios y otros lugares clasificados como infraestructura crítica padecieron la carencia de electricidad, lo cual causó que por ejemplo muchos doctores hayan tenido que realizar procedimientos en la oscuridad. Aún 6 meses luego de la tormenta, cerca de 200.000 familias y negociospermanecían sin electricidad.

Los páneles solaresy los sistemas de almacenamiento energéticose convirtieron en un soporte a través del cual los puertorriqueños pudieron realizar actividades como cargar sus celulares, recibir terapia respiratoria con equipos especializados y tener los medios para organizarse y ayudarse los unos a los otros.

Es inconcebible que Higgins y María Mercado desconozcan ahora el papel fundamental que estas tecnologías han jugado en la recuperación de partes de la isla, y que como si fuera poco promuevan la quema de carbón para la generación de electricidad. La energía solar y el almacenamiento energético son vitales como medidas de resiliencia energética. Así mismo, luego de los efectos devastadores de María y en un tiempo en donde las tormentas se hacen cada vez más frecuentes y fuertes debido al cambio climático, todos tenemos la responsabilidad de disminuir las emisiones de gases de efecto invernadero.

4. Estar bien informado, la clave en torno a la palabra “autoridad”

Para ser la autoridad en un tema específico se requiere tener conocimiento del mismo. La conviene entonces a la PREPA, o Autoridadde Energía Eléctrica, hacer un alto en el camino e informarse mejor sobre los compromisos que debe cumplir (como es el requisito de energía limpia), y realizar un análisis detallado de los beneficios y costos de seguir con modelos insostenibles como el uso de carbón y otros combustibles fósiles en lugar de integrar energías renovables. Por ejemplo, el Instituto Nacional de Energía y Sostenibilidad Isleña (INESI)cuenta con expertos energéticos que podrian colaborar en dicho análisis.

Con seguridad, un análisis sólido mostrará que los beneficios de la transición a energías renovables son mucho más rentables que los costos de seguir atados a prácticas del antiguo milenio como la quema de combustibles fósiles.





A New Tailwind for Clean Energy, 15 Miles Offshore

UCS Blog - The Equation (text only) -

Photo: Erika Spanger-Siegfried/UCS

Massachusetts has a deep and bipartisan commitment to clean energy. State leaders and the public at large recognize that clean energy is not only an environmental imperative but a key economic strategy for a small state that relies mostly on brainpower and technology for prosperity. Republican and Democratic governors and the Massachusetts legislature have implemented a wide array of policies to make Massachusetts a national leader in this transition to clean energy.

But Massachusetts has had one disadvantage when compared to many other states—it does not have the wide open, windy land mass to build large wind farms, or available land for massive solar arrays, or the mountainous areas for large hydroelectric installations.

But just a few miles offshore is a resource that has been referred to as the “Saudi Arabia of wind energy”—the Atlantic Ocean. And after some false starts, Massachusetts and its neighbor Rhode Island have just taken a big step closer to taking advantage of plentiful and steady Atlantic ocean winds, jump-starting a whole new industry in the United States: offshore wind.

What Massachusetts and Rhode Island just did

Today, Massachusetts approved a bid by a company known as Vineyard Wind to build an 800 megawatt wind farm in a wide-open ocean tract more than 15 miles offshore. This wind farm is phase 1 of a larger plan to build a total of 1,600 megawatts of offshore wind, enough to power about one-third of the homes in Massachusetts and meet about ten percent of MA energy demand, according to the Massachusetts Clean Energy Center. This bidding process was mandated by a law backed by UCS and signed by Governor Baker in 2016. The law requires Massachusetts utility companies to conduct a competitive bidding process and thereafter enter into long terms with offshore wind companies to purchase the power they generate. The long-term contracts provide a guaranteed revenue source for these projects, which makes it possible for them to obtain financing for the sizable upfront capital costs which could exceed $1 billion.

In addition to this, Rhode Island announced today its intent to enter into a long-term contract with another bidder, Deepwater Wind, for an additional 400 megawatts of wind energy to be built in an adjacent area.

The benefits of these projects are enormous. At full 1600 MW build out, offshore wind project will reduce Massachusetts greenhouse gas emissions by 2.4 million tons per year, about a fifteen percent reduction of emissions from electricity consumption, according to the Massachusetts Clean Energy Center. This large-scale generation will also help Massachusetts replace with clean renewable energy its aging power plants, such as the Pilgrim Nuclear Power Station, that is scheduled to close in 2019,. These projects will also help ensure that Massachusetts does not continue along its current path of over-relying on natural gas, which now accounts for about two-thirds of MA electricity consumption.

This offshore wind industry will also be a colossal job creator. The Clean Energy Center estimates that 1600 MW of offshore wind will generate approximately 7000-10,000 construction jobs over next 10 years, and once built, it will generate hundreds more thereafter in operations and maintenance. It is estimated that the “ripple effects” of this additional employment will add between 1.4-2.1 billion dollars to the economy as the workers employed by these projects spend money on other goods and services.

Note that these are direct jobs from construction and operation. These projections do not include the potential that turbine manufacturers will locate in Massachusetts or Rhode Island to manufacture components of the offshore wind arrays. Yet the cost of shipping giant wind turbines manufactured elsewhere is so high that it seems inevitable that some components will be manufactured close by. And there are attractive sites for this enterprise, such as the Brayton Point power plant site, which once housed a coal-burning plant but is now a vacant, industrially zoned property on Buzzard’s Bay with a direct water connection to the offshore wind areas. If just 25% of the components are manufactured locally, this could add thousands more good paying jobs.  

What about cost?

Massachusetts has not released the estimated cost of the accepted bid, so the price will not be known until a contract is negotiated and submitted to the Department of Public Utilities for approval. However, the contract price is likely to reflect the remarkable worldwide decline of offshore wind costs due to technology innovation and economies of scale.

To put costs in perspective, one of the first US offshore wind contracts was for the Cape Wind project in Nantucket sound (more on that below). The Cape Wind developer planned on using 3.6 megawatt turbines, and the price of the power would start at 17 cents per kilowatt hour and escalate by 3% every year for 15 years. This price was well above the market price for power generally, and well above the price for onshore wind.

But in recent years, offshore wind projects in Europe have utilized much bigger and more efficient turbines, as large as 8.8 MW, and companies such as General Electric are developing turbines as large as 12 MW. As a result, new projects in Europe are entering into contracts for as low as 7-8 cents per kilowatt hour.

While we are not likely to see prices that low, as we lack the supply chain and trained workforce that Europe has developed over the last twenty years, it is highly likely that this project will take advantage of these much larger turbines and achieve significant economies of scale and price reduction as a result.

What’s next?

The next step is for Vineyard Wind to negotiate long term power purchase contracts with Massachusetts utilities, and for Deepwater Wind to negotiate with Rhode Island, processes that will likely take a number of months. In Massachusetts, the contract will then be submitted to the Department of Public Utilities, which will hold a public process and ultimately determine whether the contract is cost-effective and meets other statutory criteria. While this is occurring, the project developers will need to secure the federal, state and local permits needed to construct the wind turbines, transmission lines, and other equipment. It is hoped that this process can go forward expeditiously, so that the projects can take the necessary steps to qualify for at least a portion of federal tax incentives that will phase out by 2020.

But here is the best part: on top of the 1,200 MW approved by Massachusetts and Rhode Island, other states in the region have similar plans. New York is planning on 800 MW of project solicitations over this year and next, and is aiming for 2400 megawatts in total, and New Jersey has just announced plans for 3500 MW.

As one writer has observed, “commitments from northeastern states total 7,500 MW at a minimum, with more expected to follow. That’s enough critical mass to attract numerous bidders and create the foundation for an industry here in the U.S.”

A personal note

While I am always excited when states advance clean energy, this step forward is particularly sweet for me personally. Before I took the helm at UCS, I worked in the administration of Massachusetts Governor Deval Patrick, and under his leadership we laid the groundwork for offshore wind by building a wind test blade center in Boston and a marine terminal in New Bedford, passing legislation to authorize long term contracts for offshore wind energy, and working with Rhode Island and the federal government to designate appropriate offshore sites.

But Massachusetts’ first project—the Cape Wind project in Nantucket sound—crashed and burned, primarily because of the unrelenting and well-financed litigation brought by well-to-do homeowners on Nantucket sound who did not want to see wind turbines five miles offshore. When the Cape Wind project died, I feared it might be a very long time before another viable project would come along.

So, I am particularly heartened that Massachusetts and Rhode Island have kept at, and that technology has improved to allow for much larger, more cost-effective projects to be built farther offshore. Massachusetts and Rhode Island, all of the Northeast, and the entire country will benefit in ways we can barely foresee from this big, bold, and exciting new clean energy industry.

Photo: Erika Spanger-Siegfried/UCS

Testimony Reveals the Real Controversy over Census Data and Voting Rights

UCS Blog - The Equation (text only) -

On Friday, May 18, the acting head of the Department of Justice’s Civil Rights Division repeatedly refused to answer questions about his role in the Justice Department’s December 2017 request to the Department of Commerce to add a new citizenship question to the Decennial Census.

John M. Gore, who refused to show up to a May 8 hearing, claimed that he would not “make any statements today beyond those in the Department’s letter (requesting the question) or other publicly available information.” Gore claimed that his silence was required by “longstanding department policy” against discussing litigation outside of court, referring to the four lawsuits that have been filed against his department on behalf of numerous states and voting rights organizations. These lawsuits seek to prevent the addition of the citizenship question, given its anticipated negative impact on the quality of the Census enumeration, which only takes place once every ten years.

Representative Elijah Cummings (D-MD) lost his patience early on, shouting at Gore: “I asked you did you talk to your boss! You mean you’re going to tell me that you can’t answer a question as to whether you talked to your boss who we pay?” At one point, Representative Carolyn Maloney (D-NY) moved to subpoena Gore to answer these questions, but a Republican motion to block the subpoena passed on a party-line vote of 22 to 15.

In addition to entertaining several questions about hypothetical registration and voter fraud (which has been demonstrated, in court, to be nearly non-existent), Gore did at least acknowledge the actual scientific controversy at the heart of the Justice Department’s justification for the question. On the one hand, he acknowledged that the enforcement of the Voting Rights Act, passed in 1965, has never depended on the use of Census citizenship enumeration data directly. Indeed, he even acknowledged that there has NEVER been a public challenge brought under the VRA that was dropped due to inadequate data on racial voting patterns.

On the other hand, Gore did reference a private litigation case in Texas where the party was unable to move forward due lack of adequate data from the American Community Survey (ACS), which provides population estimates of the Citizen Voting Age Population (CVAP). However, he did not reference the case in his submitted testimony. Nevertheless, the claim is clear: census enumeration data is required for the VRA because in small, sparsely-populated districts, such as rural school districts, the margin of error associated with population estimates from the ACS may not be precise enough to make statistically valid inferences about those populations.

Survey samples like the ACS draw random (as possible) distributions of individuals, which provides estimates of population characteristics (i.e., a district is 49% eligible African-American voters) with a margin of error that accounts for sampling inaccuracies (say plus or minus 3% for a sample of about 1,000, such that the actual distribution is nearly always between 46 and 52%). In a census, we attempt to count the entire population, but any uncounted individuals (undercounts) distort the data, leaving us with a less accurate assessment of the actual universe of individuals. Samples, by their nature, may have less precision, but census counts, especially among hard-to-reach populations, can be less accurate, and it is more difficult to correct for undercounts.

This point was driven home by Justin Levitt, Loyola Law School professor and former attorney at the same division of the Department of Justice where Gore serves, who did show up on May 8 when he was invited to testify before the government oversight committee. Levitt’s testimony included an assessment of actual cases, public and private, where he too found one Texas case, Fabela v. City of Farmers Branch, Texas, where ACS data was not itself up to the task of validating a VRA violation claim. However, Levitt demonstrated that complementary, well-tested and judicially accepted techniques were used to assess the claims. More importantly, Levitt describes how the goal of VRA population data

is not to definitively predict the precise vote count in a future election based on ironclad certainty about an individual’s voting preferences based on her race or ethnicity, and her propensity to register or turn out to vote for a particular candidate. Instead, the purpose of the analysis is to determine whether past voting behaviors generally indicate that racial or language minority communities would vote similarly most of the time, and whether they would be likely presented with effective equitable electoral opportunity more often than not.

And here we get to the heart of the question, because it is the evidence that we need a citizenship question on the Census short form, which is sparse, that must be weighed against evidence about the negative impact that the addition of such a question might have on the accuracy of the Census count. That evidence is compelling. Indeed, we know from Census analysis that Latino populations are already undercounted, such that the addition of a question that would further reduce response rates among legal immigrant residents will create artificially low population estimates of VRA-protected groups in such cases, making it more difficult to identify and remedy VRA violations.

The addition of a citizenship question is far more likely to inhibit the successful trial of VRA cases, by increasing the inaccuracy of the Census, than it is to improve the assessment of VRA claims due to greater precision. Even Thomas Brunell, once a candidate to direct this Census, recently acknowledged that the administration is not making a scientific, but “a political decision.”  For these reasons, and for all of the reasons provided by all of the past Census directors, the scientists, the civil rights advocates, and all those who have dedicated their lives to advancing both science and sovereignty in our democracy, we have a mutual obligation to protect the scientific integrity of the Census. You can do your part by urging Congress to adopt the 2020 Census Improving Data and Enhanced Accuracy (IDEA) Act, legislation that would protect the accuracy of the 2020 census and ensure that any proposed changes to the count are properly studied, researched, and tested.

The Senate Should Oppose the New Low-Yield Trident Warhead

UCS Blog - All Things Nuclear (text only) -

This week, the Senate Armed Services Committee will take its turn to mark up the FY 2019 National Defense Authorization Act (NDAA). This also gives it an opportunity to weigh in on the Trump administration’s proposal for a new, lower-yield warhead for the Trident D5 submarine-launched ballistic missile (SLBM), funding for which is included in the bill.

The new warhead, designated the W76-2, will reportedly have a yield of 6.5 kilotons and would replace some of the W76 warheads currently on the Trident missiles, which have a yield of 100 kilotons.

The NDAA as it is now written would authorize $88 million in spending for the new warhead: $65 million from the Department of Energy’s National Nuclear Security Administration’s budget and $23 million in Department of Defense funds. The House Armed Services Committee earlier this month voted along party lines to reject an amendment that would have eliminated funding for the program from its version of the bill.

Despite the administration’s rhetoric about the need to strengthen deterrence, there is no good reason to develop a new warhead. As the head of the US Strategic Command, General Hyten, said himself in Congressional testimony earlier this year, “I have everything I need today to deter Russia from doing anything against the United States of America.” Worse, as many experts have pointed out, the new warhead could cause confusion for Russia and potentially increase the chances of miscalculation leading to an escalating nuclear exchange. Former Secretary of Defense William Perry has called such low-yield weapons “a gateway to a nuclear catastrophe.”

Opposition to this new program may be stronger in the Senate than in the House. It is certainly ripe for debate, given the dangers it presents and the questionable rationale the administration has put forward for it. To help make the case, more than twenty NGOs sent a letter to Senate Majority Leader Mitch McConnell that lays out the arguments against a new lower-yield Trident warhead.

It is unlikely that the Senate, in its current configuration, will stop the program, but it is important at the very least to ask the relevant questions about why we need such a weapon (we don’t) and how it would really affect US security (by decreasing it).

If You Can’t Censor It, Bury It: DOI Tries to Make a Stark New Study on Rising Seas Invisible

UCS Blog - The Equation (text only) -

Cape Lookout National Seashore, North Carolina. Photo: NPS

A new National Park Service (NPS) report is unequivocal that human-caused climate change has significantly increased the rate of sea level rise that is putting coastal sites at risk. But the study is difficult to find on the web and the report’s lead author, Maria Caffrey of the University of Colorado, says she had to fight to keep many scientific statements about climate change in the final version.

The report, Sea level Rise and Storm Surge Projections for the National Park Service, was published late on Friday May 18th, with no official announcement or accompanying press release – indeed, no easy way to find it unless you know where to look (hint: it’s here…tell your friends). The report has been several years in the making, and was delayed for several weeks after a draft showing edits removing mentions of human-driven climate change emerged and was reported in The Reveal. In the wake of these revelations, Department of Interior (DOI) Secretary Ryan Zinke was questioned about the changes by House Democrats Chellie Pingree (Maine) and Betty McCollum (Minnesota) in a House Appropriations subcommittee soon after the controversy broke in April. Responding to a question about the report by Pingree, Zinke responded: “If it’s a scientific report, I’m not going to change a comma.”

Since then, the references to human-caused climate change and climate attribution that had been proposed for deletion, have been restored. What we now have in the public domain at last, is a hugely important and detailed analysis of how projected future sea levels and storm surges may impact 118 US national parks. The findings are quite dramatic.

Dozens of US parks at risk from flooding and inundation

The report identifies dozens of famous and iconic sites including Virginia’s Historic Jamestowne and Assateague Island, Big Thicket National Preserve in Texas, the Florida Everglades and Jean Lafitte National Historic Park in New Orleans, as especially vulnerable. Several of the sites at risk were also identified by the Union of Concerned Scientists (UCS) in its 2014 report “Landmarks at Risk”, which built on previous NPS climate impacts research. Nationally, the new analysis shows that the highest average rate of sea level change by 2100 is projected for the National Capital Region, which puts sites on the Potomac River, and in and around the National Mall at risk.

Simulation of flooding from a category three hurricane striking Theodore Roosevelt Island, Washington DC. Credit: NPS

The highest total sea level rise by the end of the century is expected to be seen on coastline of the Outer Banks, threatening Wright Brothers National Memorial, Fort Raleigh and Cape Hatteras, and the broader Southeast Region is expected see the highest storm surges in the future. National parks on Caribbean and Pacific islands are at risk too, including in Puerto Rico and the US territories of Guam, American Samoa and the US Virgin Islands.

Parks must plan for worse storms & floods

Using Intergovernmental Panel on Climate Change (IPCC) sea level rise scenarios and National Oceanic and Atmospheric Administration (NOAA) data, the report also looks at how increased rates of sea level rise will interact with increasing hurricane intensity to worsen storm surges. When Hurricane Sandy hit the East Coast in 2012, storm surge caused widespread flooding throughout the region. That larger storm surge rode in on seas about 12 inches higher than in the pre-industrial period due primarily to warming oceans and melting land ice. Further analysis found that sea level rise added $2 billion to the damages from Hurricane Sandy in New York City. According to the NPS, Hurricane Sandy in 2012 caused in excess of $370 million in damage to national parks. The costs of 2017’s hurricanes Harvey, Irma & Maria to America’s parks have not yet been fully tallied, but will be large.

The authors of the new report recommend that because of the likely intensification of hurricanes, park managers should base planning on impacts likely from storms at least one storm category higher than any storm that has previously hit their particular park unit. According to the report “When this change in storm intensity (and therefore, storm surge) is combined with sea level rise, we expect to see increased coastal flooding, the permanent loss of land across much of the United States coastline, and in some locations, a much shorter return interval of flooding”. A suite of detailed storm surge maps for 54 sites has been posted on the NPS Coastal Adaptation page on Flickr.

Flood projection for a category 3 hurricane at high tide, Boston Harbor Islands, Massachusetts. Credit: NPS

A win for science and scientific integrity. This time.

The new NPS sea level rise analysis and storm surge maps represent a huge leap forward in terms of the tools that park managers, especially in some of the more remote locations, have available to them to assess the vulnerability of sites, and prioritize planning for resilience. It builds on a growing body of policy- and management-relevant climate science that the NPS’s Climate Change Response Program has been developing over the last decade. This work continues to keep the US at the cutting edge of international efforts to understand and manage climate impacts on cultural and natural heritage, and protected areas. It’s a pity that the DOI seems to be doing everything it can to make this report invisible, and that some of the climate scientists involved had to fight so hard to maintain the scientific integrity of their work.

After the study was published, report author Maria Caffrey, told journalist Elizabeth Shogren, the fight will have been  “worth it if we can uphold the truth and ensure that scientific integrity of other scientists won’t be challenged so easily in the future…”.  For the sake of our treasured national parks, and the dedicated staff who look after them, let’s all say Amen to that.

Did EPA Consult With The Chemical Industry While Working To Suppress A Scientific Study On PFAS?

UCS Blog - The Equation (text only) -

Today, members of the House Committee on Energy and Commerce sent a letter to EPA requesting more information about a meeting with an industry trade group, the American Chemistry Council (ACC), attended by Richard Yamada, the Deputy Assistant Administrator for the Office of Research and Development.

The letter and subsequent reporting (paywalled) is based on additional documents obtained by the Union of Concerned Scientists through a Freedom of Information Act request last month. EPA subsequently took down those documents, in an action similar to what happened with some of our other public records requests.

POLITICO reports:

Top House Democrats are raising concerns about a meeting between one of EPA Administrator Scott Pruitt’s top aides and representatives of the chemicals industry one day after a White House official raised alarm about a study of contaminants that has been stalled for months.

The American Chemistry Council represents companies that could face more expensive cleanup requirements if the HHS study were finalized, and the trade group appears to have had the ear of a top EPA official when it was being discussed internally, the House Democrats said.

A meeting titled “ACC Cross-Agency PFAS Effort” appears on the Jan. 31 calendar for Richard Yamada, EPA’s deputy assistant administrator for research and development. The calendar was obtained by the Union of Concerned Scientists under the Freedom of Information Act and cited by the Democrats in their letter to Pruitt Monday. One day earlier, Yamada and other EPA officials had received an email from the White House seeking to delay publication of the health study poised for release by HHS that would have increased warnings about certain PFAS chemicals.

A former staffer for the anti-science chairman of the House Committee on Science, Space, and Technology, Yamada attended a meeting with the ACC to discuss EPA’s cross-agency efforts to address PFAS. As we chronicled in 2015, the ACC has a history of obstructing stronger science-based public health protections from harmful chemicals and have frequently used tobacco industry tactics to pressure policymakers. An ACC spokesman confirmed the meeting with POLITICO but said that the suppressed PFAS study (also discovered by a UCS public records request) was not discussed.

The meeting, which occurred on January 31, was held the day after the now infamous “public relations nightmare” email was sent by an unnamed White House staffer.

The letter from members of the House Energy and Commerce Committee is the latest in a string of oversight letters related to the potential suppression by the White House and EPA of a key health assessment that is being conducted by the Agency for Toxic Substances and Disease Registry. Late last week, Representatives Brendan F. Boyle and Brian K. Fitzpatrick led another bipartisan letter demanding the release of the ATSDR study on the human health effects of PFAS chemicals.

Tomorrow, EPA is convening a national summit to discuss PFAS and the issues that states and communities are facing around the country. Unsurprisingly, one of the scheduled speakers is Jessica Bowman, an ACC attorney, who is talking first thing in the morning. And before a story in The Intercept, EPA failed to invite any community organizations and/or members to attend. After the reporting however, EPA has invited Andrea Amico, founder of Testing for Pease.

It remains unclear whether press will be able to attend, and according to the summit website, it appears as though the public can only view parts of the meeting online. Hopefully though, the agency will use tomorrow’s meeting as an opportunity to commit vital resources and concrete next steps to help remove these toxic chemicals from our environment.

Shareholders Not Playing Games at Big Oil Annual General Meetings

UCS Blog - The Equation (text only) -

Major fossil fuel producers are holding their annual general meetings (AGMs) this month amid mounting pressure from investors, increasing risks of legal liability for climate damages, and heightened scrutiny of their lobbying and public policy advocacy. BP and Royal Dutch Shell host their AGMs this week; ExxonMobil and Chevron will follow next week.

If shareholder meetings were classic game shows, and investors were keeping score, fossil fuel companies would be coming up short.

Investors demand Truth or Consequences about a 2°C world

Investors with a combined total of more than $10 trillion in assets under management are demanding that major oil and gas companies demonstrate support for the Paris Climate Agreement. In an open letter published in the Financial Times, fund managers including Aberdeen Standard Investments, BNP Paribas Asset Management, Fidelity International, and HSBC Global Asset Management Ltd. urged companies to be more transparent about how they are planning for a world in which global temperature increase is kept well below 2 degrees Celsius (2°C).

Specifically, the signatories called on oil and gas companies to make tangible commitments to reduce their carbon emissions substantially, consider the impact of emissions from the use of their products, and clarify how their investments are consistent with a 2°C world. (Check out this new report by Carbon Tracker and my recent blog highlighting these and other unanswered questions in ExxonMobil’s and Chevron’s reports on their plans for a 2°C world).

At tomorrow’s AGM, Shell faces a shareholder resolution by the Dutch organization Follow This calling on the company to set and publish targets that are aligned with the Paris Climate Agreement’s well below 2°C goal. Similar resolutions have received small but growing support from Shell shareholders over the past two years.

The United Kingdom responsible investment charity ShareAction notes that Shell’s announced ambition to reduce its carbon footprint is not a target, is not aligned with the goals of the Paris Climate Agreement, and would allow the company’s absolute emissions to increase. ShareAction therefore encourages investors who publicly support the Paris Climate Agreement to vote in favor of the Follow This resolution.

Although BP shareholders did not vote on any climate-related proposals this year, global warming was nonetheless a hot topic at the company’s AGM. As I observed in a Twitter thread about BP’s annual report, the company is wisely investing in renewables like wind and solar—but it is also upping natural gas production, and its strategy does not align with the Paris Climate Agreement’s well below 2°C goal or meet the expectations set out in the investor open letter.

(Legal) Jeopardy: Climate liability lawsuits gain momentum

One thing BP failed to mention in its annual report was the rising tide of climate liability litigation. (King County, Washington, filed the most recent lawsuit against BP and other major fossil fuel producers, seeking to hold them accountable for ”knowingly contributing to climate disruptions” and putting residents “at greater risk of floods, landslides, ocean acidification, sea level rise, and other impacts.”)

BP’s omission is particularly glaring in light of the recommendations issued last year by the Task Force on Climate-Related Financial Disclosures (TCFD) calling for consistent, comparable, and timely disclosures of climate-related risks and opportunities in public financial filings. In stark contrast, several of BP’s peers—including Shell, ConocoPhillips, and Peabody Energy—explicitly mentioned lawsuits filed by US municipalities as a shareholder risk. Peabody Energy did so despite a court ruling (appealed by the plaintiffs) that the company is shielded from liability by its bankruptcy filing. Shell may be sued in the Netherlands if it fails to align its business model with the Paris Climate Agreement.

This week, a federal judge in California will hear oral arguments in the fossil fuel companies’ motion to dismiss the lawsuit brought by San Francisco and Oakland over sea level rise brought on by climate change. Communities across the country and around the world that are struggling with enormous costs of climate damages and adaptation will be closely watching the ruling in this case.

Let’s [Not] Make a Deal with industry groups on climate change

There have been significant advances in fossil fuel company transparency about climate lobbying since last year’s AGMs. ConocoPhillips has expanded its disclosures of lobbying and other public policy advocacy following dialogues led by Walden Asset Management, and in response to shareholder resolutions that won substantial support in recent years—as well as recommendations in UCS’s Climate Accountability Scorecard. Valuable updates to the company’s website include an explanation of board and senior management oversight of lobbying, details on lobbying priorities and grassroots lobbying, and easily accessible information on lobbying expenditures.

This move by ConocoPhillips followed a report by BHP Billiton Limited on the material differences between the company’s positions on climate and energy policy and the advocacy positions on climate and energy policy taken by industry associations to which it belongs. BHP took action based on its review, severing ties with the World Coal Association. ConocoPhillips, BP, and Shell have all left the American Legislative Exchange Council (ALEC) in recent years, with Shell explicitly citing ALEC’s stance on climate science as the reason for its departure.

However, inconsistency between major oil and gas companies’ stated positions on climate change and those taken by their trade and industry groups remains a serious concern for investors. BP, Chevron, ConocoPhillips, ExxonMobil, and Shell all maintain leadership positions in trade associations and other industry groups that spread disinformation on climate change.

  • All five companies are represented on the board of the American Petroleum Institute (API)—notorious for its 1998 memo outlining a roadmap for climate deception, claiming that “victory will be achieved when…average citizens ‘understand’ (recognize) uncertainties in climate science.” API was warned about global warming as early as 1959.

  • BP, ConocoPhillips, ExxonMobil, and Shell are represented on the board of the National Association of Manufacturers (NAM), and Chevron is a NAM member. The NAM website is virtually silent on climate change, downplaying the issue even as NAM’s Manufacturers’ Accountability Project (MAP) attacks climate scientists and communities aiming to hold fossil fuel companies accountable for climate damages attributable to their business.

Following last week’s ConocoPhillips AGM, I received a vague response to a question I posed about what the company is doing to ensure that the public policy positions of NAM, API, and the US Chamber of Commerce are aligned with its own and environmentally responsible. To be a leader on transparency, ConocoPhillips ought to provide examples of what the company considers a “reasonable compromise” with industry groups on climate change—and explain how such dealmaking squares with its own climate policy priorities.

Shareholders are raising similar questions with BP and Shell about their leadership roles in NAM, API, and the Western States Petroleum Association (WSPA).

And next week, Chevron and ExxonMobil shareholders will vote on proposals for annual reporting on direct and indirect lobbying and grassroots lobbying communications. The filers of the ExxonMobil resolution, led by the United Steelworkers of America, are urging shareholders to vote yes. The proponents highlight a “Trade Association Blind Spot,” pointing out that ExxonMobil does not disclose its trade association memberships, trade association payments, or the portions of those payments used for lobbying. Arguing in favor of the resolution, the 26 co-sponsors “remain concerned that inadequate state lobbying and trade association disclosure by ExxonMobil presents significant risks to the corporation’s reputation.”

I always hated The Gong Show, but it would be tremendously satisfying to see the public and investors gong Big Oil CEOs off the stage if they continue their pathetic performance on climate change—failure to plan for a carbon-constrained world, failure to disclose climate risk, failure to renounce climate deception.


Closing North Korea’s Nuclear Test Site

UCS Blog - All Things Nuclear (text only) -

Of the surprising announcements North Korea has made in recent weeks, one of the most surprising was its statement that it would not only end nuclear tests but shut down its nuclear test site with international observers watching.

What should we make of this?

Pyongyang said it would allow journalists from the United States, Russia, Britain, and South Korea to watch the destruction of the tunnels at Punggye-ri sometime in the coming week (May 23-25). These tunnels dug into the mountain are where North Korea conducts its nuclear tests. US intelligence says that North Korea is already dismantling the test site, and satellite photos of the site (here and here) confirm that a number of facilities at the site have already been torn down.

Punggye-ri Test Site (Source: Google Earth)

If North Korean leader Kim Jong-un is serious about limiting and perhaps eventually eliminating his nuclear and missile capabilities in return for economic engagement with the outside world, the question is how he demonstrates that seriousness. Publicly shutting down his test site is a meaningful step in the right direction and an interesting way to try to send that message.

It’s true that shutting down the Punggye-ri test site does not prevent North Korea from ever testing again. If negotiations fail or situations change in the future, it could decide to tunnel at a different site and build the required infrastructure needed to test. But it’s a meaningful and pretty dramatic action nonetheless.

For one thing, while part of the current test site is no longer usable because some tunnels collapsed after previous tests, experts agree that a couple tunnels at the site remain usable. They also agree that disabling the facilities would take time to reverse—perhaps months or longer.

This reminds me of North Korea’s decision in 2008 to disable its nuclear reactor at Yongbyon by  blowing up the cooling tower and letting foreign reporters film the event. This was at a time when negotiations with the United States seemed to be moving ahead. A few years later after negotiations had stalled, Pyongyang built a new cooling system and was able to restart the reactor. But disabling the reactor was still a meaningful action, since it kept the reactor from operating for several years.

What’s next?

North Korea’s statements last week raised the possibility that Kim was walking back his various offers. Yet Kim’s criticism was focused on statements by John Bolton and others about the need for the North to denuclearize as an early step of negotiations. This is an approach Pyongyang has consistently rejected, calling instead for a step-by-step process that helps build the trust needed for additional steps.

President Trump’s subsequent statement disavowing this so-called “Libyan model” of disarmament seemed intended to help repair the situation, but his later statement that appeared to threaten destruction of North Korea if talks failed could have exactly the opposite effect and lead Kim to cancel or delay the talks. In the meantime, China has urged Pyongyang to continue with the talks.

So whether or not the summit will proceed as planned remains uncertain. An important indicator will be whether North Korea goes ahead with destroying tunnels at its test site this week.

Wait—Offshore Wind Offers HOW Much Power? Use This Calculator…

UCS Blog - The Equation (text only) -

Credit: Derrick Z. Jackson

Almost every week is bringing news about another step forward somewhere in the country for America’s newest renewable energy, offshore wind. Increasingly, the news is about advances for specific projects off our shores.

But when we hear about an offshore wind project of a certain size—X hundred megawatts—what does that mean? What does it mean in terms of our electricity needs, for example, or our need to cut pollution, or our potential to do more?

A simple new calculator from the Union of Concerned Scientists can help you size up each offshore wind project.

What Would an Offshore Wind Project Mean?

var divElement = document.getElementById('viz1526909389840'); var vizElement = divElement.getElementsByTagName('object')[0];'420px';'1140px'; var scriptElement = document.createElement('script'); scriptElement.src = ''; vizElement.parentNode.insertBefore(scriptElement, vizElement);

The inputs

Here’s the deal: When you hear about a proposed offshore wind farm, the project size is likely to be expressed in terms of megawatts—its nominal capacity/power output, based on the rating of each wind turbine at a given wind speed.

Credit: J. Rogers

How many turbines that proposed project will involve depends on the capacity of each individual turbine (also expressed in megawatts). That math isn’t complicated.

How much electricity an offshore wind project’ll generate is a little more complicated, depending mostly on where the turbines will be—what kind of wind resource the turbines will have access to. That’ll vary by state, and even within a given coastal area.

But with a few simplifying assumptions and estimates, you can get ballpark figures for what the project will mean in terms of the energy generation/production, the benefits it will provide such as avoided carbon emissions, and the area it will occupy.

The outputs

What the new tool offers when you put in those few inputs (state, project size, turbine size) can tell you something about what we’re likely to get out of the project you’re assessing.

Energy equivalent – The electricity expected from a project can be thought of in terms of the number of household equivalents it could power. Not actual households, since it takes a mix to make sure we’ve got power ‘round the clock, but how the energy produced matches up with the amount of electricity a typical household uses.

Average household electricity use varies by region and state, based on things like the climate and state energy efficiency efforts. So a given amount will go further in some places than in others.

Pollution reduction – And then there are the air quality benefits of projects. Megawatt-hours of offshore wind generation will displace megawatt-hours of generation from land-based power plants in the region. What an offshore wind electron displaces depends on what’s “on the margin” at a given moment—usually what next-cheapest power source doesn’t get turned on because offshore wind is doing its thing instead.

If those displaced sources are coal, oil, or natural gas power plants, which will often be the case, the offshore wind power will help us avoid the pollution that those plants would otherwise emit. Avoiding that pollution brings important health and environmental benefits.

This simple calculator focuses on carbon dioxide. And it puts the result in terms of number of car equivalents—what that CO2 pollution reduction would be like in terms of the carbon pollution that comes from a typical car in a typical year, given the US auto fleet and American driving habits.

Leases and lessees – some done, more to come (Source: BOEM 2018)

Lease area potential – In general, the areas most ready for offshore wind projects are in the existing federal leases on the US Outer Continental Shelf off our nation’s East Coast. The federal government, using robust public stakeholder processes (as in Massachusetts), identified various offshore wind lease areas. It auctioned off the leases, and a range of companies won the rights to put up turbines in those areas. There are more than a dozen such leases so far, from North Carolina up to Massachusetts. (And more are on the way.)

Given that, you can think about a project in terms of how much of that state’s existing lease area it’s likely to take up, and how much room it leaves for more offshore wind power.

Using the calculator

To ground all this in (projected) reality, here’s an example for you to try: Let’s imagine a 400 megawatt wind farm off Massachusetts (and at this point in the process that doesn’t require much imagination), and imagine 8-megawatt wind turbines. So:

  1. Click on Massachusetts on the calculator’s map.
  2. Use the sliders or right-left arrows to get to 400 megawatts for the project size.
  3. Pick 8 megawatts for the turbine size.
  4. Check out the results.
    • For number of turbines, you get 50.
    • For number of households whose total energy consumption would match what the project would produce, you’d get something like 230,000.
    • The avoided CO2 pollution would be equivalent to taking some 90,000 cars off the road.
  5. Check out how much—or how little—of the existing Massachusetts lease areas a project like that would use up: 6%.

At the bottom of this post are details about the calculator and calculations.

The scale of things to come: Offshore wind blades, and a sample of the people behind it all (Credit: Derrick Z. Jackson).

More results

Other results from new offshore wind are equally important, but harder to quantify simply at this early stage in the technology’s history in this country. Those include employment and ecosystem effects.

Jobs – A big reason for offshore wind power’s popularity right now is its tremendous potential for job creation, in manufacturing, project development, installation, maintenance, finance. Think welders, pipefitters, electricians, boat crews, and a whole lot more.

And the vision is not just jobs, but careers, as single projects pave the way for multiple tranches that then lead to a whole US offshore wind industry, one big enough to sustain not just projects but all the soup-to-nuts pieces that go along with that when the scale is big enough.

In Europe, the offshore wind industry is 75,000 workers strong. Estimates for US jobs draw on assumptions about how big the American market will get, and how quickly, and what that means for how many of the jobs end up here, instead of overseas. A 2015 US Department of Energy study found that going to 22,000 megawatts by 2030 could mean 80,000 American jobs by that year. A study for various Northeastern states looked at 4,000 to 8,000 megawatts of offshore wind development in the region, and projected full-time equivalent jobs in a given year of up to 36,000.

Proceed, but with caution (Credit: Derrick Z. Jackson).

Ecosystems – The results of an offshore wind farm in terms of our offshore ecosystems depend on the care taken in planning, siting, installing, operating—and, eventually, decommissioning—of the project. Offshore wind’s potential to cut carbon pollution can help reduce the impacts of climate change—including important ones for our oceans and marine ecosystems. But additional activity and infrastructure in the marine environment can have direct impacts that need careful consideration.

One concern is marine mammals, and particularly, on the Eastern seaboard, the critically endangered North Atlantic right whale. Project developers have to be careful to not add to the right whale’s troubles.

For fish, once a project is in place, the bases for the offshore wind towers can be problematic for some species, and a boon for others, as they can act as artificial reefs and change the environment.

Where jobs and fish come together is in the fishing fleet. Results, positive and negative, will depend on things like any limitations on boat travel in the project area during construction, and any boost to fish stocks from the project once it’s installed. While commercial fishers may view projects differently from how recreational ones do, at least some fishers are finding the US’s first offshore wind farm, off Rhode Island’s Block Island, to be a plus (and there’s this upbeat from the University of Delaware and the American Wind Energy Association).

Results in terms of jobs, careers, and our marine environment will be important to keep an eye on.

Technology and people (Credit: Derrick Z. Jackson)

Calculate on

In the meantime, there’s plenty we can know about with greater certainty. With the help of this simple calculator, the next time you hear of an X megawatt offshore wind project destined for a shore near you, you can let it be more than a single number. Look at what it means in terms of energy to be generated, pollution to be avoided, and lease area implicated.

To be clear: an offshore wind calculator is no substitute for the detailed wind monitoring, engineering calcs, environmental assessments, and much more that go into project proposals, investment decisions, and approval processes.

But this one just might help give some more depth for contemplating project announcements as the offshore wind industry takes off in the country. Because, beautiful as offshore wind farms seem to many of us, they’re a lot more than just a bunch of graceful kinetic sculptures.

The technical stuff

  • States – The calculator includes the eight (as of this writing) states for which the US government’s Bureau of Ocean Energy Management (BOEM) has auctioned off leases. South Carolina is working toward joining that club. Projects can also happen in state waters, as with the Icebreaker project planned for Lake Erie waters off Cleveland. The West Coast also has terrific resources, and even the Gulf Coast may get into the act at some point.

    The power on the seas (Source: NREL/Musial et al. 2016)

  • Capacity factors – To calculate electricity production, the calculator uses midpoint capacity factors from the different zones in NREL’s latest offshore wind resource potential assessment (Musial et al. 2016): Delaware (42.5%), Maryland (42.5%), Massachusetts (47.5%), New Jersey (45%), New York (45%), North Carolina (42.5%), Rhode Island (47.5%), and Virginia (42.5%).
  • Household equivalents – The calculator uses the latest figures from the US Energy Information Administration on average monthly electricity use by residential customers in the chosen state. Figures are rounded to the nearest thousand.
  • Avoided CO2 emissions – The calculator uses the average CO2 emission rate for each region, as calculated by the US EPA, and the car pollution figure from EPA’s own equivalencies calculator. Figures are rounded to the nearest thousand.
  • Project areas – Project footprint calculations are based on NREL’s assumption of 3 megawatts per square kilometer (Musial et al. 2016).
  • Lease areas – The lease area calculations for each state are based on the figures from BOEM here. For the two leases in the shared Rhode Island/Massachusetts offshore wind area, the calculator credits those amounts fully to each state; that is, it considers them to be Rhode Island’s when considering a Rhode Island project, and Massachusetts’s when looking at Massachusetts.
Photo by Derrick Z. Jackson Photo by Derrick Z. Jackson

Now Is the Time To Halt the EPA’s Restrictions on Science

UCS Blog - The Equation (text only) -

If you have been following the news, I am sure you know by now that the EPA is proposing to restrict the science it will consider when developing new or revised health and safety protections. It may seem like a Washington game, but this proposed rule has huge implications for all of us.

For scientists, it means that much of your work may be dismissed from impacting policy out of hand because you must adhere to research ethics policies that restrict the release of private data. Or because you can’t and shouldn’t sacrifice intellectual property rights at the whim of the EPA. For industry, it creates greater uncertainty around the always thorny issues concerning confidential business information. And, most importantly,  for all of us, the proposal means that policies that protect our health and safety will not be based on the best available science because of inappropriate political interference.

So what can YOU do to fight back? Well, for all the political manipulation that we have been documenting at the EPA, the agency must still adhere to the law when making or changing regulations.  That means the EPA must make a proposal public, accept public comments from all who wish to submit them, evaluate and respond to those comments, and then decide on the final version of the rule. And they are subject to challenge in federal court on all actions.

That means YOU can submit a comment into the public record that the EPA is obligated to consider. And now is the time! For this proposal, the comment period is only 30 days—and it’s already more than half over. It closes at the end of May (though requests have been made to extend it, so far with no response from the EPA).

How do I make a comment?

The proposed rule is complicated and somewhat confusing. It is misnamed as an action to “strengthen transparency” in the rulemaking process, but it does no such thing. To have an impact, however, your comment needs to be specific and detailed, not just broad comments on the rule.

To help you better understand the proposed rule, we have produced a guide for commenters. The guide highlights topics for which the EPA is specifically requesting input and some of the issues you may want to consider in making your comment. It also gives you the links for submitting a comment and some suggestions for how to have the most impact.

I want to encourage scientists to submit as part of their comments examples of specific important scientific studies and evidence that are likely to be excluded if this rule is implemented. For example, the rule proposal says that studies will only be considered if all raw data, computer code, models, and other material in the study is fully publicly available.

On its face, that precludes using studies where personal confidential information is part of the “raw” data. Most Institutional Review Boards require researchers maintain confidentiality for human subjects data. Are their studies you have been involved in or rely on in your research that would be excluded a priori because of this restriction?

One of the reasons it is important to cite specific studies in the record is because that public record will be important in any future legal action. Also, our political leaders are usually not fully familiar with the scientific process. They need specific examples to inform their own views. How will your work be impacted scientists? How will community members be affected if certain public health and safety protections are not enacted based on good science?

A week of collective action

A coalition of groups including 500 Women Scientists, EarthJustice, and the Public Comment Project are joining forces to mobilize as many public comments as possible during the week of May 20-26.  This coordinated action—the National Week of Public Comments on EPA’s “Restricting Science” Policy—is part of the overall effort of Science Rising, which is working to defend science and its crucial role in public policy and our democracy more broadly. You can participate by sending in your comment and letting us know that you did.

This is still our government, our democracy, and our voices need to be heard.

A Response to Roberts and Payne

UCS Blog - All Things Nuclear (text only) -

A recent letter by Bradley Roberts and Keith Payne responds to a Japanese press account of a blog post that discussed Japanese Vice Foreign Minister Takeo Akiba’s 25 February 2009 presentation to a US congressional commission on US nuclear weapons policy. Reports of Mr. Akiba’s presentation created some controversy in the Japanese Diet, since he may have made statements that contradict the spirit, if not the letter, of a long-standing Diet resolution. That resolution, adopted decades ago and reaffirmed many times since, prohibits any transportation of US nuclear weapons into Japanese territory.

The 1969 US-Japan agreement granting the United States “standby retention and activation” of nuclear weapons storage sites on US military bases in Okinawa.

Roberts and Payne mistakenly claim the document on which the post was based does not exist, despite the fact that it was published on the website of a non-governmental Japanese arms control expert more than a month before their letter appeared in the Japan Times.

The document exists.

Roberts and Payne also claimed that because the Japanese participants were “off-the-record” no records were kept. This too is incorrect. There may be no transcript of Mr. Akiba’s presentation, but an April 10 reply by the cabinet to questions from Rep. Seiji Osaka confirmed that the Foreign Ministry kept records on the proceedings of the US commission where representatives of the ministry were present. The same reply was repeated in a document issued on April 13 by the Security Treaty Division of the North American Bureau of the Ministry of Foreign Affairs. The United States Institute of Peace (USIP) also archived documents that describe the discussions between the commissioners and the Japanese officials.

Records were kept.

Meetings are often held “off the record” to allow public officials to express their personal opinions. Rep. Osaka asked the Abe government whether the Foreign Ministry officials who participated in the proceedings of the US commission were acting in a personal or an official capacity. The April 10 reply by the cabinet confirmed that all of the Japanese officials who participated in the proceedings were acting in an official capacity under the direction of Foreign Minister Nakasone.

The three-page document Akiba presented to the US commission is therefore an official record of the Japanese government’s views on the role of US nuclear weapons in the defense of Japan. So are any oral statements Akiba and the other Japanese officials gave to the commission.

Some of those oral statements were recorded in hand-written notes on the margins of the document. Those notes contain an abbreviated rendition of a conversation between Akiba and James Schlesinger in which the Japanese minister gives a favorable response to Schlesinger’s question about building nuclear weapons storage facilities in Okinawa. Roberts and Payne recall the conversation. They note that Akiba “clearly set out the three non-nuclear principles,” which the Japanese official does in the hand-written notes on his conversation with Schlesinger. Yet Roberts and Payne neglected to mention Mr. Akiba also said that “some quarters talk about revising the third principle,” which would be necessary if the United States were to bring nuclear weapons into Japan or prepare to store them in Okinawa.

The language in the hand-written notes makes it difficult to assess whether Mr. Akiba is among those who want to revise the third principle. But his favorable response to Schlesinger’s proposal to construct nuclear weapons storage sites in Okinawa deserves more careful scrutiny.

Notes are often incomplete and sometimes inaccurate. Memories, especially of a conversation that took place nine years ago, can be faulty. One way to help clarify this matter is for the United States Institute of Peace (USIP) to release the Foreign Ministry from its promise of confidentiality and encourage the ministry to respond to Diet requests for access to its records. USIP should also grant the Diet access to all materials on the proceedings of the commission it may hold in its archives. Greater transparency, from both sides, is the best way to set the record straight.

Here’s Why Seas Are Rising. Somebody Remind the Wall Street Journal.

UCS Blog - The Equation (text only) -

Ice sheets on land in Greenland and Antarctica are melting, adding water to the world's oceans. Photo: NASA

On May 15, the Wall Street Journal published a commentary by Fred Singer which argued that rising sea levels are unrelated to global warming, that they won’t be much of a problem, and that there’s little we can do about them. Singer, whose history of disingenuous attacks on science on behalf of the tobacco, fossil fuel and other industries goes back nearly 50 years, is wrong on all counts.

Singer acknowledges that “sea levels are in fact rising at an accelerating rate,” but then argues that “the cause of the trend is a puzzle.” Perhaps Singer is puzzled as to the causes, but science is crystal clear about this. Worse, we know that without strong policy to limit CO2 emissions, the rising water will continue to accelerate, inundating all the coastal cities of the world.

Fundamentally, there are three reasons why the ocean is rising at an accelerating rate

  1. Adding heat to things causes them to change temperature (1st Law of Thermodynamics)
  2. Seawater volume increases with temperature (thermal expansion)
  3. Adding a volume of water to the oceans from melting land ice causes them to increase in height (conservation of water)

All three of these principles (conservation of energy and mass, and the thermal expansion of water) are bedrock principles of physics which have been established for centuries and can easily be verified by direct observation.

The effect of CO2 on the absorption of radiation has been understood for 160 years.

The effect of rising CO2 on the energy budget of the Earth is directly measured in the laboratory, from towers, from balloons and aircraft, and from satellites. We measure precisely how much extra heat is absorbed globally by CO2 because of burning carbon, all the time. Adding heat to the world causes it to warm up, for precisely the same reason that adding heat to a pot of water on the stove causes the temperature of the water to increase.

When water warms up, it expands

The precise increase in seawater volume with temperature is easy to measure and extremely well known. Nearly all the resulting change in heat content (more than 90%) is in the oceans, where temperatures are measured at all depths by thousands of autonomous instruments floating at different depths. Oceanographers know the three-dimensional temperature and density of the oceans worldwide to amazing precision from these floating sensors. Since 1992, we have also tracked rising sea levels everywhere on Earth by measuring the height of the ocean from space using laser altimeters. The expansion of the warming seas measured by the floats is completely consistent with the rising surface of the water measured by the lasers.

As the world warms, ice sheets on land in Greenland and Antarctica are melting, adding water to the oceans.

Just as we directly measure the effect of CO2 on heat and the effect of that heat on ocean temperatures and sea level, we also have satellite measurements of the volume and mass of the great ice sheets. The height of the ice is measured by radar and the mass is measured by the gravitational pull of the ice itself. These data show precisely how much water from the ice sheets in both Greenland and Antarctica is added to the oceans each year. The total rise in sea level is completely consistent with the additions from land ice and ocean expansion, all of which are precisely measured all over the Earth and to the bottom of the oceans.

The reason that sea levels are rising faster and faster is because every bit of coal, oil, and gas we burn adds to the CO2 in the atmosphere, absorbing more of the Earth’s radiant heat, and contributing more to the thermal expansion of seawater and the loss of land ice. This is not a mystery. It’s extremely well understood and documented by millions of direct measurements.

Without strong policy, coastal cities will be inundated and abandoned

The oceans will continue to rise faster and faster unless the world implements very strong policy to quickly reduce and eventually eliminate the burning of fossil fuels. Depending on how quickly these policies are put in place, seas will rise between one and eight feet by 2100, according to a 2017 report from the federal government, released under the Trump Administration. Without strong policy, coastal cities in the US and around the world will be inundated and abandoned.

Rising oceans are but one devastating consequence of inexorable global warming caused by burning fossil fuels. Luckily, it’s not too late to prevent the damage to the world and our economy. Nearly all the world’s nations have agreed to limit warming by cutting emissions. Maybe somebody should tell Fred Singer.


Five Things You Should Know About EPA’s Proposed Giant Step Backward on the Safety of Chemical Facilities

UCS Blog - The Equation (text only) -

Kentucky Army National Guard members training for disaster responseMembers of the Kentucky National Guard receive a brief on extracting the mock injured and wounded during the early stages of their external evalutation at Muscatatuck Urban Training Center in Butlerville, Ind. May 23. The purpose of the exercises and evaluation is to prepare the Kentucky Guard’s chemical, biological, radiological, and nuclear (CBRN) teams to respond to such attacks and disasters. Photo: Spc. David Bolton, Public Affairs Specialist, 133rd Mobile Public Affairs Detachment, Kentucky Army National Guard/CC BY 2.0 (Flickr)

As one of his first acts in office, EPA Administrator Scott Pruitt decided to put on hold the implementation of new regulations to improve the safety of chemical facilities around the country. Those regulations, finalized in 2017, called for consideration of safer technologies, better information for communities and first responders that are on the front lines of accidents and other incidents, better planning for accidents and disasters, and improvements in response capabilities including coordination and practice sessions with local first responders. These changes were made to update the so-called Risk Management Plan rule, last significantly modified in 1996.

Now, the EPA has proposed a new rule, modifying the 2017 regulations without ever implementing them. The new proposal, soon to be published in the federal register and open for a 60-day public comment period, basically rescinded all new requirements with a few minor exceptions and takes us back to 1996 at best. The justification by Pruitt’s EPA is that it will reduce industry costs if they don’t have to do these things, by $88 million. Rolling back these critical protections in the wake of a devastating hurricane season that demonstrated the need for increased planning for these chemical facilities and after there have been 43 reported incidents at chemical facilities since the rule was initially delayed demonstrates a lack of leadership and commitment to public health at the EPA.

The short summary is that Pruitt’s EPA has eliminated or weakened every provision of the rule to eliminate protection for fenceline communities or workers. The justification is possibly saving $88 million dollars in compliance and at the expense of immense public health and safety benefits to communities which were not calculated in the proposal.

When the Public Comment period is open, the EPA will hold exactly one public hearing to receive input in addition to written comments. That hearing will be in EPA Headquarters in DC, not in any one of the communities like Houston, TX and Wilmington, DE affected by the risks of chemical facilities, and frankly out of reach in terms of cost to most grassroots or local organizations. That’s a shame. It also means that the written comments submitted to the EPA are all the more important as the delay of the previous rule, and certainly this new proposal if it is finalized, are being challenged in court, including by the Union of Concerned Scientists.

So here are five things you should note as you consider commenting on the new EPA proposal.

  • The 2017 rule required chemical facilities to evaluate and consider safer technology and alternatives defined by the EPA itself as “a variety of risk reduction or risk management strategies that work toward making a facility and its chemical processes as safe as possible.” Seems reasonable that these should be considered by facilities everywhere to reduce risks to workers, communities and first responders. The idea is to reduce the risks with safer alternatives before an accident or disaster takes place. The preventive medicine of the chemical facility so to speak. The new proposal completely eliminates this requirement for facilities to look at preventative, safer alternatives. The justification for the rollback was the costs to industry, without any consideration of benefits to the public or to the mission of the EPA (to serve the public interest).
  • Prior to the new rules set in 2017, it was nearly impossible to get much information about what chemicals were being held at a facility in a timely and regularly updateable way. To obtain any information, you had to prove you lived in the neighborhood around the facility and go to a special EPA reading room when it was open—if it was available, you were not allowed to use a copier, computer or scanner and you couldn’t take anything away. The 2017 rules eased these restrictions somewhat by allowing communities to ask for information and requiring companies to be forthcoming in a timely way. The new proposal eliminates that option. It goes back to a system where the public, including first responders, have little or no information in case of a chemical disaster or emergency chemical release in their neighborhood.
  • Prior to 1996, chemical facilities could leave most of the response capability for accidents and disasters up to the local government, with the cost borne by local taxpayers, not the company. That burden was only partially shifted in 2017 with greater participation and coordination requirements put on companies to work with local government and groups. The new proposed rule takes a step back again and weakens those requirements, though there would be some requirement for joint exercises to practice responding to an accident every few years. And they propose eliminating the requirement to report on the results of those exercises to improve performance.
  • Under the 2017 rules, when an accident occurred, an incident analysis would be required along with an analysis of the causes of the incident. Now Pruitt’s EPA is eliminating that requirement to analyze and report on accidents and their causes and make that information available to the community.
  • And, in 2017 the rules required the industry to hire third-party independent auditors to evaluate compliance with the rules and to investigate problems. The EPA is now proposing to eliminate that requirement and continue to allow companies to audit themselves.

Should you submit a comment? Yes! Because this proposal makes all of us less safe. It is simply unacceptable that we cannot do a better job of preventing and responding to the thousands of chemical accidents that occur every year in this country.

Sonny Perdue’s USDA Is in Bed with Big Pork. That’s Really Bad for Everyone Else.

UCS Blog - The Equation (text only) -

North Carolina hog barns with waste lagoons. Photo courtesy Waterkeeper Alliance Inc./Flickr

In his first year running the US Department of Agriculture, Secretary Sonny Perdue has displayed a curious tendency to say things he really shouldn’t. The most recent example is his striking off-the-cuff comment about a big court judgment won by neighbors of a massive hog farm and its stinking cesspools in North Carolina. Perdue told reporters he was not familiar with the case, in which a US District Court jury leveled a landmark $50 million verdict against Murphy-Brown LLC, a subsidiary of pork giant Smithfield Foods. But that didn’t stop him from calling the jury’s decision “despicable.”

Secretary Perdue’s alignment with big corporate interests over the public interest has been clear for a while. But his knee-jerk reaction to this case, along with related pending actions at his USDA, suggests that he is willing to throw workers, farmers, rural residents, consumers, and clean air and water overboard to protect Big Pork’s bottom line.

“Nuisance” is putting it mildly

When the jury in the Murphy-Brown case (a so-called “nuisance” suit filed on behalf of a group of 10 neighbors) handed down its decision on April 26, fear surely rippled through the pork industry. Led by Iowa, North Carolina, and Minnesota, annual US pig production exceeded 110 million animals in 2014, with the total national swine herd that year valued at $9.5 billion. In 2018, the industry is forecast to produce even more pigs—an estimated 134 million. The vast majority of those animals will be raised in CAFOs (confined animal feeding operations), which generate huge quantities of concentrated manure waste. In North Carolina alone, hog and poultry CAFOs produce 15,000 Olympic-size pools’ worth of waste each year.

In that state, there’s a long line of angry CAFO neighbors awaiting their chance to demand justice for the harm these operations cause. More than 500 plaintiffs have filed 26 lawsuits alleging damage from Murphy-Brown’s operations. The company’s practice of holding liquified manure in open pits and spraying the excess on nearby fields, common in the CAFO industry for decades, leaves a reeking stench over nearby communities. Residents, most of them working class and black, complain of health problems—which researchers have shown can include nausea and respiratory problems such as asthma—along with reductions in property values and quality of life from the CAFOs that built up around them. If juries in the other North Carolina cases (and in CAFO lawsuits elsewhere, like one filed this week by Iowa residents against that state) decide in favor of plaintiffs, it could be a watershed moment for environmental justice—and may force the industry to change.

In the weeks since the North Carolina jury’s bombshell announcement, the judge in the case has bowed to a state law that caps punitive damage awards, reducing the $50 million award to a mere $3.25 million. Still not exactly small potatoes, but the reduction must have prompted sighs of relief from the board rooms of Murphy-Brown, parent company Smithfield Foods, and WH Group, the Chinese company that owns Smithfield and is the world’s largest pork company.

And there’s more for giant pork companies to smile about. In addition to state laws that have long enabled the pork industry to operate profitably at the expense of its neighbors and continue to protect it from major consequences, Big Pork appears to have the Trump administration on its side.

Perdue backs Big Pork over farmers…

Two regulatory actions initiated by the USDA in its first year under Secretary Perdue show how it has favored the big corporations that process and sell US pork at the expense of small farmers and workers in the industry. First, last fall the department announced it would withdraw the Farmer Fair Practices Rules, Obama-era rules that would have made it easier for livestock and poultry farmers to sue meat processing companies with which they have contracts and to protect farmers from unfair and predatory corporate practices. In response, a group of farmer plaintiffs and the Organization for Competitive Markets filed suit in December, calling the rules’ cancellation arbitrary and capricious, a gift to the industry, and a failure to protect small farmers.

Speaking to reporters as part of a farm tour in Ohio last month, Perdue suggested farmers are on their own:

There are farmers there, some of which will not survive because other people do it better. That’s the American capitalistic society. The best producers thrive and provide, and the others find another industry where they can thrive.

That’s a startling statement from a guy who claims to serve the interest of farmers—Perdue calls them the USDA’s “customers,” and they still largely support the Trump administration (though their support is slipping).

…and workers and food safety, too

In a related action, the USDA in January proposed a rule it claims will “modernize” swine slaughter. In fact, by reducing the number of trained government food inspectors in pork processing plants and allowing plants to operate at higher speeds (something the administration has also tried in poultry plants). These changes would likely increase rates of worker injury and incidents of meat contamination, and the proposed rule faces broad opposition from food safety, labor, and animal welfare groups. More than 83,500 people wrote to the USDA about it during a public comment period that closed May 2, and dozens of members of Congress have also entered the fray. In their letter to Secretary Perdue, 63 members of the House of Representatives (including several from leading pork states) cited the danger posed by the hog slaughter rule to workers and urging the secretary to withdraw it.

As various lawsuits wind their way through the courts and the swine slaughter rule proceeds through the regulatory process, we’ll see whether Secretary Perdue’s USDA backs down or continues to back Big Pork. Meanwhile, the perception of the Trump administration’s coziness with the industry is peaking in a weird way: in the online video game Bacon Defender, players navigate an animated high speed pork plant—complete with falling poop emojis and oddly Trump-like voice effects—armed only with a mustard-shooting hot dog. “Even a novice Bacon Defender player quickly learns that at higher speeds feces can contaminate your food more easily,” say the game’s creators.

I wish I had a sad poop emoji for that.


High Energy Arc Faults and the Nuclear Plant Fire Protection IOU

UCS Blog - All Things Nuclear (text only) -

Last year, we posted a commentary and an update about a high energy arc fault (HEAF) event that occurred at the Turkey Point nuclear plant in Florida. The update included color photographs obtained from the Nuclear Regulatory Commission (NRC) via a Freedom of Information Act request showing the damage wrought by the explosion and ensuing fire. Neither the HEAF event or its extensive damage surprised the NRC—they had been researching this fire hazard for several years. While the NRC has long known about this fire hazard, its resolution remains unknown. Meanwhile, Americans are protected from this hazard by an IOU. The sooner this IOU is closed out, the better that Americans in jeopardy will be really and truly protected.

What is a HEAF?

The Nuclear Energy Agency (NEA), which has coordinated international HEAF research efforts for several years, defines HEAF this way: “An arc is a very intense abnormal discharge of electrons between two electrodes that are carrying an electrical current. Since arcing is not usually a desirable occurrence, it is described as an arcing fault.”

Nuclear power plants generate electricity and use electricity to power in-plant equipment. The electricity flows through cables or metal bars, called buses. An arc occurs when electricity jumps off the intended pathway to a nearby metal cabinet or tray.

Electricity is provided at different voltages or energy levels for different needs. Home and office receptacles provide 120-volt current. Nuclear power plants commonly have higher voltage electrical circuits carrying 480-volt, 4,160-volt, and 6,900-volt current for motors of different sizes. And while main generators at nuclear plants typically produced electricity at 22,000 volts, onsite transformers step up the voltage to 345,000 volts or higher for more efficient flow along the transmission lines of the offsite power grid.

How is the Risk from HEAF Events Managed?

Consistent with the overall defense-in-depth approach to nuclear safety, HEAF events are managed by measures intended to prevent their occurrence backed by additional measures intended to minimize consequences should they occur.

Preventative measures include restrictions on handling of electrical cables during installation. Limits on how much cables can be bent and twisted, and on forces applied when cables are pulled through wall penetrations seek to keep cable insulation intact as a barrier against arcs. Other preventative measures seek to limit the duration of the arc through detection of the fault and automatic opening of a breaker to stop the flow of electrical current through the cables (essentially turning the arc off).

Mitigative measures include establishing zones of influence (ZOI) around energized equipment that controls the amount of damage resulting from a HEAF event. Figure 1 illustrates this concept using an electrical cabinet as the example Electrical cabinets are metal boxes containing breakers, relays, and other electrical control devices. Current fire protection regulatory requirements impose a 3-foot ZOI around electrical cabinets and an 18-inch ZOI above them. Anything within the cabinet and associated ZOI is assumed to be damaged by the energy released during a HEAF event. Sufficient equipment must be located outside the affected cabinet and its ZOI to survive the event and adequately cool the reactor core to prevent meltdown.

Fig. 1 (Source: Nuclear Regulatory Commission)

Even with these preventative and mitigative measures, NEA recognized the hazard that HEAF events poses when it wrote in a May 2017 report: “The electrical disturbance initiating the HEAF often causes loss of essential electrical power and the physical damage and products of combustion provide significant challenges to the operators and fire brigade members handling the emergency. It is clear that HEAFs present one of the most risk significant and challenging fire scenarios that a [nuclear power plant] will face.”

What is the Problem with HEAF Risk Management?

Actual HEAF events have shown that the preventative and mitigative measures intended to manage the hazard have shortcomings and weaknesses. For example, arcs have sometimes remained energized far longer than assumed, enabling the errant electricity to wreak more havoc.

Additionally, HEAF events have damaged components far outside the assumed zones of influence, such as in the Turkey Point event from March 2017. In other words, the HEAF hazard is larger than its defenses.

How is the HEAF Risk Management Problem Being Resolved?

On March 11, 2011, an earthquake offshore of Japan and the tsunami it spawned led to the meltdown of three reactors at the Fukushima Daiichi nuclear plant. That earthquake also caused a HEAF event at the Onagawa nuclear plant in Japan. The ground motion from the earthquake prevented an electrical circuit breaker from opening to limit the duration of the arc. The HEAF event damaged equipment and started a fire (Fig. 2). Because the fire brigade could not enter the room due to heat and smoke, the fire blazed for seven hours until it had consumed all available fuel. As an NRC fire protection engineer commented in April 2018, “If Fukushima wasn’t occurring, this is probably what would have been in the news headlines.” Onogawa was bad. Fukushima was just worse.

Fig. 2 (Source: Nuclear Regulatory Commission)

Research initiated in Japan following the Onagawa HEAF event sought to define the factors affecting the severity of the events. Because the problem was not confined to nuclear power plants in Japan, other countries collaborated with the Japanese researchers in pursuit of a better understanding of, and better protection for, HEAF events.

The NRC participated in a series of 26 tests conducted between 2014 and 2016 using different types of electrical panels, bus bar materials, arc durations, electrical current voltages, and other factors. The results from the tests enabled the NRC to take two steps.

First, the NRC entered HEAF events into the agency’s generic issues program in August 2017. In a related second step, the NRC formally made the owners of all operating US nuclear power plants aware of this testing program and its results via an information notice also issued in August 2017. The NRC has additionally shared its HEAF information with plant owners during the past three Regulatory Information Conferences and several other public meetings and workshops.

The NRC plans a second series of tests to more fully define the conditions that contribute to the severity of HEAF events.

How Are HEAF Events Tested?

Test 23 during the Phase I program subjected a 480-volt electrical cabinet with aluminum bus material to an arc lasting 7.196 seconds. Figure 3 shows the electrical cabinet with its panel doors opened prior to the test. A pointer on the left side of the picture shows the location where the arc was intentionally caused.

Fig. 3 (Source: Nuclear Energy Agency)

To induce an arc for the test, a wire was wrapped around all three phases of the 480-volt alternating current connectors within one of the cabinet’s panels as shown in Figure 4. On the right edge of the picture is a handswitch used to connect or disconnect electrical power flowing into the cabinet via these buses to in-plant electrical loads.

Fig. 4 (Source: Nuclear Energy Agency)

Instrumentation racks and cameras were positioned around the cabinet being tested. The racks included instruments measuring the temperature and pressure radiating from the cabinet during the HEAF event. High-speed, high definition cameras recorded the progression of the event while infrared cameras captured its thermal signature. A ventilation hood positioned over the cabinet connected to a duct with an exhaust fan conducted smoke away from the area to help the cameras see what was happening. More importantly, the ventilation duct contained instruments measuring the heat energy and byproducts released during the event.

Fig. 5 (Source: Nuclear Regulatory Commission)

What Are the HEAF Test Results?

For a DVD containing reports on the HEAF testing conducted between 2014 and 2016 as well as videos from the 26 tests conducted during that period, send an email with your name and address to Much of the information in this commentary comes from materials on the DVD the NRC mailed me in response to my request.

Test 4 in the Phase I Program subjected a 480-volt electrical cabinet with aluminum bus material to an arc lasting only 0.009 seconds (i.e., 9 milliseconds). The short duration arc had minimal consequences, entirely missed if one blinks at the wrong time while watching the video. This HEAF event did not damage components within the electrical cabinet, yet alone any components outside the 3-foot zone of influence around it.

Test 3 in the Phase I Program subjected a 480-volt electrical cabinet with copper bus material to an arc lasting 8.138 seconds. The longer duration arc produced greater consequences than in Test 4. But the video shows that the consequences are largely confined to the cabinet and zone of influence.

Test 23 in the Phase I Program subjected a 480-volt electrical cabinet with aluminum bus material to an arc lasting 7.196 seconds. The voltage level and arc duration for Test 23 were essentially identical to that for Test 3, but the consequences were significantly different. Aluminum behaved differently than copper during the HEAF event, essentially fueling the explosion and ensuing fire. As a result, the damage within the cabinet, zone of influence, and even beyond the 3-foot zone of influence was much greater. For example, some of the instruments on the rack positioned just outside the 3-foot zone of influence were vaporized.

Until debris from the event obscured the lens of a camera positioned many feet outside the 3-foot zone of influence, a side view of the Test 23 HEAF event showed it was a bigger and badder event than the HEAF event in Test 3 and the HEAF event in Test 4.

Figure 6 shows the electrical cabinet with its panel doors open after Test 23. The cabinet clearly looks different from its pre-test appearance (see Figure 4). But this view does not tell the entire story.

Fig. 6 (Source: Nuclear Energy Agency)

Figure 7 shows the left side of the electrical cabinet after Test 23. The rear upper left corner of the cabinet is missing. It was burned and/or blown away by the HEAF event. The cabinet is made of metal, not wood, plastic, or ice. The missing cabinet corner is compelling testimony to the energy released during HEAF events.

Fig. 7 (Source: Nuclear Energy Agency

Tests 3, 4 and 23 all featured electrical cabinets supplied with 480-volt power.

Tests 4 and 23 each featured aluminum bus material. Test 4 had negligible consequences while Test 23 had significant consequences, attesting to the role played by arc duration. The arc lasted 0.009 seconds in Test 4 while it lasted 7.196 seconds in Test 23.

Tests 3 and 23 featured arcs lasting approximately 8 seconds. Test 23 caused substantially greater damage within the electrical cabinet and beyond the 3-foot zone of influence due to the presence of aluminum rather than copper materials.

How Vulnerable Are US Nuclear Plants to HEAF Events?

The Phase I series of tests revealed that HEAF events depend on the voltage level, the conducting material (i.e., copper, iron, or aluminum), and the arc duration. The higher the voltage, the greater the amount of aluminum, and the longer the arc duration, the greater the consequences from HEAF events.

The NRC received results in 2017 from an industry survey of US nuclear plants. The survey showed that the plants have electrical circuits with voltage levels of 480, 4160, 6900, and 22000 volts. The survey also showed that while some plants did not have electrical circuits with components plated with aluminum, many did.

As to arc durations, actual HEAF events at US plants have involved arc durations longer than the 8 seconds used in Tests 3 and 23. The May 2000 event at Diablo Canyon lasted 11 seconds. The March 2010 event at HB Robinson last 8 to 12 seconds. And the June 2011 event at Fort Calhoun last 42 seconds and likely would have lasted even longer had operators not intervened by manually opening an electrical breaker to end the event.

So, many US nuclear plants have all the ingredients necessary for really nasty HEAF events.

What Might the Fixes Entail?

The testing program results to date suggest a tiered approach to the HEAF event solution. Once the key factors (i.e., combinations of voltage levels, materials, and arc durations) are definitively established, they can be used to screen out configurations within the plant where a HEAF event cannot compromise safety margins. For example, a high voltage electrical cabinet with aluminum bus material and suspect arc duration limiters might need no remedies if it is located sufficiently far away from safety components that its HEAF vaporization carries only economic rather than safety implications. Similarly, configurations with voltage levels and materials that remain bound by the current assumptions like the 3-foot zone of influence would require no remedies.

When a configuration cannot be screened out, the remedy might vary. In some cases, it might involve providing more reliable, quick-acting fault detection and isolation systems that limit the duration of the arc. In other cases, replacing aluminum buses with copper or iron buses might be a suitable remedy. And the fix might be simply installing a protective wall between an electrical cabinet and safety equipment.

Further HEAF testing will expand knowledge of the problem, thus more fully informing the decisions about effective solutions.

UCS Perspective

It has been known for many years now that HEAF events could cause wider damage than currently assumed in designing and applying fire protection measures. As a result, a fire could damage primary safety systems and their backups—the very outcome the fire protection regulatory requirements are intended to prevent.

This is normally the time and spot where I chastise the NRC for dragging its feet in resolving this known safety hazard. But while years have passed since the HEAF hazard flag was first raised, the NRC’s feet have been busy. For while it was known that HEAF events could cause greater damage than previously realized, it was not known what factors played what roles in determining the severity of HEAF events and the damage they inflict. The NRC joined regulatory counterparts worldwide in efforts designed to fill in these information gaps. That knowledge was vitally needed to ensure that a real fix, rather than an ineffective band-aid fix, was applied.

That research took time to plan and conduct. And further research is needed to fully define the problem to find its proper solution. In the meantime, the NRC has been very forthcoming with plant owners and the public about its concerns and associated learnings to date.

While the NRC’s efforts to better understand HEAF events may be justified, it’s worth remembering that the agency’s intentions and plans are little more than IOUs to the millions of Americans living close to vulnerable nuclear plants. IOUs provide zero protection. The NRC needs to wrap up its studies ASAP and turn the IOUs into genuine protection.


Subscribe to Union of Concerned Scientists aggregator - Combined UCS Blogs