UCS Blog - The Equation (text only)

The Department of Interior Does Not Care What You Think About Endangered Species

Photo: NCinDC/CC BY-ND 2.0 (Flickr)

The Department of Interior simultaneously announced three majorly flawed proposals that would radically transform how the Endangered Species Act functions and gave the public just 60 days to provide feedback. Yesterday, without providing any reasoning, the department denied a request from UCS to extend the comment period. That means you have six more days to file a comment (Rule 1, Rule 2, Rule 3). This guide from UCS can help you craft an effective comment on one or all of these rules.

The way the government uses science to manage endangered species like the greater sage grouse may change significantly for the worse under a series of new proposals. Photo: USFWS

Public comments are more than just hyperbole. As part of the official record, they can be used by organizations that wish to challenge the government’s interpretation of a law in court. They are sometimes used by members of Congress conducting oversight over the work of federal agencies. And they can guide future administrations in how they carry out environmental and public health laws.

In a letter requesting a comment period extension, I noted the following:

These proposals could profoundly change the implementation of the Endangered Species Act and the public, including the scientific community, needs sufficient time to better evaluate the impacts of the proposed rule in conjunction with the other two administrative proposals to provide comprehensive and meaningful feedback on it…

Given the critical and comprehensive nature of this proposal, the current timeframe is wholly inadequate and will not allow for thorough public input on these proposed rules and their impact on FWS’s ability to fulfill its mission to conserve, protect and enhance fish, wildlife and plants and their habitats for the continuing benefit of the American people. 

When the EPA tried a similar stunt with its proposal to restrict the use of public health science in its work, the agency ultimately agreed to extend the public comment period by more than two months. By this measure, then, the Department of Interior is even worse than Scott Pruitt’s EPA.

Still, we must work with the limited democratic tools we have left. 1500 scientists signed a letter earlier this year that helped inform public understanding of current threats to endangered species by Congress and the Trump administration. Keep that momentum going by submitting a public comment by September 24, 2018. You can also sign on to a more general comment developed by UCS before the deadline.

Photo: NCinDC/CC BY-ND 2.0 (Flickr)

Mass. Gas Explosions: What Can We Do About Home Fossil Fuels?

The calls and texts from my kids’ school started coming in at 5:11 p.m. last Thursday: “Evacuate campus buildings immediately.” Some of the messages included mention of a gas leak. The northern Massachusetts headlines about gas leaks, fires, and explosions were scary, and this was my own family potentially in harm’s way.

After events like that, it’s easy to imagine wanting to be done with fossil fuels. Not just because of their climate change, broader environmental, or public health impacts, but also because of the problems, even rare ones, that can arise from having those fuels right where we live.

But where might that fossil fuel reduction plan happen on the home front? Here are a few ideas.

Getting beyond fossils

Photo: Pixabay/Magnascan

Even with all the thinking I do about moving away from fossil fuels and toward clean energy, those messages from school brought an immediacy to the need for transition that I hadn’t felt before. When my family got past the emergency stage, I found that last week’s events were prompting me to rethink the role that fossil fuels play in my own life, knocking out of me the remnants of my that’s-the-way-it-is-because-that’s-the-way-it’s-always-been mindset.

The first part for addressing our home fossils is understanding where they are (in fuel form in this case—not, say, as plastic plates, polyester quick-dry clothing, or vinyl siding). The second part is understanding the options for dealing with them.

In my house, that first part comes down to space heating, water heating, cooking, and transportation. No small list. For the second part, though, the catalogue of options is definitely up to the challenge.

Here’s a look at the opportunities in each of those areas, and where my own journey stands.

Cutting fossil use in space heating

Our winters are cold, and in New England, space heating accounts for 60% of household energy use. While natural gas is the fuel for more than half, homes in these parts also use fuel oil in a big way; in Massachusetts, heating oil accounts for close to 30%.

Moving from oil to gas cuts down on carbon pollution, using high-efficiency gas furnaces and boilers takes it to the next level, and insulating homes better can cut down on any fuel. But none of those results in ditching on-site fossils altogether.

Fortunately, heat pumps do. The two options are ground-source/geothermal, which take advantage of the constant temperature underground, and air-source, which miraculously harvest heat from even really cold air (and have gotten good in recent years at handling frigid northern temperatures). And they both do it with electricity as the only input.



I haven’t gotten to that stage yet. After becoming a homeowner a while back, I upgraded my heating equipment to the highest-efficiency units I could find. But heat pumps weren’t really on my radar screen. So there’s room for progress there.

Cutting fossil use in water heating

The next big category for fossil use in our houses is water heating; it accounts for 16% of home energy use in Massachusetts. Efficiency is an opportunity here, too, but again, only a partial solution if it’s a natural gas- or oil-fired unit.

The options for fossil freedom lie in electricity and the sun. For our home, I put in a solar water heating system, with a backup to boost it as needed. That booster is gas-fired, but could have been electric.

Another option is electric with, again, heat pumps to the rescue. Like their space-heating brethren, heat-pump water heaters draw heat from their surroundings. In this case, that adds up to water getting heated two to three times as efficiently as it would with a conventional electric (resistance) water heater.

My solar heat as of Monday. No sense wasting all that sunshine.

Cutting fossil use in cooking

Most of the time these days, the choice for frying eggs or roasting potatoes is between gas and electricity. Gas devotees like its responsiveness (though electrics may often actually have the edge in performance).

When we updated our old kitchen a few years back, we switched from gas to electric, but not a standard one. Instead of a resistance (glowing coil) kind, we went with an induction cooktop—electric, efficient, and really responsive (and able to boil water in no time flat).

Cutting fossil use in transportation

Our transit of the house in search of fossil fuels shouldn’t ignore the garage, and gasoline. The obvious solution is electric vehicles, and it’s an option that’s so much more real than it was when I drove EVs back in the 1990s. Add in walking, biking, and electric buses, and you’re cruising without carbon (onsite).

My ride, when I’m not on my bike or a train, is efficient (a 2001 first-gen hybrid, still, at 196,000 miles, getting 45 miles to the gallon), but still gas-powered. My wife’s, though, is pure electric—not a gas gallon in sight.

Home fossil use beyond the home

There’s actually one more entry for this list: electricity. This might seem like an odd thing to discuss when we’re talking about fossil fuels in the home, but let’s face it: A lot of the approaches above involve switching to a plug, and we’re not necessarily interested in just exporting our fossil problem with the out-of-sight-out-of-mind approach.

Fortunately, there are options here, too. One is to make your own fossil-fuel power, particularly with solar electric (photovoltaic) panels. If you’ve got the wherewithal and the roof, for example, you can look into putting up a solar array (and maybe even adding batteries). Or you can see about joining in with neighbors in a community solar system.

A more broadly available fossil fix is to buy green power. If your utility gives you the option, you can choose a fossil-free mix in place of whatever default electric mix might otherwise supply you. Or you can buy an equivalent amount of renewable energy credits (RECs) to green up your power supply.

We went solar two years ago, and generate enough to cover all of our home electricity use and a portion of the car. For the rest of our usage, my utility had been offering a REC option, and I’d been a loyal customer till that program went away; I’ll be looking to find a successor option.

Continue the journey

Home sweet lower-fossil-fuel home (Photo: J. Rogers)

Not all of these opportunities are available to all of us (think renters, for example). Money, too, is a consideration, and not all the options above are cheap (though some can actually save you money). But times like these call for do-what-you-can and beyond-the-wallet thinking. (Including because of the expense of the investments that someone is going to have to make, in the case of these events, to improve safety even without nixing the fossil fuels.)

Last Thursday, many of us in my area were lucky. My boys and their schoolmates evacuated and waited it out in a field. We picked up our kids, and adopted for the night a couple of extra who were more affected by the gas fires. Our town is supplied by a different gas network from the gas-fire communities, and we have a different power company, so didn’t lose power when service got cut off in affected communities.

But fossil fuels are certainly a part of our lives as much as they were for those who got hit by last week’s events: We have natural gas in our home, gasoline in our garage, and neighbors who heat with oil. So the journey continues.

When it comes to fossil fuel use in our homes, we’ve got options, and plenty of reasons to exercise them. Fossil fuels’ days of fossil fuels are numbered. Accelerating that phase-out is in our hands… and looking better all the time.

What’s for Dinner? A Preview of the People, Process, and Politics Updating Federal Dietary Guidelines

Photo: grobery/CC BY SA 2.0 (Flickr)

Months behind schedule, two federal departments have officially kicked off the process for writing the 2020-2025 iteration of the Dietary Guidelines for Americans. Updated and reissued every five years, these guidelines are the nation’s most comprehensive and authoritative set of nutrition recommendations. And although the process is meant to be science-based and support population health—and has historically done so, with some notable exceptions—there are plenty of reasons to believe that the Trump administration is preparing to pitch a few curveballs.

First, a little background: The two agencies responsible for issuing the guidelines are the US Department of Agriculture (USDA) and Department of Health and Human Services (HHS). Earlier this month, the agencies released a call for nominations to the advisory committee that will review current nutrition science and write recommendations for the new guidelines. For the first time, the guidelines will include recommendations for maternal nutrition and for infants and toddlers through 24 months—meaning we may see a larger advisory committee and some extra work put into developing these recommendations from scratch.

And that won’t be the only change since the last cycle. There was a bitter political battle over the 2015-2020 Dietary Guidelines, in which the advisory committee made mention of environmental sustainability, noting that plant-based diets that include plenty of foods like fruits, vegetables, and whole grains are good for both our health and the future of our food supply. These recommendations were ultimately omitted, and the episode culminated in Congress writing new legislation to limit the scope of the guidelines and mandate a so-called critical review of their scientific integrity. The full impact of this anti-science legislation, which was tacked onto a 2016 appropriations bill (despite strong opposition from public health and nutrition groups), will be brought to bear during the coming months.

All that said, there’s one thing that’s likely to remain the same: the industries that wielded influence over the 2015-2020 Guidelines haven’t gone anywhere. On the contrary, they may be emboldened by an administration that has repeatedly given preference to corporate interests, sidelining science and sacrificing the public good in the process.

The People: What will become of the Scientific Advisory Committee in the Trump era?

Typically, the first major step in developing new Dietary Guidelines is to identify the group of nutrition and health experts who will form the Dietary Guidelines Advisory Committee (or DGAC). These nominees will be well-known in their fields, and will bring with them more than a decade each of experience as medical or nutrition researchers, academics, and practitioners. Members of the DGAC serve the committee for two years, after which they submit a final scientific report to the USDA and HHS with their recommendations.

This part of the process is happening in real-time. The 30-day call for nominations is now open and will close on October 6. (Read more about the criteria for nominees here.)

Photo: USDA

But the negligence the Trump administration has shown in maintaining existing scientific advisory committees is concerning, to say the least. An analysis by my colleagues here at the Union of Concerned Scientists shows that, during the administration’s first year in office, federal science advisory committees met less frequently than in any other year since 1997, when the government began tracking this data. A majority of the committees are meeting less than their charters require, and committee membership has also decreased—with some agencies disbanding entire advisory committees altogether.

Furthermore, what happens after the public submits nominations to the DGAC happens largely behind closed doors. Nominations will be reviewed by USDA and HHS program staff, and the slate of chosen nominees will be evaluated and vetted internally. Formal recommendations for the committee will then be reviewed and approved by the USDA and HHS secretaries. Per their most recent communication, the agencies hope to announce the 2020-2025 DGAC by early next year.

If you’re thinking that the committee selection lacks a certain element of transparency, you’re not the only one.

In one of two reports released last year examining the Dietary Guidelines process (the result of the aforementioned legislation, passed in 2016 appropriations rider), the National Academy of Medicine recommended that the public have the opportunity to review the provisional committee for bias and conflicts of interest before it’s approved.

It’s worth repeating that the selection of committees in recent DGA cycles has successfully brought a wealth of knowledge and expertise to the process—resulting, for the most part, in strong evidence-based recommendations. But in an administration where the “D” in USDA has come to stand for DowDuPont, concerns about undue influence on the committee selection may be well warranted. (See “The Politics” below.)

The Process: More to do, and twice as fast

After the advisory committee is appointed, the committee begins to review the current body of nutritional science to generate its recommendations. The recommendations are based on a “preponderance of scientific evidence,” which means they consider a variety of research and study designs. (Though randomized controlled trials are typically the gold standard in science, this type of study is incredibly difficult to do with diet.)

The committee won’t review everything—there are certain topics that are selected each cycle, based on what new evidence has emerged and what issues are of greatest concern to public health. And here’s the first place you’ll see the 2020-2025 DGAs break from tradition: rather than identifying topics of interest after the committee is selected, USDA and HHS have developed a list of topics first, soliciting public comments in the process. You can read their list here.

There are immediate glaring absences in the topic list, including fruits, vegetables, and whole grains—some of the staples of what we consider a healthy diet. This may just mean that the committee won’t be revisiting these topics, and will instead default to existing recommendations—but the lack of clarity here is disconcerting. A brief note at the end of the topic list, perhaps meant to explain the omissions, has left public health and nutrition groups scratching their heads: “Some topics are not included above because they are addressed in existing evidence-based Federal guidance. In an effort to avoid duplication with other Federal efforts, it is expected that these topics will be reflected in the 2020-2025 Dietary Guidelines by referencing the existing guidance. Thus, these topics do not require a review of the evidence by the 2020 Dietary Guidelines Advisory Committee.”

Photo: USDA

Meanwhile, the topics that have been explicitly named include added sugars; beverages, such as dairy, sugar-sweetened beverages, and alcohol; the relationship between certain diets (think: Mediterranean Diet, vegetarian, etc.) and chronic disease; and different dietary patterns across life stages, including infancy and toddlers through 24 months. What didn’t make the cut? A mention of red meat or processed meats—which have been linked to certain types of cancer and other health risks. The agencies (predictably) sidestepped this issue, making reference only to types of dietary fats.

If this sounds like a lot to sort through, it will be. And the tentative timeline that the agencies have proposed is ambitious. After the committee is announced in early 2019, it will have just over one year to deliberate before releasing its scientific report. During that time, the committee will hold approximately five public meetings (last cycle, there were seven) and offer an extended period of open public comment. After the DGAC scientific report is released, the public will also have one final opportunity to comment.

But if there’s anything we learned from the last DGA cycle, it’s that what can happen during that gap—between the release of the DGAC scientific report and the issuance of the DGAs—is critical, and it isn’t always clear. Enter “The Politics.”

The Politics: When money talks

What happened during the 2015-2020 DGA cycle?

The DGAC advisory report, submitted in February 2015, included recommendations for plant-based diets that supported both human health and environmental sustainability—an unprecedented move. Per the report: “A diet higher in plant-based foods, such as vegetables, fruits, whole grains, legumes, nuts, and seeds, and lower in calories and animal-based foods is more health promoting and is associated with less environmental impact than is the current U.S. diet.”

But eight months later, the writing was on the proverbial wall, in the form of a blog written by former USDA Secretary Vilsack and HHS Secretary Burwell. Sustainability is outside the scope of the DGAs and would not be included.

Two months after that, the 2016 appropriations bill was passed, stating that any revisions to the Dietary Guidelines for Americans be limited in scope to nutritional and dietary information.

By all appearances, the key concern seemed to be that science-based sustainability recommendations were outside the scope of the DGAs. But you don’t have to read too far between the lines to see that many were more concerned about sales—as in, sales of foods that aren’t central to a plant-based diet. Like, for example, meat and dairy.

At a Congressional hearing on the matter, Rep. Mike Conaway, current chair of the House Agriculture Committee, put it this way: “[the inclusion of sustainability] could result in misguided recommendations that could have ill effects on consumer habits and agricultural production.”

Rep. Glenn Thompson, current chair of the House Agriculture Subcommittee on Nutrition, put a finer point on his interests: “What can we do to remove policies that hinder milk consumption, and to promote policies that could enhance milk consumption?”

It’s hardly a stretch to imagine that what happened during the 2015-2020 DGA cycle—and to the advisory committee’s recommendations that were seemingly lost in translation—was a direct product of industry influence.

And though efforts to communicate the science behind more sustainable, plant-based diets have been all but stymied, there is still plenty at stake for industry groups in the 2020-2025 DGA cycle. Expect to see some of the usual suspects make an appearance, including the meat industry, dairy industry, and sugar-sweetened beverage associations, as well as formula companies, which will have vested interest in shaping the new recommendations for infants and toddlers. (This may be happening in real-time, too. Just this spring, Gerber announced it would join its parent company, Nestle, at its headquarters in Rosslyn, Virginia—just a stone’s throw from the capitol.)

As this process unfolds, the Union of Concerned Scientists will be there—watchdogging and waiting. Stay tuned to learn more about how you can help us stand up for science and make the 2020-2025 Dietary Guidelines for Americans the strongest, most health-promoting edition yet.

Photo: grobery/CC BY SA 2.0 (Flickr)

Even More Than 100% Clean: California’s Audacious Net-Zero Carbon Challenge

Governor Brown signing SB-100 into law.Governor Brown signing SB-100 into law. Photo: Governor's Office

At the end of a summer that was marked by dramatically destructive natural disasters, including massive fires throughout the entire western U.S., killer heat waves, fires, and floods in Asia and Europe, and now Hurricane Florence landing on the Carolinas, California is offering a ray of hope for a planet that is facing increasingly terrible impacts from global warming.  Governor Jerry Brown has convened an international climate summit in San Francisco that demonstrates the huge number of jurisdictions both nationally and from around the world, in addition to businesses and industries, religious groups, climate justice advocates, and a lot of scientists, among many others, who are working hard for climate action.

Brown began the week by demonstrating that California is not resting on its impressive climate action laurels but significantly increasing its commitment to reducing emissions. I was lucky enough to attend the ceremony where Governor Brown signed SB 100, a bill that had taken its legislative author, State Senator Kevin de León (whose legislative tenure has been distinguished by successfully championing historic clean energy and climate action), a grueling two years to pass the state legislature.  SB 100 commits California to 60% renewable energy by 2030 (up from the current 50% requirement) and a goal of fully 100% clean electricity by 2045.  While we have much work to do to achieve this goal, we are now committed to a path toward a fully decarbonized electricity system.  I was proud to represent UCS’s incredible staff who made uniquely valuable contributions to this coalition effort to get the bill passed.

In a remarkable piece by NPR, UCS’s  energy analyst Laura Wisland talked about the many challenges that remain for California to achieve this goal, but it is clearly something we can do.  Thanks to over 15 years of previous renewable electricity and energy efficiency policies, electricity emissions now account for a relatively low 16% of California’s greenhouse gas (GHG) inventory. But the last emissions reductions will be the hardest to achieve. We will need to grapple with how to lower the vast amount of natural gas used to generate electricity and make room for cleaner, carbon-free sources of energy.  We also have a big challenge ahead to grapple with transportation and industrial emissions to meet ambitious state 2030  GHG reduction limits.  But in the last twelve years California has shown that it can succeed- ahead of schedule- in meeting carbon reduction goals while growing the state economy from the eighth to the fifth largest in the world, a feat that belies alarmists who say that reducing emissions will damage the  economy.

A surprise announcement of huge ambition

Coming at the end of Governor Brown’s remarkable 8-year tenure, the bill signing ceremony this week contained a surprise. With no fanfare or previous signaling of his intentions, Governor Brown included an additional action in this week’s bill signing – a new executive order, B-55-18, that creates an economy-wide carbon neutrality goal for California by 2045.  He is directing the state to strive for net zero carbon emissions in less than 30 years. This must be done using a combination of zero-emission technologies to power the electric grid, transportation, homes, buildings and industries,  with other practices and technologies that sequester carbon, or take it out of the atmosphere.

This will require an extraordinary effort that will affect every Californian. The state will not only have to meet its ambitious new 100% clean electricity goal on top of its very ambitious 2030 statewide GHG reduction limit, but also virtually halt nearly all emissions in a mere 15 years. Let’s take a moment to appreciate that goal.

For California to achieve net zero carbon emissions, it will require a staggering change to some of the basic elements driving our economy. California will need to eliminate the single biggest tranche of carbon emissions: those from the vehicles that transport people and goods across throughout the state.  This means electric vehicles powered by our clean electric grid for nearly everyone. It will require carbon-free fuels for industrial processes, hugely advanced efficiency in buildings and appliances, and likely the development of truly reliable forms of storing carbon in plants, soils, and geographic formations. As yet, no one has developed a real plan for how we could get to net zero emissions, and to do this successfully is a very tall order. David Roberts at Vox wrote this early analysis of the ins and outs of what could happen, including a warning that implementing net zero could include measures and policies that could be more symbolic than substantive.

But here’s the thing. Once the Governor of the nation’s most populous state, a serious and credible leader with a remarkably successful tenure over eight years, has made carbon neutrality the goal, then a lot of people may start take it seriously enough to figure out how to meet it. What Brown has done is to challenge us to think about what it would really take to get carbon emissions down fast enough and thoroughly enough to reduce the risks we are seeing multiply so rapidly, and help ensure a viable future, within three decades.  It is a huge undertaking.

A vision for the future, both audacious and necessary

A few cynics may argue that this is a non-binding announcement to help the Governor’s visibility for his San Francisco Global Climate Action Summit that is happening this week, but I believe that this order could have tremendous value. Executive Orders in California are part of state policy, even if they do not have the force of law. And whatever mix of reasons Brown had for doing this, he is sending a strong science-based message to the world –namely, that we need to reduce global warming pollution much further and faster than we previously thought to avoid the worst impacts of climate change. It will be up to the next governor and the legislature to carry out Brown’s order. Luckily, California has good examples of turning Executive Orders into law.

So here is my unsolicited advice to California’s next Governor and the Legislature starting in 2019: 1) do the hard work of ensuring we get  to 100% clean energy by 2045 ; 2) start now on fully implementing  new regulations and laws that will rapidly take the carbon out of our transportation and industrial sectors to ensure we meet our 2030 goals to reduce ghgs to 40% below 1990 levels; 3) get our best minds in science and technology to work together to produce an economically sound blueprint that would get us to net zero by 2045; and 4) start to implement the next generation of policies that would get us to net zero.

California has shown that a world-class economy can reduce its carbon emissions rapidly while growing its economy. While we can’t guarantee that further decarbonization will go as smoothly, the on-going, accelerating, costly, and deadly destabilization of our climate is much too urgent a matter to approach timidly because of our fears. We owe a great deal to the leadership of people like State Senator Kevin De León and Governor Jerry Brown who have shown what is possible.  Neither will be serving as leaders in Sacramento after this year. It is now up to us to take their bold leadership and ensure that we succeed in making their visions real, and help provide examples and lessons to the nation and the world on how it can be done.

The EPA Can’t Stop Polluters When the Trump Administration Cuts Enforcement Staff

The primary task of the US Environmental Protection Agency is to protect public health and the environment. To do so, the agency must ensure that everyone, whether in the private sector or in government, complies with our nation’s laws and regulations. These safeguards are in place to protect health and safety for everyone anywhere in the country. Their enforcement safeguards are also a matter of fairness—all entities that might adversely impact our health and environment are supposed to follow the rules. So, it is particularly disturbing that the EPA Office of Enforcement and Compliance Assurance (OECA) has taken a major hit in staffing over the past 19 months in the Trump Administration.

Here at UCS, we filed a Freedom of Information Act request to help us identify changes in the number of EPA staff working on enforcement and compliance. It took a while to get the answer, but the overall results are even worse that we suspected: In EPA headquarters, at least 73 OECA staff left the office and only 4 were hired between the start of the Trump administration and late July 2018. Among those 73 departures were 17 environmental protection specialists and at least 10 scientists or engineers. The scant hires include Assistant Administrator Susan Bodine, a former lobbyist, and Deputy Assistant Administrator Patrick Traylor, a lawyer who previously defended the Koch brothers, among other industry clients.

EPA has also lost further enforcement staff (not included in the OECA list) at the regional level. Region 5, for example, lost five employees in their enforcement support section, including three investigators, while Region 7 lost several employees in its enforcement coordination office.

Those departure numbers are BIG. It means that many fewer people are out there assuring that pollution and polluters are monitored and living up to their responsibilities under the law. In addition to reductions in staff focused on pollution prevention, it also means reductions in staff for those who work on environmental cleanup, such as at Superfund sites.

There is also a critically low number of criminal investigators working for the EPA. Even though the law requires the agency to have at least 200 “special agents,” there are only 140 on staff, according to Public Employees for Environmental Responsibility.

This goes along with big reductions in EPA staff across all offices as reported in the Washington Post over this past weekend. According to the Post, at least 1,600 staff have left EPA since January 2017, with fewer than 400 new employees hired. UCS’s own data shows that at least 670 of the losses have been at the 10 regional offices (with just 73 hires to offset these losses). Notably, when EPA has been hiring, they generally haven’t been hiring scientists. According to a December 2017 New York Times piece, the administrator’s office was the only unit to have more hires than departures that year, adding 73 new employees despite the departures of only 53 staff.

My colleague Kathleen Rest and I warned about the dangers of such staff attrition at the beginning of the year. In both those articles, there were warnings from former EPA staff, aligning with our own government experience, that cutting off new hiring sends all the wrong signals to young professionals about opportunities to spend part of their careers in public service—while also threatening the capacity of our federal agencies to address current and future risks. From the statistics we obtained, even the number of student trainees in the regional offices has been slashed, with only five hired (all in Region 5, the upper Midwest) but 48 lost from the other regional offices.

Our survey of federal scientists confirmed that morale is desperately low, and that many offices no longer feel they have the staff to do their jobs.

These staff reductions fly in the face of Congressional action that appropriated funds for the EPA to maintain these programs. Indeed, it seems that around the country, the Trump Administration has gone ahead and made cutbacks not only in enforcement, but also in areas such as the Chesapeake Bay Program, the Great Lakes Program, the Gulf of Mexico program, and others—despite the fact that Congress provided the funding for them.

What does all this mean? It means the Environmental Protection Agency has taken a step backwards on protecting the health and environment for workers, communities, families, for you. New agency directions call for changing the priorities for enforcement and for dropping a special focus on oil and gas extraction and concentrated animal feeding operations, because the agency says issues with these industries have been largely resolved. Seriously. That will be news to neighbors of drilling, pipelines and animal waste disposal sites.

And the leadership of the EPA wants to turn away from enforcement overall, to encouraging compliance through voluntary measures and “compliance assistance”. Pardon our skepticism here.

And so it goes with what seems to be an ongoing industry takeover of our premier public health agency. First they roll back regulations, then they roll back enforcement so there are fewer consequences for those who put the public’s health at risk, and then they reduce the professional staff so new rules can’t be put back in place.

It is time to stand up and stay STOP. Enough is enough. We need the EPA. And that means we need it to be a vibrant, well-staffed, professional agency. Not a political punching bag or a pinata of goodies for the regulated industry.

 

 

Photo: EPA

Mr President, More Than 3,000 Deaths is Not an “Incredible, Unsung Success”

Photo: Juan Declet-Barreto

Last year, I thought throwing rolls of paper towels at victims of Hurricane María in Puerto Rico was the lowest that President Trump could go in disrespecting and failing the people of Puerto Rico in the midst of the climatic catastrophe that was personal to me and my family on the island.

But this morning he went even lower with his tweets denying the death toll from Hurricane María in Puerto Rico, adding insult to injury to an enormous disaster exacerbated by a failure to prepare and to help the island recover. The day before that, in his characteristic self-congratulatory tone, he touted his administration’s handling of Hurricane María an ”incredible, unsung success”.  The dead don’t sing Mr. President. The aftermath of the storm left Puerto Rico without power for months, unleashed a humanitarian crisis for more than 3 million US citizens, and was responsible for more than 3,000 deaths.

Once again, Mr. Trump has shown callous disregard for human life, minimizing the toll of human suffering during and after the hurricane. The President’s falsehoods are rebutted by scientific reports that present evidence that lack of electricity to power hospitals, medical equipment, and refrigerate insulin, combined with a collapsed public health system and inadequate protocols to ascribe deaths to post-hurricane conditions, contributed to the estimates that more than 3,000 Puerto Ricans lost their lives because of Hurricane María (I previously reported on that here).

Besides the President’s statements and tweets being an affront to human dignity, they also shows a callous disregard for the truth and the importance of accurate information in our democracy. Although this administration has a pattern of attacking or ignoring science since day one, this is one of the most extreme examples – we can’t expect to be able to solve problems if our leaders choose to deny facts and attack the evidence.

Sobering fact: Just two days ago, a whopping nine simultaneous tropical storms were shown in a satellite image composite. As my colleague Kristy Dahl recently reported, we already know that the expected storm surge from storms like Hurricane Florence will be amplified because of sea-level rise, that our atmosphere can hold more moisture, and that the potential for extreme rainfall during hurricanes is increased. It is also expected that coastal, rural, and low-income communities in the Carolinas and Virginia will be among the hardest hit by Florence.

I watch with worry as  Florence continues to barrel towards the US Southeastern coast and Typhoon Mangkhut in the Pacific threatens 10 million in the Philippines. I wonder how the President’s tweeted falsehoods will impact our federal agencies’ capacity to respond to what is likely a disastrous situation for millions of Americans. María taught us many things, among them that when the President uses Twitter to minimize disasters, ignore science, and disparage people in harm’s way, this effectively lower the urgency with which federal agencies will respond to such disasters.

As a new hurricane season threatens the US, we need to be able to trust that the government is willing and able to acknowledge the facts and act to protect us. President Trump’s offensive and false tweets today undermine public trust.

 

Juan Declet-Barreto

The Price of Large-Scale Solar Keeps Dropping

Photo: NREL

The latest annual report on large-scale solar in the U.S. shows that prices continue to drop. Solar keeps becoming more irresistible.

The report, from Lawrence Berkeley National Laboratory (LBNL) and the US Department of Energy’s Solar Energy Technologies Office, is the sixth annual release about the progress of “utility-scale” solar. For these purposes, they generally define “utility-scale” as at least 5 megawatts (three orders of magnitude larger than a typical residential rooftop solar system). And “solar” means mostly photovoltaic (PV), not concentrating solar power (CSP), since PV is where most of the action is these days.

Here’s what the spread of large-scale solar looks like:

Source: Bolinger and Seel, LBNL, 2018

In all, 33 states had solar in the 5-MW-and-up range in 2017—four more than had it at the end of 2016. [For a cool look at how that map has changed over time, 2010 to 2017, check out this LBNL graphic on PV additions.]

Watch for falling prices

Fueling—and being fueled by—that growth are the reductions in costs for large-scale projects. Here’s a look at power purchase agreements (PPAs), long-term agreements for selling/buying power from particular projects, over the last dozen years:

Source: Bolinger and Seel, LBNL, 2018

And here’s a zoom-in on the last few years, broken out by region:

Source: Bolinger and Seel, LBNL, 2018

While those graphs show single, “levelized” prices, PPAs are long-term agreements, and what happens over the terms of the agreements is worth considering. One of the great things about solar and other fuel-free electricity options is that developers can have a really good long-term perspective on future costs: no fuel = no fuel-induced cost variability. That means they can offer steady prices out as far as the customer eye can see.

And, says LBNL, solar developers have indeed done that:

Roughly two-thirds of the contracts in the PPA sample feature pricing that does not escalate in nominal dollars over the life of the contract—which means that pricing actually declines over time in real dollar terms.

Imagine that: cheaper over time. Trying that with a natural gas power plant would be a good way to end up on the losing side of the contract—or to never get the project financed in the first place.

Here’s what that fuel-free solar steadiness can get you over time, in real terms:

Source: Bolinger and Seel, LBNL, 2018

What’s behind the PPA prices

So where might those PPA price trends be coming from? Here are some of the factors to consider:

Equipment costs. Solar equipment costs less than it used to—a lot less. PPAs are expressed in cost per unit of electricity (dollars per megawatt-hour, or MWh, say), but solar panels are sold based on cost per unit of capacity ($ per watt). And that particular measure for project prices as a whole also shows impressive progress. Prices dropped 15% just from 2016 to 2017, and were down 60% from 2010 levels.

Source: Bolinger and Seel, LBNL, 2018

The federal investment tax credit (30%) is a factor in how cheap solar is, and has helped propel the incredible increases in scale that have helped bring down costs. But since that ITC has been in the picture over that whole period, it’s not directly a factor in the price drop.

Project economies of scale. Bigger projects should be cheaper, right? Surprisingly, LBNL’s analysis suggests that, even if projects are getting larger (which isn’t clear from the data), economies of scale aren’t a big factor, once you get above a certain size. Permitting and other challenges at the larger scale, they suggest, “may outweigh any benefits from economies of scale in terms of the effect on the PPA price.”

Solar resource. Having more of the solar happen in sunnier places would explain the price drop—more sun means more electrons per solar panel—but sunnier climes are not where large-scale solar’s growth has taken it. While a lot of the growth has been in California and the Southwest, LBNL says, “large-scale PV projects have been increasingly deployed in less-sunny areas as well.” In fact:

In 2017, for the first time in the history of the U.S. market, the rest of the country (outside of California and the Southwest) accounted for the lion’s share—70%—of all new utility-scale PV capacity additions.

The Southeast, though late to the solar party, has embraced it in a big way, and accounted for 40% of new large-scale solar in 2017. Texas solar was another 17%.

But Idaho and Oregon were also notable, and Michigan was one of the four new states (along with Mississippi, Missouri, and Oklahoma) in the large-scale solar club. (And, as a former resident of the great state of Michigan, I can attest that the skies aren’t always blue there—even if it actually has more solar power ability than you might think.)

Capacity factors. More sun isn’t the only way to get more electrons. Projects these days are increasingly likely to use solar trackers, which let the solar panels tilt face the sun directly over the course of the day; 80% of the new capacity in 2017 used tracking, says LBNL. Thanks to those trackers, capacity factors themselves have remained steady in recent years even with the growth in less-sunny locales.

What to watch for

This report looks at large-scale solar’s progress through the early part of 2018. But here are a few things to consider as we travel through the rest of 2018, and beyond:

  • The Trump solar tariffs, which could be expected to raise costs for solar developers, wouldn’t have kicked in in time to show up in this analysis (though anticipation of presidential action did stir things up even before the tariff hammer came down). Whether that signal will clearly show in later data will depend on how much solar product got into the U.S. ahead of the tariffs. Some changes in China’s solar policies are likely to depress panel prices, too.
  • The wholesale value of large-scale solar declines as more solar comes online in a given region (a lot of solar in the middle of the day means each MWh isn’t worth as much). That’s mostly an issue only in California at this point, but something to watch as other states get up to high levels of solar penetration.
  • The investment tax credit, because of a 2015 extension and some favorable IRS guidance, will be available to most projects that get installed by 2023 (even with a scheduled phase-down). Even then it’ll drop down to 10% for large-scale projects, not go away completely.
  • Then there’s energy storage. While the new report doesn’t focus on the solar+storage approach, that second graphic above handily points out the contracts that include batteries. And the authors note that adding batteries doesn’t knock things completely out of whack (“The incremental cost of storage does not seem prohibitive.”).

And, if my math is correct, having 33 states with large-scale solar leaves 17 without. So another thing to watch is who’s next, and where else growth will happen.

Many of the missing states are in the Great Plains, where the wind resource means customers have another fabulous renewable energy option to draw on. But solar makes a great complement to wind. And the wind-related tax credit is phasing out more quickly than the solar ITC, meaning the relative economics will shift in solar’s favor.

Meanwhile, play around with the visualizations connected with the new release (available at the bottom of the report’s landing page), on solar capacity, generation, prices, and more, and revel in solar’s progress.

Large-scale solar is an increasingly important piece of how we’re decarbonizing our economy, and the information in this new report is a solid testament to that piece of the clean energy revolution.

Photo: NREL

Zombie Truck Theater: A House Science Committee Hearing

The issue of glider trucks, new truck bodies with old polluting engines, has come up in Congress yet again.  This time, it moves over to the House Science Committee, a place where Chairman Lamar Smith tends to hang science (and sometimes scientists) out to dry.

If the Science Committee was, well, different, this hearing would be an opportunity to examine the scientific facts underlying the issue of allowing unlimited glider trucks on our nation’s roads, facts which clearly show that these vehicles are dangerous to public health.  Instead, I expect it to focus on the false narrative that political leadership at EPA, glider manufacturer Fitzgerald, and more recently, Steve Milloy, founder of the climate and science-skeptic blog Junk Science, have been putting forward – that glider trucks help small businesses and are no more polluting than new, more expensive, trucks.

The witness list shows that this hearing is just meant to be legislative theater for the Chairman. The Republicans have invited the trade association of the independent truckers (OOIDA, basically the only mainstream industry group that has always supported the glider rule repeal) and Dr. Richard Belzer, an economist for hire, who was hired by Fitzgerald to write a “straw Regulatory Impact Analysis” that was submitted to the agency – he will undoubtedly parrot their talking points in the hearing. The final witness for the majority is Linda Tsang from the Congressional Research Service, which I like to call the library for Congress, is a non-partisan research and analysis service arm for Congress (let’s hear it for librarians!) – this is an interesting pick as she has not written anything publicly available on gliders, so it’s unclear what her specific expertise will be.  The minority (Democrats) were allowed to invite one witness – Dr. Paul Miller, the Deputy Director and Chief Scientist of the Northeast States for Coordinated Air Use Management (NESCAUM), which is a coordinating body for air quality regulators in the northeast.  Note that Paul is the only scientist who was asked to testify at this hearing.

What is this hearing about?

Good question, and it’s one we have been asking ourselves since we first heard about the hearing.  We have a couple of hypotheses:

  1. Chairman Smith has routinely used his position to give a stage to industry interests and fringe perspectives that align with his, and now this administration’s deregulatory agenda. Fitzgerald is just the latest actor to somehow curry favor and use the Committee to relitigate environmental protections.
  2. This hearing is really about undermining the science done at EPA and gives the Republicans a stage to question EPA’s methodical testing of glider trucks (please note, however, that no witnesses from EPA were invited to testify).
  3. All of the above.
Isn’t this a regulatory issue?  Why is Congress getting involved at this point?

Congress has been playing in the glider vehicle space for a little while, but it’s really heated up recently.  One reason for this is a recent letter lead by Rep. Bill Posey (R-FL and member of the Science Committee), who reiterated many of the same talking points OOIDA and Fitzgerald have used in pressing for an exemption to environmental protections for these dirty trucks. Another is that Steve Milloy, an industry shill and longtime opponent of regulation, has been combing over emails sent between agency officials and outside parties about EPA’s testing of glider vehicles last fall, attempting to make mountains of molehills in his quest for deregulation.

If you’ve read previous UCS blogs on gliders, you may remember that when then-Administrator Pruitt began the process of repealing the rule that limits glider truck production at the behest of Fitzgerald there was a (now discredited and withdrawn) “study” done by not-scientists at the Tennessee Technical University (TTU) that was bought and paid for by Fitzgerald.  EPA documented the issues with the TTU study and also did their own study of the emissions from in-use glider trucks (glider trucks that have been on the road a while).  EPA doesn’t have tractor trailers just sitting around to test, nor do they have the budget to buy a bunch of them, so when they need tractors to test, they typically borrow them while they put them through their paces.  This time, Volvo helped them procure some gliders to test and there is a mad conspiracy theory out there that Volvo influenced the results because they helped find the trucks for EPA to test.

Several Congressmen have latched onto this story line and sent letters to the Office of the Inspector General (OIG) asking them to open an investigation into the procurement of the trucks (the OIG recently said they would start an audit, not a full investigation).  In addition, Chairman Smith has now sent a couple of letters requesting specific correspondence between EPA and Volvo and calling into question the “scientific integrity and validity” of the EPA, despite the fact that a top Trump appointee has already notified Chairman Smith that he doesn’t see any untoward influence in the study and that it was conducted according to standard lab practices. Furthermore, as we have already pointed out, the EPA study merely confirms the obvious: these trucks pollute like crazy.

Attacking an empty chair

Tomorrow, I expect that we will see some reprisal of the Congressional letters to the agency play out.  Unfortunately, each Science Committee member will be the center of their own one-man show, since they are seeking no input from the agency itself.  If the committee were interested in actual oversight and upholding their constitutional role, the hearing would be focused on the merits of the testing and allow the agency an opportunity to detail the methodology and rigor of the testing protocol which shows how deadly glider trucks are. Instead, this will be another showboat for the Science Committee members to delegitimize the critically important and lifesaving science done by career staff at EPA. Unfortunately, conducting “oversight” without the agency present isn’t a new play for the Science Committee.

In his quest to find fire where there is no smoke, Chairman Smith will once again discover that no bogeyman exists. Despite his efforts to reanimate this issue, citizens, scientists, lawmakers, and businesses know zombie trucks should not be operating on our roads and polluting our air. The science is stating the obvious here…if only the Science Committee were interested in listening to it.

Public Domain

Los peligros escondidos del huracán ‘Florence’: mareas catastróficas e inundaciones al interior amenazan a comunidades rurales y de bajos recursos

La Guarda Nacional evacuando.

English version > 

En el transcurso de los últimos días, el huracán ‘Florence’ se ha intensificado rápidamente. Mantiene una trayectoria directa hacia Carolina del Norte, como una tormenta de Categoría 4. Esta tormenta es particularmente riesgosa dado el pronóstico de lluvias fuertes y persistentes que amenazan no solo a las áreas costeras, sino también a comunidades del interior.

Una emergencia costera exacerbada por inundaciones al interior de las Carolinas

Esta temporada de huracanes estaba pronosticada para ser una bajo lo normal o casi normal. Cualquier complacencia que haya generado esta idea ha desaparecido rápidamente. Antier, el Centro Nacional de Huracanes monitoreaba al menos tres tormentas en el océano Atlántico y emitió advertencias adicionales para el océano Pacífico. Sin embargo, basta tener tan solo un gran huracán tocando tierra para que esta temporada se convierta en una temporada de huracanes terrible — al otro lado del mundo, el mega-tifón Mangkhut amenaza a las Filipinas, Taiwán y a China, después de pasar sobre Guam.

Los estados costeros en las regiones del Sudeste y el medio-Atlántico de Estados Unidos se están preparando para el huracán Florence. Hay órdenes de evacuación para más de un millón de personas en las Carolinas, Virginia y Maryland.

La fuerza naval tomó la precaución de mover al mar sus barcos de la base naval en Norfolk. La compañía eléctrica Duke Energy se está preparando para los impactos en la red eléctrica en las Carolinas y está alistando equipos de emergencia para restaurarla una vez pase la tormenta. En un comunicado de prensa, Duke Energy advirtió sobre la probabilidad de apagones de electricidad durante días o hasta semanas. La compañía ha dicho que los impactos pueden sobrepasar los del huracán Matthew, el cual causó apagones para 1,5 millones de clientes de Duke y costó $125 millones en reparaciones.

Lo más aterrador de esta tormenta son las mareas altas y lluvias fuertes que la acompañarán. Según los pronósticos, la tormenta podría bajar de velocidad, creando un evento de precipitación extrema por muchos días, parecido a lo que vivieron los residentes de Houston durante el huracán Harvey del año pasado, y del cual aún no se han recuperado.

El último aviso del Centro Nacional de Huracanes indica que si el punto máximo de la marejada coincide con mareas altas, áreas desde el Cape Fear hasta Cape Lookout, incluyendo los ríos Neuse y Pamlico, podrán ver mareas de 1,8 a 3,6 metros de altura. Áreas costeras de Carolina del Norte y Virginia, incluyendo las áreas costeras de baja elevación, podrán ver mareas entre los 0,6 y 2,4 metros de altura.

El pronóstico también es alarmante porque contempla la posibilidad de 15 a 20 pulgadas de lluvias adicionales. Ciertas áreas de las Carolinas y Virginia están pronosticadas a recibir casi 30 pulgadas de lluvia hasta el sábado. Dependiendo de la trayectoria de la tormenta, lugares tan lejanos como West Virginia podrían sentir los impactos de las lluvias e inundación en los próximos días.

Desafortunadamente, durante las últimas semanas, gran parte de la región del sudeste y medio-Atlántico, como Carolina del Norte, Virginia y Washington, DC, han sufrido niveles de lluvia más altos de lo normal, lo que ha causado la sobresaturación del suelo con agua. Con la cantidad de lluvia pronosticada, es muy probable que las comunidades al interior de estos estados sufran inundaciones catastróficas.

Posibles impactos de la tormenta

Una tormenta de esta magnitud sin duda causará mucho daño. Esperemos que los avisos de advertencia tempranos y las preparaciones en progreso eviten la pérdida de vidas.

Los análisis preliminares de CoreLogic muestran que casi 759 mil viviendas en las Carolinas y Virginia, con costos de reconstrucción valorados en más de $170 mil millones, se encuentran directamente en la trayectoria del huracán Florence, si tocara tierra como una tormenta de categoría 4.

En comunidades rurales de Carolina del Norte, experiencias previas con tormentas muestra que la inundación puede causar que los desechos de las granjas de cerdos se desborden, contaminando ríos y arroyos. También puede causar que los estanques de cenizas de carbón (residuos de las plantas de carbón) derramen contaminantes tóxicos (¡y en el pasado lo han hecho!). Las plantas de tratamiento de aguas residuales pueden desbordarse, contaminando los reservorios de agua subterráneos de los cuales dependen muchas comunidades rurales para recibir agua potable.

Como trágicamente ocurrió durante los huracanes Irma y María del año pasado, sino son trasladados a áreas seguras, la pérdida de energía eléctrica, especialmente durante largos periodos, podría ser fatal para pacientes en hospitales y personas con condiciones médicas.

Las prisiones también tienen que ser evacuadas por su seguridad. Es preocupante ver reportajes que, hasta ahora, Carolina del Sur ha decidido no evacuar la cárcel en el condado de Jasper.  

La preparación para desastres requiere planeación previa

La respuesta de emergencia al huracán Florence no es un esfuerzo espontáneo de un día. Los directores de emergencia, planeadores y directores de empresas de servicios públicos echan mano de la extensa experiencia con tormentas anteriores en preparación para esta tormenta. Los huracanes Floyd y Matthew dejaron lecciones amargas.

Conseguir que las personas estén fuera de peligro es la prioridad. Por eso, las áreas de riesgo más alto han recibido órdenes de evacuación temprana. Estas advertencias deben ser tomadas muy en serio y obedecidas.

La tormenta va a poner a prueba la resiliencia de infraestructura crítica como son carreteras, puentes, tendidos eléctricos y subestaciones, plantas de tratamiento de aguas residuales, drenajes de agua pluvial, hospitales, aeropuertos y mucho más. Las inversiones inteligentes en infraestructura probarán ser un acierto en los próximos días. Las debilidades en donde estas inversiones no fueron suficientes, por el contrario, serán expuestas.

Los oficiales a cargo no están esperando que la tormenta toque tierra o que su trayectoria sea clara para actuar: ellos actúan con cautela y desde ya están protegiendo (lo mejor que pueden) a las personas y los servicios críticos (incidentalmente, esto es una lección que merece ser extendida a como pensamos en prepararnos para los riesgos crecientes que trae el cambio climático).

¿Qué pasará con las comunidades rurales, en las islas y de bajos recursos?

La verdadera prueba de nuestros sistemas de respuesta a desastres no es la rapidez con la que se restaure el servicio de energía eléctrica o se reestablezca el tráfico aéreo,  sino en la respuesta y apoyo para comunidades aisladas o marginadas.

Los desastres exponen las inequidades socioeconómicas. Para algunos, escapar a zonas seguras es un costo imposible (los hoteles cuestan, el combustible cuesta, incluso el acceso a carros). Tomar días libres debido a vías inundadas o escuelas cerradas para algunos significa perder su trabajo. Para aquellos para quienes su salario cubre gastos básicos, comprar seguro de inundación para proteger su hogar o pertenencias es un lujo.

Las comunidades de pocos recursos y los grupos étnicos son más propensos a vivir en zonas cercanas a sitios de desperdicio tóxico, como estanques de cenizas de carbón y vertederos. Y las comunidades rurales tienden a depender de reservorios de agua subterráneos.

Las comunidades que viven en las islas, incluyendo aquellas a orillas del Atlántico en Carolina del Norte y las zonas costeras de Carolina del Sur, están en la primera línea de impacto de esta tormenta. Esperamos que sus residentes les hagan caso a las órdenes de evacuación. En algunos casos, tendrán que viajar largas distancia para salir de peligro, debido a la gran área que será impactada por la tormenta, lo cual crea una carga adicional para aquellos con pocos recursos. Las comunidades rurales y en las islas podrán quedar aisladas durante muchos días, si sus puentes son afectados o sus vías de acceso inundadas.

Así que cuando la tormenta toque tierra, recordemos a la gente de Princeville y Roxboro, a los residentes de la nación Gullah/Geechee, aquellos de Nags Head y de Kitty Hawk, y de Tybee Island y Kiawah Island, y muchas otras comunidades que seguramente no saldrán en los titulares.

El arduo camino de la recuperación

Dado el pronóstico, podemos anticipar que los impactos de Florence serán fuertes. Esperamos que las comunidades en la trayectoria de la tormenta la enfrenten sin pérdida de vidas.

Pero la experiencia nos enseña que el regreso a la normalidad demora, y durará mucho más después que Florence deje de ser titular o tendencia en Twitter. Comunidades en Houston y Puerto Rico aún se esfuerzan por regresar a la normalidad después de la temporada desastrosa de huracanes del año pasado.

Y esta emergencia abre una serie de cuestionamientos sobre el sistema de respuesta a desastres en este país:

  • Como nación, ¿utilizaremos esta oportunidad para reconstruir de manera resiliente y tomar en serio los impactos del aumento del nivel del mar , de la creciente intensidad de tormentas Atlánticas y el incremento en eventos de lluvias torrenciales debido en parte al cambio climático?
  • El congreso y esta administración ¿proveerán los fondos apropiados no solamente para los esfuerzos inmediatos de recuperación, sino también para la recuperación resiliente a largo plazo, tanto como la compra voluntaria de hogares y el traslado de áreas de alto riesgo?
  • La fecha límite del 30 de septiembre para presupuestos federales se acerca ¿Serán protegidos los presupuestos de la Agencia Federal para el Manejo de Emergencias (FEMA) y el Departamento de Vivienda y Desarrollo Urbano (HUD), que son tan necesarios para la preparación y recuperación de desastres?
  • ¿Protegerá el congreso el presupuesto de NOAA y dará la orden de que continúe produciendo la ciencia que necesitamos para predecir y prepararnos para estas tormentas?
  • ¿Será capaz la Agencia para la Protección del Medio Ambiente (EPA), comprometida bajo la administración de Trump, de hacer su trabajo e identificar rápidamente y remediar la contaminación tóxica resultante de la tormenta? ¿O, pondrá en riesgo a comunidades como lo vimos después de Harvey?
  • ¿El congreso y los estados asignarán recursos para que las comunidades de bajos recursos o de otras maneras marginadas se recuperen de las secuelas de la tormenta?

Las respuestas a estas preguntas probarán si tenemos la determinación para asumir la realidad y los riesgos del cambio climático y del clima extremo de manera resiliente y equitativa, o si escogeremos ignorar esta realidad y responder a estas catástrofes como si fuesen hechos aislados, únicos, cuya carga cae desproporcionadamente sobre aquellos quienes menos pueden soportarla.

Por ahora, nuestros pensamientos estarán con los millones de personas en la trayectoria de esta tormenta y con los rescatistas que trabajan arduamente para protegerlos. Que todos estén a salvo.

For more information on the local hazards from #Florence, follow the @NWS offices on Twitter: @NWSCharlestonSC @NWSMoreheadCity @NWSRaleigh @NWSWilmingtonNC @NWSColumbia @NWSGSP @NWSWakefieldVA @NWSBlacksburg @NWS_BaltWash @NWS @NWSCharlestonWV pic.twitter.com/LNh3TGvCcd

— National Hurricane Center (@NHC_Atlantic) September 11, 2018

Hurricane Florence Threatens East Coast Electricity Infrastructure

Photo: NASA

Hurricane Florence is bearing down on the Mid Atlantic and by every measure it’s poised to be an extremely dangerous event—lashing winds, storm surge reaching 9 to 13 feet, and inland flooding from 20 to 30 inches of rain, and possibly even 40 inches in select locations. All this will be occurring in an area that has been experiencing above-average precipitation, meaning saturated soils less able to absorb incoming water and trees that are more likely to fall.

Evacuations have been ordered in Virginia and the Carolinas; Virginia, Maryland, Washington, D.C., and North and South Carolina have declared states of emergency; and the Navy has sent tens of vessels out of Norfolk to better weather the storm.

But not all people have the means to leave, and many more are bracing for the water and wind to come—as well as the inevitable power outages that will follow.

That’s because energy infrastructure is significantly at risk in an event such as this. Storm surge has the potential to inundate coastal assets like power plants and substations, and inland flooding from extreme precipitation threatens to submerge many more. In addition, heavy winds can topple trees and take down wires and poles.

These outages could be severe, triggering another and separate disaster long after the skies have cleared. Across the region people should heed warnings and be prepared for widespread, long-lasting blackouts, and stock up on food, water, medicines, and fuel.

Here are some electricity-related things to keep an eye on as the storm approaches, plus updates as needed as we learn more.

Power assets at risk

The U.S. Energy Information Administration provides data on energy infrastructure alongside real-time storm information. Here, coal and nuclear plants in the region are displayed, as of Wednesday morning. See more here.

Power outages can occur due to disruptions at any point in the system, from power plants, to transmission and distribution lines, to the many critical substations enabling power flow in between.

This map from the Energy Information Administration (EIA) displays energy infrastructure and real-time storm information. Here’s a clip from Wednesday morning, showing nuclear and coal plants in the region. The map has multiple data layers that can be toggled:

There are 7 nuclear reactors operating in South Carolina and 5 in North Carolina, plus 4 more in Virginia. There are an additional 32 coal generators in the Carolinas alone, plus many natural gas, biomass, and solar facilities, and even one large-scale wind farm.

One plant of immediate concern is Duke Energy’s Brunswick Nuclear Power Station, situated along the North Carolina coast and presently right in the line of the storm. Brunswick is a large, two-reactor nuclear plant. In the aftermath of Fukushima, the Nuclear Regulatory Commission (NRC) conducted a review and Brunswick found hundreds of missing, degraded, or unverifiable flood seals. The follow-up report is not publicly available, though the company states it has since installed more safety equipment at the plant. What’s more, the NRC recently concluded that reevaluated flood hazards exceed the plant’s current design basis; the stated near-term remedy is to install metal “cliff edge barriers” at targeted locations prior to hurricane landfall.

Nuclear plants must be shut down at least 2 hours before winds of 73-plus miles per hour are expected. EIA tracks nuclear outages here.

We have some further idea about the potential for electricity infrastructure exposure to storm surge in the area as three years ago, my colleagues and I conducted an analysis to examine this issue at five sites along the East and Gulf coasts—including Charleston and the South Carolina Lowcountry, and Norfolk and Southeastern Virginia. Our analysis focused on power plants and substations in particular, recognizing that many of these long-lived pieces of infrastructure, which are centrally important to the power grid, are already exposed to flood risks and will face increasing levels of exposure as seas rise.

In our analysis of the South Carolina Lowcountry, 54 major substations (representing more than 27 percent of the total) and seven power plants in the mapped region could be exposed to flooding from a major storm today, including nearly 90 percent of those (14 out of 16) in Charleston. Click to enlarge.

These maps are illustrative, intended to show possible points of exposure to a major storm, not a prediction of what will actually occur.

In our analysis of southeastern Virginia, 57 major substations (representing 43 percent of the total) and four power plants in the mapped region could be exposed to flooding from a major storm today, including more than 80 percent of the substations in Norfolk and Hampton. Click to enlarge.

Critically, although much of this infrastructure is fixed in location, utilities can take proactive measures to reduce vulnerability to exposure such as by elevating assets or equipping them with flood-protection safeguards. Even de-energizing a substation before it floods can make a big difference in repair times, given the comparative calamity of equipment that gets flooded while electricity is still coursing through.

But as we know from past storms, outages are not only related to infrastructure on the ground; they’re commonly caused by downed wires and poles from things like falling trees and flying debris. Vegetation management, select undergrounding, use of smart grid equipment, and grid hardening can all help to limit the scale and duration of outages.

Finally, when utilities anticipate storm damage, they call on others to help. For example, Duke Energy, bracing for outages worse than those sustained from 2016’s Hurricane Matthew, has called on trucks and crews from outside the region to help assist with their work. The addition of crews can substantially quicken repairs, and mutual aid is an increasingly important factor in successful outage response.

Outages and communications

But despite planning, investments, and proactive work, it’s still prudent to expect that outages will occur.

And as we’ve seen time after devastating time, the loss of electricity can cascade into a disaster all its own. Given its pivotal role in society—from supporting emergency services throughout disaster recovery and response, to enabling the most basic to the most advanced of our everyday needs—the loss of electricity can be crippling.

As a result, as power outages mount, communication with customers is key. In particular, emergency responders and other critical services must not only be kept aware of their own status, but also that of the facilities and populations most vulnerable should an outage occur. Such tightly coupled communications can help prevent loss of life.

Utilities should further strive to provide robust communications to general customers, including estimates of when power will be returned. In a recent review of Florida’s electric utility hurricane preparedness and restoration actions, the state’s Public Service Commission found that communication issues were a “notable source” of customer dissatisfaction and frustration in the aftermath of Hurricane Irma. Because so many decisions hinge on whether or not the power is on and when it will be coming back, that communication is key.

Tracking the storm

As Hurricane Florence churns closer, the likelihood of a severe event is growing ever higher.

We will be following the storm, and its aftermath. One key will be to ensure that across utilities and co-ops, data are recorded so lessons can be learned and policies and investments improved down the line. Some important issues to be tracking include:

  • Where outages occur, and for how long they endure
  • Who gets power back first, and who gets it last
  • Whether critical services and vulnerable populations are left without
  • Which parts of the grid performed well, and which did not (including plants, fuels, pipelines, substations, wires, and poles)
  • Whether recent resilience and storm-hardening investments—including microgrids—have led to improved outcomes

But first and foremost as the storm draws near, we hope for the safety of those on the ground and the many brave workers striving to mitigate damage and loss.

Photo: NASA U.S. Energy Information Administration UCS analysis

The Hidden Dangers of Hurricane Florence: Catastrophic Storm Surge and Inland Flooding Threatens Rural and Low-Income Communities

The North Carolina National Guard prepares for Hurricane Florence

Over the last few days, we have watched with deepening dismay as the forecast for Hurricane Florence has turned increasingly grim. This rapidly intensifying hurricane is now on a trajectory to come ashore somewhere along the southeast coast, likely in North Carolina, potentially as a Category 4 storm. What heightens the risks of this storm is the forecast of days of lingering heavy rain, threatening not just coastal but also inland areas.

A coastal emergency compounded by inland flooding

This was projected to be a below-normal or near-normal hurricane season—but any complacency that may have engendered has changed very quickly. Yesterday, the National Hurricane Center (NHC) was tracking no less than three storms in the Atlantic and there are additional advisories in the Pacific. And it just takes one major landfalling hurricane to make it a terrible season.

(On the other side of the world, super-typhoon Mangkhut is threatening the Philippines, Taiwan and China, after passing over Guam.)

Coastal states in the Southeast and Mid-Atlantic are clearly taking Hurricane Florence very seriously. As of now, there are emergency evacuation orders for well over a million people across North and South Carolina, Virginia and Maryland.

The Navy has moved ships from Naval Station Norfolk out to sea to ride out the storm more safely.

Duke Energy is gearing up for major impacts to the power system in North and South Carolina and getting emergency crews in place to restore power after the storm passes. In a news release it warned of widespread outages in North and South Carolina, potentially lasting days or weeks. The company said that impacts could exceed that of hurricane Matthew, which caused 1.5 million Duke customers to lose power and cost $125 million in repairs.

What makes this storm especially scary is the huge storm surge and major rainfall that is predicted to accompany it. Forecasts show that the storm might stall creating a multi-day extreme precipitation event, similar to what residents of Houston experienced in the wake of Hurricane Harvey last year and are still struggling to recover from.

The latest advisory from the National Hurricane Center indicates that if the peak of the storm surge coincides with high tide, areas from Cape Fear to Cape Lookout, including the Neuse and Pamlico rivers, could see surge as high as 6 to 12 feet! Other parts of coastal North Carolina and Virginia, including low-lying coastal areas, could see 2 to 8 feet of storm surge.

Alarmingly, the forecast also indicates the potential for 15 to 20 inches of rain from the storm, with some areas of North Carolina, South Carolina and Virginia expected to experience as much as 30 inches through Saturday! Depending on the track of the storm, places as far away as West Virginia could also see heavy rain and flash flooding in the days to come.

Unfortunately, much of the southeast and mid-Atlantic, including North Carolina, Virginia and the Washington DC area have experienced above-normal rainfall over the past weeks, so the ground is already saturated. With more rainfall coming, catastrophic flooding—including in inland areas—is very likely.

Potential impacts from the storm

A storm of this magnitude will undoubtedly cause great harm. Hopefully, with the advance warning and preparation underway, loss of life will be avoided.

Early analysis from CoreLogic shows that nearly 759,000 homes across North and South Carolina and Virginia, with a reconstruction value of over $170 billion, lie in the path of the storm surge from Hurricane Florence, were it to come ashore as a category 4 storm.

In rural communities in North Carolina, experience from previous storms shows that flooding could cause waste lagoons from hog farms to overflow, contaminating rivers and streams. Coal ash ponds can (and do) also leak toxic contaminants. And wastewater treatment facilities could be overwhelmed by floodwaters.

Sewage and waste can also contaminate groundwater, affecting the well water that many rural communities depend on for their drinking water supplies.

Loss of power, especially for long periods of time, can be life-threatening for patients in hospitals and others with medical conditions, if they are not quickly moved to safety, as we witnessed so tragically after hurricanes Irma and Maria hit last year.

Those incarcerated in prisons also must be evacuated for their safety—it is troubling to see news reports that, as of now, South Carolina has chosen not to evacuate a prison in Jasper County.

Disaster preparedness requires advance planning

The emergency response to Hurricane Florence hasn’t just been conjured up in the last few days; emergency managers, planners and utility managers are using hard-won experience from previous disasters to prepare for this storm. Hurricanes Floyd and Matthew taught some bitter lessons.

Getting people out of harm’s way is job #1. Hence the mandatory evacuation orders from some of the highest risk areas that are being issued well in advance of landfall. These are warnings that people should take seriously and obey.

The storm is also going to test the resilience of critical infrastructure like roads, bridges, power lines and substations, sewage treatment plants, storm water drainage, hospitals, airports and more. Smart investments made well ahead of time will pay off in the days to come. And fatal weaknesses will be exposed where those investments fall short.

Responsible officials are not waiting for the storm to hit or for its exact path to be clear: they are acting out of an abundance of caution and making sure people and critical services are protected as best they can. (Incidentally, that’s a lesson well worth extending to how we think about preparing for the growing risks of climate change.)

How will rural, island and low-income communities fare?

The true test of our disaster response doesn’t just lie in how quickly the lights come back on or flights are restored in major economic hubs, but in how well isolated or marginalized communities fare in the aftermath of storms.

Disasters lay bare the socioeconomic inequities in our society. For some, fleeing to safety is prohibitively expensive—they may not have money for hotels or gas or even a car. Taking time off from work because of impassable roads or closed schools could mean losing a job. For those who can barely make ends meet, buying flood insurance to protect their homes or belongings can seem a luxury.

Low-income communities and communities of color are more likely to live near toxic waste sites, like coal ash ponds and landfills. Rural communities are more likely to depend on well water.

Island communities including those along the outer banks of North Carolina and coastal South Carolina are on the frontlines of this storm. Hopefully their residents are heeding evacuation orders. In some cases, they may have to travel long distances to really get out of harm’s way, given the wide swath of destruction this storm is likely to cut—which creates an additional burden for those who may not have the resources. Rural and island communities could be cut off for days if their bridges are washed out or few access roads are flooded.

So as this storm bears down, let’s remember the people of Princeville and Roxboro, the residents of the Gullah/Geechee Nation, those from Nags Head and Kitty Hawk, and from Tybee Island and Kiawah Island, and many other small communities like them that may not make the headlines.

A long road ahead to recovery

Looking ahead, given the terrifying forecast, unfortunately we can expect to see major impacts from this storm. Hopefully, communities in its path will be able to ride out this storm without loss of life.

But experience shows that recovery will take a long time, well after Hurricane Florence drops out of the headlines and is no longer trending on twitter. Communities in Houston and Puerto Rico are still struggling to recover from last year’s catastrophic hurricane season.

And then there are a whole set of additional questions regarding our nation’s response to these types of disasters:

  • Will the nation use this as an opportunity to build back in a more resilient way that takes into account the impacts of climate-driven sea level rise; as well as the increasing intensity of powerful Atlantic storms and increase in heavy rainfall events fueled in part by climate change?
  • Will Congress and the administration adequately fund not just the immediate recovery efforts, but long-term resilient rebuilding, as well as voluntary home buyouts and relocation from high-risk areas?
  • Will the Federal Emergency Management Agency (FEMA) and the Department of Housing and Urban Development (HUD) budgets, so necessary for disaster preparedness and recovery, be protected as the September 30 deadline for the federal budget approaches?
  • Will Congress protect NOAA’s funding and mandate so it can continue to provide the science we need to anticipate and prepare for these storms?
  • Will the Environmental Protection Agency (EPA), so compromised under the Trump administration, do its job to quickly identify and remediate toxic pollution in the aftermath of this storm—or will it put communities at risk as we saw after Harvey?
  • Will Congress and states make targeted resources available for low-income and otherwise marginalized communities, both ahead of and in the aftermath of the storm?

The answers to those questions will provide an important indication of whether we have the resolve to truly take on the long-term challenge of dealing with growing risks of extreme weather and climate disasters in a robust and equitable way—or whether we will just default to responding to these as one-off catastrophes whose burden falls disproportionately on those who can least bear it.

For now, our thoughts are with the many millions of people in the path of this storm and the first responders who are working so hard to protect them. May they all be safe.

For more information on the local hazards from #Florence, follow the @NWS offices on Twitter: @NWSCharlestonSC @NWSMoreheadCity @NWSRaleigh @NWSWilmingtonNC @NWSColumbia @NWSGSP @NWSWakefieldVA @NWSBlacksburg @NWS_BaltWash @NWS @NWSCharlestonWV pic.twitter.com/LNh3TGvCcd

— National Hurricane Center (@NHC_Atlantic) September 11, 2018

Global Climate Summit: Thank Goodness for Clean Energy State Champs

Photo: Andy Dingle/Wikimedia Commons

This week, California is hosting a Global Climate Action Summit. The summit is intended to “bring leaders and people together from around the world to take ambition to the next level” and “celebrate the extraordinary achievements of states, regions, cities, companies, investors and citizens with respect to climate action.”

It couldn’t happen at a better time or a better place. The Trump administration is busy swinging a wrecking ball at the pillars of climate progress in the United States, including the Clean Power Plan, our nation’s first ever limits on carbon dioxide emissions from power plants, and the fuel economy/tailpipe emissions standards that cut carbon pollution from cars and trucks. And his administration is hatching a scheme to bail out aging coal plants that increasingly can’t compete against renewables or natural gas.

Given these actions at the federal level, the world community, which joined us in signing the historic Paris Agreement, can reasonably question our national commitment to combating climate change. And that’s why it is so important that the United States is hosting this summit, and can present many success stories to show that Donald Trump does not speak for this country when it comes to addressing climate change. There is so much to be proud of in the private sector, cities and towns, and universities and others, but I will focus on the particularly encouraging success at the state level. Here are some of the major state success stories that  should be highlighted at the summit, but also the areas that the summit should focus on to build upon and expand that success.

California—the gold standard

Today, Governor Brown signed an extremely ambitious and inspiring new law –a mandate of —100% carbon free energy by 2045. This a breathtaking standard for any state to adopt, and it is particularly transformative given California’s size as the world’s fifth largest economy were it a nation. This goal, if achieved, would put California on the track of net zero emissions by mid-century—the level of reduction that scientists across the globe have indicated is necessary for us to meet the goals of the Paris agreement and prevent runaway climate change impacts. Moreover, California has solid policies in place to make a major head start to meeting this goal, including renewable energy standards, a low carbon fuel standard, a cap and invest program, and many others.

The wind miracle in the Texas panhandle

Texas is ranked number one in wind energy generation in the United States; it generates more wind energy than all but five countries. Wind energy powers about 15% of the Texas economy, or over 6 million homes, with thousands more megawatts under construction. Texas’ success is derived from its strong steady winds and large open spaces, and the foresight of state leaders to invest over $7 billion in transmission infrastructure to connect these open areas to population centers. Wind energy is so plentiful and cheap in Texas that some customers even get their electricity for free at night.

Sunny solar in North Carolina

In 2017, North Carolina ranked second in the US in solar by the Solar Energy Industries Association. It currently powers over 500,000 homes with solar, about 5% of its energy production, and that amount is projected to double in the next five years. North Carolina has made this progress with state incentives and renewable portfolio requirements, investments by utilities, and solar energy purchases from major in state firms such as Apple and Ikea.

Offshore wind in Massachusetts

In the 19th century, Massachusetts was the maritime leader of the world in whaling industry. In the 21st century, it is on its way to becoming the national leader in a new maritime industry–offshore wind. To take advantage of the strong steady winds across the Atlantic ocean, Massachusetts enacted a far-sighted law to authorize utility companies to purchase, after competitive bidding, approximately 1600 megawatts of offshore wind, enough to power about a third of the homes in MA and about ten percent of the power for the state. And MA has followed this up by signing a twenty year contract for the first 800 MW project, at a surprisingly low cost  (levelized 6.5 cents per kwh), with more projects to follow.

What more should states do on clean energy?

The astonishing success of renewable energy in these states and many others (e.g., New York, South Dakota, Washington, Iowa, and others) coupled with gains in energy efficiency and switching from coal to gas, is helping to dramatically drive down emissions from power plants to the point where nationally we are about 28% below 2005 levels. This is solid progress, but much more needs to be done. Many states have renewable energy standards that require utilities to purchase certain percentages of renewable power; many of the targets can be easily ratcheted upwards as the falling costs of wind, solar and energy storage make much higher levels of renewables cost effective. In addition, states can do much more to modernize their electric grids and build out transmission lines to make sure that renewable power is being used whenever the sun is shining and the wind is blowing, and invest in energy storage to store that renewable energy when they are not.

Transportation is the Next Frontier

In contrast to the electric sector, the picture is very different for emissions from the transportation sector, which is now the largest source of emissions in the US. While there have been some increases in the efficiency of cars and trucks, in large part due to rules issued by President Obama, these gains have been mostly offset by increased vehicle miles traveled and shifting consumer preferences for SUV’s and trucks, owing to sustained low gasoline prices.

It is here that states particularly need to step up. Twelve states have adopted California’s greenhouse gas emission standards for gas-powered cars, and nine have adopted “zero emission vehicle standards” that require higher sales of electric vehicles through 2025; other states should join these groups, as Colorado has indicated it intends to do.

And states can do much more to incent electric cars, buses and trucks and enjoy dramatically cleaner air and less carbon pollution. One of the biggest barriers now is the higher up-front cost of electric vehicles. While falling battery costs are expected to bring electric vehicles into cost parity with gas-fired vehicles by the mid-2020’s, we are not there yet. In the interim, states should help offset this higher cost with rebates, focusing particularly on EV customers of low and moderate income. States can also build out charging station networks, and should focus particularly on making EV’s convenient for those who don’t own a garage and can’t easily charge up overnight at home. States can also direct electric utilities, whom they regulate extensively, to offer EV-related services to customers, such as installing charging stations in homes and apartments, or offering discounted rates for charging at off-peak hours. Finally, states can lead by example by purchasing electric cars, buses and trucks for their needs.

A funding source will be needed to pay for the transition from gasoline powered transportation to electricity. Here additional experimentation is needed. One promising example is a “cap and invest” program that is in place in California and is being considered by northeast and mid-atlantic states. A cap and invest system would establish an overall cap on transportation emissions, require fuel distributors to purchase “allowances” for the right to sell transportation fuels, and use the funds to invest in cleaner forms of transportation.

Conclusion

We have a lot to be proud of when it comes to clean energy advances, the Trump administration notwithstanding. However, the federal rollback is so extensive, and time is so swiftly running out, that we will need states, and other key stakeholders such as cities, universities, businesses and others to step up the pace. The Global Summit will be a good way to mark progress, but its more important role is to stimulate ambition and jump start the next round of policies.

Photo: Andy Dingle/Wikimedia Commons

Why We Need to Humanize Chemists, and All Scientists

Silver Microscope Photo: Alexandra Gelle

Manifesto of a passionate chemistry PhD student, tired of having to fight prejudices when introducing herself.

Why humanizing scientists and their research is essential

Science has shaped our society and everyday life, and yet the public and many policymakers neglect, discredit, and underfund research and scientists due to their negative perceptions of the field. Over the last few years, public trust towards scientists has been challenged. According to recent studies by Fiske and Dupree, the public describes scientists as competent, but not as warmly as they describe doctors or nurses. Yet, scientists need to be able to effectively communicate their research and engage with the public and policymakers to ensure that the decisions that impact all of us are based on evidence.

Graphics and tables are not enough to establish a relationship between scientists and society. The public needs emotional connections with scientists and scientists need the public’s trust to be able to disseminate reliable and pertinent research. In addition, although technology now provides wide access, fake and sensational news are more accessible and can damage scientists’ image. This is why restoring the public’s trust towards scientists and science is crucial.

What chemists can do for you

Have you ever wondered what medicine would be like without the molecules that have been carefully designed by chemists? How would engineers conceive of laptops and cellphones without the development of batteries and electrochemistry?

When introducing myself as a PhD student in chemistry, I often see fear, rejection, or incomprehension in people’s eyes. I have always thought chemistry was fascinating, entertaining, and useful. Unfortunately in my experience, some of the public seems to be reluctant and suspicious when speaking about chemistry. Chemists are commonly pictured as environmental destroyers, eager for explosions, who are disconnected from the impacts of their laboratories and experiments. However, reality is quite the opposite.

It would be a lie to say that fire and explosions are not part of every chemist’s life, however, chemists are pursuing a more noble goal: helping people by improving their health and quality of life, and preserving the environment. Chemists’ ultimate objective is to better understand the behavior of molecules and use elements available on Earth to develop high-performance materials, new drugs, and more sustainable processes. One of the most extensively shared examples of chemistry in media outlets is the environmental and health damages caused by the misuse of scientific knowledge, such as chemical bombs.

While the public frustration and confusion is understandable, chemists should not be blamed for their discoveries but instead work diligently for their ethical and just applications. Chemistry, and science generally, are key to our lives and the public often neglects its importance. However the work of scientists is meaningless if not shared.

Why I decided to study chemistry

//scientific-illustrations.com

Chemists study reactions intending to develop new molecules or to enhance the efficiency of chemical processes. My PhD projects focus on the latter, in the field of catalysis. Building new molecules requires breaking and eventually forming bonds between atoms. Therefore, chemical reactions are often energy-intensive and generate large amounts of waste. In catalysis, chemical reactions can be sped up upon the addition of a substance, called a catalyst, which increases the efficiency of a chemical transformation. Moreover, catalysts can often be recycled and reused in other reactions.

My PhD focuses on the use of sunlight as an energy source and silver as a catalyst to promote popular reactions. Such catalysts which can be activated by sunlight are called photocatalysts and fall within the field of Green Chemistry – field aiming to reduce the ecological footprint of chemical industries by developing more environmentally-friendly reaction conditions and reducing chemical waste.

I always appreciate sharing my research and can do that more effectively when scientists and the public respect each other and work to ensure science is used for evidence-based policymaking, for knowledge-sharing, and for justice . Next time you see a chemist, or any other scientist, let’s talk about how we can learn from one another and be stronger together. How about we chat over a cup of caffeine (C8H10N4O2) extracted by dihydrogen monoxide (H2O) or a glass of ethanol (C2H6O)?

 

Originally from France, Alexandra Gellé moved to Montréal, QC, Canada to start her undergraduate degree in Chemistry in 2013. She is now a PhD student and is passionate about science communication and outreach. Alexandra is also the president of Pint of Science Canada, an international festival promoting science through speaker series in bars.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Illa Maru, http://scientific-illustrations.com

Hurricane Florence: Four Things You Should Know That Your Meteorologist is Truly Too Busy to Tell You

Hurricane Florence is currently making its way as a Category 4 storm toward the southeast coast and is expected to make landfall sometime on Thursday, most likely in North Carolina. Our hearts are with those who are looking at the storm’s predicted path and wondering what this means for their homes, families, and communities. As millions of residents in the storm’s path make preparations to stay safe, our hearts are also with the thousands of people who have faced similar risks in Texas, Florida, and Puerto Rico in the past year. If you are in the Carolinas, please do take care to heed local warnings and evacuation orders–and know that we are all hoping for your safety.

Florence, like any hurricane, is a fearsome storm. But the direction and northward extent of Florence’s path make it unusual, and the atmospheric and oceanic conditions in which Florence is brewing are contributing to the storm’s outsized strength for its location. With that in mind, let’s take a look at some of the climate dynamics that make Florence stand out amid our historical knowledge of Atlantic hurricanes.

ONE: Florence’s path is unusual—in a way that’s similar to Sandy’s

Atlantic hurricanes tend to develop off the coast of Africa, then move in a north/northwest direction. By the time they reach the position Florence was in a couple of days ago, they tend to take a hard right turn toward the north/northeast, staying well away from the US. In fact, as reported by Brian McNoldy and the Washington Post, of the nearly 80 recorded storms that passed within 200 nautical miles of Florence’s position on Friday, none made landfall on the US coast.

Florence’s path, however, has been blocked by a ridge of high pressure in the atmosphere, which is essentially blocking the storm from moving northward and keeping it on a westward trajectory toward the coast instead.

A ridge of high pressure along the northeast coast of North America, shown here in orange, has prevented Hurricane Florence from making the typical northward turn of most hurricanes.

Six years ago, when Sandy slammed into the coast of New Jersey, a “blocking ridge” over the eastern half of northern North America prevented Sandy from moving north. Never before had we seen hurricane take such a perpendicular path toward the Mid-Atlantic coastline.  One important difference between the paths of Sandy and Florence, however, is that during Sandy, the blocking ridge also prevented a low-pressure storm system coming from the west from moving north, so the two storms collided (hence the “Superstorm Sandy” moniker).

TWO: Major hurricanes this far north are rare

The Southeast US is no stranger to hurricanes. The Carolinas have experienced dozens of hurricanes since modern record-keeping began in 1851. The vast majority of these hurricanes have been Category 1 storms; together the Carolinas have only been hit three times by a Category 4 storm or above. The last time North Carolina was hit by a Category 4 storm was over 60 years ago.

Why is this? Hurricanes require a supply of fuel in the form of warm sea surface temperatures. Historically, as storms moved northward they did so closer to the central Atlantic and they encountered progressively cooler temperatures and weakened. Not so with Florence. While temperatures off the coast of Africa, where most Atlantic hurricanes develop, are running cooler than average right now, Florence’s path, determined largely by the blocking ridge, has taken it westward into a wide swath of the Atlantic where temperatures are running 2-3 °C above normal. Because of that ridge, even as Florence’s latitude increases, it’s projected to stay within a zone of warm temperatures that will allow Florence to stay strong and indeed strengthen as it churns its way toward the coast.

Over the next few days, Hurricane Florence will encounter abnormally warm sea surface temperatures, which will enable it to remain strong as it churns toward the coast.

We have seen the effects of warmer than average temperatures on hurricanes in the recent past. Last summer, for example, Hurricane Harvey passed over Gulf of Mexico waters that were 2.7 – 7.2 °F above average before slamming into the coast and dropping unprecedented amounts of rain on the Houston area. We know that warmer temperatures help to fuel hurricanes and that such conditions are more likely to occur in a warming world.

THREE: The expected storm surge will be amplified by higher average sea levels

The National Hurricane Center is expected to issue storm surge warnings tomorrow, but residents are already being cautioned that Florence’s storm surge could be life-threatening.

Storm surge is driven by several factors, but its primary driver is wind. As Florence makes its way over more than 1,000 miles of ocean, its winds push surface water toward the coast. That water piles up and creates a surge. The stronger the winds and the farther the storm travels, the bigger the surge. While some storms, like Harvey, cause most of their flooding through intense rainfall, for others, like Sandy, storm surge is the primary cause of flooding.

The last time a Category 4 hurricane to make landfall North Carolina was in 1954. Since then, sea level along the coast of the Carolinas has risen roughly 8 inches. That rise is already playing out in the form of increasingly frequent high tide flooding in the region. Charleston, for example, has experienced more than a quadrupling in the number of high tide flooding events just since the 1970s. And when it comes to storm surge, higher sea levels make for larger, farther-reaching surges.

Given that Florence is moving relatively slowly and is predicted to stall over the Southeast or Mid-Atlantic, the storm will likely remain along the coast during at least one high tide cycle. The timing of landfall relative to high tide remains to be seen, but the current state of sea level along the Carolina coast has the potential to add more height to the storm surge, which allows it to reach farther inland, than a storm like Florence would have caused, historically.

FOUR: In a warmer world, the atmosphere can hold more moisture and there is increased potential for extreme rainfall during storms

Like the rest of the US, since the late 1950s, the Southeast has experienced a dramatic increase in the percentage of rain that falls during the heaviest events. This trend has been linked to human-caused climate change because warmer air holds more moisture. And in the case of Hurricane Harvey, human-caused warming was found to have made the storm’s record-breaking rainfall three times more likely and 15% more intense.

With Florence’s path through very warm waters, we can expect a lot of moisture with this storm. Current forecasts are predicting 10 to 15 inches of rain for much of North Carolina and Virginia in the next few days.

Parts of North Carolina and Virginia could see 10 to 15 inches of rain in the coming days.

Just as when Hurricane Matthew hit the region in 2016, the extreme precipitation expected during Hurricane Florence will be falling on already saturated ground. Stream gauges in inland areas of North Carolina and Virginia, where Florence could stall, are recording streamflow–or the flow of water in streams, which is primarily driven by rainfall amounts–“much above normal,” which calls into question just how much more rainfall they’ll be able to accommodate.

The combination of saturated soil and rivers, heavy rainfall, and elevated sea levels due to long-term sea level rise and storm surge could make it very difficult for floodwaters to drain after the storm has passed.

As I wrote this, Governor McMaster of South Carolina was telling residents of his state that they should expect more wind than with Hurricane Hugo and more rain than with Hurricane Matthew. As South Carolinian listeners would know, these storms each caused grave damage through their respective mechanisms. In that press conference, the Governor also ordered the mandatory evacuation of the state’s entire coastline. This means that over 1 million people will be fleeing the coast in that state, and more in North Carolina, Virginia and elsewhere. The threat to the coast is the obvious priority as this week gets underway. Later, the challenge of managing the impacts to North Carolina’s interior regions may need to take center stage, as the stalled storm deluges large areas.

People talk about the “calm before the storm” and, if we do things well, there will indeed be an eerie quiet along much of the southeast of the United States later this week. In the meantime, as we scramble, hunker down, and prepare to ride out the latest in a terrible spate of hurricanes, we also hope that, unlike with Katrina, Sandy, Harvey, and Maria, we don’t surface to find our communities fundamentally scarred by this latest in spate of brutal storms.

We’ll be updating this blog post as conditions continue to evolve.

NASA Climate Reanalyzer Climate Reanalyzer National Hurricane Center

Puerto Rican Scientists and the Communities They Serve: “Resistance is Resilience”

Photo: Juan Declet-Barreto

We are coming up on the one-year anniversary of the devastation caused by Hurricane María in Puerto Rico. As part of the Puerto Rican diaspora in the United States and like thousands more of my compatriots abroad, I spent a frustrating, depressing, and maddening year viewing the fiscal and climatic catastrophe unfold from afar, and collaborating with others in the diaspora and other sectors of American society to send emergency aid, advocate for immediate federal action, and making myself useful any way I could for Puerto Rico and the US Virgin Islands.

So it was especially rewarding for me to return last week to Puerto Rico for the first time since the hurricane. In my homeland, I was able to witness not only the incredible resilience of our people, but also their refusal to sit idly by and wallow in the misery left behind by María. Here’s what I saw.

As I drove through the north and northeastern towns of San Juan, Naguabo, and my beloved Luquillo, I talked to people who told me stories of how, in the absence of a coherent or timely federal and local government response, neighbors banded together to care and feed each other, to remove debris from roadways, and to make treacherous trips to the nearby El Yunque rainforest to open up municipal water supply valves.

I was particularly impressed by the Coalición Pro Corredor Ecológico del Noreste, a local coalition of residents and scientists protecting coastal beaches and wetlands that serve as egg-laying grounds for the beautiful and endangered tinglar (leatherback turtle). The corredor provides several valuable ecological services, as it is an effective barrier against storm surge and coastal erosion, and its wetlands, beaches, coral reefs, bioluminiscent lagoon, and forests evidence its great biodiversity. As Cristóbal Jiménez, the president of the coalition, told me, they considered not holding the annual Festival del Tinglar in 2018 due to the devastation caused the year before, thinking at first that people would be too overwhelmed to attend. But as soon as they started planning for it, the community turned out in record numbers to hold their festival and continue the defense of local flora and fauna and the valuable ecological and cultural services the corredor provides.

This, to me, is testament to the potential for communities to build resilience by banding together.

The coalition’s story is a great example of scientist-community collaborations built on decades of experience. And it’s a great example of the type of partnerships to advocate for a climate-resilient future that have developed in the post-María period.

In my time on the island, I was able to get a broader look at how scientist-community partnerships are organizing to construct and demand a climate-resilient and equitable reconstruction of the island’s infrastructure. UCS joined the leadership of Ciencia Puerto Rico (CienciaPR) and the American Association for the Advancement of Science – Caribbean Division (AAAS-CD) in a conference titled “Ciencia en Acción: Política Pública Puertorriqueña Apoyada por Evidencia” (Science in Action: Puerto Rican Public Policy Supported by Evidence). CienciaPR and AAAS-CD are scientific societies front and center in making scientists’ voices heard in decision-making around public policy in Puerto Rico, and the event was the kick-off for the Puerto Rico Science Policy Action Network (PR-SPAN).

While the UCS Science Network and I were invited to add our own experiences in science policy advocacy, I was humbled to learn more of the long-standing and deep commitment of boricua* experts to elevating their communities’ needs, but also saddened at how most of their expert recommendations have been sidelined or otherwise ignored for decades.

For example, renewable energy experts from the National Institute for Island Energy and Sustainability (INESI, in Spanish) have long been strong proponents of developing local energy sources like solar and wind to facilitate the transition from expensive, global warming-producing, and climate-vulnerable fossil fuel-burning electric infrastructure.

Dr. Elvira Cuevas, a terrestrial ecosystems ecologist at the University of Puerto Rico, reminded the audience of the urgency of taking action, and that building climate resilience is both our obligation and right: “If we want a Puerto Rico that is truly resilient, we cannot leave it in the hands of the universities and [other] organizations. Each and every one of us is responsible for demanding our rights.”

Marine scientist Dr. Aurelio Mercado from the University of Puerto Rico told us of the long history of the Puerto Rican government ignoring scientists’ warnings about climate change and dismissing the need for hurricane preparedness: Dr. Cruz recalled how science was sidelined in the decades leading up to Hurricanes Hugo (1989), and Irma and María (2017), dismissing the need to prepare for what local government officials—decades before Hugo—called “hypothetical hurricanes” of categories 3, 4, or 5. We are at risk of repeating that history as scientists warn that the San Juan international airport could be underwater by the next decade or so – warnings that so far remain unheeded.

But perhaps it was Dr. Braulio Quintero, urban ecologist and co-founder of the scientific non-profit ISER Caribe, who best described the Puerto Rican population’s response to the government’s so-called recovery plan: “Resilience requires resistance; resistance is resilience”. Dr. Quintero is referring to the community- and science-driven mobilization of large swaths of Puerto Rican society against the anti-democratic impositions of fossil-fuel interests and the fiscal control board appointed by President Obama through the Puerto Rico Oversight, Management, and Stability Act of 2016 (PROMESA).

PROMESA’s fiscal austerity measures, together with the Trump and Rosselló administrations’ commitment to fossil fuel interests will keep Puerto Rico on the path of fiscal and climatic vulnerability that catastrophically hit rock-bottom after María.

So it’s not hard to see two divergent visions for the future: the first one, largely imposed on the Puerto Rican population by the Trump and Rosselló administrations, and without taking into account the climatic, fiscal, social, and economic challenges facing Puerto Rico and the Caribbean, insists on continuing the reliance on climate-changing fossil fuels for electricity production and fails to start planning for climate impacts like increased temperatures, sea level rise, and more frequent and destructive hurricanes. The second one, actively proposed and sought by Puerto Rican civil society, grassroots organizations and collectives, and scientific societies and advocates, demands a diversified and decabornized power sector, and a climate-resilient and equitable recovery that prioritizes the needs of the Puerto Rican population.

In light of the crossroads that Puerto Rico finds itself in, I go back to the question I have asked before: Are Puerto Ricans willing to allow a repeat of the errors of the past that put us on the path to fiscal and climate ruin, or will Puerto Rican society actively demand and work together towards developing an energy, housing, and economic infrastructure that responds to our present and future needs under a changing climate?

For Puerto Rican scientists and the communities they serve, the answer is clear: they are using community-driven science to demonstrate impacts and propose resilient solutions for the benefit of all, not just a few narrow—ahem, ahem, fossil-fuel—interests. UCS is proud to stand together with Puerto Rico and all climate-vulnerable communities to turn resistance into resilience.

 

*Boricua is the ancestral demonym for Puerto Ricans, from Borikén or Borinquen, given by the Taíno native peoples to the island later baptized “Puerto Rico” by the Spanish colonizers.

Photo: Juan Declet-Barreto

Community Choice Aggregation Puts Communities in Control of Their Electricity

Rebecca Behrens, 2018 UCS Schneider Fellow

Keep your eyes and ears open for Community Choice Aggregation, already a major player for consumer energy choice in California and spreading rapidly. In the post below, 2018 UCS Schneider Fellow Rebecca Behrens explains how CCAs work, where CCAs are forming, and what you should be on the look-out for as more communities get involved.

It’s late summer, which means ice cream season is coming to an end. A coworker and I have made it a habit of exploring the (many) ice cream shops around our office each week, and for something as simple as ice cream, it’s amazing how many choices we have. I can choose what ice cream I want based on price, proximity, flavor, or even the company’s business practices.

This got me thinking: if I have so many choices for something as simple as ice cream, what about bigger choices in my life—like where my electricity comes from? Like most of the US, I’m served by one utility. If I don’t like the way they’re sourcing electricity or setting rates, I have limited options.

But that story has been changing, in part due to the growth in Community Choice Aggregation (“CCA”). CCAs offer an alternative to traditional utilities and are designed to give communities a voice in where their electricity comes from. In California, many CCAs are striving to provide their customers with more renewable energy at lower costs than traditional utilities. Let’s break down the what, when, where, how and why of this new body.

What are CCAs?

Community Choice Aggregation allows local governments to purchase electricity on behalf of their residents, aggregating the electricity needs of everyone in the community to increase purchasing power.

The investor-owned utility (“utility” or “IOU”) that used to supply and deliver electricity is still there, but it plays a different role. Now, the utility is just in charge of delivering the electricity through its transmission and distribution lines (the utility still owns and maintains the “poles and wires”) and billing customers. This partnership distinguishes a CCA from a municipally-owned utility, which takes over both electricity procurement and electricity delivery (aka the poles and wires).

CCAs are in charge of procuring electricity while the utilities are in charge of delivering the electricity to you. (Source: Cal-CCA)

When and where have CCAs formed?

So far, CCAs are allowed in seven states: Massachusetts, Rhode Island, New Jersey, New York, Ohio, Illinois and California. Within a state, the decision to form a CCA is up to the community and local government. California has seen the most recent growth in CCAs, so I’ll be using it as an example here, but know that CCA formation and growth looks a bit different in each state.

Most of the seven states that allow Community Choice Aggregation passed bills legalizing CCAs in the early 2000s: California passed AB 117 in 2002. However, it wasn’t until years later, in 2010, that the first CCA in California launched in Marin County.

Since 2010, the number of CCAs in California has grown significantly. In 2016, there were five CCAs serving 915,000 customers. In 2017, there were nine CCAs. By the end of 2018, there will be 20 CCAs, serving over 2.5 million customers. And more local governments are considering the option.

The regions CCAs serve in California as of September 2018. Because CCAs are growing quickly in California, this map changes quickly, too. (Source: Cal-CCA)

Even if no more CCAs launch after 2018, CCAs are expected to serve 16% of the electrical load in California in 2020. But, it’s highly likely more CCAs will launch in the coming years, which could put this number at over 50% in 2020.

How do CCAs work?

In California, once the local government votes to form a CCA, a nonprofit agency is formed to carry out its duties. The agency goes through a rigorous planning process and once the CCA is ready to launch, they line up the customers.

And who are those customers? Anybody who wants to be. CCAs are “opt-out” in California, and in most other states, meaning that the default is for customers to be automatically served by the CCA. Customers have 60 days to opt-out for free and are notified about the change four times before this deadline. After 60 days, customers can opt-out for a fee to account for the power the CCA had bought in advance for them.

And that’s it! Customers are now served by the CCA. In California, if customers were receiving discounts because of particular circumstances, they will automatically continue receiving those discounts. This includes California Alternative Rates for Energy (“CARE”), Family Electric Rate Assistance Program (“FERA”) and Medical Baseline customers. Customers with rooftop solar systems who are on a net energy metering program are automatically enrolled to continue.

In terms of electricity service, as a CCA customer, nothing else changes. Your lights stay on, your TV still works, and your freezer stays cold.

The biggest difference is that the existence of CCAs allow customers to have more of a choice in the type of electricity they receive. Not only can customers choose between being served by the utility or the CCA, but if customers are unhappy with the electricity options or rates offered by their CCA, they can provide feedback to the CCA at its board meetings, which allow for public participation in California.

CCA communities can also benefit from the reinvestment of CCA profits, given that CCAs are nonprofits. CCAs can offer additional programs beyond what the utility offers. These could look like free energy efficiency audits, rebates for electric car charging stations, incentives for low-income customers to install solar, or really any program that helps customers better manage their electricity usage.

In some cases, customers could lose access to programs run by their utility by joining a CCA, although in California, most utility programming is still available to CCA customers. In any case, it’s smart to reach out to your local CCA and ask if you’ll still be eligible for programs you rely on.

Why do CCAs matter?

In California, every CCA (so far) has chosen to provide customers with more renewable energy than the competing utility and has done so at lower rates. However, how much new renewable energy CCAs are contributing to the grid varies a lot from community to community.

The devil is in the details here: A CCA that uses mostly short-term contracts to buy renewable energy or renewable energy credits (“RECs”) is likely buying from projects that already exist. Electricity purchases from existing renewable energy projects do not increase the supply of clean electricity on the grid, and customers that used to consume electricity from those renewable projects may now be consuming electricity from a dirtier source. This is called resource shuffling. On the other hand, a CCA that uses long-term contracts is helping new renewable projects develop, which means that more clean power is being added to the grid.

If you live in an area served by a CCA, it’s up to you to make sure your CCA is sourcing electricity in a way you support and providing programming you can use. Here are some questions you can ask to see how well a CCA is doing:

  1. Is the CCA providing more renewable energy than the competing utility, and are they sourcing their renewable energy from long-term contracts for energy and RECs? By buying “bundled” renewable energy through long-term contracts, CCAs can more directly support the development of additional renewable energy projects and add more clean electricity to the grid.
  2. Is the CCA making use of local resources and supporting the local community? Having a sustainable workforce policy and hiring locally and from unions can help bring the broader benefits of renewable energy to a community.
  3. Is the CCA leveraging grants and their revenue to provide programs designed to help customers reduce or better control their energy use? More renewable energy is just one piece of the puzzle; we need a host of solutions for a clean energy transition. Programs that invest in electric vehicle infrastructure and energy efficiency are equally important.
  4. Is the CCA proactively reaching out to its community? Programming needs to be accessible, useful and reach all members of the community—especially those that historically have not received the full benefits of energy programming and renewable energy.

CCAs have the potential to empower (and quite literally power) communities. But it’s up to residents to hold their CCAs accountable and ask them to provide equitable and fair climate solutions. By staying engaged and informed, you can make sure your CCA is providing your community with the best options.

CCAs are a growing movement in California but they aren’t the only way consumers are making choices about their electricity. While not every utility or state offers choices in electricity sourcing, it is worth seeing if yours does. You may even be surprised on what your options are: home in Vermont, through my utility I can choose to buy Cow Power! What sets CCAs apart from other choices is their ability to localize decision making and let communities invest in what is best for themselves, which has made them a powerful new player at the table.

Photo: Zbynek Burival

Department of Energy Walks Into a Fight About Subsidies

Offshore wind gets started where policy supports it. Photo: M. Jacobs

There is a fight over power plant costs that could threaten grid reliability, and it’s not as simple as the fight you have been hearing.  This wraps together three issues, each of which could cost billions of dollars. By throwing them together, policymakers are jeopardizing the electric grid reliability they say they are trying to protect. The three subjects in this fight are:

  1. Long-standing state policies for utility-owned generation in Kentucky, Ohio, Virginia, and West Virginia have been challenged as uneconomic;
  2. Renewable energy supports enacted by states are under attack;
  3. The federal government is pushing contradictory treatment for old coal plants.
A mess this big takes time

Presently, new political appointees in key agencies have tossed their respective agencies into a manufactured crisis that casts doubt on the basic means for paying power plants to keep the lights on. This uncertainty is a train-wreck of unacknowledged and uncoordinated policies verging on playing chicken with grid investments. In a hasty decision that invalidated the existing rules for reliability payments, a three-person majority at the Federal Energy Regulatory Commission, all appointed by President Trump, has made the continued operation of coal and nuclear plants less certain and new investment riskier. Meanwhile, DOE proposals to override the market, and over-pay coal plant owners threaten market investments.

Taxes on CO2 are a good idea for sorting out subsidies.

The owners of coal and nuclear plants opened this battle in 2013-2014 by arguing that the markets were paying too little, and despite all evidence that cheap natural gas had lowered prices across all U.S. energy markets, the fault lay in state policies that supported the gradual use of renewable energy. Soon, states began to rescue nuclear plants with additional payments and the fighting widened. Economists predicted that subsidies would lead to more subsidies, though this is already what we call U.S. energy policy. The Trump administration soon proposed subsidies for coal plants, and a national debate broke out.

No one expects markets to function when subsidies keep uneconomic plants online and force the supply to be greater than demand.  While the arguments to straighten this out will continue at the federal agencies and courts, here’s an explanation that should get you up to speed on how the economics and regulation are meant to provide grid reliability are complicated by old policies colliding with market prices driven down by innovation.

The focus is on FERC. What is FERC?

The Federal Energy Regulatory Commission (FERC) is center stage for this drama. For over 20 years, FERC has championed competition between power plants as the best way to determine how much should be paid to plant owners. The fundamental role for FERC is to ensure that rates for buyers and sellers of energy are just and reasonable. FERC was created in the 1930’s after financial manipulation by an interstate electric company demonstrated the need for a federal system to regulate in conjunction with the long-standing state authority over power plant construction and electric company service to consumers.

FERC’s role in electricity markets addresses the interstate commerce of power plants once they are built. With considerable reliance on competition to sort out winners and losers, as well as set prices, FERC looks to ensure open access to the transmission system and the administration of fair markets. This assignment has been accepted in much of the U.S. by independent system operators, with names like ISO-New England, Southwest Power Pool, California ISO and PJM. In addition to markets, these organizations are key to maintaining the reliability of the power system.

Role of grid operators getting into politics

PJM and the other grid operators are utilities and regulated by FERC. Unlike most utilities, the grid operators own no power plants or wires. Instead, they have rule-making and stakeholder processes where policies are made that shape competition. These stakeholder and governance processes are not perfect. Where a grid operator covers multiple states, grid operators in New England and PJM have entered a dramatic policy battle between state policies and the grid operators’ perception of economic subsidies for certain power plants. PJM accepted the idea that state policies are subsidies in these rule-making and stakeholder processes.

PJM functions for reliability and adequacy of the power supply involve consumers and utilities in 13 states and the District of Columbia. All grid operators create a demand forecast and projection of needed future electricity supply. This is key to signaling the need for new investment in power plants or alternatives, which would help ensure reliability. PJM’s approach to ensuring adequate supply also addresses the challenges related to power plant utilization and revenues from energy sales that vary by hour and season.  PJM calls this the Reliability Pricing Model or RPM. This operates through a series of auctions that are expected to determine what existing plants remain operating in future years or close, and what new plants will be built.

Take a deep breath- we are diving in deep

There is so much investment in our electricity supply, it is unrealistic to think there are some fuels and power plants that have no subsidies. PJM got into trouble by trying to pick sides and pretend that it wasn’t doing so. In practice, the folks with subsidies from “the old days” are unhappy that there are new subsidies. What might have been a principled stand by PJM about the new subsidies and their impacts on a market has to address many layers of subsidies and protections. We debated specific fuel subsidies and tax breaks, only to discover the very basics of old utility monopolies would be put on the table by FERC.

Since RPM pays for capacity that can produce energy (or reduce demand) separate from how many days or hours it actually runs, the debate over retiring coal plants, maintaining nuclear plants and how to recognize subsidies all focus on the RPM market. (In the midst of these debates, many observers say all the tweaking and adjustments PJM makes prove the RPM is not actually a market…but that is another debate.) PJM started the debate over state actions in 2016 when legislatures in Illinois and New Jersey took steps to provide nuclear plants with additional revenues. This, along with earlier action by the Ohio Public Utilities Commission and the West Virginia Public Service Commission to protect coal plant owners from losing money in the energy market led PJM to the position that state policies supporting existing plants could be suppressing the RPM auction prices. At this stage, PJM is saying it has a problem with every state it serves (except Kentucky, but that may not last), as each has either a renewable portfolio standard in its laws, a nuclear support in its laws, or a recent regulatory decision bailing out a coal plant.

The auction clearing prices are applied to all generators in the auction, so PJM says it is keenly interested in preventing available out-of-market revenues supplementing the auction prices bid by generators, thus hiding the true costs of the generators and suppressing auction prices. However, there is a spectacular hidden exception to this pursuit of accuracy and fairness of auction bids and results. (Sorry Kentucky.)

“Guaranteed revenues” sounds like a subsidy

PJM has long accepted the presence in its markets a category of old plants (mostly coal) that receive state support through consumer bills and are protected from competition. These old plants are a legacy of monopoly utilities that have their costs repaid through state-approved rates that are paid in consumer electric bills. This remnant allows old generation that is owned by utilities and still paid through cost-recovery rules to automatically succeed in the capacity auction (i.e. PJM rules say this is “a mechanism to guarantee that the resource will clear in the Base Residual Auction”).  The effect of this provision in PJM’s rules is it allows state-supported generation to bid low in the auction while receiving out-of-market revenues from state-sponsored payments made by consumers.

A rough estimate of the old plants protected in Kentucky, Ohio, Virginia and West Virginia is approximately 40,000 MW. Another measure is that over 100 power plants in PJM bid zero in the most recent capacity auction.

In its stakeholder process, PJM pushed to decide what kinds of subsidies it would tolerate, and which it needed to “correct” or “adjust” so that the RPM auction would have correct prices. This all got out of hand when PJM requested permission from FERC to adjust bid prices for nuclear and renewable generation that receive out of market payments. The industry was not prepared for what happened next.  The proposed “minimum offer price rules” were rejected by a split decision that declared PJM did not go far enough to root out all out-of-market payments to generators of all kinds. The three Trump-appointed commissioners voting to reject PJM proposal also found PJM could not rely on existing rules, as those would result in rates that are not just and reasonable.

FERC makes a big splash

FERC instructed PJM to make several changes not proposed by any party to the case, and to do so quickly. In effect, FERC ordered PJM to reshape the market that distributes $6-10 B a year, maintains reliability and determines coal plant closing.  Acknowledging this would be difficult, FERC nonetheless ordered this be done in 90 days. (FERC has since granted an extension of 6 weeks.)

Does anyone know what happens next?

As of late August, PJM discussion with stakeholders have not been promising. No clarity on what counts as a subsidy. Is a municipal- or cooperative-owned electric plant “subsidized”? Consumer-owned utilities see no overlap between their business model and the issues in this debate.  If the U.S. Department of Energy orders payments to uneconomic old coal plants to keep them open, is that a subsidy that should be “corrected”? PJM has said yes, and they intend to include any DOE-directed support to coal or nuclear plants with the bid price re-setting to protect the PJM auction from interference by subsidized bids. PJM has said things like “if the reason is national defense, then the payments should be made from a nation-wide fund.” Also, PJM showed stakeholders on August 15 “Out-of-market payments from any federal program adopted [after 3/21/16] will be subject to [adjustment through] Minimum Offer Price Rules, unless there is a clear statement of congressional intent indicating otherwise in the law creating the subsidy.”

In other words, PJM still believes that as the regulated utility responsible for the process, they should decide which subsidies are OK and which are not. PJM wants to stick with their plan:

  • State payments from state laws establishing renewable portfolio standards are bad,
  • State payments for old coal plants that are paid for in rates are OK, because that’s the way we have always done it, and
  • New federal subsidies are bad, but old federal subsidies are OK.

FERC, the regulator, has said all the subsidies are bad. And of course DOE has said once and will soon say again, a new subsidy is good.

Now you are up to speed.

Photo: EarthCareNM

Amazon Deforestation in Brazil: What Does it Mean When There’s no Change?

Photo: Brazilian things/Wikimedia Commons

I was recently invited by the editors of the journal Tropical Conservation Science to write an update of a 2013 article on deforestation in the Brazilian Amazon that I had published with Sarah Roquemore and Estrellita Fitzhugh. They asked me to review how deforestation has changed over the past five years. The most notable result, as you can see from the graph in the just-published article (open-access), is that overall it hasn’t changed. And that’s actually quite surprising.

During the late 90s and early 2000s the deforestation rate in the Brazilian Amazon averaged about 20,000 square kilometers per year, driven by the rapid expansion of cattle pasture and the commercial soybean industry. Then, starting around 2005, it began to drop rapidly, falling by 70% in just half a dozen years. This dramatic drop cut Brazil’s national global warming emissions very substantially, in addition to having important benefits for biodiversity and for the people of the Amazon basin.

Since then – essentially no net change. There have been small fluctuations up and down in the annual measurements of deforestation (up in three years and down in three years, to be specific) but it remains at basically the same level. In 2017 the annual loss of Amazon forest was 6,947 km2; that compares to 6,418 km2 in 2011.

Why is this surprising? Because in the same period, Brazilian politics has been incredibly chaotic. To cite the most striking developments during this turbulent period: one President has been impeached and removed from office; an ex-President (during whose administration the decrease in deforestation was achieved) has been jailed and prevented from running again; and politicians across the political spectrum have been implicated in the corruption scandal known as “Lava Jato” – or Car Wash. Not to mention a major economic depression, the passage of legislation weakening of Brazil’s Forest Code, and the indictment of the world’s largest meatpacking company, JBS S.A., on charges relating both to deforestation and to selling tainted meat.

Why then, did deforestation remain essentially the same?

While there are many factors involved, the lack of change does seem to reflect the institutionalization of the reasons that caused deforestation to drop in the earlier period. These include regulations (and prosecutions) limiting the sale of beef and soy from deforested areas; increased transparency concerning who is deforesting and to whom they’re selling their beef and soy; improvements in efficiency which allowed farmers and ranchers to raise output without clearing more land; and underlying these, the development of a political movement, led by Brazilian NGOs, that made deforestation an important issue in national politics.

If the lack of change in deforestation is interesting, so is the way that the international media have covered it. My co-author Dora Chi and I reviewed news stories on Amazon deforestation (using Lexis-Nexis; our search found 134 print articles from 2013 through 2017) and discovered a common theme: the idea that although deforestation had fallen in earlier years, now it had gone back up. As our review showed, even though this interpretation isn’t borne out by the data, it was nonetheless quite frequently used in the media narratives about deforestation.

Perhaps this mis-interpretation simply reflects a common journalistic tendency to write “on the one hand… but on the other hand…” stories. Or maybe it’s that you can’t get a story into print if it says that there’s nothing new. It may also reflect our tendency to present data such as deforestation rates as percentages, without realizing how they can be misleading because they’re using different denominators. A quick example – if my income dropped by 50% last year, then turned around and increased by 50% this year – am I now back to where I was two years ago? No – I’m actually still 25% below that level.

So, both the lack of change in the data, and the mis-communication of its stability in the media, are notable phenomena. But there’s a third (non-)event worth noting, and that’s the fact that deforestation hasn’t dropped to zero, as it would have if the earlier trend had continued. This is a major failure in terms of its effect on climate change and efforts to reign in global emissions. It shows that Brazil’s political turbulence has had important consequences for the global environment.

Photo: Brazilian things/Wikimedia Commons

One Year after Maria, Puerto Rico Deserves a Solid, Resilient, and Healthy Power System

Lights in San Juan. Puerto Rico Photo: Paula García

Hurricane Maria, one of the most extreme climate events to devastate the island of Puerto Rico (PR), left tragic statistics in its wake: thousands of people killed, material damages of more than $90 billion from which many people are still struggling to recover, hundreds of animals (abandoned, lost, and hurt) that are still looking for a home—and the largest power outage in US history, one that for a large swath of the population lasted even for months.

While the lights have come back on for the majority of Puerto Ricans, the hurricane and the distraction it caused shined a spotlight on an electric power system that was on the edge of collapse and that today demands urgent investment. Today’s decisions about investment and management will define whether the system can survive, recover, and be resilient for the long term.

Earlier this month, Ciencia PR, the Caribbean division of the American Association for the Advancement of Science (AAAS-CD), and the Union of Concerned Scientists organized a “Science in Action” symposium, in which one of the questions explored focused on that very issue: What can we do to make sure that the island’s power system emerges solid, resilient, and healthy?

Here I share some of the key issues that emerged in a panel discussion between Lionel Orama of the National Institute of Island Energy and Sustainability (INESI, in Spanish), Agustín Carbó Lugo of ClimaTHINK, former commissioner of the Puerto Rican Energy Commission (CEPR), and me.

Science, innovation and energy panel. From left to right: Lionel Orama, Agustín Carbó and Paula García

A critical moment for the power sector
  • Maria was the straw that broke the camel’s back of an electricity sector that was already on the verge of collapse. Puerto Rico Electric Power Authority (PREPA, or AEE in Spanish) was already in an extreme fiscal crisis with a debt of $9 billion dollars. This contributed to PREPA underinvesting in the infrastructure and maintenance of facilities and equipment. On arriving, the hurricane knocked down 80% of the power poles and all of the transmission lines, leaving the island’s 3.4 million inhabitants in the dark.
  • For years, PREPA has clung to the use of fossil fuels, forcing Puerto Ricans to depend on fuel imports, exposing them to swings in fuel prices, and subjecting them to the financial stress associated with operating power plants dependent on oil, coal and natural gas. The lack of an energy mix that was diversified, decentralized, and free of the dependence on imported fossil fuels has prolonged even more the recovery of—and confidence in—the energy services provided by PREPA.
  • The privatization of PREPA is adding to the fiscal and operational uncertainty. At the beginning of the year, Governor Ricardo Roselló signed into law the privatization of PREPA. This privatization will define who generates the electricity, from what sources, and at what prices; so far there’s total uncertainty about the answers to these questions and the impact that they’ll have on island residents.
  • Likewise, the island for years lacked a control entity to ensure transparency and optimal functioning of PREPA until the CEPR was created in 2014. Despite its importance, its work has been threatened with a new law signed recently by Gov. Roselló.
The transformation that Puerto Ricans deserve
  • The voices of the scientific community and civil society need to be reflected in the development of the utility’s “integrated resource plan” (IRP). Having them at the table is key for making sure that decisions made are informed by solid technical analyses that respond to the needs of the communities. INESI is one of the organizations contributing to this effort.
  • A solid system needs to consider diversification and resilience. It’s crucial to reduce dependence on fossil fuel imports, diversify generation to incorporate local sources of energy (like solar and wind), upgrade electrical distribution systems, and integrate microgrids and energy storage systems to increase confidence in the grid and meet critical needs (at health centers, in emergency shelters, and for water pumping systems, for example). Reducing energy consumption through energy efficiency programs is also crucial. All of this should be guided by principals of transparency and affordability.
  • A healthy system should benefit us all. Emissions of heat-trapping gases (like carbon dioxide and methane) from power plants based on fossil fuels (like oil, coal and natural gas) just worsen the effects of climate change, like hurricanes, floods, and droughts that become ever more devastating. Burning fossil fuels also emits a number of air pollutants (like sulfur dioxide, nitrogen oxides and particulate matter) that can have big impacts on our health. It’s vital that we make the transition to clean energy as quickly as possible.
Energy, climate, and health: An equation that affects us all

My visit to the Isla del Encanto affected me deeply. Interacting with some of the island’s experts on energy, environment, and health reconfirmed for me that these variables are intrinsically linked at the local level. I return to Boston inspired by all of the work led by the symposium’s organizers and participants, and motivated to collaborate with them on these themes, which have an impact not just locally but globally.

While climate change affects us all, some communities are more vulnerable to the bad decisions that others have taken for them. Hopefully the power that comes from working together can help us to have an increasingly strong voice for urgent action to address climate change, for the sake of our fellow humans, for our fellow living beings, and for our one planet, Earth.

 

Grace. Brought from a shelter in PR to one in MA for adoption.

*NOTE: As the beginning of this post mentions, the hurricane left hundreds of animals (abandoned, lost, and hurt) in need of homes. The island’s shelters have limited capacity (both physical and financial) and need volunteers that can take animals to shelters in different parts of the US. For those who travel to Puerto Rico and are interested in helping out, All Sato Rescue can fill you in. I brought home Grace, an adorable puppy that will soon be up for adoption via Buddy Dog.

This blog is available in Spanish here.

 

Audrey Eyring Paula García

Will Happer, a Climate Science Denier, Joins the White House

Photo: Gage Skidmore/Flickr

News broke Tuesday that Dr. William Happer, a skeptic of climate science and professor emeritus of Physics at Princeton University, has joined the National Security Council, directing an emerging technologies portfolio. The scope of his responsibilities and the power he will wield remain unclear as the position appears to be newly created. However, Dr. Happer’s public condemnation for the scientific consensus around climate change (a field that he is not an expert in) is cause for serious concern, especially given the role the National Security Council has in setting high-level foreign policy and the growing threat climate change is posing to our nation’s security. Yet again, the White House has elevated an individual who denies the science around climate change to a position of power – with this, individuals such as Happer are not the exception in this administration, but the rule.

Dr. Happer is known for his important contributions to the field of modern atomic physics. He has been on the faculty of Princeton University since 1980, during which time he has also been active in government service, having, for example, served as the Department of Energy’s director of energy research during the George H.W. Bush administration.

However, Dr. Happer is also known for his dismissal of the scientific consensus around climate change, making scientifically unfounded statements that more carbon dioxide will net benefit society. He has also questioned the science underpinning The Paris Agreement (a worldwide commitment to reduce global warming emissions and limit the increase in global temperature to well below 2 degrees Celsius), and recommended withdrawing from the Agreement. Such statements caught the attention of the Trump Administration, who considered Dr. Happer for the Director of the White House Office of Science and Technology Policy (OSTP) post (a post Dr. Kelvin Droegemeier was nominated to fill). Instead, the administration named him to his new role directing the emerging technologies portfolio.

In his new position, Dr. Happer is now an earshot away from the President and his closest climate and energy advisors. While it is unclear what his specific responsibilities will be, we will be watching to see if he advocates or not for technologies that help reduce carbon emissions in line with climate goals, and that help make the world a safer place. The White House must specify soon what exactly Dr. Happer will be working on. We will be watching to ensure that if his role drifts into climate-related issues, such as policies around our climate, ocean, the Arctic and Antarctic, and Earth observations, that he relies on and accurately conveys the latest science represented by the expertise of many excellent agency scientists and scientific resources available to him.

Photo: Gage Skidmore/Flickr

Pages