UCS Blog - The Equation (text only)

Will Chevron Show Leadership in Climate Solutions? Notes From the 2018 Shareholders’ Meeting

Photo: ArtBrom/Flickr

Last week, I joined the Union of Concerned Scientists at the Chevron shareholders’ meeting in San Ramon, CA. We were there to ask why Chevron leadership, and shareholders, have not pushed for more meaningful action to meet global emissions targets that would keep climate warming well below 2 degrees celsius.

The security to get into Chevron Headquarters in San Ramon was tight – more significant than your typical airport security. In addition to multiple steps of checking of our passes to enter and walking through metal detectors, we were only able to bring in paper and pen, and each of our papers were shuffled through and inspected on the way in. Once seated, we listened to the presentations by the company’s Chair and CEO and by shareholders advocating proposals on environmental, social, and governance issues. During this time, shareholders followed the Board’s recommendation to reject proposals to “transition to a low carbon business model” and improve lobbying disclosures, among other things.

During much of the meeting, I was scribbling down notes and adapting my prepared statement based upon what I was hearing. I also spent some time staring into this infographic that was provided in the Chevron Climate Resiliency Report (data from IEA 2015 World Balance and Final Consumption Report 2015):

This diagram highlights the flow of energy — the width of the bars reflects the relative size of the production/consumption budget — in our current fossil-fuel focused energy system. This diagram allows you to watch the flow of energy towards different areas of our economy that utilize that source. One remarkable aspect of this data, which is pointed out in the Climate Change Resilience Report, is that “about 25% of global oil consumption is used in personal vehicles” (to see this, follow the bar from “oil”, to “transport”, and then to “passenger”). This means every day that we drive in our personal vehicles we are making choices about fossil fuel emissions that add up to something very significant. I was struck by this statistic because it underscores something that I frequently address in my public talks about climate change: personal, individual action is one piece of the puzzle in solving the climate problem. But there are other pieces of the puzzle – government leadership and corporate accountability which I address again below.

At the end of the scheduled shareholder proposals, it was time for the lottery of Q&A. Each of us who had a question or statement had to get a numbered ticket; tickets were pulled randomly and there was no guarantee that all questions would be heard. In total, about a dozen people asked questions or made statements to the Chairman. Of these, almost all of them were on three topics: climate change, human rights, and an ongoing lawsuit with the people of Ecuador due to a decades old environmental disaster.

Here was my statement and question when my number was called:

Good morning Mr. Chairman, members of the Board, and Stakeholders. Your recent Climate Change Resilience report was a step toward responding to investor demands that you disclose your plans for operating in a world where global temperature increase is kept well below two degrees Celsius. However, your company emphasizes potential conflicts rather than synergies between climate solutions and other societal goals and dismisses a rapid transformation of our energy system as “unlikely.”

I am a scientist here in Northern California. One of the areas of my research focuses on the impact of rising carbon dioxide concentrations on the changing chemistry of the ocean. I collaborate with businesses along the coast that are deeply concerned about the impacts of rising carbon dioxide on their financial future. Specifically, rising carbon dioxide concentrations threaten a key part of our history, culture and economy of California – sustainable harvests of food from the sea. As a scientist, I understand the grave risks we are facing without deep reductions in emissions and know that swift action is precisely what is needed to avoid the worst effects of climate change.

You stated this morning, and you describe in the Climate Resilience Report, that a first principle that guides your views on climate change is that “reducing greenhouse gas emissions is a global issue that requires global engagement and action”. Yet, in this report you bet against our ability to tackle meaningful energy transformation. When will Chevron show greater ambition to keep global warming below 2 degrees C?

In his answer, Chair and CEO Michael Wirth was respectful, and thanked me for my work in the scientific community. He explained that the company simply “meets the demands of energy used by people around the world,” and that it does “look at low carbon scenarios” as part of its business plan. However, Mr. Wirth argued that global policies are needed – ones that would require government intervention – and that it isn’t the role of individual companies to make decisions on this matter. This was an interesting answer because it spelled out something that Chevron doesn’t say directly in its public report – the company isn’t planning on taking leadership on climate change until governments lead the way. Which is hard to imagine, since fossil fuel companies spend millions every year lobbying our government to support policies that promote the use of oil and gas.

Why does this matter – and why would a climate scientist attend a Chevron shareholders’ meeting? I pondered this quite a bit when I was asked to join the UCS team for the meeting that day. For me, the decision came down to three things. First, because I am asking Chevron to use the best available science to make decisions for our future. Was a being an ‘advocate’ – yes – I am advocating for the use of science in decision making. Second, because I have made a commitment to not just communicate with those who already agree with me. We need to be able to put ourselves in situations where we work to find common ground and similar values with people in many different communities. Finally, as I’ve discussed above, I think individual responsibility is an aspect of the problem – people need to feel emboldened to make their own decisions that place our planet on a better path. But individuals can’t solve this problem alone: corporate accountability is important here. We need to be asking more of corporations that contribute significantly to our greenhouse gas burden. If they contribute significantly to the problem, they should be contributing significantly to the solution.

 

Dr. Tessa Hill is a Professor and Chancellor’s Fellow at University of California, Davis, in the Department of Earth & Planetary Sciences. She is resident at UC Davis Bodega Marine Laboratory, a research station on the Northern California Coast. She is part of the Bodega Ocean Acidification Research (BOAR) group at Bodega Marine Laboratory, which aims to understand the impact of ocean acidification on marine species. Tessa leads an industry-academic partnership to understand the consequences of ocean acidification on shellfish farmers. Tessa is a Fellow of the California Academy of Sciences, a AAAS Leshner Public Engagement Fellow, and a recipient of the Presidential Early Career Award for Scientists & Engineers (PECASE).

Our Latest Automaker Rankings: What The Industry Needs to do to Keep Moving Forward

Every few years, UCS takes a look at the auto industry’s emission reduction progress as part of our Automaker Rankings series of reports. This year’s analysis, based on model year (MY) 2017 vehicles, shows that the industry has once again reached the lowest levels yet in both smog-forming and global warming emissions from new vehicles, despite the fact that many off-the-shelf technologies are deployed in less than one-third of all new vehicles.  Unfortunately, this record-setting trend in progress also shows some indications of slowing down, with Ford and Hyundai-Kia showing no progress towards reducing global warming emissions, and Toyota actually moving backwards.

At the same time, the industry spearheaded an effort to re-litigate fuel economy and emissions standards set through 2025, and this report comes out while a proposal from the current administration responding to their request that would completely halt progress in the industry at 2020 levels sits awaiting public release. Therefore, while this year’s Automaker Rankings highlights some of the progress made by leaders in the industry to move forward on the technology front, it’s also critical that on the political front these companies stand up to the administration to ensure the rest of the industry continues to move forward on reducing emissions.

The technology to meet future standards is out there

For me, one of the key takeaways I had from this report is that while standards have in many cases accelerated the deployment of new technologies, some of the most cost-effective strategies to reduce emissions are still sitting on the shelf. The industry’s progress to-date is barely a glimpse of where gasoline-powered vehicles could be in the future as shown in the figure below.

While vehicle standards have led to significant growth in a number of technologies, even many of the most cost-effective technologies to lower emissions have been deployed in only a small fraction of the fleet, leaving plenty of room for further reductions.

On top of this, many of the deployed technologies, like advanced transmissions, still have significant incremental progress that can be made. We’re also seeing novel developments in other technologies like start-stop, where we are beginning in 2018 to see the deployment of higher-voltage (48V) systems that can result in complementary technology such as electric boost and again continue to push out the horizon for combustion engine improvements. For this and many other reasons, it’s baffling to see the industry assert that meeting 2025 vehicle standards requires widespread vehicle electrification.

No more Greenest Automaker

Of course, electric vehicles are one of the reasons for a key difference in this year’s report: we are now including the results of all automakers, not just those largest companies who sell vehicles of all sizes and types. A lot of the development for some of the technologies that could pave the way to a lower-emissions future are coming from some of the smallest manufacturers, whether that’s Tesla’s all-electric fleet or Mazda’s SkyActiv-X spark-assisted charge compression engine, which looks to bring diesel-like operation to a gasoline engine. Ignoring this leadership from smaller automakers would be ignoring some of the most forward-looking technology deployment in the industry.

Additionally, it’s important to recognize that this report is limited to the emissions of the vehicles sold by manufacturers—it does not consider other aspects of operations which also affect the sustainability and “greenness” of a company, whether that’s related to water use at its facilities, renewable power sourcing, or other aspects of the manufacture and distribution of a manufacturer’s fleet.

Considering these two central limitations, we have decided to no longer award a “Greenest Automaker.”  It’s important to recognize the wide difference between the emissions from the fleet of Honda, who has again asserted its leadership to provide the lowest emission fleet from full-line automakers, and Fiat Chrysler, who finds itself producing a fleet better only than McLaren, Ferrari, and Aston Martin—automakers who produce only exotic sports cars meant more for a track than a highway—but that is only part of the story.

The gap between leaders and laggards is huge and pervades all vehicle classes

One of the reasons we have previously ignored small manufacturers is that they provide a narrow spectrum of vehicles—and it’s been a historic complaint from companies like Ford that they should get a pass because people want big trucks. But one of the key findings from this year is that the Detroit Three fall to the bottom of the pack not because they sell big trucks, but because in virtually all classes of vehicles they sell, their cars and trucks emit more than the rest of the industry.  And the reverse is true for a company like Honda.

Honda is the manufacturer with the lowest emissions because it invests broadly in improvements across its entire fleet. Similarly, the Detroit Three don’t perform poorly because they sell a lot of trucks—they perform poorly because their vehicles emit more than the industry average in most classes of vehicle.

The only company whose ranking is significantly affected by the mix of vehicles they sell is Toyota—but that was an intentional decision on their part.  They chose to boost production of their least efficient vehicles, trucks and SUVs, while at the same time bypassing investment in improving those vehicles.  If they want to catapult back to the top of the pack, they’ll need more than the Prius to make them look like a leader—it’s about providing consumers lower emission choices across the entire spectrum of vehicles sold.

A path forward

With every Automaker Rankings, we try to provide the industry with a path forward. And the truth is, the engineers at these companies have been working their butts off to provide a bright future for the industry…should they choose to embrace it.

Manufacturers have made a number of pronouncements about the vehicles planned over the next five years which could easily end up keeping emissions levels on the path envisioned under the 2025 standards now on the books. And we have tried to highlight the role these vehicles can play in creating a more sustainable transportation future.

But too many within the industry have been looking to ignore their role in getting to this low-emissions future, so the question remains:  Will the industry accelerate toward a cleaner future by following their engineers, or continue to deploy their lobbyists to slam on the brakes?

It’s Time to Implement Stronger Autonomous Vehicle Testing Standards

Photo: Grendelkhan/Wikimedia Commons

The widespread introduction of autonomous vehicles could potentially bring about many benefits – advocates argue they will reduce traffic, the burden of driving, and emissions should the cars be electrified. The could also improve access for children, the elderly or people with disabilities – but the most important benefit is improved safety.

U.S. road fatalities increased 5.6 percent from 2015 – 2016. This is a disturbing trend, as this is the largest increase in the last decade. Proponents of the self-driving community will tell you that the cars will help to slash the numbers significantly because the human driver is taken out of the equation. According to the National Highway Traffic and Safety Administration, there were 5,987 pedestrian fatalities in 2016 – the highest number since 1990 – and 847 bicyclist fatalities, the highest since 1991. In addition, fatalities due to drunk driving and speeding went up at least 1 percent. Although fatalities from distractions and drowsiness went down 2.2 and 3.5 percent, respectively, they were offset by an increase in other reckless behaviors (speeding increased 4 percent, alcohol impairment increased 1.7 percent, and unbelted incidents increased 4.6 percent).

Autonomous vehicles are being tested in several states and provinces, such as California, Pennsylvania, and Ontario. The graphic below shows the status of autonomous vehicle testing laws in the various states across the country – 25 of 50 states have passed laws overseeing testing. Uber and Waymo have taken the lead in testing – Waymo has logged over 5 million miles and Uber, although far behind Waymo, has logged a significant number of miles itself with 2 million. California has been working with testing companies under a regulatory framework, while states like Arizona have allowed free reign to the companies to test the vehicles on the public roads, with a backup human in the driver seat to compensate for any failures in the software. However, what happens if the driver gets distracted and loses focus? Or when the autonomous system doesn’t have a sufficient way of warning the driver that they need to take over?

Current Status of State Laws on Self-Driving Cars
Source: Brookings Institution and the National Conference of State Legislatures. Click to enlarge.

The NTSB presents its findings

According to a preliminary report released by the National Transportation Safety Board (NTSB), that is exactly what happened when an Uber self-driving platform controlling a Volvo XC90 autonomous vehicle killed a bicyclist in Tempe, Arizona on March 18. The initial reaction of the chief of the Tempe police on March 19 was that Uber was likely ‘not at fault’ for the incident after viewing the vehicles own video of the event. After a more thorough investigation, however, the NTSB report states that the Uber system “registered…observations of the pedestrian about 6 seconds before impact, when the vehicle was traveling at 43 mph. As the vehicle and pedestrian paths converged, the self-driving system software classified the pedestrian as an unknown object, as a vehicle, and then as a bicycle with varying expectations of future travel path.” The Volvo XC90 had its own emergency braking system, but this system was disabled when the Uber self-driving system was controlling the vehicle, to “reduce the potential for erratic behavior.” The Volvo emergency braking system could have prevented or reduced the severity of the crash, since it detected the bicyclist 1.3 seconds before the collision, and if enabled would have made an emergency stop.  The driver appeared to have been distracted by the computer interface and did not see the bicyclist step out into the street. By the time the driver looked up, saw the bicyclist and pressed the brake, it was too late.

View of the self-driving system data playback at about 1.3 seconds before impact.
Source: National Transportation Safety Board.

Safety advocates across the spectrum have cautioned lawmakers about the rapid pace of testing saying that it is too soon to have them tested on public roadways, interacting with pedestrians and bicyclists. Moreover, reports suggest that Uber’s self-driving system was struggling to navigate public streets, with drivers needing to intervene and take control from the automated system once every 13 miles, compared to more than 5000 miles between interventions for the Waymo systems being tested in California.  Real world testing on public roads is clearly needed to test and improve the self-driving technology but testing on public roads must only be done once public safety can be assured.

Congress is pushing federal legislation too quickly

This fatal crash is a stark reminder of the risks involved in racing to bring automated driving technology to market without adequate oversight. Senator John Thune, the Republican Chairman of the Senate Committee on Commerce, Science, and Transportation, remarked that “the [tragedy underscores the need for Congress to] update rules, direct manufacturers to address safety requirements, and enhance technical expertise of regulators.” Senator Gary Peters also chimed in, saying that “Congress must move quickly to enhance oversight of self-driving vehicles by updating federal safety rules and ensuring regulators have the right tools and resources to oversee the safe testing and deployment of these emerging technologies.”

Yet while state and local governments grapple with responses to this tragedy, the architects of the Senate self-driving bill are renewing their push to get it passed through Congress.  The Detroit News reported that Peters and Thune are still attempting to win support from reluctant senators. The bipartisan duo also is looking at the possibility of trying to attach the measure to another bill that has better prospects for a full vote or passing it as a standalone bill.

This push concerns us as we question whether the AV START Act is the right vehicle to meet those aims. The bill would allow hundreds of thousands more autonomous vehicles on our roads, with lax oversight, and would pre-empt the great work that state and local governments are doing to regulate AV testing in their jurisdictions.

Safety of all users of the road must be the top priority

In our policy brief “Maximizing the Benefits of Self Driving Vehicles,” UCS advocates that “rigorous testing and regulatory oversight of vehicle programming are essential to ensure that self-driving vehicles protect both their occupants and those outside the vehicle.” In October 2017, UCS expressed its concerns on the lack of scientifically-based safeguards in the Senate’s AV START Bill. Already, cities and states are having discussions on how to regulate AVs more strictly. The mayor of Pittsburgh Bill Peduto planned to ask representatives from the AV industry agree to a 25-mph limit on city roads, stating “Pittsburgh should have a very strong voice in whatever Pennsylvania should decide to do,” Peduto told reporters Tuesday. “These are our streets. They belong to the people of the city of Pittsburgh and the people of the city of Pittsburgh should be able to have certain criteria that shows them that safety is being taken first.” However, the city has  limited authority to regulate vehicles on its streets California is taking a different tack, as its Public Utilities Commission recently released guidelines that will allow AVs to pick up passengers – as long as the company holds an autonomous vehicle testing permit from the DMV for at least 90 days before picking up passengers, agrees to not charge for the ride, and files regular reports including the number of miles their self-driving vehicles travel, rides they complete and disabled passengers they are serving.

Uber and other companies will have to reassess their procedures for AV road testing and states will have to re-evaluate how freely they allow self-driving cars to be tested on their roads. Furthermore, municipal governments need to be at the table working with companies to develop robust safety standards. We need to ensure at all levels of government that adequate, sound safeguards are implemented, so that autonomous vehicles can truly achieve the safety benefits they are expected to have.

Grendelkhan /Wikimedia Commons

Why NASA’s Carbon Monitoring System Matters

Future funding for NASA’s Carbon Monitoring System (CMS) was recently cancelled, leaving the important program in jeopardy unless Congress takes action soon.

Why is the Carbon Monitoring System important?

The CMS complements other NASA carbon monitoring work by taking Earth science observations and doing research with them that improves understanding of how carbon flows through our land, biosphere, waterbodies, ocean, and atmosphere. As a result, the CMS funds work that allows for better management of these natural resources and better understanding of how they will respond to global changes, such as climate and land use change.

The research that comes out of the CMS has real world benefits for communities across the United States, as one of its objectives is to meet stakeholder needs across a range of scales. For example, NASA’s Carbon Monitoring System recently funded work that provided fine-resolution information on a vast swath of forest stocks in interior Alaska. This project made use of NASA airborne LiDAR technology to provide information on forest resources within the state that were previously understudied due to their remote location. The region covered in the analysis accounts for 15% of US forested land.

Other CMS projects include research to:

  • Improve measurements of carbon storage in the U.S. Corn Belt, which stands to enhance measurements of agricultural productivity and management practices.
  • Advance understanding of the Great Lakes, and how carbon is stored in them.
  • Improve understanding of how land use decisions and changes in climate such as extreme weather events affect nutrient cycling and water quality in the Gulf of Mexico.

Through these projects and others, CMS is helping stitch together observations of carbon sources and sinks to produce a high resolution representation of how carbon flows through our planet.

What Congress can do to ensure the survival of the Carbon Monitoring System 

In May, the House Commerce, Justice, and Science subcommittee approved language that would restore funding for the CMS in a bill that provides funding for NASA in 2019. While the amendment provides $10 million to the program, now under a slightly different name (the Climate Monitoring System), the funds would, “… help develop the capabilities necessary for monitoring, reporting, and verification of biogeochemical processes to better understand the major factors driving short and long-term climate change.” Given that this is virtually identical to the mission of the current CMS, it seems clear the intent of this amendment is to continue funding for this existing program.

As the Senate now turns to mark-up the FY-19 Commerce, Justice, and Science appropriations bill, they will have the opportunity to ensure that funding for this important initiative continues. An investment in the CMS would ensure that NASA has the funds available to study its carbon measurements and make them usable for decision makers, resource managers, and communities across the country.

 

Debriefing the EPA’s Science Advisory Board Meeting

I spent most of Thursday and Friday this week at the EPA’s Science Advisory Board meeting in Washington, DC, as the 44 members gathered to discuss EPA’s regulatory agenda and hear updates from EPA programs on lead, per- and polyfluoroalkyl substances (PFAS), and the Integrated Risk Information System (IRIS). As I explained earlier this week, it was the first meeting for 18 of the members who had been appointed after Administrator Pruitt issued his directive barring EPA-funded scientists from serving on the committee. Much of the meeting turned out to be an exercise in reaching consensus in a group of over 40 people on a select few decisions (you can follow my twitter thread for more detail here), but there were some important things that came out of the two half-days.

Overwhelming public support for more review of scientific underpinning of EPA regulations

My colleague, Dr. Dave Cooke, during the public comment period.

During the public comment period section, 21 speakers provided comments on the EPA SAB workgroup’s memos on the EPA’s regulatory agenda. Public comment periods at these meetings don’t usually last a full hour, but scientists and experts in person and on the phone clearly wanted to express their support for the EPA SAB’s review of several rulemakings in process that would effectively roll back science-based regulations on vehicle and power plant emissions. Many of the oral comments echoed the same concerns that the SAB workgroup raised in its assessment of glider vehicles, namely that EPA had not undertaken an assessment of the emissions impacts from this rule and should, and that the technical information relied upon in the proposal was both at odds with EPA’s own tests and had now been withdrawn by the body conducting the research. Additionally, and notably, nine of the oral commenters (including myself) from different fields were in strong support of SAB review of the “Strengthening Transparency in Regulatory Science” proposed rule.

Consensus from SAB on general lack of analysis supporting EPA’s regulatory decisions

After the comments finished up, the SAB moved on to discuss the recommendations of the workgroup (here and here) and to come to a consensus on the advice that would be contained in a letter to the administrator coming out of this meeting. At first, some of the newer members advocated that the SAB postpone review of several of the regulatory actions that had been flagged as meriting review by the workgroup until more information was provided by the EPA. In fact, SAB member, Dr. Christopher Frey told Politico on Thursday that, “Basically they just didn’t provide us with any answers,” said Frey. “That kind of put us in a position where all we can really do is say EPA has not identified the science or any plan to review it, and clearly there are science issues that are in the proposed rule.” Luckily, after much conversation, there was an acknowledgement, even from the newest members, that it is better to agree to review and find out later that the scope of the review can be narrowed than to simply kick the can down the road and hope for better information from EPA. Thus, the full SAB was able to vote in favor of recommending to Administrator Pruitt that they review all five deregulatory actions identified by the workgroup as requiring scientific review.

Agreement that Pruitt’s restricted science proposal warrants review

Then the committee moved onto the question of whether to review the EPA’s April proposed rulemaking on transparency in regulatory science. From the outset, all members seemed to agree with the workgroup’s recommendation that it merited review. Dr. Honeycutt even justified the need for SAB review because of the sheer number of questions (27) that the EPA posed in such a short proposed rule. Stanley Young was the only member to show support for the rule, arguing that “mischief has been done” in the past with “studies hiding behind data,” calling out the Six Cities study as an example. This is a common talking point used by Administrator Pruitt and others when talking about so-called transparency, however it’s easily countered by showing that after all of the controversy around the study, once the data was reanalyzed by the Health Effects Institute, its findings were confirmed. The majority of members, however, were supportive of SAB review of this policy and ultimately voted unanimously in favor of recommending that Pruitt charge them with that duty.

Calls for delay of SAB review come from Pruitt-appointed members

The sentence proposed by committee member, Dr. Steven Hamburg, on the restricted science proposal. It will likely be edited during the drafting of the SAB’s letter to Administrator Pruitt.

The perhaps more contentious piece of this conversation happened on Friday when the SAB had time to consider exactly what they would be asking of the Administrator Pruitt regarding this rule. The question became: would they just ask to review the rule or would they also request that the agency defer all action on the rule (moving any further in the rulemaking process) before SAB’s review was complete. Interestingly, only new members disagreed with asking for agency deferral and Dr. Kimberly White of the American Chemistry Council commented that she thought SAB review shouldn’t begin until the rule is in final rule stage. It’s important to note that the American Chemistry Council has lobbied on similar legislation in Congress (the HONEST Act). Not only is it supportive of the rule, but its member companies stand to gain financially from such a rule that would limit the agency’s ability to use independent science to implement strong standards on chemicals that are environmental contaminants. Thus, Dr. White’s interest in delaying SAB review until it’s too late is right in line with her employer’s agenda in getting this rule finalized and implemented as soon as possible.

Another reason for delaying SAB review of regulatory actions is to wait until the makeup of the committee changes again this fall. Some 15 committee members’ terms will end at the end of September 2018 and while 8 of them have only served one term and could be reappointed, it is likely that Pruitt will not do that and appoint all new members. The final 11 members who were appointed under the Obama administration have terms expiring in September 2019, 8 of whom have only served one term. By the end of 2019, it is possible that Pruitt could have a hand-selected SAB and so far, Pruitt appointees appear more interested in delaying SAB review and allowing EPA actions to get farther into the rulemaking process before SAB weighs in. But, the SAB’s role is to be involved in EPA’s deliberative process and providing advice early enough in the rulemaking process that it can actually have an influence on the science considered by the agency. Advice received after a rule is already finalized is useless.

The ball will soon be in Pruitt’s court

At the very end of the meeting, the SAB agreed that Dr. Honeycutt and the Designated Federal Officer would draft a letter to Administrator Pruitt that would be sent to members for comment. This letter will then be sent to the Administrator’s office and once there, there are no requirements for him to respond in any window of time and no mandates that he follow the SAB’s advice. He could agree with the SAB and charge them with reviewing EPA actions within a matter of months, he could do the same thing but have them do it over the course of the year (at which time several of the rules under review might already be finalized), he could disagree with their recommendations and give them no charge or a different charge altogether, or he could ignore them completely. It’s hard to foresee what he will do because while he seems uninterested in scientific backing for his deregulatory agenda and never responded to the SAB letter sent in September 2017, he took quite an interest in EPA’s federal advisory committees when he issued a directive that changed the composition of many of them in October.

It would be in the public’s best interest for the SAB to have the opportunity to review these EPA regulatory actions so that there is at least some public record of scientific input and peer review feeding into the rulemaking process that has been entirely lacking at the EPA under the Trump administration. Administrator Pruitt should listen to his advisors on these issues, charging them with immediate review of these potentially destructive policies. Otherwise, the message he’ll be sending is that he can’t handle the truth: best available science will not support his deregulatory agenda.

 

Who’s Interested in the Trump Coal Bailout?

Old coal plant Photo: San Juan Citizens Alliance/EcoFlight

Changing technology, from low cost wind and solar, to pollution control added to coal plants, to fracking for natural gas, has created a new debate about where we should get our electricity. The debate has reached new levels with an order today from the White House to protect coal and nuclear plants from competition, even where the plants voluntarily agreed to participate in competitive markets.

This isn’t the first time the debate over grid reliability has included the Department of Defense (DOD), and despite what the White House says, the regional grid operators are still maintaining reliability.

The DOD is the largest single buyer of energy on the planet. They know a bit about fuel security, and they spend a lot of money on day-to-day energy needs.  In the announcements and justifications today that the national defense depends on old coal and nuclear plants, there isn’t any recognition of what the DOD has been saying and doing.

Grid support from offshore wind is real in New England. Photo: M. Jacobs

James Mattis, Trump’s Secretary of Defense, is credited with the energy strategy for war fighters to “unleash us from the tether of fuel.”  In this regard, there is an on-going effort to build renewable energy and energy-storage facilities on US military bases. Because almost every power outage in the US was caused by problems with the wires, the DOD is interested in power supplies located on base, and not dependent on fuel. Wind and solar have been welcomed by the DOD.

The DOD has also warned against “not appropriately valuing the continuing contribution of renewable resources in meeting requirements.”  This was in the context of a PJM (grid operator for 13 states in Mid-Atlantic and Ohio Valley) debate over need for more fuel assurance in winter weather.

Meanwhile, the operators of grids in the US are not interested in the Trump Administration’s interference with energy markets. Comments filed at FERC by the federally-regulate independent grid system operators were uniformly hostile when Secretary Perry ordered a rulemaking last year for plants with a 90-day fuel supply. This is true for grid operators that have little or no coal plants running on their system. It is also true of PJM, which put out a statement that it is uninterested in the today’s recycling of the Administration’s coal plant proposal.

Earlier today PJM, the grid operator with one of the least-supportive environments for wind and solar,  said that they are doing just fine without the current proposal:

Our analysis of the recently announced planned deactivations of certain nuclear plants has determined that there is no immediate threat to system reliability,” the operator stated. “Markets have helped to establish a reliable grid with historically low prices. Any federal intervention in the market to order customers to buy electricity from specific power plants would be damaging to the markets and therefore costly to consumers.

(Credit Julia Pyper, Jeff St. John at Greentech Media.)

 

Given that the organizations most relevant to this issue are not seeing this as helping, the question has to be, “Why are we going through this again?”  There isn’t a claim that there is some long-term benefit to raising costs for all consumers, trashing markets and the decisions made to invest in them, and polluting our air, land and water with the coal ash, waste heat and carbon emissions. This proposal does not make America safer, does not advance the technologies we need, and does not provide a path to cost-reductions, or cleaner environments, or even a better-supplied military.

Once we get the Adminstration’s ideas for implementation (yes, they’re not ready with that) we can compare to last year’s proposed rule, which would have had disastrous impacts on human health, cost consumers billions, hurt competitive energy markets, all while doing nothing to improve and possibly even impede, grid reliability and resiliency.

Stay tuned.

The difference between 4,645 and 64 deceased in the aftermath of Hurricane María is… science

Over the last few decades, we have seen the Puerto Rican populace’s vulnerability to extreme weather hazards increase as the built environment and social services infrastructure decays, Puerto Ricans and their families flee at an increasing tempo to the United States, and the frequency and intensity of hurricanes in the Caribbean increases. Growing up in Puerto Rico, I lived through one hurricane (Hugo, 1989) and a few tropical storms, but nothing compared in ferocity and devastation to Hurricanes Irma and María.

Given the destruction and flooding from these two hurricanes, the ineptitude of the Puerto Rican government in handling the situation, and the unwillingness of the Trump administration to adequately assist the citizens of its territory, it was hard to believe that the death count had only reached 64 fatalities, as Puerto Rican Governor Ricardo Rosselló’s Department of Public Safety claims. In fact, Puerto Rican society widely mocked the government’s numbers and suspected either incompetence or a deliberate undercount to minimize the magnitude of human toll.

A new study offers evidence of what Puerto Rican communities suspected. Independent researchers at Harvard University estimated that at least 4,645 people lost their lives in the aftermath of Hurricane María, with one-third attributed to interrupted or delayed health care. The researchers used household surveys to calculate an all-cause mortality rate after the hurricane, and compared this rate to official 2016 (i.e., pre-María) mortality rates to estimate excess deaths from the hurricane. That number is supported by the researchers’ methods and data, but it is also symbolic, as it represents the central estimate of between 793 and 8,498 deaths. For perspective, and to underscore that Puerto Ricans experienced a real catastrophe (contra Trump’s false assertions that I debunked here) 4,645 is more than those killed by the terrible calamities of 9/11 and Hurricane Katrina.

Among the statistical estimate of 4,645 we can count real people

Who were these thousands of people who lost their lives after the hurricane? We will likely never know for sure, but among them was Gaspar Cruz Agosto, a 73 year-old Puerto Rican man who was scheduled for surgery before the hurricane, but who could not be operated on because the hospital lost power after María. Mr. Cruz Agosto died two weeks after the hurricane because the hospital could not provide him with the critical care he needed. This sad case does not appear to be isolated, as Puerto Rico’s independent Center for Investigative Journalism (CPI in Spanish) estimates that 60 percent of María-related fatalities ocurred in health care or retirement home facilities.

The difference between the official count and the estimate is vast—it is, in fact, more than 70 times the official figure. What can be the cause of the enormous discrepancy? Well, it’s clear that the answer is…science. The large undercounts appear to be due to established protocols that require that a medical doctor annotate a death certificate linking the clinical cause of death to the disaster event. As CPI explains, the attending physician in these cases is seldom the physician certifying the death of a patient. This means typically there was no contextual information included in a death certificate—information like lack of electricity, transportation services, or medicines, interrupted health care, dietary changes, temperature increases, or stress caused by the disaster. If we add to that the chaotic conditions after the hurricane, and the lack of communication with public health agencies, hospitals, and funerary homes, it becomes clear that obtaining an accurate count of fatalities was a very difficult task.

But the disaster conditions and the inadequacy of death certificate protocols in Puerto Rico do not excuse the Rosselló administration’s attempt to discourage at least two prominent Democratic senators from asking the Department of Homeland Security to ensure an accurate count of all storm-related deaths. The Puerto Rican government’s lobbyist who called Democratic congressional offices suggested that focusing on the death toll would negatively impact the image of Governor Rosselló, showing more concern for public relations fallout than for the well-being of our people. Didn’t we just see a similar disregard for human health and concern about a “public relations nightmare” in the Trump administration’s blocking of a study on hazardous chemicals on military bases in the U.S.?

Arguably, the lack of attention and resources given to Puerto Rico by the Trump administration also had a role in increasing the death count, as the President’s disparaging and dismissive tweets about Puerto Ricans and the disaster likely sent the message to all levels of the federal government that neither he nor his agencies should be very concerned about the plight of Puerto Ricans.

The public has a right to know the facts about natural disasters and their aftermath, and neither the Rosselló nor the Trump administrations have been honest with us about this. There is no way to overstate the severe public health crisis still unfolding in Puerto Rico nearly ten months after María. As we have seen in the botched attempts at restoring the electrical grid in anticipation of the next hurricane season (just a few days away from starting!), neither social nor economic justice has been prioritized. What is being prioritized by the Puerto Rican government are juicy contracts for unqualified (but well-connected to the Trump administration) contractors and government agency executives tasked with dismantling public schools, the social safety net, and labor protections. What is being prioritized is violent police repression to silence civil resistance to austerity measures by tear gassing children and other non-violent demonstrators.

Latinas lead the way towards a recovery in Puerto Rico

But there is hope. Leading the way towards an equitable recovery for Puerto Rico are multiple grassroots and national advocacy organizations—and Latinas are leading the way here. Recently I had a chance to see their work in action at a recent summit of Latina and Latino environmental professionals. The compañeras at the Fundación Fondo Acceso a la Justicia are providing legal assistance to appeal denied FEMA aid requests—a complicated and very cumbersome process. Local Sierra Club activists in Puerto Rico are providing solar panels and helping to increase the skills of local community leaders that can create strong and resilient neighborhoods for when the next hurricane hits. Latinas with Oxfam America have helped convene grassroots in Puerto Rico with FEMA officials so that the federal agency can have a better understanding of the language and cultural barriers that prevent people from accessing aid. Latina scientists from CienciaPR and other scientific organizations are convening a workshop in the fall in Puerto Rico to educate Puerto Rican and Puerto Rico-focused scientists on how to engage in the pressing science-policy debates and decisionmaking that are vital to safeguard our health, environment, and democracy.

We need to address climate change with the tools and knowledge produced by science. We need to do so with special attention to the most vulnerable populations, be they in the Caribbean, the Gulf Coast, or in inland cities or rural areas. If we do not, we will see more of these deadly impacts as climate change continues to fuel more intense and destructive hurricane seasons.

What the Failed House Farm Bill Got Wrong About SNAP and Work

The House of Representatives voted down a farm bill last Friday. It was a bill that lived and died by its insistence on subjecting participants in the Supplemental Nutrition Assistance Program (SNAP, or food stamps) to a slew of unnecessary and misguided work requirements. Had it passed and been signed into law, the bill would have effectively reduced or eliminated benefits for millions of people. And though it promised to channel the resulting “savings” into state-administered job training programs, this proposal, too, was deeply flawed and betrayed serious misperceptions about the populations that participate in SNAP.

While this version of the farm bill failed (good riddance), chances are we haven’t seen the last of proposals to achieve so-called welfare reform through farm policy. So this seems like a good time to assess what the House bill’s sponsors got wrong about the people in their own districts who rely on SNAP—and what Congress should do to achieve a farm bill that really works for these communities.

Who are SNAP participants?

We’ve shown that rural communities rely on SNAP at higher rates than their urban counterparts and derive substantial economic benefit from the program, but are often overlooked in federal policy discussions about nutrition assistance programs—allowing policymakers who represent these communities to repeatedly make decisions that aren’t in their best interest. (See: Kentucky Representative Hal Rogers.) Amid the calls for work requirements that promote “self-sufficiency” and discourage a “lifestyle of dependency,” it is particularly important that we continue to push for policies grounded not in ideology, but in evidence. To this end, we used the publicly-available 2016 USDA Quality Control data and a linked geographic indicator to gain a better understanding of SNAP use in urban (metropolitan) and rural (nonmetropolitan) areas nationwide.

In many ways, the demographics of SNAP use in urban and rural areas differ little, and support what we already know about the populations that use the program. Among all SNAP participants nationwide, about 40 percent of participants are children, while roughly 10 percent are elderly. Meanwhile, able-bodied adults without dependents, or ABAWDs—the population at the center of the highly contested work requirements—make up only eight percent of all SNAP participants. (You read that correctly. Eight percent.) And among that population, research indicates that one-quarter work while receiving SNAP, and about three-quarters work during the year before or after receiving benefits. Which means the House leadership willingly, enthusiastically even, jeopardized their chance at passing a farm bill in order to target fewer than eight percent of SNAP participants—the vast majority of whom are still actively participating in the labor force.

In rural communities, more SNAP participants live with disabilities

But the USDA dataset does point to one key difference between urban and rural SNAP participants: a greater percentage of those in rural areas are living with disabilities.

In rural areas, SNAP users with disabilities make up 11.3 percent of participants, compared to 9.5 percent in urban areas. Though the difference may seem slight, think of it this way: if the proportion of SNAP users with disabilities in urban areas matched rural areas, it would equal an additional 70,000 participants. The disparity shouldn’t necessarily come as a surprise, given that rural rates of disability are themselves higher; according to the Center for Disease Control, residents of rural areas tend to be older, poorer, and sicker than their urban counterparts.

Work requirements won’t end persistent poverty—least of all with untested and underfunded job training programs

All of this should trigger some realizations for those on the House and Senate agriculture committees who will draft and campaign for subsequent versions of the farm bill. Firstly, additional work requirements will apply to a small fraction of their constituencies, while delivering a particularly devastating blow to the participants it does touch. Secondly, the same set of social, economic, and demographic factors that contribute to higher disability rates are likely among the factors that continue to drive unemployment and underemployment in rural areas.

Broadly speaking, these are but a few of the symptoms of the same disease: persistent poverty. And the bottom line is that no amount of job training will counteract a lack of well-paying jobs, least of all while we’re punching holes in the federal safety net.

The consolation prize of last week’s failed farm bill was supposed to be the promise of delivering people from poverty by providing Employment and Training (E&T) opportunities for “anyone who wants one.” This proposal, too, was deeply flawed: it lacked empirical evidence showing that E&T models would be effective and scalable, and grossly underfunded the program, offering what would be equivalent to just $30 per month per eligible SNAP participant. Even if a lack of employable skills were the primary factor driving SNAP use—and it’s not, by a long shot—the House bill’s job training solutions would still be feckless and paper-thin.

What should Congress do instead?

Addressing some of the major root causes of poverty and food insecurity is no easy task—particularly given the wide variation among rural communities across the country. But using the farm bill to make investments in local economies would be a good place to start. The House bill failed to invest in a number of proposed policies and programs with demonstrated success in supporting farmers, bolstering local and regional food systems, and making nutritious foods more affordable and readily available to communities. These programs include the Farmers Market and Local Food Promotion Program and Value-Added Producer Grant program, both of which are contained in the bipartisan Local FARMS Act; the Beginning Farmer and Rancher Development Program; the Food and Insecurity Nutrition Incentive Program; and the Healthy Food Financing Initiative. These programs have received broad support from families, farmers, and food producers around the country who know their communities best and see a better way forward.

The Senate will release its own draft farm bill in the coming weeks, and the House is expected to hold another vote late in June—meaning there are plenty of opportunities to tell your elected officials what you want to see (and definitely don’t want to see) in the next farm bill. Visit our website for all things farm bill, including policy updates, easy ways to reach out to your Senators and Representatives, and helpful talking points around SNAP and the Local FARMS Act. Let’s keep working toward a food system we can be proud of.

New House Bill Cuts Critical Climate Research. The Senate Could Stop it

We are keeping close track of the National Oceanic and Atmospheric Administration (NOAA) budget for fiscal year 2019 because President Trump’s budget proposal, released in February, put much of NOAA’s life-saving research on the chopping block. The U.S. House Commerce, Justice, and Science Appropriations subcommittee recently passed a bill with numbers that we can compare to the president’s proposal (Figure 1)—and not in a good way.

President FY 2019 Federal Budget Request compared with House Subcommittee Appropriations Bill

Figure 1. Comparison of funding levels in the president’s proposal and the recent U.S. House subcommittee bill for FY 2019 with those enacted in FY 2017. FY 2018 was funded with a continuing resolution that was largely similar to FY 2017. Figure based on data provided by the NOAA 2019 budget summary and the U.S. House of Representatives subcommittee report.

House slashes satellite program budget

The House subcommittee bill makes drastic cuts to the National Environmental Satellite, Data, and Information Service (NESDIS) budget – far deeper than the President’s request and nearly 89 percent lower than FY 2017.  This is no time to reduce funding for NESDIS, as Americans rely on this data and instrumentation to help understand and prepare for extreme precipitation and wind, both of which are becoming more intense due to climate change.  For example, the GOES-R team (a NESDIS research program) is currently addressing issues in the cooling system of the Advanced Baseline Imager instrument, which was recently launched into space on the GOES-17 (also called GOES-West) satellite.  The instrument advances the capabilities for detailed information on rain, cloud and wind.

Also, there are five other instruments on GOES-17 that have been working in space since the March 2018 launch to help researchers and decision-makers understand global climate impacts. These instruments include the cutting-edge lightning mapping capabilities, which enable forecasters to see rapid increases in lightning that often signal a storm may become even more dangerous, identify areas prone to lightning-sparked wildfires, and issue earlier flash flood warnings, enabling people more time to get to safety.  Thank you NESDIS!

GOES-17 lightning mapper May 2018

Figure 2: Image from satellite GOES-17’s lightning mapper, which detected storms with lightning passing over Kansas and other states in May 2018. Source: NOAA NESDIS

U.S. House scalpel excises climate research program budget, restores other programs

The U.S. House bill keeps in line with the president’s FY 2019 budget request by taking a 38 percent bite out of NOAA’s Climate Research Program, as compared to FY 2017. This includes zeroing out competitive research grants and a 32 percent cut in regional climate data and information. The House FY 2019 bill also, unfortunately, has a 5 percent cut to the National Sea Grant College Program, relative to FY 2017.

On the bright side, the House bill does maintain the President’s request for an increase in funding for laboratories and cooperative institutes by 22 percent. Research funded by the NOAA Climate Program Office includes a study of ways to incorporate near-real time satellite information. One outcome from this study would be to provide the southern Great Plains earlier warnings on seasonal drought onset  that are associated with La Niña phase in the Pacific Ocean.  Farmers, for example, then have a better chance to prepare and minimize losses through improved advanced warning on pending drought conditions.  Other studies funded by the Climate Program Office suggest the Tennessee Valley tends to be drier during the second winter in a row with La Niña conditions .  NOAA’s National Integrated Drought Information System programs also contribute funding to these cited studies.

Because constituents weighed in with their elected House members, cuts to parts of NOAA’s Office and Atmospheric Research budget were not as severe as in the president’s request. .  And while the president requested a 52 percent cut to the Ocean, Coastal, and Great Lakes Research program, the U.S. House bill just increased their budget by 14 percent.

Senate Markup for FY2019 Budget

We can see there’s still work to be done. America deserves—and can have—the best science and technology to help us understand and prepare for a rapidly changing climate. Now is the time to weigh in with Senators who are grappling with FY 2019 appropriations and share your stories of the many benefits the NOAA satellites programs and climate program office bring to your region.

 

data provided by the NOAA 2019 budget summary and the U.S. House of Representatives subcommittee report. NOAA NESDIS

Now That Xcel Won’t Get Its Nuclear Bill, What’s Next?

Earlier this month the Xcel Nuclear Plant Costs Bill (SF3504/HF3708) passed the Senate but failed to pass through the Minnesota House. The bill created a system of approving nuclear plant repair costs for Xcel Energy that would have circumvented the normal process of the Minnesota Public Utilities Commission (MN PUC) and left ratepayers to shoulder potentially excessive costs of keeping Xcel’s nuclear plants running.

Xcel’s two nuclear power plants are a key component of its goal to reduce carbon emissions company-wide by 60 percent (below 2005 levels) by 2030. Just last week, Xcel announced it has already cut carbon emissions by 35 percent, and is on track to achieve its 2030 goal according to its newly released Corporate Responsibility Report.

Xcel is preparing to file their next resource plan with the MN PUC in 2019, they are currently talking to stakeholders about their vision to reduce carbon emissions 80 percent by 2030 for the region. This vision is dependent on Xcel continuing to run their two nuclear power plants through their licensing periods in the early 2030s.

So now that the Xcel nuclear bill didn’t pass, what’s next, and what does this all mean for Minnesota’s clean energy future?

Trying to keep Xcel’s nuclear fleet in the black

Xcel’s nuclear fleet is struggling to stay profitable in the face of cheaper alternatives (like renewable energy and natural gas) and looming upkeep costs. Xcel estimates it will need at least $1.4 billion dollars in repairs over the next 17 years for its Monticello and Prairie Island nuclear plants. To provide certainty that Xcel would be able to recover those costs from ratepayers, they introduced legislation that would have allowed the company to get upfront approval from the PUC for its future nuclear expenses instead of approval after those investments have been made (how it works currently). The legislation would have provided certainty for Xcel that they would be able to recover these maintenance costs from ratepayers.

This is a bad deal for ratepayers because the legislation dilutes the PUC’s authority, and attempts to bypass the PUC’s current process for reviewing costs to determine if they’re prudent. That’s why UCS opposed the bill: it was an attempt to avoid the existing regulatory review process and shift financial risk from Xcel’s shareholders to ratepayers. This is not the first legislative attempt to dilute the power of the MN PUC.

Maintaining the current process for approving costs is important

Xcel is due to file their next Integrated Resource Plan (IRP), also known as their 15-year business plan, in February 2019. The IRP process allows for a comparison of electricity options to make sure consumers are getting the most bang for their ratepayer bucks. The IRP process is where Xcel will detail how they plan to generate and supply power to their customers over the next 15 years, including any expected expenses to keep its nuclear plants up and running.

A successful IRP includes evaluation of existing resources, a robust economic analysis of different supply-side and demand-side options under a range of scenarios and assumptions, including future environmental costs and fuel prices, opportunities for stakeholder engagement, adequate reporting requirements, and a robust set of criteria of which to base approval or denial of utility plans to spend ratepayer dollars.

It’s important to keep the current process because it protects ratepayers from excessive charges. By separating out the nuclear plant upkeep costs, we’re not comparing them to other options that would maintain a reliable and affordable energy supply for less cost to ratepayers. The legislation would have pre-approved these costs, meaning any cost overruns due to mismanagement by Xcel would have been automatically passed on to ratepayers. To protect Minnesota consumers, it’s important to keep the robust IRP process and maintain the PUC’s authority to scrutinize Xcel’s expenditures.

With Xcel’s nukes in jeopardy, what does this mean for its carbon reduction goals?

In 2016, the MN PUC approved Xcel Energy’s 15 year resource plan, which prioritized renewable development. Xcel Energy has led the charge in the powerful wave of utility announcements to cut carbon emissions and invest in clean energy. Xcel has stated that its nuclear plants play a critical role in achieving the company’s clean energy targets, and its commitment to reduce carbon emissions.

However, the fate of Xcel’s nuclear plants is unknown. Markets are constantly evolving and it’s Xcel’s job to adapt and remain profitable under changing circumstances. With adequate time to prepare, there are a variety of carbon-free resources that can step in (solar, wind, storage) if Xcel is committed to its carbon goals and undertakes thoughtful and robust planning to make sure its investments are smart ones.

Regardless of the near term, Xcel’s nuclear plants are currently scheduled to close in the early 2030s—and we need a plan for a reliable, affordable, and low-carbon energy future for Minnesota without these nuclear plants.

MN’s clean energy future

Minnesotans want clean, affordable energy, and the PUC is critical for ensuring their interests are reflected in Xcel’s future energy investments.

Xcel Energy attempted to gain security for its shareholders ahead of its 2019 Integrated Resource Plan (IRP). Minnesota ratepayers deserve the intensive planning of the IRP process, which ensures billions of dollars of repairs are made in the most prudent and efficient manner.

We expect public engagement around the IRP to occur in the summer and fall of 2019. We look forward to working with Xcel Energy and other stakeholders to develop a comprehensive IRP that puts Xcel’s clean energy goals front and center while protecting Minnesota Consumers.

Creative Commons/Mulad (Flickr)

The US Military, Resilient Energy, and the Zombie Apocalypse

Photo: Mass Communication Specialist 1st Class(SW) James Kimber, U.S. Navy

The US military and its supporters understand the importance of resilient energy. With or without zombies.

Just last month, the House of Representatives Appropriations Committee encouraged the Department of Defense (DoD) to “prioritize funding for energy-related projects, including renewable energy projects, to mitigate risk to mission-critical assets and promote energy security and efficiency at military installations” in the report accompanying the 2019 Military Construction and Veterans Affairs bill. The committee highlighted how renewable energy and smart technology investments can “shield mission-critical operations from disruptions to the power grid.”

Because, as a recent analysis from my colleagues at the Union of Concerned Scientists showed, US military bases aren’t just on the front lines of homeland defense. They’re also, in a lot of cases, on the front lines of climate impacts. Rising seas and storm surges don’t stop for checkpoints and can threaten the energy supplies that military missions depend on.

And, though climate security is a significant concern to the DoD, it turns out that there are lots of reasons why the military is a big fan of “resilient energy” based on advanced and renewable energy options like solar, wind, microgrids, and energy storage.

Wilson Rickerson (left) and Michael Wu (right)

To understand how the pieces all fit together for our armed forces, I checked in with Wilson Rickerson and Michael Wu, cofounders of Converge Strategies, LLC, a resilience and advanced energy consulting company, about the intersection between the US military, clean energy, and resilience. And about how the Zombie Apocalypse comes into play.

John Rogers: So, Wilson, what’s the issue—why are we talking about clean/resilient energy in the context of defense and preparedness?

Wilson Rickerson: Energy is critical to every component of military operations and capability. It’s the fuel that powers our aircraft, vehicles, and ships. The natural gas that heats nearly 300,000 buildings. The electricity that keeps our forces globally networked.

In recent decades, we’ve developed numerous capabilities that rely on uninterrupted access to electricity, in particular. For example, the US Department of Defense (DoD) operates the Global Positioning System relied on by more than three billion users worldwide. A network of satellite ground stations and other controls facilities ensures that GPS is always available, and each of these has critical energy requirements.

DoD is almost completely reliant on the civilian electric grid to meet these requirements. That fact has driven investments in energy resilience projects and technologies on DoD bases.

JR: How big is the scope—the reach—of DoD energy activities, including the renewable energy pieces?

WR: The DoD is the largest institutional consumer of energy in the world. In 2016, the DoD’s energy bill was $12.4 billion, comprising 57% of the energy spending of the entire federal government.

As of 2016, the DoD has more than 1600 active renewable energy projects spread across more than 500 installations. Most of these are small-scale projects that don’t provide a resilience benefit, but DoD is increasingly focused on integrating clean energy and energy storage into resilient energy systems.

JR: Mike, how long are the gaps the military needs to prepare for?

Michael Wu: DoD can handle short term outages relatively well. There are proven systems and solutions in place that address the more common three- to five-day outages. But as the threats to infrastructure become larger—stronger storms and natural disasters, for example—it’s important to prepare for longer-term outages.

Current approaches to energy resilience rely heavily on diesel generators, which typically have limited fuel storage onsite. As we have seen in recent large storms, the diesel fuel supply chain can be severely disrupted during outages. So bases need to become more self-sufficient.

Experts on grid security believe that our electric grid faces unprecedented threats of long-term disruption. More extreme weather events, cyber and physical attacks, and other “black sky” threats imperil the critical infrastructure our military relies on to remain operationally capable, and that our society relies on to function.

DoD is investing in several efforts to strengthen its ability to maintain critical functions during disruptions. For example, the US Army issued a policy last year requiring all installations to keep adequate energy storage to maintain critical operations for 14 days. And the US Air Force has a goal that all mission-critical functions will have assured access to a reliable supply to energy at all times within the next 20 years.

JR: Can you give some examples of where renewables and microgrids have been brought into the picture?

MW: Sure. The Navy partnered with Georgia Power to build a 30 MW solar array at Naval Submarine Base Kings Bay, which became operational in 2016. Under the agreement, the Navy granted Georgia Power the land to construct the solar array and receives the legal and technical right to the power during a grid outage.

Otis Air National Guard Base in Massachusetts is creating an advanced microgrid powered by renewable energy that will power critical military intelligence facilities. The project will integrate wind power, an advanced battery storage system, and microgrid controls, and is the result of a unique partnership between the DoD and the Commonwealth of Massachusetts, which provided substantial grant funding.

Kings Bay solar (Photo: NAVFAC/Flickr)

Marine Corps Air Station Miramar broke ground late last year on a new microgrid project that incorporates natural gas generated by a nearby landfill, and several solar photovoltaic projects. Through this project, MCAS Miramar can power its entire flight operations, even if the local grid is disrupted, and the project will lower the installation’s electricity bills and generate revenue by providing services to the civilian electric grid.

The Hawaiian Electric Company and the US Army are currently constructing a 50 MW power plant at Schofield Barracks on the Island of Oahu. The power plant will be capable of running on biofuels or conventional fuels, and will complement the increasing levels of solar and wind power on Oahu’s electric grid. The Schofield plant will power the civilian electric grid during normal operations, but will be capable consolidating to power only nearby Army “assets” during emergency events.

Photo: U.S. Air Force photo/Scott Dehainaut

JR: That’s quite a mix of states. So people shouldn’t think of these resilient power efforts as a red state or blue state thing?

WR: Definitely not. DoD and the Military Services are investing in energy resilience because it is critical to military operations, not for political purposes. DoD’s mission is to provide the military forces needed to deter war and to protect the security of our country, and its investments and priorities should be viewed in that context.

The military also spends more than $400 billion in payroll and contracts across all 50 states, with military spending accounting for more than 5% of state GDP in some states (e.g., Alabama, Alaska, Virginia).

Red and blue states both have a strong economic interest in supporting the success of in-state military installations—and today a major focus for the military is advanced energy resilience.

JR: How do civilians benefit from the military work in this area?

MW: There’s a long heritage of military-civilian technology crossover. GPS, microwaves, and the internet are just a few examples of technologies developed for military use have changed our entire society.

Energy can be a similar success story. Everyone has critical energy requirements—hospitals, fire and police stations, schools, and businesses. The technologies, planning approaches, and business models that the military is investing in can also be applied to meet those critical requirements in the commercial and civilian sectors, and vice versa.

There are also significant opportunities for civilian governments to partner with military installations on joint and mutually reinforcing resilience planning. Military base resilience depends heavily on the resilience of surrounding communities and infrastructure but there isn’t yet a standard playbook on how to align military and civilian resilience efforts.

JR: How does climate change factor into the equation? Or: How does our military see it?

James Mattis (Source: DOD)

WR: DoD has recognized climate change is a national security threat multiplier and an accelerant of instability around the globe. Secretary of Defense James Mattis has called climate change a “challenge that requires a broader, whole-of-government response.” He joined a long list of defense, intelligence, and national security leaders that acknowledge the unprecedented international and homeland security risks of the changing climate.

DoD faces challenges to its infrastructure from more frequent and severe storms and sea level rise, while resource scarcity and humanitarian crises will destabilize the global security environment.

However, it’s important to note that DoD’s investments in clean energy are not primarily motivated by reducing greenhouse gas emissions—they are to strengthen military and operational capability.

Why zombies matter

JR: So, bring this home for our audience, and tackle the question on everybody’s mind: Are we ready for the Zombie Apocalypse?

WR: We spend a lot of time thinking about existential threats to the power system—things like cyberattack, electromagnetic pulse weapons, massive earthquakes… although they each pose different types of risks that require different types of hardening, there are some common sense and no-regrets things that we can be doing across the energy industry which are hazard neutral.

In order to provoke “hazard-neutral” thinking—thinking that seeks to identify “no regrets” strategies that address multiple hazards—we sometimes find ourselves posing the scenario of a hypothetical zombie apocalypse. This is actually something the Pentagon has done in the past as well.

There is an interesting ranking of states’ ability to survive a zombie apocalypse by Estately.com. It takes into consideration factors such as population density, percentage of gun ownership, etc. Arrestingly, some of the states with the large concentrations of critical military infrastructure (e.g., Virginia) are also some of the most vulnerable to zombie attack.

In a number of zombie movies, survivors attempt to run to the military for safety. In reality, many installations will not currently be up and running during a large-scale outage—whether triggered by zombies or otherwise.

Our goal is to make sure that our critical national security missions can still be completed even on really bad days.

JR: Got it. So with or without zombies, resilient power is something that our military is serious about. Thanks, Wilson and Mike.

WR: Happy to help.

Thanks to my colleague Paula Garcia for help with this interview.

Hurricane Season 2018 Begins: Will it be Different From Last Year’s?

Three hurricanes forming in the Atlantic in 2017. Photo: NASA Earth Observatory

As we brace for the start of yet another hurricane season on June 1st, I can’t help but compare last year’s hurricane season outlook from the National Weather Service with the one for this year. The first thing that strikes me is that, even though they look very similar, and the number of predicted hurricanes is the same, there is a 10 percent lower probability of this year being an above-normal season than last year.

The 2017 outlook predicted a 35 percent chance of a near-normal season, while this year’s is 40 percent. Lower chance of above-normal and higher chance of near-normal? I think we can all say YEAH to that!

I mentioned last year that since the mid-1970s the number of hurricanes reaching categories 4 and 5 in strength has roughly doubled. It may be unsurprising then that 2017 set some records. With 17 named storms, including 10 hurricanes and six major hurricanes (those reaching categories 3-5), 2017 was the most active season since 2005 and the seventh most active since record-keeping began in 1851. And September 2017 was the most active month on record for Atlantic hurricanes. Hurricanes Harvey, Irma, and Maria are among the five costliest storms of all time. The season was terribly destructive and definitely not “normal.”

We know that it only takes one hurricane making landfall to make a season destructive, and we should always be prepared. But there is nothing wrong about looking at this year’s prediction and hoping for a better season!

Sea surface temperatures are near normal – a good thing

Current North Atlantic sea surface temperature (SST) anomaly pattern looks like opposite of May SST patterns associated with active Atlantic #hurricane seasons historically. pic.twitter.com/1eLt541Q9y

— Philip Klotzbach (@philklotzbach) May 29, 2018

What a difference a year makes! According to NOAA, this year’s slightly less severe outlook is mainly due to temperatures in the tropical Atlantic Ocean and Caribbean Sea. Last year in early May, sea surface temperatures in that area were running consistently above average. But not this year – temperatures are running below average. In addition, there is the possibility for a weak El Niño to develop, and this phenomenon tends to suppress hurricane activity in the Atlantic.

Why does that matter? While many factors affect hurricane formation, the main driver is the sea surface temperature. It is a known fact that hurricanes “feed” on warm waters. Surface ocean temperatures above about 79°F (26°C) are one of the key factors that strengthen hurricane development when overall conditions – high humidity, warm and moist air above the ocean, and relatively constant winds at different altitudes – are conducive for their formation and growth. So yes, lower sea surface temperatures bode well for a potentially weaker season.

The climate change connection

Although we are not off the hook for a bad season, we hope that this year will be nothing like 2017, and certainly wish that Puerto Rico and other locations hit hard by last year’s hurricanes are spared this year.

We know that global warming is not making things any easier when it comes to hurricanes. As mentioned before, warm waters are the fuel for hurricanes, and have the potential to increase hurricane power. But it is not just the wind and strength: the amount of rain that hurricanes bring can also be increased by global warming. The potential moisture content (water vapor) in the atmosphere increases with every 1°C (1.8°F) increase in temperature, and we are currently 2.0°F (1.1°C) warmer than we were in the late 1800’s. That means more water can fall when it rains. It is therefore not surprising that we are seeing not only more powerful hurricanes, but ones bringing more rain too.

Hurricane Harvey is a prime example: it set rainfall records in the Houston region, and that was not by chance. A study published a few months after Harvey hit found that human-caused global warming made the record rainfall roughly three times more likely and 15 percent more intense. Another study found that higher ocean heat content and sea surface temperatures make hurricanes such as Harvey “more intense, bigger, and longer lasting and greatly increase their flooding rains.” As the authors put it, “Harvey could not have produced so much rain without human‐induced climate change.”

Water, water everywhere

I cannot close without mentioning another type of event that often happens during the same season as hurricanes, and that is extreme precipitation – heavy rainfall unrelated to hurricanes or tropical storms – that can bring intense flooding. The increase in moisture content in a warmer atmosphere and the potential for more intense rain events applies whether there is a hurricane or not.

The historic town of Ellicott City, Maryland, was devastated by flooding brought by extreme precipitation just this past weekend. This is the second time in less than two years. We know there are several factors affecting flooding risk in addition to the rain amount itself, such as the configuration of the storm and patterns of land use, but the fact remains that extreme rain events are on the rise. This is something we must come to terms with – and plan for.

Devastating floods in both rural and urban areas are nothing new, but the human and economic toll are increasing as more floodplain and coastal areas are developed. We must act at the federal, state, and local level to rally the resources, policies, and coordination needed to respond adequately to the magnitude and severity of floods not only today, but into the future.

In addition, we must support and act on policies and measures at all levels – federal, state, and local – to reduce coal, oil, and natural gas emissions that cause global warming. Our actions in the next few decades will be crucial in determining how much more weather patterns will change and how destructive they may become.

 

 

 

EPA Science Advisory Board’s First 2018 Meeting: What to Expect

Photo: Tony Webster/CC BY-SA 2.0 (Flickr)

This Thursday and Friday, the EPA’s independent advisory body, the Science Advisory Board (SAB), will be meeting in person for the first time since Administrator Scott Pruitt announced his sweeping advisory committee directive last fall. I, for one, am thrilled that the EPA’s scientific sounding board is active and meeting in person at a time when the agency can use all the scientific counsel they can get. However, it is important to understand that since Administrator Pruitt has joined the agency, the context for science advice at the EPA has greatly changed.

Several important things have happened since the last time the SAB met:

  • Administrator Pruitt’s directive banning EPA-grant-funded scientists from serving on the agency’s advisory committee meant that six committee members were dismissed for that reason, six others were not renewed for a second term (which had been common practice), and the 17 new members joining the SAB include individuals who have questioned mainstream science, are funded by industry, or have actively opposed the very mission of the EPA.
  • Former chair Peter Thorne’s term ended, and he has been replaced by Michael Honeycutt, the head toxicologist of the Texas Commission of Environmental Quality, who has actively sought out weaker standards for a variety of environmental contaminants in his state and has even claimed that air pollution can make you live longer.
  • Administrator Pruitt has not answered the SAB’s September letter asking him to join the SAB during a meeting.
  • The EPA issued a proposed rule in April, “Strengthening Transparency in Regulatory Science,” that would effectively restrict the agency’s ability to use the best available science as it designs critical environmental and health protections. This will not only affect the science used to support EPA’s safeguards, but will limit the way in which the SAB will be able to review the scientific basis of those rules.

Typically these in-person meetings provide the committee members a time to discuss ongoing projects, charges from the administrator, or additional issues they might want to raise as the agency’s peer-review mechanism. The agenda for this meeting includes time to discuss the recommendations of a workgroup that was tasked with looking at the Spring and Fall 2017 regulatory agendas and figuring out what EPA regulatory actions merit review from the SAB on their scientific or technical merits. It turns out that since the agency has attempted to roll back several agency policies that would require scientific grounding (including the Clean Power Plan and the Glider Vehicle Rule), the SAB wants to weigh in. During the meeting, the full committee will have a chance to figure out what to cover and how they will do this.

It is imperative that the SAB strongly urge the administrator not to move forward with its restricted science proposal nor its deregulatory measures until the SAB has had ample time to review the actions and the science supporting them and provide objective advice on next steps. I will be asking this of the committee during my comment tomorrow. You can read my full written statement here.

The SAB is an invaluable advisory body that should be actively working to ensure that EPA’s science is unassailable. And thanks to the transparency measures of the Federal Advisory Committee Act, the public will be able to hold the SAB to its charge and its conflict-of-interest policies to guarantee that its science advice is pure and untainted by political or ideological motivations, so that the EPA has the best available scientific information as a baseline for its decisions. Pruitt isn’t legally obligated to follow every piece of SAB advice, but we’ll know when he fails to—and you can bet that we’ll be demanding justification when his actions are in direct opposition to his agency’s mission of protecting the environment and public health.

 

Weathering the Storm: Building Community Resilience in Environmental Justice Communities

Art by Micah Bazant

In 2015, It Takes Roots convened a delegation of climate justice leaders to participate in mobilizations at the COP21 in Paris and proclaimed “It Takes Roots to Weather the Storm.” When I first heard this statement, I was struck by the vivid imagery it evoked. I envisioned a tree with roots that, despite a powerful rainstorm, swirled, connected, and clenched with fortitude into the depths of its rich soil. I imagined branches growing and the emergence of leaves bearing fresh fruit.

I see these roots as representing the cooperative networks, social fabric, and human relationships that ground us firmly in the soil of our diverse communities. In the face of climate change, how do our community roots support neighborhoods — not only to withstand immediate disruption, but to thrive, sustain our cultures, and provide for future generations?

As a grassroots, environmental justice organization, the Asian Pacific Environmental Network (APEN) is addressing climate change through base building, civic engagement, and policy advocacy. The communities we organize, low-income Asian American immigrant and refugee communities in California, are uniquely vulnerable to the impacts of climate change. Therefore, our approach to resilience bridges mitigation and adaptation, with the aim of simultaneously addressing the risks from climate change alongside the inequalities embedded in our current systems that marginalize low-income communities of color.

APEN members and organizers in the East Bay

Emergency response must reach communities in their language

Since the 1980s, Richmond has been a home to many Southeast Asian refugees who were uprooted from their homelands by the Vietnam War. Our members live on the fence line of the Chevron Refinery and suffer from contaminated air, soil, and water due to their close proximity to industrial sites and toxic hazards. A major chemical explosion in March 1991 at the Chevron Refinery revealed Contra Costa County’s inadequate emergency response system, as monolingual residents were poorly informed of emergency safety procedures. In response to this, the Laotian Organizing Project launched and won a historic campaign that pushed the health department to implement a multilingual emergency phone-alert system.

This campaign is a lesson about the importance of accessible and targeted early warning systems to alert residents of predicted extreme weather events. This is particularly important for immigrant and refugee communities with limited English proficiency as well as communities living in proximity to industrial facilities, where coastal flooding and other climate disasters could exacerbate toxic releases and air pollution.

Housing justice is climate justice

In addition to organizing in Richmond, APEN works with low-income Chinese immigrants in Oakland. Oakland’s Chinatown, like many immigrant communities, is a historic neighborhood offering essential services like health clinics, schools, and grocery stores in culturally and linguistically relevant ways. These institutions not only preserve Chinese traditions and practices, but keep immigrant families deeply rooted in a thriving, culturally rich community.

The growing crisis of housing unaffordability and homelessness is closely connected to climate vulnerability. Rising housing costs and displacement threaten to tear apart the social fabric of communities like Chinatown, making it more difficult to ensure that our communities have accessible emergency resources like health care, evacuation shelters, and transportation during a climate disaster. For this reason, our climate justice activism centers strategies like renter protections ordinances and anti-displacement in statewide policies.

Community microgrids promote energy democracy

Low-income communities have a higher energy burden, and thus are more vulnerable to fluctuating energy prices and increased energy needs due to climate change. Power outages can leave the lights out when electricity needs are crucial, particularly for those that rely on medical equipment and families with young children. In light of these impacts, we are pushing for prioritization of critical facilities that serve our communities with emerging clean energy technologies like energy efficiency, solar, and storage.

Recently, APEN proposed a community microgrid project in Chinatown to strengthen a local school and health clinic’s ability to serve as emergency support facilities and offer services to the linguistically isolated families in the community. The accompanying economic savings and community ownership from these investments can root community organizations and institutions that contribute to the social fabric of the neighborhood.

In his encyclical on the environment, Pope Francis notes that “We are not faced with two separate crises, one environmental and the other social, but rather one complex crisis which is both social and environmental.” In order to address this intersectional crisis, then, scientists must acknowledge the underlying social inequities faced by disadvantaged communities and approach climate solutions through a lens of community development, public health, and social justice. As part of the UCS Science Network Mentor Program, I am working on a project that analyzes climate vulnerability tools that integrate climate impacts and socioeconomic factors. Leading with values like trust, empowerment, and cooperation, researchers can equitably partner with grassroots advocates to advance our knowledge about community resilience. Centering these principles in our collective work will support meaningful policy and pave the way towards deeper systemic change.

 

Amee Raval is a Policy and Research Associate at the Asian Pacific Environmental Network (APEN), an environmental justice organization that empowers Asian American immigrant and refugee communities across California through grassroots organizing, civic engagement, and policy advocacy. Through her role at APEN, she offers an environmental justice and health equity lens to climate and energy policy in California. She previously worked with the Natural Resources Defense Council on research and advocacy focused on the environmental and occupational health impacts of extreme heat and rising temperatures due to climate change on vulnerable communities. Amee has an MS in Environmental Health Sciences from UC Berkeley School of Public Health. @APEN4EJ

 

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

 

The ABCs of Sidelining Science by the Trump Administration

Photo: KE4/CC BY-NC-SA 2.0 (Flickr)

As the school year comes to a close, I took a look at what lessons the Trump administration has taught us about science. It’s a harsh lesson for our children and families, adding up to harms that will touch all of our lives. As someone who is immersed in watchdogging this administration, I was surprised how many new things I learned about how the Administration is dismantling public health and safety protections, increasing security threats, and attempting to undermine the role of government in serving All the people.

Here are the ABCs of Sidelining Science:

A

is for ATSDR, the Agency for Toxic Substance and Disease Registry, whose report was suppressed on water pollution. Or the AGRICULTURE DEPARTMENT, where Secretary Perdue has betrayed the trust of farmers and the public. Or ATTACKS ON SCIENCE, which occur at an alarming rate all across the Administration.

B

is for BERYLLIUM, which puts workers at risk—but the Administration has halted regulations to protect them. Or Dr. Nancy BECK, an appointee to lead efforts on regulating toxic chemicals despite massive conflicts of interest. Or  David BERNHARDT, who similarly has conflicts of interest in his job at Interior.  Or BLACK LUNG disease, which has afflicted miners for decades—but protections from coal dust were rolled back by the Administration.

C

is for the Clean Air Science Advisory Committee, CASAC, now led by Dr. Tony Cox, a consultant to polluting industry for many years. Or CHLORPYRIFOS, a pesticide that impacts brain development in children, impacts endangered species, and is widely used on fruits and vegetables. EPA Administrator Pruitt rejected the science on this pesticide and refuses to restrict its use.  Or CONFLICTS OF INTEREST, which are rife in this administration, from the President to agency heads and political staff. Instead of draining the swamp, the President has brought conflicted lobbyists and business people into the agencies directly. Or CHEMICAL DISASTERS, which unfortunately are still an all too regular occurrence. But the EPA, in a recent proposed rule, wants to roll back all preventive measures at chemical facilities in order to reduce costs for industry, while ignoring impacts on the public. And of course, CLIMATE CHANGE, which is roundly ignored by administration officials and actions including communications to the public.

D

is for DATA SUPPRESSION, as webpages have been deleted or studies and grants cancelled. Or DOURSON, the nominee to serve at a senior position in the EPA who represented industry and dismissed the impact of toxics. His nomination was defeated due to widespread outrage. Or the DISINFORMATION PLAYBOOK, a set of tactics that some bad actors starting with the tobacco industry have developed, which now seem to be part and parcel of the Administration’s strategy to sideline science.

E

is for ENVIRONMENTAL JUSTICE, the disproportionate public health and environmental impacts on low income communities and communities of color. The Trump Administration has sidelined environmental justice efforts as well as science. Or ESA, the Endangered Species Act, which Congress and the Administration are trying to weaken, particularly on public lands, to make room for oil and gas development. Or the ENVIRONMENTAL PROTECTION Agency, which is mired in the scandals of Scott Pruitt and at the center of efforts to roll back public health protections. The agency’s science budget was on the chopping block by the President but was saved by Congress once the impact of the cuts became clearer.

F

is for Freedom Of Information Act (FOIA), which has revealed that the EPA and other agencies are only listening to industry at the expense of the public. Or Fuel Efficiency as the EPA seeks to roll back standards for cars that were previously agreed to by the automakers and have enormous benefits. Or FEDERAL SCIENTISTS, vilified or ignored by this administration, but whose work does so much good for the country!

G

is for GUIDANCE MEMO, which is the way that agencies tell the public how they interpret their congressional mandates. The Justice Department has declared they won’t use this guidance in enforcing the rules, causing great confusion and opening up a huge gap for those with deep pockets in industry to avoid compliance. Or GUN VIOLENCE RESEARCH, which was banned from study at the CDC until recently even though guns are the cause of more deaths than auto accidents in recent years.

H

is for Hazardous Air Pollutants, which are likely to dramatically increase under new legal guidance put in place by the EPA, which was prepared by a former industry lobbyist now leading the Office of Air, William Wehrum.

I

is for INDEPENDENT SCIENCE, as the EPA has tried to pack its board of external science advisors with those with ties to industry and exclude academics. Or INTERIOR Department, where senior career staff such as Dr. Joel Clement were arbitrarily and capriciously reassigned to inappropriate jobs. Or IMMIGRATION policy of this Administration, which of course impacts science and scientists among all the other reasons it harms our country. Or IRAN and the dangerous act of withdrawing from the agreement to prevent the production of nuclear weapons.

K

is for North KOREA, which our Global Security Program tracks so carefully to debunk the myths. And L is for Public LANDS that the Interior Department is rapidly opening to private oil and gas development.

M

is for MACT—the Maximum Available Control Technology for cancer-causing pollutants that political appointees at the EPA say imposes too high a cost on the industries that release these chemicals into our air.

N

is for National Ambient Air Quality Standards, NAAQS, which are supposed to be set on public health criteria only—but the EPA is seeking to add economic and social factors to these limits. Or the need for federal agency professionals to make a NOTE FOR THE RECORD because some agencies are refusing to keep a written record of meetings and decision-making to avoid public scrutiny.

O

is for Ozone, a major public health pollutant that the EPA is re-evaluating and may increase allowable levels. Or the Office of Information and Regulatory Affairs, OIRA, in the White House, which seems to be abdicating its responsibility to ensure regulations are based on a fair process with a clear analysis. Or the Office of Government Ethics, OGE, which should be addressing the conflicts of interest that are rife across this government from the White House on down.

P

is for PFAS or PFOA, chemicals that have been used in many products such as Teflon and in fire suppressing foam used by the military, which now contaminate water supplies across the country and particularly on military bases affecting military families. Or PUBLIC HEALTH, the primary mission of the EPA, which is too often ignored in this Administration.

Q

is for a Question on citizenship that the Justice Department wants to add to the Census even though there is substantial evidence it will result in misestimating our population, with wide-ranging effects on the distribution of government spending as well as elections.

R

is for RESTRICTED SCIENCE that can be used in making public health and safety regulations as proposed by the EPA. Or REGULATORY ROLLBACK, which is a hallmark of the Trump Administration but harms the health and safety of us all. Or the EPA’s RISK MANAGEMENT PLAN for chemical facilities that Pruitt has proposed should not contain any measures to prevent accidents.

S

is for the EPA’s Science Advisory Board, SAB, which now is excluding academic scientists with grants in favor of scientists employed by industry. Or SUPERFUND, which is intended to clean up toxic waste sites but seems to be dragging even though it is supposedly an EPA priority. Or SEA LEVEL RISE, one of the most immediate and costly impacts of our changing climate—even while the President denies climate change is occurring.

T

is for the Toxic Substances Control Act, TSCA, now being implemented by a former chemical industry lobbyist. Or TIP RULES from the Department of Labor, which ignored the analysis that showed workers would be hurt if employers were allowed to control the distribution of tip revenue.

U

is for the URBAN ECONOMY, which will be badly impacted by the proposed changes to the Supplemental Nutrition Assistance Program (SNAP) in the Farm Bill if it passes.

V

is for VOTING RIGHTS under attack in this Administration in a way that threatens our democracy. Or VITTER a particularly egregious judicial nominee who seems to deny fundamental scientific evidence on a regular basis.

W

is for WORD BAN at the Centers for Disease Control to remove words that are politically unpopular in some circles. Or WORKER SAFETY, which should be continually improving but has been undermined by the Trump Administration.

X

is for XENOPHOBIC divisions in our society, stoked by the President  and harmful to all of us.

Y

is for Dr. Richard YAMADA at the EPA Office of Research and Development, who is leading the charge to restrict independent science at the agency.

Z

is for Department of Interior Secretary ZINKE, who is undermining the historic mission of the agency by shrinking National Monuments, opening public lands, and sidelining science and scientists across the Department.

It’s time to stand up for science and people

Now you know your ABCs. My colleague Shreya Durvasula called this “the world’s saddest children’s book.” My goal is not to depress you, but to remind everyone what’s at stake and why we need to fight back against these harmful actions.

For each of these issues (and unfortunately many more), the Union of Concerned Scientists has sought to explain why these actions matter, and how we can come together to fight back so we, once again, can work toward having a government that is by and for the people. So whether you are most concerned with Global Security or Climate Change or Public Health or all of the above, it is time to do something! Go to the UCS Action Center and see how you can get involved.

Former ExxonMobil Engineer Says Oil and Gas Companies Can and Should Plan to Be Part of the Solution to Climate Change

Former ExxonMobil executive Bill Hafker

At first glance, I don’t have a lot in common with William (Bill) Hafker. He is an environmental engineer who spent 36 years working for ExxonMobil. I am not an engineer, and I’ve spent nearly 30 years holding companies like ExxonMobil accountable to consumers, shareholders, and the public at large. So I was intrigued when I learned that Bill believes that oil and gas companies should be increasingly active and transparent in identifying and committing to meaningful climate action, and should incorporate climate planning into their traditional business planning.

Bill shared with me his op-ed on CNBC, as well as a paper he will present at next month’s annual conference of the Air & Waste Management Association. Ahead of the ExxonMobil and Chevron annual meetings tomorrow, I had a chance to talk with him about his proposal for credible two degree Celsius (2°C) climate planning in the petroleum sector.

Here are some highlights of our conversation.

Is it fundamentally impossible for an oil and gas company to rise to the climate challenge and become a lower-carbon energy company?

The industry was primarily kerosene when it started, but then we didn’t need kerosene anymore. Thanks to the demand for gasoline and diesel to fuel cars, companies survived and prospered despite the loss of the kerosene market. And now oil companies are becoming, in large part, gas companies. And if in the future, electricity by hydro, solar or nuclear becomes the thing, these companies can do that. They’re technically savvy and financially capable of becoming fully integrated lower-carbon energy companies. What is probably impossible is for oil and gas companies to transform without the support and collaboration of investors, governments, non-governmental organizations, and other industrial sectors.

Why is it important for major oil and gas companies to make and disclose 2°C plans?

It is important that these companies prepare and annually release credible 2°C climate plans for at least two reasons. First, their operations, but more significantly, the use of their products, contribute half of global carbon emissions, and so their business decisions play a vital part in efforts to reduce greenhouse gas emissions.

Second, all stakeholders—including investors, regulators, and consumers—need such a plan in order to make informed decisions about the company’s environmental and business performance. In the United States in particular, now that the Trump Administration has expressed its intent to leave the Paris Climate Agreement, it is critical that companies, especially those that voiced their disagreement with that decision, outline how they intend to proactively help progress the agreement’s goals.

Credible 2°C climate plans that describe greenhouse gas emissions reduction targets, with specific actions and projections of results over time, deliver that information.

How would investors use credible climate plans published by oil and gas companies?

If all the fossil fuel companies produced credible climate plans containing the same type of information, then investors would be able to compare them on an apples to apples basis. Investors could analyze companies’ projections for carbon emissions, proposed actions to reduce emissions, and, importantly, the time frame over which companies propose to take those actions.

Oil and gas companies plan their business for decades into the future. A business plan could show that Company A set a goal of reducing emissions by X percent by 2025, has money in place for identified projects and new purchases of solar, and plans to get out of higher carbon intensity sources of fuels like oil sands. Company A should also project out to a more distant date, perhaps 20 years, so that along the way investors could assess the company’s progress against its reduction projections. Company B, on the other hand, might set a less ambitious 2025 goal or be only be halfway there by then, not have identified any steps to close the gap, and be continuing to say the same things. Assuming both companies demonstrate solid financial performance, I would expect investors to choose Company A because it has shown it can deliver environmentally as well as financially.

What are the key ingredients of a credible 2°C plan for an oil and gas company—and what are the biggest gaps in the 2°C reports you’ve seen so far?

The key ingredients are the same whether for an oil and gas company or any other company or entity such as a municipality, state, etc. The critical starting point is a complete and accurate inventory of operational greenhouse gas emissions and a transparent discussion of emissions from the use of sold products. Following from that should be the establishment of a greenhouse gas reduction target calibrated to quantify the company’s contribution to the Paris Climate Agreement goals. The plan must then describe specific actions to be taken to achieve the target, including projected emissions reductions those actions should deliver over time. Lastly, the plan must assess as best it can, the financial implications of its execution in order to demonstrate to investors that the company is able to deliver environmentally sound performance while remaining financially viable.

The biggest gaps in the reports are the lack of setting of targets and specific plans for how to achieve them, as well as the omission of projections that demonstrate how the planned actions deliver on the goals over time. This information is needed to assess the credibility of what to date have been largely qualitative statements of intent to contribute to reductions in greenhouse gas emissions. This would also greatly facilitate comparisons between companies on their environmental and financial outlooks, allowing for more informed purchasing and investment decisions.

What does it mean to be a “fully integrated energy company”? What are some obstacles to oil and gas companies transforming themselves into fully integrated energy companies?

A fully integrated energy company is one whose energy supply chain reflects a mix of energy products, such as solar, hydro, biofuels, etc. Such a company would be well-positioned to deliver on Paris Climate Agreement-focused emissions reduction targets—not only those related to company operations and purchased power, but also those related to use of company products.

One potential obstacle to concerted action in this direction by companies and their investors is a concern that it won’t provide any real greenhouse gas reductions. The fear is that as long as there is no reduction in the demand for fossil fuels through actions by the key sectors that use them—such as transportation and power generation—someone will supply them. As long as they are being supplied, they will be used, and the resulting greenhouse gas emissions will still end up in the environment. Government engagement through incentives and/or mandates addressing the demand sectors, for instance through tighter vehicle fuel efficiency standards, will be an important factor impacting success.

Another potential obstacle is that a company that leads in shifting its business model from fossil fuels could be less profitable, at least in the short term while repositioning itself, and that investors will punish it for lower profitability, leaving it competitively disadvantaged. Somehow the investor community must share the burden of transforming our energy system, and find ways to reward companies for becoming fully integrated and doing their part to achieve the goal of keeping global temperature increase below 2°C.

What role do you think research and development can play in helping to address the climate change problem, and why do you believe that timing is important?

The demand for energy is projected to increase substantially in the years ahead as population and overall world living standards increase. In fact, the availability of affordable energy is a primary contributor to those improving living standards. The dilemma is that fossil fuels are often chosen as the traditional, easiest, or most cost-effective approach to meet these increased demands. What is needed is research focused on how to supply affordable low-carbon energy while also reducing fossil fuel demand. Cooperative research and development efforts involving energy/fuel suppliers and energy/fuel users would be an important way to engage the expertise of both actors for win-win solutions.

It is also important to ensure that research and development efforts are targeted for near-, mid-, and long-term solutions. Advances with a small impact now are as important as large breakthrough successes decades down the road. Credible research and development efforts—and credible climate plans that incorporate them—need to show a mix of near- and mid-term efforts that likely have a lower risk and lower reward, and higher-risk, higher-reward long-term efforts. It is not credible to set a target for 2035 and suggest that it will be reached solely through long-term research and development that is not anticipated to deliver a large reduction in greenhouse gas emissions until that year.

What motivated you to publish your op-ed and paper, and what are you hoping to accomplish?

From as far back as I can recall, I had a very strong passion for the environment and the need for people to protect it. As a high school freshman, I started SOAP (Students Organized Against Pollution) and never stopped feeling that protecting the environment is a vital undertaking for the well-being of humankind.

Early last year two events that I considered very important occurred: President Trump’s announcement that the United States would withdraw from the Paris Climate Agreement, and the passage of a climate risk reporting shareholder resolution at ExxonMobil with significant support from major financial institutions (after years of similar resolutions failing). I felt that it was a critical time for major U.S. corporations to live up to their claims of being good corporate citizens by proactively filling the void that the Trump administration may create, especially since several of them had voiced support for the Paris Climate Agreement.

I wrote my op-ed because it was what I felt I could do in retirement—drawing on my unique experience of working to improve environmental performance for 36 years inside the oil and gas industry—to initiate a discussion of a framework for credible climate planning among the affected stakeholders (i.e. the companies, all industries, governments and regulators, investment firms and individual shareholders, non-governmental organizations, etc.). I did not feel I’d be adding anything to the climate debate if I merely expressed an opinion on what needed to be done without offering a description of how that could be accomplished. That’s why I prepared a paper describing eleven elements of a “Framework for Credible 2°C Climate Planning”, without prescribing how exactly to operationalize each element. It is my hope that this paper can form the basis for engaging all stakeholders in a discussion of what is needed in corporate climate planning to satisfy them and lead to the climate risk reduction desired.

William Hafker

More Great News for Clean Air and Public Transit

DC Circulator bus

Transit buses are community resources. They help pedestrians get around on rainy days, hot days, and cold days. They help subway riders get home when the trains stop running late at night. They help cyclists get through parts of town that aren’t bike friendly. They help crowds of people get to sporting events. They reduce the number of cars on the road in space-limited downtowns. They provide regular transportation for people that aren’t able to afford a car, people that choose not to have a car, and people that aren’t able to drive a car.

To meet air quality and climate goals, we need widespread electrification of all types of vehicles. Buses are the people’s electric vehicle.

And we’re seeing more and more electric buses hit road. Recent news shows how state and local governments are bringing zero-emission battery and fuel cell technology to the masses.

Major investments in California California State Transportation Agency (CalSTA)

CalSTA recently announced awards for 285 zero-emission buses (in addition to major rail projects) that will be deployed across the state over the next five years. To my knowledge, this is the largest single investment in zero-emission buses in the United States to date. Communities from San Diego to Redding will benefit from these new buses.

A couple of noteworthy awards: funding will provide dozens of buses for new express routes along the highly congested US 101 corridor on the San Francisco Peninsula. Funding will also provide several electric coach buses operated by transit agencies in northern and southern California – yes, electric coach buses exist! These coach buses would also be great fits for companies like Google, Facebook, and Apple that provide transportation for their employees.

Funding for the 285 zero-emission buses comes from the state’s cap and trade revenue and the state’s fuel tax, the latter which increased in 2017 by 12 cents per gallon of gasoline with passage of Senate Bill 1.

California Legislature

The state legislature has authority to annually allocate 40 percent of cap and trade revenues that aren’t subject to continuous appropriations. The 2018 state budget provides $180 million of this cap and trade revenue for clean heavy-duty vehicle incentives, of which at least $35 million was specified for zero-emission transit buses. Budgets recently proposed for 2019 by the state Senate and Assembly indicate a similar commitment ($160 million and $150 million, respectively) for heavy-duty vehicle incentives next year.

California Air Resources Board (CARB)

CARB not only manages vehicle incentive funding allocated by the legislature, but it also directly oversees settlement money to offset the pollution from the Volkswagen #dieselgate scandal. CARB’s proposed plan for this funding could direct up to $65 million for zero-emission transit buses.

Taking the existing zero-emission buses on the road (100+), buses on order (340+), and the sources of funding above, I estimate at least 1,000 zero-emission buses will be on the road within the next five years in California, roughly 10 percent of all transit buses in the state. This means that many transit agencies are getting well ahead of milestones proposed by the California Air Resources Board for transitioning to a zero-emission fleet by 2040.

Strong commitments from transit agencies

Leadership on zero-emission buses is also coming directly from transit agencies and people asking transit agencies to take action. I’ve highlighted the work of King County Metro (Seattle-area) and Los Angeles Metro in previous blogs, but transit agencies large and small across the country are beginning to adopt electric buses. Here’s some of the most recent leadership we’re seeing from transit agencies.

San Francisco Muni

Muni, the transit agency serving San Francisco, recently adopted a resolution committing to all zero-emission buses by 2035. Muni has nearly 600 diesel and diesel hybrid buses. Combined with its 400 trolley buses, Muni is the second largest bus operator in California behind LA Metro.

Santa Monica Big Blue Bus

Another city leading the way in California is Santa Monica, which recently reaffirmed its 2016 commitment to transitioning its 200 bus fleet to zero-emissions by 2030. The city was awarded funding from Senate Bill 1 and cap and trade revenue (see above) for its first 10 electric buses. Santa Monica said this fall will be its last order of natural gas buses. This is a remarkable statement that I expect we’ll be hearing more and more from transit agencies.

It’s not just California

Washington, DC recently announced 14 battery electric buses have joined its fleet. And New York City’s Metropolitan Transportation Authority (MTA), who operates the largest bus fleet in the country with 4,000+ buses, casually made a huge announcement in its new bus plan that it will transition its entire fleet to zero-emission buses. A timeline for MTA’s transition hasn’t been specified yet.

The work ahead

Commitments to fleet transitions are the first step in getting zero-emission buses on the road. Then comes laying out a plan for acquiring the new buses, becoming familiar with the technology, and ultimately integrating the vehicles in significant numbers. A lot of thought, planning, and attention to detail are needed in between.

The transition to zero-emission fleets will require problem solving and teamwork across all aspects of the transit industry including route planners, bus makers, bus purchasers, facility managers, finance departments, mechanics, state and federal grant agencies, and public officials.

Like any new technology, there is a learning curve for the industry to overcome in the early years of adoption – such as figuring out the range an electric bus will get on specific routes in specific weather, because it won’t be the same as the range on the window-sticker. Fortunately, there isn’t any aspect of this learning curve I’ve seen that can’t be overcome.

There’s also myths that must be overcome with any new technology, such as electric buses’ ability to climb hills. The bus maker Proterra recently debunked this myth by climbing its bus up the canyons and mountain passes of every major ski resort near Salt Lake City. Another myth is the ability to operate in cold weather. Debunking this is Worcester Regional Transit Authority in central Massachusetts who has been operating battery electric buses since 2013.

One aspect of the transition to zero-emission buses that must be in place is hydrogen fueling and electric vehicle charging infrastructure. Utility companies must be given the green light to develop this infrastructure. From everything I’ve seen, electricians are more than ready for the job opportunities to build it out.

The magnitude of change needed to improve air quality and reduce climate change can be overwhelming, but I take great relief that we have the technology to overcome these challenges. China already has an estimated 386,000 electric buses on the road, which is more than five times the number of all types of transit buses in the United States. It can be done here too and there’s no time to wait.

Photo: DC Circulator

Did My Tea Leaves Reveal the Supreme Court’s Upcoming Gerrymandering Ruling?

This morning, I stirred my green tea vigorously to see if it would reveal the Supreme Court’s opinion on two partisan gerrymandering cases that are soon to be released. The tea spilled, I scalded my lap, then wondered why any Decent American Patriot would sip tea while the nation awaits a decision of such historic significance. I then made a cup of coffee and resolved to give up fortune telling.  So I won’t try and predict where the Court will come down on the constitutionality of partisan gerrymandering. However, I will offer some guideposts to help interested parties (see what I did there) understand the significance of the decision when it comes.

1. Is there a real decision?

It is always possible that SCOTUS decides to re-argue the points next session if there is serious fragmentation of opinion about what constitutional principles, if any, should govern partisan gerrymandering. Of course, that did not stop the Court from issuing a fragmented opinion in Vieth v Jubelirer, the decision that unleashed state legislatures to gerrymander without restraint in 2011.  Or they could decide that plaintiffs in the first Wisconsin case, Gill, do not have standing because they were not harmed within a gerrymandered district.  That outcome could have serious implications, and could depend on who writes the majority opinion.

2. Who writes the opinion?

While all eyes have been on Justice Kennedy as the decisive swing vote in these cases, Chief Justice Roberts is the only justice who has not yet written a majority opinion from this session, which makes it more likely that Roberts will be the author. The possibility of a Roberts opinion has led to speculation at Election Law Blog and other sites about the possibility that the Court will take a narrow, district-level approach, focusing on arguments such as those offered by Republican plaintiffs in the Maryland case, Benisek.

As has already been pointed out by Gill counsel Nick Stephanopoulos, this would be a misguided approach for SCOTUS to take if the goal is to conservatively reduce the number of applicable cases and thus restrain court intervention.  Moreover, the logic of state-imposed harm on all voters of the targeted party is inescapable and would inevitably make its way back into legal arguments.  As Justice Kennedy has acknowledged, it is the state that is imposing the inequity, and it is a state-level harm, in the sense that it is the number of seats denied the opposition party from all seats in the statewide districting plan that causes targeted voters (who voted for the opposition party) to suffer vote dilution.

An opinion that does the work that Kennedy and the liberals require, but is narrow enough for Roberts to be on board, will likely require more than a demonstration of intent to discriminate.  Harm will have to be demonstrated empirically, with clear evidence that the relationship between party vote and seat shares has been intentionally manipulated to punish voters who favor the opposition party.  And that takes us back to some of the fundamental scientific questions that gave rise to these cases in the first place.

3. What kind of rights are we talking about? Equal Protection? Free Speech and Association?

One of the most interesting aspects of these cases from the perspective of constitutional theory resides in the variety of ways that plaintiffs and lower courts have linked the harm of gerrymandering to constitutional protections. Traditionally, gerrymandering cases have used equal protection arguments, specifically the 14th Amendment, to protect voters from districting plans that don’t treat voters equally.  Alternatively, Justice Kennedy specifically, and the Court more generally, has been more receptive to “free speech” arguments as of late, especially in campaign finance and other election law cases, so this has become a more popular strategy.

The basic claim behind this strategy is that a vote cast is a form of expressive association, such that diluting or suppressing the value of that act violates the 1st Amendment.  There is considerable disagreement over the extent to which such claims are still implicitly dependent on the equal protection provided by the 14th Amendment, so it is certain that the Court’s response to these claims will shape future litigation and legislation.

4. Will the Court rely on a single metric to determine harm?

Almost certainly not, but the Court could set parameters and narrow the bounds of applicable cases by emphasizing that in the two cases in question, all of the empirical measures relied on by lower courts converged. That is, in the worst cases of gerrymandering, it doesn’t matter which metric is used, those for partisan bias, efficiency, and mean-median gaps will all show that a plan gives an asymmetric advantage to the voters of one party over another.

At the same time, the majority decision, or concurring opinions, could provide more support to some metrics over others. The efficiency gap is among the newer kids on the block and should receive a good deal of attention, but the model of asymmetry was developed over 20 years ago, and is still dominant in the field.  Of greater interest for those following the election science is the degree to which the Court considers the constitutional implications of these different measures, which are significant.  Specifically, as litigation and legislation moves forward, such arguments will be relevant for clarifying just what the constitution demands of our electoral systems, and how we can distinguish its bugs from its features.

5. How much is too much?

Again, it would be surprising for the Court to establish an empirical metric of “x percent.” Rather, a workable, manageable threshold would reflect both what is constitutionally required, but also respects judicial restraint.

This is why Maryland seems like an especially important case, in that a decision overturning that state’s Democratic gerrymander (the governing party manufactured an extra seat when they are already the dominant party) would provide a rather clear guideline, a one-seat principle. That is, if it can be shown, through whatever metrics, that the opposition party’s voters are effectively and reliably denied a minimum of a single seat as the result of an adopted plan, which is what would be required for vote dilution to occur, it would be grounds for overturning a districting plan.

If the Court can provide such guidance to lower courts, as to how much is too much inequality, that is as much as we can ask for. For the current situation is clearly too much, in the opinion of experts and citizens alike.

A Great Day for Offshore Wind: Massachusetts, Rhode Island, New Jersey All Go Big

Photo: Derrick Z. Jackson

Offshore wind power is a powerful, plentiful resource, but that doesn’t mean that it’s been a slam dunk in terms of getting it into the US electricity mix. Movement forward on offshore wind in three different states, though, made yesterday a day to celebrate.

1. Massachusetts says yes to 800 megawatts

The state we’d been watching this week was Massachusetts. Yesterday was to be the date for an announcement about which offshore wind project or projects had been selected for the first phase of a 1600 megawatt commitment from the state based on a 2016 energy law.

And the day didn’t disappoint. While the law required at least a 400 megawatt first tranche, the state announced that an 800 megawatt proposal from Vineyard Wind was the winner of this round. The larger project likely brought with it some nicely lower pricing, and was a pleasant surprise.

That amount of power (as our handy new offshore wind calculator shows) will generate electricity equal to the consumption of more than 400,000 typical Massachusetts households. It will also, given the electricity mix and what that offshore wind power might displace, reduce carbon emissions by the amount emitted by almost 200,000 cars.

All that requires actually getting the wind farm built and the turbines spinning. But yesterday’s step was an important one.

2. Rhode Island goes for 400 megawatts

Another pleasant surprise from yesterday was the announcement that Rhode Island had taken advantage of the same bid process and selected a 400 megawatt project of its own.

While the announcement was a surprise, Rhode Island’s commitment to offshore wind isn’t. The new project-to-be, from Rhode Island-based developer Deepwater Wind, will build on the state’s (and Deepwater’s) experience with the first-in-the-nation 30 megawatt Block Island Wind Project. And it fits within Gov. Gina Raimondo’s recent call for 1,000 megawatts of renewable energy for the Ocean State by 2020.

Rhode Island has already shown it knows how to get offshore wind done. While the next project will be in federal, not state, waters, that experience is likely to count for something in the race to get the next steel in the water.

3. New Jersey grabs a piece of the limelight

Not to be outdone, New Jersey also used yesterday to move offshore wind forward. Gov. Phil Murphy signed into law a 3,500 megawatt state goal that the legislature had recently passed. That’s the largest state commitment to date, and the latest in the crescendoing drumbeat of state action on offshore wind.

And the first tranche of Garden State action may be even larger than what Massachusetts and Rhode Island just moved forward on. Just after coming into office, Gov. Murphy ordered the state’s public utility commission to carry out a solicitation for 1,100 megawatts of offshore wind.

Offshore wind means jobs (Credit: Derrick Z. Jackson).

While megawatts may be the stuff of headlines, each of those projects and commitments is about a lot more—jobs in the near term, and air quality improvements, carbon reductions, careers, and more once the projects are up and running.

What’s next?

All that is particularly true as even more states get into the act. So where should we look next for leadership on offshore wind?

Connecticut could be poised to join its neighbors as it makes decisions about proposals for meeting its own renewable energy needs. The bids included proposals from Vineyard Wind and Deepwater Wind, plus Bay State Wind, the other entity vying for the Massachusetts and Rhode Island attention.

It’s also unlikely that New York is going to stay quiet, given its new offshore wind master plan, a 96 megawatt project planned for off Long Island’s South Fork (also being developed by Deepwater), the record-breaking lease sale off New York City in late 2016, and federal moves to evaluate more potential sites in the New York Bight.

Or we could be hearing more from Maryland, with two projects making their way forward with state support. Or Virginia, with a pilot 12 megawatt project. Or Delaware, or North Carolina, or…

Lots of future to watch—and make happen—even as we celebrate the immediate past. Because, given our need for clean energy and good jobs, and given the incredible potential of offshore wind, we’ll be wanting a lot more days like yesterday.

 

Photo by Derrick Z. Jackson

EPA Extends Comment Deadline, Schedules Hearing on Science Proposal After Pretty Much Everyone Complains

The EPA today extended the comment deadline to August 16 on its proposal to restrict the types of science that can be used in EPA decisions after pretty much everyone—from the American Home Builders Association to the American Geophysical Union—complained that a thirty-day comment period was grossly insufficient for a rule with such potential wide-ranging consequences. The EPA also scheduled a public hearing to be held in Washington, DC on July 17.

The EPA’s proposal would prevent the EPA from using many public health studies when making decisions. Scientists now have more time to comment on the potential harm that this proposal would have on public health and the environment.

The move gives scientists the ability to develop more sophisticated comments and ensure that their peers have the opportunity to detail how the rule would impact their own public health research and its use in EPA decisions—and to submit for the record specific studies that could be set aside. It is important for scientists to explain how and why specific communities would be harmed by excluding legitimate, peer-reviewed public health research from consideration by EPA.

In just three short weeks, nearly 100,000 comments were submitted.

From the beginning of the comment period, scientific organizations repeatedly and pointedly repudiated the EPA’s claim that the new rule is consistent with scientific transparency standards. The EPA heard from both industry and the science community that the short comment period on such a vague and badly written rule was wholly inadequate and possibly even in violation of the Clean Air Act and other statutes. Now scientists will have a few more weeks to fully detail the impact that such a fatally flawed rule would have on public health and the environment.

UCS and its partners have produced a guide for scientists and organizations on filing an effective public comment on this rule, and will be encouraging people to provide testimony at the July 17 hearing.

Pages