UCS Blog - The Equation (text only)

The EPA Disbanded Its Office of Science Advisor. Here’s Why That Matters.

EPA office building with agency flag

Late last week, the EPA announced its intention to get rid of its Office of the Science Advisor (OSA) and bury its functions deep down in another agency office. This move will significantly diminish efforts to coordinate and standardize the way that EPA does science. The administrator will have significantly less access to scientific advice at normal times and during times of crisis. And it will be easier for agency leaders to sideline and politicize science.

E&E News broke the story, and the New York Times, Bloomberg, CNN, and NPR soon followed. Then, on Monday, the lead lobbyist for Koch Industries took a job as the top political appointee inside the office that is now in charge of science advice and coordination at EPA.

What the office currently does

The National Academy of Sciences in 1992 recommended that the EPA create a science advisor office to “ensure that EPA policy decisions are informed by a clear understanding of relevant science.” From NAS:

It envisioned that the essential function of the science advisor would be to ensure that EPA policy decisions are informed by a clear understanding of relevant science. The panel recommended that the new science advisor advise the EPA administrator, implement a peer-review and quality-assurance program for all EPA science-based products; be a key player when EPA makes a policy decision, ensuring that the science and uncertainties relevant to a policy or regulatory issue are considered; play a key role in evaluating the professional activities of EPA scientists; reach out to the broader scientific community for information; and maintain an appropriate relationship with EPA’s [Science Advisory Board]. The panel suggested that the role of the new science advisor might be somewhat analogous to the role of the general counsel, who will not approve a document destined for an external audience until it is judged legally defensible.

The EPA is disbanding its Office of the Science Advisor. By any measure, this is not good news for the agency’s ability to protect public health and the environment. Cartoon: UCS/Jesse Springer

OSA staff are there to provide science advice to the administrator on the development of public protections and about how to respond to crises (including contamination from hurricanes, chemical disasters, and oil spills). They work to standardize scientific practices across different agency departments that develop and communicate science. They investigate allegations of political interference in science. They provide space to address contentious scientific issues. All of this requires them to have direct access to the administrator as well as sufficient authority and stature to influence other parts of EPA.

How it should work

The science advisor position is currently parallel to the head of the EPA Office of Research and Development (ORD). While the head of ORD has often served as science advisor, this has not always been the case. Administrator Lisa Jackson separated the two roles, which has distinct advantages, including a greater ability to guide scientific work across the agency and investigate violations of scientific integrity within ORD.

The best practice would be to make the science advisor more independent from ORD. Instead, they are going in the other direction by burying the science advisor’s responsibilities deep within the office. It’s highly unlikely that an ORD head is going to allow information to come from within that would put ORD programs in jeopardy.

Why this change matters

The elimination of the science advisor’s office accelerates the decay of the role of science advice within the EPA administrator’s office. The administrator needs someone who will tell it to them straight—even if it’s not the information they want to hear. Unnecessary layers between the science advisor and EPA leadership can do long-term damage to the agency’s scientific capacity, but they’re especially harmful during crisis situations when the administrator needs independent scientific analysis fast.

The move further compromises the ability of EPA to standardize and improve scientific practices across the agency. EPA’s work on human subjects protection, data sharing, peer review standards, and other best scientific practices will be diminished as their recommendations carry less weight.

Further, the agency’s scientific integrity policy (designed to prevent political pressures on scientists and scientific work) and its implementation will also be compromised. The scientific integrity officer’s ability to investigate allegations of political interference in science within ORD will be considerably more difficult.

The EPA’s response

The EPA claims that the agency is simply streamlining its workforce, and that there will be no reduced access or authority since the current acting science advisor is also the head of ORD. But this is based on several faulty assumptions.

First, it suggests that the science advisor doesn’t need a staff. Second, it assumes that there will never be a need to separate the ORD head from the science advisor position. Third, it completely ignores the OSA’s cross-organizational functions that will be significantly less prominent and that staff will be left out of important discussions due to rank.

The context

Unfortunately, the current EPA has consistently left EPA scientific staff out of the conversation on important public health matters. For example, EPA scientists were not consulted when the agency decided not to ban the brain damaging chemical chlorpyrifos (a decision that was later reversed in court). The proposal to restrict the types of science that EPA can use in decisionmaking was similarly hatched by political appointees without the meaningful involvement of agency scientists.

Under disgraced EPA Administrator Scott Pruitt, the ban on science advice from EPA grant recipients, the appointment of multiple industry insiders to high-level policy positions, the removal of climate change information from websites, and so much more has led to an understanding that science doesn’t count for much in the current EPA. Some were hopeful that Pruitt’s departure would bring changes.

In one of his first days on the job, Acting Administrator Andrew Wheeler met with agency staff and pledged to include agency scientists in the work of the agency. Yet this move suggests a lack of interest in unfiltered science advice and strong science standards, which can only make the agency less effective in protecting public health and the environment. The addition of a Koch Industries lobbyist as the liasion between the science office and the administrator’s office only amplifies the concern.

Lapsed Farm Bill Hurts Central Texas Farmers and Low-income Families

Photo: Sustainable Food Center

When you think of Texas, a thriving local food scene probably isn’t the first thing that comes to mind—but a visit to the SFC Farmers’ Market in downtown Austin might change that. The market draws large crowds every Saturday, and it plays a vitally important role in this city: linking small and midsize farmers across central Texas with customers—including those who shop using benefits from federal nutrition assistance programs—who are hungry for fresh produce and a sense of community.

But far from Austin, the federal law that gives markets like this one a leg up are in limbo. Congress has just allowed the last five-year farm bill to expire, having failed to pass a new one by the September 30 deadline. I spoke with Joy Casnovsky, the deputy director of the Sustainable Food Center (SFC) in Austin, to learn more about how federally-funded programs in the farm bill have helped make a difference in communities throughout Texas, and what’s now at stake.

A Saturday at the SFC farmer’s market

“We get a lot of phone calls from people who want to know about the market, and we know people are coming from all over to get here,” says Casnovsky of the downtown market. “There are families, older people, younger people, people doing their exercise routine, folks with kids, folks without kids.” SFC, with a mission and history of helping the central Texas food system thrive, now supports two weekly farmers’ markets and community farm stands featuring more than 60 local farmers, artisans, and food producers.

“It’s a hub. And if we can get folks to come to the market and chat with us, it’s a great way for them to get more information about how to use their benefits.”

The “benefits” Casnovsky is referring to include nutrition assistance benefits the Supplemental Nutrition Assistance Program (SNAP), the Special Supplemental Nutrition Program for Women, Infants, and Children (WIC), and the Farmers Market Nutrition Program (WIC FMNP), which offers additional produce vouchers for WIC clients. (Think of it as WIC’s trusty farm-fresh sidekick.)

Photo: Sustainable Food Center

The signs all over the market send a clear message that shoppers who use these programs are welcome here. Not only can SNAP, WIC and WIC FMNP be used at the markets, but they can be doubled through SFC’s Double Dollar Program. The program itself, which can be used by shoppers each week, is pretty simple: show up, swap out up to $30 of your SNAP or WIC benefits for up to $60 Double Dollars, and start spending them on fruits and vegetables from local farmers. (Followed by go home, eat up, and feel good.)

But what’s incredibly complicated about programs like Double Dollars? Getting Congress to pass the bill that funds them.

The real losers of the Farm Bill fight: Local farmers and families

The grant that made Double Dollars possible is part of the farm bill, an enormous piece of legislation that touches nearly everything in our food system, from farm to fork. The deadline for Congress to renew this legislation without disruption came and went on Sunday, and every additional day they delay in passing a new one is a day without funding for such grants.

Though the House and the Senate Agriculture Committees passed their respective versions in late June, the dramatic differences in the two have proven irreconcilable for the committee charged with drafting a final farm bill. Much of the tension has to do with ill-considered and onerous new work requirements in SNAP, which were proposed in the House bill despite overwhelming opposition from leading health and nutrition groups.

Some programs, like SNAP itself, will see continued funding in the absence of a new farm bill. Many smaller and related programs will not—including those providing critical support to small farmers and low-income families in communities across Texas.

SFC launched the Double Dollars program in 2012 with the support of a grant program called the Farmers Market Promotion Program (FMPP). It was the first of its kind in Texas. “When we launched this in 2012, we only did it at one market,” Casnovsky told me. “And then we expanded it to our three other markets. And then we expanded it to markets run by other associations. It went from being a pilot to being at 16 different locations.”

“Had we not gotten that grant, I can’t say how we would have launched at all. The FMPP grant gave us the catalyst we needed to start the program, and improve it and expand it over time. That initial investment helped us get additional funding from elsewhere, too.”

Similar initiatives have sprung up across Texas and all over the country in recent years. Double Up Food Bucks, a Michigan-born model that helps shoppers double the value of SNAP dollars spent at markets and grocery stores, began operating at 10 new Texas farmers markets just this spring—adding to its growing list of hundreds of markets served nationwide. Unlike Double Dollars, Double Up Food Bucks is supported by grants from the Food Insecurity Nutrition Incentive (FINI) program, a highly popular grant program introduced in the 2014 farm bill to incentivize fruit and vegetable purchases by low-income consumers. During its first five years, FINI helped more than 300,000 families put healthy food on the table, and supported more than 1,000 local farmers in the process. But this funding, too, is halted without a new farm bill—leaving many markets and the communities they support hanging in the balance.

What now?

With funding sources like the FMPP, FINI, and more now stranded, we need to tell Congress that it’s past time to pass a new farm bill.

This isn’t just about a handful of programs—it’s about maintaining the progress we’ve made in building a stronger food system for all of us. Casnovsky, who is now working with the Sustainable Food Center to seek grant funding for their next local foods project, put it this way: “I see that affecting all of Texas. It may not be our organization going after the grant next time—it may be another city or another town going after it—but that affects our food system and infrastructure. We are one big network, and we’re starting to understand how our efforts can really lift all boats.”

Want your senators and representatives to get the message? We’ve made it easy. You can find phone numbers that will connect you directly to your elected officials (and smart talking points to back you up) right here on our website.

Photo: Sustainable Food Center

Despite Trump Roadblocks, Full Steam Ahead for Clean Energy Transition

California recently adopted legislation that puts them on the path to source all of their electricity from solar, wind, and other clean energy technologies by 2045. Photo: Dennis Schroeder / NREL

With today’s public hearing on the EPA’s wretched and dangerous ‘plan’ for regulating power plant carbon emissions, the Trump administration is continuing its assault on clean energy, public health, and the climate. Whether withdrawing from the Paris Climate Agreement or bailing out uneconomic coal plants at the behest of his fossil fuel cronies, President Trump desperately wants to reverse progress on the transition to a low-carbon economy. But his schemes continue to flop thanks in large part to the ongoing actions of states, utilities, and corporations that are forging ahead with commitments to accelerate the adoption of wind, solar, and other clean energy technologies. And come this November, voters will have the opportunity to cast their support for clean energy as well.

Here’s a quick rundown of what’s been happening recently and what to look for come November 6.

States taking the lead

The biggest news-maker in recent weeks that exemplifies the undeniable shift toward a clean energy economy is the California legislature’s passage of SB 100. Signed into law by Governor Brown, the new law increases the state’s renewables portfolio standard (RPS) to 60% by 2030 and establishes a longer-term goal to reach 100% carbon-free electricity by 2045. California has long been a national leader in renewable energy deployment, but that’s a bold goal (Hawaii is currently the only other state with a similar ambition) that puts the world’s 5th largest economy on a path toward a fully decarbonized electricity system.

And if that wasn’t enough, at the SB100 signing ceremony, Gov. Brown unveiled an executive order that sets a goal of reaching carbon neutrality across the entire California economy—electric power, transportation, buildings, and industry—by 2045.

California isn’t the only state strengthening its clean energy commitments. Earlier this year, New Jersey and Connecticut raised their targets to 50% and 40% respectively. And in early August, Massachusetts passed legislation that raises the state’s RPS to 35% by 2030 (up from 25%), advances energy storage, improves upon the nation’s leading energy efficiency efforts, and potentially doubles the requirement for offshore wind development to 3,200 megawatts by 2035. The first tranche of the offshore wind is already under contract, at a surprisingly low price that could accelerate the technology’s deployment even faster than many have anticipated.

Xcel Energy’s Comanche Coal Plant in Pueblo County, Colorado will be retired and replaced with wind, solar, and energy storage.

More utilities turning away from coal

Many electric utilities across the country are also getting into the clean energy act by accelerating coal retirements and investing in renewable energy. For example, Xcel Energy recently obtained approval from the Colorado Public Utilities Commission to close 660 megawatts (MW) of coal capacity at their Comanche generating station and put in its place more than 1,800 MW of wind, solar, and energy storage by 2026. Dubbed the Colorado Clean Energy Plan, the $2.5 billion investment will result in renewable energy accounting for 55% of Xcel’s power supply and curb carbon emissions by some 60% compared with 2005 levels in 2026. What’s even more remarkable about Xcel’s plan is that prices for the new renewable energy projects are so low this plan will save ratepayers between $213 million and $374 million cumulatively.

The low cost of renewable energy was also a primary reason offered by Northern Indiana Public Service Company (NIPSCO) executives when they announced just last week they will move up several coal plant retirements—amounting to 1,800 MW of capacity—and be entirely coal free within the next decade. Consider that coal accounts for more than 70% of Indiana’s power generation mix today and NIPSCO’s decision becomes even more jaw-dropping. In addition, while final decisions have not yet been made, NIPSCO’s own analysis suggests that the most cost-effective strategy for replacing their coal fleet is through a mix of wind, solar and battery storage.

Corporations go for 100% renewables or bust

Despite the great leadership that many states and utilities are showing, some experts argue that the rapid increase of renewable energy procurement by major corporations has been a main driver for the clean energy transition in the last couple of years. Indeed, more than 150 major global businesses have made pledges to source 100% of their power consumption from renewable energy. In just the past few weeks, apparel industry titan PVH corporation (Calvin Klein, Tommy Hilfiger) joined the growing list, as did Facebook, who expects to achieve the goal by 2020. Rocky Mountain Institute estimates that nearly 3,900 MW of new renewable energy will be spurred by the direct corporate purchases announced so far this year.

Cast your vote to bring clean power to the people

Of course, voters have the potential to be a far more powerful driver of clean energy than any utility or corporation. And come November, voters in three states will be able to cast ballots directly in favor of greater clean energy investments. In Arizona and Nevada, there are ballot initiatives (Proposition 127 and Question 6, respectively) to increase each state’s RPS to 50% by 2030. In Washington state, voters will have their say on Initiative 1631, which would curb carbon emissions by placing a fee on the state’s largest polluters and then reinvesting the revenues in clean energy and climate resilience programs.

Voters in Arizona, Nevada, and Washington are not the only ones that can take control of their clean energy future this election. Across the country, candidates running for local, state and congressional offices are taking a stand for clean energy and climate change solutions. Climate policy has become a hot topic in many of the 36 states with gubernatorial races, for example, with some candidates pledging to support strong renewable energy requirements and others committing to help meet US obligations under the Paris Climate Agreement.

Now is the time to educate yourself on candidate positions and cast your vote to keep clean energy momentum going strong into 2019 and beyond.  To find more opportunities to get involved this upcoming election, check out sciencerising.org.

Photo: Dennis Schroeder / NREL Photo: Xcel Energy

With Headlines Elsewhere, Administration Offers Just ONE Day For Public Input on Major Climate Rule

On Monday, the Environmental Protection Agency (EPA) will hold its only public hearing on the Affordable Clean Energy (ACE) rule, the Trump administration’s recently proposed Clean Power Plan (CPP) replacement.

The CPP was a landmark rule, setting the nation’s first-ever carbon standards for power plants. This proposed replacement would severely curtail that action, so much so that it could actually result in no reductions in emissions instead.

Indeed, the ACE proposal stares the reality of climate change right in the face and presses its foot on the gas. It is a stunning abdication of agency mission, an overt rejection of underlying statute, a blatant favoring of special interests over the well-being of the public.

And Monday makes clear the agency’s contemptuous disregard for the people caught in the way, offering just one chance for the public to have their voices heard.

A single day to hear from those in panicked disbelief that their public health protector is turning its back on them right when they need it most.

A single day to hear from communities ravaged by wildfires, inundated by floodwaters, knocked down by record heat.

A single day to hear from every parent looking at their child, every child looking at their future, and all wondering what comes next.

But as much as the Trump administration’s EPA may try to quash the public voice, people will be in Chicago on Monday, and people will be speaking out. People care about their future—the question is whether the agency with a mission to protect public health and the environment cares, too.

An agency in conflict

The reason for just one public hearing is that the ACE proposal is not designed to protect people. It is about doing everything the agency can—up to and including abandoning mission and statute—to find ways to protect coal profits instead.

It is the jumbled outgrowth of an agency trying to resolve two countervailing constraints:

  1. An uncontested obligation to regulate carbon emissions from power plants, and
  2. An administration-driven push to bolster the coal sector in every way it can.

Inconveniently for the agency, coal plants continue to be the dominant source of power sector carbon dioxide emissions, even as their share of electricity generation has plummeted in favor of cleaner and cheaper renewables and natural gas.

Credit: US Energy Information Administration, Monthly Energy Review.

Also inconveniently for the agency, if it fails to issue a replacement regulation, then the CPP could enter into effect instead—a win for the public, but a problem for an administration fixated on wholesale regulatory rollback.

And thus the ACE proposal, a nonsensical plan that truly beggars belief, twisting an obligation to regulate carbon emissions into a standard that results in increased coal generation instead.

Registering public dissent

ACE erodes the safeguards erected to protect us—from laughably weak standards, to a jarring reordering of federal-state responsibilities and accountability, to a pernicious resurfacing of previously defeated regulatory work-arounds. All to the detriment of our health, our environment, our savings, our future.

On Monday, though coal bosses may be happy and polluters may feel relief, the record will reflect the public’s deafeningly angry roar.

On Monday, the voices of parents, of community advocates, of scientists, of people, will be speaking up and speaking out for the better future we all deserve.

Audrey Eyring/UCS U.S. Energy Information Administration.

Environmental Justice Requires Electoral Reform: New Analysis

The Center for Science and Democracy has released a new analysis, Building a Healthier Democracy: The Link Between Voting Rights and Environmental Justice, which demonstrates the negative impact of restrictive election laws on voter turnout across Congressional districts (see the report and our impact maps here).

The study provides a snapshot of the conditions that need to be overcome in order to build a healthier democracy in the United States. Environmental pollution reduces voter turnout directly, as seen in the negative impact of poor air quality (as measured through the EPA’s Risk-Screening Environmental Indicators model), as well as indirectly, through the turnout dampening impact of poverty, regional job loss, and other forms of socioeconomic distress (measured using the Economic Innovation Group’s Community Distress Index).

Voting inequalities are further exacerbated by restrictive election laws, many of which have been put in place by the same organizations responsible for climate change disinformation campaigns and other pseudo-science. For example, the American Legislative Exchange Council (ALEC) was directly involved in the implementation of one of the nation’s worst Congressional gerrymanders in Michigan, and it has supported “proof of citizenship” and discriminatory photo identification requirements in many states. Discriminatory election laws further distort representation by keeping eligible voters out of the process. Our analysis shows that restrictive eligibility laws have a particularly suppressive impact on voter turnout, and that full participation of the electorate will require major electoral reforms.

Building a healthier democracy requires that we recognize the additional burden of restrictive election laws on communities already overburdened by economic and environmental distress. People cannot protect their communities when they are excluded from the process that generates electoral representation and public policy. Uncompetitive elections further reduce voter turnout, insulating political parties and candidates from public accountability. Representation in state legislatures and Congressional delegations is further distorted through racial and partisan gerrymandering that entrenches powerful interests at the expense of responsive government.

Environmental justice and voting rights communities can prioritize their common goal of empowerment by re-engineering U.S. elections through a series of evidence-based reforms. Removing the remaining barriers that keep eligible voters out of the voting process should be the first priority, as our analysis shows that the biggest institutional restraint on turnout occurs through strict registration requirements. Automatic voter registration, which requires citizens to opt out rather than opt in, can also ensure for more secure voter registration lists and automatic updating of voter records. In addition, pre-registration coupled with civics education, as well as early and extended voting periods would secure an open electoral process.

On the back end of the electoral process, where votes are converted into seats and legislative power, our analysis suggests that increasing electoral competition, and preventing state legislatures from engaging in racial and partisan gerrymandering, could significantly improve participation and the policy making process. Currently, our single-seat electoral districts for Congress provide a big incentive to game the districting process in favor of parties over people. We show that gerrymandered states have significantly worse air quality than non-gerrymandered states. Even without gerrymandering, the geographic concentration of voters inevitably produces uncompetitive seats dominated by a single party. Moving to a more proportional allocation of legislative seats in states and in Congress would substantially improve competition over leadership and the responsiveness of the electoral system. If we cannot fix our democracy, it is unlikely that we will fix our climate.

Fortunately, change is already underway in some states. Oregon, California, and nearly a dozen other states have already implemented or adopted automatic voter registration. A surge of voter opposition against gerrymandering is coming to a head this November in states like Michigan, Missouri and Utah, where reformers hope to join an increasing number of states to take districting out of the hands of state legislatures. Our analysis identifies reform movements happening across country and provides interested citizens with important information about how to get involved. Reform starts with engagement, and that means voting this November.

Getting the Facts Right on Washington’s I-1631

This November, people in Washington have the chance to vote on I-1631, an initiative that would drive down carbon pollution and set the state on a forward course toward an economically robust and climate-resilient future.

I-1631 has impressive vision and far-reaching goals. It’s rigorous in design and thoughtfully advanced by a diverse set of stakeholders. It recognizes that fighting climate change doesn’t just mean reducing emissions; it also means investing in communities to ready them for climate impacts. The Union of Concerned Scientists proudly endorses I-1631.

But because this policy would flip the script on big polluters who have long been letting the public bear their costs, an aggressive opposition campaign has sprung up spouting desperate misdirections. Tellingly, the specter of accountability has rushed some of the world’s largest oil and gas companies into a defensive crouch, readily providing more than 99 percent of the opposition campaign’s multi-million dollar funds.

Yet while money can make noise, it can’t change the facts, and we’re here to shine a light on the truth. We’ve previously explored the specifics of the proposal; here, we debunk misleading opposition claims.

 

 

CLAIM: I-1631 will negatively impact the state’s economy.

FACT: I-1631 positions Washington to be a leader in the clean energy economy.

Climate action should not be assumed as in conflict with economic growth. In fact, states that have enacted climate policies have outperformed others across a range of economic indicators. For example, California, which has set aggressive climate targets and keeps going back for more, has only increased its economic dominance over the same period.

I-1631 presents an opportunity for Washington to solidify its status as a leader in the burgeoning clean energy economy, incentivizing good jobs that are built to last. At the same time, it also recognizes that some jobs will be lost as the state shifts away from fossil fuels, and thus proactively plans for sustained workforce transition support.

But preparing for the future isn’t enough—climate change is already imposing real and significant costs on communities all across the state. I-1631 would confront these challenges head-on, spurring necessary investments in things like boosting wildfire resilience and forest health and preparing for sea level rise and impacts on fisheries.

I-1631 sends the right signals and spurs the right actions to set the state on an economically robust and climate-resilient forward course.

 

 

CLAIM: I-1631 excuses the state’s largest polluters.

FACT: I-1631 covers the vast majority of carbon emissions in the state.

I-1631 covers the overwhelming majority of carbon emissions in the state by targeting major polluters like oil refineries, industrial facilities, and utilities that have not yet transitioned to clean energy. The proposal does, however, include two specific exemptions on this front.

The first involves coal plants already planned for closure by 2025 and is not a choice—it’s required due to a pre-existing closure settlement agreement.

The second involves careful targeting of specific Washington industries that are energy intensive and in direct competition with businesses in places without a comparable carbon fee. These select exemptions recognize that business shifting out of state would not reduce emissions yet would harm the workers and communities left behind, and thus are proposed as a hedge—not a permanent pass—until an even playing field is achieved. This includes industries like pulp and paper mills, iron and steel mills, and agricultural work.

 

 

CLAIM: I-1631 lacks accountability.

FACT: I-1631 requires accountability and oversight throughout.

I-1631 was designed over the course of many years and many, many stakeholder conversations. The end result is a policy that specifically works to put people first, and keep people first. And that means building in accountability and oversight from the start.

Specifically, the initiative calls for a 15-person public oversight board that would ensure rigorous implementation of the proposal, as well as accountability and public involvement. The board would include a full-time, governor-appointed chair, as well as a mix of state agency heads, advisory panel co-chairs, and four at-large positions including a tribal representative and a representative of vulnerable populations in pollution and health action areas.

Three investment advisory panels—environmental and economic justice, clean water and healthy forests, and clean air and clean energy—would serve to inform the board on program development and implementation.

Finally, I-1631 would require the Department of Commerce to develop an “effectiveness report,” subject to board review and approval, every four years beginning in December 2022. The report, intended to inform recommendations on improved program implementation, would not only track progress in achieving carbon reduction goals, but also characterize impacts on employment and jobs as well as risks and recommendations for vulnerable populations and affected workers.

Throughout, I-1631 requires financial support for all non-state employee board and panel participants to ensure full participation and representation.

 

 

CLAIM: The costs of I-1631 will be borne on the backs of workers and the poor.

FACT: I-1631 has specific provisions to facilitate workforce transition and to ensure those worst off receive the greatest support.

I-1631 recognizes that energy costs comprise a disproportionate share of bills for low-income households; therefore, the proposal specifically provides for a subset of funds to be used to ensure low-income populations are not left worse off because of the rule. This is to be achieved through investments that lower energy burdens, such as energy efficiency, weatherization, and transit projects, as well as through direct support like bill assistance.

The proposal also includes specific provisions to ensure that when the revenues get spent—whether for clean energy projects, community climate preparedness investments, or stormwater management initiatives—more than a third are to provide direct and meaningful benefits to pollution and health action areas, as well as a share formally supported by Indian tribes. This recognizes that the costs of pollution, of climate change, of transitions are not shared evenly, and that it will take concerted effort to ensure that all have access to clean air and clean water and climate-resilient communities.

But I-1631 isn’t just about protections, it’s also about opportunities—opportunities for good jobs that are built to last. The investments driven by I-1631 will power the vibrant and ever-growing clean energy economy, offering something of a “Green New Deal.” And as the state undertakes that shift away from fossil fuels, the proposal proactively supports assisting in the considerate transition of workers impacted along the way.

 

 

CLAIM: I-1631 is trying to trick people by calling itself a pollution “fee” instead of a “tax.”

FACT: I-1631 is specifically designed as a “fee” to ensure revenues are used for their intended purpose.

It’s no secret that I-1631 is a carbon pricing program, charging a set amount of money per set amount of emissions. This type of program is often referred to as a fee or a tax on carbon.

The critical difference lies in how revenues are used.

A central pillar of I-1631 is that collected revenues are deployed for a very specific set of initiatives. A “tax” would put revenues in a general fund, at risk of being redirected to any number of different government uses, whereas a “fee” ensures lasting control over how the revenues are spent.

Unlike the opposition, I-1631 faces the facts head-on

Here’s what we do know about I-1631:

  • It addresses a problem that people across Washington can simply no longer ignore: climate change is here, climate change is happening, and climate change demands action.
  • As of the end of September, fossil fuel companies accounted for more than 99 percent of the opposition campaign’s donations, chipping in over $20 million to fight a fee on their pollution. Credit: Washington Public Disclosure Commission.

    It faces the magnitude of the challenge head-on, charting a course toward an economically robust and climate-resilient future and then doing everything it can to catalyze the investments, the workforce, the science, and the action that will help get the state to that place.

  • It is supported by an incredibly broad and diverse coalition, with people representing the interests of labor, tribal nations, communities of color, environmental justice, health, faith, business, environment, and more all on board—and the movement is growing every day.
  • It anchors action in accountability and leads by looking out for those who for too long have been over looked.

Opponents of I-1631 are led by fossil fuel companies fighting to preserve a world where people pay and polluters profit.

Supporters of I-1631 are led by people fighting to realize a better and fairer future for Washington—for clean air, clean water, healthy forests, and the creation of jobs that are made to last.

John Westrock/Flickr Washington Public Disclosure Commission.

A Stunningly Low Price for Offshore Wind: Massachusetts Moves Forward

Photo: Erika Spanger-Siegfried/UCS

When the winning bid for Massachusetts’s first request for offshore wind proposals was revealed recently, it was a whole lot lower than any of us had imagined. That matters now, and for years to come.

Pricing to win

The winning bid was from Vineyard Wind, a joint venture of Copenhagen Infrastructure Partners and Avangrid Renewables, and one of 27 bids from three bidders (the three entities holding offshore wind leases off Massachusetts). And that winning bid was really—impressively—low.

The 800 megawatt (MW) bid was broken into two 400 MW pieces, based on the construction timeline. The first tranche was priced at an initial $74 per megawatt-hour (MWh)—7.4 cents per kilowatt-hour (kWh)—and the next tranche at $65/MWh. Each contract goes out 20 years, and goes up by 2.5% per year.

Average out those contracts in nominal dollar terms, then bring it back to present day so that we can understand it in our current reality, and, according to the independent evaluators, you end up at $64.97/MWh (2017$).

While expected prices for electricity from new power facilities in this region aren’t often public, that figure is lower than the headline-grabbing prices of around $80/MWh a few years ago for in-region wind and solar projects. And it’s cheaper than most other options for new electricity in New England, renewable or fossil. It looks impressive even in the context of options/prices available nationally.

And that price wasn’t just impressive; it caught us really off-guard. I had been expecting a price about twice as high, based on the prices and price trajectories in Europe, the most advanced market for offshore wind. And I wasn’t alone in being surprised: Reports on the announcement called the price “well below analyst expectations”, and quoted analysts saying it was “extremely low”, even “shocking”.

It’s a fine day when you can catch that many people by surprise in a good way.

What’s included

So what’s included in that price? In terms of inputs, some of the price-dropping pieces:

  • Investment tax credit (ITC). The federal ITC support for renewable energy projects is phasing out, but this project is aiming to move quickly enough to capture a portion of it.
  • Turbine size. Though the project developers haven’t specified what equipment they’ll be using, projects leading on price elsewhere in the world are using larger turbines and the economies of scale they bring. It’s a safe bet that this project will use turbines of at least 8 MW.
  • Project size. It’s notable that the project picked through the state bidding process is an 800 MW one, not one of the 200 or 400 MW proposals that were also on the table. That’s also important for economies of scale.

In terms of outputs, it’s important to consider that we’re not talking about just energy, but also renewable energy certificates (RECs), used for complying with renewable portfolio standards (RPSs) in Massachusetts and other states.

Not included in all the discussion of the bid is the remuneration that the Massachusetts utilities get for entering into long-term renewable energy contracts, a (somewhat controversial) recompense for what it means for them financially to be in deals like this. In this case, it’s 2.75% of payments made annually under the contract.

On the flip side, also missing from discussion of that low $64.97/MWh price are the range of pieces that Vineyard Wind committed to as part of the bid:

  • $10 million for an “Offshore Wind Industry Accelerator Fund”
  • $3 million for a “Marine Mammals and Wind Fund”
  • $2 million for offshore wind workforce development
  • $15 million ($1M/year for 15 years) for establishing a “Resiliency and Affordability Fund”, which will support solar+storage projects for resiliency and low-income ratepayer benefit in communities hosting pieces of the project

Overall, quite a package.

What it means for electricity bills

The next issue is what that bid like that means for our electricity bills. And there’s good news there, too.

The state’s Department of Energy Resources to the Department of Public Utilities strongly encouraged approval of the deal and the related contracts, suggesting that they “provide a cost-effective source of reliable offshore wind energy for Massachusetts customers, meet the requirements of Section 83C [of the 2016 Energy Diversity Act], and are in the public interest.”

And they make a strong case, including in terms of the bill impacts from the contract price:

This total price [energy plus RECs] is materially below the levelized projected costs of buying the same amount of wholesale energy and RECs in the market, which is projected to be a total levelized price of 7.9 cents/KWh in 2017 dollars over the 20-year term of contract. Over the life of the contract, the 800 MW Vineyard Wind Project is projected to provide an average 1.4 cents/KWh of direct savings to ratepayers.

The indirect benefits, they say, include the reductions in wholesale prices that come from adding a zero-fuel resource like this onto the system, reduced RPS compliance costs (from increased REC supply), and “the benefit of price certainty through a fixed cost contract.”

And that all adds up:

Overall, the total direct and indirect benefits to Massachusetts ratepayers from the long-term contracts with Vineyard Wind are expected to be 3.5 cents/kWh, or $35.29/megawatt-hours on average over the term of the contract, with total net benefits of approximately $1.4 billion.

They project, on average, that these contracts “are expected to reduce customer’s monthly bills, all else being equal, approximately 0.1% to 1.5%.” (The spread in price impacts has to do with the customer type, the particular utility, and the fact that the wholesale electricity supply is just one piece of a customer’s electricity bill.)

Given that this might be the first large-scale project in the country, what’s really remarkable is the fact that it’s projected to save ratepayers money at all.

What it means for the future of offshore wind

The next step for this project is to go through the rest of its permitting process, so that all the pieces can be better understood, and concerns can be assessed/addressed. And then it would need to get built—no small task, certainly; they’re shooting for a 2019 start date for construction.

But the contract price announcement has already made a difference; as big as this project is, its potential impact is much bigger, in a least a couple of ways.

First, having that price benchmark out there is going to really excite a lot of decision makers, and the public at large (maybe even beyond those who read blogposts like this). The new reality this creates in people’s minds changes the offshore wind conversation from “maybe a bit expensive at first but worth trying” to “so worth doing.”

Second, the next project to serve Massachusetts will have to be even cheaper. The state’s 2016 Energy Diversity Act put in place a 1600 MW offshore wind requirement, and this is the first piece. It also requires that each subsequent project cost less than the previous tranche. This project has set a really high (low) bar for every one that follows.

And, if things go according to plan, there’ll be a lot more following: According to the US Department of Energy’s latest offshore wind market assessment, the “pipeline” of potential projects in this country is up to 25,464 MW off 13 states.

This latest pricing is a strong testament to the power of technology, innovation, markets, and policy to really move the needle on clean energy and economic progress. Kudos to everyone from our legislators on Beacon Hill to the entrepreneurs, investors, and visionaries who are making offshore wind happen.

Photo: Erika Spanger-Siegfried/UCS Photo: Kim Hansen/Wikimedia Commons Photo: Andy Dingle/Wikimedia Commons

What is the Responsible Science Policy Coalition? Here Are Some Clues

An interesting new group has popped up called the Responsible Science Policy Coalition (RSPC) that seems to have a significant interest in chemical safety policy. Are they legitimate? As Congress prepares for another hearing into the dangers of per- and polyfluoroalkyl substances, known as PFAS, it’s worth digging into who these folks might be.

PFAS have been in the news quite a bit recently because the White House was caught censoring a report from the Agency for Toxic Substances and Disease Registry (ATSDR) on the health effects of PFAS exposure. These chemicals are widely used in products ranging from nonstick cookware to water-repellant fabrics, and they are especially prevalent in the water at military bases due to their use in firefighting foam. Bipartisan outrage about the censorship was swift and sustained, and the report was released in June. A new UCS analysis shows that many military bases have potentially unsafe levels of PFAS in drinking water.

PFAS are used in firefighting foam. Some military bases have unsafe levels of PFAS in drinking water. Photo: DVIDSHUB/Flickr

In July, the Responsible Science Policy Coalition surfaced at a meeting of the Council of Western Attorneys General where they expressed being “eager to help your state with your issues.” In their presentation to the attorneys general, the RSPC argued that there are “lots of problems with existing PFAS studies” and that these studies “don’t show the strength of association needed to support causation.”

The RSPC also submitted a comment on the ATSDR draft toxicology assessment that extensively detailed why, in their view, ATSDR’s scientific approach was sub-par.

Who is supporting the RSPC?

Where, then, did the Responsible Science Policy Coalition come from, and why do they care so much about PFAS? Here’s what we know. According to the PowerPoint presentation, RSPC is a new coalition made up of 3M, Johnson Controls, and unnamed other companies.

The inclusion of 3M is particularly notable because the company spent decades hiding the science about the dangers of PFAS. 3M used such chemicals in many highly-used products including Scotchgard and firefighting foam. According to the Intercept:

A lawsuit filed by Minnesota against 3M, the company that first developed and sold PFOS and PFOA, the two best-known PFAS compounds, has revealed that the company knew that these chemicals were accumulating in people’s blood for more than 40 years. …The company even had evidence back then of the compounds’ effects on the immune system…

The suit, which the Minnesota attorney general filed in 2010, charges that 3M polluted groundwater with PFAS compounds and “knew or should have known” that these chemicals harm human health and the environment, and “result in injury, destruction, and loss of natural resources of the State.” The complaint argues that 3M “acted with a deliberate disregard for the high risk of injury to the citizens and wildlife of Minnesota.” 3M settled the suit for $850 million in February, and the Minnesota Attorney General’s Office released a large set of documents…detailing what 3M knew about the chemicals’ harms.

The government can protect families from excessive exposure to PFAS, but only with access to independent scientific information. Photo: nicdalic/Flickr

And you thought all they made was Post-its and tape!

RSPC seems to be led by Jonathan Glendill and James Votaw. Glenhill is the president of the Policy Navigation Group, whose list of past and present clients is dominated by industry groups. Votaw is a lawyer for Keller and Heckman; the address given for RSPC is the address of Keller and Heckman’s DC offices. Votaw has signed the three comments from RSPC on the ATSDR study (the first two were extension requests). Votaw’s practice concentrates on environmental and health and safety regulation, including chemicals and pesticides.

Keller and Heckman’s chemicals practice is more circumspect, but their pesticides practice is described in part as “[helping] clients defend existing markets worldwide against governmental pressure and environmentalist activism.” They can help companies “defend against an EPA enforcement action” and “secure successful tolerance reassessments.”

How many groups like this are there?

A name like the Responsible Science Policy Coalition makes insinuations of course—that most people are pulling numbers out of thin air and pursuing haphazard or irresponsible science policy, and we really need some adults in the room. The same words are reused again and again in the names of these types of organizations, and “responsible” is no different. There’s the Citizens Alliance for Responsible Energy, the Coalition for Responsible Healthcare Reform, the Coalition for Responsible Regulation, and more.

Less charitably, groups like RSPC are known as front groups. Disguised by innocuous-sounding names and with a veneer of independence, they principally exist to create doubt and confusion about the state of the science to avoid regulation of the products their members create. Plenty of industries have them. The American Council on Science and Health has long conducted purportedly independent science that was in fact funded by corporate interests. The Groundwater Protection Council fights federal regulation of fracking. The Western States Petroleum Association, the top lobbyist for the oil industry in the western United States, was found in 2014 to be running at least sixteen different front groups in order to undermine forward-looking policies like California’s proposal to place transportation fuels under the state’s carbon cap.

Could the Responsible Science Policy Coalition meet its stated goal to “accelerate research and promote best practices and best available science in policy decisions?” Perhaps. But those looking to RSPC for advice should be wary of the fact that so far, it seems to exist to encourage more relaxed regulation of PFAS chemicals – a decision that is worth a lot of money to the organization’s key members.

When Will Autonomous Vehicles be Safe Enough? An interview with Professor Missy Cummings

Photo: Jaguar MENA

Autonomous vehicle (AV) supporters often tout safety as one of the most significant benefits of an AV-dominated transportation future. As explained in our policy brief Maximizing the Benefits of Self-Driving Vehicles:

While self-driving vehicles have the potential to reduce vehicle-related fatalities, this is not a guaranteed outcome. Vehicle computer systems must be made secure from hacking, and rigorous testing and regulatory oversight of vehicle programming are essential to ensure that self-driving vehicles protect both their occupants and those outside the vehicle.

Professor Mary “Missy” Cummings, former fighter pilot and current director of the Humans and Autonomy Lab at Duke University, is an expert on automated systems. Dr. Cummings has researched and written extensively on the interactions between humans and unmanned vehicles, regulation of AVs, and potential risks of driverless cars. I had the opportunity to speak with Dr. Cummings and ask her a few questions about current technological limitations to AV safety and how to use regulation to ensure safety for all Americans, whether they are driving, walking, or biking.

Below are some key points from the interview, as well as links to some of Dr. Cummings’ work on the topics mentioned.

Jeremy Martin (JM): Safety is one of the biggest arguments we hear for moving forward with autonomous vehicle development. The U.S. National Highway Traffic Safety Administration has tied 94% of crashes to a human choice or error, so safety seems like a good motivating factor. In reality, how far are we really off from having autonomous systems that are safer and better than human drivers? And are there specific software limitations that we need to improve before we remove humans from behind the wheel?

Dr. Mary “Missy” Cummings (MC): I think one of the fallacies in thinking about driverless cars is that, even with all of the decisions that have to be made by humans in designing software code, somehow they are going to be free from human error just because there’s not a human driving. Yes, we would all like to get the human driver out from behind the wheel, but that doesn’t completely remove humans from the equation. I have an eleven-year-old, I would like to see driverless cars in place in five years so she’s not driving. But, as an educator and as a person who works inside these systems, we’re just not there.

We are still very error prone in the development of the software. So, what I’d like to see in terms of safety is for us to develop a series of tests and certifications that make us comfortable that the cars are at least going to be somewhat safer than human drivers. If we could get a reliable 10% improvement over humans, I would be good with that. I think the real issue right now, given the nature of autonomous systems, is that we really do not know how to define safety for these vehicles yet.

JM: So you’re not optimistic about meeting your five-year target?

MC: No, but it’s not a discrete yes or no answer. The reality is that we’re going to see more and more improvement. For example, automatic emergency breaking (AEB) is great, but it’s still actually a very new technology and there are still lots of issues that need to be addressed with it. AEB will get better over time. Lane detection and the car’s ability to see what’s happening and avoid accidents, as well as feature’s like Toyota’s guardian mode, will all get better over time.

When do I think that you will be able to use your cell phone to call a car, have it pick you up, jump in the backseat and have it take you to Vegas? We’re still a good 15-20 years from that.

JM: You mentioned that if AVs performed 10% better than human drivers, that’s a good place to start. Is that setting the bar too low? How do we set that threshold and then how do we raise the bar over time?

MC: I think we need to define that as a group of stakeholders and I actually don’t think we need a static set of standards like we’re used to.

With autonomous vehicles, it’s all software and not hardware, but we don’t certify drivers’ brains cell by cell, what we do is certify you by how you perform in an agreed-upon set of tests. We need to take that metaphor and apply it to driverless cars. We need to figure out how to do outcome-based testing that is flexible enough to adapt to new coding approaches.

So, a vision test, for example, in the early days of driverless cars should be a lot more stringent, because we have seen some deaths and we know that the sensors like lidar and radar have serious limitations. But, as those get addressed, I would be open to having less stringent testing. It’s almost like graduated licensing. I think teenagers should have to go through a lot more testing than me at 50. Over time, you gain trust in a system because you see how it operates. Another issue is that now cars can do over-the-air software updates. So, do cars need to be tested when a new model comes out or when they have a new software upgrade that comes out? I don’t claim to have all the answers, and I’ll tell you that nobody does right now.

JM: One safety concern that emerges in discussions around AVs is cybersecurity. What are the cybersecurity threats we should be worried about?

MC: There are two threats to cybersecurity that I’m concerned about, one is active hacking, and that would be how somebody hacks into your system and takes it over or degrades it in some way. The other concern is in the last year, there’s been a lot of research that’s shown how the convolution neural nets that power the vision systems for these cars can be passively hacked. By that I mean, you don’t mess with the car’s system itself, you mess with the environment. You can read more about this but, for example, you can modify a stop sign in a very small way and it can trick an algorithm to see a 45 mile per hour speed limit sign instead of a stop sign. That is a whole new threat to cybersecurity that is emerging in research settings and that, to my knowledge, no one is addressing in the companies. This is why, even though I’m not usually a huge fan of regulations, in this particular case I do think we need stronger regulatory action to make sure that we, both as a society and as an industry, are addressing what we know are going to be problems.

JM: We hear a lot about level 3 and 4 automation, where a human backup driver needs to be alert and ready to take over for the car in certain situations, and after that fatal accident in Arizona we know what the consequences can be if a backup driver gets bored or distracted. What kinds of solutions are there for keeping drivers off their phones in AVs? Or are we just going to be in a lot of trouble until we get to level 5 automation and we no longer need backup drivers?

MC: I wrote a paper on boredom and autonomous systems, and I’ve come to the conclusion that it’s pretty hopeless. I say that because humans are just wired for activity in the brain. So, if we’re bored or we don’t perceive that there’s enough going on in our world, we will make ourselves busy. That’s why cellphones are so bad in cars, because they provide the stimulation that your brain desires. But even if I were to take the phones away from people, what you’ll see is that humans are terrible at vigilance. It’s almost painful for us to sit and wait for something bad to happen in the absence of any other stimuli. Almost every driver has had a case where they’ve been so wrapped up in their thoughts that they’ve missed an exit, for example. Perception is really linked to what you’re doing inside your head, so just because your eyes are on the road doesn’t mean you’re going to see everything that’s in front of you.

JM: What’s our best solution moving forward when it comes to safety regulations for autonomous vehicles? Is it just a matter of updating the standards that we currently have for human-driven vehicles or do we need a whole new regulatory framework?

What we need is an entirely new regulatory framework where an agency like NHTSA would oversee the proceedings. They would bring together stakeholders like all the manufactures of the cars, the tier one suppliers, people who are doing the coding, as well as a smattering of academics who are in touch with the latest and greatest in related technologies such as machine learning and computer vision. But we don’t just need something new for driverless cars, we also need it for drones, and even medical technology. I wrote a paper about moving forward in society with autonomous systems that have on-board reasoning. How are we going to think about certifying them in general?

The real issue here, not just with driverless cars, is that we have an administration that doesn’t like regulation, so we’re forced to work within the framework that we’ve got. Right now, NHTSA does have the authority to mandate testing and other interventions, but they’re not doing it. They don’t have any people on the staff that would understand how to set this up. There’s just a real lack of qualified artificial intelligence professionals working in and around the government. This is actually why I’m a big fan of public-private partnerships to bring these organizations together – let NHTSA kind of quarterback the situation but let the companies get in there with other experts and start solving some of these problems themselves.

 

Dr. Mary “Missy” Cummings  is a professor in the Department of Mechanical Engineering and Materials Science at Duke University, and is the director of the Humans and Autonomy Laboratory and Duke Robotics. Her research interests include human-unmanned vehicle interaction, human-autonomous system collaboration, human-systems engineering, public policy implications of unmanned vehicles, and the ethical and social impact of technology.

Professor Cummings received her B.S. in Mathematics from the US Naval Academy in 1988, her M.S. in Space Systems Engineering from the Naval Postgraduate School in 1994, and her Ph.D. in Systems Engineering from the University of Virginia in 2004.  Professor Cummings as a naval officer and military pilot from 1988-1999, she was one of the Navy’s first female fighter pilots.

Photo: Jaguar MENA

PFAS Contamination at Military Sites Reveals a Need for Urgent Science-based Protections

A new UCS factsheet released today looks at PFAS contamination at military bases, revealing that many of the sites have levels of these chemicals in their drinking or groundwater at potentially unsafe levels. PFAS, or poly- and perfluorinated alkyl substances, have been used in everything from Teflon pans, to nonstick food packaging, to water-repellent raingear for decades. Only recently has it been revealed to the general public that these compounds are seeping into our waterways and causing health issues in people who are exposed to the chemical at elevated levels over time.

The contamination of drinking water and groundwater at military bases continues to be a problem because the firefighting foam used in training exercises and in operations contains PFAS. Living with this additional risk is an unacceptable extra burden that these men and women and their families should not have to bear. This is not just a story about a chemical that is largely unregulated, it is a story about the people who are dealing with the ramifications of its widespread contamination every single day.

What we found

A draft toxicology report released by ATSDR after emails obtained by UCS revealed that the White House had been suppressing the study suggested that risk levels for PFAS were 7 to 10 times lower than the EPA’s current standards.

The report’s findings, suggesting that PFAS are potentially more hazardous than previously known, are particularly concerning because of these compounds’ persistence in the environment and widespread prevalence.

UCS mapped PFAS contamination of groundwater and drinking water at 131 active and formerly active US military sites across 37 states. We translated the ATSDR’s risk levels for PFOA and PFOS into comparable drinking water standards in parts per trillion using EPA’s own methods and found all these sites but one exceeded the more conservative of those levels.

  • At 87 of the sites—roughly two-thirds—PFAS concentrations were at least 100 times higher than the ATSDR risk level.
  • At 118 of the sites—more than 90 percent—PFAS concentrations were at least 10 times higher than the ATSDR risk level.
  • Over half of the 32 sites with direct drinking water contamination had PFAS concentrations that were at least 10 times higher than the ATSDR risk level.
Urgent action needed

In the ATSDR’s scientific review of 14 PFAS, the association between exposure and negative health effects is clear. While there is absolutely need for more research into some of these associations and a lot more data to fill the gaps on the thousands of PFAS compounds that have not yet been looked at, there is a compelling case to be made for EPA to act urgently on the class of chemicals and there are no shortage of ways to do so, both by enacting enforceable standards and providing support to the states that have taken the lead on this issue.

Responding to high rates of contamination, communities have been on the frontline of getting action in their states. More and more states are setting standards for PFAS in drinking water and groundwater more stringent that the EPA’s health advisory, and places like Washington have banned the use of PFAS in firefighting foam and food packaging. And communities and organized and poised to get the changes they want.

Congress has also been hearing from their constituents and taking action. There have been an encouraging flurry of bills introduced in both the House and Senate over the past year. An amendment to the 2018 National Defense Authorization Act secured by New Hampshire’s Sen. Jeanne Shaheen has enabled $20 million funding for ATSDR to conduct a nationwide health study on PFAS. Other measures are still pending. The House recently passed an amendment to the FAA Reauthorization Act of 2018 that would allow commercial aircraft manufacturers and commercial US airports to use non-fluorinated chemicals in firefighting foam starting in 2 years. The PFAS Registry Act (S. 2719, H.R. 5921) would direct the U.S. Department of Veterans Affairs to establish a registry to ensure that veterans possibly exposed to PFAS via firefighting foam on military installations get information about exposure and treatment. The PFAS Accountability Act (S. 3381) in the Senate and the PFAS Federal Facility Accountability Act in the House would encourage federal agencies to establish cooperative agreements with states on removal/remedial actions to address PFAS contamination from federal facilities including military installations. The PFAS Detection Act (S. 3382) would require USGS to develop a standard to detect and test for PFAS in water and soil near releases, to determine human exposure, and report data to federal/state agencies & relevant elected officials.

Tomorrow, a Senate Homeland Security & Government Affairs subcommittee is holding a hearing on the “federal role in the toxic PFAS chemical crisis” at which EPA and DOD representatives will be testifying. It is critical that these agencies provide members of the public with clear answers on how they will be doing their jobs to protect all of us from further PFAS contamination. Take action with us to hold these agencies accountable today.

Action Needed to Address the US Military’s PFAS Contamination

There was dead silence at a community meeting last week in Portsmouth, New Hampshire after Nancy Eaton spoke before a panel of top federal health officials planning a study of per- and polyfluoroalkyl (PFAS) contamination at the former Pease Air Force Base. She described how her husband David, who was healthy all his life, died quickly in 2012 at 63 from pancreatic cancer.

David Eaton served four decades in the Air National Guard based out of Pease and saw duty in Vietnam, the Persian Gulf, and Iraq. Nancy said David drank the base’s water and the coffee brewed with the same water every day, on top of being exposed to toxic chemicals as an airplane mechanic.

“He loved every second of the 40.7 years he proudly served our country,” Eaton told the panel of experts from the Agency for Toxic Substances and Disease Registry (ATSDR), a division of the Centers for Disease Control and Prevention. Unfortunately, Eaton said, “my husband and I never had the chance to retire together, take a couple of trips nor build our retirement home.”

She said she was not alone as a premature widow, noting that several of her husband’s comrades died of cancer and others have  suffered tumors of the brain, lung, mouth and breast. She concluded by saying that her husband and others served their country without asking questions, “never thinking their lives would be cut short due to carcinogens on the job. Our families deserve answers as well as preventing this from happening again.”

Eaton’s sentiments were echoed by Andrea Amico, a co-founder of Testing for Pease, a group of parents whose children drank contaminated water at the site’s daycare center, and whose demands for a thorough investigation of PFAS harms have now resulted in establishing Pease as a key location in the first nationwide federal study of those harms.

A nationwide problem

Eaton’s and Amico’s pleas for answers are part of a growing chorus across the country. The latest scientific research suggests that this group of chemicals is more harmful to human health than previously recognized. The group of chemicals known as PFAS, common in many household products such as non-stick cookware and water-repellant carpeting, are linked to several cancers, liver damage, thyroid disease, asthma and reproductive and fetal health disorders. A report from the Environmental Working Group says as many as 110 million Americans may be drinking PFAS-laced water. Nowhere, however, is the problem as acute as it appears to be on US military sites where PFAS compounds have been heavily used for training in fire suppression and the chemicals have been routinely allowed to drain into groundwater.

According to a new report and interactive map from the Union of Concerned Scientists (UCS), the levels at Pease, a Superfund site, are 43,545 times above what the ATSDR considers safe, and some 30,000 people live within three miles of the base. Some 9,500 employees work in the 250 businesses in the current trade port that was once part of the base itself. The military has shut down the worst polluted drinking water well but residents remain concerned about the groundwater contamination. An Air Force official told the Portsmouth Herald in July that it could take up to a decade to resolve the issue even with state-of-the-art water filtration.

Worse yet, Pease is just one of more than 100 military sites with similar problems. The UCS study looked at 131 military sites and found that all but one had levels in excess of what the government now considers safe. The vast majority–some 87 bases–reported PFAS concentrations more than 100 times safe levels and 10 of those military sites had concentrations 100,000 to 1 million times higher than the government’s recommended “safe” levels.

Trump Administration tried to suppress new findings

Concern has mounted over the past year about the danger posed by the PFAS group of chemicals. In May, UCS published emails obtained under the Freedom of Information Act that indicated that the Trump administration was suppressing a study reviewing the link between PFAS and disease in humans.

In a now-infamous January 30 email discussing the decision to delay publication of the PFAS report, Office of Management and Budget Associate Director James Herz relayed the concern of an aide in the White House Office of Intergovernmental Affairs that: “The public, media, and Congressional reaction to these numbers is going to be huge. The impact to EPA and [the Defense Department] is going to be extremely painful.” Herz fretted about “the potential public relations nightmare” it would be for the report to be released.

Thanks to pressure brought after the emails were exposed, the 850-page ATSDR study on the toxicity and prevalence of 14 PFAS compounds was released in June. The study sets new recommendations for safe exposure to PFAS compounds 7-to-10 times lower than currently recommended by the EPA. Notably, though, this group of chemicals has so far managed to escape any enforceable limits because EPA has never officially listed them on its registry of toxic chemicals.

It is nothing short of outrageous that worries of a public relations headache almost won out over the actual public health nightmare that millions of Americans face greater risks from exposure to PFAS than previously recognized. The entire class of PFAS should be immediately registered on EPA’s list of toxic pollutants and the Defense Department should ask Congress for enough resources to clean up the contamination.

Pressure mounts for action

Andrea Amico from Pease, age 36, said she worries every day if and when health effects might show up in her husband, who drank Pease water at work for nine years and whose first two children drank the water in day care. She said close to 100 people have contacted her, worried that their serious illnesses are related to PFAS exposure. “We’ve heard from women having problems with fertility to the point where one woman told me she worked in an office where all the women had fertility problems,” she said.

Similar concerns are being echoed all around the country. This week, Amico is scheduled to testify before a Senate subcommittee alongside Arnie Leriche, a former EPA environmental engineer who lives near the former Wurtsmith Air Force Base in northern Michigan. PFAS compounds there were recorded at 73,636 times the level considered safe by ATSDR’s new recommendations.

Asked how it feels to be to be advocating for answers on pollution after investigating it at EPA for nearly four decades, Leriche said he felt “a lot of disappointment that the agency hasn’t been able to conduct the mission the way it should,” both amid the Trump administration’s current attempt to gut clean air and water rules of the 1970s, and the inconsistent focus and inadequate EPA funding over the years by both Democratic and Republican administrations. “But there’s a lot of career employees at EPA, engineers, Forest Service who are sticking their neck out, not going to let this happen quietly.”

Michigan, with its heavy industrial history, has a long, troubled relationship with PFAS. Clean water advocates were staggered this summer when the MLive Media Group, which represents many newspapers in mid-sized cities in the state, discovered through a Freedom of Information Act request that the state Department of Environmental Quality sat for nearly six years on a report warning of potential widespread PFAS contamination. It is the same DEQ that sat on the Flint Water Crisis.

The report, prepared by state environmental specialist Robert Delaney, found perfluoroalkyl levels in fish “on an order of magnitude higher than anything documented in the literature to date.” With contamination evident throughout the food web, from algae and zebra mussels to mink and bald eagles, the report indicated Michigan was suffering from “widespread contamination,” with little monitoring and “an endless list of things that could and possibly should be done. However, first, those in authority have to be convinced that there is a crisis.”

One of those trying to convince the state to care about PFAS, despite its horrific neglect of Flint, is Cody Angell. He has been a lead advocate for testing and remediation of industrial PFAS in communities north of Grand Rapids. Like the residents at Pease, Angell says Michigan has many anecdotal cases of cancer and residents want answers and action to address the problem.

“Every day, more and more people are realizing that government has failed us,” Angell said. “When you hear that government is more concerned about public relations than people, that is really like saying they are knowingly poisoning people. But as long as we can keep PFAS in the news, we’ll get results at some point.”

Back at Pease, keeping PFAS in the news is precisely Amico’s goal as well. At the meeting, she read a letter from Doris Brock, widow of Kendall Brock, who served 35 years at Pease and died last year of bladder and prostate cancer at the age 67. Doris Brock said she keeps a list of 70 members of military families she knows were hit with organ cancers and 40 are now dead.

“We don’t want just studies,” Amico said. “We want medical monitoring and physicians have to know what these chemicals are so people can be treated properly.”

Ken Lauter, 69, who worked at Pease for 24 years in military and commercial aircraft maintenance and security, told the community meeting that he too has been battling cancer. His lymphoma was discovered when he went to the doctor for what he thought was a pinched nerve that limited use of his left arm.

He said that in the 1990s, people’s suspicions abounded about the water as taped signs with skulls and crossbones appeared above water fountains. Once, when an air tanker exploded, he said, he and other responders waded knee deep in fire fighting foam, a key source of PFAS.

“We were in (chemicals) up to our neck doing our jobs every day . . . I did my job. So did these other guys and this is the price we pay. Investigate. Please check it out for these people,” Lauter said.

It’s long past time for the government to act.

Puerto Rico: Maria’s Laboratory for Scientific Collaboration

CienciaPR’s education specialist, Elvin Estrada, trains educators at the Boys and Girls Club of Puerto Rico on how to use the Foldscope, a low-cost paper microscope, as part of CienciaPR’s Science in Service of Puerto Rico initiative. Each of the 500 students participating in the project will receive the instrument free of charge to observe the biological diversity in a terrestrial ecosystem that was impacted by Hurricane Maria. Photo courtesy of Mónica Feliú-Mójer.

Reposted with permission by STEM + Culture Chronicle, a publication of SACNAS – Advancing Chicanos/Hispanics & Native Americans in Science

When Hurricane Maria hit Puerto Rico on September 20, 2017, Ubaldo Córdova-Figueroa’s primary concern was for the safety of his students and research assistants. With communications shut down, it took over a month for the professor of chemical engineering at the University of Puerto Rico–Mayagüez to contact them all. “Having no access to my students or my research-lab members was very painful because I didn’t know what was going on with them. I just wanted to know that they were fine,” he says. Everyone was okay but became anxious when research was interrupted for months. Córdova-Figueroa had to reassure them that it was okay, to relax, and wait for things to return to normal. It was, after all, a catastrophe.

Córdova-Figueroa says many scientists are concerned about their future in research at the university, which was facing a fiscal cliff before the hurricane. “They are afraid that they may not get the support they need to recover,” he says. But consensus is building that devastation from last year’s hurricanes could change the way science is approached in Puerto Rico. The post-hurricane conditions provide a unique environment to study. There is also an opportunity to develop local, non-scientific and scientific collaborations as well as attract outside collaborators to work together across disciplines. The results could impact resiliency and innovation both locally and globally.

Local collaborations

“When you lose energy as we did after Maria, not only does your grid go down but with it goes your health system, your communication, your transportation system, your food distribution system, your education system,” says Associate Professor of Social Sciences at the Mayagüez campus, Cecilio Ortíz-García, “But none of those realms, in non-emergency times, talk to each other or understand each other. It’s time to establish a platform for cross-communication.”

The University only has a few pictures of the classrooms because most places were difficult to get through and some were forbidden because of fungus contamination.

Ortíz-García is on the steering committee of the National Institute of Island Energy and Sustainability (INESI) at the University of Puerto Rico. INESI promotes interdisciplinary collaboration on energy and sustainability problems and has a network of 70 resources across the university’s 11 precincts. In the wake of Hurricane Maria, it has been able to help establish collaboration at local, community, and municipal levels as well as with some of the stakeholders, says Professor of Social Sciences at the Mayagüez campus, Marla Pérez-Lugo, who is also on the steering committee.

The absence of strong federal and central government involvement following Hurricane Maria has prompted organized innovation and resilience on local levels that was never expected Ortíz-García says. The mayor of San Sebastian pulled together volunteers who were certified electricians, ex power-utilities employees, retired employees, and others like private construction contractors that had heavy equipment. “They put those guys together and started electrifying neighborhoods on their own,” said Professor Ortíz-García.

Solving real-life problems

Ciencia Puerto Rico (CienciaPR) is a non-profit organization that promotes science communication, education, and research in Puerto Rico. They received a grant from the National Science Foundation to implement project-based science lessons on disaster-related topics. The middle school education program features lesson activities that are related to what’s happening in Puerto Rico as well as culturally relevant.

The first lessons implemented included how to properly wash hands when clean water is scarce and understanding the effect of the storm on the terrestrial environment.

Educators at the Boys and Girls Club of Puerto Rico learn how to use the Foldscope.

Each child is given a paper microscope and asked to conduct a research project to answer a question they have about how the storms have affected the environment. At the end of June, the students will share their findings with the community.

The project is funded by a RAPID grant, which is awarded for one to two years to respond to emergency or one-off events. The Foundation has awarded about 40 grants associated with Hurricane Maria, according to their website. Most of them are RAPID grants and about 25 percent of them have been awarded to scientists in Puerto Rico.

RAPID grants associated with Hurricane Maria have required INESI to adapt its vision, says Professor Pérez-Lugo. INESI’s basic mission is to look at Puerto Rico from a local perspective to insert local knowledge into the policy process. But the flood of effort coming from outside universities has required them to attempt to identify and coordinate those doing research and relief work in Puerto Rico. INESI initially counted 20 universities conducting research, but other initiatives and projects involving energy and the electric system have been identified since. In some cases, there were three or four teams from the same university working in Puerto Rico that were unaware of the presence of the other teams. “So, these universities found out about their colleagues through us,” said Pérez-Lugo.

The workers and researchers tended to be concentrated in only a couple of municipalities, leaving many areas neglected. INESI coordinated their efforts to avoid fatigue, to avoid saturation in some areas, and to distribute aid in a more just and equal way Pérez-Lugo says.

Updating approaches to disaster

Most classrooms at the University of Puerto Rico were filled with water, some with vegetation, and many with broken equipment.

According to Ortíz-García, INESI was founded prior to the arrival of Hurricane Maria in recognition of the flaws associated with the fragmented organization at the university. Like most universities, it is organized to accomplish the goals of teaching, research, and service, which is an organization best suited to the scientific processes of discovery, knowledge creation, and scientific inquiry. “But these are different times,” says Ortíz-García, “with problems that are not aligned with a fragmented, unidisciplinary approach.”

“But that’s an outdated approach because now we know that energy transitions are embedded in everything that society values, from water to health, to safety and security, and to food. So, multiple organizations will need to be involved to solve the problem and they need a common language to fix something.”

INESI has been working toward taking the University of Puerto Rico to the next level of university organization, with networks of interest and practice within and throughout interconnecting disciplines. “Instead of concentrating on a scientific development in one discipline, scientists need to concentrate on the effective design of solutions to issues that don’t belong to any discipline, like climate change,” says Ortíz-García.

Collaborative convergence platforms such as INESI can foster interdisciplinary dialogue and the generation of solutions for these issues. Now, inspired by the influx of representatives from other universities to Puerto Rico in the wake of Hurricane Maria, INESI wants to build a platform of platforms.

RISE Puerto Rico

A group representing an inter-university collaborative convergence platform will meet for a foundational catalyst workshop at the end of June. Twenty-seven people from ten universities have already accepted the invitation and will meet face to face for the first time.

“The platform that we’re looking to build here, we’ve already preliminarily named it RISE Puerto Rico, which stands for Resiliency through Innovations in Sustainable Energy,” Pérez-Lugo says.

Starting these dialogs now will go a long way, Ortíz-García says, in reorganizing academic environments toward finding the solutions necessary to fix these problems. “In addition, it can foster innovation in ways our own organizational structure could never, ever think of because you would have spin-off after spin-off of academic conversations not only with the scientists but also community and other stakeholders’ knowledge that is out there from leading these events themselves,” he says.

Córdova-Figueroa is optimistic about the research opportunities in Puerto Rico.

He would like to see many scientists from around the world take advantage of the myriad research opportunities available. “Come to Puerto Rico,” he says. “You will learn something great here.”

Dr. Kimber Price is a science communications graduate student at the University of California, Santa Cruz. Follow her on Twitter: @LowcountryPearl

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Understanding 1.5°C: The IPCC’s Forthcoming Special Report

Photo: IISD

The Intergovernmental Panel on Climate Change (IPCC) – an international body that develops non-policy prescriptive climate science assessments for decisionmakers – is currently compiling a Special Report that will provide information on what it would take to limit global warming to 1.5 degrees Celsius above pre-industrial levels. The report will also assess the climate impacts that could be avoided by keeping warming to this level, and the ways we can limit the worst impacts of climate change and adapt to the ones that are unavoidable. Report authors and government representatives will meet in Incheon, Republic of Korea from October 1-5 to review the report, with the report’s Summary for Policymakers due to be released on October 7 at 9 p.m. Eastern US time (October 8 at 10:00 local time (KST)). The report is slated to come out just as nations look towards revising the commitments they made to achieve the goals of the Paris Agreement.

The Paris Agreement is a worldwide commitment adopted in 2015 under the United Nations Framework Convention on Climate Change (UNFCCC) to reduce global warming emissions and limit the increase in global temperature to well below 2°C. More specifically, the Paris Agreement includes a goal of “Holding the increase in the global average temperature to well below 2°C above pre-industrial levels and pursuing efforts to limit the temperature increase to 1.5°C above pre-industrial levels, recognizing that this would significantly reduce the risks and impacts of climate change.” Small Island Developing States that are disproportionately vulnerable to global warming were pivotal in the inclusion of the 1.5°C goal.

It has been recognized that efforts beyond those spelled out in the commitments made to meet the goals of the Paris Agreement (the Nationally Determined Contributions) will be necessary to limit global warming to 1.5°C above pre-industrial levels. As a result, policymakers are interested in what it would take to achieve this goal, as well as the benefits and tradeoffs to consider as countries look to ramp-up their commitments. (This report will fulfill an invitation made by UNFCCC member countries including the U.S. during the adoption of the Paris Agreement for, “the Intergovernmental Panel on Climate Change to provide a special report in 2018 on the impacts of global warming of 1.5 °C above pre-industrial levels and related global greenhouse gas emission pathways.”) The report is thus intended to inform such deliberations and respective domestic and international climate policy.

Just preceding the invitation to the IPCC to produce the Special Report, UNFCCC member countries also decided, “to convene a facilitative dialogue among Parties in 2018 to take stock of the collective efforts of Parties in relation to progress towards the long-term goal referred to in Article 4, paragraph 1, of the Agreement and to inform the preparation of nationally determined contributions pursuant to Article 4, paragraph 8, of the Agreement.” The Special Report will be an important input into this global stock-take, which will be a prominent feature of the forthcoming UNFCCC Conference of Parties (COP24) in Poland in December of this year.

In addition to its Summary for Policymakers, the report will have five underlying chapters, as well as an introductory section, several break-out boxes, and frequently asked questions. The title of the chapters will be as follows:

  • Chapter 1: Framing and context
  • Chapter 2: Mitigation pathways compatible with 1.5°C in the context of sustainable development
  • Chapter 3: Impacts of 1.5°C global warming on natural and human systems
  • Chapter 4: Strengthening and implementing the global response to the threat of climate change
  • Chapter 5: Sustainable development, poverty eradication and reducing inequalities

Among other topics, the report will provide information on the global warming emissions reductions required to keep global warming to 1.5°C relative to the emissions reductions necessary to limit warming to 2°C. And it will compare the different paths nations can take to achieve these emissions reductions, including the opportunities and challenges that meeting the 1.5°C goal will present from socio-economic, technological, institutional, and environmental perspectives. The report will also assess the impacts that could be avoided if warming is kept to 1.5°C instead of 2°C, as well as the emissions reduction options relevant to keeping warming to 1.5°C and the options available to prepare for projected impacts. Furthermore, authors have been tasked with considering how to limit warming to 1.5°C together with sustainable development and poverty eradication efforts, and the implications of pursuing this goal for ethics and equity.

The report is being prepared by experts from a diverse array of countries and institutions, including from the United States. The IPCC does not conduct new science. Instead, authors reviewed the best available, peer-reviewed literature relevant for each chapter, and employed set criteria to characterize the evidence relevant to key topics, including the level of confidence, agreement, and uncertainty in the evidence base (as an example, here is a link to the criteria employed in the IPCC’s Fifth Assessment Report).

The IPCC’s Special Report on 1.5°C Warming has undergone a lengthy review process, including an internal review, and multiple expert and government reviews. For example, 2000 experts registered to review the First Order Draft, along with 489 government reviewers from 61 countries. The review process provides a mechanism for engaging and incorporating input from a diverse and inclusive set of experts. Authors are required to consider and respond to each comment received – responses that are then made publicly available.

The Union of Concerned Scientists will be reviewing the released documents on October 8th and posting a series of blogs that will cover key aspects of the report. Although the report is not public yet, the latest scientific literature (e.g. that assessed in the recent U.S. Climate Science Special Report) has only underscored the urgent need for action to limit the heat-trapping emissions that are fueling climate change. This new IPCC Special Report will almost certainly make this point even clearer, as it will show the world what can be avoided if we ramp down emissions now. There is too much at stake for humanity to indulge in further delay or inaction.

The Coal Bailout Nobody is Talking About

Photo: Mike Poresky/CC BY (Flickr)

If you are reading this blog, chances are you are either an energy economist, grid geek, or maybe my mother. Regardless, this administration seems intent on trying various coal bailout attempts. Hopefully, you’ve already read up on the high costs and low benefits to such bailouts, how the first attempt failed, and how they’re at it again. My latest research has uncovered that every month, millions of consumers are unwittingly bailing out coal-fired power plants to the tune of over a billion dollars a year.

Merchant vs monopoly

Before we dig into the numbers, let’s talk about the two types of utilities that own coal power plants.

Merchant utilities primarily rely on revenues from the competitive power markets to make money. Monopoly utilities, on the other hand, own power plants and directly serve retail customers. They are household names in the areas they operate, probably because they send those households a monthly bill; those bills are how those utilities make money.

The first part of this new research looked at how these two types of utilities operated in four large competitive power markets. These markets were designed such that power plants that are cheap to operate should run more often than plants that are expensive to operate.

My analysis looked at market prices for energy where every coal plant is located and calculated how often a plant would operate based on market prices alone. I then compared that “expected” value with the power plant’s actual operational data.

For the most part, the merchant power plants operated at or below the expected value. Power plants that were owned by monopolies, however, typically operated more than would be expected.

The expected vs actual value of several large utilities. You would expect the dots to fall along the diagonal line, but monopoly owned plants tend to overgenerate.

The stark difference begged for additional investigation, so I used hourly data to conduct a detailed analysis of each power plant to discover if power plants were operating at times when cheaper energy was available.

Billion-dollar bailout

Some utilities appear to be finding a way to undermine the competitive market structure that would have lower cost resource operate more and higher cost resources operate less. Expensive coal plants—which are objectively not competitive—are being operated in such a way that costs consumers money, reduces flexibility, and exacerbates existing pollution problems.

Based on my analysis, monopoly utilities appear to be running more expensive plants while depriving their customers of access to cheaper (and likely cleaner) sources of energy.

This new analysis builds on earlier work of mine that investigated this issue in the Southwest Power Pool, SPP, the market that covers several great plains states. My original analysis calculated that ratepayers were incurring a burden of $150 million a year from just a few power plants. The new analysis (which includes all coal units in SPP) indicates that the number is closer to $300 million a year, just for SPP.

I looked at four large electricity markets, SPP, ERCOT, MISO, and PJM. Together, these four markets span from New Jersey to North Dakota; from Texas to Virginia.

The latest results suggest that, across the four coal-heavy energy markets, coal-fired power plants incurred $4.6 billion in market losses over the past 3 years or $1.5 billion dollars in market losses each year. Most of these “losses” were incurred by power plants owned by monopoly utilities and are not absorbed by the investors or owners. Rather, those costs were likely covered by customers. Consequently, I estimate this practice places a least a $1 billion burden on utility ratepayers each year.

New spin on old news

The fact that so much coal-fired power is uneconomic is not new news. The financial woes of coal have been well documented by UCSRhodium Group and Columbia University, Bloomberg New Energy FinanceMoody’sBank of America Merrill LynchMJ BradlyRocky Mountain InstituteUBS, Synapse Energy Economics, IEEFA, and others.

My new research differs in two ways. First, it quantifies the financial impact on consumers when utilities opt for dirty and expensive fuel over cleaner and cheaper alternatives. And two, this work focuses on a very specific aspect of how these coal plants operate and draws a somewhat unintuitive conclusion:

Some coal-fired power plants might make more (or lose less) money by operating less. 

My analysis further suggests that, for at least some of the owners of these power plants, the current economic woes are self-inflicted.

Throwing good money after bad

If it costs $25 to produce a unit of energy and the market price is $30 per unit then it makes sense to operate that power plant and take that $5 margin and use it to pay down debt or other fixed costs.

If market prices stay at $30 the power plant keeps operating as much as it can and begins to pay down the fixed costs and eventually the revenues go to building up a profit. If the market price drops down to $20 and the unit keeps operating, then the owner’s profits begin to erode. The longer the owner does this the more the profits erode.

“Stop throwing away money,” is not something you’d expect to have to tell a corporation.

Some plants generate at a loss so often that they make it impossible for the power plant to make money.

If utilities allowed the market price to determine when to run, this wouldn’t happen. However, in the competitive markets I analyzed, power plant operators can choose to ignore price signals, and the owner can “self-commit” or “self-schedule,” effectively bypassing the market’s role as the independent system operator.

If a merchant-owned power plant does this, it does so at its own risk. But when monopoly utilities do this, customers bear the burden. Below is the list of the 15 power plants that each imposed a $100 million burden on rate-payers over the 3 year study period.

Power plants that imposed at least $100 million on rate payers. Note that many of the merchant-owned plants (highlighted in purple) burn waste coal and/or are “cogen” facilities.

The common excuses I hear

I’ve talked about this issue with advocates, economists, lawyers, engineers, market monitors, utility operators, and reporters. I like to take a moment to share some of the things I’ve heard in response to my analysis, and my response to them.

The most common: Aren’t these plants needed for reliability? 

Markets are designed to provide low-cost, reliable power. The idea that a power plant needs to bypass the market’s decision-making process and self-select (as opposed to market-select) is to presume that the markets are incapable of doing its job. Arguably, if the clearing price in a market is constantly below a power plant’s production costs, then, there were other resources available to reliably provide lower cost power. In some cases, the plants might be needed in some months but not others, like the municipal coal plant owner in Texas that realized it only made sense to operate in the summer months and decided to sit idle 8 months of the year. The municipality still provides electricity to their customers year-round, they just decided it didn’t make sense to burn $25/MWh coal in a $20/WMh market.

At the end of the day, this research was not designed to indicate or evaluate reliability and makes no judgment about the “need” for any of these plants for reliability purposes.

The most insulting: You just don’t understand how this works.

After the SPP report came out, SPP and utility officials challenged my conclusions (oddly, they did not contest my results). According to E&E news, that pushback included the specious argument that “[w]holesale electric rates do not directly correlate with retail electricity rates…” And, “[w]holesale electric rates also do not reflect other costs, like the price of ensuring the grid’s reliability, or utilities’ long-term fuel supply contracts”

Really? Retail and wholesale are different? Please, tell me more.

Of course retail rates are different, they include additional costs (for example, distribution system costs). Yes, regulated monopolies are allowed to recover prudent capital costs from the past, but we are only talking about operating costs.

None of these arguments changes the fact when wholesale prices are low, there is an opportunity for utilities to buy lower cost energy off the market and pass along savings to customers.

The silliest: I have the right to “self-supply.”  

When monopoly utilities joined competitive markets, they did so voluntarily. In fact, some had to jump through hoops to do it. These utilities often have a “least cost” obligation, meaning they are supposed to provide electricity to retail customers at the lowest reasonable cost. Now that there are resources available that are lower cost, utilities are desperate to retain their monopoly rights to supply their customers with resources they own. That’s just silly from any point of view other than the plant owner.

The most technical: Fuel contracts turn fuel costs into fixed costs.

This being the most technical excuse it is also the most complex; not easily handled in a few-hundred-word response. Maybe I’ll come back to this one in a future blog, but in the meantime…

Many coal-fired power plants enter into contracts for fuel which have “take-or-pay” provisions. Utilities claim this means there is effectively no cost to burning the fuel. First, most contracts can be re-negotiated. Fuel contracts I have reviewed are akin to a rental agreement: Yes, technically, you are locked in for a number of years, but typically there are ways to negotiate your way out. Second, the accounting logic they use to justify discounting the coal costs can produce sub-optimal results when companies fail to appropriately account for opportunity costs.

What’s next?

This research raises interesting questions, including:

  • Does this impact how we value energy efficiency and renewable energy?
  • Are power plants owned by monopoly utilities receiving de-facto ‘out of market’ subsidies?
  • When do fuel contracts and fuel cost accounting become imprudent?
  • Are inflexible coal units crowding out renewables on the transmission system?
  • Is coal partially to blame for negative energy market prices?

UCS wants allies, utilities, and decision makers to look at this question of operating uncompetitive coal plants without regard for the availability of lower priced energy. Utilities have the ability to stop engaging in this practice; if they don’t, regulators have agency to create (dis)incentives to help end it. If neither of those groups acts, consumer and environmental groups would seem well aligned to work together to stop it.

Sustainable FERC Project

In a Warming World, Carolina CAFOs Are a Disaster for Farmers, Animals, and Public Health

North Carolina hog CAFO in Hurricane Florence floodwaters, September 18, 2018. Photo: Larry Baldwin, Crystal Coast Waterkeeper/Waterkeeper Alliance

In the aftermath of Hurricane Florence, I’ve joined millions who’ve watched with horror as the Carolinas have been inundated with floodwaters and worried about the various hazards those waters can contain. We’ve seen heavy metal-laden coal ash spills, a nuclear plant go on alert (thankfully without incident), and sewage treatment plants get swamped. But the biggest and most widely reported hazard associated with Florence appears to be the hog waste that is spilling from many of the state’s thousands of CAFOs (confined animal feeding operations), and which threatens lasting havoc on public health and the local economy.

And while the state’s pork industry was already under fire for its day-to-day impacts on the health and quality of life of nearby residents, Florence has laid bare the lie that millions of animals and their copious waste can be safely concentrated in flood-prone coastal areas like southeastern North Carolina.

CAFO “lagoons” are releasing a toxic soup

The state is home to 9.7 million pigs that produce 10 billion gallons of manure annually. As rivers crested on Wednesday, state officials believed that at least 110 hog manure lagoons—open, earthen pools where pig waste is liquified and broken down by anaerobic bacteria (causing their bubblegum-pink color) before being sprayed on fields—had been breached or inundated by flood waters across the state:

The tally by the North Carolina Department of Environmental Quality is rising rapidly (it was just 34 on Monday). Perhaps not surprisingly, the state’s pork industry lobby group is reporting much smaller numbers: by Wednesday afternoon, the North Carolina Pork Council’s website listed only 43 lagoons affected by the storm and flood.

In any case, the true extent of the spills may not be known for many days, as extensive road closures in the state continue to make travel and assessment difficult or impossible.

The scale of North Carolina’s CAFO industry is shocking

In 2016, the Waterkeeper Alliance and the Environmental Working Group used federal and state geographical data and analyzed high-resolution aerial photography to create a series of interactive maps showing the locations and scale of CAFOs concentration in the state. The map below shows the location of hog CAFOs (pink dots), poultry CAFOs (yellow dots), and cattle feedlots (purple dots) throughout the state.

Waterkeeper Alliance and the Environmental Working Group used public data to create maps of CAFO locations in North Carolina in 2016. For more information and interactive maps, visit https://www.ewg.org/interactive-maps/2016_north_carolina_animal_feeding_operations.php#.W6KBLPZReUk.

Note the two counties in the southeastern part of the state, Duplin and Sampson, where the most hog CAFOs are concentrated—nearly as pink as a hog lagoon, these counties are Ground Zero for the state’s pork industry. In Duplin County alone, where hogs outnumber humans 40-to-1, the Waterkeeper/EWG data show there were, as of 2016, more than 2.3 million head of swine producing 2 billion gallons of liquid waste per year, stored in 865 waste lagoons. (Duplin County was also home to 1,049 poultry houses containing some 16 million birds that year.)

The state’s CAFOs harm communities of color most

“Lagoon” is a curious euphemism for a cesspool. Even without hurricanes, these gruesome ponds pose a hazard to nearby communities. In addition to the obvious problem of odor, they emit a variety of gases—ammonia and methane, both of which can irritate eyes and respiratory systems, and hydrogen sulfide, which is an irritant at very low exposure levels but can be extremely toxic at higher exposures.

These everyday health hazards hurt North Carolinians of color most of all. To pick on Duplin County again, US Census figures show that one-quarter of its residents are black and 22 percent are Hispanic or Latino. And a 2014 study from the University of North Carolina at Chapel Hill found that, compared to white people, black people are 54 percent more likely to reside near these hog operations, Hispanics are 39 percent more likely, and Native Americans are more than twice as likely.

What does all that mean for health and environmental justice? Residents near the state’s hog CAFOs have complained for years of sickening odors, headaches, respiratory distress, and other illnesses, and have filed (and begun winning) a series of class-action lawsuits against the companies responsible for them.

Just this month, researchers at Duke University published new findings on health outcomes in communities close to hog CAFOs in the state. They found that, compared with a control group, such residents have higher rates of infant death, death from anemia, and death from all causes, along with higher rates of kidney disease, tuberculosis, septicemia, emergency room visits and hospital admissions for low-birthweight infants. (Read the full study or this review.)

CAFO damage from Florence was predictable…and will get worse

Releases of bacteria-laden manure sludge from CAFO lagoons in flooding like we’re seeing this week compound the day-to-day problem, and they’re inevitable in a hurricane- and flood-prone state like North Carolina. Between 1851 and 2017, 372 hurricanes have affected the state, with 83 making direct landfall in North Carolina. Hurricane Floyd in 1999 and Hurricane Matthew in 2016 wreaked havoc similar to what we’re seeing this week.

As you can see on the map below, Florence dumped between 18 and 30+ inches on every part of Duplin County.

http://www.nc-climate.ncsu.edu/climateblog?id=266

It’s not surprising that flooding from such an event would be severe. And while the North Carolina Pork Council called Florence “a once-in-lifetime storm,” anyone who’s paying attention knows it’s just a matter of time before the next one.

Millions of animals are likely drowned, starved, or asphyxiated

In addition to the effects on communities near North Carolina’s CAFOs, it’s clear that Hurricane Florence has caused tremendous suffering and death to animals housed in those facilities. Earlier this week, poultry company Sanderson Farms reported at least 1.7 million chickens dead, drowned by floodwaters that swamped their warehouse-like “houses.” Some 6 million more of the company’s chickens cannot yet be accounted for. Overall, the state Department of Agriculture and Consumer Services on Tuesday put the death toll at 3.4 million chickens and turkeys and 5,500 hogs, but those numbers may very well rise.

A major reason we don’t yet know the full extent of animal deaths in North Carolina’s CAFOs is that road closures due to flooding has cut off many of the facilities, preventing feed deliveries and inspections. Many animals likely also died in areas that experienced power failures due to the storm. According to this poultry industry document, a power outage that interrupts the ventilation system in a totally enclosed poultry CAFO can kill large numbers of birds by asphyxiation “within minutes.”

North Carolina farmers face staggering financial losses and likely bankruptcies

And what about the farmers? Many of the nation’s hog and poultry producers are in already in a predicament. Corporate concentration has squeezed out many independent farmers, meaning more operate as contractors to food industry giants like Smithfield and Tyson. In the US pork industry, contract growers accounted for 44 percent of all hogs and pigs sold in 2012. The farmers have little power in those contracts, and an early action of the Trump administration’s USDA served to remove newly-gained protections against exploitation by those companies. The administration’s trade war isn’t helping either.

As one expert in North Carolina put it as Hurricane Florence approached:

A farmer (who operates a CAFO) has very little flexibility. They take out very large loans, north of a million dollars, on a facility that is specifically designed by the industry, as well as how the facility will be managed. Remember that 97% of chickens and more than 50% of hogs are owned by the industry. These farmers never even own the animals. But if the animal dies, and how to handle the waste, that’s on the farmer. That’s their responsibility.

I know many individual farmers who do the best they can, who work as hard as they can, who treat their animals with respect. But there’s only so much they control. They can’t control the weather. They can’t control the hurricane. These farmers are part of an industry that says, for the sake of efficiency, you have to put as many animals as possible into these facilities.

Post-Florence, these contract farmers are likely to receive inadequate compensation for the losses of animals in their care. A series of tweets this week by journalist Maryn McKenna, who has studied the poultry industry, illuminates the issues:

So, as the waters recede, many hog and poultry farmers are about to find themselves responsible for a ghastly cleanup job. Imagine returning home to find thousands of bloated animal corpses rotting in the September sun. They they were your livelihood, and now they’re not only lost, but an actual liability you must pay to have hauled away.

Public policies should encourage sustainable livestock production, not CAFOs

And so it goes for farmers in today’s vertically-integrated, corporate-dominated, CAFO model. But it doesn’t have to be this way. Public policies can give more power to livestock farmers in the marketplace, protect animals and nearby communities from hazards associated with CAFOs, and facilitate a shift to more environmentally and economically sustainable livestock production practices.

If Hurricane Florence teaches us anything, it’s that flood-prone coastal states like North Carolina are no place for CAFOs. At a minimum, the state must tighten regulations on these facilities to protect public health and safety. A 2016 WaterKeeper Alliance analysis found that just a dozen of North Carolina’s 2,246 hog CAFOs had been required to obtain permits under the Clean Water Act, with the rest operating under lax state regulation. The state and federal government should also more aggressively seek to close down hog lagoons and help farmers transition to more sustainable livestock practices or even switch from hogs to crops. A buyout program already exists but needs much more funding.

In the meantime, the federal farm bill now being negotiated by Congress also has a role to play. At least one farm bill program, the Environmental Quality Incentives Program, or EQIP, has been used in ways that underwrite CAFOs. In a 2017 analysis of FY16 EQIP spending, the National Sustainable Agriculture Coalition noted that 11 percent ($113 million) of EQIP funds were allocated toward CAFO operations, funding improvements to waste storage facilities and subsidizing manure transfer costs. And the House version of the 2018 farm bill could potentially increase support for CAFOs by eliminating the Conservation Stewardship Program—which incentivizes more sustainable livestock practices and offers a 4-to-1 return on taxpayer investment overall—and shifting much of its funding to EQIP.

The post-Florence mess in North Carolina illustrates precisely why that’s a bad idea. Particularly in a warmer and wetter world, public policies and taxpayer investments should seek to reduce reliance on CAFOs, not prop them up.

Utilities Look Toward a Clean Energy Future, Yet the Administration Keeps Looking Back

Coal—and the president’s ill-conceived plan to bailout the industry—is taking up a lot of bandwidth in discussions around national energy policy. As the president and the coal industry continue to rely on dubious arguments to justify the idea of keeping economically struggling coal plants afloat, we began to wonder: what are electric utilities doing on the ground? How are they approaching the question of future investments in technology?

Today I’m handing my blog over to our 2018 UCS Schneider Fellow Eli Kahan, who has studied how utilities are planning for future electricity needs, what they are doing in terms of new investments in low-carbon generation sources, and how that compares to their stated goals.

Market pressures

In the past decade, due to advances in horizontal drilling technology and the fracking boom, domestic natural gas prices have plummeted, incentivizing many power plant owners to shift from coal to natural gas. Moreover, in the past few years, renewable energy such as wind and solar have continued to grow at record rates as these renewables have become highly economically competitive with traditional forms of electricity generation. A look at Lazard’s 2017 Unsubsidized Levelized Cost of Energy report shows wind as the world’s cheapest source of energy with utility scale solar not far behind.

As such, this rapid decline in coal generation has been driven primarily by market forces. While the administration’s proposed bailout might buoy a few coal plants destined for retirement in the next few years, this crutch is unlikely to keep coal afloat in a time of ever-falling costs of renewable energy, to say nothing of the urgent need to act on climate change.

Another key reason this bailout is unlikely to stick is due to a recent surge in announcements of utility goals to reduce CO2 emissions. Who is setting these targets? What is driving them? And how are utilities planning to meet these goals?

Utilities around the country have announced significant emissions reduction goals.

The targets

To begin answering these questions, I gathered data on more than 100 of the nation’s largest electric utilities and parent energy companies—more than 30 of which have set quantitative targets for reducing CO2 emissions, achieving higher percentages of renewable energy, and/or completely moving off of coal. Together, these companies, including 7 of the 10 largest electric utilities by market value in the country, account for nearly 40% of 2016 US electricity sales. These utilities are showing awareness of the science, costs, and their consumers’ concerns in setting sights on a lower carbon future.

While these goals come in many shapes and sizes, they all share one thing in common: additionality—all of them voluntarily exceed state goals and mandates. Moreover, the majority of these targets have been set in the past couple years—a time when the Clean Power Plan has been on the chopping block and during a fossil-fuel-friendly presidency.

Furthermore, as the map below shows, this is a national trend—these targets are not restricted to just a few leading states but rather can be found all around the country, even in places that rely heavily on coal.

Electric Utility Companies with targets deemed to provide additionality. Not shown: Avangrid, Engie, Great River Energy, MidAmerican Energy, Minnesota Power, NextEra Energy, NRG, Tennessee Valley Authority. Since the analysis done for project, California’s SB100 has nullified the additionality from PG&E’s and LADWDs’ goals.

What else is driving this trend?

There’s little doubt that falling prices for natural gas and renewables have aligned cutting emissions with cutting costs, and the success of state renewable portfolio standards (RPSs) and greenhouse gas (GHG) emissions reductions targets has played a role in breaking up the inertia on decarbonization. But according to the utilities themselves, there exists a whole host of additional motivating factors.

For AEP’s Appalachian Power President, Chris Beam, the motivation is customer preferences:

“At the end of the day, West Virginia may not require us to be clean, but our customers are”  –Chris Beam

For Berkshire Hathaway Energy’s vice president for government relations, Jonathan Weisgall, it’s customer input focusing on the role of corporate commitments:

“We don’t have a single customer saying, ‘Will you build us a 100 percent coal plant?’…Google, Microsoft, Kaiser Permanente — all want 100 percent renewable energy. We’re really transitioning from a push mandate on renewable energy, to more of a customer pull.” –Jonathan Weisgall

For DTE, it is a science-based response to the issue of climate change:

“Through our carbon reduction plan DTE Energy’s is committed to being a part of the solution to the global climate crisis. There is broad scientific consensus that achieving 80 percent carbon reduction by 2050 will be necessary to begin to limit the global temperature increase below two degrees Celsius over preindustrial levels” –DTE 2018 Environmental, Social, Governance, and Sustainability Report, p.3

Even coal-heavy American Electric Power (AEP) is moving toward a clean energy future, recognizing the value of investing in renewable energy. AEP serves customers in 11 states—including Ohio, Indiana, Kentucky, Virginia, and West Virginia. Nick Akins, AEP’s CEO, is skeptical of the administration’s plans to bailout certain coal and nuclear plants. Although his position is a bit more nuanced—claiming that coal remains important to the reliability and resiliency of the grid”—Akins is clear-eyed about the future, charting AEP’s path toward a 60 percent reduction in carbon emissions by 2030, and 80 percent by 2050:

“Our customers want us to partner with them to provide cleaner energy and new technologies, while continuing to provide reliable, affordable energy.” –Nick Akins

How are utilities meeting these goals?

For the past decade, utilities have been cutting GHG emissions and costs by shedding coal and transitioning onto natural gas. Natural gas is sometimes seen as a “transitional fuel” on the path to a clean energy future, as it is known for producing 50-60 percent less CO2 than coal does in combustion. Southern Company, shown below, provides a particularly illustrative example of this coal to natural gas switching.


However, natural gas is not particularly clean, especially when considering available renewable energy alternatives. And with the recent dramatic cost declines in wind and solar, companies like MidAmerican Energy, shown above have realized they can skip the “transition fuel” and get a head start on investing in a more long-term energy solution.

Over and over, it’s the same story:

“Retiring older, coal-fueled units. Building advanced technology natural gas units. Investing in cost-effective, zero-carbon, renewable generation” – WEC Energy Group

“The company expects to achieve the reductions through a variety of actions. These include replacing Kentucky coal-fired generation over time with a mix of renewables and natural gas” – PPL

“Traditionally, our generation portfolio has depended on coal, but we are transitioning our energy supply away from coal to rely more on renewable energy and natural gas generation as backup. From 2005 through 2026, we will retire more than 40 percent of the coal-fueled capacity we own under approved plans, and if regulators approve our proposed Colorado Energy Plan in 2018, we will retire even more.” – Xcel

In short, utilities around the country are retiring coal, and investing in natural gas and renewables. But it’s important to emphasize that there is a serious risk of an overreliance on natural gas. Natural gas is subject to price volatility (putting consumers at risk of higher prices) and is inconsistent with long-term deep decarbonization targets. Utilities prioritizing clean, cost-effective renewable energy instead are protecting their customers from gas-related risks.

Connecting the dots

We’ve seen a recent flurry of utility announcements of decarbonization goals and renewables targets. In the last decade, natural gas has steadily eaten away at coal’s market share of the nation’s electricity production. And the costs of renewables continue to fall dramatically, making them both clean and cost-effective. This trend shows no sign of changing. Some utilities such as Consumers Energy have even explicitly pledged to drop coal altogether.

In that context, how does the administration’s proposed bailout, which would cost billions to keep a few of the nation’s oldest, dirtiest, most expensive coal plants on line for another couple years, make sense? Grid operators and federal regulators have consistently said that near-term retirements pose no threat to grid reliability. Reserve margins—that is, how much extra electricity generating capacity is available beyond expected peak demand levels—are sufficient in most regions of the country. And besides, studies show renewables are diversifying the electricity mix, making the electricity grid more reliable and resilient.

We see that some utilities are making decisions prudent for their long-term planning: smart investments, achievable decarbonization targets, and a mix of energy sources in their portfolios. But a bailout of this nature will send utilities—along with their investors—scrambling to square established goals with the administration’s backward steps.

Even with the temporary crutch, coal has no long-term future, as it continues to be displaced by cheaper and cleaner forms of energy. Instead of propping up a dying industry, our collective interests would be better served by ensuring that the miners and coal plant workers—whose livelihoods will be threatened by this transition—have new opportunities to find good jobs with family-supporting wages. We can and should look to our nation’s utilities and their clean energy targets for a clearer vision forward.

Hurricane Florence: One Week Later Here’s What We Know and Here’s What’s Next

Homes and businesses are surrounded by water flowing out of the Cape Fear River in the eastern part of North Carolina Sept. 17, 2018, in the aftermath of Hurricane Florence. (U.S. Army Photo by Staff Sgt. Mary Junell)

On the morning of September 14, Hurricane Florence made landfall near Wrightsville Beach, North Carolina, bringing with it record storm surge and torrential, historic amounts of rain. A week later, communities across the Carolinas are struggling with the aftermath. At least 42 people have lost their lives thus far. Heavy, lingering rainfall has caused rivers to rise for days after the storm, leading to catastrophic flooding including in inland areas. Here’s what we know so far and what we can expect in the weeks and months to come.

Six things we know about hurricane Florence’s impact so far:

1) Hurricane Florence is still a dangerous unfolding disaster. It’s important the people in the Carolinas continue to pay close attention to warnings from state emergency management agencies and other local authorities and not return to flooded areas until they are deemed safe. Rivers are still rising and new areas can get flooded. The National Weather Service Newport/Morehead City warned that the Neuse River will likely not crest until this evening at the earliest. Water contamination is a real health so please follow the CDC’s advice and heed local water advisories if you live in one of the affected areas.Source: NWSWPC

2) The advance projections for Hurricane Florence proved remarkably accurate and were very helpful in informing emergency preparedness and evacuation efforts. The National Hurricane Center was predicting for days that landfall would likely be along the North Carolina coast, and likely the southern part of the coastline. The forecast also made clear that the there would be record storm surge and this was borne out with tide gauges in Beaufort and Wilmington recording their highest-ever levels.

Most importantly, it was clear that one of the biggest dangers from Florence was that it would bring heavy rainfall and stall over the area for days, significantly raising flooding risks especially inland—a sad reality that many communities are now living through. Record rainfall was experienced in a number of places, including Elizabethtown, NC which saw 35.93 inches of rain and Marion, SC which saw 34 inches. Rivers are just cresting in many places, including Kinston, Lumberton, Fayetteville and Lumberton. Several rivers including the Cape Fear, Pee Dee, and Trent rivers, broke high water records. (Watch this dramatic and sobering visualization from the USGS of the flooding as Florence moved through North Carolina.)

Soldiers of 252nd Armored Regiment and 230th Brigade Support Battalion shuttle people, children, and their pets across high water on Highway 17 between Wilmington and Bolivia N.C. on September 18, 2018. (Photo by Sgt. Odaliska Almonte, North Carolina National Guard Public Affairs)

3) Despite clear warnings about the risks of inland flooding, many people were still caught unawares or did not have the resources to evacuate, and in some cases that contributed to loss of life. Too many people thought the worst of the storm was over by the day after it hit—when that turned out to just the beginning of the flooding in many inland areas. Drivers ventured onto roads that looked dry but were quickly overwhelmed by flash flooding. Evacuation is costly and not everyone can afford it. Unfortunately, the data also show that far too few homeowners in the path of Florence—especially in inland areas— were carrying flood insurance, which could leave them struggling financially as they try to rebuild their lives.

4) Hurricane Florence is a very costly storm, likely to rank among the top ten most costly in the United States. Early estimates of the property damage costs of Florence from Moody’s are in the $17 to 22 billion range, although that could rise. AIR Worldwide’s initial estimate of the insured losses just from the wind and storm surge, without accounting for the heavy precipitation, ranges from $1.6 to $4.7 billion. These costs do not include the damage to infrastructure, including major highways (see this stunning drone footage of I-40 turned into a river, for example) or dams; or payouts from the National Flood Insurance Program.

In Robeson County, sections of I-95 at the Lumber River remain under water in the wake of Hurricane Florence. Credit: NC DOT.

5) Early fears about the risks of coal ash ponds leaking toxic wastes and hog lagoons flooding and contaminating waterways have become realities. Initial reports show that some of Duke Energy’s coal ash ponds near the H.F. Lee coal plant have been breached, potentially contaminating the Neuse River near Goldsboro. A coal ash pond at the Sutton power plant near Wilmington has also been flooded, potentially contaminating the Sutton lake, a public lake. Meanwhile, the NC Department of Environmental Quality data show that over a hundred hog lagoons are either already discharging waste into waterways or are in danger of doing so. The Waterkeeper Alliance is tracking these hog lagoon and coal ash spills.

6) Low-income communities, communities of color, and rural communities have been particularly hard-hit by the flooding from Florence. News reports detail Florence’s impacts on public housing; on livelihoods of hourly wage workers; and on the rural poor. Many communities that were affected by flooding from Hurricane Matthew two years ago have found themselves again in the midst of devastating flooding. In comments earlier this week, Governor Roy Cooper of North Carolina said, “One thing that this storm puts a spotlight on is the issue of affordable housing, which is there even without a storm…” He went on to say “We’re going to approach this rebuilding effort with an emphasis on affordable housing.” Families around the state will be counting on this promise to be fulfilled.

Looking ahead to recovery and rebuilding

Ahead of Hurricane Florence’s landfall, President Trump issued disaster declarations for North Carolina and South Carolina and an emergency declaration for Virginia. This has authorized federal disaster assistance, including coordination from FEMA, to help supplement state, local and tribal emergency response and recovery efforts.

FEMA teams are on the ground in the affected states and a list of resources are available here. The National Guard, the US Coast Guard, the US Army, FEMA teams, NC emergency management personnel, nonprofits and thousands of volunteers are working to help safely evacuate and shelter people. Individual assistance programs for families and public assistance for state, local and tribal entities are available.

A Coast Guard Air Station Clearwater MH-60 Jayhawk aircrew searches for survivors of Hurricane Florence in Elizabeth City, North Carolina, Sept. 18, 2018. Credit: U. S. Coast Guard photograph by Auxiliarist Trey Clifton.

In the weeks to come, Congress will also need to step in with supplemental disaster aid, as has happened with previous major disasters. It will be critical for this aid to flow quickly to the communities that are hardest hit and have the fewest resources to cope. A major concern is that the public housing stock in many locations has taken a hard hit. Affordable housing is already scarce, and this hurricane will make it even more difficult for families looking for a safe place to live. Unfortunately, in past storms a lack of affordable housing has forced some to leave communities and move far away; or incur hardships because they had to move to places further from jobs and schools. The Department of Housing and Urban Development’s (HUD’s) Community Development Bock Grant-Disaster Recovery (CDBG-DR) funds are critical to rebuilding resilient and affordable housing where people need it most. Congress must allocate adequate funds to this program.

FEMA funding is also vital for communities’ rebuilding efforts. It’s important to ensure that rebuilding is done in a resilient way that will help protect homeowners and communities from future storms. This is also a good time to make funding for voluntary home buyout programs available so homeowners who live in areas at high risk of flooding can choose that option and move to safer ground. Recovery will take a long time and we can’t lose sight of that reality even after the storm drops out of the headlines. Yesterday marked one year since Hurricane Maria hit Puerto Rico—and it’s clear there is still so much to do to help communities get back on their feet. Some families in North Carolina were still waiting for federal assistance in recovering from Hurricane Matthew when Florence hit.

A resilient future must take account of climate change and equity considerations

It’s unmistakable that climate change is contributing to the risk of more intense hurricanes and worsening flooding. Higher sea levels and increased heavy rainfall exacerbate the risks of catastrophic flooding. The human and economic toll of these extreme events is high. And even in the absence of storms, sea level rise is worsening tidal flooding and is a grave risk to coastal communities.

Meanwhile, as we’ve seen repeatedly with recent hurricanes—Katrina, Harvey, Maria, Irma and now Florence—low income communities and communities of color bear the brunt of the harmful impacts in the wake of disasters.

As communities recover and rebuild from these terrible disasters, we must keep these facts front of mind and ensure that we’re building for a more climate-resilient future for all.

If you would like to support local recovery efforts for Hurricane Florence, please consider these resources assembled by frontline communities on the ground: A Just Florence Recovery  Thank you.

Photo by Sgt. Odaliska Almonte, North Carolina National Guard Public Affairs NC DOT U. S. Coast Guard photograph by Auxiliarist Trey Clifton/Released.

Sea Level Rise: New Interactive Map Shows What’s at Stake in Coastal Congressional Districts

A new interactive map tool from the Union of Concerned Scientists lets you explore the risk sea level rise poses to homes in your congressional district and provides district-specific fact sheets about those risks. Explore the interactive map.

No matter where you live along the coast, chances are that rising seas will begin to reshape your community to one degree or another in the coming decades. Communities that want to be prepared for the changes to come will need representatives in Congress who will advocate for the research, funding, and policies we need to address sea level rise and coastal flooding head-on. As we head into the midterm elections this fall, this tool provides a resource for both visualizing your community’s future as sea level rises and engaging with congressional candidates around the issue of climate change.

In this post you’ll learn how to explore this tool, how to get facts about sea level rise specifically for your congressional district, and how to take action within your community in light of the upcoming elections.

Explore how homes in your congressional district will be affected by sea level rise

The mapping tool is fairly simple. Clicking on any coastal congressional district in the contiguous United States will bring up information on the number of homes at risk of chronic inundation–or flooding, on average, every other week–as sea level rises. You’ll also get information about how much those at-risk homes are collectively worth, an estimate of the number of people living in those homes, and their current contribution to the property tax base.

Each district also has an accompanying district-specific fact sheet with statistics and information about chronic inundation risks. You can explore both near-term and long-term projections for a scenario with relatively rapid sea level rise and one with more moderate sea level rise.

While some districts have more homes at risk than others, every coastal district faces some degree of risk. New Jersey’s 2nd District– encompassing roughly the southern quarter of the state, including Atlantic City, Ocean City, and Cape May–is among the most exposed districts in the country, with more than 45,000 homes at risk of chronic inundation within the next 30 years.

Florida’s 26th District, which covers the southernmost tip of Florida and the Florida Keys, is also highly exposed with more than 12,000 homes at risk of chronic inundation by 2045.

More than 45,000 of the existing homes in New Jersey’s 2nd District are at risk of chronic inundation in the next 30 years.

 

In a state with some of the highest overall exposure to sea level rise, Florida’s 26th District stands out as being acutely at risk of chronic inundation.

Fact sheets available for every coastal Congressional district in the lower 48

With more than 10,000 homes at risk of chronic inundation in South Carolina’s 1st District, there’s a lot at stake. Our district-specific fact sheets can help you to assess just how much is at risk in your district as sea level rises.

When you click on any district in the map, the accompanying pop-up window includes a “Learn more” link, which brings you to a two-page fact sheet for that district. Included in each fact sheet is information about the number and value of homes at risk over the near-term (by 2045) and long-term (by 2100). There are also statistics about the percentage of homes that could potentially avoid chronic inundation if we limit future warming to below 2 degrees Celsius and future loss from land-based ice is limited.

The second page of the fact sheet highlights the implications of chronic flooding more broadly, and includes recommended policies for local, state, and federal policymakers.

Candidates running for Congress in coastal districts need to know the risks of rising seas

This tool enables people and policymakers along the coast to better understand when and to what extent sea level rise and coastal flooding will impact their communities. But what we do with that understanding is critical, particularly when it comes to ensuring that coastal congressional candidates fully recognize and acknowledge the risk, and have a plan for addressing it.

Here are four ways you can take this information to the candidates in your district—and ask them what they’re going to do about it:

  1. Reach out to candidates on social media. If you’re on Twitter, tweet at the candidates in your district. Include a key fact or two on rising seas in your district, link to the map or fact sheet, and ask them what their plans are to address the issue. Make sure you include candidates’ Twitter handles in your tweet so that candidates or their staff see it—you can find information about candidates on your ballot, including their Twitter handles, here. (Note that Twitter is including a special election label on each candidates’ official account to help you verify the correct Twitter handle to include.)
  2. If you’re on Facebook, follow candidates’ Facebook pages and comment on posts that can be connected to the risks of rising seas (property development, community protection, etc.), or create your own Facebook post highlighting the risks to your congressional district and share it.
  3. Attend a candidate forum or event. Ask candidates about their plans to address sea level rise and climate change. Cite the facts about homes in your district that are at risk of chronic flooding and ask candidates how they will support your community in efforts to build resilience to flooding. Because this problem will not be limited just to your community, ask about candidates’ plans to advocate for reductions in global warming emissions at the federal level, knowing that nationwide more than 80 percent of the homes at risk of chronic inundation this century could potentially avoid such a fate if we were to rapidly reduce emissions and limit future warming. You can also email candidates directly. The official web sites for most candidates includes “Contact Us” information, which typically provides an email form, address, or other way to write directly to the candidate.
  4. Write a letter to the editor for your local paper. Candidates monitor local news sources, so writing a letter to the editor (LTE) can be a great way to let them know about the issues that are important to you and your community, including rising seas. And including specific statistics about homes at risk of chronic inundation can help your LTE pack an extra punch and make it more likely to be published. Papers don’t publish every LTE they receive, but your chances are better if you’re writing one in response to an article the paper already published. You do have to be quick in your response–you don’t want days to go by between the original article and your LTE. But the good news is that LTEs are usually required to be 200 words or less, or 1 to 2 paragraphs, so the writing usually goes pretty quickly. Check with your paper about its specific requirements when it comes to submitting LTEs.

We hope that you find this new tool useful and look forward to hearing how you’re using it!

California Ready to Take Action on Clean Transportation after Climate Summit

With last week’s Global Climate Action Summit in San Francisco all wrapped up, it’s time to get down to the business of turning words into actions.  And next week, California is poised to do just that.  The California Air Resources Board agenda for next Thursday and Friday is chock-full of transformative policies that, if adopted, will accelerate deployment of electric cars and transit buses, increase electric charging and hydrogen refueling infrastructure, bring more low carbon alternatives to diesel and gasoline to the state, and ensure consumers in California and the 12 other states that follow California’s standards continue to have cleaner, more efficient vehicle choices.

Transportation emissions – the pollution from cars, trucks, buses, planes, ships and trains – are proving to be stubborn.  They’ve been increasing and becoming a larger portion of economy-wide emissions. They are now over 40 percent of California’s climate pollution. They are stubborn in part because vehicles stay on the road for a long time.  So even though standards that bring more efficient gasoline vehicles and EVs to market are very effective, they only apply to new vehicles. And new cars aren’t purchased like cell phones. Cars can last 15 years or more which means replacing all the cars on the road today with new ones takes time. Looking beyond passenger vehicles is also essential.  About 70% of transportation emissions in California are from passenger cars and trucks. The rest come from other types of vehicles and the fuels they burn.

California transportation emissions are more than 40% of the state’s total and are on the rise

Source: California’s Emissions Trends Report 2000-2016

There is no silver bullet solution policy on transportation, so a combination of coordinated and complementary policies is our best bet. The issues before the California Air Resources board meeting this month demonstrate this multi-prong approach in action.  Here are three of them:

  1. Extension of the Low Carbon Fuel Standard to 2030

The Low Carbon Fuel Standard requires gasoline and diesel fuel providers to reduce the carbon content of the fuel they sell in California. The current standard requires reducing the carbon intensity by 10 percent by 2020.  The board is set to vote on September 27th to strengthen the standard to require a 20% reduction in carbon intensity by 2030.  What’s the big deal?  This policy isn’t just about blending lower carbon biofuels like ethanol or renewable diesel into petroleum-based fuels. It’s also about expanding cleaner fuel choices like electricity and hydrogen that are needed to power zero emission vehicles.

The board isn’t just considering raising the bar on this policy, but considering some important changes designed to accelerate deployment of electric vehicle solutions, including:

Establish a statewide rebate program for electric vehicles funded by the clean fuel credits earned through vehicle charging. This comes at a critical time when some companies like Tesla and GM are starting to hit the cap on the federal EV tax credit.

Support electric vehicle charging and hydrogen fueling station deployment by providing financial incentives to station developers. This will help accelerate investments and help get California on the path to reach Gov Brown’s goal of 250,000 vehicle chargers and 200 hydrogen stations by 2025.

My colleague Jeremy Martin explains all of this in his recent blog post about how the Low Carbon Fuel Standard is clearing the roadblocks to electric vehicles. But the bottom line is that the Low Carbon Fuel Standard ensures that the fuels powering our transportation system become cleaner over time and, in the process, provides direct incentives for the clean vehicles and fueling infrastructure we need to make it happen.

  1. Requiring electric transit buses

Ever ride on a battery electric transit bus? If you’ve ridden a bus in China, the answer is likely ‘yes’. They’ve deployed more than 400,000 electric buses over the last few years. Modern battery electric and fuel cell powered buses are starting to gain traction in the U.S. and several transit agencies are making moves to deploy the technology. The Innovative Clean Transit regulation being heard by CARB on September 28th is aimed at accelerating that transition and making every bus in California either hydrogen or electricity powered by 2040.  That seems like a long way off, but that means transit agencies need to start buying electric buses now, and before 2030, 100% of their new bus purchases will need to be zero tailpipe emission buses.  This regulation will ensure that transit agencies in California are all moving forward together and transit riders around the state get the benefits of a quieter, cleaner bus ride. And the communities these buses operate get the benefit  of zero-tailpipe emissions . It will also help further advance electric drive in the heavy-duty vehicle sector paving the way for more electric trucks.

My colleague Jimmy O’Dea covers the finer details in his recent blog post and UCS’s recent Got Science Podcast on electric buses.

  1. Defending California clean car standards from Trump administration attacks

California has its own vehicle standards for cars and trucks, which 12 other states and the District of Columbia follow. California has had vehicle emission standards for decades, bringing huge benefits to the state as well as other states that follow the same rules. The rest of the country as a whole has also benefited as clean car technology, driven by California’s leadership, (the catalytic converter comes to mind).  The federal clean car standards are currently very similar to California’s standards and, as a result, California has accepted automaker compliance with federal standards as compliance with their own.

The board is proposing a change to California vehicle standards to further clarify that California will only accept compliance with the federal standards as they are currently written.  This is not a change in policy. California never signed-up to throw its authority to regulate vehicle emission out the window by accepting compliance with federal standards, whatever they may be.  And now that the Trump administration has made their intentions to freeze the standards in place at 2021 levels clear, California is simply clarifying that California standards will indeed be enforced.

Ideally, federal and California standards would remain aligned and continue to push forward on making new cars and trucks cleaner, more efficient and more affordable to drive. But barring an unforeseen change in the Trump administration’s anti-science agenda, that seems unlikely.  Making this regulatory language clarification makes it crystal clear that California intends to exercise its right to protect its residents from car and truck pollution as it always has.

The way forward

As with any change there is resistance. Oil companies have long attacked the low carbon fuel standard and automakers have resisted vehicle standards for decades. Many transit agencies are cautious about making the shift to electric buses.  But make no mistake: these changes are feasible and they are necessary if we are to succeed in preventing the worse consequences of climate change. The proposals before the Air Resources Board are based on extensive analysis and have been thoughtfully developed and deliberated and should be advanced.

There are over 25 million cars on the road in California – the vast majority of which are filled up with gasoline or diesel.  Transitioning to a clean, modern, low-emissions transportation system isn’t going to be easy.  There’s just no “one and done” strategy.  Each of the items before the board next week are substantial on their own and taken together they are a big step forward in reshaping California’s transportation system to deliver the clean air and stable climate California needs, while setting an example the rest of the country and the world can benefit from and follow.

Public domain

Here’s What Agriculture of the Future Looks Like: The Multiple Benefits of Regenerative Agriculture Quantified

Crops and livestock integrated in a regenerative agricultural system. Photo: Farmland LP

At the Union of Concerned Scientists, we have long advocated agricultural systems that are productive and better for the environment, the economy, farmers, farmworkers and eaters than the dominant industrial system. We refer to such a system as our Healthy Farm vision. Based on comprehensive science, we have specified that healthy farm systems must be multifunctional, biodiverse, interconnected and regenerative.

The scientific case for agricultural systems that renew rather than diminish resources is comprehensive, and research demonstrates the productivity and agronomic feasibility of such systems. Yet, economically viable real-world examples are necessary to spur acceptance and adoption of such schemes. Further, we need to overcome the limitations of economic thinking and measures that were developed in the 19th century—when it seemed that the Earth’s resources and its capacity to absorb waste were inexhaustible—and improve them to create more modern assessments, appropriate for the 21st century and beyond. A new report from our colleagues at Farmland LP, Delta Institute and Earth Economics will make a major contribution toward this end.

Healthy Farmland Vision – Click the graphic for an interactive web feature.

Economists view agriculture as a primary sector of the economy, meaning that without the activity of that sector, the remainder of the economy (such as manufacturing and service) could not be developed. Together with other primary economic enterprises such as mining and forestry, agriculture has generally been practiced and acknowledged as an extractive industry. Whereas mining is visibly extractive, agriculture is less so, because degradative processes such as soil erosion, fertility loss, and water and air pollution are not as obvious as mountaintop removal and strip mining. Yet, as practiced industrially, agriculture is both extractive and more extensive than mining.

 

Source: Our World in Data.

Extractive agricultural practices are abetted by strategies such as importing nutrients to compensate for loss of native soil fertility and by the fact that we value the gains from the extraction but don’t discount the losses. For example, we measure crop and animal yield and translate that to sales and profit, but don’t subtract from the ledger the soil, nutrients, air and water quality lost to produce crops and livestock. One superficial reason for this is that we don’t know the “cost” of those resources, but that is simply a polite way to say that historically we don’t value them. This is a perfect example of the nostrum that we measure what we care about and care about what we measure.

Yet, agriculture need not be inherently extractive. Through practices that build soil, recycle nutrients and store water it can become a regenerative system while still providing abundant food and other agricultural products. A key to shift from extractive to regenerative mode is to build a more complete picture of the total benefits and costs associated with agricultural management. For nearly a decade, the investment firm Farmland LP has been managing thousands of acres with regenerative techniques, thereby providing an opportunity for scientists and economists to assess the value of these practices to soil, water, climate, energy and social sectors. The Delta Institute and Earth Economics, with grant support from the Department of Agriculture’s Natural Resources Conservation Service, worked with Farmland LP on just such a project.

Based on a comprehensive review of scientific literature examining the value of various ecosystem services, the researchers applied the rigorous methodologies of Ecosystem Services Valuation and Greenhouse Gas Accounting to assess the effects of farm management on items such as soil formation and quality, water capture and quality, pollination and seed dispersal, climate stability, disaster risk reduction, air quality and biological control. Using Colorado State University’s COMET-Farm model, and the USDA’s Revised Universal Soil Los Equation, the researchers evaluated the effect of regenerative techniques on farmed and non-farmed land under Farmland LP’s management. They compared these model outputs with those from land managed conventionally to construct a comprehensive impact balance sheet.

The sums cited in this report are astounding, ascending into the millions of dollars of added ecological value from regenerative process—against millions of dollars of ecological losses due to standard industrial practices. The practices Farmland LP implements are well-known, backed by science and practice, and accessible to all farmers and farm managers with an interest in managing whole systems to increase returns to management. Examples include integrated crop and livestock production, crop rotation, biodiverse annual and perennial mixes, stream buffers, grassed waterways, organic fertilizers, biological pest control and uncultivated land to provide ecological services (erosion control, water capture, habitat and refugia for beneficial organisms.) The combination of these regenerative methods generated net value while industrial methods destroyed value—all while performing comparably on the dominant indicator of agricultural yield.

Ecological Service Value of farmed and non-farmed areas by impact metric – Delta Institute (see report for methods, context and further data.)

This assessment affirms the concrete value and effectiveness of multifunctional regenerative approaches. Since many of these ecosystem services are not currently quantified—much less traded—on markets that would remunerate farmers, the benefits are primarily experienced by way of cleaner environment, lower costs of production and added value of agricultural land. This is because land managed with regenerative practices will produce bountifully, at lower cost and for an indeterminate period of time, whereas the value of industrially managed land depends on false and brittle economies, such as access to government subsidies and the availability of cheap industrial fertilizer.

In fact, the main business of Farmland LP, a real estate investment trust, is to add long-term value to agricultural land for landowners and investors. A remarkable aspect of this strategy and business model, in addition to more faithfully reflecting actual ecological economics, is how quickly Farmland LP management has been able to produce results. In addition to demonstrating the effectiveness of regenerative methods, these findings indicate the kinds of practices that should be more broadly adopted across all of agriculture to assure our livelihood at present and far into the future.

The skilled agronomists and farm managers at Farmland LP, together with the rigorous scientists and economists who have developed and used the ecosystem evaluation technique, are demonstrating that regenerative agriculture is not an aspirational figment. It is real, it is possible, it is productive, it is profitable and it is environmentally beneficial. These things can all exist with one another. A successful business model is predicated on this. As long as reliable scientific information influences decisions and behavior, this report provides a beacon toward more viable, ethical and realistic agricultural practice for the long term.

Photo: Farmland LP Graphic: Our World In Data.

Pages