Combined UCS Blogs

Xcel Energy’s Plan to Eliminate Coal and Boost Solar in Minnesota

UCS Blog - The Equation (text only) -

Photo: Zbynek Burival/Unsplash

Today, Xcel Energy released a preliminary plan to phase out its remaining coal-fired power plants in Minnesota and replace them primarily with wind, solar, and energy efficiency—moving the company forward toward its goal of 100% carbon-free electricity by 2050.

Part of the plan involves a consensus proposal joined by the Union of Concerned Scientists, other clean energy organizations, and the Laborers International Union of North America.

Below are some of the noteworthy items included in the consensus proposal and Xcel’s plan—and how they relate to Minnesota’s clean energy future.

1. Coal plant retirements

Xcel Energy will propose a 2028 retirement date for the Allen S. King coal-fired power plant and a 2030 or earlier retirement date for the Sherco 3 coal unit. Significantly, these plants are the company’s last coal-burning power generators in Minnesota for which Xcel has not yet announced retirement dates.

In the meantime, Xcel will also reduce coal usage at the Sherco 2 unit through committing to seasonal operation of the plant, a concept that my colleague Joe Daniel has written about here. Seasonal operation of coal plants has helped other utility customers save money and promotes grid flexibility to enable Xcel to integrate more renewable energy in the future.

Reducing and phasing out coal burning produces major carbon pollution reduction benefits as well as reducing public health impacts through lower soot and smog emissions (see our Soot to Solar report from last fall on Illinois coal plants).

2. Massive growth in solar power

Minnesota currently has close to 1,100 megawatts of installed solar power statewide. As part of the agreement with clean energy organizations, Xcel will propose the acquisition of 3,000 megawatts of solar power to add to its system by 2030. This is enough to provide 20 percent of Xcel’s energy, powering the equivalent of more than 417,000 homes and furthering reduce carbon pollution from the electric sector. This would add to Xcel’s addition of 1,850 megawatts of wind power by 2022.

3. Big commitment to energy efficiency

According to the Center for Energy and Environment, in 2018 Xcel achieved a record amount of energy efficiency: more than 680 gigawatt-hours of electricity savings, or about 2.35% of sales (well exceeding the state’s 1.5% energy savings target). In the consensus proposal, Xcel commits to include achievement of electric savings above that 2018 amount for the entire decade (2020-2029) in its planning.

This ambitious goal is based on the Minnesota Department of Commerce’s statewide energy efficiency potential study and would allow Minnesota to potentially join other states that are achieving 2-3% per year in efficiency savings.

Investing in energy efficiency helps utilities avoid more expensive measures and helps reduce customers’ energy bills which promotes energy affordability and reduces the energy burden.

4. Support from labor

The Laborers District Council of Minnesota and North Dakota (“LIUNA Minnesota”) joined the consensus proposal alongside UCS and other clean energy organizations. As part of the agreement, Xcel will commit to a request-for-proposals process for solar projects that maximizes local job creation and participation in apprenticeship programs.

5. The role of gas and nuclear

Gas-fired power plants are not clean resources and investing in them is a risky proposition for electric utilities. The Union of Concerned Scientists is part of a group challenging approval of Minnesota Power’s plan to build a new gas plant in Superior, Wisconsin.

However, we and other signatories to the consensus proposal are supporting Xcel’s acquisition of the Mankato Energy Center, an existing gas plant currently owned by Southern Company. Why?

Xcel already buys power from the Mankato plant, and the acquisition is being pursued in combination with the above aspects of an overall plan to decarbonize the company’s generation portfolio. Our analysis is that acquisition of the Mankato plant will not have significant impacts on greenhouse gas emissions and will help Xcel phase out its Minnesota coal plants by:

  • Reducing system costs associated with early coal retirements and incentivizing the decarbonization of sectors outside the electricity sector;
  • Displacing the need for large additions of gas combustion turbine generation in the 2030s and 2040s; and
  • Putting a large carbon emitter (the Mankato plant) under the oversight of the Minnesota Public Utilities Commission, an important step in ensuring beneficial resource planning for a carbon-free future.

Utilities are unfortunately rushing to build new gas infrastructure despite there being enough gas capacity online to meet demand. Still, Xcel Energy is not backing off from its commitment to be net carbon-free by 2050 and emissions from the Mankato plant will fall under that cap if the acquisition is approved and Xcel owns the plant.

With respect to nuclear, while it is not part of our consensus proposal, Xcel’s preliminary plan also includes an expectation of relicensing its Monticello nuclear plant and operating it at least until 2040. (To date, no nuclear reactor in the United States has received approval from the Nuclear Regulatory Commission to extend its operating license beyond 60 years, but three applications are currently pending.) This concept will require close examination by stakeholders and regulators on whether it is the most cost-effective path toward a 100% carbon-free electricity future and whether the plant can continue to operate safely beyond 60 years.

What are the next steps?

The consensus proposal will be reviewed by the Minnesota Department of Commerce and other stakeholders in a proceeding currently pending before the Minnesota Public Utilities Commission.

Stakeholders also can weigh in Xcel’s preliminary plan prior to the company’s integrated resource plan filing slated for July 1, 2019.

Finally, the measures outlined by Xcel Energy show that a low-carbon electricity system is achievable in Minnesota and should further support the legislature’s consideration of clean energy measures that I blogged about earlier this month, including establishing a goal for 100% carbon-free electricity by 2050.

Photo by Zbynek Burival on Unsplash

California’s Infrastructure Earns a C-. We Need More Equitable and Climate-Safe Infrastructure Now

UCS Blog - The Equation (text only) -

Last week, California’s infrastructure got its report card. Engineers from Region 9 of the American Society of Civil Engineers (ASCE) evaluated the state of our roads, dams, electric grid, schools, and other critical infrastructure, as they do every six years. This time around, the Golden State earned a grade of ‘C-,’ or “mediocre and requires attention.”

As the mother of an infant, this assessment struck a special chord with me. I count on the quality and reliability of our roads, water and wastewater systems, and electric grid to help me keep my daughter safe from harm and provide an environment where she can thrive. Many other parents do, too.

These expectations seem reasonable. They will, however, become even harder to meet in the face of continued underinvestment and disinvestment in communities and more frequent and severe climate-related extreme events here in California and beyond. These issues must be key considerations in infrastructure decisions and solutions moving forward.

Making the grade

California’s ‘C-’ grade is better than the nation’s grade of ‘D+’ but worse than its previous 2012 evaluation of ‘C’ – despite billions of dollars in investments since then at the state and local levels. What’s more interesting and sobering are the individual sector analyses. Our roads, energy and stormwater systems, and levees all received poor grades (‘D’, ‘D-,’ ‘D+,’ and ‘D,’ respectively). A couple of the many stated reasons include:

  • Electricity outages affected nearly 4 million Californians per year on average between 2008-2013, and roughly 3 million people in 2017. One study found that California had the most reported outages of any state in 2015, 2016, and 2017.
  • A significant portion of our stormwater drainage infrastructure pre-dates the 1940s and requires repair or replacement for continued use and protection of communities.

ASCE estimates it would take investments of hundreds of billions of dollars over the next couple decades to upgrade these and other critical sectors to a ‘good’ condition or ‘B’. Recent State and local bonds and voter-approved propositions, including SB 1 for transportation and Prop 1 and Prop 68 for water, provide a sizable down payment towards this goal.

Moving beyond averages and historical trends

California is a massive state and home to nearly 40 million people. While a single grade can serve as a helpful benchmark, it also masks the varying quality of infrastructure throughout the state that contributes to disparities in health, economic opportunities, and quality of life. As a result of decades of underinvestment and disinvestment in low-income communities and communities of color, families are left relying on infrastructure (or lack thereof) that fails to meet even basic needs. One example is low-income unincorporated communities in the San Joaquin Valley who lack access to clean, safe and affordable drinking water due to pollution, groundwater depletion, and insufficient wastewater treatment and disposal systems. Examples exist for transportation, schools, and other sectors as well. Infrastructure solutions should address these inequities and prioritize investments in the communities that need them most.

In addition, we need our infrastructure to function during extreme weather events, from floods to droughts and wildfires to heat waves. The best available science reminds us that no sector or region will be left untouched as these events become more severe and frequent due to climate change. Efforts to improve and expand our critical infrastructure must plan for this new reality, rather than continuing to assume that the past is a good predictor of the future. The good news is that the work of the AB 2800 Climate-Safe Infrastructure Working Group provides a useful framework for the necessary fundamental shifts in design, planning, investments, operations and maintenance.

Remembering why

Budget conversations are continuing in Sacramento. Discussions on a federal infrastructure package are moving forward with the introduction of the LIFT America Act, thanks to the leadership of the House Energy and Commerce Committee Chairman Frank Pallone, Jr. It’s important for policymakers to keep in mind during this process the people that the state’s and nation’s infrastructure is meant to serve. UCS will be watching closely to see what and who they prioritize.

We look forward to working with Governor Newsom’s administration and the US House of Representatives and US Senate on equitable, clean, and climate-safe solutions. Our state and nation must invest in such infrastructure as if our safety, quality of life, and livelihoods depend on it – because they do.

Hot Arctic and a Chill in the Northeast: What’s Behind the Gloomy Spring Weather?

UCS Blog - The Equation (text only) -

When temperatures hit the 80s Fahrenheit in May above latitude 40, sun-seekers hit the parks, lakes, and beaches, and thoughts turn to summer. By contrast, when temperatures lurk in the drizzly 40s and 50s well into flower season, northerners get impatient for summer. But when those 80-degree temperatures visit latitude 64 in Russia, as they just did, and when sleet disrupts Mother’s Day weekend in May in Massachusetts, as it just did, thoughts turn to: what is going on here?

Hot arctic

Before we jump into the science, let’s take a quick look at the unusual spring weather. This past weekend, Russia was the scene of record-high temperatures. A city above the Arctic circle—Arkhangelsk—recorded a high of 84 degrees Fahrenheit on May 11 at the Talagi Airport weather station. The average high temperature for Arkhangelsk this time of year is around 54 degrees Fahrenheit.

Gloomy weather

Meanwhile in the Northeast US, try having a conversation that doesn’t loop back to the endlessly gloomy, chilly, unseasonable weather. When gloomy weather becomes such a dominant topic of conversation in a region, a form of citizen science is occurring, and it tells you something: it is unusual, it is anomalous, it is downright wacky.

Many locations are not seeing the sun nearly as much as normal memory serves—and science confirms—for this time of year.  The Long Island town of Islip, New York, recorded its longest streak of rainy days on record from April 20 to May 7. It rained for 21 days this April in Boston.

It’s not just in the Northeast: repeated rain events resulted in much of the contiguous US being ranked in the 99th percentile for soil moisture on May 14, including many of the Plain states (South Dakota, Nebraska, Kansas, Oklahoma, and Texas) and most states eastward. This is a continuation of a high soil moisture ranking percentile pattern (see Jan – April 2019 in Figure 1). Soil moisture ranking percentile is from the 1948-2000 Climatology

As of this writing, there are headlines with exasperated tones wondering when winter will truly depart, including:

In that third article, Jason Samenow describes the abnormal late May forecast for snow, hail, tornadoes, flooding, and excessive heat to different parts of the contiguous US over upcoming days.

US Monthly Soil Moisture ranking percentile

Figure 1. Continental US Monthly Soil Moisture ranking percentile for Jan-April 2019. Repeated rain events resulted in a large portion of the contiguous US being ranked in the 99th percentile for soil moisture on May 14. Source: CPC NCEP NOAA


Unfortunately, the consequences of these gloomy, chilly, and rainy or snowy conditions are very real in terms of damages, both personal and in the larger economy. People are taking time away from work—lost labor hours—to deal with them. People are pumping water out of basements and throwing away cherished items lost to water damage.

Some of the flooding is from intense storms like the two rare interior US bomb cyclones that caused flooding and prompted governors to spring into action, calling on the National Guard. There is a current backlog of unmet disaster relief requests. Some of the flooding is from water tables rising since relentless repeated rain events don’t allow the soil enough time to dry out.

The natural and human-driven aspects of flooding are critical to tease apart so we can better prepare our communities for the flood risk of today and the changing flood risks of the decades ahead. This is especially important when investing dollars in infrastructure that are anywhere near surface water or groundwater (also known as the water table).

No words. It’s May 14. #snow #isurrender #uncle #youcanleavenow #vt

— Anson Tebbetts (@anson_ag) May 14, 2019

Eurasian October snow cover extent indicator

It may seem counter-intuitive, but the story of the strange weather unfolding this spring in the US is related in part to snow last October in Eurasia. This indicator—the Eurasian October snow cover extent indicator—is proving to be worthy of additional attention by US weather geeks. The good news is that the scientists who were paying attention to the Eurasia snow extent behavior during October, along with a host of other indicators, gave advanced warning of the emerging US winter and spring weather pattern for 2018/2019. Winter sports enthusiasts rejoiced and sought the snow-peaked slopes of Colorado and Utah.

The bad news is it can feel extremely bouncy going through record-breaking cold and record flooding, with temporary relief periods over these past months. It can feel like riding a seesaw. But the lasting memory of the major pattern is what becomes the talk of the region. Terrific winter snowpack, tragic flooding, and gloomy northeast.

You may wonder about the Eurasian snow extent indicator and the broader connections. I encourage those who want to know, to spend some time clicking on the links here or links in earlier blogs that point to even more information (see here, here, here, and here). These describe the details regarding how Arctic sea ice decline, particularly in the Barents-Kara sea ice, north of Scandinavia and Russia, contributes to ocean and atmosphere behavior. Which contributes to Eurasian snow cover extent behavior. And ultimately a wavy jet stream with episodic cold outbreaks over winter and spring in the Northern Hemisphere, including the US.

Here is an example of the science as Judah Cohen explains, “There is a growing consensus that it is Barents-Kara sea ice in the late fall and early winter that has the greatest impact across Eurasia.  Therefore, low Barents-Kara sea ice in November for example, favors a strengthened Siberian high, increased poleward heat flux, a weak stratospheric Polar Vortex and finally a negative Arctic Oscillation. An important point regarding the Siberian high is that it strengthens or expands northwest of the climatological center.  For low snow cover and/or high sea ice the opposite occurs.”  Translation, a weakened polar vortex means more cold outbreaks deep into US territory like this past winter and spring.

We know that burning coal, oil, and gas and the resulting global warming has caused dramatic declines in Arctic summer sea ice extent (minimum occurs in September). It takes longer to cool the warmer than normal Arctic ocean enough to grow new sea ice or thicken remnant ice in the following October and November. Over each successive decade, we are more likely to experience low Barents-Kara sea ice extent over more years, causing weather geeks to keep monitoring jargon indicators: Sea ice extent, Eurasian Snow Cover Extent, Stratospheric Polar Vortex, El Niño Southern Oscillation, North Atlantic Oscillation, Arctic Oscillation, and more to improve US seasonal outlooks.

This is little consolation to those throwing out their flood-soaked cherished items from Kansas to Maine this spring season.


A Stroller Debacle at CPSC Politicizes Child Safety and I Have No Chill

UCS Blog - The Equation (text only) -

I’m a self-proclaimed transparency nut. But now that I’m a mom, my need for information has grown exponentially. I want a label on baby food that tells me how much added sugar is in it. I want to know whether my daughter’s car seat or mattress contains organohalogen flame retardants. And I certainly want to know whether the stroller I’m using to cross busy DC streets is safe. But apparently that last bit is none of my business and that’s okay with some federal regulators who care more about acquiescing to industry wishes than keeping kids safe.

President Trump’s CPSC turns child safety into a partisan issue

The Washington Post recently reported that despite the evidence and staff scientists’ opinions that the Consumer Product Safety Commission (CPSC) should recall a jogging stroller shown to result in injuries to children (and their parents), the commission worked with the company, Britax, to avoid the measure. From 2012 to 2018, over 200 documented injuries came to CPSC by way of its reporting mechanism,, leading agency staff scientists to pursue an investigation that lasted nearly a year. The agency’s health sciences division found that children could suffer “potentially life-threatening injuries” from the common issue of front-wheel detachment. CPSC staff ran engineering tests, put together injury reports, and pored over epidemiological data, eventually starting the recall process by issuing a preliminary determination that the front wheel of the stroller was a “substantial product hazard.”

But right as this was happening, a transition was occurring at the agency from a Democratic to a Republican chair and majority of the 5-member commission. President Trump named Ann Marine Buerkle acting chair of the CPSC and she awaits Senate confirmation for the position. Buerkle was appointed to the CPSC by President Obama in 2013 and has a history of siding with companies peddling unsafe products. According to sources within the agency, she kept information on the ongoing investigation from the Democratic commissioners long enough that key decisions about the potential recall would happen as more Republican commissioners were appointed, including Dana Baiocco and Peter Feldman.

When it came time to vote on the settlement with Britax, the two minority commissioners wrote a dissent that called it “aggressively misleading” to consumers. The company got off the hook by promising to initiate a public-safety campaign and offer replacement parts to customers. But the cherry on top of this story is that the replacement parts that Britax sent to customers to deal with the strollers in question were also defective. Not only did the company achieve getting out of the hassle of a recall but they have since maintained that there was no defect in the product and have accused those parents reporting injuries of using the product wrong. I mean, come on! This is exactly why the strollers should have gone through the full recall process to begin with.

Further, the value of a child’s life should not be decided based on political affiliation. Republicans and Democrats alike should be able to band together to hold companies accountable to keep our kids safe, not align with the companies who seem to care more about playing the blame game than engineering safe products.

The value of consumer product regulation

There is nothing in this world that I want protected more than my daughter’s life. That’s why I value the mission of the CPSC and the work that has been done to help improve the safety of consumer products since its inception. Last year, UCS wished the Consumer Product Safety Improvement Act a happy 10 year anniversary. This regulation addressed a long list of issues with product safety and transparency and gave the agency the power it needed to enforce provisions to keep us safe. We’ve come so far in getting rid of paint in children’s toys, requiring a set of standards for crib and other child furniture manufacturers, and in making it easier for consumers to share their experiences with the agency directly. It’s a relief to know that there’s a government agency that is holding companies accountable for the safety of the products they put on the market and that we buy for ourselves and our children. It’s one less thing for parents to worry about.

That’s part of why it’s so infuriating to see how CPSC commissioners with agendas have thwarted the very mission of the agency. Former commissioner of the CPSC from 2013 to 2018 Marietta Robinson wrote in a letter to the editor related to the Washington Post report, “The agency was formed more than 45 years ago for the very purpose of protecting consumers from unreasonably dangerous products such as the Britax stroller.” Without an official recall, people who buy these strollers from 3rd party sellers or used strollers on Craigslist or Facebook marketplace are rolling the dice. I say this as someone who is currently browsing these sites for used strollers and finding listing upon listing for these strollers without any disclaimer about their safety issues.

With no posting on the CPSC’s website, consumers have to rely on a Washington Post investigation to make a purchasing decision. This is unacceptable. It’s a clear demonstration of the importance of regulators looking out for public health and safety, not the bottom lines of the regulated industry. CPSC commissioners need to listen to their staff recommendations and stop politicizing consumer safety measures, and Senators need to take a long, hard look at Buerkle’s history and this case in particular when her confirmation vote comes up.

Photo: John and Christina/CC BY-NC-SA 2.0 (Flickr) Craigslist

5 Reason’s Why HB 6, Ohio’s Nuclear Plant Subsidy Proposal, Should Be Rejected

UCS Blog - The Equation (text only) -

Photo: Nuclear Regulatory Commission

Last November, UCS released Nuclear Power Dilemma, which found that more than one-third of existing nuclear plants, representing 22 percent of total US nuclear capacity, are uneconomic or slated to close over the next decade. This included the Davis-Besse and Perry plants in Ohio that are owned by Akron-based FirstEnergy Solutions. Replacing these plants with natural gas would cause emissions to rise at a time when we need to achieve deep cuts in emissions to limit the worst impacts of climate change.

When we released our report, my colleague Jeff Deyette described how a proposal backed by FirstEnergy to subsidize its unprofitable nuclear plants in Ohio was deeply flawed and did not meet the conditions recommended in our report. By providing a blatant handout to the nuclear and fossil fuel industries at the expense of renewable energy and energy efficiency, ironically, the latest proposal to create a “Clean Air Program” in Ohio (House Bill 6) is bad for consumers, the economy and the environment.

Here are five reasons why this proposal is flawed and should be rejected:

1. HB 6 doesn’t protect consumers

HB 6 would provide incentives to maintain or build carbon-free or reduced emission resources that meet certain criteria. The state’s Legislative Budget office estimates the new program would cost $306 million per year, collected through a dedicated monthly charge on consumer electricity bills. Monthly costs range from $2.50 for a typical residential customer to $2,500 for large commercial and industrial customers.

HB 6 doesn’t require FirstEnergy Solutions to demonstrate need or limit the amount and duration of the subsidies to protect consumers and avoid windfall profits as recommended in our report. It simply sets the starting price at $9.25/MWh and increases that value annually for inflation.  In 2018, Davis-Besse and Perry generated 18.3 million megawatt-hours of electricity, according to the U.S. Energy Information Administration. This means that FirstEnergy Solutions nuclear plants would receive approximately $170 million per year in subsidies, or 55% of the total. As explained below, the rest of the money would likely go to upgrading Ohio’s existing coal and natural gas plants.

2. HB 6 is a bait and switch tactic to gut Ohio’s clean energy laws

But here’s the rub. HB 6 would effectively gut the state’s renewable energy and energy efficiency standards to pay for the subsidies for Ohio’s existing nuclear, coal and natural gas plants. It would make the standards voluntary by exempting customers from the charges collected from these affordable and successful programs unless they chose to opt-in to the standards. This could result in a net increase in emissions and a net loss of jobs in Ohio over time.

This political hit job is outrageous, but not at all surprising. It is just another attempt in a long series of efforts by clean energy opponents to rollback Ohio’s renewable and efficiency standards over the past five years. When combined with stringent set-back requirements for wind projects that were adopted in 2014, these actions have a had a chilling effect on renewable energy development and explain why renewables only provided a paltry 2.7% of Ohio’s electricity generation in 2018 (see figure below). In contrast, renewables provided 18% of U.S. electricity generation in 2018, and wind power provided more than 15% of electricity generation in 11 states.

The sponsors of HB 6 go one step further and make the false claim that their proposal will save consumers money. While the charges appearing on consumer bills might be less, this ignores the much greater energy bill savings consumers have been realizing through investments in energy efficiency. In addition, the cost of wind and solar has fallen by more than 70 percent over the past decade, making them more affordable for consumers and competitive with natural gas power plants in many parts of the country. It also ignores the energy diversity benefits of renewables and efficiency in providing a hedge against natural gas price volatility. Many Ohio legislators continue to put their heads in the sand and refuse to embrace the new reality that renewables and efficiency are cost-effective for consumers.

Energy efficiency programs are especially important for low-income households. By lowering their energy bills, they have more money to spend on food, health care and other necessities. It also reduces the need for assistance in paying heating bills. Unfortunately, legislators like Energy and Natural Resources Committee Chair Nino Vitale are proposing to provide handouts to large corporations at the expense of easing the energy burden for low-income households, which are also disproportionately affected by harmful pollution from coal and natural gas power plants.

3. HB6 creates a false sense of competition

While renewable energy technologies are technically eligible to compete for funding under HB 6, several criteria would effectively exclude them:

  • It excludes any projects that have received tax incentives like the federal production tax credit or investment tax credit, which applies to nearly every renewable energy project.
  • Eligible facilities must be larger than 50 MW, which excludes most solar projects, and wind projects have to be between 5 MW and 50 MW, which is smaller than most existing utility scale wind projects in the state.
  • Eligible projects must receive compensation through organized wholesale energy markets, which excludes smaller customer-owned projects like rooftop solar photovoltaic systems.

When combined with the rollback to the renewable standard, this absurdly stringent criteria would create too much uncertainty for renewable developers to obtain financing to build new projects in Ohio.

4. HB 6 will increase Ohio’s reliance on natural gas

While HB 6 could temporarily prevent the replacement of Ohio’s nuclear plants with natural gas, gutting the renewables and efficiency standards would undermine the state’s pathway to achieving a truly low-carbon future by locking in more gas generation as coal plants retire.  Over the past decade, natural gas generation has grown from 1.6% of Ohio’s electricity generation to more than 34% in 2018 (see figure). A whopping 40,000 MW of new natural gas capacity was added during this time, mostly to replace retiring coal plants. In contrast, the share of nuclear and renewable generation has only slightly increased by 2-3% each.

Ohio’s Increasing Reliance on Natural Gas for Electricity


While natural gas has lower smokestack emissions than coal, the production and distribution of natural gas releases methane emissions—a much more potent greenhouse gas (GHG) than carbon dioxide. To achieve the deep cuts in emissions that will be needed to limit the worst impacts of climate change, Ohio will need to reduce its reliance on natural gas. Gutting the state’s renewables and efficiency standards would take away the most cost-effective solutions for achieving this outcome.

5. HB 6 includes no safety criteria or transition plans

HB 6 does not require FirstEnergy’s nuclear plants to meet strong safety standards as a condition for receiving subsidies, as recommended in our report. While Davis-Besse and Perry are currently meeting the Nuclear Regulatory Commission’s (NRC) safety standards–as measured by their reactor oversight process (ROP) action matrix quarterly rating system–both plants have had problems with critical back-up systems during the past two years that put them out of compliance.

The nuclear industry has been trying to weaken the ROP for years. For example, the industry has been advocating for combining the first two columns of the action matrix, which would essentially put all nuclear reactors in the top safety category. My colleague Ed Lyman, acting director of the UCS Nuclear Safety Project, is working to stop the NRC from changing the ROP to make it a less meaningful and transparent indicator of plant safety. Our report recommends that policymakers monitor the situation and adjust subsidy policies if the NRC weakens its standards.

HB 6 also does not include any transition plans for affected workers and communities to prepare for the eventual retirement of the nuclear plants. These plans are needed to attract new investment, replace lost jobs and rebuild the tax base.

A better approach

On May 2, House Democrats announced an alternative “Clean Energy Jobs Plan” that would address many of the problems with HB 6. The plan would modify the state’s Alternative Energy Standard (AES) by increasing the contribution from renewable energy from 12.5% by 2027 to 50% by 2050 and fix the onerous set-back requirements that have been a major impediment to large scale wind development. It would expand the AES to maintain a 15% baseline for nuclear power. In addition, it would improve the state’s energy efficiency standards, expand weatherization programs for low-income households, and create new clean energy job training programs.

This proposal is similar to the laws recently passed in Illinois, New York and New Jersey that provided financial support for distressed nuclear plants while simultaneously strengthening renewable energy and energy efficiency standards. While our report shows that the subsidies for some of these nuclear plants may have been too generous, these policies have prevented plants from closing and resulted in a wave of new investment in wind, solar, and efficiency projects.

With more than 112,000 clean energy jobs in 2018, Ohio ranks third in the Midwest and eighth in the country. Ohio added nearly 5,000 new clean energy jobs in 2018.  While most of the clean energy jobs are in the energy efficiency industry, Ohio is also a leading manufacturer of components for the wind and solar industries.

To capitalize on these rapidly growing global industries, lawmakers in Ohio should reject HB 6 and move forward with a real clean air program that ramps-up investments in renewables and efficiency and achieves the deep cuts in emissions that are needed to limit the worst impacts of climate change.

Three Ways Federal Infrastructure Policy Can Speed Up Our Clean Energy Transition

UCS Blog - The Equation (text only) -

Photo: John Rogers

May thirteenth marked the beginning of Infrastructure Week and, as you might have heard, there might be at least one thing that Republicans and Democrats agree on: the need to invest in our nation’s aging infrastructure to remain competitive and build a more resilient, equitable system. This includes the electricity sector, where we must decarbonize our electricity supply, address growing threats to system resilience from climate change, and invest in the research and development of technologies that will power our growing clean energy economy. Here’s three ways a federal infrastructure policy package could help make this happen.

Unlock investments in our electric transmission system

Transmission lines are the backbone of our electricity supply. As we transition to clean energy, we also need to invest in a more efficient and resilient transmission system.

Transmission lines are critical to delivering electricity from where it’s generated to where it’s consumed, and as the nation transitions from centralized fossil-fueled power to more dispersed renewable energy resources, we need to invest in our transmission system to efficiently carry renewable energy to our light switches and build resilience against challenges such as extreme weather events and cyberattacks.

Research shows that these investments provide benefits to consumers that outweigh the costs. But a number of hurdles remain, including complex and often dysfunctional planning and approval processes, and a failure of focused leadership at the top – namely Congress and the Federal Energy Regulatory Commission (FERC).

To address these issues, Congress should declare it a national priority to upgrade our nation’s electricity transmission system and direct FERC, which oversees our bulk electric supply, to prioritize transmission planning in furtherance of a zero-carbon, more resilient electricity supply.

Congress should also authorize and fund the Department of Energy (DOE) to provide technical assistance to state and local authorities that evaluate and approve transmission projects and to develop a national transmission plan that includes recommendations on how to take advantage of existing rights of way like railroad corridors and interstate highways.

Accelerate battery storage deployment

Battery storage can make the electricity system more reliable, affordable, secure, and resilient to extreme events – all while smoothing the way for high levels of renewable energy. This is why experts agree that energy storage should be a top federal priority – both to speed up deployment of current technologies and develop the next generation of this resource.

Current storage technologies are ready for targeted cost-effective deployment to enable renewable energy integration, offset transmission system investments, and replace fossil-fuel-powered plants – particularly those located in urban environments and having significant public health impacts on surrounding communities. To achieve all that battery storage can offer for a clean, resilient electricity supply, Congress should fund tax incentives for battery storage investments to incentivize the private sector while also providing grant programs for deployment in underserved communities where battery storage can displace fossil fuels and reduce local pollution.

Congress also has a role in funding a diverse body of research on the next generation of storage technologies that would put the United States back in a global leadership position, attract private investments, create jobs, and provide significant value to the electricity sector.

Support the infrastructure build out that will fuel the offshore wind boom

The U.S. offshore wind industry about to take off, but federal investment in our infrastructure are necessary to make sure we’re ready.

The U.S. offshore wind industry is experiencing significant growth. Robust winds, relatively shallow waters, and lots of energy demand near the coast combine to make the Central and Northern Atlantic prime for offshore wind development. Several east-coast states – led by New York, New Jersey, Massachusetts, and Maryland – are moving to procure offshore wind, pushing U.S. demand to more than 17,000 megawatts (MW). Recent estimates put the value of the U.S. offshore wind supply chain at nearly $70 billion with the potential to create hundreds of thousands of jobs.

But building out the offshore wind industry requires coordination among federal, state, local, and tribal authorities, and a multitude of interests including commercial and recreational fishing, the Department of Defense, seagoing navigation, compliance with protections for migratory birds and marine mammals – just to name a few. At the same time, U.S. waters offer a new set of technical challenges compared to the European offshore wind industry that has matured over the past several year.s And at this early point in the U.S. offshore wind industry’s growth, we don’t have the ports, ships, and crews necessary to support the industry.

All of this calls for a proactive and robust federal role in the build out of our offshore wind industry. Ongoing coordination of stakeholders to identify prime offshore wind sites and open them for development while maintaining environmental safeguards is necessary. Research and development of the next generation of offshore wind turbines and the transmission grid to carry that clean energy to load centers must be funded. And federal funding to states and local communities is critical to not only build the ships, ports and other equipment necessary for offshore wind development, but to do it in a way that improves the efficiency and lowers the environmental impacts on local communities.

Infrastructure touches nearly every aspect of our lives – including our electricity supply and the potential to transition to a clean, equitable, and more reliable and resilient system. A federal infrastructure package presents an opportunity to pass ambitious climate solutions at the federal level. These should be national priorities, and any federal infrastructure package should reflect this urgent priority.

Photo: John Rogers Photo: James Ferguson/Wikimedia Commons Photo: Derrick Jackson

ExxonMobil, Chevron, and ConocoPhillips Climate Risk Reports Miss the Mark

UCS Blog - The Equation (text only) -

Photo: nickton/CC BY-NC 2.0 (Flickr)

In the next three weeks, the CEOs of major fossil fuel companies around the world are going to stand before their shareholders and tell them everything is fine when it comes to climate change.

To back up this preposterous claim (everything is not fine), the CEOs will point to their in-house climate risk analyses, which all ignore the need for fossil fuel companies to drastically and rapidly reduce their greenhouse gas emissions in order to keep the global average temperature increase to 1.5°C and avoid the worst impacts of climate change. As we’ve shown in our 2018 Climate Accountability Scorecard, UCS statements, and blog posts, major fossil fuel companies continue to make insufficient progress on climate.

A set of shareholders at ExxonMobil is so tired of the company’s failure to act that they have issued a call for shareholders to vote against all board members – a move saved for extreme events, including a proxy vote recommendation for Wells Fargo after employees had been pushed to open fraudulent accounts in 2017.

To prepare ourselves, we’ve done a deep dive into three major climate risk reports: ExxonMobil’s 2019 Energy & Carbon Summary (our expectations were low); Chevron’s Update to Climate Change Resilience; and ConocoPhillips’s Managing Climate-Related Risks, and created a detailed table comparing the reports on their climate statements and actions.

Overall, the oil and gas companies miss the mark in these reports, downplay the urgency of climate change and the depth of emissions reductions that are needed, and generally assume that they’ll continue to come out on top. We’ve summed up a few of the highlights below.

What’s a climate risk report?

In the last few years, shareholders have successfully pressured companies to report on their climate change risks. For example, how ExxonMobil might adapt if policymakers enact regulatory policies, like a carbon tax or cap-and-trade system; if solar and wind become so cheap that fossil fuel demand declines; and if sea level rise and heat waves impact refineries and other company facilities.

Companies usually only publish these reports after extensive shareholder engagement, or if such a report is requested through a shareholder proposal at the annual meeting (where shareholders vote on directors and the CEO’s pay package) and receives a majority of shareholder votes.

In 2017, 62% of ExxonMobil’s shareholders called for the company to issue a report outlining the company’s strategy for operating in a 2 degree centigrade-constrained economy. Since then, Chevron, Anadarko, ConocoPhillips, and a few other oil and gas majors have issued similar reports because of shareholder resolutions or engagement.

Climate risk reports light on details

Each of these companies is telling shareholders that it is the best equipped and best prepared to handle any sort of climate risk — be it regulations, a change in demand, or hurricanes/flooding/drought/fires/rising seas/insert your favorite climate impact here.

While companies are disclosing more climate risks than they have previously, they still haven’t listed any specific, measurable metrics that would allow shareholders to verify the companies are doing enough.

Most significantly, none of these three companies has laid out an emissions reductions plan that encompasses the full life cycle of its oil and gas products, from extraction, production, and refining to transport and use of its products.

As landmark climate science reports have stressed, not all fossil fuel assets are burnable if the world wants to avoid the worst effects of climate change. Perhaps that explains why ExxonMobil quietly downgraded its confidence in having “90 percent” of its assets produced to “the substantial majority,” which is both extremely vague and concerning for investors.

Unambitious emissions reductions goals

All three of these companies have put out some sort of quantitative emissions reductions goal. ConocoPhillips was one of the first carbon majors to come out with a firm target, even if it is underwhelming.  Chevron, after years of refusal, has put out a startlingly unambitious methane goal and linked it to high-level bonuses, and ExxonMobil has merely “announced greenhouse gas reduction measures that are expected to result” in a 15 percent decrease in methane emissions by 2020.

ConocoPhillips and Chevron have only put forward intensity targets, which means they can hit their targets by decreasing emissions per barrel even if their total emissions increase.  None of these three companies has included the emissions from the end use of its products – when they are ultimately burned – in its targets, even though these emissions make up around 80 percent of each company’s total emissions.

Undervaluation of renewables

The ExxonMobil, Chevron, and ConocoPhillips reports undervalue the role of renewables, claim that oil will be a big part of the energy mix no matter what, and are full of undeserved self-congratulations.  Most importantly, none of these three companies takes responsibility for the emissions that come from the burning of its products or acknowledges the need to urgently and drastically reduce emissions.

At this point, it’s an established fact that solar and wind are becoming the lowest-cost option for energy.  Just look at New Mexico, where a renewable energy company put forward the most cost-effective plan for supplying electricity to the state.  ExxonMobil proceeds to undervalue the expected penetration of renewables, and announced that it’s doubling down on technological improvements to keep us below 2C, but also that those tech options are not currently working.  This seems like a questionable strategy for a company that spent only $9 billion on low-carbon investments in the last 19 years, but $30 billion on oil and gas exploration in 2019 alone.

Chevron dedicated over half its “update” report to “actions and investments”, which include a fair number of renewable energy venture investments, but with no details to the dollar amounts invested, the time frame for expected implementation, or the emissions reductions anticipated. This section of Chevron’s report also includes the dubious claim that it is contributing to the “Zero Hunger” Sustainable Development Goal because its natural gas operations produce nitrogen, which is used in fertilizer, as a byproduct.

ConocoPhillips’s report avoids the whole “how will you reduce emissions” bit almost entirely, which seems like an odd choice for a company whose products are among the top 10 contributors of greenhouse gas emissions since the start of the Industrial Revolution.

Still squirreling out of responsibility for reducing emissions

ExxonMobil quietly admitted that its products are part of the problem of climate change, and there is only so much it can do without making major changes to its business model (a little late to the party on that one).  Chevron, meanwhile, subtly claimed that it isn’t the worst fossil fuel company out there and therefore everyone should stop asking it to align its business model with the Paris agreement, which Chevron simultaneously claimed to support.

Shareholder showdown at 2019 annual meetings

UCS will be attending the annual meetings for all three companies, including the virtual meeting for ConocoPhillips this morning.  Overall, ConocoPhillips has continued to engage shareholders this year and has no climate-related shareholder proposals on the ballot.  Chevron has successfully engaged with a number of investors and had several shareholder resolutions withdrawn, with a company commitment to address the issue, and managed to have a shareholder resolution calling for Paris-aligned climate targets excluded by the SEC.  ExxonMobil is facing down what could be a shareholder revolt, with prominent institutional shareholders calling for votes against all board members, support for a shareholder proposal to separate the role of CEO and board chairman, and support for a climate-related board committee.  We’ll be reporting back on our in-person attendance at the Chevron and ExxonMobil annual meetings later this month.

Photo: nickton/CC BY-NC 2.0 (Flickr)

Improving Transparency and Disclosure of Conflicts of Interest for Science Advisory Committees

UCS Blog - The Equation (text only) -

Members of the USDA Advisory Committee on Beginning Farmers and Ranchers, in a December 2010 photo. USDA photo

On Wednesday this week, the Senate Committee on Homeland Security and Government Affairs will hold a mark-up hearing in the Federal Advisory Committee Act Amendments of 2019 introduced by Sen. Portman (R-OH). And before you stop reading, yes this is a science issue. The proposed amendments are intended to improve the transparency of the federal advisory committee process, including science advisory committees of scientists from outside government, and to disclose and reduce the impacts of conflicts of interest on those committees.

My colleagues and I have written extensively about recent problems with science advisory committees: Many aren’t meeting, some are rife with conflicts and some really have lost the capacity to provide independent advice for agencies across the government. That’s a serious problem for science-based policymaking. Science advice plays a crucial role in helping ensure our government makes science-based decisions on everything from air pollution standards to new drug approvals to worker safety protections.

What’s new

The proposed amendments require agencies to open nominations for committee positions, select and publicize from those nominations, and clearly distinguish independent scientists from those representing a particular interest group. They also require disclosure of conflicts of interest to the agency and the public and greater transparency of the meetings themselves. Also, political party affiliation cannot be used as a criterion for selection for a committee. This would be a useful requirement, since such political litmus tests have been used to distort and stack advisory committees under previous administrations.

When it comes to addressing conflicts of interest in government science advice, disclosure and transparency are a good thing but require effort. Under the bill, scientists serving on committees would need to file conflict of interest statements and meet the government ethics rules. I have had to do so for numerous committees, and I can’t say it is enjoyable to do the paperwork. But, if you believe the committee’s work is important, you sigh and fill it out. And then realize whining about it was more work than just doing it.

And yes, agencies would have to put in the effort to make more information public, but this kind of transparency would bring agencies more in line with the spirit of the Federal Advisory Committee Act. And in the overall work of an agency, this is not that big a burden.

But there is opposition, it seems particularly from the biomedical community. The NIH is already exempt for the purposes of grant proposal reviews, but for other advisory committees, disclosing conflicts of interest is viewed as an overwhelming hurdle that will discourage participation on panels. I just don’t see it.

Starting to fix what’s broken

The Federal Advisory Committee Act Amendments are an attempt to fix an advisory process that is, in this Administration, too often captured by regulated industry. Conflicts of interest are the core of that problem, and transparency is one way to push back. Are the amendments perfect? No. There are still issues of diversifying panels, clarifying roles of committee members with conflicts of interest, adequately recognizing participation, and better institutional support and encouragement for panelists, as well as being transparent in the least burdensome way. You can see our ideas on improving advisory committees here.

But these amendments go in the right direction. We, as scientists, need to realize the need to continue to build and maintain public trust in our work, and ultimately decisions based on science. Spending a little time on disclosure will not go amiss.


The National Academies Illustrates the More Nuanced Value of Transparency in Science

UCS Blog - The Equation (text only) -

Photo: Another Believer/Wikimedia Commons

Ever think about reproducibility in science? Turns out you’re not alone! The National Academies of Science (NAS) just spent a year and a half studying the status quo and have released some important findings. An NAS committee released a report this week that EPA Administrator Andrew Wheeler, Department of Interior Secretary David Bernhardt and OMB Acting Director Russell Vought should really read, titled Reproducibility and Replicability in Science. The group of experts was charged with answering questions about reproducibility and replicability, mandated by the 2017 American Innovation and Competitiveness Act. There are two key takeaways that are incredibly important for federal agency heads to understand as they are issuing sweeping policies that include language about these scientific concepts under the guise of transparency.

Reproducibility and replicability are important but not the be-all end-all of good science

The NAS committee was charged with defining reproducibility and replicability across scientific fields. Reproducibility is obtaining consistent results using the same input data, computational steps, methods, and code, and conditions of analysis. Replicability is obtaining consistent results across studies aimed at answering the same scientific question, each of which has obtained its own data. While the report acknowledges that reproducible and replicable studies help to generate reliable knowledge, it also is very clear throughout that these standards can be features of a scientifically rigorous study, but are not necessarily essential. The committee writes, “A predominant focus on the replicability of individual studies is an inefficient way to assure the reliability of scientific knowledge. Rather, reviews of cumulative evidence on a subject, to assess both the overall effect size and generalizability, is often a more useful way to gain confidence in the state of scientific knowledge.” There are many reasons why a study might not be able to be reproduced or replicated, not the least of which to protect the privacy, trade secrets, intellectual property and other confidentiality concerns associated with the underlying data. Challenges also arise when studying environmental hazards. We must use observational data for studies of air and water pollution and it is often not possible or ethical to recreate the conditions under which people were exposed to a contaminant.

As my colleague, Andrew Rosenberg, explained in a recent blog:

“Maybe we all learned that doing an experiment in a lab many times over can give you confidence in the results and that is the “scientific method.” Made sense in grade school. But lots and lots of critical scientific information and even analyses are not “reproducible” in this sense. Take, for example, the impact of a toxic pollutant on a local community. Should we release it again to see if it is really harmful? Or the study of a natural disaster? Should we wait for it to happen again to reproduce the results? The Environmental Data and Governance Initiative illustrated the many real-world examples of scientific studies that are neither feasible nor ethical to reproduce.”

In the EPA’s proposed restricted science rule issued last April, EPA argues that part of the reason for the policy is to allow regulators to better determine that key findings are “valid and credible.” It claims that the benchmark upon which validation and credibility are measured is reproducibility and replication of studies. But as EPA fails to understand and the NAS committee rightfully points out, “reproducibility and replicability are not, in and of themselves, the end goals of science, nor are they the only way in which scientists gain confidence in new discoveries.” The report explains that policy decisions should be based on the body of evidence, rather than any one study (replicable or not), and likewise, that one study should not be used to refute evidence backed by a large body of research. Further, systematic reviews and meta-analyses, whereby large bodies of evidence are evaluated, are an important method of increasing confidence of scientific results. The EPA and other agencies should have the flexibility to use their own criteria to judge the rigor and validity of the science informing rules as applicable, and should not rely on reproducibility and replicability as the principal criteria of scientific credibility.

Challenges of transparency and reproducibility in science are best handled within the research community, not the White House or EPA

Improvements in transparency can be and are being made by researchers, journals, funders, and academic institutions and the report gives many neat examples of ongoing efforts. It certainly is not one agency’s job to solve issues around science transparency. Indeed, they couldn’t do this even if they tried. The recommendations of the report are aimed at scientific institutions to work on educating researchers and ensure best practices in recordkeeping and transparency that may lead more reproducible and replicable studies. Nowhere does it suggest that federal agencies that are users of such science should be involved in deciding how transparent authors must be. The scientific community needs to drive the bus. End users of scientific information are not in a position to address challenges in the scientific community at large, especially considering the lack of infrastructure and resources needed to ensure privacy protections for sensitive data within agency rulemaking. Instead of making sweeping transparency requirements that would limit the government’s ability to use the best science, the report recommends that funding agencies invest in the research and development of open-source tools and related trainings for researchers so that transparency is fostered at the beginning of the scientific process instead of being used as an opportunity to exclude crucial public health studies that have already been conducted.

No crisis of reproducibility, no time for complacency

During the report release webinar, the study authors summarized their findings by saying that there wasn’t a crisis of reproducibility, nor was it a time to be complacent about issues related to transparency in science. This is a fair assessment of the situation and one that should be reexamined by the EPA as it reviews the 590,000 public comments it received on its restricted science proposal. There are absolutely ways we can use technology to improve recordkeeping and transparency throughout the scientific process so that researchers can better build on one another’s findings and advance knowledge. Smart minds at NAS and elsewhere are already working on this. The committee report highlights some of the ways this is happening thanks to the leadership of academic institutions, funders, and journals. Government has a role to play in helping to fund the infrastructure that will foster more open and accessible science and arm researchers with the tools to abide by best practices. The EPA, DOI, and OMB should listen to the scientific community and learn how best to accomplish that task. There is absolutely no role for the White House or federal agencies like the EPA to issue sweeping, prescriptive rules that limit the way that science is used to inform regulations.

Photo: Another Believer/Wikimedia Commons Source: NAS

Top Clean Cars for 2019 and 2020 

UCS Blog - The Equation (text only) -


Looking to clean up your commute? Choosing a less polluting vehicle is one of the biggest things you can do to combat climate change and fortunately for you, I just got back from the DC and NY Auto Shows where automakers displayed the latest and greatest clean vehicles coming to a showroom near you.

Electric vehicles were prominently displayed at this year’s auto shows; for good reason. EVs are cheaper and cleaner to drive than their gasoline-powered counterparts and are beginning to appear as SUVs and pickups, which are the most popular vehicle types in the U.S. Want to find out how clean an EV is in your area? Check out this handy emissions calculator.

2019 Hyundai Kona EV

This crossover utility EV is already a fan favorite, having generated strong reviews from auto reporters and consumer advocates since it was introduced to the U.S. in January 2019. It not only has good looks, but also good performance. The Kona EV gets 258 miles on a full charge from its 64 kWh battery pack, which can be filled up to 80 percent in just 75 minutes from a 50kW level 3 charger, or to 100 percent when plugged into a level 2 (240V) charger overnight.  The Kia Niro EV, the Kona’s sister car, has similar specs.

The only bad news here is the Kona EV is exclusively available on the West Coast and in Northeast states (specifically, Connecticut, Delaware, Maine, Maryland, Massachusetts, New Jersey, New Mexico, New York, Oregon, Pennsylvania, Rhode Island, Vermont, Washington state and Washington D.C.). Should sales of this newcomer prove strong, Hyundai may be pressed to expand its availability but until then, you need to travel to a state where it is sold to take possession of this new EV offering from Hyundai.

2019 Volkswagen e-Golf

Volkswagen is slowly making amends for their transgressions and are beginning to offer electric options across their vehicle classes. One of the reasons why I’m excited about the 2019 VW e-Golf is its price. This all electric hatchback starts at $32,790 and is still eligible for the $7,500 federal tax credit – bringing the base MSRP down to $25,290. Considering that the average new vehicle cost $37,577 at the end of 2018, getting a nice VW for around $25k is a great deal. Though the e-Golf offers slightly less range than its competitors (estimated 125 miles on a full charge), it’s a good size – easily fitting 4 adults with bags in the trunk – and has plenty of electric range for most daily driving. Its price and features earned the e-Golf “best electric vehicle in the compact class” honors from Car & Driver, and an overall 10Best award for 2019. Similar to the Kona EV, the availabiilty of the e-Golf is limited to the “ZEV states” for now, but VW plans to bring more EVs to all 50 states as soon as 2022.

2019 Chrysler Pacifica Plug-In Hybrid

Minivan alert! Do you need to shuttle gremlins to soccer practice or the mall but also want to cut your carbon footprint? Then this 2019 offering from Chrysler may be for you, as it is currently the only plug-in minivan for sale in the U.S. With the ability to travel 32 miles on a full charge, the Pacifica Hybrid can avoid filling up with gas for weeks or even months depending on your daily driving needs. It is also eligible for the $7,500 federal tax credit, which brings its price more in line with other traditional minivans.

When the battery is depleted, the Pacifica Hybrid operates like a traditional gasoline-electric hybrid, and achieves considerably better fuel economy than its gas-only minivan competitors. EPA rates the Pacifica Hybrid as capable of 32 miles per gallon combined in traditional hybrid mode, which is 10 mpg more than the Toyota Sienna, Honda Odyssey, and standard Pacifica. With its 16.5-gallon fuel tank, the Pacifica Hybrid also offers an outstanding 520 miles of total driving range, plenty for weekend warrior’ing or long road tips.

2020 Toyota Corolla Traditional Hybrid

For the car shoppers who can’t use an EV because they don’t have a place to plug it in every night, this traditional gasoline-electric hybrid might be a better choice.  The 2020 Toyota Corolla Hybrid comes in at a MSRP of just $23,880 and offers an estimated 52 MPG combined with the reliability consumers have come to expect from Toyota. Though the Prius has been the king or queen of traditional hybrids, the 2020 Corolla is a great alternative with a a more innocuous styling package.

2020 Rivian R1T

Based in Plymouth Michigan, start-up automaker Rivian recently raised funds to launch production of an all-electric pickup truck (the R1T) and an all-electric SUV they unveiled at the LA Auto Show this past November. Pickups and SUVs are the most popular vehicle classes in the U.S., so if Rivian cracks the code at producing an affordable electric version of these vehicles, they may be onto something huge. The Rivian R1T pickup is expected to deliver up to 400-plus miles of range, have an 11,000-pound tow rating and a cargo capacity of 1,760 pounds, go 0-60 in 4.9 seconds, and have off-road capability. But these impressive specs will come at a price. The R1T is expected to start at about $69,000 before any tax credits, but if you need a pickup truck and are tired of burning too much oil as you carry your cargo around, check out the Rivian R1T.

Photo: Photo: Hyndai Photo: Volkswagen Photo: Chrysler Photo Corolla Photo: Rivian

How Big is Gridlock in our Electric Grid?

UCS Blog - The Equation (text only) -

Photo: AWEA

Progress in electric power, particularly the growth of renewable energy and consumer choice, is looking like gridlock.  Look closer and we can see three fundamental issues: state policy vs. federal policy; changing perspectives on reliability, and how electric grid planning should accommodate the ongoing transition to renewable energy. We even have gridlock in the appointment and continuity of the Federal Energy Regulatory Commission (FERC) that oversees much of the decision making in these spaces.

Transitions need transmission

From the beginning of Nikolai Tesla’s rivalry with Thomas Edison, the choice of energy supplies has depended on the availability of transmission to flow electricity from one place to another. Any new energy supply needs some kind of conductor or transport from the supply to the demand. The larger the cumulative supply, the more pronounced this need. Adding a lot of offshore wind energy, for example, requires a commensurate plan for safely getting that energy into the existing grid. State policies in the Northeast have brought this innovation, (first studied at University of Massachusetts but first adopted commercially in Northern Europe), to the cusp of commercial-scale deployment off New England and Mid-Atlantic coast. So while we now have commitments for significant offshore wind development, the details of how we’ll effectively move that energy to the onshore grid and ultimately to customer demand remains unresolved.

Transmission lines are the infrastructure of renewable energy.  Planning ahead for these lines enables the addition of the clean energy.  We saw this in Texas, and we need to see it again for offshore wind.   Renewable energy is growing rapidly, replacing fossil fuels and reducing carbon and other air pollution every day.  An infrastructure strategy for carbon reduction and the transition to renewable energy should include electric transmission investment.

More about that gridlock: State policy vs. federally regulated markets

Sitting at FERC is a request by PJM on how federally supported markets will treat power plants that are supported by state policies, like the long tradition of state sanctioned monopoly utilities or the decisions of state legislators to promote innovation or continued operation of zero-carbon power plants.  These policies ultimately pave the way for power plants to receive revenues outside of the FERC-regulated markets – either through checks from captive ratepayers or through alternative revenue streams like renewable energy credits (RECs). This decision regarding how to align these policies with the wholesale markets in PJM has been stuck in a regulatory deadlock at FERC since mid-2018 when one of Trump-appointed Commissioners left, leaving a 2 – 2 split in opinions at FERC.

At stake in that decision are these three fundamental issues:

  1. State policy vs. federally supervised market platform: PJM asked FERC permission to discriminate between the old-style, ratepayer-subsidized plants (usually fossil-fueled) owned by state-regulated monopolies (which would be exempt from PJM’s proposed definition of “improperly subsidized”) when applying new financially impactful rules to renewable power plants that have revenues from state Renewable Portfolio Standards and the nuclear plants that have been given additional payments from legislative actions.  PJM seeks to raise the bids offered by renewables and nuclear plants to counter state supports but allow state-supported old fossil plants to bid low so as to keep them in business. While all of these plants are essentially subsidized by state policy, PJM is proposing to penalize one category of power plant while allowing another to operate in the market at artificially low costs that will ultimately be made up by utility customers.
  2. Reliability in a changing supply mix: The PJM management of the capacity market, which provides utilities with enough resources to meet the peak demand in summer is struggling through repeated and continuous reforms that limit or reduce new types of resources.  The capacity market was also changed to incentivize coal- and gas-burning power plants to be more reliable in cold weather. Demand response and renewable energy have been devalued in the various changes, and are targeted by the proposed reform awaiting FERC decision. These types of reform ultimately create gridlock as older, less efficient resources don’t exit the market because they’re being subsidized by ratepayers while new, more cost-effective resources can’t enter the market because they’re not being properly valued for the services they can provide.
  3. This decision affects the planning and growth of transmission, literally the resolution of physical gridlock limiting renewable energy growth. PJM’s rules for transmission planning use the capacity market results to determine where and how much transmission is built for generation. So if renewable energy resources can’t participate in the capacity market, PJM doesn’t build the transmission necessary to transmit renewable energy to customer demand.

The interaction of these issues can be seen in all the U.S. grids to some degree.  The assumptions based on fossil-fuel plants, and the owners of those plants, work against a transition to renewables, demand-response, or energy efficiency.  Differences among the grid operators NY Independent System Operator serving New York, ERCOT serving Texas, and the Midcontinent Independent System Operator serving 15 states plus Manitoba demonstrate some diversity in attitudes about these issues.

Planning for energy around the whole year, not just a peak demand period, is one positive change MISO is exploring.  NYISO approval of transmission that will “help unbottle clean energy” is a model for our policy goals and the role of infrastructure in achieving these goals. In short, the opportunity is at hand to use infrastructure investments – whether roads, bridges, or electric transmission – to unlock opportunities to achieve a cleaner, more resilient future.

Photo: AWEA

US Solar: 2 Million Systems Strong. And Definitely Growing

UCS Blog - The Equation (text only) -

The latest good news from the forefront of clean energy makes me think of the old FlintstonesTM vitamins commercials about the number of kids they were reaching: “[X] million strong, and growing,” went the catchy jingle. This good news is about the count of solar photovoltaic (PV) systems in the US, and should be just as catchy: We have just sped past the two-million mark. Two million PV systems, on homes and businesses, over parking lots, beside highways, and in fields and deserts across America.

Gif of numbers

That tally is courtesy of energy market analysis firm Wood Mackenzie (Wood Mac) and the Solar Energy Industries Association (SEIA). While large-scale solar accounts for more of the megawatts, rooftops account for the vast majority of the system count; residential systems alone are 96% of the total.

This momentous occasion is one of the clean energy milestones I’ve been watching for. And it’s just one more sign of a key technology that keeps hitting new heights.

Heights upon heights

That heights-hitting is on vivid display, for example, in the latest year-in-review report from the same team of Wood Mac and SEIA. While the 2018 data shows that annual solar installations were down, with US solar companies installing 2% less than they had in 2017, so many of the other data points are good news. Here’s a taste:

  • Solar megawatts are climbing. Even with the annual total down (a smidge), we still added 10,600 megawatts (MW) of PV to our national power mix. That put the total installed solar tally at more than 64,000 MW—enough to generate the equivalent of 12 million typical US households’ use.
  • Solar’s contributions are growing. Solar’s new heights are particularly visible in the technology’s increasing role in our electricity supply. While only 2.4% of US electricity in 2018, solar generation climbed 24% between 2017 and 2018. And its continued climb over the last decade combined with wind energy’s progress is a marvel to behold.
  • Residential solar is up. Installations of large-scale and “non-residential” (commercial) solar were both lower in 2018 than in the previous year, but gains in residential solar made up for almost all of those drops. Residential solar’s growth, says the report, “exceeded expectations for 2018,” with 7% more megawatts going in during the year than in 2017.
  • Solar isn’t just climbing; it’s spreading. The report authors emphasize how the results suggest major states “have moved past early adopters”: “[G]rowth in low-penetration emerging markets, such as Texas and Florida, continues to add to the geographic diversity of the residential market outside of California and the Northeast.” The Lone Star State, long the undisputed leader in wind power, is finally becoming a factor in solar too, capturing the #2 slot for solar megawatts installed in 2018. And the Sunshine State has finally started talking solar seriously, taking the #3 spot in 2017 and #4 in 2018.
  • Solar costs are dropping (even more). One area where less is more is in the continuation of the amazing downward trend for the cost of solar. Costs for the different market segments dropped another 4-15% in 2018. The report authors credit reductions in hardware costs, including the costs of PV modules—with the Trump solar taxes on imports being offset by Chinese policy changes that led to global oversupply.
And more heights

And the heights keep coming. Solar in California, for example, couldn’t even wait for spring to set a new record for instantaneous solar generation, and large-scale solar plus rooftop solar briefly supplied close to two-thirds of the state’s electricity demand. And California solar set another megawatt record last month.

The two million systems now in, given growth already in 2019, add up to a cool 70,000 MW. Wood Mac and SEIA are projecting that solar installations this year will be 14% higher than 2018, with the residential sector continuing to push forward and large-scale solar bouncing back. That progress looks likely to lead to another recordbreaker in annual installations by the year after next.

Meanwhile, the system count will continue to climb, along with the pace of installation. The news on this latest milestone quotes Michelle Davis, a Wood Mac senior solar analyst (and former colleague), as saying:

“According to our latest forecasts, by 2024, there will be on average, one solar installation per minute. That’s up from one installation every 10 minutes in 2010.”

This two-million mark comes just three years after we hit one million PV systems. And Wood Mac/SEIA project that we’ll hit three million in 2021 and four million by 2023.

Readers of the right vintage will recall that the FlintstonesTM vitamin commercials of yesteryear talked about “10 million strong”. Solar isn’t there yet, but at the rate it’s making progress, we’ll be there before we know it.

Photo: GRID Alternatives Wood Mackenzie and SEIA, US Solar Market Insight, 2018 Year in Review Wood Mackenzie and SEIA, US Solar Market Insight, 2018 Year in Review

Trump Administration Outdoes Itself on Climate Change Denial, Insists Arctic Warming is Good

UCS Blog - The Equation (text only) -

Among intergovernmental bodies, it is hard to find a more congenial, consensus-driven body than the Arctic Council. This organization, comprised of the foreign ministers of the eight arctic states and leaders of six Arctic indigenous organizations, has managed to find common ground on issues of sustainability and the environment even during politically tense standoffs among the members, such as when Russia invaded Crimea.

Leave it to the Trump administration to detonate this paragon of congeniality. In 2017 they thoroughly pissed off the diplomats from the other seven countries, and embarrassed their own team, by demanding last minute deletions of climate change language from the Ministerial Document – the Arctic Council’s biannual affirmation of ongoing work and priorities. This nearly derailed the entire diplomatic effort, but the other countries, bent on preserving the integrity of the Arctic Council’s famously united front, conceded, and all eight foreign ministers signed the document.

This week, however, they drew the line when Secretary of State Mike Pompeo, a former Congressman from the Kansas district that houses the Koch brothers climate-denying headquarters, demanded that no language on climate change or the Paris Climate Agreement be included in the Ministerial Document. That was it; for the first time ever in the history of the Arctic Council, no Ministerial Document was signed at the biannual meeting of the Ministers.

Adding insult to injury

Setting aside the possible effects upon the integrity of the Arctic Council, this level of climate denial was a profound embarrassment to American diplomats and an insult to Arctic residents who are facing ongoing and serious threats due to climate change in the region that is warming two to three times faster than the rest of the planet.

To add insult to injury, Secretary Pompeo spoke glowingly of Arctic warming, insisting that the loss of sea ice would yield “new opportunities for trade” and otherwise be a boon for Arctic nations.

Let’s have a look at these “opportunities.”

Rapid sea-ice loss is an indicator of warming, and recent studies have shown that Arctic sea ice is at an all-time low. In addition to creating unsafe and deadly conditions for travelers and hunters who rely on thick sea ice in the winter, this also means that coastal villages are exposed to violent winter storms and waves. As the permafrost thaws beneath these coastal villages, the loss of protective sea ice accelerates rapid erosion as villages begin planning for expensive relocations.

What this rate of sea-ice loss implies for the melting rate of Arctic glaciers is also ominous. Research has shown that, regardless of emissions reductions in the coming years, Arctic warming will accelerate until at least mid-century, potentially bringing tipping points for irreversible melting of the Greenland ice sheet. This would have dire implications for coastal cities around the world. As the president of the Marshall Islands once said to the premier of Greenland – “If your island melts, mine sinks.” Every coastal city in the world, from Miami to Hong Kong, will pay a huge price if these tipping points are crossed.

As the ice melts at an accelerating rate, permafrost thaw is also accelerating, and will continue to do so until at least mid-century. Scientists tell us that we are likely to lose nearly half of the Arctic’s permafrost in the coming decades, threatening 70% of Arctic infrastructure. Roads, pipelines, airports, buildings, homes, and, in the case of Russia, entire industrial cities will require billions of dollars to preserve or rebuild. To make matters worse, thawing permafrost has the potential to amplify global warming by releasing substantial amounts of carbon dioxide, methane, and nitrous oxide from the formerly frozen soil. Together with the loss of sea ice, which increases the heat storage capacity of the Arctic and accelerates warming, these phenomena add an ominous signature to the uncertainty surrounding global warming.

What happens in the Arctic affects us all

In the lower 48 states we experience the influence of a warming Arctic year-round. In winter, as shocking heat waves affect the Arctic region, it reduces the temperature differential between the Arctic and the mid-latitudes. This slows the jet stream into a wavy meander that can usher polar air and crippling freezes into the mid-latitudes in a phenomenon that’s sometimes called a polar vortex (though technically it’s a disruption of the polar vortex). Such frigid incursions have become unusually frequent—and expensive—in the mid-latitudes. In 2018, the costs were borne by families in New England, but in any given year these events can affect nearly any mid-latitude region.

In other words, warming in the Arctic is amplifying the climate crisis worldwide, at great economic cost. Seven out of the eight Arctic nations are deeply concerned and committed to addressing the problem. The Trump administration and Secretary Pompeo, however, beholden to the fossil fuel industry and embarrassingly ignorant of the economic realities of the climate crisis, do not care. They delude themselves with fever-dreams of economic opportunity that have no basis in reality.

While it is tempting to write this off as just another absurd international Trump administration stunt to please their fossil fuel patrons, Americans cannot ignore the health, safety, and economic threats that this blithe ignorance will cause for people abroad and here at home. While the bar for American leadership has dropped shockingly low, it’s time to demand that our public servants remember who they are meant to serve.

The Science Denial is Crystal Clear: The EPA Ignores Scientists on Asbestos

UCS Blog - The Equation (text only) -

We ignore scientists at our peril. Why doesn’t the Environmental Protection Agency (EPA) get this?

The New York Times (subscription required) just reported that EPA officials actively ignored the advice of its own toxicologists last year on asbestos (yes asbestos of all things!) by issuing a rule (subscription required) that placed some restrictions on asbestos but did not fully ban it. More than a dozen EPA scientists and lawyers wrote to officials urging them to ban asbestos outright.

This is not the first time that EPA under the Trump administration has been criticized in the way it deals with asbestos. By not banning it, the EPA leaves open the door for the continued import of asbestos and provides a deal that allows industry a regulated but available pathway to use asbestos products in the US.

We have to listen to federal scientists

EPA scientists were disturbed by both the agency’s review process and with the rule itself. Reading through the two internal memos, it is exceedingly obvious that the EPA scientists were providing a thorough and scientifically valid evaluation of the rule and why it needed to better reflect the current science.

Specifically, the EPA experts criticized the agency’s use of a scientific methodology that was more than 30 years old. The EPA rule only looked at six asbestos fiber types as harmful to health, but the agency has known there are more than six deadly fiber types since the 1990s. This is a severe breach of methodology and runs counter to the EPA’s mission statement which states that agency will use the “best available scientific information” in environmental policy decisions. Thirty-year-old science just won’t cut it when lives are literally on the line.

Additionally, the scientists criticized the fact that only two health problems were considered during the evaluation: lung cancer and mesothelioma. This does not reflect all the ways that asbestos can harm a person. The scientists listed many more health maladies (“asbestosis and other respiratory ailments, ovarian cancer, colorectal cancer, and cancers of the stomach, esophagus, larynx and pharynx”) which should be considered in the regulation process for asbestos. When this type of health data is thrown out the window, then we cannot see the full effects of asbestos exposure, and therefore cannot enact policy decisions that is based on all the evidence.

Why haven’t we banned asbestos yet?

Asbestos was once considered a miracle mineral because of its heat resistance and strength and was widely used in homes built in the 1950s to the 1970s and in thousands of products. When the science began to show just how deadly asbestos exposure can be, some of the industries that have used asbestos in their products decided that risks to their profit margins were more important than the risks to public health. The manufacturer Georgia-Pacific, a conglomerate now owned by Koch Industries and previously the maker of an asbestos-containing construction material, began a campaign to flood scientific journals with counterfeit science, articles that attempted to cast doubt on the health risks associated with asbestos exposure. Faking science to benefit industry profits is unfortunately a well-known disinformation playbook tactic that has been used repeatedly for decades.

Even now, the vice president of regulatory and technical affairs of the American Chemistry Council, an industry trade group well known for its attempts to dismantle science-based regulations on chemical exposures, told the New York Times, “We ought not to be imposing regulation simply on the basis of hazard.” This is absolutely absurd – all public health regulations are enacted for the purpose of reducing hazards to people!

What is particularly tragic is that the US does not need to import asbestos, that we know that business can cope with the change. In 1989, the EPA under the George H.W. Bush administration, attempted to ban most asbestos-containing products and phased out most other uses. While this policy was overturned by the courts in 1991, we know that banning asbestos is more than possible in the US. And the fact that over 60 nations have banned all uses of asbestos, the ability of industry to cope without asbestos is simply undeniable.

Asbestos is really, really bad

Since the 1920s, we’ve known that asbestos exposure is deadly. But here is the worst part – the latest science is showing that asbestos is far, far more deadly than we previously thought. For years we used to say that asbestos is responsible for the deaths of 12,000 to 15,000 Americans every day. According to the best available science, this number now about 40,000 people dying a year. These numbers exceed deaths in America caused by gun violence or vehicle fatalities. Former Assistant Surgeon General Richard Lemen, now the science advisory board co-chair for the Asbestos Disease Awareness Organization, said that the “mortality rate of asbestos exposures is indeed of epidemic proportions.”

This is no safe level for asbestos, even the smallest forms of exposure can result in devastating illness decades later. Take the case of Heather von St James, who was diagnosed with a life-threatening form of mesothelioma because she as a child she would wear her dad’s jacket, a jacket he used when putting up asbestos-containing drywall. St James survived the mesothelioma and is now an advocate for banning asbestos, but her story is not the norm. Mesothelioma is estimated to be fatal in 90 percent of patients over a five year period.

Ban it, just ban it!

Here’s how the scientists ended their well-crafted arguments: “Rather than allow for (even with restrictions) any new uses for asbestos, EPA should seek to ban all new uses of asbestos because the extreme harm from this chemical substance outweighs any benefit — and because there are adequate alternatives to asbestos.”

Congress is taking a look at the issue, specifically the House Committee on Energy and Commerce Subcommittee on Environment and Climate Change, but the administration needs to do better on this. The American people deserve to live their lives without the threat of asbestos exposure, we need to ban this dangerous substance right away.

Healthy Soil, Coming to a Theater Near You: 5 Lessons from “The Biggest Little Farm”

UCS Blog - The Equation (text only) -

Photo courtesy of Apricot Lane Farms

An email in my inbox last month caught my attention. It was from author, environmental advocate, and Academy Award-winning film producer Laurie David (“An Inconvenient Truth”), and it offered a preview of “The Biggest Little Farm,” a new documentary film David had coming out soon. “I promise you that any person that goes to see this film will leave inspired and caring a whole lot more for the planet,” her note said. “I promise you it will help your organization achieve your goals!”

I clicked on the link, watched the trailer, was intrigued. The movie looked gorgeous. But would it hold up to scrutiny from skeptical agricultural scientists?

A few days later, in a conference room with several members of the UCS food and agriculture team, I dimmed the lights and let the film roll. “The Biggest Little Farm” (in theaters this month) chronicles the adventures of filmmaker John Chester and his wife Molly as they leave their lives in Los Angeles behind to start a diversified farm on an exhausted piece of land north of the city, where they intend to live and grow food “in perfect harmony with nature.”

At first, the storytelling seems to veer toward the precious. John documents the promise they made to their rescue pup Todd about how much he’d love being a farm dog. The narration, over cute animation, extols the idyllic life John and Molly imagine for themselves. But I soon realized he was setting up viewers for the same jolt he and Molly would soon get—repeatedly—about the harsh realities of farming, especially when you’re trying something new and complex.

Because it turns out this kind of farming isn’t all rainbows and puppies and adorable baby goats. It’s also exhausting and sometimes heartbreaking. Before long, the story got real—very real—and I was hooked. After the credits rolled, my colleagues’ reviews came in:

A really beautiful, honest, and engaging film. It shows the many tough challenges of farming with nature rather than against it, but leads with the opportunities and a hopeful optimism.

I don’t think I’ve ever seen such a stunning illustration of the ecology of diversified farming – the challenges, the potential, and all the interconnectedness of a complex farm ecosystem.  

More dead chickens! Why did you make me watch this??

Indeed, midway through the film, the casualties start to pile up. John, Molly and their team face a seemingly never-ending string of predator attacks, pest and disease outbreaks, and other deadly natural phenomena as they struggle to make Apricot Lane Farms a sustainable enterprise. Although the relentless mishaps challenge their core belief in working with nature rather than against it, they persist, learning something from each experience and finding creative ways to adapt.

Their story, while unique in many ways, contains some key lessons for US agriculture:

  1. Soil is paramount. When the Chesters first arrived at Apricot Lane Farms, their newly acquired soil was so compacted and devoid of organic matter, they could hardly break it with a shovel. “The soil is dead,” John says flatly. “And we have no idea how to bring it back to life.” But with the help of consultant and soil guru Alan York, they set about enriching it. “Plants build soil,” Alan said as they seeded cover crops. They also installed a state-of-the-art compost tea system and added animals (so many animals!) for their manure. And indeed, by the end of the film—which spans a seven-year period of historic California drought followed by an unusually wet year—the Chesters’ spongier soil seemed to have paid off, as it held water better during dry periods and soaked up more of it when the rains fell. At a time when climate change is driving more weather extremes in every part of the country, building healthy soil will be critical to ensuring that farmers can be successful.
  1. Increasing a farm’s biodiversity is critical (and hard). Someone recently said to me that farmers are the only manufacturers who work outside, completely exposed to the elements. There’s truth in that, for sure, but the choice of the word “manufacturers” is revealing. Factories typically make one thing, over and over, day in and day out. And farming in the United States has become a lot like that—an overwhelmingly industrial process, divorced from nature and, in fact, often fighting it tooth and nail. In the film, we see Alan explaining how the Chesters must emulate how natural ecosystems work (we call this agroecology). His mantra: “Diversify, diversify, diversify.” John and Molly take this to the extreme, eventually farming 200+ crops and animals across pastures, orchards, and a large vegetable garden. A plethora of wildlife also returns, including new pests that require more creativity and further diversification to combat. Alan promises all this diversity will become simplicity, but as John notes, “a simple way of farming is just not easy.”
  1. Few farmers can go to the lengths the Chesters have. But most don’t need to. The 76 varieties of stone fruit trees John and Molly now tend is…probably a bit much for most farmers. And without access to investors like they recruited, few farm startups can afford fancy composting systems, miles of new irrigation line, and the costs associated with repeated trial and error. It is never clear, in the film, how much up-front and continued investment was necessary to do what they did at Apricot Lane Farms (though we can assume it was a lot). Nor do we know at what point in the saga that investment was fully recouped, if it has been. But recent research has shown that even more limited and lower-cost efforts at diversification on farms—for example, expanding from two crops to three or four, or planting prairie strips around the edges of crop fields—can have substantial benefits. And federal farm programs provide help (though not nearly enough) for farmers to do such things.
  1. One way or another, the ecological debts of our industrial farming system must be paid. Apricot Lane Farms required substantial upfront investment not only because the Chesters had ambitious plans, but also because they needed to pay down an enormous ecological debt racked up on that piece of land over the years. Industrial agriculture has been called an “extraction industry” because it takes nutrients from the land without replacing them, allows precious soil to wash or blow away, and sends rainwater running off the surface rather than percolating down to refill underground aquifers for later use. Due to decades of short-sighted management, this is the situation on farmland all across this country. And while John, Molly, and their investors had the means to take on Apricot Lane’s ecological debt, it’s not fair or realistic to expect farmers to make up for the damage caused by industrial practices and the public policies that have incentivized them. Rather, “The Biggest Little Farm” shows once again why shifting agricultural policies to help farmers diversify the landscape and rebuild their soil and is a smart investment in the future.
  1. Nature is breathtakingly beautiful. The film’s message is in line with what the science tells us about farmland diversification and healthy soil, and it comes at a time when legislators in many states and in Congress are looking to expand policy supports and public investments to help more farmers advance soil health. Even though Apricot Lane is just one farm, and a unique one at that, my hope is that this film adds to the conversation. But you don’t have to be an advocate for healthy soil policy to appreciate the movie, which above all is visually stunning and brimming with optimism. You’ll marvel at the ways John Chester’s cinematography captures the beauty and devastation of nature and life on a diversified, ecologically-based farm—from aerial footage of painstakingly designed orchards to images of playful lambs and terrifying wildfires, infrared footage of nocturnal predators, and superslomo shots of the hummingbirds and beneficial insects who return as part of the farm’s renewal. If you like that iPhone commercial, you’ll find this film equally appealing.

“The Biggest Little Farm” opens this Friday, May 10, in Los Angeles and New York, and nationwide May 17.

Putting Communities First in Deploying Energy Storage

UCS Blog - The Equation (text only) -

A wide range of stakeholders from across the country met in December 2018 to develop a set of principles to ensure equitable deployment of energy storage technologies. (Photo: Megan Rising/UCS)

UCS convened a select group of stakeholders in December 2018 to discuss policies to spur deployment of energy storage. But this meeting was not your typical policy development session—we focused on how to design policies that put communities first. UCS focused on not only deploying more energy storage, which is an important part of the clean energy transition, but also doing so in a way that involves community members and drives equitable outcomes. The stakeholders present at December’s convening developed a set of consensus principles based on the discussions there and conversations since.

Fully 26 participating organizations have endorsed the principles on equitable deployment of energy storage.

The opportunity

When combined with investments in clean energy, storage has the potential to hasten retirements of coal and even natural gas plants across the country. This is critical not only for our climate and decarbonization goals, but also to improve air quality in frontline communities. Utility-scale storage is already being procured to replace three natural gas plants in California. Experts predict energy storage will be a $3.8 billion industry by 2023.

Energy storage has a wide range of potential applications, and UCS recognizes and emphasizes the potential for storage to benefit disadvantaged communities. Because of the potential community benefits, UCS focused in on a couple of important use cases for storage: replacement of peaking power plants and fossil-fired plants; ability to keep the lights on and bounce back more quickly from  power outages; and accelerating the development and integration of renewable energy on the grid. Our focus on energy storage is not meant to preclude other carbon reduction policies or the need for renewable energy policies, but rather to lift up energy storage deployment policy as a key complementary policy. We also recognize that much work remains to be done to fully capture the value that storage can provide to the market and customers.

Involving stakeholders

UCS convened a group of diverse stakeholders, including environmental justice and grassroots organizations, policy experts, solar and storage industries, labor, consumer advocates, faith groups, and renewable energy advocates, in December 2018 in Chicago, Illinois, focused on the equitable deployment of energy storage. The participants developed a set of consensus principles for storage deployment that elevate the critical importance of community-led clean energy solutions. Together these principles can help state policymakers focus on solutions that ensure that the growth of energy storage improves all communities.

As far as we can tell, this event was the first of its kind. Typically, policy wonks gather in a room to think up ideas about how to drive the outcomes they think are important. And while those expert opinions are obviously important, we wanted to know what affected communities thought about the desired outcomes and how to get there. We see this process as an important contribution to our collective work to drive a transition to a clean energy economy.


The purpose of the convening was to develop policy recommendations, strategic relationships, and political momentum to accelerate the equitable, safe, and low-carbon deployment of energy storage in the US at the state level.

Our goals for the convening were to:

  • Create a core set of policy design elements on equitable, safe, and low-carbon energy storage policy deployment that can influence state legislation in 2019 and beyond.
  • Build momentum in a set of target states with a broader coalition for equitable, safe, and low-carbon storage deployment policies.
  • Produce both short-term and longer-term materials for broad distribution that advance these goals.

This convening on state-level deployment of energy storage built on an earlier convening that UCS held in March 2018 in Washington, DC. That earlier event brought together leading researchers to identify the most important breakthroughs needed to scale up electricity storage as well as ways the federal government can support innovation in this strategically important industry. It was sponsored by the bipartisan House Advanced Energy Storage Caucus and resulted in a policy brief which synthesizes the discussions, including recommendations for federal policy-makers on how to best support electricity storage RD&D that drives innovation, lowers electricity prices, and increases the reliability of the US electric grid.

The principles

Prior to December’s convening, UCS set the stage with some initial thoughts and ideas about what equity might looks like in the context of energy storage deployment. The stakeholders then expanded and shaped the concepts and ultimately outlined six principles of equitable policy design for energy storage. They grappled with the following questions:

  • How can storage be deployed to reduce emissions and improve air quality?
  • How can storage make communities and residents more resilient to disasters and power outages?
  • How can storage promote local economic development and job growth?
  • How can storage help accelerate greater levels of renewable energy on the grid?
  • How can storage help reduce electricity bills?
  • How can policymakers ensure that communities have a seat at the table?

Read the full text of the principles with the list of supporting organizations here.

Outcomes and Next Steps

For this discussion, we focused on three states—Minnesota, Illinois, and Maryland—where we saw opportunities for advancing storage legislation in the near term. Participants represented these three states, and other stakeholders attended who shared perspectives from leading states and nationally.

We know that our convening brought together people who would not otherwise have met, and we saw that dynamic play out in hallway conversations throughout our two-day event. We also know that some of those relationships have continued beyond the convening.

While UCS and many of the convening attendees are focused on advancing equitable energy storage policy in these three states, our hope is that these principles can be used more broadly to inform policy and to shape the way legislators and storage advocates are conceiving of the opportunities afforded by energy storage.

China’s Counterproductive Response on New START

UCS Blog - All Things Nuclear (text only) -

May 6th, 2019: The Chinese Foreign Ministry dismisses the possibility of entering into strategic arms limitation talks with the United States and Russia

Last month Secretary of State Mike Pompeo told the Senate Foreign Relations Committee the Trump administration wanted China to participate in discussions on extending the New START Treaty, which places limits on the size of the nuclear arsenals of the countries who sign it. The current treaty, which expires in 2020, is a bilateral agreement between the United States and Russia. Pompeo said the administration wants to broaden participation in the treaty to include China.

When asked at a recent press conference, a spokesperson for the Chinese Foreign Ministry said his government “will not participate in any negotiation for a trilateral nuclear disarmament agreement.” That’s unfortunate. It’s also counterproductive. China lost an opportunity to educate Americans, and the rest of the world, about its comparatively reserved nuclear weapons policies. It lost an opportunity to be an international leader on nuclear disarmament and to achieve numerical parity with the United States and Russia. And, if the ministry’s own assumptions about the disingenuous motives of Trump administration officials are correct, it may have helped Trump pin the blame for failed negotiations between Russia and the United States on China.

Silence is Acquiescence

Most Americans don’t think about nuclear weapons very much. They don’t know how many weapons each nation has or understand the policies governing their use. So when Trump administration officials claim the United States lags far behind China in modernizing its nuclear arsenal most Americans are not inclined to doubt. Without public inquiry or objection, those claims will form the basis of Congressional decisions to approve spending trillions to modernize the US nuclear arsenal so America can catch up with China.

In fact, China’s nuclear arsenal is smaller than the US nuclear arsenal was in 1950. None of its nuclear weapons are kept on high alert. Its nuclear-armed submarines never go on armed patrol. China is working diligently to improve the quality and increase the quantity of its nuclear forces, but at the present pace of those efforts, even if the United States does nothing, China’s nuclear forces will continue to lag far behind those of the United States for many decades.

Geng Shuang, the Chinese spokesperson who made the announcement, mentioned that China’s arsenal is “kept at the minimum level required by national security” and is “an order of magnitude” smaller than the US arsenal. But saying it once at a press conference in response to a question is a veritable whisper in the cacophony of information coming at US voters in this age of social media. Deciding to engage the United States in New START discussions would be surprising, make headlines, generate endless commentary and flood the United States with better information on China’s nuclear forces.

The Chinese government constantly complains about American attempts to hype “the China threat” to the United States. Yet presented with a golden opportunity to dispel at least some of the hype, all the ministry could muster was a few diffident lines at a press conference. All that Congress and the American public will hear is that “China said no.”

Chairman Xi Fails to Lead

That China said “no” to nuclear disarmament negotiations is most likely all the rest of the world will hear and remember as well.

China may be worried that accepting Pompeo’s offer could trap China in a difficult situation. That’s understandable, especially given the relatively strict verification requirements in the existing treaty. But Pompeo may have presented the current Chinese leader with a diplomatic no-lose scenario if he said yes. Had Chairman Xi agreed to engage the United States on the possibility of Chinese participation in New START it would have put the onus on President Trump to respond. Xi could have welcomed the opportunity to have the United States reduce its nuclear forces to a level where China could be an equal party to the treaty. If Trump demurred he would take the blame for China’s absence from negotiations. If Trump said yes the United States would have to reduce the size of its nuclear forces by an order of magnitude. Either way Xi wins. The only way he could lose was to say no, yet that’s what he did.

Geng Shuang told the press, “China stands consistently for the comprehensive prohibition and complete elimination of nuclear weapons.” Claims like this from all the nuclear weapons states ring hollow to the vastly greater number of non-nuclear weapons states, which have been waiting for 49 years for China, the United States, Great Britain, France and Russia to honor their obligations under the Nuclear Non-Proliferation Treaty (NPT) to “pursue negotiations in good faith … on a treaty on general and complete disarmament under strict and effective international control.” Pompeo’s offer to engage the United States on deep nuclear reductions was a test, intended or not, of China’s commitment to the NPT. Xi failed that test.

The non-nuclear weapons states have good reason to doubt China’s sincerity. Like the United States, China signed but did not ratify the Comprehensive Nuclear Test Ban Treaty (CTBT). However inexcusable, the United States Senate rejected ratification shortly after the US signed the treaty in 1996 and a majority of senators remain opposed. China didn’t have that problem then, and it’s even less of a problem under Xi’s leadership. He could direct China’s National People’s Congress to ratify it tomorrow. He could also be more proactive in the United Nations Conference on Disarmament (UNCD). China has a highly capable community of arms control experts, many with considerable scientific and technical expertise, who can support a more proactive Chinese stance on nuclear disarmament. Yet Xi seems happy to let things lie. His failure to respond positively to the opening Pompeo presented is a sign that China is content with a status quo where the world remains divided into nuclear haves and have-nots.

This was also a test of Xi’s common sense. Any imaginable negotiating scenario that resulted in a significant reduction of US nuclear forces is in China’s national security interest. A smaller US nuclear force makes it less likely the United States might try to launch a disarming nuclear first strike against China’s small nuclear force – the very scenario China’s nuclear modernization efforts are supposedly intended to address. Engaging in nuclear arms control negotiations with an adversary you believe may not be acting in earnest is a risk, but it’s one a China committed to “the comprehensive prohibition and complete elimination of nuclear weapons” should be willing to take.

Sucker Punched

Finally, Geng Shuang said Pompeo’s offer was an “attempt to make an issue out of China on arms control.” What he meant was that the offer to negotiate wasn’t sincere. He may be right. There’s no evidence the United States approached China about New START either before or after Pompeo’s testimony, just like there’s no evidence the US spoke with China about joining the INF treaty. China’s assumption, and the assumption of most of my colleagues in the Chinese and US arms control communities, is that the Trump administration is setting China up to take the blame for the eventual collapse of the New START agreement, just like it did with the INF Treaty.

Arms control proponents in the United States are urging the Trump administration to go forward without including China because “to include limits on China would be complicated and take many years.” They also say that getting China to agree to New START’s strict verification measures “will require long periods of talks and confidence building measures.” But like the Foreign Ministry’s complaint about making China the issue,  these statements only encourage average Americans and their elective representatives to agree that China is a problem, that China isn’t ready or willing to disarm, so why should the United States. If Pompeo’s offer really was made in bad faith, the Chinese Foreign Ministry, and the US arms control advocates urging Trump to forget about China, couldn’t have found better ways to help him get away with it.  

NNSA’s FY20 Budget Request: Full Speed Ahead on Weapons Development and Production

UCS Blog - All Things Nuclear (text only) -

In March the Department of Energy released its FY20 budget request for the National Nuclear Security Administration (NNSA), which is responsible for developing, producing and maintaining US nuclear warheads and bombs.

(Source: Flickr)

The request outlines NNSA’s planned activities through FY24, including for weapons that have been part of the plan since the Obama administration such as:

  • The life extension program (LEP) for the W80-4 warhead, which will be used with the new air-launched cruise missile—the Long-Range Standoff Weapon (LRSO)
  • Producing W87-1 warheads to replace the W78 warheads on US land-based missiles

The costs of these programs are growing as they move forward with development and, as expected, have led to a substantial increase in the FY20 budget request for Weapons Activities. This category jumped almost 12%, to $12.4 billion, from $11.1 billion in FY19.

Although the Trump administration plans to add two new warheads to the arsenal—the W76-2 “low-yield” Trident warhead and an as-yet unnamed warhead for a planned new sea-launched cruise missile (SLCM)—their impact on the FY20 budget is very small. This is because the W76-2, which is nearly complete, requires only a relatively minor modification to existing W76 warheads, and the SLCM is still in the earliest stage of development, receiving only minimal funding for studies.

Big increase: W80-4 warhead for new air-launched cruise missile

The largest funding increase is for the life extension program for the W80-4 warhead that will be used with the Long-Range Standoff Weapon (LRSO), which is planned to replace the current air-launched cruise missile (ALCM). The NNSA has requested almost $900 million for the program in FY20—more than $240 million over last year’s funding of $655 million, a jump of more than 35 percent. This is on top of another large increase last year to speed the program up so that it was keeping pace with the schedule for the LRSO.

As we have previously detailed, the LRSO is an unnecessary and destabilizing weapon. It is expected to be significantly more capable than the existing ALCM, featuring enhanced accuracy, longer range, and greater speed; it will also be harder to detect. Like the W76-2 low-yield warhead discussed below, the supposed advantages of the LRSO lean more toward nuclear warfighting than deterrence.

Last year’s budget predicted that the NNSA would need only $714 million for the program in FY20. The program completed a required Weapon Design and Cost Report in December 2018 that reportedly provided data that led to the increase. In March, the Nuclear Weapons Council (a joint Department of Defense and Department of Energy group that oversees plans for nuclear weapons programs) approved the program to move into its next stage, Development Engineering. Costs for this program have increased substantially over time, and as it moves further along its development trajectory the budget will only continue to increase.

Another big increase: the W87-1 warhead for land-based missiles

Another recipient of substantially increased funds in the FY20 budget request is the replacement for the W78 warhead deployed on Minuteman III intercontinental-range ballistic missiles (ICBMs). The planned replacement will be a modified version of the W87 warhead already deployed on ICBMs—dubbed the W87-1. Its FY20 budget comes in at $112 million—more than double last year’s funding of $53 million. This warhead is slated to be fielded on the “Ground-based Strategic Deterrent” missiles which are to replace the current Minuteman missiles beginning in 2029. The warhead was previously planned as the first of three interoperable warheads (IWs) to be used on both land- and submarine-based missiles.

The IW-1 warhead was intended to replace both the W78 and half of the Navy’s W88 warheads. But the program was not supported by the Navy and experts (including my colleague Lisbeth Gronlund) raised serious questions about its high cost, increased risk, and limited benefits, leading to its eventual death. The NNSA’s FY2019 Stockpile Stewardship and Management Plan, released last fall, finally dropped the IW designations altogether in favor of “W78 replacement warhead” (now the W87-1), “ballistic missile warhead Y” (BM-Y, formerly IW2), and “ballistic missile warhead Z” (BM-Z, formerly IW3).

Coming in under cap: Weapons Dismantlement and Disposition

Funding for Weapons Dismantlement and Disposition is capped at $56 million per year until FY21 due to provisions in the FY17 and FY18 National Defense Authorization Acts. This was a result of opposition by the Republican-led Congress to the Obama administration’s announcement at the 2015 Nuclear Nonproliferation Treaty review conference that it planned to accelerate weapons dismantlement, in part to help demonstrate US commitment to nonproliferation and disarmament. In its first year, the cap resulted in a cut of almost 20% to President Obama’s request, effectively undercutting the plan.

For FY20, the NNSA has requested $47.5 million for Weapons Dismantlement and Disposition, $8.5 million less than the $56 million the category received in FY18 and FY19. Last year’s budget request indicated that the NNSA planned to continue requesting the full amount through FY23, the last future year included. The NNSA says that the decrease this year is because of a “reduction in legacy component disposition and CSA activities.”

The FY20 budget request, like the FY19 budget request, also does not reiterate the goal of dismantling all weapons retired before FY09 by FY22, which was set out in previous budget documents. Dismantlement competes for space and personnel with weapons production programs, so it seems likely that adding on to and speeding up the production schedule will keep dismantlement as an also-ran for the upcoming crunch period for NNSA.

Delayed retirement: the B83 bomb

Also getting an increase over estimates in previous budgets is the B83 bomb which, with a yield of 1.2 megatons, is by far the most powerful weapon in the US arsenal. The B83 was previously scheduled to be retired after the new B61-12 entered service in the early 2020s, but the Trump administration has decided to keep it around for an undetermined period, confusingly characterized in some places in the 2018 Nuclear Posture Review (NPR) as “at least until there is sufficient confidence in the B61-12 gravity bomb that will be available in 2020” and in other places as “until a suitable replacement is identified.” This means that it requires more funding for upkeep and increased surveillance activity to ensure it will remain usable.

According to Charles Verdon, NNSA’s Deputy Administrator for Defense Programs, these measures will be sufficient to keep the B83 in the active stockpile for the next 5-7 years, but after that the bomb would need a full life extension program, and the NNSA does not yet have an estimate for how much that might cost. Chances are that the NNSA does not want to undertake such a program, given how many other life extension programs it already has on its plate.

It would also be costly and unnecessary. Recognition that the high yield of the B83 is not needed was part of the NNSA’s argument in favor of a major life extension program for the B61 bomb, which will produce a new variant, the B61-12. In response to questions from Congress about the need for the new bomb, when the B83 can already carry out some of the same missions for which it is designed, Air Force General Robert Kehler, at the time commander of US Strategic Command, characterized its “very high yield” as one of its “shortcomings.” The B61-12 will include a new guided tail kit that will significantly increase its accuracy, and more recent reports show that it will also have earth penetrating capability. These improvements will allow it to be used against a broader set of targets, even those which may previously have required higher yield.

New: Next ICBM Warhead, Sea-launched cruise missile

New in this year’s budget are the introduction of a line item for the “Next Strategic Missile Warhead Program” and funding for a study of President Trump’s proposed sea-launched cruise missile (SLCM).

The warhead labeled BM-Y in the most recent Stockpile Stewardship and Management Plan has apparently now been updated to the Next Strategic Missile Warhead (or possibly the Next Navy Warhead, as it is given different labels on different pages of the budget request). The Next Strategic Missile Warhead would eventually replace the W87 warhead on the Air Force’s planned Ground Based Strategic Deterrent missiles. Previous plans called for this to be the second of three planned interoperable warheads, to be designated IW2. In those plans, IW2 would also have replaced the remaining half of the Navy’s W88 warheads. The NNSA has not requested any funding for the Next Strategic Missile Warhead program in FY20, but indicates that it will begin to request funding in FY23 to conduct feasibility studies.

The Trump administration’s 2018 Nuclear Posture Review called for the development of a new sea-launched cruise missile, after President Obama retired a previous version in 2010. The FY20 funding is for a study called an “analysis of alternatives,” one of the earliest steps in developing a new weapons program. It is unclear how much is being requested, but according to the Arms Control Association the number is, “as much as $12 million.”

Complete: The Life Extension Program for the W76 warhead

The NNSA’s budget request for the W76 LEP drops to zero in FY20, indicating that the program is now complete. The NNSA announced in January that it had completed the upgrade of all W76s to W76-1s in December of last year. The program began production of the life-extended warheads in 2008 to extend the life of the warheads by 20 years.

Nearly complete: The W76-2 warhead

The FY20 request includes $10 million for the W76-2, a variant of the 100-kiloton W76 warhead modified to have a lower yield of about 6.5 kilotons. The warhead, the most rapidly developing outcome of the Trump administration’s 2018 NPR, is intended to replace some of the existing W76 warheads and be carried on Trident missiles launched by US submarines.

The Trump administration claims that the United States needs this weapon to respond to what it believes are increased threats from Russia and others. But as we have detailed in our fact sheet on this program, this argument does not make sense. Despite what the Trump administration says, there is no “gap” in US deterrence capabilities– existing US nuclear weapons already cover a range of yields from 0.3 kiloton to 1.2 megatons. Moreover, the W76-2, like the LRSO, will add to US capabilities for nuclear warfighting, blurring the line between conventional and nuclear conflicts and increasing the chance that a nuclear weapon could be used.

The W76-2 program received $65 million in NNSA funding in FY19, plus another $23 million from the Department of Defense. The decrease in the FY20 request is because production of the modified warheads is scheduled to be completed by the end of FY19, leaving only minor close-out activities for FY20. The NNSA announced in late February that it had completed the first production unit of the W76-2 and was on track to complete the rest of the warheads and deliver them to the Navy by the end of FY19 in September. Thus, if Congress does not step in quickly, the United States will begin deploying a destabilizing new nuclear weapon in the very near future. Fortunately, Rep. Adam Smith, the chair of the House Armed Services Committee, has already announced he is “unalterably opposed” to the W76-2 and he will likely seek to end funding for the program and bar deployment of it. His effort has the support of more than 40 former senior officials who last year recommended that Congress cancel the program.

Moving Ahead on Minnesota Clean Energy Legislation

UCS Blog - The Equation (text only) -

Minnesota State Capitol Building Photo: Minnesota Department of Administration

The Minnesota legislature is considering important new legislation to move forward on clean energy and build on the progress the state has already made to reduce emissions and modernize its electricity system.

Let’s dig into the status of the bills and some key highlights.

What’s the status of clean energy legislation in the Minnesota legislature?

Last week the Minnesota House of Representatives passed an omnibus jobs and energy bill (HF 2208), and Monday the Minnesota Senate passed its version of similar legislation (SF 2611).

Next, conference committees from the House and Senate will work on reconciling the two bills over a two-week period in the first part of May.

What’s important about the legislation?

Many Minnesotans—including Governor Tim Walz—are keen on setting a goal for 100 percent carbon-free electricity for the state by 2050. This is a target that Xcel Energy has also adopted for its own electricity system.

Unfortunately, the Senate didn’t include a 100 percent clean energy provisions in its version of the legislation.

While that is disappointing, there are many important aspects of the pending legislation to highlight. Below are five of the most significant.

Solar on schools

Championed by Rep. Jean Wagenius and other legislative leaders, the House legislation would create the Solar For Schools Program and appropriate $16 million from the state’s renewable development funds to install solar at schools (the Senate version includes funding for this program, although at a much lower level).

The program will reduce emissions and energy costs and provide learning opportunities to students about converting sunlight to electricity. It has been described as a win-win-win for schools, youth, and the environment, and the conference committees should work to fund this program at the higher level proposed by the House.

Beneficial electrification

The House version of the bill includes an important provision that sets a goal for the state to promote the use of electricity from clean energy sources to reduce greenhouse gas emissions and authorizes electric utilities to submit plans to promote electric energy uses in their service territories.

Switching parts of the economy from fossil fuels to electricity is considered beneficial if, according to the Regulatory Assistance Project, it saves consumers money over the long run, enables better grid management, and reduces negative environmental impacts.

Transmission enhancements and reliability planning

The House legislation will require utility companies to participate in a study to identify transmission network enhancements necessary for system reliability as the state’s coal-fired power plants are phased out.

Thinking ahead on how to replace coal plants with increased amounts of renewable energy will help Minnesota plan for the transmission network improvements needed to make this important transition happen more quickly and cost effectively.

Energy storage

While both versions of the legislation will authorize funds for an independent study into the benefits and costs of energy storage systems such as batteries, the higher levels of funding included in the Senate version are necessary to conduct an adequate study.

The bills would also authorize utilities to seek approval for implementing energy storage system pilot projects and require them to include an assessment of how energy storage systems contribute to generation and capacity needs in their long-term resource plans.

Highlighted in the legislation are the many benefits of energy storage, including controlling frequency and voltage, mitigating transmission congestion, providing emergency power supplies during outages, reducing curtailment of existing renewable energy generators, and reducing peak power costs.

Greenhouse gas emission reduction strategies and benchmarks

The House bill would require the Minnesota Department of Commerce to develop a set of strategies and benchmarks aimed at significantly reducing greenhouse gas emissions by 2030. The strategies include building efficiencies, consumers tools and financial incentives, electrification from fossil fuels, energy storage, grid modernization, and more.

One key improvement for the legislation would be targeting the strategies toward also achieving 100 percent carbon-free electricity by 2050 in line with proposals by Governor Tim Walz and the Minnesota 100% Campaign. My colleague Steve Clemmer provided testimony to the legislature in support of a 100 percent goal, along with interim targets for renewable energy, to help maintain Minnesota’s national leadership on deployment of renewables and energy efficiency.

The path ahead

The clean energy provisions highlighted above are among many positive aspects of the pending legislation. Also included are transportation measures such as electric vehicle rebates, charging station improvements, and support for electric transit and school buses.

Nevertheless, opposition to progress remains. For instance, 50 Republicans in the House voted no on a straight-forward legislative finding “that greenhouse gas emissions resulting from human activities are a key cause of climate change.”

But fortunately for Minnesota, the forward momentum on clean energy legislation is pointing the state toward a bright future. If the new law is enacted, rapid deployment of solar and wind power, as well as energy storage, will be pursued in a smart and cost-effective manner, supported by data and evidence.

And the studies, planning processes, and pilot programs included in the bills can ensure Minnesota continues its leading role among Midwest states on renewable energy and reducing dangerous climate change pollution.

Through merging bipartisan measures to ramp up progress and put the state on track for an equitable low-carbon energy future, the Minnesota House and Senate conference committees are poised to complete an important step forward for the state and region.

USDA Provides Blueprint for Dismantling a Government Research Agency

UCS Blog - The Equation (text only) -

Photo: USDA (Flickr)

For scientists, it’s a significant accomplishment to get your work published in a peer-reviewed journal. The process of submitting a paper, fielding reviewer comments, and revising the work can take months (or years!), and final publication in a respected journal lends credibility to any researcher’s work.

So it was odd when the Washington Post reported last week that USDA was now requiring its researchers to label outside peer-reviewed scientific publications with the word “preliminary”. But this wasn’t actually news to me. And it isn’t the only way the Trump administration is undermining the USDA’s important research role.

Peer-reviewed ≠ preliminary

Last fall I learned about this new and unusual disclaimer as I finalized a research article for publication in Public Health Nutrition which I had co-written with colleagues at Tufts University and USDA’s Economic Research Service (ERS). The disclaimer read as follows:

“The findings and conclusions in this preliminary publication have not been formally disseminated by the U.S. Department of Agriculture and should not be construed to represent any agency determination or policy.”

This new disclaimer was very troubling to me and my colleagues because it seemed to delegitimize our scientific work even though it had undergone a rigorous scientist review. Still, when we learned about the disclaimer, we felt at the time that our only option was to include it if we wanted our work published.

Research disclaimers for government employees aren’t new; I included one in another article published last summer having worked with the same USDA co-author. ERS has used this old disclaimer language for some time.  The old disclaimer essentially made clear that any opinions expressed by ERS employees in their own work were not the opinions of ERS or USDA. This old disclaimer protected USDA and the agency.

Nevertheless, each administration has its own policies and as a researcher I felt I had no sway. We published the paper with the new disclaimer, assuming the publication of it in a well-regarded, peer-reviewed journal would speak to its finality.

Since then, a great deal has been unearthed about how USDA is sidelining science with this new “preliminary” disclaimer and by other means. The same Washington Post article reported that the “preliminary” disclaimer violates the USDA’s own scientific integrity policy. Even before this disclaimer was issued, UCS’s own analyses found that during the current administration some USDA scientists have had concerns about communicating their research. Just as troubling was a 2016 Washington Post report that a USDA entomologist had his research on pesticide use and its impacts suppressed.

Why the new “preliminary” disclaimer is a major red flag

What’s strange to me about this new potential violation is that the USDA scientific integrity policy applies to “all USDA mission areas, agencies, and offices,” but the new “preliminary” disclaimer only applies to the USDA REE mission area (which includes ERS, the Agricultural Research Service, and the National Institutes of Food and Agriculture). What is more, a brief scan of peer-reviewed research articles by USDA researchers suggests that the new disclaimer policy is being inconsistently implemented.

So why have REE and ERS researchers been singled out by this new disclaimer policy? My hypothesis (I’m a scientist, it’s my job to have hypotheses) is that the Trump administration is nervous.

The Trump administration can’t always control what comes out of an objective, independent research agency such as ERS. So it’s doing everything it can to get ERS (and other REE) researchers aligned with what the administration political line.

One way to do that is to make it harder for scientists to publish their research or to delegitimize it with new policies such as the “preliminary” disclaimer to which ERS and other USDA REE employees are currently subject.

But there are many ways to muzzle government scientists

Another way is to simply cut the funding for research that is out of line with the administration’s policy goals. The Trump administration has done just that, proposing steep funding cuts to USDA research, especially ERS, in both fiscal years 2019 and 2020.

These cuts would eliminate huge swathes of research ERS conducts. The most recent budget proposal zeroes out research on food consumption and nutrition; invasive species; markets for environmental services; bioenergy and renewable energy; agricultural research investments; international food security; food and nutrition assistance; drought resilience; rural economics; beginning farmers and ranchers; and local/regional food markets.

What remains is bare bones. The only research the administration wants to continue, according to its official budget document, concerns farm business, household income and wealth, agricultural cost of production, and farm practice adoption.

In August 2018, the Trump administration continued its attempt to gut USDA’s research capacity when Secretary Perdue unveiled a plan to reorganize and relocate ERS and NIFA. Under this plan, ERS would be resituated under the USDA Office of the Chief Economist (OCE), and both ERS and NIFA would be relocated outside of the national capital region.

Moving ERS back under the purview of OCE is a reversal of a 1994 decision to place ERS and its important research farther outside the sphere of political influence from a current presidential administration.

As currently organized, ERS operates independently, freer from interference or the whims of high-level political appointees. ERS researchers have produced mountains of rigorous, high quality, independent research to inform the development of agriculture and food policy over the last few decades. Congress routinely calls on ERS to conduct studies of food and agriculture issues to inform policymaking or program implementation. Folks, this is how the process of government should work.

Moving ERS back under OCE could give Secretary Perdue (or any future USDA Secretary, for that matter) more control over what research does and does not get done by ERS or how it’s messaged to the public.

If they are out of sight, they’re out of mind

The physical relocation of ERS (and NIFA) outside the capital region can be viewed as another way for Perdue to better control the messages coming from ERS research, so that it’s more aligned with the political priorities of the administration, or simply to reduce the amount of research coming from the agency.

Through a game-show style bid “process”, Perdue’s administration has sought relocation for some, but not all ERS and NIFA staff. Politico reported that the secretary anticipated “cost savings” and thought the move would allow the agency to “provide better customer service” and it would “better attract and retain staff”.

Yet it’s unclear how moving some ERS employees out of DC and keeping some here is better for “customer service”. According to a list published by USDA, 76 positions will remain in the capital and the remaining 253 will be moved somewhere else, which likely means the relocation will splinter ERS divisions and teams. It’s anybody’s guess how this could benefit USDA customer service or agriculture research.

Not only will ERS staff be split up, but they may not have the necessary data to do their work. ERS staff are currently just a stone’s throw from the Census Bureau Headquarters where there is a Federal Statistical Research Data Centers (FSRDC) in Suitland, Maryland. These data centers contain several datasets ERS researchers probably need to do their research, including data from the Economic Census, the Current Population Survey, and the American Community Survey.  According to our own analyses, many of the relocation sites on the USDA “middle list” are nowhere near one of these data centers.

While it’s entirely possible the relocation site will be in a city near an FSRDC, why is USDA still considering places that don’t have one? Are they going to build a new data center? If so, is that really going to save USDA money (which was one of the reasons Secretary Perdue suggested the relocation in the first place)?

ERS staff may also lose critical access to data available only on-site at Bureau of Labor Statistics’ (BLS) national office in Washington DC, which they likely use for their research, too. The BLS website notes that the Consumer Expenditure Survey, Consumer Price Index, Current Employment Statistics for the US and states, the Producer Price Index, and other data sets related to food purchasing and agricultural markets can only be accessed onsite at BLS in Washington. How will ERS employees be able to access these data sets at the relocation site?

Another possibility is that USDA does away with the research that requires the use of data housed at these facilities. Time will tell how data access issues are addressed as the relocation process moves forward.

Congress, economists strongly oppose ERS restructuring, but Trump administration rushing

It’s telling that the primary professional association for ERS researchers, the Agricultural and Applied Economics Association (AAEA)—of which I’m a member—strongly opposes the restructuring plan. AAEA members are all over the country, including in places where ERS might be relocated. Our members do topnotch research on a variety of agricultural and food issues and understand just how vital ERS’s work is to ensure that our food and agricultural systems are providing economic benefits for farmers and healthy, affordable food for consumers.  So, it’s no wonder the association is opposed.

Members of Congress, both Democrats and Republicans in both the House and the Senate, have taken actions to stop the ERS/NIFA restructuring, including requesting an analysis from the USDA Office of the Inspector General (OIG) to determine if Secretary Perdue’s proposal follows established procedures for relocations. The OIG report could slow or even stop the reorganization proposal but won’t be complete until at least June 2019. In the meantime, Secretary Perdue has clearly signaled that he plans to ignore Congress entirely. Is this how we want a presidential administration to behave?

ERS and NIFA employees have taken matters into their own hands to push back against the relocation. They recently began the process of forming a union, as reported by Politico last week. A representative for the American Federation of Government Employees noted in the article that “morale has been destroyed at the agency.” We at UCS know a thing or two about what can happen when concerned scientists band together to take action (we’ve been here 50 years and are still going strong).

What can be done to stop this runaway train?

If carried through, this proposal would diminish the American food and farm system by harming public investment in science-based solutions at a time when farmers and ranchers need it most. It would also further deteriorate the morale of the talented staff throughout USDA. Moreover, the proposal would set a dangerous precedent for the types of actions presidential administrations can take to silence government scientists whose research happens to conflict with political agendas.

Thankfully, there’s still time to stop it. And there are concrete ways to do so.

In March 2019, 108 organizations wrote to Congress asking them to stop the restructuring through the 2020 appropriations process. If Congress includes language in the 2020 Agriculture Appropriations bill prohibiting the use of any funds for the reorganization, the proposal would be dead in its tracks. While many members of Congress have already expressed support for this approach, they need to hear from their constituents on this issue to compel them to act.



Subscribe to Union of Concerned Scientists aggregator - Combined UCS Blogs