UCS Blog - The Equation (text only)

Winds and Wildfires in California: 4 Factors to Watch that Increase Danger

Photo: Bob Dass/Flickr

Before we dive into the science behind the four factors specific to the California Santa Ana winds, let’s review the current situation in California and wildfire disaster risks in general.

California wildfires November 2018

Scenes of fiery devastation are heartbreaking to see unfolding in the news and social media.  In this moment, November 15, 2018, there are a dozen active wildfires in California.  Communities have been badly burned, some to the very foundations with scarcely a structure left standing. Thousands have evacuated, more than 60 people have lost their lives, some on foot, some trapped in fleeing cars; there are hundreds of people unaccounted for and family worried sick about them. Camp Fire is now the deadliest in California history.

Thousands of firefighters are battling to contain wildfires in the state. At this writing, the deadly Camp fire, near Chico in northern California, is now 141,000 acres and 40% contained with 9,700 residences and 290 commercial structures destroyed.  In southern California, the Woolsey fire of Los Angeles and Ventura Counties, is now 98,362 acres with 62% contained, with 504 destroyed (estimate) and 96 damaged structures.

Wildfires disaster risks

Let us briefly review wildfire disaster risks in general. First, wildfires can be ignited by natural causes such as lighting strikes or by various human activities at the wildland urban interface (WUI), such as when a lit cigarette is tossed or when electricity infrastructure fails.  Power lines may be implicated in the tragic November 2018 Camp Fire. According California data from 2007 to 2016, around 5% of wildfire ignitions were from power lines and were around 11% of the acres burned.

Camp Fire smoke plume 8 Nov 2018

Camp Fire smoke across portions of Northern California. NOAA-20 satellite image November 8, 2018 at 8:40 p.m. Pacific Time. Source: NESDIS/NOAA

Fire suppression, which has cost the California Department of Forestry and Fire Protection (CAL FIRE) an estimated average of $554 million per year for the past five years, leaves more vegetation intact. In many places, more vegetation, also referred to as ‘fuel,’ is available to burn for the next fire.

In the western US, vegetation is more flammable than the 1970s with around half of the vegetation drying attributed to human-caused climate change.  Hence, when fires strike in the western US, they are more likely to burn parched vegetation that serve as a tinder box fueling more severe wildfire today compared with the 1970s.

Tragically, when wildfires encounter homes, schools, and businesses it can become a dangerous situation.  Not just the structures at the site of the fire are at risk.  Smoke from wildfires can be lethal locally and a public health hazard when transported far downwind.  Wildfire smoke can disproportionately increase health risks for children or those with heart disease or lung disease.   Among the top 20 deadliest California wildfires from 1933 through November 2018, nearly a third were in 2017 and 2018.  We can and must do better to protect lives from risks posed by dangerous wildfires.

Santa Ana winds

Santa Ana winds in California, sometimes referred to as Diablo winds in the San Francisco Bay area downwind of Mount Diablo, can promote ignition and rapid spread of wildfires by drying vegetation and fanning the flames of fires once they are started. The Santa Ana winds dry out soils, trees and other vegetation much like a clothes dryer does a pair of jeans. Like an efficient dryer, Santa Ana winds increase both airflow and temperature to speed up evaporation of water. But instead of leaving behind freshly fluffed jeans, these winds suck out moisture and prime ecosystems to burn.

Santa Ana influenced fires, which occur between October and April, are different from the warm and dry season fires, that typically occur between June and September. Scientists have found the main reasons why Santa Ana influenced fires contribute the vast majority of cumulative economic losses in California compared to other wildfires that typically occur in the summer.  From 1990-2009, Santa Ana influenced fires spread three times faster, occurred closer to urban areas, and burned into areas with greater housing values. Over the same years, other fires often occurred in higher elevation forests, were more sensitive to how old the vegetation was, lasted for extended periods, and accounted for 70% of total suppression costs.  In other words, other fires burned in remote forests, often with plenty of mature vegetation or ‘fuel’ for long-lasting wildfires. Whereas Santa Ana influenced fires scorched with greater speed through areas that were typically closer to more people.

Factors to watch for to protect communities from fires exacerbated by Santa Ana winds

Santa Ana winds have a name because they are naturally occurring seasonal winds that typically peak during the autumn.  The season bridges the end of the typically hot and dry summer from the typically rainy winter season in California.  California oscillates between wet years and dry years, with the prevalence of dry years outpacing the wet years. Drought years increase the risk of desiccated soils and parched vegetation that form ready fuel for wildfires. When these conditions occur simultaneously with Santa Ana winds, they can influence the severity of wildfires.  Four factors influence the severity of winds during the autumn to winter season in California and are worth monitoring and providing timely and effective warnings for the public at risk.

Factor 1: Pressure difference between western North America high-elevation basins and pressure off the Pacific coast

During the autumn season, a typical weather pattern can set up that is favorable for the occurrence of Santa Ana winds.  A high-pressure pattern predominates over western north America while a low -pressure pattern predominates over the Pacific Ocean.  As a result, the predominant flow of air is from the interior basins, places like Nevada, to the Pacific Ocean.  When this weather pattern sets up, pay close attention to local meteorological reports and fire warnings.  This weather pattern is similar to turning on the clothes dryer.

Weather pattern favorable to Santa Ana Winds

700-hPascal height anomalies for 25 October 2003, ignition date for Cedar fire, near San Diego CA. Source: Westerling et al., 2004 EOS

Factor 2: Temperature

The air starts out at a cool temperature emanating from the autumn conditions of western US.  The air encounters California mountain ranges and flows through the high elevation mountain passes and heads downslope. The air compresses both due to constriction of the air flowing through the narrow canyon as well as the higher atmospheric pressure at lower elevations (see figure).  The molecules of the air parcel are now closer together, bump into each other, and have “higher kinetic energy.” Therefore as a parcel of air gets compressed, its temperature increases.  This is similar to setting the clothes dryer temperature setting to warm or hot.

Factor 3: How dry the air is Atmospheric pressure

Red columns indicate atmospheric pressure. Higher atmospheric pressure at sea level than on top of a mountain. Source: NASA GISS

Santa Ana winds become drier during their journey downslope. Why does this happen? An air parcel starting at a cool temperature contains a low amount of water vapor.  If that same air parcel warms, such as described above, it can now hold more water vapor and therefore the relative humidity drops. Put another way, for the same amount of water vapor, the relative humidity is higher in cool air compared to warm air.   Hence, the Santa Ana winds become drier and can increase water loss from vegetation that often is already quite dry during drought years or at the end of a summer season.  This is like shifting the auto-dry level from less dry to more dry on a clothes dryer.

Factor 4: Speed of the wind

Another consequence occurs when the air flowing through California coastal ranges is constricted through narrow mountain passes and flows down through the canyons.  This constriction increases the wind speed above the initial wind speed before passing through the narrow mountain topography. Similar to shifting the cycle from “delicate” to “heavy duty” on a clothes dryer.  Santa Ana winds have speeds of 40-60 kilometers/hour (25-37 miles/hour) or in extreme cases over 100 km/hr (62 mi/hr).  It can be extremely difficult to outrun such winds or drive away along canyon roads with neighbors who may be fleeing at the same time. Such winds can help fires grow rapidly, spread quickly and become deadly.

Santa Ana Winds are a natural seasonal occurrence.  Scientists are studying the consequences of climate change and how warmer background conditions interact with the four factors described above.   The findings have the potential to better inform advanced warnings for populations and first responders confronting the risks of Santa Ana wind influenced wildfires.

Photo: Bob Dass/Flickr NESDIS/NOAA Westerling et al., 2004 EOS GISS/NASA

Seven Things People Got Wrong with UCS’ ‘Nuclear Power Dilemma’ Report

On November 8, UCS released The Nuclear Power Dilemma: Declining Profits, Plant Closures, and the Threat of Rising Carbon Emissions, which found that more than one-third of existing nuclear plants, representing 22 percent of total US nuclear capacity, is uneconomic or slated to close over the next decade. Without new policies, we found that if these and other marginally economic nuclear plants are closed before their operating licenses expire, the electricity would be replaced primarily with natural gas. If this occurs, cumulative carbon emissions from the US power sector could rise by as much as 6 percent at a time when we need to achieve deep cuts in emissions to limit the worst impacts of climate change.

Unfortunately, some of the media coverage and statements by the nuclear industry and other groups have mischaracterized our report and our past work. Here are seven points to correct the record:

1. The report does not promote new nuclear power plant construction.

Our analysis is focused on the economic viability of existing nuclear power plants in the United States through 2035. The cost of keeping existing plants operating is considerably less than building new ones. While new nuclear plants could be built under a national carbon price or low-carbon electricity standard, our modeling shows they are too expensive compared to new wind and solar projects, energy efficiency programs, and natural gas plants with carbon capture and storage.

The only new nuclear reactors included in our analysis are the two currently under construction at the Vogtle plant in Georgia. Their cost has ballooned to more than $27 billion, which is double the estimate approved by regulators in 2008, and the project is more than five years behind schedule. This 2012 UCS analysis shows that building the two new Vogtle reactors would be more expensive than other alternatives. And the Vogtle reactors’ cost has escalated significantly over the past six years, while the cost for wind and solar has fallen dramatically.

This isn’t the first time UCS has shined a spotlight on the high costs of building new nuclear reactors. This 2016 UCS power sector deep decarbonization study found that nearly all nuclear and coal plants in the United States would be replaced by low-carbon technologies by 2050 under every scenario, except our “optimistic nuclear case.”  A blog I wrote in 2013 explains why calls by some climate scientists to build new nuclear plants are misguided.

2. The report does not advocate for subsidies for any specific nuclear plants.

The report emphasizes that a price on carbon or a low-carbon electricity standard (LCES) would be the best options for internalizing the costs of climate change in the price of burning fossil fuels and providing a level playing field for all low-carbon technologies. As explained by UCS President Ken Kimmell in his recent blog, “the report does not argue for subsidies to any specific plants. That case will have to be made in state-specific forums. Should states decide to support nuclear power plant subsidies, our report calls for them to be temporary and subject to periodic reassessment. Companies seeking subsidies must open their books and allow the public and regulators to make sure that the subsidies are needed and cost-effective, and that the same level of carbon free power cannot be provided during the relevant time period with less costly options.” Any subsidies also must be part of a broader strategy to reduce carbon emissions that increases investments in renewables and efficiency.

Finally, our report makes clear that UCS would never support financial assistance that is also tied to subsidizing fossil-based energy sources, such as Trump administration proposals to bail out coal and nuclear plants based on spurious grid-reliability and national-security grounds.

3. Existing nuclear plants must also meet strong safety standards to be eligible for support.

Since the 1970s, UCS has been a leading nuclear safety watchdog. The new UCS report recommends that nuclear reactors must meet or exceed the highest safety standards under Nuclear Regulatory Commission’s (NRC) Reactor Oversight Process to be eligible for any policy or financial support. If the NRC weakens these standards, as proposed by the nuclear industry, UCS could no longer support this recommendation. At the same time, UCS will continue to push for better enforcement of existing regulations, the expedited transfer of nuclear waste from overcrowded cooling pools to safer dry cask storage, strengthened reactor security requirements, and higher safety standards for new plants. We also consider the NRC safety standards to be a floor, not a ceiling. States could encourage plant owners to make other safety improvements that go beyond current NRC standards.

4. Not every currently operating nuclear plant should stay open.

The report highlights examples where it might make sense to shut down existing nuclear plants that are saddled with major, reoccurring safety issues such as the Pilgrim plant in Massachusetts that Entergy is closing next year and the Davis-Besse plant in Ohio that FirstEnergy is threatening to close in 2020 if it doesn’t receive subsidies. Other examples include Indian Point, due to its proximity to New York City, and Diablo Canyon, which is located near earthquake fault lines in California.

It also might make sense to shut down plants with high operating costs or ones that need to make major new capital investments to continue operating safely. Examples cited in the report include Crystal River in Florida and San Onofre in California, which were retired in 2013 following failed steam generator replacements. Fort Calhoun in Nebraska shut down in 2016 primarily for economic reasons following several years of extended outages and flood damage. Chris Crane, CEO of Exelon, agrees that some high-cost plants should probably close: I will be the first one to tell you that some of the nuclear plants are small, uneconomic and they won’t make it and they probably should not make it,” he said. “Let’s not save every one.”

5. Not every nuclear plant that retires early will be replaced with fossil fuels.

The report acknowledges that with sufficient planning and strong climate and clean energy policies, some existing nuclear plants can be replaced with renewables, energy efficiency, or other low- carbon technologies. For example, California passed legislation in September that commits the state to replace Diablo Canyon with zero-carbon energy sources by 2025. And states experiencing rapid wind and solar power deployment such as Iowa, Nebraska, Kansas, and Texas could potentially replace their nuclear plants with low-carbon energy sources over a reasonable period of time. However, a significant portion of the electricity in most of those states is still generated by coal and natural gas. Replacing those fuels with renewables and efficiency would result in much greater emissions reductions than replacing nuclear plants, another low-carbon source of electricity.

6. UCS has long recognized the role of existing nuclear plants in reducing carbon

UCS has long supported keeping existing nuclear reactors that meet high safety standards operating to combat climate change. In 2004, the director of our energy program at the time, Alan Nogee, stated: “We cannot phase out current nuclear generation quickly, especially without [a] significant increase in carbon emissions.” Five years later, we released our “Climate 2030 Blueprint,” which assumed the fleet of more than 100 US reactors would continue to operate through 2030 and beyond. You will find in the report’s executive summary: “Hydropower and nuclear power continue to play important roles, generating slightly more carbon-free electricity in 2030 than they do today.”

US Electricity Generation under the UCS Climate 2030 Blueprint

Two years ago we posted a  “Nuclear Power and Global Warming” page on our website, highlighting the need for all low-carbon technologies, including nuclear power, to limit the worst consequences of climate change. The web page also warns that replacing existing nuclear power plants with natural gas plants would increase carbon emissions.

In 2016, UCS was involved in negotiations in Illinois to keep two uneconomic nuclear plants running, while strengthening the state’s renewable energy and energy efficiency standards. We posted the following blogs on the topic: “A Huge Success in Illinois: Future Energy Jobs Bill Signed Into Law,” “The Future Energy Jobs Bill: Promise, Pitfalls, and Opportunities for Clean Energy in Illinois,” and “New Analysis Shows Fixing Illinois Clean Energy Policies Is Essential to Any ‘Next Generation Energy Plan.’”

7. UCS has long supported a low carbon electricity standard (LCES), but not at the expense of renewable electricity standards (RES).

Since at least 2011, UCS has engaged in constructive dialogues and provided support for LCES proposals. See here, here, here, and here. More recently, UCS advocated for the 100 percent zero-carbon electricity standard in California that was signed into law in September.

While an LCES could be effective at preserving existing nuclear generation and increasing the deployment of renewable energy and other low-carbon technologies, our position has remained consistent (including in our new report) in that we do not recommend replacing state RESs with broader LCESs. Renewable standards have been effective at reducing emissions, driving down the cost of wind and solar, and creating jobs and other economic benefits for states and in rural communities. They have also been affordable for consumers. Including existing nuclear power plants in state renewable standards could significantly undermine the development of new renewables and all the benefits that go along with them.

We recommend including existing nuclear in a separate tier of an LCES, as New York state has done, to limit costs to ratepayers and avoid market-power issues due to limited competition among a small number of large plants and owners. New York also has combined an LCES with a zero-energy credit program to provide financial support only to existing nuclear plants that need it, adjusting support as market conditions change. New technologies would be eligible to compete in the existing tier to help ensure that the most cost-effective, low-carbon energy sources replace any retiring nuclear plants. Illinois and New Jersey also strengthened their renewable standards while providing separate financial support for distressed nuclear plants.

And finally, despite reporting to the contrary, UCS has not changed its position on nuclear power. Has UCS advocated vigorously for policies to increase the deployment of renewable energy to address climate change? Absolutely. Have we been a longstanding watchdog for nuclear power safety? You bet. Do we now believe the Nuclear Regulatory Commission (NRC) is an effective watchdog or that nuclear power safety concerns are overblown? Emphatically no.

But UCS has long recognized that the current nuclear fleet is a significant source of low-carbon power and that nuclear plants should not retire precipitously without carbon-free replacements. As cited above, my former colleague Alan Nogee tweeted a slide from 2004 showing that UCS grappled with just this point more than a decade ago:

As Congress Revives its Oversight Responsibilities, Science Should Be on the Agenda

The midterms brought checks and balances to Washington, complete with new opportunities for accountability and oversight, and some members of Congress have already signaled that science will be on the agenda. Today, a diverse set of environmental, public health, and good government organizations released a report outlining what Congress can do to address recent actions that sideline science from policymaking. Contributing and endorsing organizations are listed below.

Truth and science cartoon

Accountability for political interference in science can come through congressional oversight. It’s now up to Congress to choose what topics are most ripe.

We know that oversight works. Political appointees during the George W. Bush presidency rewrote scientific reports, compromised science advisory committees, and threatened scientists across a wide variety of issues. As a result of oversight, appointees resigned and scientific analysis was made right. To support this process, it was tremendously useful for civil society to come together to identify patterns and build public awareness about the public harm caused by attacks on science.

The report “describes new and ongoing threats to the communication of science and its use in public health and environmental decisions,” and recommends steps Congress can take in response, from exposing abuses of scientific integrity to holding appointees accountable to passing good government laws. Issues addressed include:

  • Politicization of science within agencies
  • Threats to scientific advisory committees and science advice
  • Unqualified and conflicted government leaders
  • Constraints on the communication of science
  • Whistleblowing and scientific integrity
  • Low-information approaches to enforcement of existing public health and environmental laws

I’m thrilled to see so many respected organizations coming together around common themes: scientific advice is essential to public health and wellbeing; attacks on science and scientists decrease faith in the institutions that are designed to keep us safe; and the sidelining of scientists and science advice deserves to be scrutinized and reversed.

It’s so impressive that all of these organizations with desperate interests have come together because they recognize the harm Trump administration actions have had on topics as diverse as workplace injuries, reproductive health, the Census, chemical contamination, tipped workers, endangered species, climate change, and air pollution.

The report recognizes that political interference in science is a constant temptation for policymakers—and that recently, that interference has become more sustained and pervasive. This highlights the need for better systems and protections that strengthen the role of science in policymaking.

The recommendations are intentionally broad so that any public interest organization can use them as a guide when they talk with congressional offices about more specific oversight recommendations that are relevant to their areas of expertise. I hope others who see value in the role of science in policymaking will use this report to inform their work to protect public health and the environment.

Jurisdiction over the federal scientific enterprise falls to several House committees, including The House Energy and Commerce Committee, House Natural Resources Committee, and the Committee on Space, Science, and Technology. The ball is now in their court to conduct fair oversight of the federal scientific enterprise and slow down the most egregious attempts to make evidence-free public health and environmental policy.

The findings and recommendations in this report have been endorsed by the following organizations. Contributors to the report are identified with an asterisk.

  • Climate Science Legal Defense Fund*
  • Defenders of Wildlife
  • Democracy Forward*
  • Environmental Integrity Project*
  • Environmental Protection Network*
  • Government Accountability Project*
  • Greenpeace*
  • Jacobs Institute of Women’s Health*
  • National Center for Health Research
  • National Federation of Federal Employees*
  • National LGBTQ Task Force
  • National Partnership for Women & Families*
  • National Women’s Health Network
  • Power to Decide*
  • Project on Government Oversight*
  • Union of Concerned Scientists*

7 Questions the Senate Should Ask Trump’s New USDA Chief Scientist Nominee

The Senate Agriculture, Nutrition and Forestry Committee hears testimony at the confirmation hearing of Agriculture Secretary-nominee Sonny Perdue, March 23, 2017. USDA Photo by Preston Keres.

Back in early August (or roughly two Trump years ago), I wrote about the president’s nomination of Scott Hutchins to head up science at the US Department of Agriculture. In that post, I argued that Hutchins, an entomologist with a 30-year career at pesticide-maker Dow, is the wrong choice for the job.

On November 28, the Senate agriculture committee will hold a confirmation hearing for Hutchins, their chance to interview him for the position of USDA under secretary for research, education, and economics. Following are seven questions I think they should ask.

1. As chief scientist, would you push back on efforts to cut, marginalize, and politicize USDA research? The position Hutchins is seeking has, until now, overseen four agencies that make up the USDA’s Research, Education, and Economics (REE) mission area, which collectively carry out or facilitate nearly $3 billion worth of research on food and agriculture topics every year. But in August, Secretary Perdue dropped a bombshell with an abrupt reorganization proposal that would pluck the Economic Research Service (ERS) figuratively from within REE and place it in the Secretary’s office. Perdue’s announcement also included a plan to literally move ERS, along with National Institute for Food and Agriculture (NIFA), to as-yet-undetermined locations outside the DC area. More than 1,100 of Hutchins’ fellow scientists recently signed a letter opposing the move, which threatens to marginalize and politicize the agencies, and would cost millions of dollars that could otherwise be spent on their important science work. And even before the reorganization proposal, the Trump administration had been gunning for ERS in particular—its FY2019 budget request, unveiled in February, would have cut the agency’s budget in half. What would Hutchins do to push back on these anti-science moves?

2. Would you champion independent economic analysis at the USDA that helps policymakers and the public understand the economic impact of taxpayer investments and federal policies…even when it doesn’t support the administration’s political agenda? ERS—the agency that Perdue and the White House seem most determined to muzzle—plays an important role in building from data collected by the USDA to illuminate the socio-economics of food and agriculture. US farms and farmers are impacted not only by market forces of supply and demand, but also by the sometimes-unexpected consequences of public policies. ERS has a history of publishing reports that examine policy implications and overarching trends, bringing life to complex but critical data; examples include recent reports on consolidation of agriculture, public research funding, and food availability and dietary trends. The Trump White House may not always appreciate ERS findings, but maintaining unvarnished, independent analysis on a wide range of topics is particularly important in this era of low crop prices and escalating trade tensions. As chief scientist, would Hutchins go to bat for such research?

3. As a scientist, are you concerned about the administration’s science record? Why or why not? And if you’re confirmed, how will you ensure scientific integrity at the USDA? The Trump administration’s record on respecting science in federal decision-making is abysmal. And while the USDA hasn’t seen the same level of attacks on science as, say, Ryan Zinke’s Interior Department, our 2018 survey of USDA scientists shows there is cause for concern. What would Hutchins do to swim against this administration’s tide and maintain a high standard of scientific integrity at the USDA? Will he commit to uphold the department’s scientific integrity policy, and to resist politically-motivated moves that would undercut the ability of the thousands of USDA scientists under his purview to do their vitally important jobs?

4. How would your pesticide industry ties affect USDA efforts to help farmers reduce their dependence on expensive and dangerous chemical inputs? Do you think that’s an important goal? Why or why not? These are increasingly important questions for the Senate to ask, as problems brought on by decades of over-reliance on pesticides come home to roost. In August, for example, a court ordered the EPA to ban all remaining agricultural uses of Dow’s brain-damaging insecticide chlorpyrifos (which former EPA Administrator Scott Pruitt had refused to do the year before). Days later, a San Francisco jury handed down a $289 million judgment against Monsanto in the case of a former school groundskeeper who developed terminal cancer after years of spraying the company’s popular Roundup herbicide. Then there’s the ongoing dicamba debacle: Another widely-used weedkiller, dicamba has many farmers in a bind because of its propensity to drift from fields of soybeans and cotton specifically engineered to resist it, and damage neighboring farmers’ non-resistant crops. By the middle of this year’s growing season, weed scientists had estimated that well over a million acres of soybeans—at least 1.2 percent of the entire US crop for the year—had already suffered drift damage. Hutchins’ longtime employer DowDuPont is one of several companies that sells dicamba and dicamba-resistant seed and has been named in an ongoing lawsuit over drift damage. Clearly, farmers (and eaters) need safer, more sustainable solutions, but the Trump administration is moving in the opposite direction—the White House budget request earlier this year would have eliminated the Organic Agriculture Research & Extension Initiative and cut the Sustainable Agriculture Research & Education Program by nearly 30 percent. As a pesticide industry insider, would Hutchins be able to rise above the interests of his long-term employers and colleagues, and to consider instead the larger public interest to be served through investment in these valuable research and education programs?

5. Do you accept the science of climate change, and would you increase support for the evidence-based tools farmers need to build resilience to a warmer, more volatile climate? Many Trump administration officials—including the president himself and Secretary Perdue—have scoffed at the science of climate change. But when it comes to farmers and our food supply, inaction on climate change just not an option. What does Hutchins think about the contribution of soil health to climate resilience and productivity? As chief scientist, would he stand behind USDA investments in research, education, and extension to help farmers and ranchers better cope with our changing climate?

6. What scientific and economic research would help policymakers better understand and improve the Supplemental Nutrition Assistance Program (SNAP)? Formerly known as the food stamps program, SNAP has been at the center of controversy in the farm bill, and it’s under attack by the Trump administration. But in 2014, this program lifted an estimated 4.7 million people out of poverty, including 2.1 million children, and abundant data show that SNAP is a smart investment in the nation’s health and well-being and a boon to local economies. As chief scientist, what research would Hutchins prioritize to improve understanding of the program’s benefits and how it could be improved to better serve public health and Americans still struggling economically?

7. How would you help ensure that the next update of federal dietary guidelines is based on the best science? The USDA and the Department of Health and Human Services recently embarked on a two-year process to update the Dietary Guidelines for Americans. My colleague Sarah Reinhardt recently wrote about what we expect from that process—including our concerns that it will be particularly vulnerable, under the Trump administration, to influence by food industry interests. Hutchins would not be directly responsible for the process—it will be run out of the USDA’s Food and Nutrition Service, which Secretary Perdue reorganized last year, and which is led by a Trump appointee with zero nutrition background. But as chief scientist, Hutchins could play an important role. By prioritizing USDA nutrition research over the next two years to inform the process, he could help ensure that the 2020 DGAs are based on sound science in the interest of public health. Will he?

A Stealth Move to Undermine Science at the US Department of Agriculture

NIFA research teamA team of scientists gathers data for a NIFA research project. Photo: USDA/CC BY 2.0 (Flickr)

In its latest scheme to undermine science, the Trump administration is brazenly trying to—pun intended—farm out to the hinterlands the most important research arms of the Department of Agriculture.

When Secretary Sonny Perdue recently boasted that 136 entities in 35 states are vying for the relocation of the Economic Research Service (ERS) and the National Institute of Food and Agriculture (NIFA), his press release claimed that the move would place scientists closer to many “stakeholders” who live and work far from Washington, DC, would give “significant savings on employment costs,” and would “improve USDA’s ability to attract and retain highly qualified staff with training and interests in agriculture.”

It sounds benign enough, but the rhetoric of moving these divisions closer to farming “stakeholders” purposely masks the likely damage to the far bigger world of stakeholders—the American people. The truth is, the move by Perdue and the Trump administration will further disconnect the perspective and expertise of USDA scientists from direct contact with policymaking on Capitol Hill.

Key data and research agencies

The 57-year-old ERS is no household acronym, but it is the principal agency that scours data on the impact of agricultural practices on the environment. It studies nutrition, food safety and food access for the poor, employment in rural economies, and the pros and cons of international agricultural trade proposals and regulations.

NIFA, created in the 2008 Farm Bill, funds research and programs that guide policymakers on improving nutrition and food safety, promoting sustainable agriculture, and keeping American agriculture competitive at a global level.

The data collected and questions explored by these agencies cross-pollinate in Washington, DC with research from the 12 other federal statistical agencies to help Americans understand economic trends and realities in our nation’s urban, suburban, and rural populations. Here’s the rub: these agencies’ collaborative and impartial search for facts is often at odds with the skewed and sometimes false narratives of lobbyists and politicians, such as in pleas for farm subsidies and stereotypes about how low-income mothers abuse food assistance benefits.

Widespread opposition

The Trump administration wants to break up this science-based collaboration, which runs parallel to its more highly publicized efforts to defang science in the Environmental Protection Agency and the Interior Department. Not only that, but the move by Perdue follows an effort earlier this year to cut the ERS budget in half—a request Congress rightfully dismissed. And this latest attempt to hamstring both ERS and NIFA has similarly drawn the ire of leading scientists around the nation. More than 1,100 of them signed a letter, coordinated by the Union of Concerned Scientists, urging key Senate and House agriculture committee members to block the move of the ERS and NIFA.

“The world class research carried out through NIFA and ERS comprises part of the science-based bedrock of our food and farm system,” that letter explains. “It empowers producers, businesses, and decisionmakers across the country with the accurate, unbiased data they rely on every day.”

In a stunning display of how seriously this professional community takes the proposed move, 56 former senior administration officials and heads of statistical agencies wrote a similar letter to congressional leaders. The signatories include Susan Offutt, who ran ERS from 1996 to 2006, under both the Clinton and George W. Bush administrations; her successor Katherine Smith Evans; and former leaders from the Census Bureau, the Office of Management and Budget, the Internal Revenue Service, the Energy Information Administration, the Bureau of Justice Statistics, the Bureau of Labor Statistics, and the National Center for Health Statistics. They say the move “jeopardizes” the independence of federal data gathering by increasing “the potential for interference in the direction, design, analysis and release of studies and reports.”

Fears of political interference

Interference should be inconceivable when public health is at stake, as with food safety. For instance, ERS studies whether salmonella testing programs on poultry are effective. A report last year concluded, based on the evidence, that tougher and more clear federal regulations reduced salmonella contamination.

Top critics of the relocation proposal see it as a form of interference. Smith Evans, who ran ERS from 2007 to 2011, under both the George W. Bush and Obama administrations, told me that her former agency “will be decimated. It will not be able to hire the best and the brightest and compete for skilled people if it is relocated in an isolated area.”

Those fears are gaining political traction on Capitol Hill as the USDA Inspector General is now reviewing the proposed move at the request of Representatives Eleanor Holmes Norton (D-District of Columbia) and Steny Hoyer (D-Maryland). Equally concerning is the fact that Perdue has also proposed to reorganize ERS out from under the Office of Research, Education and Economics and into the Office of the USDA’s Chief Economist. Since the chief economist reports directly to Perdue, many worry the shift will further compromise the agency’s independence.

Steve Gliessman, an emeritus professor and founding director of agroecology at the University of California Santa Cruz, is one of many scientists concerned about the proposal. From 2004 to 2008, Gliessman used a NIFA grant to improve the technique for growing strawberries organically. His research demonstrated that strawberries could be grown without pesticides by rotating cover crops that did not play host to diseases that bedevil the berries. Such techniques helped increase acreage for organic strawberries from 134 acres two decades ago to 4,000 today.

Gliessman says he fears that moving NIFA to a more rural state gives big producers more opportunity to influence the direction of research while their lobbyists remain in DC to work over the politicians. Between the two, he worries that advocates for more sustainable, diversified and safer food production will be drowned out.

“I see this resulting as a return to a focus on production and profit rather than a deep understanding of the ecological and social impacts of that mode,” he said. “We’ve seen more than enough of the unsustainable nature of the industrial food model. We really should be moving toward an ecological model.”

A wealth of research and data at stake

The fact is, ERS data and NIFA research have shown why the nation should be moving toward an ecological model of agriculture and a more nutritious food system. ERS has shown that conservation compliance programs, which tie more sustainable agricultural practices to eligibility for federal price supports and relief, work to reduce erosion. It has shown that programs that pay farmers a rental fee to take millions of acres a year out of production work to reduce erosion, pollution, restore wildlife and diversify rural economies through recreation.

Analysts at the Union of Concerned Scientists built upon ERS research to show that measures to reduce fertilizer pollution in the Corn Belt could save taxpayers, farmers and businesses $850 million a year, instead of costing the nation $157 billion in lost tourism, fishing, health care costs and water treatment. UCS has also found that sophisticated three-crop and four-crop rotations that preserve soil can lead to higher yields than two-crop rotations.

Among its other contributions, ERS developed the definition for US food deserts and a national atlas of low food access areas, giving the federal government and states specific geographic areas to target with programs, data that helped inform Obama-era healthy food initiatives and First Lady Michelle Obama’s Let’s Move program. An ERS report last year found that the percentage of low-income census tracts with large grocery stores or supercenters nearly doubled, mirroring the growth for moderate- and high-income census tracts. Another ERS report showed how the Supplemental Nutritional Assistance Program helps low-income individuals and families alleviate food insecurity while pointing out a myriad of social and educational challenges to securing the best nutrition.

Those successes are a reminder that you cannot have progress unless you have data as a reference point. “People should understand that it takes decades to assemble the data on issues like this,” Offutt says. “You just don’t go out and instantly collect data on grocery stores or how food is cooked and processed. Home waste is different than waste in restaurants. Food choices change. There’s no single university or institute that can put together the intellectual firepower necessary to get the entire picture as can the federal government.”

As Smith Evans put it, “ERS never says USDA must do this or that. We say: ‘Let’s lay out the facts, and the facts will inform without prescribing. That’s so hard for other institutions to do. It would seem we need that more than ever.”

Offutt added, “My philosophy at ERS and government research in general is that you want to look over the horizon and ask questions two, five, six, 10 years down the road. That’s an important function for a public agency. With this proposed move, I’m really concerned that longer-term view will be lost. If your biggest public agency isn’t doing this work, who will?”

Put another way, if your biggest public agency isn’t doing this long-term work, who is there to protect your food, your health and the environment? If Perdue is allowed to move ERS and NIFA out of the mainstream of federal data gathering, the answer from the USDA will likely be: no one.

Photo: USDA/CC BY 2.0 (Flickr)

Which Parks and Rec Character is FirstEnergy?

Source: Florian64190/Wikimedia Commons

FirstEnergy at it again, begging this administration for a handout.

FirstEnergy is a large, investor-owned, electric utility that operates in 13 different states. It operated a competitive generation subsidiary, First Energy Solutions (which is currently bankrupt). Recently it announced its intentions to retire two coal-fired power plants, observers believe this was just an attempt to garner support to get bailed out. The local grid operator (PJM) concluded the lights will stay on absent these two plants.

Despite being a billion-dollar corporation, FirstEnergy acts a lot like an entitled teenager. Not satisfied with its allowance, it moved out only to find out that the real world is tuff.

Now, after racking up huge amounts of debt, the spoiled brat wants to move back in and live rent-free.

And get its allowance back.

And wants all of us to pay off its debt.

And the debt of all its friends.

A problematic history

Other analysts and watchdog groups have chronicled the economic dire straights that FirstEnergy is facing and the dubious efforts they’ve engaged to try and get out the current predicament, which includes:

(this list may not be comprehensive) 

In part, a self-inflicted problem

I’ve conducted a comprehensive look at how coal-fired power plants operate in competitive energy markets including PJM, where FES does business. Two of FES’s coal-fired power plants (Kyger Creek and WH Sammis) operated for long periods of time when it would have been cheaper to just turn off.

Over the past three years, FES ran Kyger Creek and W.H. Sammis in such a way that likely imposed nearly $90 million in unnecessary costs onto FES’s financial ledger. $90 million is a drop in the bucket compared to the billions of dollars FES is seeking to be bailed out, but it goes to show you how poorly run the company is.

What does all this have to do with Parks and Rec?

In many ways, FirstEnergy reminds me of the Parks and Rec character Mona-Lisa Saperstein (portrayed by the brilliant and amazingly talented, Jenny Slate).

FirstEnergy demands to be bailed out.

Ohio says, “No.”

FirstEnergy tells everyone that without a bailout coal will retire, and it will impact reliability or even national security.

And then lobbies again for a bailout.

Meanwhile, Buckeyes, consumer advocates, environmental advocates, grid experts and plenty of other folks think that FirstEnergy is…

https://commons.wikimedia.org/wiki/File:Parks_And_Recreation_Logo.png NBC NBC NBC NBC NBC

Forensics, Justice, and the Case for Science-Based Decision Making

Photo: Lonpicman/Wikimedia Commons

Forensic science—and the language forensic scientists use to talk about their findings–has real-world impacts, sometimes life-or-death impacts, for real people. If the criminal justice system is going to really serve the cause of justice, it needs to be informed by the best available science. Unfortunately, the United States Department of Justice (DOJ) is ignoring scientific best practices, reversing progress toward improving forensic science in the U.S.

At the end of July 2018, the DOJ announced the release of eight new Uniform Language for Testimony and Reporting documents (ULTRs) at the annual meeting of the International Association for Identification. An ULTR is a document meant to ensure that all forensic practitioners from the same discipline in DOJ forensic science laboratories use the same language in reporting the results of their analyses to police, lawyers, judges, and juries. While an ULTR is only binding on DOJ laboratories, state and local laboratories often follow DOJ’s lead.

The Deputy Attorney General said at the meeting that these documents “meet the highest scientific and ethical standards.” But do they?
All nine of the ULTRs use what is sometimes described as a “categorical” reporting framework. This framework sorts all reports into a small number of categories. For example, the categorical framework for firearms evidence is:

  1. Source identification (i.e., identified)
  2. Source exclusion (i.e., excluded)
  3. Inconclusive

Categorical reporting has long been widely criticized because the artificial boundaries between the categories render the system prone to perverse cliff effects. A better way would be what might be called “continuous” reporting, in which the weight of the evidence is reported as it is, rather than by reference to its place in a relatively crude three-category framework.

Another criticism of categorical reporting is that it implies certainty, as for example in the firearm example above in which the analyst would tell the jury “that two toolmarks originated from the same source.” Science doesn’t deal in certainties, and these ULTRs violate basic probabilistic reasoning. They are neither logical, nor scientific. That very point was made in the public comments on the draft ULTRs by several commentators and in a recent report on latent print analysis by the American Academy for the Advancement of Science (AAAS).

A discouraging omen

In April 2017, Attorney General Jeff Sessions shut down the National Commission on Forensic Science, a roughly 30-member advisory panel of scientists, forensic and non-forensic, and legal and law enforcement professionals. The Commission had been launched in 2013 after a 2009 report by the National Research Council, the official science advisor to the US Congress, found “serious deficiencies in the nation’s forensic science system” and called “for major reforms.” With the closing of the Commission, the DOJ turned its forensic reform effort over to the Forensic Science Working Group, the current publisher of the ULTRs.

Given that the ULTRs are the first official documents produced by the Forensic Science Working Group as part of its “plans to advance forensic science,” these documents are a discouraging sign for a future in which forensic reform is driven by the DOJ. Since the ULTRs were supposed to “serve as a model for demonstrating” the DOJ’s “commitment to strengthening forensic science, now and in the future,” their flaws don’t portend well.

Not making sense

After stating that the forensic experts should report that they know the source of a forensic trace, the ULTRs go on to make a number of statements that sound more uncertain. It might seem like the ULTRs are trying to tone down their claims of certainty, but the result is that the ULTRs try to support reports of certainty with statements of uncertainty. That doesn’t make any sense.

It also seems like the ULTRs are suggesting that small probabilities can be rounded down to zero for the “consumer” of the evidence. But it is unclear why that would be a scientific, or a just, thing to do.

It is helpful that the ULTRs contain lists of statements that should not be said, such as “zero error rate” and “100% certain.” These statements were made for years, including by DOJ forensic analysts, and they have now been largely discredited. However, a lot of the “banned” statements are what I call “false concessions.” It appears that the DOJ is conceding something important, but in fact they are conceding little or nothing because analysts are still permitted to make statements that are logically equivalent to the banned statements.

Scientists, not just forensic scientists, can weigh in to protect the role of evidence

In recent years, some progress has been made toward recognizing the inherently probabilistic nature of all scientific evidence and seeking ways of communicating those probabilities to lay audiences. The ULTRs signal that the DOJ is not yet ready to join that effort. This is unfortunate, given the DOJ’s power and influence.

Scientists don’t need to know anything about forensic science to understand that categorical statements of certainty are not plausible. Any scientist can help by letting the DOJ know that their statements are not scientifically credible and that the opinions of individual scientists and scientific institutions should be taken seriously by the nation’s most important purveyor of justice.

Overstating the certainty of forensic evidence has been implicated in many miscarriages of justice. And it is scientifically wrong. The people who are the ultimate consumers of forensic evidence deserve better.

 

Simon A. Cole is a Professor at University of California, Irvine’s Department of Criminology, Law and Society. 

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Senate Should Reject Trump’s Coal-Friendly Energy Commission Nominee

Photo courtesy of Sen. Martin Heinrich

 

The steady parade of unqualified, ideologically driven appointees for key Trump administration positions has resumed now that things in Washington have settled down after the mid-term elections. Last week, Trump tapped Matthew G. Whitaker to replace Attorney General Jeff Sessions. This Thursday, the Senate will hold a hearing to confirm attorney Bernard McNamee to fill a vacancy at the five-member, presidentially appointed Federal Energy Regulatory Commission (FERC), a relatively obscure—but critically important—independent agency that oversees interstate power lines and pipelines.

Trump presumably picked McNamee to put the administration’s pro-fossil-fuel spin on a number of key decisions FERC will make in the coming months, especially one that would bail out uneconomic coal plants. If that happens, Americans will be saddled with higher electric bills, more toxic air pollution, and more heat-trapping emissions that cause climate change. The commission also will be considering rules that would encourage energy storage, rooftop solar installations, and remotely located renewable sources.

McNamee would replace Robert Powelson, a former utility executive and Pennsylvania utility regulator who left the commission in August after less than a year. One of the three Republicans on the commission, Powelson maintains that FERC should be insulated from political pressure. “I don’t make any decision based on the fact that I’m a lifelong Republican,” he told Energywire. “I have a mean independent streak in me.”

McNamee, who has no utility sector experience, is all about partisan politics. He worked for Republican attorneys general in Virginia and Texas and advised Republican Sens. George Allen and Ted Cruz before joining the Department of Energy (DOE) in May 2017 as deputy general counsel for energy policy.

Last February, he left DOE to work for the Texas Public Policy Foundation, a libertarian think tank funded by a rogues gallery of polluters, including Chevron, Devon Energy, ExxonMobil, Koch Industries and Luminant, the largest electric utility in Texas. It’s the same outfit that produced Trump’s unqualified—and rejected—nominee to head the White House Council on Environmental Quality, Kathleen Hartnett White.

While at TPPF, McNamee penned a paean to his favorite energy source for The Hill, a political trade publication, titled “This Earth Day, let’s accept the critical role that fossil fuel plays in energy needs.” “We have been told that fossil fuels are wrecking the environment and our health,” his April 17 column read. “The facts are that life expectancy, population and economic growth all began to increase dramatically when fossil fuels were harnessed….” Renewable energy sources, he added, cannot replace fossil fuels, but not to worry, “America is blessed with an abundant supply of affordable natural gas, oil and coal.”

McNamee rejoined DOE in June as the executive director of the agency’s policy office. Before and after his brief stint at TPPF, he promoted Energy Secretary Rick Perry’s proposal to require regional transmission operators to buy electricity from power plants that can store a 90-day fuel supply on site, ostensibly to strengthen electricity-grid resiliency. The plan, which would prop up coal and nuclear plants that have been struggling to compete on the open market with cheaper natural gas and renewables, would cost ratepayers an estimated $17 billion to $35 billion annually.

At Trump’s behest, Perry asked FERC in September 2017 to issue grid resiliency rules to protect failing coal and nuclear plants. FERC rejected the request, concluding that DOE did not provide any evidence that coal and nuclear plant retirements would undermine grid reliability. An analysis by Mid-Atlantic grid operator PJM of the impact of closing at-risk plants in its region also found no threat to the grid.

Besides trying to reverse FERC’s coal- and nuclear-power bailout decision, McNamee could do lasting damage in other ways. For example, the commission is currently not required to consider the impact of climate change when making electricity policy decisions, but the two Democratic commissioners think the “social cost of carbon”—the financial damage caused by carbon pollution—should be incorporated in environmental reviews for gas pipelines and other fossil fuel infrastructure. Likewise, the commission will be deliberating over whether it should eliminate barriers to electric energy storage, make it easier for solar panel owners to sell their excess power back to electric utilities, and recommend federal incentives for more transmission-line construction, which would enable remotely sited wind and solar projects to compete with natural gas. Given McNamee’s biases, it is unlikely he would support any of those initiatives.

This week’s confirmation hearing, hosted by the Senate Committee on Energy and Natural Resources, will be chaired by Sen. Lisa Murkowski, who is no stranger to the FERC confirmation drill and quite knowledgeable about the commission’s mandate. In her opening statement during a FERC commissioner confirmation hearing in 2013, Murkowski made a case for rejecting an Obama nominee that could be easily applied to McNamee.

“FERC is independent by law and by design. It is clearly distinct from executive agencies that carry out policy directives from the White House…,” she explained. “It is critically important for us to enable the agency—and its professional nonpartisan employees who report to the chairman as their CEO—to maintain its strong culture as an expert agency free of undue political influence.”

Murkowski should hew to that line on Thursday—and the Senate should reject the McNamee nomination.

Automakers propose loopholes, not rollbacks of cleaner car standards—both are terrible

Since word first leaked that the Administration was planning to freeze fuel economy and global warming emissions standards for passenger cars and trucks, automakers and their trade associations have been adamant about “not wanting a rollback.”  Now that the public comment period on the agencies’ proposed freeze has closed, we have an opportunity to see just exactly what it is that the manufacturers want instead of a rollback—the answer is, in some cases, actually even worse:

  • Honda proposes keeping the curves the same but asks for a number of changes that would erode the benefits of the standards we have today. Lost Emissions Benefits: ~20-40%
  • The Association of Global Automakers not only asks for all those same flexibilities, but they have also requested further revisions downward “to account for today’s market realities.” Lost Emissions Benefits: ~50-70+%
  • General Motors has proposed scrapping the greenhouse gas emissions program entirely, replacing it with a weak National Zero Emissions Vehicle (NZEV) program that will not drive electric vehicle (EV) adoption beyond the status quo, and their proposal does little to drive down emissions from the 95 percent of the vehicle market that will still be powered by gasoline. Lost Emissions Benefits: ~75-90%
  • The Alliance of Automobile Manufacturers asks for every loophole under the sun and then some—so much so that even if the year-over-year improvements remained unchanged from the rules we have today, progress on emissions could actually be even worse than the proposed rollback. Lost Emissions Benefits: ~70-130+%

Even the most aggressive positions by major automakers would represent a step backwards from the standards we have today. Our analysis shows that GM’s so-called “visionary” proposal is anything but, representing only a marginal improvement on the rollback and locking that lack of progress in through 2030. And proposals from its trade group, the Auto Alliance, are actually WORSE than a rollback due to the countless flexibilities requested. The hashed boxes indicate uncertainty around the year-over-year improvement requested by the organization, while the ranges reflect uncertainty about technology adoption. Arrows indicate additional, unquantified changes which would further shift the benefits of the proposal.

“Flexibilities” are at the heart of all automaker comments

While there are rhetorical flourishes from automakers about “meaningful year-over-year improvements” and insistence on being “committed to reducing greenhouse gas emissions,” every single automaker indicated that they believe there are changes needed to the standards that are on the books, standards which have successfully driven investment in fuel efficiency across all vehicles classes, saving consumers over $70 billion at the pump.

The standards on the books today have roughly comparable year-over-year requirements for every type of vehicle but are adjusted so that bigger vehicles and light trucks have lower targets.  There are two ways to dampen the progress from these standards:  the first approach is to adjust the year-over-year requirements of the standards, the “curves” underpinning the rules—this is what the administration has done by freezing the standards at 2020 levels; the second approach is much sneakier, which is to ask for “extra credit” for specific applications of technology that give more credit for emissions and fuel reductions than will actually result in the real world—this is the approach favored by automakers (though some have deployed a combination of both strategies).

The agencies have already included some incentives in the current standards, which the industry refers to as “flexibilities” and others may refer to as “loopholes” (for example, EV emissions are currently credited without acknowledging emissions from upstream electricity production).  However, many of these incentives were designed to be temporary to drive early adoption and are now phasing out.  Manufacturers are now requesting that these incentives be extended, in some cases indefinitely, and additionally that these flexibilities be broadened well beyond the original intent of the incentive—while promoting sustainable technologies like EVs in the near-term is important, it shouldn’t be done at the expense of encouraging a less efficient fleet overall.  This can have a profound impact on the overall benefits of the rule—by crediting manufacturers with more reductions than would actually appear in the real world, those benefits are simply “lost.”

Tallying up the impact of automaker proposals

The impact of many of these requested flexibilities are uncertain because they depend upon exactly how many vehicles are sold with a given technology.  Furthermore, not all requests have been explicitly quantified, and in the case of requests for credits for safety technologies, the data is uncertain not just about how many vehicles would adopt this technology, but whether there is even any benefit at all.

However, I’ve put together an assessment of the four most clearly-defined proposals below, assessing their impact relative to the standards that we currently have on the books:

Honda:  Honda has specifically proposed a stringency of “approximately 5 percent per year annual improvement,” making it essentially the same as the rules we have on the books right now.  The catch, however, is that they’ve requested added incentives, asking for EV incentives to be extended and for hybrid incentives to be available for all light trucks, including the hybrid Honda CR-V going on sale in some parts of the globe in 2019.

Global Automakers:  The Association of Global Automakers represents the major Asian manufacturers as well as a handful of small luxury car companies.  Unlike Honda, they have only hinted at what level of stringency they believe would be appropriate, including and citing a study by Novation Analytics claiming that gasoline-powered cars and trucks could only achieve standards of 49 mpg and 35 mpg in 2025, respectively (compared to 55 mpg and 40 mpg according to the current standards).  Additionally, they asked for even more flexibilities than Honda, including giving credits for hybrid cars like the Prius, which has been on sale for two decades.  They have also requested credits for safety technology like adaptive cruise control, despite little evidence suggesting it will result in net emissions reductions—we have not considered the impact of these additional “off-cycle” credit requests.

GM:  In lieu of the program now on the books, General Motors proposed a completely different scheme—gasoline-powered vehicles would be required to improve by about 1 percent per year, but in addition there would be put in place a National Zero Emission Vehicle (NZEV) program to encourage sale of electric vehicles.  The problem, as my colleague has already written, is that the NZEV proposed by GM is quite weak, leading to just 8 percent EV sales by 2030.  On top of this, the proposal on conventional vehicles is flimsy and includes credit giveaways, but it would be in effect for the vast majority of vehicles because conventional vehicles will be 95 percent of vehicles sold 2020-2030, even under GM’s proposal.

Auto Alliance:  The Alliance of Automobile Manufacturers ramps nearly every requested loophole to 11.  Not only do they request permanently excluding the impact of the electricity powering EVs, but it requests that the multipliers be more than doubled, from 2 to 4.5 for battery-electric vehicles and from 1.6 to 4.8 for plug-in hybrid vehicles—yes, they are actually requesting more credit for vehicles with worse emissions.  They are also seeking to change the definition of a truck so that all utility vehicles fall under significantly weaker standards, even while acknowledging that consumers are cross-shopping sedans and crossovers.  Importantly, the Alliance does not propose a specific change to the year-over-year stringency of the program, only a general call for “adjustment”—our analysis of flexibilities thus assumes that the standard curves remain in place, clearly a very, very conservative assumption given the rest of the Alliance proposal.

A rollback by another name

The future impacts of these proposals are uncertain—the adverse effects on emissions from giving extra credit for hybrid or plug-in electric vehicles depends on the number of those vehicles sold.  Our modeling spans a number of scenarios of technology penetration, ranging from the agencies’ 2016 analysis and compliance with state ZEV standards to the agencies’ 2018 analysis and its ludicrously high assessment of technology needed to comply with regulations.  No matter how you cut it, it is clear from this analysis just how severely these automaker asks would erode the standards.

The asks from the Alliance in particular are so egregious one wonders whether they were accompanied by maniacal laughter and moustache twirling.  Without even reducing on paper the requirements of the standards on the books today, the Alliance asks are equivalent to a rollback under even the most moderate assumptions, and at the level of technology adoption that they and their members claim is necessary, the giveaways would actually be worse than the administration’s proposal.

An incredibly myopic “vision”

The GM NZEV plan has been heavily covered in the media, with some mistakenly calling it a vision for the future.  But the numbers speak for themselves—the GM proposal disregards significant improvements in the vast majority vehicles through 2030 and provides not much better than status quo adoption of EVs in return.  Additionally, they call for increased credit for hybrid light trucks and reclassification of more of the fleet as light trucks, which would fall under weaker standards.

The result is predictable and amounts to an average improvement of about 1.4 percent per year, well short of the nearly 5 percent improvement on the books right now.  It also serves to undermine state and EPA authority under the Clean Air Act, escalates giveaways for unproven technologies that (coincidentally) GM is planning on selling anyway, and doesn’t even provide a guarantee for the benefits under its piddly NZEV because it has an escape clause which would nullify the proposal and any meager attempt at progress if things get too hard—not unlike the eject button they’re trying to push as a part of this mid-term review.

Is anyone not calling for a rollback?

Maybe the clearest outcome of the mid-term review has been to show the viability of the current standards—as time has gone on, more opportunities to reduce fuel use and emissions have been put to market, and even some of the most obvious, low-cost solutions are still only gradually making their way across the fleet.  We and many others continued to press this point to the agencies in the public comment period, pushing back on the administration’s rollback.

Unfortunately, apart from Tesla (who called for even stronger standards), the closest any automaker got to calling for standards equivalent to what we already have right now is Honda.  While they distanced themselves from some flexibilities requested by their trade group like the Prius loophole, Honda still mirrored a number of the same requests.  That means that while on paper the rules would remain as stringent as they are right now, Honda’s proposal would still cut 20-40 percent of the benefits of the rules on the books today, leading to an increase of 175 to 350 million metric tons over the lifetime of vehicles sold through 2025.

While compared to the rest of the industry that may be about as good as it gets, even Honda’s proposal is a significant step backwards, slowing down near-term progress with a wink and a nod that the industry is committed to a sustainable future.  That, of course, is a tactic we’ve seen before.

Promises today, pollution tomorrow

The history of the auto industry is rife with examples of automakers undermining progress not out of technological infeasibility but out of profit and disregard for public outcomes.  When it came to tailpipe pollution, the Alliance spent years undermining the science. When California pushed for action, the companies pushed back, claiming that voluntary action that would prove woefully inadequate to the problem was the right path. After California’s successful regulatory push to move tailpipe control devices to market led to federal regulations, automakers again stalled, winning a reprieve again on the claims that what is really needed is fleet turnover—a claim which, of course, proved false and led to untold adverse health consequences as a result.

There are positive statements in the positions of Global Automakers and Honda that recognize the need for continued progress, and while the proposals represent a short-term setback, it is possible that this is merely strategic positioning as the companies look to negotiate a truly sustainable path forward.  But when looking at the proposal from General Motors looking to codify the status quo and the harmful, cartoonish nonsense out of the Alliance that would actually make the country worse off than the administration’s proposal, it’s hard not to see these proposals together as just another example of an industry doing what it can to avoid responsibility for its products, consequences be damned.

The Voters have Spoken: Time for Checks and Balances to Make a Comeback

Photo: PeopleImages/iStockphoto

The election is all but over, and the result is a divided Congress.

Take a deep breath, scientists, and remember that divided government in these United States is what our Constitution was designed for.  A guiding principle was one of checks and balances – a check on dominance of one point of view and balance in the resulting policies for the people.  Something that, in my view, has been sorely missing for the last two years because adherence to party has superseded service to constituents and country.

So now what?  In Washington-speak, there will be an increased appetite for serious “oversight” of the Executive branch by the House of Representatives.  That means that Congress is likely to focus on how the Trump Administration is implementing the laws and mandates put in place to serve the public’s interest.  This is literally one of the “checks and balances” the framers of the Constitution created.

How does that happen?  Congress can hold hearings to question agency officials as well as solicit views from the affected public, experts, and other stakeholders about impacts of agency actions.  Also, as appropriators of federal dollars, Congress determines funding levels for each agency and can set the terms of use for those funds.  And Congress can demand information in writing, investigate problems through the Government Accountability Office (GAO) or Inspectors General’s (IG) offices in each federal agency, and hold agency officials accountable both in the court of public opinion, along with referring cases to the courts as needed.   These are powerful tools that have been semi-dormant for a couple of years.  Time for a change.

I like to think of Congressional efforts toward checks and balances coming from three sources:

  1. Pursuing specific constituent concerns
  2. Ensuring the intent of Congress is carried out
  3. Highlighting controversial issues
Constituent services

Every member of Congress is elected to serve both their constituents and the Nation as a whole.  And every member is attentive to issues raised by their constituents, whose welfare (and votes) matter to them.  When a member of Congress hears similar concerns from multiple constituents, he/she can and should see what can be done to address the issue writ large from DC.  Your calls, your letters, your visits to local state or district offices matter.  Every scientist is also a constituent; communicating with your elected representatives can often be more important and effective than the voice of a famous expert speaking broadly from elsewhere about a policy.  So, scientist/constituents can be the impetus for congressional oversight.

Let’s consider a few ways this could happen right away.  Without notifying the public, the Environmental Protection Agency (EPA) this past year made a legal interpretation that the rules for industrial facilities that emit hazardous air pollutants will change — with the potential to dramatically increase emissions of these toxic and sometimes cancer-causing substances.  Suppose one of those facilities is in your neighborhood (and we have mapped them all by congressional district)?  You and your neighbors could ask your member of Congress to demand more information from the EPA or even to call for reconsideration of that policy change.  Tell your elected representative you expect them to hold the EPA accountable for public health impacts in your community.

Or perhaps you live near a military base, and your water supply has been contaminated by toxic per- or polyfluorinated alkyl substances (PFAS), endangering the health of your family and your neighbors.  We mapped many of these sites too.  The EPA has taken little to no action to clean up these hazardous pollutants despite overwhelming scientific evidence, and the Department of Defense (DoD) is moving slowly.  Your elected officials need to know that this isn’t acceptable.  It’s up to you to tell your member of Congress that you want them to hold the EPA and DoD to account for cleaning up the pollution.  That’s their job – to serve the public interest, not the interest of companies like Dow, Dupont, or 3M that made these compounds and are pushing back on improving the safety standards.  Your members of Congress can insist on better information, a timeline for cleanup, funds to make the water safe, and clear commitments to action by the agencies and the Administration, if they think it matters to you.

Intent of Congress

Another important part of oversight is to monitor and constantly question whether agency actions are meeting congressional intent.  In other words, ensuring that the agencies are doing their jobs on behalf of the public. Every law passed and perhaps periodically reauthorized and updated by Congress has specific goals in mind.  The Clean Water Act aims to make the nation’s waters fishable and swimmable.  The Clean Air Act seeks to ensure that the existing and future sources of air pollution are curtailed to protect public health and welfare.  The Endangered Species Act is designed to prevent the extinction of species.

Executive branch agencies implement those laws through policies and regulations specifically designed to meet the intent of Congress as written in the statute and interpreted by the courts.

Again, consider some examples.  Congress intended the Clean Air Act to clean up the air and to use the best available science to determine threats to public health and safety and then enact safeguards to protect the public against them.  Recently, the EPA has taken actions that fly in the face of this statutory mandate.  They intend to restrict the science that EPA can consider in implementing public health and safety regulations; they have dismissed the expert panels to advise on the scientific evidence for major air pollutants; and they have reshaped the agency’s science advisory boards to give industry and states a greater role than independent academic scientists.  Is this what Congress intended when it told the agency to use the best available science?  We should ask our elected officials to question these actions and demand justification from the agency. Scientist/constituents can call on Congress to withhold such that they can not be used to implement agency policies that sideline science.  And, of course, we can advocate for stronger laws that the agency can’t easily wriggle out of that ensure the use of science.

Controversial issues

There has seldom been lack of controversy in how our governments decides to deal with particular issues, but lately concerns about climate change, for example, have reached fever pitch.  These will continue, as different stakeholders have different priorities, preferences, and even values.  But our policies will not get better under any circumstances by ignoring the scientific evidence.  At the Department of Interior there have been across the board actions to remove consideration of climate change from agency planning and actions.  That includes virtually hiding reports that describe global warming impacts.

In addition, there are controversies related to conflicts of interest of political appointees and the culture of corruption in agencies and to advisory committees, as well as clear indications of political interference in agency science.

This is not just politics as usual; there are serious challenges we face as a nation.  Questions Congress could and should address in hearings, investigations and demands for information include:  Is our government and our governmental agencies putting the public’s interest first and foremost when it acts?    How should we be using public resources?  When are we going to get serious about addressing climate change — one of the greatest challenges we face globally and as a nation?

Our role as constituent scientists

There are many issues of concern that are a combination of sidelining of scientific evidence and impacts on people in our communities directly.  So, lets speak about the science, but also local impacts when contacting our representatives and asking them to pursue a strong oversight agenda.  Let’s bring the facts forward, demand information and look for solutions.  These are not esoteric or theoretical problems.  We need to speak as both scientists and constituents.

The checks and balances of Congressional oversight that I am talking about are often motivated by constituent concern, when it is voiced directly, clearly and productively.  As scientists, we are constituents but with a particular knowledge set and training on how we approach problems that is particularly valuable in shaping the oversight discussion.  As community members, we have a strong role to play in ensuring these health and safety issues get the attention they deserve, and responsible action from our federal government.

Voting in the midterms was incredibly important.  Now we need to follow up on the opportunity created by a new Congress by speaking truth to power, calling on our representatives to do the crucial job we gave them of checking and balancing the Trump Administration.  Let us know if you want to join us, and we’ll be in touch!

 

 

Photo: PeopleImages/iStockphoto

Ørsted, Deepwater Wind: Are Offshore Wind Mergers Good for Us?

Kim Hansen/Flickr (https://www.flickr.com/photos/slaunger/5483311060/)

Last week saw offshore wind giant Ørsted complete its acquisition of local star Deepwater Wind. Is that a good thing?

The players and the scorecard

The $510-million deal brings together two important players in the offshore wind space. Rhode Island-based Deepwater holds the distinction of being the developer of the first offshore wind project in the Americas, the Block Island Wind Farm in Rhode Island waters. Since no successor projects have gotten that far, it also holds the distinction of being the owner of the only offshore wind project in the Americas.

Ørsted, formerly the Danish Oil and Natural Gas company, developed the very first offshore wind farm, in Denmark in 1991, and is the largest developer of offshore wind in the world, including the new world recordholder for largest offshore wind farm. In this country, it holds an offshore wind lease in federal waters off the Jersey coast, and a half share of another off Massachusetts. It’s also involved in a pilot project under development off Virginia.

For Deepwater’s investors, the acquisition by Ørsted, originally announced last month, likely represents a successful exit on the bet they took with the company, established in 2007.

For Ørsted, acquiring Deepwater gives it an even more solid footing in the US market. Along with the Block Island project, Deepwater holds two of the four federal offshore wind leases off Rhode Island and Massachusetts, and half of another off Maryland. It won bids early this year to supply Rhode Island with 400 MW of offshore wind and Connecticut with 200 MW. And its proposed 90 MW wind farm east of Long Island looks like a good bet to be one of the next places for steel in the water.

Credit: Derrick Z. Jackson

What about us?

The Ørsted press release announcing the Deepwater acquisition said that they expected it “to deliver a healthy value creation spread on top of our cost of capital, with additional significant strategic upside.” It’s not entirely obvious what that business-speak means, but it’s clear they think it’s a good idea for them.

But what does this merger mean for us—consumers, policy makers, or just interested observers?

On the one hand, competition is good, and a merger like this arguably reduces competition—in the case of bid opportunities like the ones from Rhode Island and Connecticut (and Massachusetts), for example. Some might also feel some regret having an American company get acquired from abroad.

On the other hand, it’s easy to view this as a strong vote of confidence by a company that knows more than a thing or two about the offshore wind space. If Ørsted is willing to put a half a billion dollars into increasing its presence in these parts—not to mention the investment that its new portfolio of projects will require—that’s a pretty strong sign that we (the public, the states, the federal government) must be doing something right in working to create an attractive climate for investment in offshore wind.

There’s also clearly a lot of value in achieving economies of scale in this industry. European offshore wind project keep getting larger and cheaper, and now we’ve seen dramatic drops in the price of power from offshore wind on this side of the Atlantic, in Massachusetts’s recent long-term contracting.

And, while Deepwater was no shrinking violet, financially (its owner was a hedge fund with tens of billions of dollars under management), Ørsted brings plenty of capital to bear plus its 27-year experience in the offshore wind space.

Given the incredible challenge of climate change, our need to do offshore wind power not just quickly but correctly, and the tremendous potential off our shores/near our cities, most anything that accelerates the ramp-up of offshore wind in this country is probably a good thing for us as consumers, and for us as citizens of a world in need of decarbonization.

Because ultimately, that’s where our focus needs to be: faster, cheaper, right-er. We’ll be watching the industry to make sure that’s where their focus stays too.

Photo: Kim Hansen/Wikimedia Commons

Forget the Trump Bailout—Here’s a Real Solution for Nuclear and the Climate

The Trump Administration’s proposal to bail out uneconomic coal and nuclear power plants is a bad idea predicated on a made-up problem. The real crisis we face is the climate crisis, as the recent IPCC report highlighted in stark terms last month. We must steeply reduce CO2 emissions over the next decade and beyond or we will lock in warming that will have disastrous consequences for people around the word.

We’ve dwindled away our most precious commodity in the climate fight… time. Now there are no easy options; no easy pathways. We are in a world of trade-offs. We must reconcile the science and the clock with the reality of where we are in our transition to a clean energy economy.

For the electricity sector, that means building a lot (a lot a lot) more renewables and increasing energy efficiency. It means modernizing our grid, ramping up energy storage, and phasing out coal and natural gas without carbon capture and storage (CCS). And it also means scratching and clawing for every metric ton of CO2 we can avoid, including guarding against the risk of existing nuclear power plants retiring abruptly and being replaced by natural gas.

UCS’ new report, “The Nuclear Power Dilemma: Declining Profits, Plant Closures, and the Threat of Rising Carbon Emissions” analyzes the economics of the existing nuclear fleet and concludes that a well-designed carbon price or a low-carbon electricity standard will help keep existing nuclear plants that meet high safety standards online.

The Trump coal and nuclear bailout is not a real solution

Earlier this year, the administration issued a notice of proposed rulemaking to the federal electricity regulatory commission (FERC), which would use executive authority to force consumers to buy more expensive electricity produced from coal and nuclear plants.

This is a bailout. Not only would it cost rate-payers (or taxpayers, depending on how the bailout is paid for), but the additional use of coal would hurt public health and increase the heat-trapping emissions that drive climate change.

The administration said they needed to take this unprecedented action because the prospect of coal and nuclear plant closures would jeopardize electricity reliability—keeping the lights on—and make the grid less resilient. This justification has been widely disproved by grid experts and was unanimously rejected by FERC. The administration’s actions appear to be based more on politics than on substance.

Even if this administration abandoned the current architecture of the proposal, jettisoning the coal bailouts and focusing only on nuclear, it would still be a poor approach. Dumping a bunch of rate-payer or taxpayer money into the coffers of private interests without big public benefits, transparency, and accountability is wrong.

Likewise, temporary bailouts for nuclear don’t address the systemic market failure which is a significant part of why nuclear plants are losing money in the first place: zero-carbon benefits are not rewarded in the marketplace in most states. Nuclear is competing with natural gas on an uneven playing field, and it’s losing. A temporary nuclear bailout would do nothing to address the underlying issue; applying a Band-Aid on a deep, gaping wound is not a real solution. Throwing good money after bad is not a responsible use of the public trust; these plants would be right back in the red the minute that money runs out.

What nuclear and other low-carbon technologies need is durable policy support that corrects this systemic market failure.

Real policy solutions that help existing nuclear and the climate

Our new report found that even a very modest carbon price ($25 per ton in 2020, increasing 5 percent per year) would solidify the economic position of the existing nuclear fleet, helping to avoid an over-reliance on natural gas and significant emissions increases. It would also incentivize the development and deployment of renewables, as well as other low- or zero-carbon energy technologies.

One policy option that hasn’t received as much attention and can also deliver similar benefits as a carbon price is a National Low-Carbon Electricity standard (LCES), or “Clean” Energy Standard.  UCS has supported this approach in the past, but as i will explore in a subsequent blog, the policy design matters.  For example, the last federal iteration of this policy was the Bingaman Clean Energy Standard Act of 2012, which gave partial credit to natural gas generation without CCS, which we would not support today, given the country’s growing over-reliance on natural gas, and the significant associated carbon emissions.

UCS modeled two policy scenarios: a modest carbon price case ($25 per ton) and a modest low-carbon electricity standard (60% by 2030/ 80% by 2050). The figure below compares the modeling results for our nation’s electricity generation mix under the policy scenarios to the 2017 generation mix, a reference case in 2035 (which includes the 5 nuclear plants slated to retire by 2025) and to three ‘early nuclear retirement scenarios’ that assume an additional 13-26 percent of the current nuclear fleet retires by 2026 because of economic reasons (before their current 60-year operating licenses expire). The early nuclear retirement scenarios are based on our analysis of the profitability of the existing fleet.

Both the carbon price and the LCES help maintain existing nuclear generation at reference case levels through 2035. In the case of the LCES, we see additional reductions in natural gas and additional development of wind and solar. How much the generation mix shifts to low-carbon resources is a function of the stringency of the policy; a higher carbon price or a more ambitious LCES target would show even more renewables.

The figure below shows the emissions trajectory of the different scenarios, including a carbon price and an LCES. Note that our early nuclear retirement scenarios show a 6 percent increase in emissions at a time when we need to be on track to achieve a 90 percent reduction by 2040 (shown here as the National Research Council Carbon Budget) to stay on track with our climate goals. The figure also shows that a 60 percent by 2030 LCES provides similar emissions reductions as the $25 per ton electricity sector carbon price, but note that those policies only get us a little more than half way to our emission reductions target by 2035. More stringent policies or additional complementary polices are required.

A national LCES is good for red states

UCS has been a leading advocate of renewable electricity standards (RES) around the country for many years, and supported the last federal iteration back in 2015, the Udall 30 by 2030 bill. We continue to believe that Congress should pass a strong national RES to help incentivize more renewables development, reduce our nation’s growing over-reliance on natural gas, and aggressively bring down carbon emissions. But, a properly designed national LCES can provide similar benefits, while also solidifying the economic position of existing nuclear plants that meet strict safety standards (preventing abrupt closures). And while we did not analyze this in our modeling, an LCES could also provide an incentive for developing new low and zero carbon energy technologies, including potentially new nuclear reactors and carbon capture and sequestration technologies (CCS), giving us more tools for the climate fight.

A national LCES can broaden the tent of support for low-carbon electricity in parts of the country that are not as far along in their transition to a clean energy economy. This policy helps mitigate some of the imbalances to states with less renewable development relative to a national RES. And it gives many red state congressional delegations a clean energy policy that may be a better fit for their state, freeing up badly needed support from conservatives.

For example, a strong national LCES would provide a lot of benefit to states like South Carolina and Tennessee, for which nuclear power makes the biggest contribution to their electricity mix, with very little coming from renewables. These states could be in position to benefit economically from this policy, while an LCES would also incentivize additional renewables and/or low-carbon development in those states as they prepare to eventually replace those nuclear plants when their useful life expires.

A strong national LCES would also benefit states like Iowa and Kansas, which have enormous wind power as well as nuclear, but also have a lot of coal in their electricity mix. A national LCES would help that existing nuclear stay online, as well as retire some of that expensive and harmful coal generation, while also building on the amazing 36-37% wind energy in their mix. Iowa and Kansas could also easily comply with an LCES and will benefit economically.  All of the states below would realize significant public health benefits that come with trading off coal for renewable energy development.

Electricity Generation Share by Sources, 2017 (source: The Nuclear Power Dilemma)

STATE Nuclear Coal Nat. Gas Hydro Wind Solar Biomass Other SC 58% 19% 17% 3% 0% 0% 3% 0% TN 40% 35% 13% 10% 0% 0% 1% 0% IA 9% 45% 6% 2% 37% 0% 0% 1% KS 21% 38% 5% 0% 36% 0% 0% 0%

We need to create incentives for states to reduce investments in coal and natural gas, maintain the low-carbon generation they already have, and substantially increase investments in new low or zero carbon technologies. Complementary policies to boost energy efficiency will also be needed. With a national LCES, several years from now the table above could show a significant reduction in generation from coal (and natural gas), while holding nuclear generation steady, and substantially increasing the contribution from renewables.

Absent a national LCES or some other policy that incentivizes and protects low carbon generation, the electricity mix in states like South Carolina and Tennessee is likely to go in the wrong direction for the climate.

We need real solutions, not bailouts

Our new analysis of the economics of the existing nuclear fleet clearly show there’s a risk of abrupt retirements, and that the generation would be replaced primarily by fossil fuels. That’s a climate problem, but it’s also a public health problem, it’s a jobs concern, there are tax revenue implications for communities, and much more. States like Illinois, New York and New Jersey avoided abrupt nuclear retirements by working with stakeholders to reach agreements that spawned real policy solutions. Pricing carbon and creating national standards for low emissions electricity are real policy solutions that would protect existing nuclear that can be implemented at the state or the federal level.

These policies don’t cost taxpayer money, and the modeling we’ve done on the electricity price impacts has shown no significant increases.

Juxtapose these real policy solutions with the coal and nuclear bailout proposed by the Trump administration that will cost substantial rate-payer or tax payer money, will NOT protect nuclear in the long-term, and will assuredly exacerbate the climate crisis while increasing threats to public health.

The choice is clear. We need real policy solutions, not bailouts for political supporters.

Fossil Fuel Giants Are Pumping Out Greenwashing—Their Tricks Won’t Work

Exxon refinery in Baytown, Texas.

In recent months, we’ve seen fossil fuel giant ExxonMobil leave the American Legislative Exchange Council (ALEC), pledge $1 million to support a carbon tax, announce measures to reduce methane emissions, and join the Oil and Gas Climate Initiative (OGCI). Is the company finally getting serious about addressing climate change? Um, no. ExxonMobil is finally responding to mounting pressure from shareholders, law enforcement, and the public to clean up its climate act. So are other major fossil fuel companies. That was one of the key findings of the 2018 update to UCS’s Climate Accountability Scorecard, an in-depth analysis of eight major oil, gas, and coal companies’ climate-related positions and actions. But unfortunately, we also found that these companies still appear to be trying to trick us with greenwashing. (Read my colleague Brenda Ekwurzel’s blog breaking down the ways that many of these companies continue to distort climate science here.)

Here are six tricks by ExxonMobil and some of its key competitors that we’re countering with our public exposure and organizing.

Trick 1: Look over there—squirrel!

Each of ExxonMobil’s announcements followed news about the severity of the climate crisis and the outsize role of major fossil fuel producers in creating it: Baltimore and Rhode Island suing to recover the costs of climate damages and preparedness, the special report of the Intergovernmental Panel on Climate Change (IPCC) on the dangerous consequences of global temperature increase of 1.5°C above pre-industrial levels, the New York state attorney general filing a lawsuit against ExxonMobil for defrauding its shareholders by downplaying expected climate risks to its business. In this context, it’s hard not to see the company’s moves as public relations distractions.

Trick 2: Putting their money where their mouths aren’t

This election season, major fossil fuel companies fueled the opposition to I-1631—the ballot initiative in Washington state for a carbon fee that went down to defeat despite heroic organizing by a broad grassroots coalition. The Western States Petroleum Association (WSPA), which counts BP, Chevron, and ExxonMobil among its leaders and ConocoPhillips and Royal Dutch Shell among its members, was the sponsor of “No on I-1631” leading the charge. BP spent a staggering $13 million to fight the carbon fee, directly contradicting its claim that “carbon pricing provides the right incentives for everyone—energy producers and consumers alike—to play their part in reducing emissions.”

Trick 3: Putting their mouths where their money isn’t

ExxonMobil’s pledge of $1 million to lobby for a carbon tax is dwarfed by the $36 million the company has donated over the past 20 years to groups that spread climate disinformation.

Similarly, the $100 million pledges by ExxonMobil and Chevron to the OGCI Climate Investments fund might sound like a lot… that is, until you compare them with the companies’ planned spending on oil and gas exploration and infrastructure in 2018: $28 billion for ExxonMobil, $15.8 billion for Chevron. And while “climate investments” might evoke renewable energy resources like wind and solar, the OGCI fund focuses instead on reducing methane leakage, promoting energy efficiency, and developing carbon capture and storage (CCS) technology to sequester global warming emissions from fossil fuels by storing them underground. The pot of funding pledged to date by the 13 OGCI members seems little more than a token—particularly when you consider that Shell’s scenario for limiting global temperature increase to well below 2°C above pre-industrial levels relies on a 200-fold increase in deployment of CCS by 2070. Meanwhile, when it comes to renewables, ExxonMobil seems content with supplying lubricants for wind turbines.

Trick 4: Empty promises

Like BP, ExxonMobil and Shell have long professed to support a price on carbon. And like BP, these companies have yet to back up their stated support with consistent policy advocacy.

BP, Chevron, ExxonMobil, and Shell have also publicly committed to uphold five “guiding principles” to reduce methane emissions—including to support “sound” and “effective” methane policies and regulations. Yet as the Trump administration proposes to weaken and even eliminate methane regulations, these companies have failed to step up to defend them. In fact, all four of these companies maintain leadership roles in the American Petroleum Institute (API), which is pushing for the rollback of methane rules.

Trick 5: Hiding behind front groups

Major fossil energy companies have a long history of funding campaigns to sow doubt about climate change. Much of this disinformation has been disseminated by third-party groups, including trade associations, think tanks, and other nonprofits.

This trick remains in the fossil fuel industry’s playbook. A recent investigative piece by ProPublica found that Big Oil and other industries are getting around Facebook’s new ad transparency rules—in some cases with “a digital form of what is known as ‘astroturfing,’ or hiding behind the mirage of a spontaneous grassroots movement.”

UCS’s 2018 scorecard found that all eight companies in our sample maintain membership in trade associations and other industry-affiliated groups that spread disinformation about climate science and seek to block climate action. Each company holds at least one leadership position in groups such as ALEC, API, and WSPA.

The influence of industry groups is a major obstacle to achieving the “rapid and far-reaching” transitions across major sectors of the global economy that the IPCC special report says are now needed to limit global warming to 1.5°C. While the report acknowledges that industry group lobbying was a factor in reducing political space for some major emitting nations to maneuver, the IPCC has faced criticism for ignoring academic research into fossil fuel-funded climate science denial campaigns.

Trick 6: Mum’s the word

More than a dozen coastal and inland communities in the US have now filed lawsuits to hold fossil fuel companies accountable for climate damages and the ongoing costs of mitigation and preparedness. Last week, New York City appealed to overturn the dismissal of its lawsuit by a federal district court. Although such lawsuits generate substantial media visibility, require significant legal efforts, and expose companies to the possibility of enormous payouts, BP, Chevron, and ExxonMobil failed to disclose their potential climate litigation liability to shareholders in their securities filings.

The Climate Risk Disclosure Act aims to end such incomplete and uneven disclosures. Introduced by Senator Elizabeth Warren and seven co-sponsors and supported by UCS and dozens of other organizations, the bill would require public companies to disclose critical information about their exposure to climate-related risks. In the face of stricter transparency rules, silence will no longer be golden for fossil fuel companies.

We’ve got the major fossil fuel companies right where we want them—starting to say some of the right words. They have only come this far thanks to public, investor, and legal pressure. Now we need to ramp up the pressure to turn those words into meaningful actions. The IPCC 1.5°C report is a stark reminder that there’s no (more) time to lose.

On Veterans Day, Why Aren’t Congress and the USDA Looking Out for Those Who Served?

Navy-veteran Lenny Evans Miles, Jr. operates Bluestem Farms LLC, in Chestertown, MD. USDA Photo by Preston Keres

This Veterans Day is particularly significant, marking the 100th anniversary of the end of World War I. Though US veterans from that long-ago war are gone, some 20 million of their brethren are with us today. Our culture honors them at sporting events and other public venues, but we also have an ugly history of mistreating those who served—from returning Vietnam vets being spat upon to mismanaged healthcare programs and corruption at the Department of Veterans Affairs.

And right now, misguided decisions by the Secretary of Agriculture and members of Congress threaten to reverse progress for service members and veterans who want to work the land and feed their neighbors.

In 2014, Congress recognized the ways that military veterans are particularly suited to growing food, and how farming can help former soldiers cope with the effects of war. That year’s farm bill called out veterans as a distinct group eligible for support under the US Department of Agriculture’s beginning farmers programs, opening access to grants and low-interest-rate loans to get started and to innovate.

(For more on how vets-turned-farmers are continuing to serve their communities and reduce hunger, see this 2016 post by former UCS Kendall Science Fellow Andrea Basche, now an assistant professor at the University of Nebraska.)

Fast forward to 2018, and both the Trump administration and its allies in the House of Representatives are pursuing farm bill changes that would hurt those same veterans, along with active-duty military personnel.

The two principal actors—Representative Mike Conaway (R-TX) and Secretary of Agriculture Sonny Perdue—should know better. Conaway, who chairs the House agriculture committee, is an Army veteran and senior member of the House Armed Services Committee; his biography page is emblazoned with an image of him with service members in fatigues. Over at the USDA, Perdue is a former captain in the Air Force, and just last week he professed his gratitude to the nation’s veterans.

But as usual, actions speak louder than words.

The Perdue/Conaway attack on SNAP hurts military personnel and veterans

Take the positions Conaway and Perdue have pushed on the Supplemental Nutrition Assistance Program (SNAP, formerly known as food stamps). We’ve written extensively about the punitive SNAP program changes the Trump administration and Rep. Conaway have pursued this year. The farm bill Conaway wrote and passed through the House in June would add unnecessary and burdensome new work requirements to the program. And that would effectively reduce or eliminate benefits for millions of people.

Now, Conaway and his caucus would have you believe that SNAP is plagued with participants who would rather collect benefits than work, but in fact, most SNAP beneficiaries who can work, do. Another fact? The SNAP rolls include many active-duty military personnel and veterans. A 2016 report from the Government Accountability Office found that about 23,000 active-duty troops used SNAP in 2013, then the most recent year for which data were available.

Moreover, analysis of Census Bureau data by the independent Center on Budget and Policy Priorities (CBPP) found that nearly 1.4 million veterans live in households that participate in SNAP, including 97,000 vets in Conaway’s own state of Texas. CBPP analysts have detailed the ways these veterans would be particularly vulnerable to the ill-conceived new work requirements Conaway and Perdue (and President Trump himself) have aggressively pushed.

A needless farm bill fight has left veteran-farmers without resources

As a result of their intransigence, other programs that benefit veterans (and the rest of us) have been left in the lurch. The congressional standoff on SNAP, which persisted all summer and into the fall, led to the expiration of the existing farm bill, without a replacement, on September 30. My colleagues have written about the effect of the lapsed legislation on agricultural research and local food programs. But the 39 programs stranded without funding when the farm bill expired also included the USDA’s Beginning Farmer and Rancher Development Program, which provides education, mentoring, and technical assistance grants new farmers—and which mandates that at least 5 percent of funds support programs and services that address the needs of veteran farmers and ranchers.

Now, Rep. Conaway has reportedly scheduled a Veterans Day meeting with his counterpart on the House ag committee, at which they will presumably discuss the fate of the farm bill. Perhaps the timing will keep veteran top-of-mind as he decides whether to move toward a farm bill that will help them—or continue to promote policy changes that will hurt them.

No, Natural Gas Power Plants Are Not Clean

You may have heard that natural gas is “clean.” Compared to coal, natural gas produces less global warming emissions and air pollution. But coal is just about the dirtiest way to produce electricity, so almost anything will seem cleaner in comparison. The fact of the matter is that natural gas power plants still produce a significant amount of air pollution, and that’s a problem.

NOx is not your friend

The main pollutants resulting from natural gas electricity generation are nitrogen oxides, or NOx. Not only does NOx cause respiratory problems, but NOx also reacts with other substances in the air to produce particulate matter and ozoneParticulate matter and ozone cause the extensive list of adverse health outcomes you hear at the end of a prescription drug commercial – shortness of breath, heart attacks, premature death; the list goes on. In short, NOx is bad news for human health.

Natural gas power plants have an impact on air quality

At this point you might be wondering, “So how bad is it? How much NOx is coming from natural gas power plants?” That is where things get complicated. According to projections from the California Air Resources Board, stationary sources account for roughly 21% of NOx emissions, while mobile sources account for a whopping 74% of NOx emission in the state. However, emissions from natural gas power plants are only a fraction of the emissions from stationary sources, so NOx emissions from natural gas power plants end up being roughly 1% of total NOx emissions in California.

Displays “grown and controlled” oxide of nitrogen projected emissions for 2019, excluding emissions from ocean-going vessels further than three nautical miles from the coast. Data from California Air Resources Board Emissions Projection Analysis.

Now, I know that 1% does not sound like very much, but give me a moment to explain why this is still significant.

First, natural gas power plants do not move – they just sit there and emit NOx when they are operating. Those NOx emissions may linger in nearby communities, leading to serious health problems for the people living near plants. And since half of California’s natural gas power plants are concentrated in some of the most socioeconomically and environmentally disadvantaged communities in the state, these emissions harm communities that are already overburdened with pollution.

Second, just because the electric sector is cleaner than the transportation sector does not mean the electric sector is not dirty.  Some of the highest-polluting natural gas power plants emit over 100 tons of NOx per year, which is roughly equivalent to the NOx emissions from traveling 11 million miles (assuming an emissions rate of 8.18 grams of NOx per mile) in a diesel school bus, one of the most-polluting types of vehicles. Furthermore, when studying a proposed natural gas power plant, a California Energy Commission analysis found that local one-hour concentrations of NO2 (one form of NOx) would nearly double from their background levels.  These emissions really can affect local air quality, and that is why this is a problem.

The air pollution problem may get worse

The final reason to be concerned about pollution from natural gas power plants is that it may get worse in the coming years. A recent study by the Union of Concerned Scientists found that natural gas power plants in California will start and stop much more frequently in the future, and this increase in natural gas plant start-ups may increase NOx emissions. Natural gas power plants emit more NOx when they are starting up; on average, they emit anywhere between three and seven times as much NOx during start-up than during one hour of full-load operation. As paradoxical as it may sound, California may continue to achieve its global warming emissions reduction goals and increase air pollution from natural gas power plants at the same time.

Let’s make sure that does not happen. Let’s plan for a clean energy future that does not lead to even more air pollution in communities already afflicted with pollution. Let’s make sure we bring everyone along in the transition to clean electricity. UCS recently co-sponsored a bill in the California legislature that was designed to shed light on pollution from natural gas power plants and require better planning for pollution reductions from plants. Though UCS’s legislative effort did not succeed this year, UCS is committed to finding solutions that allow us to transition away from natural gas in a way that is not only economical, but also equitable.

public domain

The Dinner Table is the Latest Battleground for Trump’s Attacks on Immigrant Families

Photo: USDA

From an ill-conceived campaign promise to build a border wall to the recent deployment of thousands of US troops to confront a non-existent “invasion,” radical immigration policy has been a hallmark of the Trump presidency. The administration has introduced a baseless Muslim travel ban; ordered a separation of families at the southern border that landed more than 2,600 children in government shelters; and suggested that children born in the US to noncitizen parents should not be granted citizenship.

Now, the administration is working to target immigrant families closer to home—at the dinner table.

The Department of Homeland Security recently requested public comments on a proposal to change longstanding immigration policy by dramatically expanding the types of public benefits that—if immigrants use them, or even if they’re deemed likely to use them in the future—would weight against their visa or green card applications. Among them are benefits from the Supplemental Nutrition Assistance Program (SNAP, formerly food stamps), which acts as the first line of defense against hunger and financial instability for millions of families in the United States. The end result? Many immigrant families—including those who work and pay taxes (which is most) and those with children born in the US—will be forced to choose between maintaining a path to citizenship and putting food on the table during hard times.

Like many of the attacks that preceded it, the proposed policy is fundamentally at odds with the values we stand for as a nation: we do not discriminate based on religion or national origin, nor do we turn our backs on those in need. Furthermore, it threatens to dramatically worsen hunger and health disparities among some of our most vulnerable populations—including children who are themselves citizens.

UCS joins thousands of organizations in strongly opposing the Trump administration’s so-called “public charge” rule. Below is the letter we submitted to the Department of Homeland Security, outlining the potential damage that could be wrought by the policy.

The deadline for public comments is December 10. You can submit your own comment here, or visit the UCS website to add your name to our petition opposing the rule.

 

 

UCS Submits Public Comment to DHS on Proposed Public Charge Rule, “Inadmissibility on Public Charge Grounds; Notice of Proposed Rulemaking”

November 9, 2018

The Union of Concerned Scientists (UCS) is a science-based nonprofit seeking solutions to our planet’s most pressing problems—from combating global warming and developing sustainable ways to feed, power, and transport ourselves, to fighting misinformation, advancing racial equity, and reducing the threat of nuclear war. Immigration has always been and remains a critical source of America’s unparalleled scientific leadership; the diversity it brings is central to creating effective and meaningful solutions to our nation’s problems.  It also enriches our lives in innumerable ways. We therefore submit this comment to express strong opposition to proposed sweeping changes by the Department of Homeland Security (DHS) to US immigration law and the definition of a “public charge.” This proposed rule defies evidence and would prove devastating to many immigrant families—including those whose children are citizens of the United States—who could be forced in hard times to choose between meeting their daily needs and maintaining a path to citizenship.

Our opposition to the aforementioned policy and programmatic changes is grounded in the following:

  • Data refute the notion that immigrant families rely disproportionately on all forms of public assistance. In 2017, the National Academies of Sciences, Engineering, and Medicine examined the economic implications of immigration. Among other findings, the resulting report revealed that just 4.2 percent of immigrant households with children utilize housing assistance—which would be newly considered in determining public charge under the proposed rule—compared with 5.3 percent of US-born households.[1],[2] Data based on individual, rather than household participation shows that US-born populations use programs like SNAP and Medicare at higher rates than either naturalized citizens or noncitizen immigrants after adjusting for poverty and age.[3],[4] The proposed rule would unjustifiably bring harm to working families who are eligible for these programs—with potential lasting consequences for the long-term health and economic vitality of their communities.
  • The proposed rule would deter participation in programs such as Medicaid, which returns proven benefits for the long-term health, achievement, and economic success of children. The future of our country depends in part on the wellbeing and economic success of its children—about one in four of whom lives with at least one immigrant parent.[5] Research shows that participation in Medicaid not only helps children become healthy adults, but also leads to greater academic achievement and later economic success. Children with access to Medicaid have lower rates of high blood pressure, hospitalizations and emergency room visits as adults; are less likely to drop out of high school; and have higher incomes later in life—contributing a strong return on investment in the Medicaid program.[6] One study reviewing Medicaid expansion during the 1980s and 1990s estimated that, based on children’s future earnings and tax contributions alone, the government would recoup 56 cents of each dollar spent on childhood Medicaid by the time the children turned 60.[7]
  • The proposed rule penalizes working families whose most accessible employment opportunities are often low-wage and lack benefits, such as health insurance. Research shows that the majority of children of immigrants live in households in which both parents are working yet are employed in lower-paying jobs without employer-sponsored health insurance.[8],[9] The food industry is among those that relies heavily on immigrant labor to fill low-wage jobs, from agricultural production to food distribution and service. Food workers make up about 14 percent of the nation’s workforce, and approximately one-fifth are foreign born.[10] The proposed rule would compromise workers’ abilities to feed and care for their own families—even while many work in roles that uphold our food system as we know it.
  • The proposed rule risks worsening hunger and health disparities among vulnerable populations—including children—by deterring participation in effective nutrition programs. Already, social service providers have noted decreases in immigrant participation in major safety net programs stemming from fears of risking green cards or eventual citizenship. Representatives from WIC (Special Supplemental Nutrition Program for Women, Infants, and Children) agencies in states across the country reported reduced program participation following the first release of the draft rule.[11] Though WIC has since been removed from the proposed rule, SNAP remains. Lingering fears are likely to deter immigrant families’ participation in both of these critical programs that prevent hunger and maintain health while families work toward regaining financial stability. Children of immigrant parents, already more likely to experience food insecurity than children of US-born parents, would face greater risk of hunger and poor health without assistance from these programs.[12] Young children’s participation in SNAP is linked to lower rates of obesity and metabolic syndrome in adulthood, as well as higher rates of high school completion.[13]
  • The proposed rule would undermine the core function of the social support programs that comprise the federal safety net, which protects us all from the unexpected. The safety net is designed to protect children and adults from the devastating consequences of food insecurity, lack of healthcare, and financial instability in the face of unpredictable events such as job loss, family illness, or other crisis. These are circumstances that can befall any family unexpectedly. The proposed consequential changes to long-standing immigration policy based on a subjective evaluation of factors such as age, health, financial status, and education would have the negative side effect of preventing immigrants’ use of major safety net programs altogether. Such changes run counter to the purpose of the safety net and would undermine its effectiveness at safeguarding individual families, entire communities and the nation as a whole. When people in our country are poorer and sicker, we all lose.
  • The apparent rationale of the proposed rule flies in the face of core American values. Effectively requiring immigrants to demonstrate they have the resources to meet any current or even future need for assistance as a precondition to legal immigration and citizenship is contrary to America’s founding core as a refuge, as well to our nation’s ideals of equality, justice, and self-determination. Furthermore, in institutionalizing policies with consequences that will be overwhelmingly borne by people of color, the proposed rule threatens to reinforce racist and anti-immigrant sentiments that degrade our country and cause immeasurable harm to citizens and non-citizens alike.

UCS appreciates the opportunity to comment on this proposed rule. In expressing our strong opposition to the proposal, we join the thousands of organizations across the country who have voiced similar objections. The sweeping changes to immigration policy proposed in this rule would exacerbate hunger and health disparities, particularly among children of immigrants; cause harm to all our communities; deny our country the benefits that immigrants bring; and signal to the rest of the world that our society has abandoned our core American values of decency, hard work, and opportunity for all.

Thank you for your consideration.

 

References

[1] National Academies of Sciences, Engineering, and Medicine. 2017. The Economic and Fiscal Consequences of Immigration. Washington, DC: The National Academies Press.

[2] Immigrant households are based on the head of household’s immigrant status (where the head of household is considered immigrant if they are not a citizen or are a naturalized citizen).

[3] Nowrasteh, A. and R. Orr. 2018. Immigration and the welfare state: Immigrant and native use rates and benefit levels for means-tested welfare and entitlement programs. Washington, DC: Cato Institute.

[4] Supplemental Nutrition Assistance Program

[5] The Annie E. Casey Foundation. 2018. Children in immigrant families. Baltimore, MD. Online at https://datacenter.kidscount.org/data/tables/115-children-in-immigrant-families?loc=1&loct=1#detailed/1/any/fal, accessed October 19, 2018.

[6] Chester, A. and J. Alker. 2015. Medicaid at 50: A look at the long-term benefits of childhood Medicaid. Washington, DC: Center for Children and Families. Online at https://ccf.georgetown.edu/2015/07/27/medicaid-50-look-long-term-benefits-childhood-medicaid/, accessed October 19, 2018.

[7] Brown, D.W., A.E. Kowalski, I.Z. Lurie. 2015. Medicaid as an investment in children: What is the long-term impact on tax receipts? NBER Working Paper Series. Cambridge, MA: National Bureau of Economic Research.

[8] The Annie E. Casey Foundation. 2018. Children with all available parents in the labor force by family nativity. Baltimore, MD. Online at https://datacenter.kidscount.org/data/tables/5060-children-with-all-available-parents-in-the-la-bor-force-by-family-nativity?loc=1&loct=1#detailed/1/any/false/870,573,869,36,868,867,133,38,35/78,79/11478,11479, accessed October 19, 2018.

[9] Earle, A., P. Joshi, K. Geronimo, et al. 2014. Job Characteristics Among Working Parents: Differences by Race, Ethnicity, and Nativity. Monthly Labor Review. Washington, DC: Bureau of Labor Statistics.

[10] Food Chain Workers Alliance and Solidarity Research Cooperative (FCWA/SRC). 2016. No piece of the pie: US food workers in 2016. Los Angeles, CA: Food Chain Workers Alliance.

[11] Baumgaertner, E. 2018. Spooked by Trump Proposals, Immigrants Abandon Public Nutrition Services. The New York Times, March 6.

[12] Chilton, M. et al. 2009. Food insecurity and risk of poor health among US-born children of immigrants. American Journal of Public Health 99(3): 556-562.

[13] Council of Economic Advisers (CEA). 2015. Long-term benefits of the Supplemental Nutrition Assistance Program. Washington, DC: Executive Office of the President of the United States.

Photo: USDA

Can the EPA Protect Us from Ozone and Particulate Pollution Without Its Experts? What to Watch

This week, the EPA announced that its Clean Air Scientific Advisory Committee (CASAC) alone would be reviewing upcoming ozone and particulate matter reviews. On October 10, the EPA nixed its ozone and particulate matter review panels—breaking with EPA’s use of expert science advisers for ambient air quality decisions since the 1970s and consistent with this administration’s trend of abandoning science advice. That same day, the EPA replaced the independent scientists on CASAC, leaving a committee of mostly state and local regulators. On December 12, the EPA will bring together the new CASAC for the first time in person to discuss the state of the science on particulate pollution. Will the EPA be able to assess the science and make science-based decisions to protect public health? Here’s what to watch for.

A history of independent science advice

Using science to set ambient air pollution standards has worked remarkably well in the US. Under both Democratic and Republican administrations, our nation has been able to follow a science-based process to set air pollution standards that protect public health. Not to say there has never been political interference (see examples under both George W. Bush and Barack Obama), but the process by which EPA gets science advice on pollution standards has remained intact, even under tremendous pressure from industries and political actors to compromise the process.

Here’s how it works (at least up until now):

For major ambient air pollutants, the EPA assesses the state of the science on a pollutant and its health effects every five years or so, gathering all relevant peer-reviewed science into what’s called the Integrated Science Assessment (ISA). The exhaustively comprehensive ISA looks at all the relevant scientific literature that sheds light on the relationship between a pollutant and human health and welfare. (Fun Fact: The current particulate matter ISA includes extensive discussion of my own academic research on air pollution measurement and exposure error.)

The scientific teeth of the ISA are found in its causality findings—these summarize the weight of the evidence for linkages between the pollutant and different health effects. They range from “not causal” to “inadequate evidence” to “likely causal” to “causal.” It is important to note there is tremendous scientific backing behind each of these statements. A finding that the association between particulate matter exposure and mortality is causal, for example, is backed by science from multiple lines of evidence—epidemiologic studies, toxicology studies, controlled human exposure studies, and biological plausibility knowledge. The robust causal determination framework used by EPA has been vetted and endorsed broadly by experts in the scientific community. These causal findings inform EPA decisionmakers on how to best protect people from harmful pollutants.

To ensure that EPA scientists get the science right, they get help from the independent scientists on CASAC. In addition, since the 1970s the agency has relied on pollutant review panels to get input from experts on specific pollutants. CASAC, too of course, is comprised of air pollution experts, but it is only seven people. It is not possible for this small committee to capture the breadth and depth of the ISA and properly assess all aspects of the science. For example, to assess particulate pollution’s health impact, you’d want experts in epidemiology, toxicology, exposure assessment, instrumentation, modeling, and a host of other specialties. As a result, the EPA has always relied on larger groups of experts like the particulate matter review panel to peer-review its ISA and ensure it gets the science right.

An ill-equipped EPA

But now, EPA is going through the PM and ozone review processes with far less scientific expertise. The Trump administration dismissed the particulate matter review panel entirely, failed to constitute an ozone panel, and removed the independent scientists serving on CASAC. Now the agency is left with a seven-member committee of mostly air pollution regulators. This leaves very little subject matter expertise on air pollutant science and health.

In one striking example, our scientific understanding of particulate matter’s health effects is based in no small part on epidemiologic studies. And yet, not a single epidemiologist will be at the table when EPA assesses the ISA. (The EPA even admits this glaring omission in its recent announcement.) To say that the EPA is ill-equipped to have a scientific discussion on particulate matter in December is an understatement.

This lack of preparedness is exacerbated by the remarkable speed at which EPA is moving. The agency plans to set new ozone and PM standards by 2020—markedly faster than reviews have typically happened given the necessary steps required to gather scientific information, incorporate reviews from CASAC and the pollutant review panels, solicit public impact, analyze policy implications before making a policy decision. To meet this arbitrary deadline, EPA intends to streamline the process, combining analyses that used to be separate documents and likely cutting down on the number of meetings and draft documents. Such measures are almost certain to mean less public input and less scientific assessment feeding into the process.

How should we protect people from particulate matter?

So how should the administration protect people from the harms of particulate matter? The science suggests the EPA should be doing more. The draft ISA finds causal links between PM2.5 (that is, particulate matter less than 2.5 micrometers) and premature death and cardiovascular disease, and likely causal relationships between particulate matter and respiratory and nervous system effects and cancer. The scientific assessment also finds a likely causal link between ultrafine particles (PM less than 0.1 micrometers) and nervous system effects. This is the draft—prior to scientific review and public input—so the linkages are subject to change. But if these scientific findings hold, we should expect EPA to take action, in order to protect public health with an adequate margin of safety—as the Clean Air Act requires. Historically, when a pollutant is linked to a serious health impact, a standard is set to curb pollution. These linkages to health impacts suggest that EPA could consider tightening the PM2.5 standard in order to protect public health and that the agency could potentially propose a new standard for ultrafine particles. Historically, these are the kinds of considerations that CASAC and the PM review panel would vigorously debate at public meetings and calls, with opportunities for public input. But it is difficult now to see how the agency could do the same this time.

A need for science advice

Will this EPA take the further actions required to protect people from these health impacts? One thing is for sure, they are likely to get less science-based input on the decision. With a weakened CASAC and no pollutant review panels, EPA won’t get the direct and robust feedback it needs from the scientific community. Without that scientific input, it is easier for the administration to make a decision that’s politically convenient rather than scientifically backed.

To compensate for the lack of science advice formally being provided to the EPA, it will be especially important that the EPA hear from scientific experts at the December meeting on PM and the November 29 CASAC call to discuss the ozone review process. It is also crucial for the EPA to hear from the public at these meetings because the compressed timeline will mean fewer meetings and thus fewer opportunities for the public to provide comment. Both air pollution experts and members of the public can (and should!) provide comment for the ozone call (November 29) and PM in-person meeting (comments in writing by December 11) and in person (sign up by Dec 5) at the meeting in Washington DC December 12-13. Join me there. I’ll be asking the EPA to listen to the scientists and you can too.

After Pittsburgh, Thousand Oaks, Will New Congress Push for Gun Safety Research?

Photo: M&R Glasgow/Flickr

The night after mid-term elections, our nation suffered another gruesome tragedy at the hands of an armed gunman, and I’m still ready for Congress to demand a science-based conversation on gun violence. Last night in Thousand Oaks, California, 12 people- including the gunman and an officer- were left dead and at least 10 others injured at a popular college bar. It is believed that several survivors of last year’s mass shooting at a Las Vegas music festival were present.

Between the antisemitic attack on the Pittsburgh synagogue on October 27th where 11 people were killed and last night’s shooting in Thousand Oaks that left 12 people dead…there have been 11 other mass shooting incidents resulting in 10 deaths and 46 injuries. That is less than two weeks’ time.

My colleagues and I have written extensively in the past on gun violence and need to remove barriers for federal research (find them here). We have seen some progress, with Congress clarifying this past spring that the Centers for Disease Control and Prevention (CDC) may pursue research on gun violence prevention. Previously, legislative language in spending bills (known as the Dickey Amendment) had effectively banned the Centers for Disease Control and Prevention (CDC) from researching gun violence since 1996. Gun violence is a public health issue, and as with all public health issues, it requires scientific evidence to build the most effective policies to protect people. But is that research actually happening now? We need to ensure that it is.

Just yesterday afternoon, the National Rifle Association (NRA) railed against the medical community for its peer-reviewed firearms studies. Shockingly, the NRA questioned whether doctors should weigh in on gun violence prevention, focusing their ire on a position paper written by the American College of Physicians (ACP) that was recently published in the Annals of Internal Medicine.

Someone should tell self-important  anti-gun doctors to stay in their lane. Half of the articles in Annals of Internal Medicine are pushing for gun control. Most upsetting, however, the medical community seems to have consulted NO ONE but themselves. https://t.co/oCR3uiLtS7

— NRA (@NRA) November 7, 2018

Ironically, the NRA itself steps out of its lane, weighing in on the details of a scientific paper by medical professionals.

Congress can change this. Legislators should provide researchers specific funding and explicit instructions to study gun violence. Perhaps, then our nation can rely on even more conclusive evidence on the causes of gun violence and develop solutions to prevent it—instead of relying on a powerful gun lobby to sway the decision-making with their dollars and nonsense.

We have a new Congress. The new leadership in the House must prioritize oversight of gun violence research at the CDC and take this opportunity to appropriate more dollars to help solve this crisis.

Thousands of people in America lose their lives to gun violence every year. Just this year, there have been 12,477 firearm casualties. It is unfair to the people who have lost their lives to gun violence that we care only after a mass shooting, and that’s only while it’s in the news’s short issue-attention cycle. It is unfair and unacceptable that, despite the tireless work of advocates and activists, the nation has made little progress on gun violence reform.

Photo: M&R Glasgow

How to Make Professional Conferences More Accessible for Disabled People: Guidance from Actual Disabled Scientists

Photo by Yomex Owo/Unsplash.

Attending professional conferences is a key part of life as a scientist. It’s where we present our research, network, and reconnect with colleagues. But for disabled scientists like me, conferences can be inaccessible and frustrating. I talked to several other scientists with a wide range of disabilities about how conferences could be better, and put their advice together in this short summary (also available in a video, if you prefer that).

You should think of this as an introduction to conference accessibility, at best. There are many disabled consultants you can hire to improve your conference, and I highly recommend you do that! They will be able to help you on a much deeper, professional level, and deserve compensation for their time. I’ve also included links to more in-depth resources at the bottom of this page.

Thank you to my friends who helped with this piece: Dr. Alexandra Schober, Dr. Arielle Silverman, Dr. Caroline Solomon, Dawn Fallik, Susanna Harris, and several anonymous contributors. This is part of my Science + Disability series, which is supported by Two Photon Art.

Etiquette

If there’s one thing to remember about etiquette, it’s this:

Talk to disabled scientists about their research. We want to be seen as scientists, not just as disabled people. Avoid saying things like ‘That’s such a cool sign for molecule,’ or, ‘It must be so hard to find signs for all those science words.’”

— Dr. Caroline Solomon

Remember that being an ally can start by just striking up a conversation. You don’t have to go up to a stranger and say, “Wow, you’re the only scientist I’ve ever seen with a medical alert dog. That must be really tough. Can he go in the lab with you? Wow!” Instead, you can just be friendly, the same way you would with anyone else at the conference. I know that I’m far more likely to ask someone for support if I already have a personal relationship with them.

If you think a disabled person needs help, you should always ask them before stepping in. There’s a good chance that they can manage just fine on their own. You can simply ask, “Would you like some help?” and allow them to indicate their preference.

Tell fellow attendees and organizers about these accessibility suggestions! It is exhausting to always have to speak up and ask for accommodations or point out where the problems are. You can lessen that burden by giving your peers gentle reminders to do a better job with accessibility.

Physical Space

What can I say, we like seating options!

Several people had great recommendations around seating. A few of the ideas:

  • Arrange the chairs with plenty of aisles, so that people can easily exit the row to reduce anxiety and/or panic or to allow people with mobility aids enough room to get by.
  • Make chairs available for all speakers, preferably without arms to better accommodate people of all sizes.

Reserve seats at the front of the room for deaf and hard of hearing audience members, or leave some spaces open for wheelchair users.

In a large conference center, provide clear signage and places to rest when trying to get from one room to another. Some conference centers span multiple city blocks and can be exhausting for anyone to navigate, but this becomes even more of a challenge for physically disabled or chronically ill people.

Provide a quiet room where people can go to relax or have some privacy. There should be guidelines that specify that the quiet room is not a place for phone conversations. Quiet rooms are helpful for everyone who needs a break from the busy conference environment, but are also an important space for neurodivergent attendees.

Ask speakers and participants to be scent-free, meaning that (at a minimum) they don’t wear perfume or cologne or use any other scented products. Strong smells can be migraine triggers or distractions for neurodivergent people.

Communication

Prepare both digital and printed versions of conference material. Digital materials are more accessible to blind and low vision attendees (for use with screen readers), and printed materials can help people with ADHD or learning disabilities follow along more easily.

Indicate (preferably before the conference starts) whether food and drinks will be present. This helps people with health conditions and/or food allergies plan how much food to bring. Also, make sure you clearly label food with relevant allergens. Food allergies can be life threatening and can be a barrier to full participation if attendees aren’t sure what is in the food.

Presentations and Panels

Always use a microphone, even if you think you don’t need it, you have a loud voice, or it’s a small crowd. You don’t want to put someone who is hard of hearing or deaf in the position of having to publicly request that you use a microphone. Instead, it’s your responsibility to make your program accessible. If you are organizing a conference, make sure to provide the necessary AV equipment in each room and tell all speakers that they must use microphones.

Poster Sessions

Make sure there are volunteer or staff guides available to give directions and/or read posters to blind and low vision attendees. Poster halls are often giant and difficult to navigate, so this is a great example of a disability-related accommodation that would benefit everyone.

Receptions

Buffets are inaccessible to people who are blind, have low vision, use mobility aids, have arm weakness, can’t stand for long, and many other people. That said, they’re often the fastest and easiest way to feed large groups of people. If a buffet is your only option, have volunteers and/or staff members available to provide assistance.

Create opportunities to socialize without alcohol. People with mental illness may avoid alcohol because of medication interactions, history of trauma, addiction, or a simple preference.

If the reception is being held outside of the convention center/main conference location, tell attendees how far away the event is and whether the venue is accessible. It’s always better to be honest than to hope there’s no problem. For example, if there’s one step to get into the restaurant, just make sure the attendees know that beforehand. Of course, it would be better to choose a fully accessible space for the reception!

Other Resources

There are so many resources available to help you make your events and physical spaces more accessible to disabled people. Here are just a few of my favorites:

Please add a comment below with other resources I should add!

 

Gabi Serrato Marks is a 4th year PhD candidate in marine geology in the MIT-WHOI Joint Program. Her primary research focuses on ancient climate records. You can find her on Twitter and Instagram as @gserratomarks

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Photo by Yomex Owo/Unsplash. Photo by Matthias Wagner/Unsplash

Even in a Carbon-Constrained World, FirstEnergy’s Nuclear Bailout Proposal in Ohio Must Be Rejected

The 908 MW Davis-Besse nuclear power plant, owned by FirstEnergy and located 21 miles east of Toledo, Ohio on Lake Erie. Photo: Nuclear Regulatory Commission.

A new report, The Nuclear Power Dilemma, released today by my UCS colleagues, finds that more than one-third of the nation’s nuclear power fleet – that provides more than 20 percent of the country’s nuclear power – are uneconomic or slated to retire over the next decade primarily due to economic, safety, and performance reasons. Two of the uneconomic plants—Davis-Besse and Perry—are in Ohio and owned by Akron-based FirstEnergy Corp. Like the analysis’s other unprofitable nuclear plants, Davis-Besse and Perry can’t compete in today’s power markets with the cheap natural gas and renewable energy that is transforming our nation’s electricity sector. That’s why FirstEnergy is now seeking a bailout from the Ohio legislature to keep these facilities open.

In a world where the threat of climate change is increasingly dire and the need to dramatically cut carbon emissions is even more urgent, every source of zero-carbon energy is important. But make no mistake: FirstEnergy’s bailout proposals for its struggling nuclear plants are poorly conceived and must be rejected. Here’s why.

FirstEnergy isn’t interested in advancing clean energy or reducing carbon emissions

After doubling down on coal and nuclear despite the rise of cheap natural gas and renewables, FirstEnergy has spent years trying to get support for a bailout of all of its uneconomic power plants, including its fleet of old, inefficient, and dirty coal-fired plants.  First, it appealed to the Ohio utility commission for a bailout of its coal plants, then it went to President Trump and the Federal Energy Regulatory Commission predicated on debunked claims that coal plant retirements would impact the reliability of the electricity sector. Both plans have fortunately failed thus far. But that hasn’t stopped FirstEnergy, and if they get their way, ratepayers will be subsidizing uneconomic, carbon-intensive coal plants to the detriment of our public health, environment, and climate.

Furthermore, FirstEnergy is actively trying to stall Ohio’s clean energy momentum. They have spent years at the Ohio legislature trying to gut the state’s energy efficiency and renewable energy standards that have helped spur Ohio’s nascent clean energy industries. Wanting to subsidize uneconomic coal on one hand and trying to kill clean energy progress on the other leaves no room for negotiation in supporting its nuclear facilities.

Of the 30 states with nuclear power plants, 17 states–including Ohio–have nuclear capacity that is unprofitable or scheduled to close.
Source: UCS

FirstEnergy’s newest proposal fails our conditions for support on all accounts

FirstEnergy’s latest attempt to bail out its Ohio nuclear plants is a “zero-emissions nuclear” (ZEN) proposal (HB 381 in the Ohio Legislature) that would generate ZEN credits for every megawatt-hour (MWh) of power produced from their nuclear plants and then require Ohio’s electric utilities to buy the credits for $17 dollars each (adjusted annually for inflation) through 2030. The legislature’s fiscal analysis reveals the proposal would cost Ohio ratepayers $180 million or more per year.

UCS’ new report argues that we must consider the impacts of potential abrupt nuclear plant retirements in achieving the carbon reductions necessary to avoid the worst impacts of climate change. While the potential retirement of Davis-Besse and Perry poses no threat to the reliability of the region’s power supply, the analysis does show that in the absence of strong policies such as a price on carbon or robust low-carbon electricity standards, coal and natural gas would largely replace their lost generation, thereby raising near-term carbon emissions at exactly the time when those emissions need to be going down. As a result, exploring some means to ensure that these and other unprofitable plants continue operating warrants discussion.

Importantly, the UCS report lays out five conditions that must be met before any consideration should be given by policymakers to providing economic support exclusively to struggling nuclear plants. FirstEnergy’s nuclear bailout proposal fails all of them:

  • Safety: any plant qualifying for economic support must meet or exceed the Nuclear Regulatory Commission’s highest safety standards. The Davis-Besse plant fails this test with one of the worst safety records in the nation’s nuclear fleet.
  • Transparency: nuclear plant owners should open their financial books for regulators and the public to protect ratepayers by demonstrating the need for economic support. FirstEnergy’s proposal far exceeds our estimate of what these plants would need to survive, and they’ve offered no proof that this level of support is necessary.
  • Flexibility: To further protect consumers, financial support should be temporary and adjustable to account for changing economic or policy conditions. FirstEnergy’s proposal appears to lock in significant ratepayer expense through 2030 with no meaningful review or provisions for adjustment.
  • Strengthened renewable energy and energy efficiency standards: FirstEnergy’s proposal does nothing to stimulate the rapid growth in clean energy resources needed to meet our deep carbon reduction goals. As discussed above, FirstEnergy has spent considerable effort to stop Ohio’s momentum in advancing renewables and efficiency. In contrast, Illinois, New York, and New Jersey significantly or strengthened their renewable electricity and energy efficiency standards as part of legislation that provided financial support for distressed nuclear plants.
  • A commitment to impacted communities: Transition plans for affected workers and communities – to attract new investment, replace lost jobs, and rebuild the tax base once nuclear plants eventually do retire – must be included in any economic support proposal. FirstEnergy’s proposed legislation does not put forth anything meaningful in this respect.

So, there you have it: zero out of five conditions met. Because FirstEnergy has shown no commitment to seriously addressing the threat of climate change and because its proposal for bailing out its unprofitable nuclear power plants meets none of the above criteria, it’s clear that Ohio legislators should say no to FirstEnergy’s bailout proposal and instead move forward with a clean energy plan that builds on Ohio’s abundant potential for renewable energy and energy efficiency.

Let’s be clear: UCS does not prefer a piecemeal approach to achieving necessary carbon reductions in our electricity sector. We strongly recommend state and federal policies such as a price on carbon emissions or a low-carbon electricity standard that provides a level playing field for all low-carbon technologies. Our analysis shows these policies would cost-effectively achieve much greater carbon reductions. Unfortunately, there’s currently a leadership void in Washington, DC. Given the urgency of climate change, we must therefore explore alternatives. But we must also ensure that the alternatives stand up to scrutiny and ultimately move us toward a truly clean economy fueled primarily by renewable energy resources. FirstEnergy’s current proposal simply doesn’t pass muster.

NOTE: UCS Sr. Analyst Sam Gomberg contributed in the drafting of this blog.

Pages