UCS Blog - The Equation (text only)

Scott Pruitt’s Regulatory Rollback Recipe  

Vehicle pollution is a major issue for human health and the environment.

EPA Administrator Scott Pruitt continues to stack the deck in favor of industry interests. At least two members appointed by Pruitt to the EPA Science Advisory Board received funding to conduct misleading research that EPA used to justify reexamining vehicle fuel efficiency standards – a regulation forecast to save consumers over $1 trillion, cut global warming emissions by billions of metric tons, and advance 21st century vehicle technology.

This shameless attempt to use shoddy research that was funded by the oil industry and used by automaker trade groups to overturn a regulation that is based on sound science and widespread public support is a perfect example of how Pruitt intends to rollback regulations at the behest of his industry-tied former donors.

Pruitt’s plan is a simple (though perhaps illegal) five-step recipe. Here’s exactly how he has been cooking up a regulatory repeal (or re-peel) soup of equal parts corruption, paranoia, and apathy.

Step 1: Separate independent science from the record, then discard

Make it exceedingly difficult for academic scientists to join the advisory committees that help your agency set pollution thresholds, compliance deadlines, and cost estimates.  These committees are supposed to represent the viewpoints of both independent scientific experts and industry stakeholders, but you can argue that the composition of these committees is solely at your discretion. So go ahead and kick those academic nerds off the advisory committees and replace them with industry-funded friends.

Step 2: Liberally add industry-funded junk science to your liking

Promote the “studies” of your new industry-funded advisory committee friends. Bonus points if they use junk science to show that health benefits from reducing smog “may not occur,” rising carbon dioxide levels are beneficial to humanity, or that people don’t want more fuel efficient cars and trucks. At the same time, give your employees new talking points on climate change to ensure any public facing communications either cast doubt on the science your agency has previously relied on or doesn’t mention it at all. Ruthlessly reassign or fire any employee who fails to comply.

Step 3: Bake junk science into the record

This step is important. Copy the text from industry-funded studies into your official justification to reevaluate, suspend, or rollback rules that science has already shown to be effective. The fastest and easiest way to do this is to just copy the text verbatim. Don’t worry that the administrative record supporting the original enactment of these regulations is chockfull of academic, peer-reviewed studies and thousands of public comments that demonstrate why these regulations are reasonable, achievable, and necessary. Also ignore trepidation from agency career staff who think you are opening the agency to legal challenges or failing to use sound science to justify your agenda.

Step 4: Set legality setting to uncertain, and wait until lawsuits have settled

Use the vast legal resources at your disposal to make any legal challenges to your efforts take as long as possible, which, in the federal court system, can be a very long time indeed. While the courts struggle with whether you have overstepped your authority, your rollback will remain in place – effectively stymying the impact of the regulation on industry for potentially years.

Step 5: Clean your workspace to eliminate traces of corruption and outrageously bad ethics

Make sure you have the support of your boss as you engage in some light to medium graft and corruption. You will probably need a soundproof “privacy booth” that costs taxpayers close to $43,000, a security detail that costs $3 million and protects against non-existent death threats, and a cheap condo rented from the wife of corporate lobbyist for the fossil fuel and auto industries. Keep public leaks of your missteps to a minimum and refrain from using social media to say anything of value.

Overall, this recipe is a disaster for both independent science, and public health. Help UCS push back against Pruitt’s effort to cook this regulatory rollback soup by checking out our new nationwide mobilization effort called Science Rising. This effort isn’t a one-day march—it is a series of local activities, events, and actions organized by many different groups. Our shared goal is to ensure that science is front-and-center in the decision-making processes that affect us all—and to fight back against efforts that sideline science from its crucial role in our democracy.

Will you join us to keep #ScienceRising?

 

The White House Clearly Does Not Like the EPA’s “Secret Science” Plan

The EPA’s plan to limit the types of science that the  EPA can use to make decisions may run into an unusual roadblock: the White House itself. In a Senate hearing yesterday, New Hampshire Senator Maggie Hassan questioned White House official Neomi Rao about the EPA plan (watch here, beginning at 59:02), and the answers suggest that the EPA and the White House are not on the same page.

Ms. Rao heads the Office of Information and Regulatory Affairs (OIRA) in the White House’s Office of Management and Budget. The office is responsible for overseeing the administration’s regulatory agenda. Agencies submit rules for OIRA review before they can be finalized.

The White House tends to enthusiastically support federal agency initiatives. But in a hearing Thursday, the administration’s representative, Neomi Rao was pretty lukewarm about the EPA’s proposal to limit the use of science at the agency. Archival photo via C-SPAN.

Some speculate that OIRA is not keen on the EPA’s proposal because it could make it more difficult for the EPA to weaken clean air and clean water protections. A court can strike down agency actions that are not grounded in evidence—both decisions that improve public protections and decisions that erode them. So in a perverse way, their desire to go back to 1950s regulatory standards could be hampered by the EPA’s proposed science restrictions.

Senator Hassan began buy questioning Administrator Rao about a proposal to give restaurants owners more control over service workers’ tip money. The Department of Labor purposely hid analysis showing the proposal would take billions of dollars out of the pockets of food servers, baristas, and many other hardworking people. OIRA allowed the department to move forward with the proposal, even though it lacked sufficient data to do so (Senators Heitkamp and Senator Harris asked great follow-up questions later in the hearing).

Then Senator Hassan moved on to the EPA (my emphasis added):

Senator Hassan: EPA Administrator Scott Pruitt is reportedly considering a proposal that would prevent the EPA from using a scientific study unless it is perfectly replicable and all the underlying raw data is released to the public. That is problematic for a whole host of reasons. For example, it could require the release of confidential medical information, which in turn may reduce participation in studies, but it would also prevent the EPA from considering some of the best evidence we have available to us when making regulatory and deregulatory decisions. Have you and your office provided any input to Administrator Pruitt on this proposal?

Administrator Rao: The questions about information quality are very important to us, and that is something that my staff has been working with the EPA on to develop best practices in that area.

Senator Hassan: Do you think such a proposal as the one I just described, the one that is from the EPA that would limit the information agencies can use by preventing them from considering best available evidence makes sense?

Administrator Rao: Well I think we want to make sure that we do have the best available evidence. I think it’s also important for the public to have notice and information about the types of studies which are being used by agencies for decision making, so I think that there is a balance to be struck there, and I think that’s something that the EPA is working towards.

That’s not exactly a ringing endorsement, and some evidence that the friction between the White House and EPA extends beyond numerous ethical scandals to the agency’s style of policymaking as well.

“Scientific evaluation and data and analysis is an ongoing process,” continued Senator Hassan. “As you know, we’ve talked about one of my priorities is the response to the opioid crisis in my state and across this country. If we wait for so-called perfect science, we’re not going to have evidence-based practices out there that are saving lives. And so I think it is critically important that we continue to honor scientific process and make sure that we are using best available data when we make policy.”

At one point, Senator Hassan posed this direct question: “Would you generally support agencies changing their procedures in ways that prevent them from using the best available evidence when making these decisions?”

“No, I would not,” replied Administrator Rao.

On that, they could agree.

Brace Yourself for Unhealthy Air: The Trump Administration Weakens Clean Air Protections

Yesterday the Trump administration started chipping away at one of the strongest science-based public health protections we have in the country. In a laundry list of industry wishes, President Trump has ordered the EPA to make several sweeping changes to how it implements ambient air pollution standards.

I’m saddened at the potential for this to weaken the clean air protections we enjoy every day because of our nation’s long history of strong science-based policies. Other countries have strict air pollution laws but not all of them come with teeth. In the US, we are lucky to have air pollution laws at work. They work because they require decisions be made based on what’s protective of public health not on what’s convenient for regulated industries. And importantly, the Clean Air Act includes consequences for failure to meet air pollution standards, ensuring strong incentives for states and industries to comply.

I’ve been proud to live in a country where these protections save thousands of lives and prevent thousands more respiratory illnesses, cardiac illnesses, and missed work and school days every year. But now it’s less clear if my family and yours will enjoy the same.

Here are four ways the new executive order will undermine our science-based air pollution protections.

1. Requiring science advisors to consider non-scientific information in their advice to EPA

This is one of the most concerning changes in the executive order. The EPA relies on the Clean Air Scientific Advisory Committee (CASAC) for independent scientific advice on where the agency should set air pollution standards in order to protect of public health. Comprised of air pollution and health experts from universities and other entities outside of the EPA, the committee dives deep on exactly what the science says about the relationship between air pollutants and the health of Americans. This system has worked remarkably well to ensure the EPA is making decisions consistent with the current science and holding the agency accountable when it doesn’t. (See more on the important role of CASAC and the independent science that feeds into the EPA process here and here).

In a striking reversal of precedence, the president’s order asks the committee to also consider “adverse public health or other effects that may result from implementation of revised air quality standards.” This is scientifically problematic and likely illegal.

The Clean Air Act mandates that ambient air pollution standards be determined by what is protective of public health with an adequate margin of safety—and that’s it. Economic impacts, costs to industry, etc. cannot be considered. The Supreme Court affirmed this in 2001 in its Whitman v. American Trucking Association decision. Ordering the EPA’s science advisers to consider information outside of the health impacts of pollutants shoves a wrench into a functional science-based process for protecting the nation’s health.

This move builds on other administration efforts to weaken the technical chops of CASAC and other government science advisory committees, by allowing them to sit idle and replacing qualified independent scientists with conflicted or unqualified individuals.

The order’s section on science advisers also signals that the agency will explore ways to “ensure transparency in … scientific evidence” considered by the committee, in what is likely a nod to the Trump administration’s expected move on addressing “secret science.” (Learn more on the many reasons this proposal is flawed.)

2. Restricting the science that can be used to protect public health

Many more people could now be living in areas that evade air pollution protections, thanks to a provision that limits what scientific information the agency can use to determine who is breathing bad air.

The Trump administration just moved to weaken our nation’s strong ambient air pollution protections, paving the road for increased pollution across the country.

The order includes an innocuous-seeming provision declaring that the agency should “rely on data from EPA-approved air quality monitors” to decide which areas need to improve their air. The EPA, of course, already relies heavily on monitoring data to make decisions about where air pollution standards are being met. But it can’t do this everywhere. Not every county or jurisdiction will have a monitor (accurate long-term monitoring isn’t cheap), so in areas without monitors for specific pollutants the EPA uses modeling or satellite information to determine air quality.

This can be a cost-effective way to determine where air is unhealthy and for some pollutants it can be impressively accurate. For example, for pollutants like ozone that form in the atmosphere, scientists know that ozone levels are very consistent over large distances, i.e. if a monitor tells me ozone levels are high, I’m confident that ozone levels are also high five miles down the road.

For other pollutants, modeling can be crucial for ensuring that people are protected from industrial emissions. Sulfur dioxide, for example, is emitted from coal-fired power plants and its concentrations can vary a lot over space, i.e. a place directly downwind of a power plant could get hit hard with sulfur dioxide pollution, while an area five miles away could have clean air. In these cases, modeling air pollution concentrations can allow the EPA to protect people from pollution that might otherwise be harder to characterize through a few monitors. (If you want to know more on this point, I know a good dissertation.)

Preventing the EPA from fully using available tools for scientific assessments means many areas, especially suburban or rural areas, could have unhealthy air that goes unnoticed and won’t be cleaned up.

3. Increasing demands without increasing resources for the EPA and states

Several provisions of the executive order focus on expediting permitting and implementation processes. In theory, this is a good idea. We would all benefit from more time-efficient government processes. However, it cannot be done in a vacuum. The EPA has been asked to do more with less over the years. Expecting the agency to expedite processes without providing additional resources could mean cutting corners or less rigorous analysis. This wouldn’t help the agency meet its mission of protecting people from air pollution; it would make it easier for lapses in implementation to happen.

This is especially true when we look at permitting processes, which are largely handled by the states. States won’t have additional resources to conduct air pollution modeling, analyze measurements, and evaluate permit applications for new industrial sources. Asking them to expedite this process could make it easier for industrial sources to be built in already polluted areas.

4. Allowing for more pollution in already hard-hit areas

In several ways, the order stands to increase pollution in areas that already face disproportionate impacts from air pollution. In addition to the expedited permitting discussed above, the rule also allows for interstate trading of pollutant emissions. Such schemes have worked well in the past (e.g. we’ve been remarkably successful at reducing acid rain), but in this case, it will be important to watch closely how this is implemented. For the ambient air pollutants that fall under this order and some of their precursors, there are acute health effects. Thus, in a trading scheme, someone will get the short end of the stick in terms of breathing bad air.

In other words, trading emissions might allow one state to breathe cleaner air and emissions could be reduced overall, but that means another area will see an increase in emissions. When the pollutant in question has adverse health impacts, that’s a big problem for anyone living downwind of a plant that bought those emissions credits. Similarly, states that depend on interstate cooperation to reduce pollution in their borders are also likely to get a sore deal here, as Senator Tom Carper of Delaware rightfully pointed out yesterday in a statement.

Already, the Clean Air Act doesn’t do a great job of improving air quality in hotspots where pollutant levels may be uncharacteristically high compared to the surrounding areas. This executive order could make that problem worse. As Alex Kauffman discussed on the Huffington Post yesterday, the people most affected by this are likely to be communities of color, which are already burdened with disproportionately high levels of air pollution across the country. This of course adds to many other steps the administration has taken that worsen inequities in pollution exposure.

Brace yourself for bad air

The bottom line is that the president’s order is bad news for anyone that breathes air in this country. It represents a chipping away at the strong air pollution protections we’ve enjoyed for decades. It doesn’t serve the public interest and it certainly doesn’t advance the EPA’s mission of protecting public health. It serves only those who wish to pollute, exposing more American to unhealthy air. And this will come with consequences for our health. The fate of this new order will likely play out in the courts, but in the meantime, I wish we could all just hold our breath.

 

 

Stories, Improv, and What Science Can Learn From Comedy

Can you name a scientist? If your response was no, you are not alone. Eighty one percent of Americans cannot name a living scientist, according to a 2017 poll that was conducted by Research America. As scientists, it is our responsibility to reach out to the public and talk to people about what we do, why it is important, and how it connects to their lives. We are not trained to make those connections and do public outreach, but luckily there are increasingly more opportunities to learn.

We are graduate students and members of Science in Action, a science communication and policy advocacy group at Colorado State University. Our goal is to encourage other scientists on campus to learn about and practice sharing their science. With financial support from the Union of Concerned Scientists, we were able to take advantage of unique opportunities to do just that.

Acting for science: using improv techniques to communicate

Scientists are trained to methodically approach problems and rigorously analyze solutions, but not taught how to communicate the findings. We may be doing vitally important work that benefits humanity, but what if we cannot communicate its importance to the public?

Actors, on the other hand, are expert storytellers. They use specific techniques to connect with their audience—techniques that scientists can and should learn to use.

Members practicing “acting tools” with Sarah Zwick-Tapley.

To help aspiring scientists learn these tricks of the trade, we partnered with the Union of Concerned Scientists to host a science communication workshop. Sarah Zwick-Tapley, a local theater director and science communication consultant, introduced us to the “actor’s toolkit,” a set of physical and vocal techniques for audience engagement.

These tips were simple enough (land eye contact, change the tone, volume, and speed of your voice) but incorporating them all together while also describing the importance of your science? That is a challenge.

Another critical piece of the storytelling approach is using the “And, But, Therefore” sequence. We practiced this technique with an outlandish example. First, you start with what we know (“we know cancer is a deadly disease AND that it has many causes”). Next, you build suspense with what we have yet to discover (“BUT, we don’t know whether eating old books causes cancer”). Then, you finish with your contribution (“THEREFORE, I am eating Shakespeare’s entire body of work to see if I develop cancer”). Using this technique turns a simple list of facts into a powerful story.

The next step: put our new acting skills into action.

Why science matters for Colorado

Colorado is home to multiple national laboratories and major research universities.

Standing in front of the Colorado State Capitol after sharing our science with legislators and staffers.

Researchers at these organizations do important science and bring the best and brightest minds to the state. To help share these discoveries with our state legislators, we joined Project Bridge, from the University of Colorado Denver Anschutz Medical Campus, for a poster day at the capitol. Speaking with non-scientists can be a challenge, but we used our new acting tools to tell a story, both in our poster design and our presentation.

We also took this opportunity to meet one-on-one with our state representatives. Because they represent a college town, they recognize the value of research for our city, state, and country. We were encouraged to hear that they regularly rely on experts at CSU for advice on pending legislation. This is science policy in action.

Communicating for the future

As a scientist, you may recognize that communicating science is important, but are unsure how to learn these skills. Luckily, there are numerous organizations across the country that are dedicated to training scientists to communicate clearly and effectively. Many scientific organizations (the American Academy for the Advancement of Science, the American Geophysical Union, and the American Society for Cell Biology, among others) hold science communication and science policy trainings and provide small grants for local groups. COMPASS is an international organization that hosts trainings and provides one-on-one coaching for aspiring science communicators. Many universities have also started in-house communication trainings and programs (Stony Brook University is home to the Alan Alda Center for Communicating Science).

These resources illustrate the fact that there are people and organizations dedicated to providing scientists with the tools they need to share their science with everyone.

 

Rod Lammers and Michael Somers are graduate students at Colorado State University. They are both officers in Science in Action, a science communication and policy group. Science in Action is a student-led organization at Colorado State University started in 2016 to engage campus scientists and provide opportunities for outreach to the public and policymakers. More information can be found on the organization’s website and Facebook page.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Trump Onboard for Offshore Wind?

Workers dwarfed by offshore wind blade tips and towers at Siemens deployment dock in Hull. Offshore wind is part of the revival of many port cities in Europe. Photo: Derrick Z. Jackson

The Trump administration’s quiet embrace of offshore wind became a shout heard around the industry last week. At an offshore wind conference, Interior Secretary Ryan Zinke said, “We think there’s an enormous opportunity for wind because of our God-given resources off the coast. We’re pretty good at innovating. I’m pretty confident that the wind industry is going to have that kind of enthusiasm.”

Sec. Zinke stoked that enthusiasm by announcing the opening of the bidding process for the final 390,000 acres of federal waters far south of Martha’s Vineyard. Those parcels went unclaimed in a 2015 auction held in the despairing wake of the collapse of Cape Wind in Nantucket Sound, leaving many advocates wondering if offshore wind, which has become an important source of energy in northern Europe, would ever take off in the United States. The prospects came even more into question with the 2016 election of President Trump, who routinely claimed that offshore wind was too expensive.

But the dramatically dropping costs of offshore wind, which is now cheaper than nuclear power and closing in on parity with fossil fuels in Europe, have sparked an explosion of renewed interest in the US. The bidding for those once-orphaned waters off Massachusetts likely will be fierce as they have already received unsolicited bids from the German wind company PNE and the Norwegian energy giant Equinor, the former Statoil.

The Interior Department made two other significant announcements last week to further brighten offshore wind’s prospects. It announced that it was soliciting industry interest and public input on the possibility of establishing offshore wind farms in 1.7 million acres of waters off the New York Bight that curls up from New Jersey to Long Island. It also is soliciting input on an assessment of all Atlantic offshore waters for wind farm development. Zinke’s energy policy counselor, Vincent DeVito, said in a press release, “We are taking the next step to ensure a domestic offshore wind industry.”

This is the surest sign yet that an administration that has pulled out of global climate change agreements and is rolling back environmental protections at the behest of the fossil fuel industry, nonetheless does not want to miss out on the economic potential of a renewable energy industry that has revived many ailing port cities in northern Europe.

Worker gives scale to offshore wind blades nearly a football field long at a Siemens facility in Denmark. Photo: Derrick Z. Jackson

It surely must help politically that onshore wind is now a bedrock of American energy, with rock-solid bipartisan support in an oft-divided America. Rural turbines have dramatically changed the energy landscape in the Republican-dominated states of the Midwest and Great Plains, with Texas, Iowa, Oklahoma, and Kansas being the top wind electricity generating states and with the nation’s fastest growing occupation paying more than $50,000 a year being wind turbine service technician.

A similar bipartisan picture is rapidly developing for offshore wind along the Eastern Seaboard. Democratic and Republican governors alike are staking claims in the offshore industry, from Massachusetts’s game-changing 1,600 megawatt mandate to Clemson University in South Carolina being chosen to test the world’s most powerful turbine to date, a 9.5 megawatt machine from Mitsubishi/Vestas.

Both states happen to have Republican governors who have joined their Democratic counterparts in opposing Zinke’s proposal to also exploit the Atlantic continental shelf for oil and gas. In the same Princeton speech that he praised the possibilities of offshore wind, Zinke acknowledged that offshore fossil-fuel drilling was opposed by governors in every East Coast and West Coast state except Maine and Georgia. “If the state doesn’t want it, the state has a lot of leverage,” he said.

In contrast, offshore wind’s leverage has become almost undeniable. The last two offshore wind lease auctions in New York and North Carolina added a respective $42 million and $9 million to federal coffers. With Massachusetts, New York and New Jersey leading the way, there are now more than 8,000 megawatts of legislative mandates and pledges by current governors. That could meet the needs of between 4.5 million and 5 million homes, based on the proposals made for 800 MW farms in Massachusetts.

An 8,000 megawatt, or 8 gigawatt (GW) market could alone create between 16,700 and 36,300 jobs by 2030, depending on how much of the industry, currently centered in Europe, is enticed to come here, according to a joint report by the clean energy agencies of Massachusetts, New York, and Rhode Island. But the potential is much greater.

A 2016 report from the US Departments of Energy and Interior estimated that there was enough technical potential in US offshore wind to power the nation twice over. The US, despite being two and a half decades behind Europe in constructing its first offshore wind farm, a five-turbine project off Block Island, Rhode Island, is still in a position to ultimately catch up to Europe, where there are currently nearly 16 gigawatts installed, supporting 75,000 jobs. The 2016 DOE/DOI report said that a robust offshore wind industry that hits 86 GW by 2050 could generate 160,000 jobs.

One can hope that it is this picture that Sec. Zinke and the Trump administration are looking at in their support of offshore wind. Sec. Zinke continues to say that offshore wind is part of the White House’s “all-of-the-above” strategy for “American energy dominance.” Given how little interest there is for any new oil and gas drilling off the coasts of America, offshore wind is becoming the new source of energy that stands above all.

Service boat cruising in the Anholt offshore wind farm in Denmark. Photo: Derrick Z. Jackson

Photo by Derrick Z. Jackson Photo by Derrick Z. Jackson

SNAP already has work requirements. Adding more won’t solve poverty.

Photo: US Air Force

On Tuesday, President Trump signed an executive order calling for a review of the nation’s federal safety net, with the stated aim of “moving people into the workforce and out of poverty.” This is almost certainly thinly veiled code language for additional work requirements in programs that serve millions of low-income individuals and families, including Medicaid and the Supplemental Nutrition Assistance Program (SNAP).

There are a number of inaccuracies and logic flaws contained in the text, but chief among them are these:

Falsehood 1: The federal safety net is causing poverty.

The order states, “Many of the programs designed to help families have instead delayed economic independence, perpetuated poverty, and weakened family bonds.”

It is a grim truth that poverty has a strong grip on too many communities in this country. But poverty is not created by social support programs, nor is it perpetuated by the people who use them. Persistent poverty is far more likely a product of the complex structural inequities embedded in our everyday lives—income inequality, for example, and institutional racism and discrimination. And until we address and remedy these underlying factors, it is essential that we have a strong federal safety net to fall back on.

Falsehood 2: Working-age adults have become dependent on programs like SNAP.

The US Department of Agriculture (USDA) counters this notion with its own data. The populations who might depend on the program for longer periods of time include children, the elderly, and those with disabilities; together, these groups make up about two thirds of all SNAP participants. The population of SNAP participants who are classified as able-bodied adults without dependents (ABAWDs) and are required to work make up just a small fraction—only two percent—of all those who stay on SNAP for a period of eight years or longer.

Yet there is every indication that ABAWDs will be the target of more stringent work requirements in the months to come. A recent USDA federal register notice asked for public input on “innovative ideas to promote work and self-sufficiency” among the ABAWD population. Here’s what we offered.

 

April 9, 2018

The Union of Concerned Scientists

Re: Document No. FNS-2018-03752: Supplemental Nutrition Assistance Program: Requirements and Services for Able-Bodied Adults Without Dependents; Advance Notice of Proposed Rulemaking

We submit this comment to the US Department of Agriculture (USDA) to express broad opposition to policy and programmatic changes that would further limit SNAP eligibility for able-bodied adults without dependents (ABAWDs). While we appreciate USDA efforts to address food insecurity and provide adequate opportunities for employment and training among low-income populations, any proposals which would remove participants from the program—either through more stringent work requirements, further restrictions on eligibility, or other means—would fail to accomplish either, and may in fact contribute to worsening economic hardship among low-income individuals while imposing undue administrative burden and cost on state and federal agencies.

Our opposition to the aforementioned policy and programmatic changes is grounded in the following:

The work requirements in place for ABAWDs are already extensive.

In addition to meeting general work requirements for SNAP participation, ABAWDs are subject to a second set of time-limited work requirements. These dictate that an ABAWD must work or participate in a work program for at least 80 hours per month, or will face benefit termination after a period of three months, renewable after three years. Data from the Bureau of Labor Statistics show that for many, securing a job within three months is an unattainable goal: last year, nearly 40 percent of those able to work and looking for jobs in the general population were unable to find work within 15 weeks, while nearly 25 percent were unable to find work within 27 weeks.[1]

The population of unemployed ABAWDs is a small fraction of SNAP participants.

The vast majority of SNAP recipients are children, the elderly, caregivers, or persons with disabilities. The ABAWD population makes up a small fraction of all SNAP recipients, and many are already working or looking for work. Fewer than 8.8 percent of all SNAP participants are classified as ABAWDs, and the number of unemployed ABAWDs at any given time constitutes only 6.5 percent of all program participants.[2] It should be noted that the population of unemployed ABAWDs is not stagnant, but shifts depending on need: research shows that among SNAP households with at least one non-disabled, working-age adult, eight in 10 participants were employed in the year before or after receiving benefits, meaning SNAP is providing effective temporary assistance during periods of economic difficulty.[3] Many of the policy changes addressed in the federal register notice, including new review processes, certification processes, and reporting requirements, would incur administrative burdens and costs with little demonstrable benefit for low-income populations, and may in fact detract from the efficacy of the program.

Bolstering employment and training programs will do little to counter the root causes of poverty and food insecurity—particularly when other public assistance programs are at risk.

Employment and training (E&T) programs can provide a path to self-sufficiency if evidence-based and adequately funded. Currently, there is wide variation among state E&T programs, with varying efficacy, and limited full federal funding available to states.[4] Until there is consistent implementation of effective and scalable models for job training across states—accompanied by a strong government commitment to invest in such models—we cannot rely on E&T programs alone to keep low-income populations employed and out of poverty. This is particularly important at a time when numerous other public assistance programs serving low-income populations are at risk.

We appreciate the opportunity to provide comments on the manner in which the USDA intends to pursue its stated goals of addressing food insecurity and providing adequate opportunities for employment and training among low-income populations. However, the questions posed by the agency suggest that forthcoming policy proposals will do more harm than good. Any policy changes to SNAP resulting in removal of individuals from the program—including more stringent work requirements or restricted eligibility among the ABAWD population—present serious risks to the health, well-being, and economic vitality of the individuals and communities served by this program.

Thank you for your consideration.

 

[1] Bureau of Labor Statistics. 2018. Table A-12: Unemployed persons by duration of employment. Washington, DC: US Department of Labor. Online at www.bls.gov/news.release/empsit.t12.htm, accessed March 2, 2018.

[2] Food and Nutrition Services (FNS). 2016. Characteristics of able-bodied adults without dependents. Washington, DC: US Department of Agriculture. Online at https://fns-prod.azureedge.net/sites/default/files/snap/nondisabled-adults.pdf, accessed March 2, 2018.

[3] Council of Economic Advisers (CEA). 2015. Long-term benefits of the Supplemental Nutrition Assistance Program. Washington, DC: Executive Office of the President of the United States.

[4] Food and Nutrition Services (FNS). 2016. Supplemental Nutrition Assistance Program (SNAP) Employment and Training (E&T) Best Practices Study: Final Report. Washington, DC: US Department of Agriculture. Online at https://fns-prod.azureedge.net/sites/default/files/ops/SNAPEandTBestPractices.pdf, accessed March 8, 2018.

 

Want to learn more about SNAP? Listen to Sarah Reinhardt on our Got Science? Podcast!

USDA Focus on Nutrition Program “Integrity” is a Smokescreen

Photo: US Air Force

The US Department of Agriculture has announced it will hire a new “chief integrity officer” to oversee federal nutrition programs such as the National School Lunch and Breakfast Programs, Special Supplemental Nutrition Program for Women, Infants and Children (WIC), and the Supplemental Nutrition Assistance Program (SNAP, formerly known as food stamps). The integrity of SNAP in particular has been a popular topic among those in the Trump administration, including USDA Secretary Sonny Perdue, who argue that SNAP enables a “lifestyle of dependency” and seek major program reforms in the upcoming Farm Bill. But these arguments have been conjured from very little science and a whole lot of smoke—and have the effect of distracting the public from more pressing issues at hand.

SNAP is among the nation’s most effective and efficient programs

Monitoring any government program is necessary to ensure that taxpayer dollars are being spent effectively, and the USDA does so through its quality control process and periodic reports on fraud and abuse. In fiscal year 2011, as part of the Obama administration’s Campaign to Cut Waste, the USDA Office of Inspector General (OIG) conducted an extensive review of more than 15,000 stores for compliance with SNAP program rules. The results of these assessments, combined with USDA participation data, tell us the program is working as intended, and with remarkably few problems.

SNAP fraud1—broadly defined as exchanging benefits for cash or falsifying participant or retailer applications to illegally obtain or accept benefits—happens relatively infrequently, though it’s difficult to measure. The USDA’s most recent report on the topic estimated that it affected about 1.5 percent of all SNAP benefits received between 2012 and 2014. This represents a slight increase since the early 2000s (1.0 to 1.3 percent between 2002 and 2011), but remains substantially lower than in the 1990s, when reported rates were as high as 3.8 percent.

Compare this to some of the other federal programs contained in the Farm Bill—like crop insurance. Back in 2013, former USDA secretary Tom Vilsack voiced concerns about the integrity of crop insurance programs due to error and fraud rates that exceeded those of SNAP. As with SNAP, illegal activity is largely uncovered by way of criminal investigations. According to the Department of Justice, recent convictions and sentences connected with the federal crop insurance program have included the indictment of a Kentucky agricultural producer for insurance fraud, wire fraud, and money laundering; an Iowa farmer who received more than $450,000 in crop insurance proceeds illegally; a Louisiana farmer who created shell farms to receive more than $5.4 million in subsidy payments; and a Kentucky crop insurance agent whose clients received nearly $170,000 in indemnity payments for false claims. And those are just the cases from the last six months. Unlike SNAP, no chief integrity officer has been assigned to monitor fraud and abuse in the program.

So yes, we should be striving for continuous improvement in the operation of all federal programs. But explaining why SNAP in particular has come under such intense scrutiny requires some historical and political context—and a good understanding of who might benefit from maintaining old narratives.

 

Learn more about SNAP, listen to Sarah Reinhardt on the Got Science? podcast

The rise, fall, and flatline of SNAP fraud

Though President Lyndon B. Johnson signed the Food Stamp Act into law in 1964, the program began to gain popularity in the 1970s, when participation doubled from 5 million to 10 million over the course of the decade. As participation increased, the USDA began to discover incidents of abuse, eventually revealing a widespread pattern of illegal activity that would plague the program for the next twenty years. Reported rates of fraud at this time were between 10 and 20 percent.

The widespread abuse lent itself to growing public disapproval of the food stamps program and,  accompanied by racist and classist rhetoric around so-called “welfare queens,” fueled substantial program reform by the Reagan administration in the early 1980s.

But what likely ushered in drastic improvements in rates of fraud and abuse was the introduction of new electronic benefit transfer (EBT) systems, which fully replaced paper food stamps in 2004. In addition to requiring a 4-digit PIN number, EBT cards create a record of each purchase, increasing the ease with which agencies can identify and document illegal use.

That brings us, more or less, to the efficient and effective program that we’ve seen for the last decade.

Despite how well SNAP works, politicians have continued to frame it as a program that suffers from rampant misuse and illegal activity. For those seeking drastic budget cuts and reforms, that negative narrative grants permission to discredit both the program and those who use it—and one needs to look no further than President Trump’s welfare reform proposals or Speaker Paul Ryan’s comment about the tailspin of “inner-city culture” to know that the legacy of the welfare queen is alive and well.

Speaking of integrity…

Photo: USDA

To get a sense of where this is all headed, keep your eyes on USDA Secretary Sonny Perdue—if you can. He’s been making some quick pivots lately.

At a May 2017 House Appropriations Subcommittee on Agriculture hearing, Perdue stated that “SNAP has been a very important, effective program,” and that the agency was considering no changes. “You don’t try to fix something that isn’t broken.”

Less than a year later, the secretary has voiced his support for a number of proposed program changes, including stricter work requirements for adults without dependents. Not unlike the recent announcement aimed at nutrition program integrity, these proposals are often grounded more in political ideology than fact. USDA’s own data shows that most SNAP participants who can work do work—albeit in unstable jobs—and counters the notion that participants stay on the program for long periods of time.

When the House releases its draft of the Farm Bill, which may happen as early as this week, Perdue’s response (or lack thereof) could provide insight on the policy proposals he’s prepared to support. More than likely, he’ll continue to endorse the positions of his party—but then again, we’ve been surprised before.

 

  1. Fraud and abuse should be distinguished from error rates, or improper payment rates, which capture how often SNAP participants receive underpayment or overpayment of benefits. (The overwhelming majority of SNAP errors are attributed to unintentional error by recipients or administrative staff.) Program error rates have also experienced substantial declines in recent years: they reached an all-time low of 2 percent in 2013, down from 6.6 percent in 2003. And though a 2015 USDA OIG review of the quality control process found understated error rates in some states, the resulting corrective actions target state agencies—not individual SNAP participants.

 

Andy Wheeler: Trump’s Pick for EPA Deputy is a Threat to Our Climate and Health

Washington’s latest parlor game involves predictions about the number of days left in Scott Pruitt’s tenure at the EPA.  There’s even a website where you can place bets on it and some very funny memes and gifs on the internet.  Amid the controversies over discounted condos, high priced furniture, self-important sirens, and questionable personnel practices, the outrage over Pruitt’s policies is getting lost in the noise.  If his ethical lapses result in his ouster, what’s to stop his replacement from continuing the destruction of nearly half a century of environmental progress?

Not much.  The nominated second in command, Andy Wheeler, is awaiting confirmation in the US Senate to become the Deputy Administrator.  Wheeler is a well-known as a lobbyist for the coal industry and former staffer for the senate’s leading climate denier, Senator Jim Inhofe, serving on the Environment and Public Works Committee (EPW) staff for 14 years. If Pruitt gets the boot, Wheeler will most likely be the acting Administrator.  Unlike Pruitt, Wheeler worked for the EPA early in his career and has played key roles in Congressional oversight of the agency and its budget, making him a formidable opponent with intimate knowledge of the agency’s programs and regulations.

Senate climate deniers at the EPA’s helm

Wheeler will join Inhofe alumni that already occupy the chief of staff and deputy chief of staff positions at the EPA. Pruitt’s senior advisers on air, climate and legal issues are also former Inhofe staff, as are the top domestic and international energy and environmental advisers to President Trump.  A Senate Democratic aide speaking off the record warned, “These are folks who are very capable. They know the agency and its programs.  They’re smart and hard-working, and they certainly could dismantle the programs if they were asked to do that. But the question is how they will react if they’re asked to do that.” Another former Capitol Hill staffer said , “I think Andrew is very similar to Scott Pruitt’s approach in understanding under EPA’s regulatory scheme that states have the priority over federal overreach.” Given Wheeler’s tenure with the Senate EPW committee and his coal company client list, it is safe to assume that he will continue the repeal of climate regulation and the assault on the Clean Air Act.

Crooked math on air pollution

Unfortunately, Wheeler is likely to move forward on changes to the way the EPA assesses costs and benefits of regulation that was buried in its proposed regulation gutting the Clean Power Plan (CPP).  The CPP was an Obama era regulation aimed at reducing emissions of carbon dioxide to reduce the risks of climate change. UCS economist Rachel Cleetus commented that, “[t]oday’s proposal to repeal of the Clean Power Plan uses crooked math to artificially lower the benefits of the pollution reductions that standard would have brought. The EPA fails to account for the fact that actions to cut carbon emissions also pay large dividends by reducing other forms of harmful pollution like soot and smog.”

The “proposed repeal outlines a flawed approach to evaluating the risks of pollution — specifically particulate matter, which is a mix of very tiny particles emitted into the air. When inhaled, this pollution can cause asthma attacks, lung cancer and even early death,” according to the American Lung Association.  Harold P. Wimme, the national president and CEO of ALA and Stephen C. Crane, Ph.D., MPH, the executive director of the American Thoracic Society, argue that “[t]he [Trump] EPA has cherry-picked data to conceal the true health costs of air pollution. Its revised calculations diminish and devalue the harm that comes from breathing particulate matter, suggesting that below certain levels, it is not harmful to human health. This is wrong. The fact is: There is no known safe threshold for particulate matter. According to scores of medical experts and organizations like the World Health Organization, particle pollution harms health even at very low concentrations. Attempting to undercut such clear evidence shows the lengths the EPA, and by extension the Trump administration, will go to reject science-based policy that protects Americans’ health.”

What are the health dangers caused by air pollution for children and adults? Credit: American Lung Association.

What Mr. Pruitt, Mr. Wheeler and the Trump Administration don’t want you to know is that actions taken to reduce carbon also reduce the air pollution that causes illness and death.  A forthcoming analysis of the proposed change to the way the EPA assesses health benefits in the from Kimberly Castle and Ricky Revesz from the Institute for Policy Integrity at the NYU School of Law finds that:

The benefits from particulate matter reductions are substantial for climate change rules, accounting for almost one half of the quantified benefits of the Obama Administration’s Clean Power Plan. These benefits are also significant for regulations of other air pollutants, making this issue one of far-reaching importance for the future of environmental protection.

Opponents of environmental regulation, including the Trump Administration, have recently embraced an aggressive line of attack on particulate matter benefits. They argue alternatively that these benefits are not real; are being “double counted” in other regulations; or should not be considered when they are the co-benefits, rather than the direct benefits, of specific regulations….An examination of the scientific literature, longstanding agency practices under administrations of both major political parties, and judicial precedent reveals that particulate matter benefits deserve a meaningful role in regulatory cost-benefit analysis.

Pruitt’s EPA has also indicated plans to adopt a policy similar to legislation that House Science Committee Chairman Lamar Smith (R-Texas) has unsuccessfully pushed for years, over the objection of the country’s leading scientific societies. The policy builds on a strategy hatched in the 1990s by lobbyists for the tobacco industry, who invented the phrase “secret science” to undermine robust peer-reviewed research on the harmful impacts of second-hand smoke. The goal back then was to create procedural hurdles so that public health agencies couldn’t finalize science-based safeguards.

Climate and health

The US Global Change Research Program found significant health impacts from climate change, and documented several linkages between climate and air quality. “Changes in the climate affect the air we breathe, both indoors and outdoors. The changing climate has modified weather patterns, which in turn have influenced the levels and location of outdoor air pollutants such as ground-level ozone (O3) and fine particulate matter.”  It also found that climate change will make it harder for any given regulatory approach to reduce ground-level ozone pollution in the future as meteorological conditions become increasingly conducive to forming ozone over most of the United States. Unless offset by additional emissions reductions, these climate-driven increases in ozone will cause premature deaths, hospital visits, lost school days, and acute respiratory symptoms.

The air quality response to climate change can vary substantially by region across scenarios. Two downscaled global climate model projections using two greenhouse gas concentration pathways estimate increases in average daily maximum temperatures of 1.8°F to 7.2°F (1°C to 4°C) and increases of 1 to 5 parts per billion (ppb) in daily 8-hour maximum ozone in the year 2030 relative to the year 2000 throughout the continental United States. Unless reductions in ozone precursor emissions offset the influence of climate change, this “climate penalty” of increased ozone concentrations due to climate change would result in tens to thousands of additional ozone-related premature deaths per year, shown here as incidences per year by county (see Ch. 3: Air Quality Impacts). Credit USGCRP, 2016: The Impacts of Climate Change on Human Health in the United States: A Scientific Assessment. Crimmins, A., J. Balbus, J.L. Gamble, C.B. Beard, J.E. Bell, D. Dodgen, R.J. Eisen, N. Fann, M.D. Hawkins, S.C. Herring, L. Jantarasami, D.M. Mills, S. Saha, M.C. Sarofim, J. Trtanj, and L. Ziska, Eds. U.S. Global Change Research Program, Washington, DC, 312 pp. http://dx.doi.org/10.7930/J0R49NQX

Temperature-driven changes in power plant emissions are likely to occur due to increased use of building air conditioning. A recent study in Environment Research Letters compared an ambient temperature baseline for the Eastern US to a model-calculated mid-century scenario with summer-average temperature increases ranging from 1 C to 5 C. Researchers found a 7% increase in summer electricity demand and a 32% increase in non-coincident peak demand. Power sector modeling, assuming only limited changes to current generation resources, calculated a 16% increase in emissions of NOx and an 18% increase in emissions of SO2.

Wheeler and Clear Skies

While at EPW, Andy Wheeler was the Bush Administration’s point person on Clear Skies – an ironically named effort to essentially gut the Clean Air Act proposed in 2003.  The bill would have significantly delayed the implementation of soot and smog standards and delivered fewer emissions reductions of Nox and SO2 than strict implementation of the existing Clean Air Act would deliver.  Wheeler not only negotiated the bill to near passage (a tied committee vote killed the bill in 2005), he carried out Inhofe’s intimidation effort against an association of state air quality officers, asking the group to turn over six years of IRS filings and all records of grants they received from the EPA.

President Trump claimed to want the EPA to focus on clean air and clean water.  But his defense of Pruitt on Twitter and his nomination of Wheeler as Deputy Administrator makes clear that he has no idea of what it takes to deliver clean air to the American people.  The Trump Administration’s priority is to reduce regulation on industry at the expense of the health and well-being of America.

Department of Energy Releases Bogus Study to Prop Up Coal Plants

A few months ago, the Department of Energy (DOE) made a request to one of its national labs, the National Energy Technology Laboratory (NETL), to study the impacts on the electricity grid of a severe cold snap called the bomb cyclone that hit the Northeast in early January 2018. NETL conducts important R&D on fossil energy technologies. The report released last week uses deeply flawed assumptions to inaccurately paint coal (and to a lesser extent, fuel oil) as the savior that prevented large-scale blackouts during the extreme cold, while greatly understating the contribution from renewable energy sources. It also estimates a bogus value for coal providing these so-called “resiliency” services. One has to wonder whether this deeply flawed and misleading study is part of the administration’s continued attempts to prop up the coal industry at all costs, especially after FERC rejected the DOE’s fact-free proposal to bail out coal and nuclear plants late last year. The utility FirstEnergy, which owns and operates a fleet of coal and nuclear generators, immediately seized upon NETL’s report and is petitioning DOE for an emergency bailout.

Separating the Facts from the Fiction

The report emphasizes the fact that fossil and nuclear power played a critical role in meeting peak demand during the cold snap. Across six regions, according to the report, coal provided 55 percent of daily incremental generation, and the study concludes that at least for PJM Interconnection (which manages the electricity grid across 12 Midwest and Mid-Atlantic states as well as DC), “coal provided the most resilient form of generation, due to available reserve capacity and on-site fuel availability, far exceeding all other sources” without which the region “would have experienced shortfalls leading to interconnect-wide blackouts.” The report then goes on to incorrectly estimate value of these “resiliency” services to be $3.5 billion for PJM.

The nugget of truth here is that we do need reserve capacity to be available in times of peak demand, especially during extreme weather events that lead to greatly increased need for heating or cooling. And this is especially important during the winter, when the demand for natural gas for home heating spikes in some parts of the country, leading to higher prices and less natural gas available for electricity generation (since home heating takes priority over electricity generation in terms of natural gas pipeline delivery contracts). In the Northeast, which uses a lot of natural gas for heating, this shortfall in natural gas led to an increase in electricity generation from [dirty] fuel oil, as the report points out.

However, regional transmission organizations (RTOs) and independent system operators (ISOs) were prepared for the cold snap, and the markets performed as expected. PJM in particular put systems in place to prepare for extreme cold weather following the 2014 Polar Vortex, and electricity markets in the Eastern U.S. are organized to provide payments to power plants for providing either energy (electrons to the grid) or capacity (the ability to switch on and provide a certain level of output if called upon). As fossil generators retire because they are uneconomic, plenty of other resources are under construction or in advanced planning stages and will be ready at the time they’re needed. This is why planning for future electricity needs is critical, and this is the responsibility of regional grid operators—one they take quite seriously.

To that point, grid operators and reliability experts see no threat to grid reliability from planned retirements of coal and nuclear power plants. The North American Electric Reliability Corporation (NERC), whose mission is to ensure the reliability of the bulk power system for the continent, finds in its 2017 Long-Term Reliability Assessment, that (contrary to NETL raising potential reliability issues from future coal and nuclear retirements) most regions of the country have sufficient reserve margins through 2022, as new additions more than offset expected retirements. PJM, in its strongly worded response to FirstEnergy’s petition to DOE for an emergency bailout (see below), stated “without reservation there is no immediate threat to system reliability.

Beyond this, the report and its pseudo-analytic underpinnings really goes off the rails. Let’s take a few of its misleading points in turn.

How to Quantify Resiliency

NETL decided to consider the incremental generation from each fuel source—that is, how much more electricity was produced by each fuel during the bomb cyclone—as a metric for which fuel provides the grid with resilient services. As they put it:

“…we examine resilience afforded by each source of power generation by assessing the incremental daily average gigawatt hours during the BC event above those of a typical winter day.”

This is a bogus metric not only because it simply reflects the amount of unused or idle generation in the system, but also because the reference time period (the first 26 days of December) is a period when there wasn’t much generation from coal and oil. Turns out, there is a lot of coal-fired capacity sitting around because it is more expensive to run compared to natural gas. The only time it makes economic sense to call on these more expensive resources is when demand pushes electricity prices high enough, as it did during the bomb cyclone.

What NETL is basically saying is that the most expensive resources are the most resilient. The report then argues that the high cost of those expensive resources represents the value of “resiliency”—and that these expensive generators should be compensated for providing that value. It’s circular reasoning, and it’s the same argument that we heard all last fall as part of the fact-free DOE FERC proposal, which boils down to this: our assets can’t compete in the marketplace because they’re too expensive, so you (meaning, the ratepayer) should pay us more money to stay online.

The NETL report is essentially trying to invent a metric to define resiliency, and it’s wrong. There are certainly qualitative ideas about what resiliency means:

Infrastructure Resilience is the ability to reduce the magnitude and/or duration of disruptive events. The effectiveness of a resilient infrastructure or enterprise depends upon its ability to anticipate, absorb, adapt to, and/or rapidly recover from a potentially disruptive event.”  –NERC, 2012

But there is no agreed-upon quantitative definition for resiliency, which is one reason FERC has opened a docket to study the issue.

Enter Capacity Markets

The NETL report misses another crucial point. These resources are, in many cases, already being paid to be available when needed. In general, there are several ways that a given generating facility of any kind can make money: by providing energy; by offering capacity on demand; and by providing what are called ancillary services (things like voltage and frequency regulation, which ensure the stability of the grid). Without going into a detailed explanation of how these different markets work, it’s sufficient to understand that these markets exist—and are working as intended.

Instead of doing a detailed analysis of how fossil generators were compensated during the cold snap, or which plants may have been cheaper to run, NETL offers a deeply misleading back-of-the-envelope calculation: it multiplies the increase in the daily cost of electricity above an arbitrary baseline (see next section) by the number of days in the cold snap. This calculation fails to acknowledge that some of these generators are already receiving payments for those services by bidding into a market and agreeing to provide the service of additional capacity when needed.

Cherry-Picking Baselines to Attack Renewables

NETL’s flawed analysis also takes aim at renewables, suggesting that because of “below average” renewable generation, resources like coal and fuel oil had to come online to pick up the slack.

What NETL did here is classic cherry-picking. They compared the generation from renewables during the bomb cyclone to what they called a “typical winter day.” Except that it wasn’t. NETL used a 26-day period in December to compare baseline generation. Wind generation during the bomb cyclone event was actually higher than expected by grid operators in the Northeast and Mid-Atlantic. For example, In PJM, wind output from January 3-7 was 55 percent higher than the 2017 average output, and consistently 3 to 5 times greater than what PJM expected from January 3-5.

Actual Failure Rates

Instead of using NETL’s flawed analysis, looking at the actual failures rate of different generation resources during the extreme weather event provides a more accurate picture of the reliability and resiliency impacts. PJM did this, it turns out. As shown in the chart below, which compares forced outages during the polar vortex and the bomb cyclone, PJM’s analysis finds that coal plants experienced similar failure rates as natural gas power plants during both the 2014 and 2018 cold snaps. For example, on January 7, 2018, a peak winter demand day, PJM reported 8,096 MW of natural gas plant outages, 6,935 MW of coal outages, 5,913 MW of natural gas supply outages, and 2,807 MW of “other” outages (which includes wind, solar, hydro, and methane units).

The NETL study completely ignores the fact that baseload resources like coal and nuclear also pose challenges to reliability—because of limited flexibility, vulnerability to extreme weather events (like the polar vortex and bomb cyclone), extreme heat and drought affecting cooling water, and storm surge. During extreme cold, pipes and even piles of coal can freeze, meaning that coal plants can’t fire up.

FirstEnergy Begs for a Handout

Only a day after NETL’s report was released, the utility FirstEnergy submitted a request to DOE for emergency financial assistance to rescue its uneconomic coal and nuclear plants and heavily cited the NETL report. The basis of the request is section 202(c) of the Federal Power Act, a rarely used portion of the statute that allows DOE to keep power plants online in times of emergency or war. But as NERC, PJM, and others have pointed out, there is no immediate reliability crisis. The request is a Hail Mary pass to save the company from bankruptcy, and is not likely to hold up in court.

Garbage In, Garbage Out.

NETL has produced a document that isn’t worth the few megabytes of disk space it is taking up on my computer. As we often say when evaluating a computer model or analysis—garbage in, garbage out. The study appears to be politically motivated, and it reveals a deep misunderstanding of how the electricity grid works, using simplistic and misleading calculations to justify its conclusions. It is shrouded in insidious, analytic-sounding language that make it seem as if it were a legitimate study. It should be rejected out of hand by any serious person taking an objective look at these issues—as should FirstEnergy’s request for a bailout.

The World’s Population Hasn’t Grown Exponentially for at Least Half a Century

Recently I was looking at some data about world food production on the excellent Our World in Data site, and I discovered something very simple, but very surprising about the world’s population. We often hear (and I used to teach) about the threat of an exponentially growing population and the pressure it is supposed to be putting on our food supply and the natural resources that sustain it (land, water, nutrients, etc). But I found that the global population isn’t growing exponentially, and hasn’t been for at least half a century.

It has actually been growing in a simpler way than exponentially—in a straight line.

What exponential growth is

Exponential growth (sometimes also called geometric or compound-interest growth) can be described by an equation in which time is raised to a power, i.e. has an exponent—hence the name. But it also can be described in simpler terms: the growth rate of the population, as a fraction of the population’s size, is a constant. Thus, if a population has a growth rate of 2%, and it remains 2% as the population gets bigger, it’s growing exponentially. And there’s nothing magic about the 2; it’s growing exponentially whether that growth rate is 2% or 10% or 0.5% or 0.01%.

Another way to put it is that the doubling time of the population—the number of years it takes to grow to twice its initial size—is also a constant. So, if the population will double in the next 36 years, and double again in the following 36 years, and so on, then it’s growing exponentially. There’s even a simple rule-of-thumb relationship between doubling time and the percentage growth rate: Doubling Time = 72/(Percentage Growth Rate). So that population with a 36 year doubling time, is growing at a rate of 2% per year.

But probably the simplest way to describe exponential growth is with a graph, so here’s how it looks:

Figure 1. Exponential growth versus linear (straight-line) growth.

This graphic not only shows the classic upward-curving shape of the exponential growth curve, but also how it contrasts with growth that is linear, i.e. in a straight line. Additionally, it demonstrates a simple mathematical result: if one quantity is growing exponentially and a second quantity is growing linearly, the first quantity will eventually become larger than the second, no matter what their specific starting points or rates of growth.

This isn’t just abstract math; it also illustrates the most famous use of exponential growth in political debate. It was put forward by the English parson Robert Malthus over two centuries ago. He argued that the human population grows exponentially while food production can only grow linearly. Thus, it follows inevitably that the population will eventually outgrow the food supply, resulting in mass starvation. This is the case even if the food supply is initially abundant and growing rapidly (but linearly). The upward-bending-curve of an exponentially-growing population will always overtake it sooner or later, resulting in catastrophe.

Looking at real data

Critics ever since Malthus’ time have pointed out that his assumption that food production grows in a straight line is just that—an assumption, with little basis in theory. So I wasn’t surprised to see that the OWID data showed faster-than-linear (upward-curving) growth in global food production over the past half-century. What did surprise me was that the growth of the world’s population over that time period has actually been very close to a straight line.

Here’s that graph:

Figure 2. World population growth from 1961 to 2016, from the official U.N. figures available at ourworldindata.org. The data are expressed as an index, with the 1961 population = 100. To convert the index to actual numbers of people, just multiply the index value by 30,830,000, since the world population in 1961 was 3.083 billion.

The graph looks very much like a straight line rather than the upward-curving exponential, but is that really the case? We can test this by calculating the value of what statisticians call the R2 (or “coefficient of determination”) for this curve. The closer it is to a straight line, the higher R2 will be, and if the data fits a straight line perfectly then R2 will be exactly 1.0.

So, what’s the actual value for this data? It’s 0.9992. I.e. the fit to a straight line isn’t quite perfect, but it’s very, very close.

Is this some sort of artifact?

I was actually quite surprised at how well the data fit a straight line—so much so that I wondered if this was just an artifact of the method I used, rather than a real result. So I applied the same method—plot the data, fit a straight line to it, and calculate the value of R2—to the data for some of the world’s largest countries and regions, rather than the world as a whole.

For several of these, the lines looked very straight and the value of R2 was almost as high as in the graph for the world as a whole, or even slightly higher, e.g.:

Country or Region  R2 for the linear equation of Population vs. Time Brazil .9977 India .9954 Indonesia .9995 Latin America and the Caribbean .9994 North America .9966 Pacific island small states .9991

But for others it was considerably lower (e.g. .9777 for China, .9668 for the European Union) and two graphs proved clearly that the excellent fit to a straight line is a real result, not an artifact. These were the ones for Sub-Saharan Africa and Russia:

Figure 3. Population growth of Sub-Saharan Africa from 1961 to 2016, from the official U.N. figures available at ourworldindata.org. The data are expressed as an index, with the 1961 population = 100. Thin dotted line shows the best-fit straight line; thick dots show the actual data.

Figure 4. Population growth of Russia from 1961 to 2016, from the official U.N. figures available at ourworldindata.org. The data are expressed as an index, with the 1961 population = 100. Thin dotted line shows the best-fit straight line; thick dots show the actual data.

The point about the Sub-Saharan African graph is not simply that it has a lower value of R2 (0.964), but that its data deviates from the straight line in the way that an exponential curve should: higher than the straight line at the lowest and the highest time values, and lower than the straight line at the intermediate ones. It does fit an exponential curve quite well, thus showing that the method can pick out an exponential curve if the data do follow one. But this is the only region or large country for which that’s actually true.

The Russia graph doesn’t fit an exponential curve well at all—it actually curves downward overall, rather than upward as it should if it were an exponential—but it does show that the value of R2 can be much lower than 1.0 for real data. For Russia it’s 0.632. So as with the Sub-Saharan Africa case, it proves that the high value of R2 for the world as a whole is not an artifact caused by the method. It reflects the reality of the past 55 years.

Finally, since I and many of my readers are from the United States, here’s that graph:

Figure 5. Population growth of the United States from 1961 to 2016, from the official U.N. figures available at ourworldindata.org. The data are expressed as an index, with the 1961 population = 100. Thin dotted line shows the best-fit straight line; thick dots show the actual data.

For the US as for the whole world, population growth over the past half-century has been quite close to a straight line; the R2 is 0.9956.

A direct test of whether growth is exponential

These graphs and R2 values seem to indicate that linear growth is the best model for the world population over the past 55 years, but there’s another way to show that it’s not exponential. As I said above, exponential growth occurs when the percentage growth rate remains constant as the population gets bigger. So a simple test is to graph the percentage growth rate over time, and see whether it’s a constant—i.e., a horizontal line. So here’s that graph:

Figure 6. Percentage growth rate of the world population from 1961 to 2016, calculated from the official U.N. figures available at ourworldindata.org. The trend lines goes downward over time, rather than being horizontal as it would if the percentage were a constant.

This result, like the others, is quite clear. The percentage growth rate is not a constant, as it should be if the population were growing exponentially. Rather, it has been dropping steadily over the past half-century, from over 2.0% in the early sixties to below 1.2% now.

What exponential growth is Not

So, we should stop saying that the world’s population is growing exponentially. That hasn’t been the case for at least 50 years. Exponential growth clearly doesn’t describe the global reality of the twenty-first century.

But there’s actually a second reason to stop saying that the global population is growing exponentially, and that’s because the term is so commonly misused and misunderstood. Note the next few times that you hear someone use the word, and I think you’ll find that it’s not being used in the sense of “constant percentage-growth-rate” or “constant doubling-time” or even just “an upward-bending curve.” Rather, it’s being used—often with an emphatic stress on the “-nen-” syllable and an implicit exclamation mark at the end of the phrase—to mean “rapidly” or “quickly” or “fast” or “big.”

That way of speaking is common, but it’s also just plain wrong. Remember the example that I started with: the exponential growth rate can be high (e.g. 10%) or low (e.g. 0.01%) or intermediate (e.g. 2%). In every case it’s exponential growth, but it’s very fast exponential growth if the growth rate is 10% and very slow exponential growth if it’s 0.01%.

I’m not that sanguine about getting people to go back to using “exponential” in its correct sense, but I think it’s at least worth a try. After all, we already have several other good words for that other, incorrect meaning—e.g. “fast” or “big.”

Implications

The results don’t just imply that we should talk about population growth differently, but also that we need to re-think how it relates to food production. There is good news in these data, because they show that hunger and environmental catastrophe is not at all inevitable. Malthus’ argument just doesn’t fit reality.

While linear growth has its challenges, it’s far easier to deal with than exponential growth. The distinction between growing exponentially and growing in a straight line does matter. On that point, at least, Malthus got it right.

FEMA and HUD Budgets are Vital for Disaster and Climate Preparedness

Members of FEMA's Urban Search and Rescue Nebraska Task Force One comb a neighborhood for survivors impacted by flooding from Hurricane Harvey. FEMA

Last year’s record-breaking disasters—including hurricanes, wildfires and floods—were a reminder of how climate change and faulty development policies are colliding to create dangerous and costly outcomes for the American public. While much attention is focused on post-disaster recovery, we need to invest much more in preparing for disasters before they happen. The good news is that the omnibus budget deal recently passed by Congress appropriated significant funding for the Federal Emergency Management Agency (FEMA) and Department of Housing and Urban Development (HUD) to help foster community resilience, in many cases undoing steep cuts that had been proposed by the Trump administration.

FEMA and HUD’s role in building disaster resilience

The omnibus budget deal recently passed by Congress was clearly influenced by the unprecedented series of disasters in 2017. There seems to be a dawning sense of new realities regarding extreme weather (even if some prefer to disavow climate science). We saw this reflected in the budgets of FEMA and HUD.

FEMA administers several programs that help states, territories, and tribal governments build back after disasters as well as invest in preparedness measures to reduce the risks and costs of future disasters. Done right, with future climate and other conditions in mind, these grants can be a powerful catalyst for building community resilience.

Key FEMA programs include:

  • The Hazard Mitigation Grant Program, which helps communities implement measures to reduce long-term risks to people and property from hazards after a presidential major disaster declaration. The HMGP provides funding for a range of activities including voluntary home buyouts, home elevation and infrastructure retrofits and is generally 15 percent of the total amount of Federal assistance provided to a State, Territory, or federally-recognized tribe following a major disaster declaration. To mark 30 years of this program, FEMA has created an online data visualization resource that summarizes data for HMGP projects by county, state, FEMA region or by Congressional District.
  • The Flood Mitigation Assistance Grant Program, which helps state and local governments fund projects and plans to reduce the long-term risk of flood damages for properties insured by the National Flood Insurance Program. In the recently passed omnibus budget, this program’s budget was $175 million.
  • The Pre-disaster Mitigation (PDM) Grant Program, authorized by the Stafford Act to help states, local governments, and communities implement long-term measures to reduce the risks and losses from disasters. Typically, FEMA pays for 75 percent of project costs and states match the remaining 25 percent. In the omnibus, this program’s budget was $249.2 million. This was a striking increase from recent years, as one news story put it: that is three times the average annual amount over the past 15 years!
  • FEMA’s budget for flood risk mapping is also vital to ensuring that communities, planners, and policymakers are aware of these risks and can take protective measures to limit them. The omnibus budget provided $262.5 million for flood mapping.

HUD’s Community Development Block Grant (CDBG) program, especially the CDBG-Disaster Recovery grants, are instrumental in helping low and moderate-income communities—often the hardest hit by disasters—prepare, recover and build resilience. Our nation has long under-invested in safe, affordable housing–a challenge which is further exacerbated when disasters strike. Despite the Trump administration’s efforts to decimate HUD’s budget with an $8.8 billion proposed cut, Congress passed an omnibus budget deal that increased funding for HUD across the board–including $1.36 billion for the HOME Program and $3.3 billion for the Community Development Block Grant (CDBG) Program.

Despite repeated attempts by the Trump administration to cut agency budgets, including FEMA and HUD’s, Congress has recognized the importance of their work for the well-being of the American public, and has maintained or increased funding levels. Unfortunately, funding still remains much below what is needed by communities, especially as the impacts of climate change worsen.

Another continued area of concern that Congress must stand up against is this administration’s attempts to sideline science in policymaking. A recent egregious example of this: FEMA scrubbed all references to “climate change” from its four-year strategic plan, released last month.

An ounce of prevention is worth a pound of cure

Investing in resilience ahead of disasters—so-called hazard mitigation—is incredibly cost-effective and can save lives. That’s the clear message from an authoritative report from the National Institute of Building Sciences, Natural Hazard Mitigation Saves: 2017 Interim Report. Based on nearly a quarter-century of data, the report found that hazard mitigation projects funded by FEMA, HUD and the U.S. Economic Development Administration (EDA) can save the nation, on average, $6 in future disaster costs for every $1 invested (That ratio is even higher, 7:1, for measures to protect against riverine flooding).

The report also found that investing in measures that exceed requirements of the 2015 International Codes, the model building codes developed by the International Code Council, can save the nation $4 for every $1 spent. See the figure below for benefit-cost ratios for these two categories of protective measures to address different types of hazards.

In the aftermath of disasters, communities clearly need stepped-up aid, but the reality is we spend a lopsided amount of money post-disaster and shortchange pre-disaster investments to help limit costs and harms.  A 2015 Government Accountability Office (GAO) report  found that from fiscal years 2011-2014, FEMA obligated more than $3.2 billion for the HMGP (Hazard Mitigation Grant Program) post-disaster hazard mitigation while the Pre-Disaster Mitigation Grant Program obligated approximately $222 million.

A recent paper from Kousky and Shabnam underscores the challenges, highlighting that:

For FEMA, almost 90% of flood risk reduction funding comes after a big flood and the HUD CDBG-DR funding is only after a major disaster. Across agencies, absent a severe flood, very few dollars for risk reduction are available.”

We also need more (bipartisan) action to foster preparedness 

It’s critical to support and bolster existing federal agency budgets and programs that are helping communities become more resilient, alongside funding to help them cope with and recover from disasters. It’s simply a commonsense way to help protect people and property—and it’s a smart use of taxpayer dollars.

What’s more, budgets for disaster preparedness and protective standards are a bipartisan priority, despite political polarization about some of the underlying climate-related risk factors.

For example, South Carolina Republican Representative Mark Sanford recently called for a flood-ready infrastructure standard, saying:

“The process of flooding and rebuilding has become increasingly costly, as taxpayer dollars are being spent to rebuild or repair public infrastructure – sometimes multiple times. It makes no sense to go through this cyclical and costly process when the simple step of strengthening the federal flood standard can save taxpayer money and protect our communities.”

This standard is sorely needed since the Trump administration rolled back the Federal Flood Risk Management Standard just before Hurricane Harvey hit.

Florida Republican Representative Carlos Curbelo has co-sponsored the National Mitigation Investment Act, which provides incentives for states to invest more in protective building standards.

Federal, state, and local policymakers will also need to do a lot more to align existing and new policies and incentives with worsening risks in a warming world. One important near-term opportunity is reforming the National Flood Insurance Program, which the omnibus bill sets up for reauthorization by July 31 this year.

State and local governments leading the way

Massachusetts Governor Charlie Baker, a Republican, recently filed legislation for a $1.4 billion climate adaptation bond to help the state prepare for the impacts of climate change. Coming off a brutal series of winter storms, accompanied by damaging coastal flooding, the Governor and the legislature now have an opportunity to pass legislation to address the near and long term threats of climate change.

At the local level we need to see more progress along the lines of the encouraging news last week that the Houston City Council has just adopted more protective building standards in the city’s flood-prone areas. Houston Mayor Sylvester Turner said it best:

“We’re going to be futuristic. We are not going to build looking back. We’re going to build looking forward.”

That’s a goal our nation must aspire toward, especially as climate projections show an increasing risk of many types of disasters.

FEMA News Photo

With Pruitt Under Fire, Likely Successor Andrew Wheeler’s Coal Ties Deserve Scrutiny

Photo: Senate EPW

As ethics storm clouds build over Scott Pruitt, environmentalists eager for a new administrator of the Environmental Protection Agency should beware.

That is because the odds-on next leader of the EPA is Andrew Wheeler. He has been an unabashed inside man for major polluters on Capitol Hill. He lobbied for coal giant Murray Energy, a captain in that company’s bitter war against President Obama’s efforts to cut greenhouse gas emissions and enact more stringent clean air and clean water rules.

Wheeler assisted the efforts of refrigerant companies to resist stricter ozone rules and represented Energy Fuels Resources, a uranium mining company that successfully pushed for Interior Secretary Ryan Zinke to shrink the size of Bears Ears National Monument in Utah 85 percent, despite all its riches in Native American archaeology and art.

Confirmation now up for a vote

Nominated last October by President Trump to be Pruitt’s deputy administrator, Wheeler’s confirmation has been in limbo. But Senate Majority Leader Mitch McConnell fast-tracked Wheeler for a vote that could come next week, by filing cloture.

The evidence is abundant that Wheeler stands squarely with the agenda of President Trump and Administrator Pruitt to render the EPA as ineffective as possible. When Pruitt sued the EPA 14 times as Oklahoma attorney general between 2011 and 2017 on behalf of polluting industries, a top petitioner and co-petitioner in half those cases was coal giant Murray Energy. Wheeler was its lobbyist from 2009 to last year. Even with pro-coal President Trump well into his second year, CEO Robert Murray is still complaining in his current message on the company’s Website:

“Our industry is embattled from excessive federal government regulations from the Obama Administration and by the increased use of natural gas for the generation of electricity. In my sixty-one years of coal mining experience, I have never before seen the destruction of an industry that we saw during the Obama presidency.”

An action plan for rollbacks

Wheeler accompanied Murray to the now-notorious meeting a year ago with Energy Secretary Rick Perry, the one in which Murray handed Perry a 16-point action plan “which will help in getting America’s coal miners back to work.” That plan ultimately became the framework of a proposal by Perry to bail out struggling coal and nuclear power plants (Wheeler was also a nuclear industry lobbyist).

That particular proposal was shot down by federal regulators, but Trump and Pruitt have made good or are making good on most of those 16 points, including the US pullout from the Paris climate accords, the rejection of Obama’s Clean Power Plan, and slashing the staff of the EPA down to a level not seen since the 1980s attacks on the agency by President Reagan.

In suggesting that EPA employees be cut by at least half, Murray’s action plan claimed that the verbiage of Obama-era EPA rules were “thirty-eight (38) times the words in our Holy Bible.”

Wheeler has denied helping Murray draw up that document, but he certainly shares its sentiments, telling a coal conference in 2016, “We’ve never seen one industry under siege by so many different regulations from so many different federal agencies at one time. This is unprecedented. Nobody has ever faced this in the history of the regulatory agenda.”

Longtime Inhofe aide

Wheeler’s vigorous lobbying career came after serving as a longtime aide to the Senate’s most vocal climate change denier, Oklahoma’s James Inhofe. When the Trump administration announced Wheeler’s nomination, Inhofe hailed Wheeler as a “close friend.” That closeness was evident last May when Wheeler held a fundraiser for Inhofe, as well as for Senator John Barrasso of Wyoming, chair of the Senate Environment and Public Works committee that advanced his nomination by a party-line 11-10 vote. The Intercept online news service reported that Wheeler held the fundraisers after it was reported that he was under consideration to be Pruitt’s second in command.

Up until now, Wheeler has escaped the harsh scrutiny that has forced the withdrawal of some Trump appointees who were seen as embarrassingly close to industry, such as Michael Dourson’s failed bid to oversee chemical safety at EPA. Part of that was his good luck in being paired in his committee hearing last November with Kathleen Hartnett White, who spectacularly flamed out with her blatant skepticism about the sources of climate change, once calling carbon dioxide, a key greenhouse gas, the “gas of life.”

By contrast, Wheeler slickly held to dry, brief statements that climate change is real, while agreeing with Trump’s pullout of global climate change accords. He even tried to play the good Boy Scout. After Tom Carper of Delaware recited Scouting’s commitment to conservation, Wheeler said, “I agree with you that we have a responsibility in the stewardship of the planet to leave it in better shape than we found it for our children, grandchildren, and nephews.”

His long track record of lobbying suggests the opposite.

Pruitt Needs to Go—But So Do Others in Pruitt’s Conflicted and Corrupt EPA

Photo: Gage Skidmore/CC BY-SA 2.0 (Flickr)

Environmental Protection Agency Administrator Scott Pruitt seems to have a penchant for scandalous behavior, from misuse of public funds to special deals with corporate lobbyists. It was hard to keep up this week with Pruitt press.  Sometimes it is hard to remember that each of these inappropriate actions by the Administrator is connected to an action that undermines public health and safety protections, as described by my colleague Josh Goldman.

And there is really no question that it is time for Pruitt to leave the agency that he leads. He has done more than enough damage to the work of the EPA, sidelining science at the expense of Americans’ health and safety. I certainly hope that the White House hears from Congress and the public that we have all had enough of Mr. Pruitt.

Unfortunately, it will take more than just change at the top for the EPA to once again serve the critical mission it is charged with by Congress—and that all of us in the public need. Mr. Pruitt has filled key positions in the agencies with lobbyists for regulated industry, cronies from Oklahoma and others with deeply held positions in opposition to the agency’s mission.

The year of hiring dangerously

Several months ago I wrote that too many of the Trump Administration’s appointees either have deep conflicts of interest, are opposed to the mission of the agencies they are appointed to, or are fundamentally unqualified. At the EPA, all of those problems are on prominent display, and they don’t end when and if Pruitt is shown the door.

One of the scandals revealed this week is that Mr. Pruitt used a provision of the Safe Drinking Water Act to appoint Dr. Nancy Beck outside of civil service rules and the ethics requirements of the Trump Administration. He did this because, at the behest of the chemical industry, he wanted former lobbyist Beck to re-write (read: weaken) chemical safety rules. Dr. Beck couldn’t meet President Trump’s own ethics requirements because she previously lobbied for the American Chemistry Council (ACC) on those very rules and therefore has a deep conflict of interest. The result: the implementation of the Chemical Safety Act has been weakened and—shockingly—the rules now fully reflect the ACC stated desires, ignoring input from all other interested parties—like public health experts and affected communities.

Or this week, Mr. Pruitt withdrew common sense automotive fuel efficiency standards that clean our air and save drivers money at the pump. These are standards the auto industry had negotiated and applauded when taxpayers were footing the bill for a huge industry bailout in 2008. Nonetheless, Mr. Pruitt, working with the automakers trade group withdrew that standard without any supporting analysis. Integral to that rollback was EPA Senior Clean Air Advisor William Wehrum, a lawyer for oil, gas, coal and chemical industries. During his career he sued the EPA more than 30 times to rollback public health protections. Not only does he have conflicts of interest because of his recent clients, but this record shows he is largely opposed to the EPA’s mission. Recently he was the architect of a new EPA legal interpretation that has the potential to dramatically increase emissions of hazardous, cancer-causing pollutants from industrial facilities all around the country.

Conflicted and corrupted

Mr. Pruitt has also brought on board EPA staff Dr. Richard Yamada in the Office of Research and Development. Dr. Yamada previously worked with Rep. Lamar Smith (R–TX) to push forward legislative efforts to give regulated industries more seats on EPA’s Science Advisory Boards, as well as excluding certain peer-reviewed science the agency can consider when implementing health and safety protections. Neither of those efforts were successful in Congress. Undaunted, Mr. Pruitt and Dr. Yamada are pushing their implementation by administrative directives, circumventing the will of Congress. They are busy excluding independent scientists from serving as advisors while packing the Boards with industry-based scientists that have been employed to cast doubt on the need for public health protections.

For example, one of their recent advisory board appointees has argued that “modern air is a little too clean for optimum health” and needs to be dirtier to protect the public. At the same time, Dr. Yamada is crafting rules to exclude from consideration many public health studies unless all the underlying raw data is released to the public. But since they are studies of public health they rely on the private medical information of real people that can’t be made public. In other words, the EPA shouldn’t use public health science to protect public health. That’s what I mean when I say some appointees seem fundamentally opposed to the mission of the agency.

The collection of conflicted aides stretches into the dozens.

Another on the list: Liz Bowman, Associate Administrator for Public Affairs and Pruitt’s lead spokesperson (and former chemical industry exec) has sought to mislead the American public about Mr. Pruitt’s long list of scandals. Elizabeth “Tate” Bennett, who previously lobbied with the National Rural Electric Cooperative Association, also faced pushback from Senators over the significant conflicts of interest she would face in her job with EPA’s Office of Congressional and Intergovernmental Relations. Erik Baptist, a former lobbyist with the American Petroleum Institute, joined Pruitt’s EPA as a top lawyer who was approved to advise Pruitt on the renewable fuel law.

And finally, there are the close aides Mr. Pruitt brought in with him to make the unprecedented assault on our children’s and families’ health and safety. One notable name is Albert “Kell” Kelly—disgraced banker (banned from banking for life by the FDIC) and friend of Pruitt from Oklahoma who has no environmental background, but was nonetheless hired to run the cleanup of Superfund sites. Twenty-five million Americans live within 10 miles of these highly toxic industrial waste sites—relicts of the days before polluting industries were regulated by the EPA. It should not escape anyone’s notice that these are the good ol’ days that Mr. Pruitt and his inner circle would like us to return to.

So, yes, Mr. Pruitt, we’re ready to say bye bye. But when you go, please take your corrupt and conflicted colleagues with you (more than I could name in a single post). The EPA needs to get back to doing what we need it to do—protect public health and safety. We don’t need the most extreme positions of some industry groups that oppose any and all regulation at the expense of our children and families. We need EPA and its many highly skilled and committed civil servants, scientists, policy experts, administrative professionals, lawyers and enforcement officers to do the jobs that they do so well. On behalf of all of us—the public.

Shell Knew About Climate Risks Since the 1980s, Will it Act Now?

Shell sign in gas stationPhoto: David Nagy CC-BY-SA-2.0 (Flickr)

The year is 1988. The Wonder Years debuts on TV, George Michael’s “Faith” tops the Billboard charts, gas costs $1.67 at the pump, the U.S. Surgeon General states that the addictive properties of nicotine are similar to those of heroin and cocaine, and Royal Dutch Shell writes a confidential report on climate science and its own role in global warming. This report is one of dozens of internal documents unearthed by journalist Jelmer Molmers of De Correspondent and posted this week on Climate Files that shed more light on what Shell knew decades ago about the risks of burning fossil fuels.

These documents are enormously significant in efforts to hold the company accountable to its stated support for the Paris Climate Agreement. Their release comes a week after Shell rolled out its Sky scenario illustrating a possible pathway for the world to achieve the goal of keeping global temperature increase well below 2 degrees Celsius—and sets up a showdown leading into the company’s annual meeting in The Hague next month, with Shell facing mounting pressure from climate litigation and its own shareholders.

“…it could be too late…”

The Greenhouse Effect,” a 1988 document marked “CONFIDENTIAL,” details Shell’s extensive knowledge of climate change impacts and implications. It includes this sobering observation:

“However, by the time the global warming becomes detectable it could be too late to take effective countermeasures to reduce the effects or even to stabilize the situation.”

Let that sink in. Shell knew in 1988—30 years ago—the enormous risks of fossil fuels. Not just that greenhouse gas emissions would warm that planet, and that the contribution of its products to heat-trapping emissions could be calculated (as scientists have since done in peer-reviewed research), but that in order to avoid impacts we needed to take action. Those detectable impacts—rising seas, wildfires, flooding, extreme heat—are now affecting every corner of the globe. A 1991 Shell video revealed last year by The Guardian warned of climate change “at a rate faster than at any time since the end of the ice age—change too fast perhaps for life to adapt, without severe dislocation.”

Yet what did Shell do? Along with other major fossil fuel companies, it deceived the public about the risks of its products and kept us on a path of unabated fossil fuel extraction. By 1994, Shell was emphasizing uncertainties in climate science in an updated internal document titled “The Enhanced Greenhouse Effect.” As my colleague Peter Frumhoff told Climate Liability News, “The company goes from saying ‘if we wait until all the scientific questions are answered it may be too late’ to saying ‘we have to wait until all of these scientific questions are answered.’”

Legal Challenges

The newly-released documents are likely to be relevant in climate-related litigation targeting Shell and other major fossil fuel companies. This week, Friends of the Earth Netherlands / Milieudefensie demanded that Shell align its business model with the Paris Climate Agreement or face legal consequences. The organization says it will sue the company if it fails to take action within eight weeks.

The Dutch lawsuit would compound the legal trouble Shell faces from lawsuits by New York City and several California communities over its contributions to climate change. Yet unlike those suits, which seek to recover costs to address climate damages and prepare for future impacts, this case would be the first to call for action by a company to prevent further climate change. It would open a new frontier in climate liability, and in efforts to use the courts to hold corporations accountable to human rights obligations and broad societal norms.

Shareholder concerns

Shell’s deadline to respond to Milieudefensie comes just a week after the company’s annual general meeting, scheduled for May 22 in The Hague. In its latest filing with the U.S. Securities and Exchange Commission, Shell acknowledged financial risks associated with “lawsuits seeking to hold fossil fuel companies liable for costs associated with climate change.” Shareholders should grill company decision-makers about the implications of the newly-released internal documents for current and potential litigation against Shell.

Some shareholders are already calling for changes to Shell’s business model similar to those outlined in the Milieudefensie liability letter. The Dutch organization Follow This has again filed a shareholder resolution requesting that Shell “set and publish targets that are aligned with the goal of the Paris Climate Agreement to limit global warming to well below 2°C.”

Shell launched its Sky scenario last week with great fanfare, and won praise for its pledge last November to double spending on clean power and halve the carbon footprint of the energy it sells by 2050. Shell has gone farther than many of its peers in analyzing what a 2°C or lower scenario would mean for its business, addressing mainstream investor expectations such as the recommendations of the Task Force on Climate-Related Financial Disclosures (TCFD). Yet serious questions remain, including about the company’s reliance on yet-to-be-developed technologies to achieve negative emissions—read more here and here and here.

Just over a year ago, Shell CEO Ben van Beurden remarked that, “Trust has been eroded to the point where it is an issue for our long-term future.” With this week’s revelations, erosion has turned into a mudslide.

Thanks to my colleagues Jean Sideris and Ja-Rei Wang for their contributions to this blog.

David Nagy

Grasping for “Hopeful Signs,” Washington Post Downplays the Dangers of Trump Administration Attacks on Science and Public Health

The headline of a Washington Post editorial board piece caught me off guard last week. It read, “Trump’s record on science so far is a mixed bag.” I read on to try and understand the points made but found myself disappointed and confused by the message conveyed.

Somehow despite the Trump administration appointing several climate deniers to key public health positions, working more closely with the oil and gas industry on policy than with public health organizations, and failing to name a science advisor, the authors argue that that there are “hopeful signs” in the public health arena despite the failures to protect the environment and reasonably approach the risks of climate change.

But to the editorial board of the Washington Post I say: you cannot separate public health from the environment and climate change. Just because the Department of Health and Human Services (HHS) has health in its name doesn’t mean it’s the only federal agency responsible for protecting public health. The fact is that our health is affected by a host of public policy decisions outside of the Centers for Disease Control and Prevention (CDC) and U.S. Food and Drug Administration (FDA’s) purview. What about environmental health at the EPA? Occupational health at the Department of Labor? Nutrition and food safety at the Department of Agriculture? I have a hard time thinking of a single federal agency the decisions at which don’t have impacts on public health in some way. Every arm of the federal government has a responsibility to protect all Americans from a variety of potential harms, from malnutrition to the devastating health impacts of poverty.

You might not think of the U.S. Department of Housing and Urban Development, for example, as having public health impacts. But, researchers have found that individuals receiving housing assistance from HUD are more likely to have access to health care than those on waitlists to receive housing assistance.

What about the Department of Commerce? The U.S. Census Bureau provides accurate and up-to-date statistics on a range of issues related to health in the United States, ranging from fertility to disability to health insurance coverage. This information helps other government agencies, state health departments, organizations, and the rest of us identify what health interventions might be needed for certain communities.

To narrow the focus of positive scientific developments to just one federal department and to use what I would consider a low bar for a good record on science fails to consider the ramifications that the Trump Administration’s actions and general disdain for science are having and will continue to have on public health. Yes, it’s true that the National Institutes of Health director and the CDC director are scientifically qualified to hold those positions. It’s true that Alex Azar, new head of HHS, has publicly stated that the CDC should be able to conduct research on the impacts of gun violence on public health. But, it’s also almost a year and half into the Trump administration and we’ve seen FDA’s Scott Gottlieb delay science-based added sugar labels at industry’s request, the previous CDC director resigned due to conflicts of interest so serious that she was unable to contribute to discussions ranging from tobacco to Zika vaccines, HHS removed LGBT health resources from its Office on Women’s Health website last fall and defunded over $200 million in teen pregnancy prevention program grants. The list goes on.

While there is promise of science-based policy decisions that will come out of HHS, there are also great concerns about the public health ramifications of this administration’s already very long list of attacks on science. At the EPA alone, the 22 deregulatory actions that Administrator Pruitt has bragged about saving taxpayer dollars actually will mean unrealized benefits for communities facing risks from environmental contaminants throughout the country.

For example, Pruitt has issued a proposed rulemaking to begin rolling back the 2015 Coal Ash Rule which would have required utility companies to monitor ponds storing coal ash waste (a byproduct of coal production), report leaks and spills regularly, and make lining ponds mandatory. If this federal requirement is no longer in place, we will see continued leakage of heavy metals into groundwater and surface water with known health consequences, including cancer risk, for nearby community members. Administrator Pruitt is doubling down on his misunderstanding of how science should inform policy by hinting at issuing a directive that would enact language from the HONEST Act. This would be an asinine move that would hamstring EPA’s ability to use a variety of scientific studies to support its policies which would have a drastic impact on its ability to enforce its own standards to protect public health from things like air pollution and lead contamination.

There have been several attempts at other agencies, like the Department of Interior (DOI), to stop monitoring and collecting data on environmental health impacts altogether. In August, DOI halted a National Academies study it had begun to fund that would have looked at the impacts of coal mining operations on residents of Appalachian states. Then, in December, DOI ordered NAS to cease yet another planned study that would review the department’s ability to inspect offshore oil and gas operations, originally commissioned to help prevent public health risks the likes of which were seen after the 2010 Deepwater Horizon explosion. These research endeavors would have contributed to the body of evidence to help the government better protect public health from oil spills and downstream impacts of coal mining, but the administration’s actions display apathy toward both evidence and protecting public health.

And while there has been some progress made in defense of science priorities, like the fact that the 2018 fiscal year budget from Congress was signed by the President even as it protected strong funding to the science agencies, the mere fact that the President’s budget initially called for a 30% decrease in funding for EPA, and its Office of Research and Development and enforcement programs shows that using science to protect public health is not a priority for this administration. Acknowledging this fact, we understand how important our role is to continue to defend the scientific work of each agency so that we can preserve as much of our public health safety net as possible, despite political appointees’ and industry lobbyists’ best efforts to undermine it.

This week marks National Public Health Week as designated by the American Public Health Association, and while we should think about how to better protect the health of the next generation this week, we should remember that there are children at risk every second. All arms of government and all of us have a role to play in improving their lives not just through better access to healthcare, but through public safeguards that minimize or eliminate risk from all exposures. From air quality controls to pesticide tolerance levels to gun safety reforms, it’s our duty to build, not to demolish, the public health infrastructure that will allow the next generation to accomplish all of the feats they aim to take on.

Follow along with the American Public Health Association’s National Public Health Week’s conversations on twitter @NPHW and with the hashtag #NPHW.

Empowering Early Career Scientists to Engage in Science Advocacy, Policy and Communication

Photo credit: Alina Chan, Future of Research

As a member of and an advocate for the early career scientist community, I strongly believe that we are the future of science. We need to engage in activities that allow us to use our voice for the greater good, and we must do this through multiple avenues. Adapting to the changing landscape of the scientific enterprise requires integrating professional development activities into the training of early career scientists, in order to create “whole scientists.” This culture shift will enable us to utilize valuable skills acquired during our training to benefit society.

Two important aspects of this training are developing the ability to explain science to various audiences, and to effectively advocate for the importance of science within our own institutions, to policy makers, and to the general public. In a sense, I believe it is the responsibility of our generation to be the change we want to see, and to lead by example in engaging others to participate in this change with us.

It is encouraging to see that many early career scientists today seek to engage in science advocacy. But in order to achieve our advocacy goals, it is imperative to receive proper training in this area. In 2016, three organizations (Future of Research, Academics for the Future of Science, and the MIT Graduate Student Council) organized a joint “Advocating for Science” symposium and workshop in Boston, MA, with the goal of sharing tools and skills necessary to train early career scientists in advocacy. The event highlighted the eagerness of participants to advocate for a particular cause, with the overall goal of improving specific aspects of the scientific enterprise. Overall, this event catalyzed the power of early career scientists to participate in culture change around science advocacy by preparing them for future engagement opportunities.

Preparing for a career that connects science and society

Similar to most early career scientists today, I now seek a non-academic career that fulfills a greater purpose. At the same time, I am part of a generation of early career scientists that is well aware of how our academic training is not preparing us for desired (non-academic) careers. This is a particularly important consideration given that academic careers are now becoming the minority, and more early career scientists are transitioning into occupations where their scientific skills can be applied towards broader societal impacts. In particular, as science advocacy, policy and communication careers are now becoming more popular with early career scientists, the manner in which they are trained for various career paths must drastically change.

At the core of Future of Research is our mission to champion, engage and empower early career scientists with evidence-based resources to improve the scientific research endeavor.” To this end, we propose changes in the scientific training environment in order to enable a more effective level of engagement in activities that complement our scientific training at the bench. The ability to communicate our science to various audiences will only enrich this training and enable us to advocate for our cause. However, this shift requires a culture change around science communication and other skills, both within academia and beyond. While many barriers still exist to enacting this change, early career scientists in many cases have developed their own programs in universities, being able to also engage others in these types of activities.

Developing initiatives at the university level to enhance advocacy, policy and science communication skills for early career scientists, in which they learn to describe their science to various audiences, is a necessary and valuable skill. Some general examples of these types of efforts are storytelling strategies, podcasts, and groups in universities. Additionally, while I was a postdoc, I developed a career seminar series to expose graduate students and postdocs to different career options. I also organized symposia to create a sense of community among scientists at all levels in the Midwest within my area of research, and to give early career scientists a voice in this event and connect with other junior and senior scientists in the area working on similar research topics.

These efforts demonstrate the willingness of early career scientists themselves to change the culture around particular issues within their local communities. Nationally, many scientific societies and organizations seek to engage early career scientists in advocating for a cause of interest, providing a natural platform in which to advocate for particular issues in various settings and to various audiences. Taking advantage of these opportunities is vital to both our professional development as scientists and to maintaining the relevance of science in society.

Personally, my goal is to advocate for junior scientists. To this end, I have been a member of both local and national committees to benefit graduate students and postdocs (UofL Postdoctoral Studies Committee, ASCB COMPASS, National Postdoctoral Association), and more broadly advocating for this population through my role on the Future of Research Board of Directors. These leadership roles have allowed me to learn about the needs of early career scientists and devise ways to best engage them in changing the academic culture. These experiences have also enabled me to create and be part of a network of professionals who share these same goals, and these individuals were also instrumental in guiding my own career path towards researching and advocating for improved policies affecting early career scientists.

Find science advocacy, policy, and communication opportunities through Science Rising

There are many ways for early career scientists to demonstrate interest in these activities and to engage others in our cause. Joining organizations such as Future of Research and the Union of Concerned Scientists are positive ways to demonstrate commitment to particular advocacy causes that we feel passionate about. Participating in local policy meet-ups with groups such as Engaging Scientists and Engineers in Policy can also be a way to show interest in particular policy issues affecting scientists or the general practice of doing science, as well as broader issues related to the relationship between science and society. You can find out about more opportunities and resources related to advocacy, policy and science communication through Science Rising, a new effort designed to celebrate the connections between science and society, and showcase opportunities for science supporters around the country to get more involved in advocating for science within their community as well as nationally.

Future of Research recognizes the importance of engaging early career scientists in shaping the scientific enterprise in an evidence-based manner. At the same time, this population seeks to engage with various stakeholders in advancing and advocating for the importance of science in society. We are proud to support the Science Rising movement and encourage the involvement of early career scientists in such national efforts.

Early career scientists still face many barriers to moving ahead towards effecting change. For this reason, we need everyone to get involved. Whether it’s designing career development programs on your campus or exploring ways to engage in science advocacy as a constituent, there are many ways to make a broader societal impact with your science.

 

Adriana Bankston is a bench scientist turned science policy researcher. She is a member of the Board of Directors at Future of Research, a nonprofit organization with a mission to champion, engage and empower early career scientists with evidence-based resources to improve the scientific research endeavor. Her goals are to promote science policy and advocacy for junior scientists, and to gather and present data on various issues in the current scientific system. Previously, she was a postdoctoral research associate at the University of Louisville. Adriana obtained a B.S. degree in Biological Sciences from Clemson University and a Ph.D. degree in Biochemistry, Cell and Developmental Biology from Emory University. Find her on Twitter at @AdrianaBankston

Science is Rising. Will You Rise With Us?

At last year’s March for Science, many wondered what would come next. Would the march be a blip, or did it represent a new era in science activism? We find that the enthusiasm for defending the role of science in public life has only deepened. Scientists and their allies went right from the streets into their communities and legislators’ offices, planning for the long haul.

At the same time, many scientists and scientific groups want to build on the successes and learn from the mistakes of others. That’s why today, UCS is partnering with a variety of science organizations to launch Science Rising, an initiative to help others make connections and put science to work for justice and the public interest.

Scientists are self-organizing, and are doing so around science issues in areas where they have not been as vocal in the past.

Scientists in Missoula, MT met with Republican and Democratic state legislators in March to better understand how to effectively communicate with elected officials.

Over the last year, some efforts were directly supported by UCS. The Penn State Science Policy Society convened a listening session to build relationships between scientists and community leaders. In Iowa, scientists organized a major advocacy day to urge the state legislature to restore funding for a sustainable agricultural research center. At the University of Washington, graduate students developed a workshop to better understand how climate policy works in Washington state.

Many efforts were not. The Data Refuge Project, made famous for safeguarding government data, is building a storybank to document how data connects people, places, and non-human species. The city of Chicago, which hosted climate change information on its website that the EPA took down, held an event in February to discuss how the city can support expanded access to climate and environmental data on its website and support research related to environmental policy decisions.

Scientific societies and universities have jumped into the fray. From Cambridge, Ohio to Corinth, Texas, the Thriving Earth Exchange of the American Geophysical Union is helping scientists and community leaders tackle climate change and natural resource challenges. Then there’s the Concerned Scientists at Indiana, a recently-convened group independent from UCS that has held multiple events and trainings for scientists in Bloomington. 500 Women Scientists, which formed after the 2016 election, held science salons—public talks—around the world during Women’s History Month to raise funds for CienciaPR, which promotes science education and research in Puerto Rico.

Make no mistake: this is just the beginning. There’s a thirst among scientists to create the infrastructure that is necessary to share ideas, amplify efforts, and keep the momentum going. And it’s clear that scientists are ready to stimulate conversation around science policy and take actions that restore it to its rightful place in policy decisions. As we head into the midterm elections, scientists and their institutions will be increasingly active in getting congressional candidates to articulate where they stand on science issues and whether and how they plan to hold the Trump administration accountable.

But to really catalyze the energy that is out there, we have to pool resources. On the Science Rising website you can explore how to organize events in your community that help you stand up for science. How to connect with legislators and inform media coverage. How to organize events, get active on social media, connect with local community organizations. And ultimately, how to inspire others to follow your lead.

Zachary Knecht, who leads the Brandeis Science Policy Initiative, is excited about making connections through this project. “Even in our interconnected world, academic life can be very isolating,” he said. “It’s absolutely critical for those of us engaged in the science policy sphere to be able to coordinate our efforts, and Science Rising provides a centralized platform to do that.”

Find out what’s happening in your area. Check out what other scientists and groups of scientists are doing that you can emulate. Figure out how to take your own interest in activism to the next level. Other groups and individuals will share what they are doing and learning. Make your contribution at sciencerising.org.

Newsflash: Better Fuel Efficiency is Good For Jobs

Factory worker in a car assembly line.

For all the rhetoric coming from the administration around proposed rollbacks to the EPA’s vehicle emission standards, one would think that existing standards are somehow inflicting damage on our economy.  EPA administrator Scott Pruitt even gave a shout out to the “Jobs” signs at the event where he announced the EPA will be rolling back the standards.  But he’s got it all wrong. Keeping the standards strong is the best way to help grow jobs and support our economy.  Investing in technology advancement in the auto industry and saving consumers money on fuel – both outcomes of clean car standards – help to create jobs and make our economy stronger.

“I love these signs, particularly the ones that say ‘Jobs’.” EPA Administrator Pruitt, April 3, 2018 in announcing rollbacks to federal vehicle emission standards. Analysis by Synapse Energy Economics shows that keeping emissions and efficiency standards strong will create jobs.

A new analysis by Synapse Energy Economics examined the existing state and federal clean car standards currently on the books through 2025 to estimate their impact on US jobs and the US economy.  They found clean car standards will:

  • Add more than 100,000 jobs in 2025 with that number increasing to more than 250,000 in 2035.
  • Increase US gross domestic product by more than $13 billion in 2025 and more than $16 billion in 2035.
  • Save consumers nearly $40 billion in annual fuel costs by 2025 and $90 billion by 2035

For details, visit see our fact sheet: Cleaner Cars Are Good for Jobs.

Why the good news?

So wait a minute.  Doesn’t it cost money to make cars more efficient and less polluting?  Yes.  But just like that more efficient refrigerator might cost a little extra upfront, the lower operating costs more than make up for it, leaving more money in your pocket to spend how you like.

It turns out, savings from improved fuel efficiency adds up to billions of dollars every year. To date, Americans have already saved more than $57 billion dollars at the pump since 2010 because of clean car standards. And spending those savings on things other than gasoline is a whole lot better for our economy. (This is old news – I wrote about this in 2011).

In addition, the standards drive the auto industry to innovate. That means more R&D, manufacturing and engineering, creating jobs throughout the supply chain.

What did Indiana University get wrong?

In the administration’s final determination notice to revise the standards, they cite a study by Indiana University, paid for by the Alliance of Automobile Manufacturers, which concluded that clean car standards would cause near-term job losses, but be positive in the long-run.  However, Synapse’s analysis found both short-term and long-term economic benefits.  Why the difference?

Indiana University study’s macro-economic modeling assumes all consumers use cash to purchase their vehicle—in fact, only 30 percent do so—and assumes consumers do not factor fuel economy into their vehicle purchasing decisions, even though evidence shows consumers value fuel economy as well as price when purchasing a vehicle. These erroneous assumptions led to erroneous results that just don’t hold up.

Bottom line: Federal and state clean car standards drive the deployment of more fuel-efficient vehicles. Developing and building these vehicles creates thousands of new jobs, while the money consumers save on fuel can be spent on other goods and services, boosting the economy overall.

The administration’s actions to weaken standards will hurt US jobs and the US auto industry, despite what their signs say and how much Administrator Pruitt loves them (starting at minute 2:33).

5 Things the EPA Gets Wrong as it Re-Evaluates the Fuel Efficiency Standards (and One Thing it Ignores)

Industry Representatives and Administrator Pruitt looking quite pleased at the press conference where they rolled out their rollback of the fuel efficiency standards. Left to right - Peter Welch, NADA, Administrator Pruitt, Mitch Bainwal, Alliance, John Bozzella, Global. Screenshot from C-SPAN

On Monday April 2nd, the EPA released a “redetermination” of the incredibly popular and successful car and light truck global warming emissions standards – spoiler alert – EPA said that the standards are not appropriate and need to be weakened.  As a reminder, the Obama administration previously completed the mid-term evaluation of the standards and issued a Final Determination that the standards are appropriate out through 2025.  Within a month of taking office, Administrator Pruitt promised that he would redo the Final Determination and voilà – here it is.

Reading the EPA’s redetermination is mind-boggling – it is basically a regurgitation of industry talking points put forward by the Alliance of Automobile Manufacturers (Alliance) and Global Automakers (Global) in the public record.

Some comments that were in opposition to the auto industry talking points were alluded to in the document, but there is no substantive evaluation of any of them. Nothing approaching a robust technical debate of any information is presented in this report — it is simply declarative, substituting the political will of the Administrator to side with industry for the hard, ignoring the scientific rigor found in the 2017 Final Determination.

Although the redetermination is full of questionable assumptions and strange conclusions, we picked five falsehoods that are core to their reasoning and explain why they’re wrong.

Falsehood 1

What they say: Vehicle costs were underestimated in the EPA’s original record that was foundational to the first Final Determination.

Why they’re wrong: When it comes to technology costs, EPA ignores the large number of peer-reviewed publications from its own technical staff showing how manufacturers can meet the 2025 standards, even without significant penetration of plug-in electric vehicles or strong hybrids.  It takes at face value automaker claims about the level of technologies needed to achieve the standards, without actually examining the studies cited by the automakers in making those erroneous claims, studies which in fact contradict the automakers’ assertions that significant penetration of advanced technology is necessary.  It also ignores the latest evidence on the vehicle costs needed to meet the rules.

Falsehood 2

What they say: Gas prices have changed since the rule was finalized in 2012.

Why they’re wrong: Gas price projections did change between 2012 and 2018.  However, when the agency updated their analysis for the mid-term evaluation and did the Final Determination in January 2017, they took that into account.  The projected gas prices used in the previous administrations’ Proposed and Final Determinations are nearly identical to current gas price projections.  Why the current EPA decided to focus on this and say it was a reason to re-evaluate the Final Determination is beyond me.

In one place, the redetermination exclaims that “lifetime fuel savings to consumers can change by almost 200 percent per vehicles based on the assumption on gas prices according to the 2016 Proposed Determination (Table IV.12).”  This is true.  A quick look at the table (below) clearly shows that fuel savings can go from good to great depending on the gas prices expected in 2025, ranging from $1,439 to $4,209 over the lifetime of the average vehicle, which is all good news for consumers.

Falsehood 3

What they say: “Consumers’ preferences are not necessarily aligned to meet emission standards and there is uncertainty on this issue that merits further consideration.”

Why they’re wrong: They got out of their way to say that consumers don’t want fuel efficient vehicles, which is not the data we’ve seen.

They cite an automaker point that only 5% of 2017 sales of normal gasoline-powered vehicles would meet 2025 standards. I don’t know why they would expect today’s vehicles to meet standards 8 years out.  The whole point of the standards is to make sure that vehicles get more efficient over time.

Auto manufacturers redesign vehicles every five years or so – it is in these product redesigns that they make major changes in the body style, and the efficiency of the engine and other components.  In eight years, all vehicles are going through at least one redesign, which is plenty of opportunity to make vehicles more efficient so they meet the standards.

It’s worth noting that models of popular vehicles like the Ford F-150 and Toyota Camry already meet targets well into the future—there is lots of opportunity to improve the efficiency of these vehicles and ample technology to do so, as reams and reams of research ignored by the agency can attest.

In addition, the way the standards work, not every vehicle needs to be exactly in compliance every year because they are based on an average.  There are flexibilities built into the program that allow manufacturers to bank and borrow credits over time because it is understood that vehicles will be more efficient right after a redesign and may be less efficient than the standards when it’s approaching its next redesign.

They also show misleading data on the uptake of electric vehicles by consumers.  Plug-in electric vehicle sales are increasing every year and as more models are introduced in varying sizes, more consumers will be able to consider them as an option for their lifestyle. Moreover, hybrid sales also grew from 2016 to 2017; conveniently, EPA excluded 2017 because it was a chart lifted from Alliance comments rather than analyzed with any sort of independent rationale.

Lastly, multiple polls have shown that consumers value fuel economy strongly. A NRDC poll from 2016 showed that 95% of Americans agree that “Automakers should continue to improve fuel economy for all vehicle types” and 79% of Americans believe that “The U.S. government should continue to increase fuel efficiency standards and enforce them”. Consumers Union has also published multiple polls that show that nearly 9 in 10 Americans think that automakers should continue to raise vehicle fuel economy.  And a poll released by the American Lung Association last week showed that after people hear balanced arguments from each side, their support for the standards increases slightly. It’s like I’m not alone in wanting to spend less money at the gas station.

Falsehood 4

What they say:  Consumers will be priced out of the market by these standards.

Why they’re wrong: Consumers are the greatest beneficiary of these savings.  As noted above, consumers stand to save thousands of dollars in fuel costs over the lifetime of their vehicles. In fact, consumers that finance their vehicles save money as soon as they drive their new cars off the lot, as the marginal cost of the fuel saving technology on their monthly payment is far exceeded by the money they save on fuel every month.

They also say that average new car sales transaction costs have increased as a result of the standards, a point which has been debunked repeatedly.  For example, Consumers Union showed that new car prices have remained relatively flat over the past 20 years with respect to inflation, and used car prices have fallen.  Similarly, auto analysts Alan Baum and Dan Luria showed that transaction prices are on the rise as a direct result of automakers upselling luxury packages to increasingly wealthy consumers.  All of this ignores consumers who are currently saving money due to paying less at the pump, which recent research shows disproportionately benefits low-income individuals, again a study acknowledged and ignored by Administrator Pruitt.

Falsehood 5

What they say: The growing preference for larger vehicles over cars make it harder to comply with the standards.

Why they’re wrong: The popularity of SUVs and light trucks doesn’t undermine the standards—it reinforces the need to maintain their strength.  Rather than setting a single greenhouse gas emission target for the average vehicle sold by a manufacturer, which is what the original vehicle standards did in the 1970’s, the new vehicle standards consider the size and type of the vehicles sold to determine each manufacturer’s target. This ensures that all vehicles improve their efficiency, including trucks and SUVs, while giving automakers flexibility in hitting their targets, based on the vehicles they sell. This system means that no particular vehicle model needs to be “in compliance”; some vehicles can achieve greater fuel economy and others less in a given year and the manufacturer’s fleet can still be in compliance with the standards.

What’s missing from the redetermination?

What they don’t say: Weakening the global warming emission standards endangers public health and welfare by contributing to global warming

Missing from the Revised Final Determination is any mention of climate change or its impacts, which endangers Americans now and into the future and is the reason that EPA sets these standards. Scientists warn that we must significantly reduce emissions of global warming pollutants to avoid the worst effects of climate change, including sea level rise, wildfires, and infectious diseases.  As it stands now, no other federal policy is delivering greater global warming emissions reductions than these vehicle standards. If the EPA completely rolls back the regulations, as some have signaled, that will mean an additional half billion tons of global warming emissions just from the vehicles sold between 2022-2025.  Doing so would make hitting our obligations under the Paris Climate Accord a virtual impossibility, significantly damaging our ability to hold global warming to 2 degrees Celsius.

We knew that this day was coming, but the extent to which this redetermination relies solely on industry arguments and ignores the robust analytics that underlie the original Final Determination is confounding.  It makes me think about the story that came out around Administrator Pruitt’s confirmation, when we learned that he took a letter written by a Devon energy lobbyist and put it on his OK Attorney General letterhead and submitted it to the Department of Interior.

This redetermination feels like that – like he just read the Alliance and Global comments and used their quotes to rewrite the determination.  It’s a slap in the face to everyone who cares about data, analytics, scientific integrity, and our climate.  We know he’s going to propose rolling back the standards in the proposed rule that we expect to see this summer.  The question is by how much.  We will keep a close eye on this and let you know what he proposes and ask for your help in keeping the standards strong.

 

Five (More) Pruitt Scandals That You Should Know About, But Probably Don’t

EPA Administrator Scott Pruitt has been hit with a string of brewing corruption scandals that go beyond merely ordering a soundproof “privacy booth” that cost taxpayers up to $25,000, or spending over $105,000 on first-class flights in his first year on the job.

Stories broke this week that Pruitt allegedly: (1) rented a condo linked to lobbyists who represented an oil pipeline project the EPA approved last year and who donated to Pruitt’s campaign to become Oklahoma Attorney General, (2) looked into renting a private jet that would cost U.S. taxpayers $100,000…a month, (3) gave two of his staffers raises under a provision of the Safe Drinking Water Act (SDWA), but not to actually work on water safety, and (4) lied to Congress about his use of private email to conduct government business.

This misuse of taxpayer dollars and pay-to-play corruption is infuriating – though unsurprising given what else is going on in this Administration – and indicates the type of person we have in charge of the federal agency charged with protecting our air, water, and health.

Pruitt doesn’t give two quarks about the U.S. taxpayer. Instead, these potential ethics violations highlight Pruitt’s penchant for using the office of the EPA Administrator to further industry interests, give handouts to his inner circle, and hide the evidence all along the way.

Although these corruption allegations are making headlines today, equal or greater attention must be paid to Pruitt’s somewhat quieter crusade to rollback regulations designed to protect our health and environment. These regulatory “reforms” haven’t garnered as much press coverage as Pruitt’s lavish spending, lobbyist connections, or shady dealings, but they are based on the same type of corrupt moral code Pruitt brings to the EPA and will cost more than just taxpayer dollars.

So, here are 5 regulatory rollback efforts currently underway at EPA that will impact our health and safety, and must be making headlines too

(1) Rollback of the “Glider Truck Rule”

Glider trucks are brand new truck bodies that manufactures cram an old polluting truck engine into so that the truck looks new from the outside but has an ancient polluting relic of an engine on the inside. The particulate matter emissions from these vehicles are estimated to cause 1,600 premature deaths each year. Pruitt’s EPA reopened a loophole that was closed under the Obama Administration so that these trucks can now be made primarily by a single company that has funded bogus science and anti-EPA politicians.

(2) Delay of Chemical Safety Standards

EPA had previously designed a suite of updated rules to protect fence-line communities and first responders from chemical accidents that happen like clockwork across the country. Over 2,000 incidents were reported between 2004 and 2013, and lives were lost. Over 17,000 people were injured and 59 people were killed during this period, and over 400,000 people experienced evacuations or shelter-in place orders because of a chemical-related accident. Upon entering office, Administrator Pruitt put this rule on hold almost 2 years later than the rule was supposed to go into effect. In just the last year, 33 more accidents occurred at facilities covered by these rules, and the consequences of these accidents may have been lessened or avoided if EPA didn’t initiate this delay.

(3) Rollback of the light-duty vehicle fuel efficiency standards

The light-duty vehicle fuel efficiency standards are a joint effort by EPA and DOT to improve the average fuel efficiency of passenger vehicles. These standards have been shown to not only reduce oil use and pollution, but also create jobs, give consumers more fuel-efficient vehicles across all vehicle classes, and have been widely supported. But, the EPA now doesn’t care about any of that as they have signaled that they are going to reconsider or rollback the fuel efficiency rules covering vehicle model years through 2025. This decision overturns thousands of pages of hard evidence, good science, and sound data, and undercuts one of the most important climate policies that is still on the books.

(4) Repeal of the Clean Power Plan

Last fall, the EPA issued a proposal to repeal the Clean Power Plan, a policy designed to reduce emissions from electric power generation approximately 30 percent below 2005 levels by 2030. By the EPA’s own estimates, the Clean Power Plan would prevent 90,000 pediatric asthma attacks and save 4,500 lives each year. The agency is still taking comment on that proposal, which, in addition to making a mockery of the value of human health and the environment, attempts to reinterpret the Clean Air Act as well as how the power system works in order to avoid the need for meaningful regulatory action. UCS, along with 250,000 others, submitted comments to EPA in support of maintaining this policy, though EPA is forecast to rule in favor of industry, rather than individuals.

(5) Failure to require mining companies to clean up their waste

Pruitt’s EPA decided not to finalize a proposal that would have required mining companies to prove they have the financial means to clean up pollution at mining sites, despite an industry legacy of abandoned mines that have fouled waterways across the country. The estimated 500,000 abandoned mine lands in the U.S. can pollute waterways for more than 100 years and pose significant risks to surface and ground water. Pruitt claimed that safe-checking mining companies with a long history of pollution was unnecessary and enforcing a regulation that makes mining companies clean up their mess would impose an undue burden.

Guess who now has to foot these bills? The U.S. taxpayer. EPA spent $1.1 billion on mining cleanups between 2010 and 2014 and EPA’s own documents report at least 52 mines and their processing sites have had spills and pollution releases since 1980. Hard-rock mining companies would have faced a combined $7.1 billion financial obligation under the dropped rule, costing them up to $171 million annually to set aside sufficient funds to pay for future cleanups, according to an EPA analysis. These costs will likely now be borne by the taxpayer instead of the responsible parties.

 

Pages