UCS Blog - The Equation (text only)

A Graduate Researcher’s (Brief) Guide to: Creating a Student Science Policy Group

Panel of speakers at the Opioid Epidemic Forum.

Research, telescopes, and computer models may consume the thoughts of many STEM graduate students, but do you ever find yourself distracted by current events? Are you ever caught up in conversations about how to fix problems in society? Have you ever “geeked out” about research that influences laws or policy? If you’re a graduate student and this sounds familiar, you have options: 1) ignore your burning desire to do something or 2) start a science policy group.

Assuming you’re considering option 2, the first and most common question you will have to tackle personally and externally is “What is science policy?”

Defining science policy

In short, it refers to the rules and regulations that govern the scientific workforce or the use of science to inform rules and regulations. After starting a science policy group in graduate school, myself and other graduate student members began to realize the nebulousness of this definition. Meeting with many policy professionals, we realized saying “I want to be involved in science policy” is as specific as saying “I want to be involved in science”.

You should determine what ways and which topics you would like to focus on for your science policy engagement. There is advocacy (addressing legislators), diplomacy (international policy efforts), education (science communication & awareness initiatives), and of course policy (informing or crafting rules and regulations). Using these approaches, there are many challenges you could address (e.g. scientific workforce issues, specific issues such as climate change or infectious diseases, STEM education, etc). As federal agencies, scientific societies and not-for profit organizations commonly focus significant portions of their resources on science policy efforts, it signals the scale of the issues, and shows it may take more than one motivated person to make a significant impact (even within your community).

Gathering a team

SPADE team

Creating a science policy group with driven members will allow you to help more people, as well as share the credit (and workload) for grand initiatives. Seek out like-minded graduate students with an interest in creating change but also appreciate that promising students can and should be found across a variety of academic fields. This provides your group with expertise and awareness to explore a wide range of issues. For my group, we found holding introductory meetings and sending recruitment emails through our graduate student government and graduate program coordinators was an effective strategy. However, you can also rely on forming collaborations with other groups on or off campus to expand your reach for members.

When you have a core group of students, create an executive board with titles (e.g. President, Treasurer, Commander Pikachu, etc.). Not only do they sound “fancy”, but they also help in establishing an expectation of duties, which saves time when planning initiatives. Another important task is to find a faculty advisor that has experience or an academic focus within science policy. This serves to address club rules on certain campuses (which could allow your group access to funds). It also helps you tap into your advisor’s experience and network (which is particularly helpful when searching for a guest speaker for an event).

Now what do you do?

So you’ve got your group and an advisor, what do you all do now?

As many topics related to science policy are national matters, it can be difficult to figure out how your rag-tag group of students will fit into the science policy landscape. Fortunately, there are many ways to address science policy topics and your group may find some original ways to address them. Based on my experiences, these are some common approaches student groups use to address issues:

Guest speaker events—Inviting a policy expert or professional to an event your group is hosting or to a panel being held on campus is a good way to get your group’s feet wet and establish yourselves as “active”. If there is not a big presence on your campus for science policy, your initial speaking events may be more effective (and better attended) if they are geared towards a general or profiled Careers in Science Policy discussion.

Forums—Similar to guest speaker events, forums will allow your group to invite policy experts for one event to explain to the public or other experts about research, concerns, and proactive actions to address an issue. For example, the opioid epidemic is a pervasive problem within in our local community. To address this, our group planned an Opioid Epidemic Forum. We hosted a physician, a policy expert, a police officer and two New York state senators to inform and empower the Long Island community.

Consider offering additional initiatives at your event to enhance your public service. For example, at the forum we also offered a Narcan training session for participants and an excess opioid drop off box (overseen by the Suffolk County Health Department and Suffolk County Police Department, respectively).

Advocacy—Your group could also go to Washington, DC, or local in-district meetings to discuss with legislators how an issue is affecting your community and/or how it may impact the scientific workforce. Contact your university’s government relations office and ask about opportunities to talk with local or federal legislators. They are a useful resource as they often have a line of contact to legislators. Additionally, your group could fundraise to subsidize fees for members in your community to participate or travel to local initiatives or marches related to science policy.

Science outreach events—Astound and inform your local community by hosting science events for the public, or joining events to discuss (in accessible ways) about the latest research you or fellow students and professors have been working on and how it may impact the public (or why it’s important to know). You could also work with local groups to create campaigns for important unspoken issues within your community that the public could help to address.

Moving forward

If you are still driven to do more after hosting a few events and being active within your community, there are several steps you can take. You can use these initiatives as a template during your journey into academia to help start initiatives to improve the lives of others alongside your research.

If you are driven to make this a career, there are fellowships that can help (and in some ways are integral) with your transition into the federal government or elsewhere as a science policy expert. Some fellowships such as the AAAS Science Policy Fellowship and the President Management Fellowship are for recent (or soon-to-be) graduates. However, others including the Christine Mirzayan Fellowship are also open to students (domestic and international) who are currently in graduate school and provide them with unique experiences in the world of policy. However, there are many others—here is a full list of those offered.

Although this was only a brief summary, I hope this was helpful in informing your journey into the world of science policy.

 

Lyl Tomlinson is a Brooklyn, New York native who recently obtained his Ph.D. in Neuroscience at Stony Brook University. As a post-doctoral researcher, his major investigative focus relates to the effects of aerobic exercise on important support-like brain cells (oligodendrocytes). He is also a science communication professional who often asks: “Would my grandma understand this?” Using this question as a guiding principle, he competed against roughly 100 scientists and won the 2014 National NASA FameLab science communication competition, which asks researchers to explain science topics accessibly in 3 minutes. He is also a longtime associate of the Alan Alda Center for Communicating Science and has been recognized as an “Alda All-Star”. While in graduate school, he was a co-creator and acting president of a graduate student lead science policy group, Scientists for Policy, Advocacy, Diplomacy and Education (SPADE). His work through this group gave rise to an action oriented local Opioid Epidemic Forum, an official graduate level Introduction to Science Policy course and several other initiatives. Lyl also meets with government representatives to advocate for science issues and regularly develops programs at Stony Brook to tackle problems related to scientific workforce matters. Find Lyl on Twitter at @LylT88

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

For a Moment, 50+ Percent of CA’s Energy Came From Solar

Photo: Recurrent Energy

On March 4, 2018, solar in California broke a record. Then, on the 5th, it broke another one.

In fact, virtually every day brings new headlines about solar energy’s progress in California and around the world, and solar records are being broken with laudable frequency. Record installation levels, record levels of electricity generation, record investment in solar….

For solar, a whole lot of signs are pointing up.

California: Another solar record (and another, and…)

Spring is a great time for solar, and a great time for solar records. The sun is high and temperatures aren’t yet at summer levels (and solar panels actually like the cooler temps). That makes for excellent solar electricity generation.

The milder temperatures also mean that spring (and fall) are when electricity demand is generally lowest. A stronger numerator (solar production) and lower denominator (energy demand) can make a record-breaking combination.

And, yup, those factors combined to knock it out of the park in California yet again in March, when at one point in time large-scale solar alone met 49.95% of the bulk electricity needs in the territory of the California Independent System Operator (CAISO, which covers 80% of California). Add in what rooftop solar took care of, and that 49.95% figure grows to well over half of electricity needs being met by solar.

An impressive tally—without even taking rooftop solar into account

But wait, there’s more: The very next day, all that solar added up to the largest solar peak to date in the CAISO territory, reaching 10,411 megawatts (10 million kilowatts) for large-scale solar (and a few thousand more megawatts for rooftop).

If that were all on rooftops (it’s not) and composed of typical rooftop residential systems, we’d be talking the equivalent of almost 2 million solar home systems worth of solar capacity.

Both of those records were for just a moment, but both are also a testament to the incredible growth solar has experienced in California, the number one state for installed solar capacity. (They also underscore the wisdom of strengthening the electric connections between California and its neighbors so that all of that solar can be put to good use.)

US: Record solar growth in official 2017 results

At the level of the country as a whole, the newly released official 2017 government stats on electricity from the US Energy Information Administration said a lot about solar’s progress, and the records it keeps shattering:

  • Electricity generation from solar, large and small, hit a new record, leaping 41% from 2016’s tally.
  • Solar accounted for a record 1.9% of electricity generation last year—almost double the percentage in 2015.
  • Solar in 2017 generated the equivalent of the electricity use of more than 8.5 million typical US homes.

Another way to look at it, with the longtime fossil fuel king in mind: With coal’s declining piece of the US electricity mix (from almost half in 2008 to less than 30% in 2017), the ratio of coal to solar has fallen from more than 2000:1 to less than 16:1 (Yup, another record). Not there yet, but that sure looks like progress.

Credit: Dennis Schroeder/NREL

World: Solar grows to record levels—and grows more than coal, gas, and nuclear… combined

And then there’s the global picture, and records broken in terms of new solar capacity and new investment. A new report from the United Nations Environment Programme and Bloomberg New Energy Finance looks at renewable energy investments across the world, and finds a whole lot good happening in solar.

The opening lines of the whole document capture the solar-centric excitement that the numbers provoke (emphasis added):

Solar power rose to record prominence in 2017, as the world installed 98 gigawatts of new solar power projects, more than the net additions of coal, gas and nuclear plants put together. The solar build-out represented 38% of all the net new generating capacity added (renewable, fossil fuel and nuclear) last year.

Think about that for a moment: Once you take retirements into account, as you should, we got more new solar last year than new coal + new gas + new nuclear.

All told, non-hydro renewables accounted for 61 percent of net power generation capacity added in 2017 worldwide, a record, and a consistently growing (and record-breaking) portion of both global power capacity and global power generation.

Source: Global Trends in Renewable Energy Investment Report 2018

For solar, UNDP/BNEF found, China was a big piece of 2017’s progress, “…with some 53GW installed (more than the whole world market as recently as 2014), and solar investment of $86.5 billion, up 58%…” Solar investment, though, grew in both developed (17%) and developing (41%) countries.

And more…

Recent tidings have also brought news of record numbers for solar purchases by US corporations, a new US solar panel manufacturing facility in the works in Florida, and a proposal for a record-breaking battery as part of a hefty proposed solar farm in California.

There are caveats with each of those tidbits, or uncertainties, or, for the US, ways that the current administration or state policies could mess things up.

But taken together these stories paint a picture of a sector continuing to do what we need it to. Solar is on the move, across the country and across the world.

It’s clearly a technology that feels that records were meant to be broken.

Will Congress Give Farmers the Farm Bill They Want?

U.S. Marine Corps veteran Calvin Riggleman holds an oregano seedling and soil on Bigg Riggs farm in Hampshire County, WV Photo courtesy Flickr/Lance Cheung, USDA

Last week, the chairman of the House Agriculture Committee made headlines by unveiling a truly terrible farm bill proposal, one that dramatically undercuts the nation’s most successful nutrition assistance program and threatens to throw the entire farm bill process into chaos. His committee is set to vote out the measure this morning, though Democrats have rejected it out of hand.

Beyond this highly partisan bill’s cynical slap at millions of low-income people and their communities, there’s also very little for farmers to like. Deep cuts to incentive programs that help them protect water quality, conserve soil, and build resilience to floods and droughts are among the bill’s many disappointing aspects, along with a failure to invest in connecting farmers with new local customers. In stark contrast, a poll released today shows that farmers across the political spectrum are eager for precisely the kind of tools and incentives House Republicans have firmly turned their backs on. And soon they may be looking for political candidates who will give it to them.

Survey says: Farmers want more support for local, sustainable agriculture

The new poll was conducted in March by Iowa-based RABA Research on behalf of UCS. Using telephone interviews supplemented by an online questionnaire, the researchers queried more than 2,800 farmers in seven states—Iowa, Illinois, Kansas, Michigan, Ohio, Pennsylvania, and Wisconsin—to better understand how they are thinking about farm policy and sustainable agriculture.

You might expect that farmers would regard the farm bill as an important piece of legislation, and the poll shows that they do. Fully three-quarters of them said the farm bill is “somewhat” or “very important” to their personal livelihoods. In an era of deep cynicism about the ability of Congress to helpfully affect the lives of everyday Americans, it’s a striking number, and it particularly contradicts recent news reports suggesting that rural America “doesn’t have time” for the farm bill.

Digging a little deeper, the researchers uncovered even more surprising results:

  • Three-quarters of farmers surveyed said it’s important to the future of farming for farm policies to offer incentives for farmers to take steps to reduce runoff and soil loss, improve water quality, and increase resilience to floods and droughts. That number was even higher in some states—76 percent in Ohio, 78 percent in Kansas, and a whopping 84 percent in Iowa. The finding indicates that farmers are keenly aware of the negative impacts of agriculture on our water and soil resources—and, with extreme weather becoming more common, they are concerned about their ability to cope. Farmers urgently want tools to minimize these impacts.
  • Furthermore, 74 percent said that strengthening the hand of farmers in dealings with companies that control the production chain is important. This view directly contradicts the recent action by the Trump administration and Secretary Sonny Perdue to end the USDA’s Farmer Fair Practices Rules, which would have leveled the playing field for poultry and livestock farmers in contracts with the giant corporations that control meat production and processing and made it easier for those farmers to sue the companies for unfair treatment.
  • Similarly, 74 percent of farmers said farm policies should support research on ways to increase farm profitability by decreasing the need for costly chemical inputs. They might not call it agroecology, but that’s what it is and what it does, and farmers want farm policy to fund more of it.
  • And 69 percent of farmers said policies should help connect farmers with new buyers through marketing arrangements like food hubs and farm-to-school programs. These are the kinds of arrangements UCS and other groups have advocated for in the bipartisan Local FARMS Act.
  • Most astonishingly, these results hold across the partisan divide. Poll respondents spanned the political spectrum but leaned heavily Republican. Across the seven states, 55 percent of respondents were Republican, 20 percent Democratic, and 25 percent other.

 

Graph showing results from farmer surveyQUESTION: I’m going to read a list of ways that US farm policy can shape agriculture in the years ahead.  Answer yes or no to indicate which you think are important to the future of farming.

It’s election season, and farmer-voters are looking for change

Perhaps the most striking finding in the poll is this: Farmers are looking to back political candidates who will deliver innovation and sustainability for agriculture.

A surprising 72 percent of farmers across the seven states said they would be more likely to support a candidate for public office who seemed to favor farm success through sustainable agriculture priorities instead of business as usual. That number was even higher in some states—74 percent in both Michigan and Pennsylvania. (Swing states, anyone?)

And remarkably, that high level of support wasn’t dependent upon party affiliation, but was held by 76 percent of Democrats, 73 percent of Republicans, and 67 percent of those who identified politically as “something else” across the seven states.

QUESTION: If a candidate for public office seemed to favor farm success through sustainable ag priorities instead of business as usual, would you be more or less likely to support that candidate? 

This finding shows that an overwhelming majority of farmers are seeking change in the federal government’s priorities for supporting US agriculture. And why should we be surprised? The poll was conducted just before tensions over trade with China threatened to erupt into a full-scale trade war—in which farmers would be early casualties. But those trade tensions are merely compounding the trouble farmers have faced in recent years as prices of leading US farm commodities have plunged. Farm income is projected to hit a 12-year low this year, leaving many farmers uneasy about the status quo and looking for new solutions.

House farm bill offers less—not more—of what farmers want (and farm groups call BS)

The bill the House agriculture committee will vote on today neglects or actively undercuts precisely the programs that our poll shows farmers want. Existing working land conservation programs provide incentives and technical support for farmers to adopt science-based practices—like planting cover crops and more diverse crop rotations—that reduce erosion and water-polluting runoff, lessen the need for expensive chemical inputs, and build healthy soil to buffer farmers from the impact of floods and droughts. The bill on the table today cuts nearly $5 billion from these programs over 10 years, and completely eliminates the Conservation Stewardship Program—a program so popular with farmers and already so underfunded that in recent years it has had to turn away as many as 75 percent of qualified applicants.

At the same time, the House farm bill as written also largely fails to take on provisions of the Local FARMS Act, a bipartisan proposal meant to expand the customer base for small and midsize farmers while improving access to healthy food (which is inadequate for 15.6 million US households). By overlooking the Local FARMS Act—which includes provisions to strengthen farm to school programs, promote farmers markets, and otherwise build connections between farmers and local consumers, especially low-income individuals and families—the authors of today’s bill are bypassing an opportunity to create jobs and establish reliable revenue streams for struggling farmers while also increasing access to healthy and affordable food for more of our neighbors.

Congress can do better—and we need to tell them

All this has led farm and conservation groups to join health and anti-hunger groups in panning the House farm bill proposal. The National Farmers Union—while attempting to be positive—similarly expressed frustration with its failure to give farmers what they need:

“[C]ongressional leadership has severely hamstrung the committee’s ability to address the six-year, 50 percent decline in the farm economy. While they’ve shown little regard for spending and deficits this Congress, they’ve failed to provide adequate resources for food and agriculture at a time of grave financial strain on family farmers and ranchers. This is irresponsible and harmful.  

The draft bill that the House Agriculture Committee will vote on today is so fundamentally flawed, we don’t expect many opportunities to strengthen it through the amendment process. But here are two areas—the Local FARMS Act and the SNAP program—where we may be able to make the bill more responsive to the needs of farmers and the public interest.

Tell Congress to fight for farmers and healthy food in the farm bill today.

Photo: Lance Cheung, USDA

Colorado Communities Sue ExxonMobil and Suncor for Climate Damages

Suncor Energy owns the only oil refinery in Colorado. Photo: Max and Dee Bernt. CC-BY-2.0 (Flickr)

Two Colorado counties and the city of Boulder are suing ExxonMobil and Suncor Energy, Canada’s largest oil company, to hold them responsible for climate change-related damage to their communities.

The lawsuit, filed Tuesday in a state district court by Boulder, Boulder County and San Miguel County, is seeking compensation for damage and adaptation costs resulting from extreme weather events.

New York City and eight coastal California cities and counties, including San Francisco and Oakland, have filed similar lawsuits against ExxonMobil and other oil and gas companies, charging that they have injured their communities under common law. The Colorado suit is the first by an inland county or municipality.

“Climate change is not just about sea level rise. It affects all of us in the middle of the country as well,” said Boulder County Commissioner Elise Jones. “In fact, Colorado is one of the fastest warming states in the nation.”

Oil Industry Knew About Threat 50 Years Ago

The 1,300-square-mile San Miguel County sits in the southwest corner of the state on the Utah border. About a third of the county’s 8,000 residents live in Telluride, a well-known ski resort town. Boulder, 25 miles northwest of Denver, is the county seat of the 740-square-mile Boulder County and home to nearly a third of the county’s 319,000 residents. The three communities have been ravaged by costly climate-related extreme weather events, including wildfires and flash floods, according to the 100-page complaint. Likewise, each community has launched initiatives to curb carbon emissions and adapt to a changing climate.

The Colorado communities contend that ExxonMobil and Suncor were aware that their products caused global warming as early as 1968, when a report commissioned by the American Petroleum Institute (API), the US oil and gas industry’s premier trade association, warned of the threat burning fossil fuels posed to the climate. Subsequent reports and memos prepared for API and its member companies came to similar conclusions. Regardless, ExxonMobil and Suncor not only continued to produce and market fossil fuel products without disclosing their risks, the complaint charges, they also engaged in a decades-long disinformation campaign to manufacture public doubt and confusion about the reality and seriousness of climate change.

The plaintiffs want the two oil giants to “pay their share of the damage” caused by their “intentional, reckless and negligent conduct.” That share could amount to tens of millions, if not billions, of dollars to help cover the cost of more heat waves, wildfires, droughts, intense precipitation, and floods.

“Our communities and our taxpayers should not shoulder the cost of climate change adaptation alone,” said Boulder Mayor Suzanne Jones. “These oil companies need to pay their fair share.”

Higher Temperatures Hurt Ski Industry, Agriculture

Over the last four decades, wildfires in the Rockies have been happening with greater frequency. According to a 2014 study by the Union of Concerned Scientists (UCS) and the Rocky Mountain Climate Organization (RMCO), the region experienced nearly four times as many wildfires larger than 1,000 acres between 1987 and 2003 than between 1970 and 1986.

Rocky Mountain trees also are being ravaged by bark beetles. Over the last 25 years, the UCS-RMCO report found, beetles have killed trees on regional forest land nearly equal in acreage to the size of Colorado itself. Heat and drought are taking a toll, too, exacerbating tree mortality. If global warming continues unabated, the region likely will become even hotter and drier, and the consequences for its forests will be even more severe.

The average temperatures in Colorado have increased more than 2 degrees F since 1983, according to a 2014 University of Colorado Boulder study, and are projected to jump another 2.5 to 5 degrees F by mid-century. That would have a devastating effect on the Colorado economy, which relies heavily on snow, water and cool weather. A 2017 study by the Natural Resources Defense Council and Protect Our Winters found that low-snow winters and shorter seasons are already having a negative impact on the state’s $5-billion ski industry, the largest in the country. Rising temperatures and drought, meanwhile, threaten the state’s $41 billion agricultural sector.

ExxonMobil and Suncor are Major Carbon Emitters

Both ExxonMobil and Suncor have substantial operations in Colorado. Since 1999, ExxonMobil has produced more than 1 million barrels of oil and 656 million metric cubic feet of natural gas from Colorado deposits, according to the complaint, and ExxonMobil subsidiary XTO Energy currently produces 130 million cubic feet of natural gas per day from more than 864 square miles across three Colorado counties. There are also at least 20 Exxon and Mobil gas stations in the state. All told, the company’s production and transportation activities in Colorado were responsible for more than 420,000 metric tons of global warming emissions between 2011 and 2015, according to the complaint.

Suncor gas stations, which sell Shell, Exxon and Mobil brand products, supply about 35 percent of Colorado’s gasoline and diesel demand. Suncor, whose U.S. headquarters is located in Denver, also owns the only oil refinery in the state, which produces 100,000 barrels of refined oil per day. According to the complaint, Suncor’s Colorado operations were responsible for 900,000 metric tons of carbon emissions in 2016 alone.

Besides their Colorado facilities, the two companies are partners in Syncrude Canada, the largest tar sands oil developer in Canada. Tar sands oil—a combination of clay, sand, water and bitumen—produces roughly 20 percent more carbon dioxide emissions per barrel than regular crude oil.

ExxonMobil and Suncor are among the 90 fossil fuel producers responsible for approximately 75 percent of the world’s global warming emissions from fossil fuels and cement between 1988 and 2015, according to the Climate Accountability Institute. Over that time frame, the two companies’ operations and products emitted 20.8 gigatons of carbon dioxide and methane.

“Based on the latest scientific studies, the plaintiffs in Colorado, as well as in California and New York City, can now show the direct connection between carbon emissions and climate-related damages,” said Kathryn Mulvey, climate accountability campaign director at UCS. “Given these companies’ significant contribution to climate change—and their decades of deception about climate science—it is long past time that they should be held accountable for the damage they have caused.”

Clean Energy is Happening, With or Without the Trump Administration

Photo: First Solar

Folks waiting for leadership at the federal level to drive our ongoing clean energy transition better get comfortable. It might be a while. Fortunately, one only has to turn one’s eyes outward from D.C.—in just about any direction—to find utilities, corporations, cities, and states taking the reins of our transition to clean energy.

The rationale for clean energy takes many forms

Whether it’s a sense of moral obligation to tackle the threat of climate change, the benefits of being perceived as “green”, or just economic good sense, there’s a strong argument to be made that clean energy is the right way to go. Wind and solar, increasingly being paired with battery storage, continue to impress with low costs and reliable performance. Energy efficiency continues to be our cheapest and most readily available resource out there.

And as utilities and large corporate purchasers of energy navigate our ongoing and historic transition away from coal, these clean energy resources are proving to be a cost-effective, low risk, and clean option for meeting energy needs.

Utility commitments show a desire to diversify away from fossil fuels

There’s a powerful recent wave of utility announcements to cut carbon emissions and invest in clean energy (see here or here, for example). Just in the Midwest for example, we’ve seen significant commitments from utility powerhouses such as Xcel, Ameren, Consumers Energy, and MidAmerican, just to name a few.

The great news is that the new wave isn’t limited to one region of the country. It’s not about red states or blue states. It isn’t driven by political ideology or, in most cases, regulatory or policy requirements. It’s driven by economics and a growing awareness—by utilities and their customers—of the urgent need to reduce carbon emission and avoid the most damaging effects of climate change.

Of course, these commitments aren’t necessarily binding, and there’s legitimate concerns that utilities are promising clean energy later to bolster arguments for natural gas investments now (as we’ve noted in DTE Energy’s pursuit of a large natural gas plant in Michigan). But the overall trend is unmistakable and promising for our ongoing clean energy transition. More work lies ahead to hold utilities accountable to these commitments and ensure clean energy investments are prioritized in the near term.

Corporate purchasers choosing renewables to power our economic growth

Another area where key players in our energy future are taking clean energy matters into their own hands is in the world of corporate purchasers. These large energy users are increasingly looking beyond their local utility to procure low-cost renewable energy to meet sustainability goals and ensure long-term price stability.

Unlike fuel-fired energy sources, renewable energy prices can be locked in for 20 years or more because the cost of that energy is not dependent on sometimes volatile prices for fuels like natural gas and coal.

According to Bloomberg New Energy Finance’s 2018 Sustainable Energy Report, corporate purchases of renewable energy took off in 2014 and have been a significant player in the renewable energy development growth ever since. Forty US-based companies have signed on to the RE100 Initiative and pledged to source 100 percent of their energy consumption from renewable energy.

Source: Bloomberg New Energy Finance 2018 Sustainable Energy in America Factbook

While current corporate commitments still make up a small share of overall energy use in the United States, the trend, once again, is unmistakable and a strong signal that our clean energy transition continues to advance despite what happens in Washington DC.

Cities and states continue to take climate change seriously—and demand clean energy solutions

I started this blog by saying that you only had to look outward from D.C. to find progress on clean energy, but yes—even the District is onboard with a clean energy future. In 2016, it strengthened its renewable portfolio to achieve 50 percent renewable energy by 2032. The new law also included provisions to ensure that everyone would have access to clean energy by funding provisions that would increase access for lower-income households.

The District’s progress on clean energy is just one example of state and local government forging their own clean energy future. At the end of 2016 we saw Illinois and Michigan strengthen their clean energy standards. California is now considering joining the ranks of Hawaii in pursuit of 100 percent clean energy. And just last week, New Jersey strengthened its renewable energy standard to 50 percent by 2030.

We’ve also witnessed direct action against President Trump’s stated intention to withdraw the US from the Paris Climate Agreements. States and cities across the US have pledged that they’re “still in” on meeting the goals of the international Paris agreement, despite President Trump’s intention to withdraw the US from this landmark agreement.

State members of the US Climate Alliance and city members of the Climate Mayors

Source: Bloomberg New Energy Finance 2018 Sustainable Energy in America Factbook. Sixteen states have committed to reducing carbon emissions, covering more than 40 percent of the US population.

As utilities, corporations, states, and cities step in to fill the leadership void left by President Trump and his administration, clean energy’s future remains bright. Economics and the moral imperative to address climate change continue to be driving forces behind our ongoing transition.

As our collective commitment to clean energy continues to grow, the federal government will have a harder and harder time turning a blind eye.

SNAP Work Requirements Provoke Broad Opposition to House Farm Bill

House Agriculture Committee chair Mike Conaway speaks at a hearing.House Committee on Agriculture Chair Rep. K. Michael Conaway (R-TX) opens the hearing with U.S. Department of Agriculture (USDA) Secretary Sonny Perdue in Washington, D.C., May 17, 2017. (Photo: USDA/public domain)

The nutrition title of the draft farm bill released by the House last Thursday is an affront to millions of individuals and families across the country—many of whom are part of the electorate that put our current political leaders in office. Despite an outcry of opposition from advocacy groups, the public, and Democrats on the House Agriculture Committee, it appears that Committee Chairman Mike Conaway (R-TX) is prepared to push through a bill that would be devastating to rural and urban communities alike.

What is it, exactly, that makes this proposal so devastating?

Under the guise of new work requirements for the Supplemental Nutrition Assistance Program (SNAP), the bill would cut billions of dollars currently protecting people nationwide from the consequences of food insecurity and economic instability. The draft language expands the population subject to work requirements to include caretakers of children over six and people between the ages of 50 to 59, establishes tighter time frames for participants to find work or job training programs, and imposes more severe penalties for those who are unable to do so. The proposed policies would allow participants only a month to secure work or job training for at least twenty hours per week; the first “violation” of these requirements would result in removal from the program for one year, and subsequent violations would result in removal for a period of three years.

Under current legislation, those who are subject to work requirements include only childless adults without disabilities between the ages of 18 to 49 (often called able-bodied adults without dependents, or ABAWDs); current penalties for failing to secure work or job training placement for at least 80 hours per month are removal from the program for a period of three years. Though the total number of hours required per month remains unchanged, the move from a monthly to a weekly minimum means that participants must also find work that offers steady and consistent hours. This can create additional barriers to program participation, particularly among those facing primarily low-wage employment options.

A Trojan Horse with dire consequences Family shopping for vegetables at grocery store.

Research shows that SNAP works, alleviating food insecurity and improving the health of families. However, the reauthorization of the farm bill could threaten the program’s effectiveness.

At best, these additional requirements are empty solutions to problems that don’t exist. At worst, they create new ones. Per House Minority Leader Nancy Pelosi, “The GOP’s ‘workforce requirements’ are nothing but a cynical Trojan Horse to take away SNAP from millions of hungry families.”

As we wrote last week, data from the US Department of Agriculture (USDA) counters the notion that working-age adults have become dependent upon SNAP. The populations who might depend on the program for longer periods of time include children, the elderly, and those with disabilities; together, these groups make up about two thirds of all SNAP participants. The population of ABAWDs makes up just a small fraction—only two percent—of all those who stay on SNAP for a period of eight years or longer.

Furthermore, more stringent work requirements won’t do anything to address poverty—on the contrary, they may well exacerbate conditions of food insecurity and economic instability among communities already challenged by a persistent lack of access to resources and opportunities. Data from the Bureau of Labor Statistics show that even for those in the general population, securing a job within three months is an unattainable goal: last year, nearly 40 percent of those able to work and looking for jobs were unable to find work within 15 weeks, while nearly 25 percent were unable to find work within 27 weeks. To expect that adults who have recently enrolled in SNAP—for reasons ranging from unexpected unemployment to family crisis to natural disaster—should accomplish this task within one month is to set them up for failure.

Of course, House majority leaders have touted employment and training (E&T) programs as the answer to unemployment and underemployment among SNAP beneficiaries. According to Chairman Conaway, SNAP participants will have “guaranteed access” to E&T programs by way of government investment in training and case management. But effective programs come with a price tag, and the $1 billion pledged in the draft bill won’t come close to cutting it. According to the Center for Budget Policies and Priorities (CBPP), that investment amounts to only $28 per person per month for a caseload of 3 million SNAP participants—far less than the typical cost of effective employment programs, and less even than the cost of existing employment services provided by the Temporary Assistance for Needy Families (TANF) program. It’s also worth noting that the bill counters the USDA’s own findings on best practices in E&T programs. A 2016 review of over 160 studies on SNAP E&T and workforce development programs found that the most effective programs serve those who volunteer to participate, rather than following a mandate as a condition of eligibility.

Entire communities will feel the fallout from SNAP cuts

Mechanic working on tractor in garageThe Committee anticipates that the new work requirements would impact between 5 and 7 million recipients, and that the proposed bill would cause about 1 million people to leave SNAP over the course of a decade. Meanwhile, the CBPP estimates that changes would cause either a reduction or total loss of benefits for more than 1 million low-income households, impacting about 2 million people.

But the economic implications of such severe cuts would extend far beyond program participants, due to the economic multiplier associated with SNAP benefits. A USDA model has estimated that each dollar in SNAP benefits generates about $1.80 in economic activity, particularly in times of economic downturn. This means that the $64.7 billion in benefits administered in FY 2017 could have generated $114 billion in economic activity, with the potential to create and support an estimated 567,000 to 624,000 jobs—including 48,700 to 59,800 in the agricultural sector.

And these benefits aren’t limited to food production, distribution, and retail sectors. With each additional SNAP dollar received, program participants can not only spend more on food, but can also afford to spend more on other necessities, such as utilities, car payments, or medical expenses, as some of the income once allocated to their food budget is displaced by SNAP dollars. This means that a wide range of industries end up getting a boost from SNAP benefits—and that many would be adversely affected by dramatic cuts.

An uncertain future for a hyper-partisan House bill

Of course, none of the policy changes contained in the House bill are set in stone—not by a long shot. This week, the Committee will markup their draft farm bill, the next step in developing a draft that would go to the House floor for further consideration and, eventually, a final vote.

Regardless of the immediate outcome, the draft text that was presented to the public last week must be recognized for what it is: a bold-faced attempt to undermine a program that effectively and efficiently serves some of our most vulnerable populations, and a blatant disregard for how these populations will actually fare. And though it was put forth with seemingly little concern for political fallout or blowback, this is a program that reaches communities—and voters—in every corner of our country, and won’t be easily forgotten.

Photo: USDA Photo: Plush Studios/Blend Photo: Flavio/CC BY 2.0 (Flickr)

Peter Wright’s Nomination Means Superfund Conflicts of Interest in Almost All 50 States

Last month, the Trump administration nominated Peter Wright from DowDuPont to serve as Assistant Administrator at the EPA’s Office of Land and Emergency Management (OLEM). This office plays a critical role in protecting public health by enforcing the Superfund program (the government’s effort to clean up sites that are contaminated with hazardous waste and posing a health risk). They also implement programs to help communities from chemical disasters, and deal broadly with regulations on hazardous waste disposal to best protect public health and hold polluters accountable.

EPA administrator Scott Pruitt has identified the Superfund program as a priority area, which so far has meant taking sites off of the National Priorities List, but not necessarily making sure those sites are adequately cleaned up.

According to his Linkedin page, Wright has spent 19 years as Managing Counsel at the Dow Chemical Company, which since merging with DuPont in 2017 is known as DowDuPont. According to the White House, while there, he has provided counsel to the company’s leaders and led the company’s “legal strategies regarding Superfund sites and other federal and state-led remediation matters,” and provided counsel on mergers, acquisitions, and significant real estate transactions for the company.

One of the main functions of OLEM is to hold companies accountable for polluting peoples’ water, land, and air which is why it is a huge conflict that someone who has long represented the interests of a company that is a currently liable major historic polluter would be at its helm. Dow Chemical Company, DuPont, or its subsidiaries are listed as the parent company for over 200 Superfund sites, 50 Risk Management Plan (RMP) facilities and over 50 Resource Conservation and Recovery Act (RCRA) facilities with corrective actions.

As the potential future head of OLEM, can we really trust that Wright would be making objective decisions regarding DowDuPont Superfund sites, ensuring its facilities with RCRA permits are cleaning up current problems with mismanagement of hazardous waste, and overseeing implementation of the RMP Rule at DowDuPont sites? According to Dow, Wright was “directly involved” with at least 14 cleanup agreements with the EPA, but which sites and whether he advised or was involved less formally in other cases is still unclear.

Wright’s Superfund conflicts from coast to coast

A 2013 EPA document lists 167 Superfund sites (both NPL and non-NPL) where either or both Dow Chemical Company or E.I DuPont de Nemours and Company were a listed responsible party. An additional 71 sites had Dow Chemical Company subsidiaries listed as a responsible party. An identifiably DuPont-linked site listed in 2014, the US Smelter and Lead Refinery site in East Chicago, has been added to this map as well (Source: EPA).

To get a better sense of exactly how many Superfund conflicts would arise if Wright were to be confirmed as head of OLEM, we mapped the locations of Superfund sites (those both on the National Priority List and not on the list) at which DuPont, Dow, or a subsidiary company were named as a responsible party according to a 2013 EPA list. While parent companies aren’t always financially responsible for cleaning up a subsidiary’s Superfund sites, these locations still pose a potential conflict of interest as Wright would not be wholly independent of these sites.

In total, using the 2013 list but eliminating any sites that have since been deleted by EPA, we found 180 still active, proposed or partial National Priority List (NPL) sites and 57 non-NPL sites equating to 238 sites tied to DowDuPont in 44 different states. There is one additional Superfund site, the US Smelter and Lead Refinery site in East Chicago, that was listed as a Superfund site after 2013—so was not included in our count—but that is publicly linked to DuPont.

As Dow and DuPont’s 2014 through 2017 10-K forms (here, here, here, and here) indicate that more Superfund sites were added than deleted for the companies every year, there are likely many more Superfund sites than those listed here that are not included on this map. And it’s important to remember that there is a strong financial incentive for companies to evade responsibility and avoid paying for expensive remediation efforts, so they employ several different strategies to do so including name changes, mergers, and bankruptcies.

DowDuPont has over $200 million in payment obligations for Superfund sites alone, according to its own financial statements. Were Wright to be confirmed, decisions made about the timely cleanup of DowDuPont sites across the country will be made by a man who spent two decades on the payroll of that same company. And judging by the way the EPA’s ethics office has so far allowed conflicted appointees to participate in policy issues related to their former employers, it seems likely that Wright will be weighing in on issues in which he has a vested interest on day one.

When conflicts of interest hit close to home

There are over 30 DowDupont or subsidiary Superfund sites in New Jersey alone. One of the country’s most polluted waterways, Berry’s Creek, is located in the Hackensack Meadowlands and is represented on the map by the large orange dot (Souce: EPA).

A little-known fact about my home state of New Jersey is that it is home to the highest number of Superfund sites in the country, with 114 sites on the National Priority List alone.

Who knew such a small state could pack in so much hazardous waste! No wonder the one superhero hailing from our state is named the Toxic Avenger

The consideration of what to do with the remediation of one DowDupont site in New Jersey, Berry’s Creek, will occur in the next few months, and because it comes with a large price tag ($80 million according to DowDuPont), its fate could be decided by Administrator Pruitt himself, who we know has already been swayed by Dow’s influence at least once before, with the counsel of Wright.

For me, Berry’s Creek is not a faraway place. I’ve been there. I grew up nearby. My parents now live a mile downstream. During college, I spent my summers as a chemistry intern, running water and sediment samples from the Hackensack River and other locations in the surrounding Meadowlands, extracting and measuring contaminants ranging from heavy metals to PCBs to pesticides.

These chemicals have contaminated the water and land surrounding the Meadowlands for decades, since the thoroughfare has been home to landfills, power plants, and other industrial sites that had been free to pollute without penalty until the Clean Water Act was signed in 1972. On one of many water sampling boat trips on the river, we approached a tributary of the Hackensack river, Berry’s Creek, and my environmental scientist colleague explained a little bit about its history and why we might not want to fall off the boat if we valued our health.

Hackensack Meadowlands, New Jersey. (Photo: Flickr/samenstelling)

A mercury processing plant was operated by Ventron/Velsicol adjacent to the creek between 1927 and 1974 and was named a national priority site for the EPA’s Superfund program in 1984. The site was acquired by a long list of companies, but its liability has fallen to Rohm & Haas, a wholly owned subsidiary of DowDupont. According to NOAA, which is working with EPA to evaluate remediation efforts at the site, the mercury levels in Berry’s Creek are among the highest found in any freshwater ecosystem in the United States. The creek is hydrologically connected to wetlands that surround it and because it’s a tidal estuary, contamination has flowed both up and downstream of the original site. In 2005, EPA and the US Army Corps of Engineers (USEPA) found dissolved mercury concentrations as high as 4,100 µg/L, 2,000 times the New Jersey groundwater quality standard of 2 ug/L! Mercury concentrations in the sediment at Berry’s Creek has been recorded as far deep as six feet below the sediment surface.

The area is also contaminated with cadmium, chromium, copper, lead, other heavy metals, and PCBs at levels exceeding New Jersey standards and will be until action is taken by DowDupont to finally clean it up. It’s only a matter of time before contamination issues are worsened by flood risks resulting from climate change, since the site is low-lying and the NJ towns surrounding the Meadowlands are in a high-risk area.

Wright is a great fit for Pruitt’s EPA (and that’s not a compliment)

My colleague, Andy Rosenberg, detailed why many of Pruitt’s team members and leadership at the EPA are incapable of protecting public health and safety. Wright fits the bill as well.

Wright’s confirmation hearing has not yet been scheduled by the Senate, but I look forward to seeing members of the US Senate Environment and Public Works Committee ask him how exactly he expects to protect us from harmful chemicals and hold companies accountable for pollution prevention when he has direct financial conflicts of interest from haunting him from his previous employer in nearly every state. I’d like them to get some clarity on which DowDuPont Superfund sites, RMP, or RCRA facilities he has directly worked on or been involved with in any capacity and whether he will be recusing himself from all decisions made about those sites.

I agree with the Trenton Times editorial board that Wright’s nomination is “horrible, horrible news” not just for the state of New Jersey’s long list of contaminated sites and my friends and family living downstream, but for the rest of the country and for the integrity of the agency and its ability to protect public health over industry profits. I’d love to be proven wrong by Pruitt and Wright and to see Berry’s Creek and other NJ Superfund sites be the cleanup success stories we’ve been waiting for as long as we’ve been the NY metropolitan area’s dumping grounds.

But New Jersey and other states with Superfund sites (including Dow Chemical Company’s home state of Michigan) have been burned too many times by responsible parties and developers. We can’t afford to have another industry apologist making the calls about whether sites and communities are cleaned up and that responsible parties are held accountable.

I would like to acknowledge and thank my colleagues Emily Berman, Juan Declet-Barreto, and Yogin Kothari for their invaluable input. 

Flickr: Steven Reynolds/samenstelling

DOI Caught Lying About a Staff Purge. Congress Has Questions

Department of the Interior. Photo: Matthew G. Bisanz. CC-BY-2.0 Wikimedia.

Last week, the Interior Department Inspector General’s office released its report on Secretary Ryan Zinke’s controversial mass reassignment of senior executives last summer, requested by alarmed Senators shortly after the reassignments took place. 

Secretary Zinke does not like what they found.  

The report painted a picture of incompetence, discrimination, and political retaliation. It described how the board that made the reassignment decisions was politicized, how they covered their tracks by keeping no records, and how they failed to “remember” anything about their instructions or motivations. The report described a sham of a process that was clearly intended as a purge. 

As if intent on demonstrating just how Zinke would be leading the agency, his team checked every box for poor workplace management. It was so damning that House Natural Resources Committee Ranking Member Raul Grijalva, immediately smelled a rat and has requested that Chairman Rob Bishop hold a hearing immediately “on the disturbing findings.” This story is not yet over. 

As one of the reassigned executives, I had a front row seat to this debacle last summer. Now, to be clear, every new administration moves a few senior executives around for various reasons when they take over, but no agency from any administration has come in and reassigned dozens of career senior executives at one time, and certainly not with such apparent intent to dislodge us from the civil service entirely.  

To do this they moved people into jobs unrelated to their area of expertise, many were moved across the country, they were reassigned without any prior consultation, many of them were retirement age, most had families, and a very disproportionate number of them were American Indians.  

So for starters, this was just horrible workplace management.  

But then Secretary Ryan Zinke, the only Senate-confirmed employee at DOI at the time, testified to Congress the following week that he would use such reassignments, along with attrition and other means, to trim the DOI workforce by 4,000 people.  

As any thoughtful individual would surmise, reassignments only trim the workforce if they cause employees to quit, and while senior executives can certainly be moved, even involuntarily, it’s unlawful to use reassignments to get employees to quit. Zinke admitted his unlawful strategy directly to Congress that day. 

We work for the American people. We are not there to play politics for any president or cabinet member. Yet Secretary Zinke last year demanded loyalty to President Trump and effectively pledged to get rid of employees who wouldn’t play along.   

Knowing all that, I was still stunned by what the IG found. 

The Executive Resources Board (ERB), the body making the reassignment decisions, is meant to consist of an equal number of political appointees and civil servants. The IG found that the Zinke ERB consisted only of recent political appointees.  The board did not document any sort of plan or reasons for selecting executives to reassign; it did not review executive qualifications or gather other information necessary to make such decisions; and it did not communicate with either the executives or their managers before make the reassignments.

There was a complete absence of a paper trail for the reassignment actions – a remarkable and reckless approach to governing that can only suggest that they did not want their reasons known.  

The IG even caught them in a lie. ERB members claimed that they had three criteria for moving executives – moving people that had been in their jobs for a long time, moving people out of Washington DC, and moving people to new functional areas. The IG found no evidence at all to show that they evaluated the reassignments against those stated criteria, and the ERB members were unable to recall the criteria that they ultimately used instead. 

While the IG certainly found the ERB members to be incompetent and unable to remember even the broadest details of their efforts, it would be naïve to think that this was simply a matter of incompetence. If they had legitimate reasons for moving us around, you can bet they would have recorded them. Instead, these actions can only be interpreted as malicious, retaliatory, and discriminatory. 

But when the IG asked the executives themselves, the criteria became quite clear. Seventeen of us indicated that the reassignment was likely political retaliation or punishment, and 12 of us felt that that it was probably related to former work on issues such as climate change, energy, and conservation. This ERB was not even subtle about its objectives – they moved me, the climate policy advisor, to the office that collects and disperses oil and gas royalty income. 

Behind this keystone cops display lurks a dogged determination to reward supporters and purge the agency of senior executives who might not salute the Secretary’s flag. To accomplish this they assembled their ERB hit squad of six political appointees who could be relied upon to sign off on whatever the political leadership decided. It’s hard to imagine a scenario that would more clearly demonstrate a politicization of the civil service workforce. 

This is a long-established no-no; there are important reasons to keep the civil service partitioned from the political winds and whims of each new administration. The mission of the agency depends on operational consistency in administering programs and services. While every incoming administration would love to bend the career ranks to their every wish, they generally know better than to try, and there are laws and regulations to prevent it. 

The Trump Administration just doesn’t know better, and the consequences are serious.  

In addition to muzzling science and stifling important climate change efforts on behalf of Americans, this purge adds up to political retaliation, discrimination, wasted taxpayer dollars, and a callous disregard for the career staff at the agency. Thankfully, some good folks in Congress have taken notice and I hope to see a deeper examination of these issues in the near future. Each of the ERB members should be forced to testify, on the record, that they just don’t remember how or why they reassigned us. This precedent can’t stand. 

I’ve left federal service for now, but my thoughts go out to all of the career folks who still have to endure this type of work environment, keeping their heads down and wondering who’s next. I hope our institutions can stand up to these abuses of power so they can get back to work serving the American people. 

Scott Pruitt’s Regulatory Rollback Recipe  

Vehicle pollution is a major issue for human health and the environment.

EPA Administrator Scott Pruitt continues to stack the deck in favor of industry interests. At least two members appointed by Pruitt to the EPA Science Advisory Board received funding to conduct misleading research that EPA used to justify reexamining vehicle fuel efficiency standards – a regulation forecast to save consumers over $1 trillion, cut global warming emissions by billions of metric tons, and advance 21st century vehicle technology.

This shameless attempt to use shoddy research that was funded by the oil industry and used by automaker trade groups to overturn a regulation that is based on sound science and widespread public support is a perfect example of how Pruitt intends to rollback regulations at the behest of his industry-tied former donors.

Pruitt’s plan is a simple (though perhaps illegal) five-step recipe. Here’s exactly how he has been cooking up a regulatory repeal (or re-peel) soup of equal parts corruption, paranoia, and apathy.

Step 1: Separate independent science from the record, then discard

Make it exceedingly difficult for academic scientists to join the advisory committees that help your agency set pollution thresholds, compliance deadlines, and cost estimates.  These committees are supposed to represent the viewpoints of both independent scientific experts and industry stakeholders, but you can argue that the composition of these committees is solely at your discretion. So go ahead and kick those academic nerds off the advisory committees and replace them with industry-funded friends.

Step 2: Liberally add industry-funded junk science to your liking

Promote the “studies” of your new industry-funded advisory committee friends. Bonus points if they use junk science to show that health benefits from reducing smog “may not occur,” rising carbon dioxide levels are beneficial to humanity, or that people don’t want more fuel efficient cars and trucks. At the same time, give your employees new talking points on climate change to ensure any public facing communications either cast doubt on the science your agency has previously relied on or doesn’t mention it at all. Ruthlessly reassign or fire any employee who fails to comply.

Step 3: Bake junk science into the record

This step is important. Copy the text from industry-funded studies into your official justification to reevaluate, suspend, or rollback rules that science has already shown to be effective. The fastest and easiest way to do this is to just copy the text verbatim. Don’t worry that the administrative record supporting the original enactment of these regulations is chockfull of academic, peer-reviewed studies and thousands of public comments that demonstrate why these regulations are reasonable, achievable, and necessary. Also ignore trepidation from agency career staff who think you are opening the agency to legal challenges or failing to use sound science to justify your agenda.

Step 4: Set legality setting to uncertain, and wait until lawsuits have settled

Use the vast legal resources at your disposal to make any legal challenges to your efforts take as long as possible, which, in the federal court system, can be a very long time indeed. While the courts struggle with whether you have overstepped your authority, your rollback will remain in place – effectively stymying the impact of the regulation on industry for potentially years.

Step 5: Clean your workspace to eliminate traces of corruption and outrageously bad ethics

Make sure you have the support of your boss as you engage in some light to medium graft and corruption. You will probably need a soundproof “privacy booth” that costs taxpayers close to $43,000, a security detail that costs $3 million and protects against non-existent death threats, and a cheap condo rented from the wife of corporate lobbyist for the fossil fuel and auto industries. Keep public leaks of your missteps to a minimum and refrain from using social media to say anything of value.

Overall, this recipe is a disaster for both independent science, and public health. Help UCS push back against Pruitt’s effort to cook this regulatory rollback soup by checking out our new nationwide mobilization effort called Science Rising. This effort isn’t a one-day march—it is a series of local activities, events, and actions organized by many different groups. Our shared goal is to ensure that science is front-and-center in the decision-making processes that affect us all—and to fight back against efforts that sideline science from its crucial role in our democracy.

Will you join us to keep #ScienceRising?

 

The White House Clearly Does Not Like the EPA’s “Secret Science” Plan

The EPA’s plan to limit the types of science that the  EPA can use to make decisions may run into an unusual roadblock: the White House itself. In a Senate hearing yesterday, New Hampshire Senator Maggie Hassan questioned White House official Neomi Rao about the EPA plan (watch here, beginning at 59:02), and the answers suggest that the EPA and the White House are not on the same page.

Ms. Rao heads the Office of Information and Regulatory Affairs (OIRA) in the White House’s Office of Management and Budget. The office is responsible for overseeing the administration’s regulatory agenda. Agencies submit rules for OIRA review before they can be finalized.

The White House tends to enthusiastically support federal agency initiatives. But in a hearing Thursday, the administration’s representative, Neomi Rao was pretty lukewarm about the EPA’s proposal to limit the use of science at the agency. Archival photo via C-SPAN.

Some speculate that OIRA is not keen on the EPA’s proposal because it could make it more difficult for the EPA to weaken clean air and clean water protections. A court can strike down agency actions that are not grounded in evidence—both decisions that improve public protections and decisions that erode them. So in a perverse way, their desire to go back to 1950s regulatory standards could be hampered by the EPA’s proposed science restrictions.

Senator Hassan began buy questioning Administrator Rao about a proposal to give restaurants owners more control over service workers’ tip money. The Department of Labor purposely hid analysis showing the proposal would take billions of dollars out of the pockets of food servers, baristas, and many other hardworking people. OIRA allowed the department to move forward with the proposal, even though it lacked sufficient data to do so (Senators Heitkamp and Senator Harris asked great follow-up questions later in the hearing).

Then Senator Hassan moved on to the EPA (my emphasis added):

Senator Hassan: EPA Administrator Scott Pruitt is reportedly considering a proposal that would prevent the EPA from using a scientific study unless it is perfectly replicable and all the underlying raw data is released to the public. That is problematic for a whole host of reasons. For example, it could require the release of confidential medical information, which in turn may reduce participation in studies, but it would also prevent the EPA from considering some of the best evidence we have available to us when making regulatory and deregulatory decisions. Have you and your office provided any input to Administrator Pruitt on this proposal?

Administrator Rao: The questions about information quality are very important to us, and that is something that my staff has been working with the EPA on to develop best practices in that area.

Senator Hassan: Do you think such a proposal as the one I just described, the one that is from the EPA that would limit the information agencies can use by preventing them from considering best available evidence makes sense?

Administrator Rao: Well I think we want to make sure that we do have the best available evidence. I think it’s also important for the public to have notice and information about the types of studies which are being used by agencies for decision making, so I think that there is a balance to be struck there, and I think that’s something that the EPA is working towards.

That’s not exactly a ringing endorsement, and some evidence that the friction between the White House and EPA extends beyond numerous ethical scandals to the agency’s style of policymaking as well.

“Scientific evaluation and data and analysis is an ongoing process,” continued Senator Hassan. “As you know, we’ve talked about one of my priorities is the response to the opioid crisis in my state and across this country. If we wait for so-called perfect science, we’re not going to have evidence-based practices out there that are saving lives. And so I think it is critically important that we continue to honor scientific process and make sure that we are using best available data when we make policy.”

At one point, Senator Hassan posed this direct question: “Would you generally support agencies changing their procedures in ways that prevent them from using the best available evidence when making these decisions?”

“No, I would not,” replied Administrator Rao.

On that, they could agree.

Brace Yourself for Unhealthy Air: The Trump Administration Weakens Clean Air Protections

Yesterday the Trump administration started chipping away at one of the strongest science-based public health protections we have in the country. In a laundry list of industry wishes, President Trump has ordered the EPA to make several sweeping changes to how it implements ambient air pollution standards.

I’m saddened at the potential for this to weaken the clean air protections we enjoy every day because of our nation’s long history of strong science-based policies. Other countries have strict air pollution laws but not all of them come with teeth. In the US, we are lucky to have air pollution laws at work. They work because they require decisions be made based on what’s protective of public health not on what’s convenient for regulated industries. And importantly, the Clean Air Act includes consequences for failure to meet air pollution standards, ensuring strong incentives for states and industries to comply.

I’ve been proud to live in a country where these protections save thousands of lives and prevent thousands more respiratory illnesses, cardiac illnesses, and missed work and school days every year. But now it’s less clear if my family and yours will enjoy the same.

Here are four ways the new executive order will undermine our science-based air pollution protections.

1. Requiring science advisors to consider non-scientific information in their advice to EPA

This is one of the most concerning changes in the executive order. The EPA relies on the Clean Air Scientific Advisory Committee (CASAC) for independent scientific advice on where the agency should set air pollution standards in order to protect of public health. Comprised of air pollution and health experts from universities and other entities outside of the EPA, the committee dives deep on exactly what the science says about the relationship between air pollutants and the health of Americans. This system has worked remarkably well to ensure the EPA is making decisions consistent with the current science and holding the agency accountable when it doesn’t. (See more on the important role of CASAC and the independent science that feeds into the EPA process here and here).

In a striking reversal of precedence, the president’s order asks the committee to also consider “adverse public health or other effects that may result from implementation of revised air quality standards.” This is scientifically problematic and likely illegal.

The Clean Air Act mandates that ambient air pollution standards be determined by what is protective of public health with an adequate margin of safety—and that’s it. Economic impacts, costs to industry, etc. cannot be considered. The Supreme Court affirmed this in 2001 in its Whitman v. American Trucking Association decision. Ordering the EPA’s science advisers to consider information outside of the health impacts of pollutants shoves a wrench into a functional science-based process for protecting the nation’s health.

This move builds on other administration efforts to weaken the technical chops of CASAC and other government science advisory committees, by allowing them to sit idle and replacing qualified independent scientists with conflicted or unqualified individuals.

The order’s section on science advisers also signals that the agency will explore ways to “ensure transparency in … scientific evidence” considered by the committee, in what is likely a nod to the Trump administration’s expected move on addressing “secret science.” (Learn more on the many reasons this proposal is flawed.)

2. Restricting the science that can be used to protect public health

Many more people could now be living in areas that evade air pollution protections, thanks to a provision that limits what scientific information the agency can use to determine who is breathing bad air.

The Trump administration just moved to weaken our nation’s strong ambient air pollution protections, paving the road for increased pollution across the country.

The order includes an innocuous-seeming provision declaring that the agency should “rely on data from EPA-approved air quality monitors” to decide which areas need to improve their air. The EPA, of course, already relies heavily on monitoring data to make decisions about where air pollution standards are being met. But it can’t do this everywhere. Not every county or jurisdiction will have a monitor (accurate long-term monitoring isn’t cheap), so in areas without monitors for specific pollutants the EPA uses modeling or satellite information to determine air quality.

This can be a cost-effective way to determine where air is unhealthy and for some pollutants it can be impressively accurate. For example, for pollutants like ozone that form in the atmosphere, scientists know that ozone levels are very consistent over large distances, i.e. if a monitor tells me ozone levels are high, I’m confident that ozone levels are also high five miles down the road.

For other pollutants, modeling can be crucial for ensuring that people are protected from industrial emissions. Sulfur dioxide, for example, is emitted from coal-fired power plants and its concentrations can vary a lot over space, i.e. a place directly downwind of a power plant could get hit hard with sulfur dioxide pollution, while an area five miles away could have clean air. In these cases, modeling air pollution concentrations can allow the EPA to protect people from pollution that might otherwise be harder to characterize through a few monitors. (If you want to know more on this point, I know a good dissertation.)

Preventing the EPA from fully using available tools for scientific assessments means many areas, especially suburban or rural areas, could have unhealthy air that goes unnoticed and won’t be cleaned up.

3. Increasing demands without increasing resources for the EPA and states

Several provisions of the executive order focus on expediting permitting and implementation processes. In theory, this is a good idea. We would all benefit from more time-efficient government processes. However, it cannot be done in a vacuum. The EPA has been asked to do more with less over the years. Expecting the agency to expedite processes without providing additional resources could mean cutting corners or less rigorous analysis. This wouldn’t help the agency meet its mission of protecting people from air pollution; it would make it easier for lapses in implementation to happen.

This is especially true when we look at permitting processes, which are largely handled by the states. States won’t have additional resources to conduct air pollution modeling, analyze measurements, and evaluate permit applications for new industrial sources. Asking them to expedite this process could make it easier for industrial sources to be built in already polluted areas.

4. Allowing for more pollution in already hard-hit areas

In several ways, the order stands to increase pollution in areas that already face disproportionate impacts from air pollution. In addition to the expedited permitting discussed above, the rule also allows for interstate trading of pollutant emissions. Such schemes have worked well in the past (e.g. we’ve been remarkably successful at reducing acid rain), but in this case, it will be important to watch closely how this is implemented. For the ambient air pollutants that fall under this order and some of their precursors, there are acute health effects. Thus, in a trading scheme, someone will get the short end of the stick in terms of breathing bad air.

In other words, trading emissions might allow one state to breathe cleaner air and emissions could be reduced overall, but that means another area will see an increase in emissions. When the pollutant in question has adverse health impacts, that’s a big problem for anyone living downwind of a plant that bought those emissions credits. Similarly, states that depend on interstate cooperation to reduce pollution in their borders are also likely to get a sore deal here, as Senator Tom Carper of Delaware rightfully pointed out yesterday in a statement.

Already, the Clean Air Act doesn’t do a great job of improving air quality in hotspots where pollutant levels may be uncharacteristically high compared to the surrounding areas. This executive order could make that problem worse. As Alex Kauffman discussed on the Huffington Post yesterday, the people most affected by this are likely to be communities of color, which are already burdened with disproportionately high levels of air pollution across the country. This of course adds to many other steps the administration has taken that worsen inequities in pollution exposure.

Brace yourself for bad air

The bottom line is that the president’s order is bad news for anyone that breathes air in this country. It represents a chipping away at the strong air pollution protections we’ve enjoyed for decades. It doesn’t serve the public interest and it certainly doesn’t advance the EPA’s mission of protecting public health. It serves only those who wish to pollute, exposing more American to unhealthy air. And this will come with consequences for our health. The fate of this new order will likely play out in the courts, but in the meantime, I wish we could all just hold our breath.

 

 

Stories, Improv, and What Science Can Learn From Comedy

Can you name a scientist? If your response was no, you are not alone. Eighty one percent of Americans cannot name a living scientist, according to a 2017 poll that was conducted by Research America. As scientists, it is our responsibility to reach out to the public and talk to people about what we do, why it is important, and how it connects to their lives. We are not trained to make those connections and do public outreach, but luckily there are increasingly more opportunities to learn.

We are graduate students and members of Science in Action, a science communication and policy advocacy group at Colorado State University. Our goal is to encourage other scientists on campus to learn about and practice sharing their science. With financial support from the Union of Concerned Scientists, we were able to take advantage of unique opportunities to do just that.

Acting for science: using improv techniques to communicate

Scientists are trained to methodically approach problems and rigorously analyze solutions, but not taught how to communicate the findings. We may be doing vitally important work that benefits humanity, but what if we cannot communicate its importance to the public?

Actors, on the other hand, are expert storytellers. They use specific techniques to connect with their audience—techniques that scientists can and should learn to use.

Members practicing “acting tools” with Sarah Zwick-Tapley.

To help aspiring scientists learn these tricks of the trade, we partnered with the Union of Concerned Scientists to host a science communication workshop. Sarah Zwick-Tapley, a local theater director and science communication consultant, introduced us to the “actor’s toolkit,” a set of physical and vocal techniques for audience engagement.

These tips were simple enough (land eye contact, change the tone, volume, and speed of your voice) but incorporating them all together while also describing the importance of your science? That is a challenge.

Another critical piece of the storytelling approach is using the “And, But, Therefore” sequence. We practiced this technique with an outlandish example. First, you start with what we know (“we know cancer is a deadly disease AND that it has many causes”). Next, you build suspense with what we have yet to discover (“BUT, we don’t know whether eating old books causes cancer”). Then, you finish with your contribution (“THEREFORE, I am eating Shakespeare’s entire body of work to see if I develop cancer”). Using this technique turns a simple list of facts into a powerful story.

The next step: put our new acting skills into action.

Why science matters for Colorado

Colorado is home to multiple national laboratories and major research universities.

Standing in front of the Colorado State Capitol after sharing our science with legislators and staffers.

Researchers at these organizations do important science and bring the best and brightest minds to the state. To help share these discoveries with our state legislators, we joined Project Bridge, from the University of Colorado Denver Anschutz Medical Campus, for a poster day at the capitol. Speaking with non-scientists can be a challenge, but we used our new acting tools to tell a story, both in our poster design and our presentation.

We also took this opportunity to meet one-on-one with our state representatives. Because they represent a college town, they recognize the value of research for our city, state, and country. We were encouraged to hear that they regularly rely on experts at CSU for advice on pending legislation. This is science policy in action.

Communicating for the future

As a scientist, you may recognize that communicating science is important, but are unsure how to learn these skills. Luckily, there are numerous organizations across the country that are dedicated to training scientists to communicate clearly and effectively. Many scientific organizations (the American Academy for the Advancement of Science, the American Geophysical Union, and the American Society for Cell Biology, among others) hold science communication and science policy trainings and provide small grants for local groups. COMPASS is an international organization that hosts trainings and provides one-on-one coaching for aspiring science communicators. Many universities have also started in-house communication trainings and programs (Stony Brook University is home to the Alan Alda Center for Communicating Science).

These resources illustrate the fact that there are people and organizations dedicated to providing scientists with the tools they need to share their science with everyone.

 

Rod Lammers and Michael Somers are graduate students at Colorado State University. They are both officers in Science in Action, a science communication and policy group. Science in Action is a student-led organization at Colorado State University started in 2016 to engage campus scientists and provide opportunities for outreach to the public and policymakers. More information can be found on the organization’s website and Facebook page.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Trump Onboard for Offshore Wind?

Workers dwarfed by offshore wind blade tips and towers at Siemens deployment dock in Hull. Offshore wind is part of the revival of many port cities in Europe. Photo: Derrick Z. Jackson

The Trump administration’s quiet embrace of offshore wind became a shout heard around the industry last week. At an offshore wind conference, Interior Secretary Ryan Zinke said, “We think there’s an enormous opportunity for wind because of our God-given resources off the coast. We’re pretty good at innovating. I’m pretty confident that the wind industry is going to have that kind of enthusiasm.”

Sec. Zinke stoked that enthusiasm by announcing the opening of the bidding process for the final 390,000 acres of federal waters far south of Martha’s Vineyard. Those parcels went unclaimed in a 2015 auction held in the despairing wake of the collapse of Cape Wind in Nantucket Sound, leaving many advocates wondering if offshore wind, which has become an important source of energy in northern Europe, would ever take off in the United States. The prospects came even more into question with the 2016 election of President Trump, who routinely claimed that offshore wind was too expensive.

But the dramatically dropping costs of offshore wind, which is now cheaper than nuclear power and closing in on parity with fossil fuels in Europe, have sparked an explosion of renewed interest in the US. The bidding for those once-orphaned waters off Massachusetts likely will be fierce as they have already received unsolicited bids from the German wind company PNE and the Norwegian energy giant Equinor, the former Statoil.

The Interior Department made two other significant announcements last week to further brighten offshore wind’s prospects. It announced that it was soliciting industry interest and public input on the possibility of establishing offshore wind farms in 1.7 million acres of waters off the New York Bight that curls up from New Jersey to Long Island. It also is soliciting input on an assessment of all Atlantic offshore waters for wind farm development. Zinke’s energy policy counselor, Vincent DeVito, said in a press release, “We are taking the next step to ensure a domestic offshore wind industry.”

This is the surest sign yet that an administration that has pulled out of global climate change agreements and is rolling back environmental protections at the behest of the fossil fuel industry, nonetheless does not want to miss out on the economic potential of a renewable energy industry that has revived many ailing port cities in northern Europe.

Worker gives scale to offshore wind blades nearly a football field long at a Siemens facility in Denmark. Photo: Derrick Z. Jackson

It surely must help politically that onshore wind is now a bedrock of American energy, with rock-solid bipartisan support in an oft-divided America. Rural turbines have dramatically changed the energy landscape in the Republican-dominated states of the Midwest and Great Plains, with Texas, Iowa, Oklahoma, and Kansas being the top wind electricity generating states and with the nation’s fastest growing occupation paying more than $50,000 a year being wind turbine service technician.

A similar bipartisan picture is rapidly developing for offshore wind along the Eastern Seaboard. Democratic and Republican governors alike are staking claims in the offshore industry, from Massachusetts’s game-changing 1,600 megawatt mandate to Clemson University in South Carolina being chosen to test the world’s most powerful turbine to date, a 9.5 megawatt machine from Mitsubishi/Vestas.

Both states happen to have Republican governors who have joined their Democratic counterparts in opposing Zinke’s proposal to also exploit the Atlantic continental shelf for oil and gas. In the same Princeton speech that he praised the possibilities of offshore wind, Zinke acknowledged that offshore fossil-fuel drilling was opposed by governors in every East Coast and West Coast state except Maine and Georgia. “If the state doesn’t want it, the state has a lot of leverage,” he said.

In contrast, offshore wind’s leverage has become almost undeniable. The last two offshore wind lease auctions in New York and North Carolina added a respective $42 million and $9 million to federal coffers. With Massachusetts, New York and New Jersey leading the way, there are now more than 8,000 megawatts of legislative mandates and pledges by current governors. That could meet the needs of between 4.5 million and 5 million homes, based on the proposals made for 800 MW farms in Massachusetts.

An 8,000 megawatt, or 8 gigawatt (GW) market could alone create between 16,700 and 36,300 jobs by 2030, depending on how much of the industry, currently centered in Europe, is enticed to come here, according to a joint report by the clean energy agencies of Massachusetts, New York, and Rhode Island. But the potential is much greater.

A 2016 report from the US Departments of Energy and Interior estimated that there was enough technical potential in US offshore wind to power the nation twice over. The US, despite being two and a half decades behind Europe in constructing its first offshore wind farm, a five-turbine project off Block Island, Rhode Island, is still in a position to ultimately catch up to Europe, where there are currently nearly 16 gigawatts installed, supporting 75,000 jobs. The 2016 DOE/DOI report said that a robust offshore wind industry that hits 86 GW by 2050 could generate 160,000 jobs.

One can hope that it is this picture that Sec. Zinke and the Trump administration are looking at in their support of offshore wind. Sec. Zinke continues to say that offshore wind is part of the White House’s “all-of-the-above” strategy for “American energy dominance.” Given how little interest there is for any new oil and gas drilling off the coasts of America, offshore wind is becoming the new source of energy that stands above all.

Service boat cruising in the Anholt offshore wind farm in Denmark. Photo: Derrick Z. Jackson

Photo by Derrick Z. Jackson Photo by Derrick Z. Jackson

SNAP already has work requirements. Adding more won’t solve poverty.

Photo: US Air Force

On Tuesday, President Trump signed an executive order calling for a review of the nation’s federal safety net, with the stated aim of “moving people into the workforce and out of poverty.” This is almost certainly thinly veiled code language for additional work requirements in programs that serve millions of low-income individuals and families, including Medicaid and the Supplemental Nutrition Assistance Program (SNAP).

There are a number of inaccuracies and logic flaws contained in the text, but chief among them are these:

Falsehood 1: The federal safety net is causing poverty.

The order states, “Many of the programs designed to help families have instead delayed economic independence, perpetuated poverty, and weakened family bonds.”

It is a grim truth that poverty has a strong grip on too many communities in this country. But poverty is not created by social support programs, nor is it perpetuated by the people who use them. Persistent poverty is far more likely a product of the complex structural inequities embedded in our everyday lives—income inequality, for example, and institutional racism and discrimination. And until we address and remedy these underlying factors, it is essential that we have a strong federal safety net to fall back on.

Falsehood 2: Working-age adults have become dependent on programs like SNAP.

The US Department of Agriculture (USDA) counters this notion with its own data. The populations who might depend on the program for longer periods of time include children, the elderly, and those with disabilities; together, these groups make up about two thirds of all SNAP participants. The population of SNAP participants who are classified as able-bodied adults without dependents (ABAWDs) and are required to work make up just a small fraction—only two percent—of all those who stay on SNAP for a period of eight years or longer.

Yet there is every indication that ABAWDs will be the target of more stringent work requirements in the months to come. A recent USDA federal register notice asked for public input on “innovative ideas to promote work and self-sufficiency” among the ABAWD population. Here’s what we offered.

 

April 9, 2018

The Union of Concerned Scientists

Re: Document No. FNS-2018-03752: Supplemental Nutrition Assistance Program: Requirements and Services for Able-Bodied Adults Without Dependents; Advance Notice of Proposed Rulemaking

We submit this comment to the US Department of Agriculture (USDA) to express broad opposition to policy and programmatic changes that would further limit SNAP eligibility for able-bodied adults without dependents (ABAWDs). While we appreciate USDA efforts to address food insecurity and provide adequate opportunities for employment and training among low-income populations, any proposals which would remove participants from the program—either through more stringent work requirements, further restrictions on eligibility, or other means—would fail to accomplish either, and may in fact contribute to worsening economic hardship among low-income individuals while imposing undue administrative burden and cost on state and federal agencies.

Our opposition to the aforementioned policy and programmatic changes is grounded in the following:

The work requirements in place for ABAWDs are already extensive.

In addition to meeting general work requirements for SNAP participation, ABAWDs are subject to a second set of time-limited work requirements. These dictate that an ABAWD must work or participate in a work program for at least 80 hours per month, or will face benefit termination after a period of three months, renewable after three years. Data from the Bureau of Labor Statistics show that for many, securing a job within three months is an unattainable goal: last year, nearly 40 percent of those able to work and looking for jobs in the general population were unable to find work within 15 weeks, while nearly 25 percent were unable to find work within 27 weeks.[1]

The population of unemployed ABAWDs is a small fraction of SNAP participants.

The vast majority of SNAP recipients are children, the elderly, caregivers, or persons with disabilities. The ABAWD population makes up a small fraction of all SNAP recipients, and many are already working or looking for work. Fewer than 8.8 percent of all SNAP participants are classified as ABAWDs, and the number of unemployed ABAWDs at any given time constitutes only 6.5 percent of all program participants.[2] It should be noted that the population of unemployed ABAWDs is not stagnant, but shifts depending on need: research shows that among SNAP households with at least one non-disabled, working-age adult, eight in 10 participants were employed in the year before or after receiving benefits, meaning SNAP is providing effective temporary assistance during periods of economic difficulty.[3] Many of the policy changes addressed in the federal register notice, including new review processes, certification processes, and reporting requirements, would incur administrative burdens and costs with little demonstrable benefit for low-income populations, and may in fact detract from the efficacy of the program.

Bolstering employment and training programs will do little to counter the root causes of poverty and food insecurity—particularly when other public assistance programs are at risk.

Employment and training (E&T) programs can provide a path to self-sufficiency if evidence-based and adequately funded. Currently, there is wide variation among state E&T programs, with varying efficacy, and limited full federal funding available to states.[4] Until there is consistent implementation of effective and scalable models for job training across states—accompanied by a strong government commitment to invest in such models—we cannot rely on E&T programs alone to keep low-income populations employed and out of poverty. This is particularly important at a time when numerous other public assistance programs serving low-income populations are at risk.

We appreciate the opportunity to provide comments on the manner in which the USDA intends to pursue its stated goals of addressing food insecurity and providing adequate opportunities for employment and training among low-income populations. However, the questions posed by the agency suggest that forthcoming policy proposals will do more harm than good. Any policy changes to SNAP resulting in removal of individuals from the program—including more stringent work requirements or restricted eligibility among the ABAWD population—present serious risks to the health, well-being, and economic vitality of the individuals and communities served by this program.

Thank you for your consideration.

 

[1] Bureau of Labor Statistics. 2018. Table A-12: Unemployed persons by duration of employment. Washington, DC: US Department of Labor. Online at www.bls.gov/news.release/empsit.t12.htm, accessed March 2, 2018.

[2] Food and Nutrition Services (FNS). 2016. Characteristics of able-bodied adults without dependents. Washington, DC: US Department of Agriculture. Online at https://fns-prod.azureedge.net/sites/default/files/snap/nondisabled-adults.pdf, accessed March 2, 2018.

[3] Council of Economic Advisers (CEA). 2015. Long-term benefits of the Supplemental Nutrition Assistance Program. Washington, DC: Executive Office of the President of the United States.

[4] Food and Nutrition Services (FNS). 2016. Supplemental Nutrition Assistance Program (SNAP) Employment and Training (E&T) Best Practices Study: Final Report. Washington, DC: US Department of Agriculture. Online at https://fns-prod.azureedge.net/sites/default/files/ops/SNAPEandTBestPractices.pdf, accessed March 8, 2018.

 

Want to learn more about SNAP? Listen to Sarah Reinhardt on our Got Science? Podcast!

USDA Focus on Nutrition Program “Integrity” is a Smokescreen

Photo: US Air Force

The US Department of Agriculture has announced it will hire a new “chief integrity officer” to oversee federal nutrition programs such as the National School Lunch and Breakfast Programs, Special Supplemental Nutrition Program for Women, Infants and Children (WIC), and the Supplemental Nutrition Assistance Program (SNAP, formerly known as food stamps). The integrity of SNAP in particular has been a popular topic among those in the Trump administration, including USDA Secretary Sonny Perdue, who argue that SNAP enables a “lifestyle of dependency” and seek major program reforms in the upcoming Farm Bill. But these arguments have been conjured from very little science and a whole lot of smoke—and have the effect of distracting the public from more pressing issues at hand.

SNAP is among the nation’s most effective and efficient programs

Monitoring any government program is necessary to ensure that taxpayer dollars are being spent effectively, and the USDA does so through its quality control process and periodic reports on fraud and abuse. In fiscal year 2011, as part of the Obama administration’s Campaign to Cut Waste, the USDA Office of Inspector General (OIG) conducted an extensive review of more than 15,000 stores for compliance with SNAP program rules. The results of these assessments, combined with USDA participation data, tell us the program is working as intended, and with remarkably few problems.

SNAP fraud1—broadly defined as exchanging benefits for cash or falsifying participant or retailer applications to illegally obtain or accept benefits—happens relatively infrequently, though it’s difficult to measure. The USDA’s most recent report on the topic estimated that it affected about 1.5 percent of all SNAP benefits received between 2012 and 2014. This represents a slight increase since the early 2000s (1.0 to 1.3 percent between 2002 and 2011), but remains substantially lower than in the 1990s, when reported rates were as high as 3.8 percent.

Compare this to some of the other federal programs contained in the Farm Bill—like crop insurance. Back in 2013, former USDA secretary Tom Vilsack voiced concerns about the integrity of crop insurance programs due to error and fraud rates that exceeded those of SNAP. As with SNAP, illegal activity is largely uncovered by way of criminal investigations. According to the Department of Justice, recent convictions and sentences connected with the federal crop insurance program have included the indictment of a Kentucky agricultural producer for insurance fraud, wire fraud, and money laundering; an Iowa farmer who received more than $450,000 in crop insurance proceeds illegally; a Louisiana farmer who created shell farms to receive more than $5.4 million in subsidy payments; and a Kentucky crop insurance agent whose clients received nearly $170,000 in indemnity payments for false claims. And those are just the cases from the last six months. Unlike SNAP, no chief integrity officer has been assigned to monitor fraud and abuse in the program.

So yes, we should be striving for continuous improvement in the operation of all federal programs. But explaining why SNAP in particular has come under such intense scrutiny requires some historical and political context—and a good understanding of who might benefit from maintaining old narratives.

 

Learn more about SNAP, listen to Sarah Reinhardt on the Got Science? podcast

The rise, fall, and flatline of SNAP fraud

Though President Lyndon B. Johnson signed the Food Stamp Act into law in 1964, the program began to gain popularity in the 1970s, when participation doubled from 5 million to 10 million over the course of the decade. As participation increased, the USDA began to discover incidents of abuse, eventually revealing a widespread pattern of illegal activity that would plague the program for the next twenty years. Reported rates of fraud at this time were between 10 and 20 percent.

The widespread abuse lent itself to growing public disapproval of the food stamps program and,  accompanied by racist and classist rhetoric around so-called “welfare queens,” fueled substantial program reform by the Reagan administration in the early 1980s.

But what likely ushered in drastic improvements in rates of fraud and abuse was the introduction of new electronic benefit transfer (EBT) systems, which fully replaced paper food stamps in 2004. In addition to requiring a 4-digit PIN number, EBT cards create a record of each purchase, increasing the ease with which agencies can identify and document illegal use.

That brings us, more or less, to the efficient and effective program that we’ve seen for the last decade.

Despite how well SNAP works, politicians have continued to frame it as a program that suffers from rampant misuse and illegal activity. For those seeking drastic budget cuts and reforms, that negative narrative grants permission to discredit both the program and those who use it—and one needs to look no further than President Trump’s welfare reform proposals or Speaker Paul Ryan’s comment about the tailspin of “inner-city culture” to know that the legacy of the welfare queen is alive and well.

Speaking of integrity…

Photo: USDA

To get a sense of where this is all headed, keep your eyes on USDA Secretary Sonny Perdue—if you can. He’s been making some quick pivots lately.

At a May 2017 House Appropriations Subcommittee on Agriculture hearing, Perdue stated that “SNAP has been a very important, effective program,” and that the agency was considering no changes. “You don’t try to fix something that isn’t broken.”

Less than a year later, the secretary has voiced his support for a number of proposed program changes, including stricter work requirements for adults without dependents. Not unlike the recent announcement aimed at nutrition program integrity, these proposals are often grounded more in political ideology than fact. USDA’s own data shows that most SNAP participants who can work do work—albeit in unstable jobs—and counters the notion that participants stay on the program for long periods of time.

When the House releases its draft of the Farm Bill, which may happen as early as this week, Perdue’s response (or lack thereof) could provide insight on the policy proposals he’s prepared to support. More than likely, he’ll continue to endorse the positions of his party—but then again, we’ve been surprised before.

 

  1. Fraud and abuse should be distinguished from error rates, or improper payment rates, which capture how often SNAP participants receive underpayment or overpayment of benefits. (The overwhelming majority of SNAP errors are attributed to unintentional error by recipients or administrative staff.) Program error rates have also experienced substantial declines in recent years: they reached an all-time low of 2 percent in 2013, down from 6.6 percent in 2003. And though a 2015 USDA OIG review of the quality control process found understated error rates in some states, the resulting corrective actions target state agencies—not individual SNAP participants.

 

Andy Wheeler: Trump’s Pick for EPA Deputy is a Threat to Our Climate and Health

Washington’s latest parlor game involves predictions about the number of days left in Scott Pruitt’s tenure at the EPA.  There’s even a website where you can place bets on it and some very funny memes and gifs on the internet.  Amid the controversies over discounted condos, high priced furniture, self-important sirens, and questionable personnel practices, the outrage over Pruitt’s policies is getting lost in the noise.  If his ethical lapses result in his ouster, what’s to stop his replacement from continuing the destruction of nearly half a century of environmental progress?

Not much.  The nominated second in command, Andy Wheeler, is awaiting confirmation in the US Senate to become the Deputy Administrator.  Wheeler is a well-known as a lobbyist for the coal industry and former staffer for the senate’s leading climate denier, Senator Jim Inhofe, serving on the Environment and Public Works Committee (EPW) staff for 14 years. If Pruitt gets the boot, Wheeler will most likely be the acting Administrator.  Unlike Pruitt, Wheeler worked for the EPA early in his career and has played key roles in Congressional oversight of the agency and its budget, making him a formidable opponent with intimate knowledge of the agency’s programs and regulations.

Senate climate deniers at the EPA’s helm

Wheeler will join Inhofe alumni that already occupy the chief of staff and deputy chief of staff positions at the EPA. Pruitt’s senior advisers on air, climate and legal issues are also former Inhofe staff, as are the top domestic and international energy and environmental advisers to President Trump.  A Senate Democratic aide speaking off the record warned, “These are folks who are very capable. They know the agency and its programs.  They’re smart and hard-working, and they certainly could dismantle the programs if they were asked to do that. But the question is how they will react if they’re asked to do that.” Another former Capitol Hill staffer said , “I think Andrew is very similar to Scott Pruitt’s approach in understanding under EPA’s regulatory scheme that states have the priority over federal overreach.” Given Wheeler’s tenure with the Senate EPW committee and his coal company client list, it is safe to assume that he will continue the repeal of climate regulation and the assault on the Clean Air Act.

Crooked math on air pollution

Unfortunately, Wheeler is likely to move forward on changes to the way the EPA assesses costs and benefits of regulation that was buried in its proposed regulation gutting the Clean Power Plan (CPP).  The CPP was an Obama era regulation aimed at reducing emissions of carbon dioxide to reduce the risks of climate change. UCS economist Rachel Cleetus commented that, “[t]oday’s proposal to repeal of the Clean Power Plan uses crooked math to artificially lower the benefits of the pollution reductions that standard would have brought. The EPA fails to account for the fact that actions to cut carbon emissions also pay large dividends by reducing other forms of harmful pollution like soot and smog.”

The “proposed repeal outlines a flawed approach to evaluating the risks of pollution — specifically particulate matter, which is a mix of very tiny particles emitted into the air. When inhaled, this pollution can cause asthma attacks, lung cancer and even early death,” according to the American Lung Association.  Harold P. Wimme, the national president and CEO of ALA and Stephen C. Crane, Ph.D., MPH, the executive director of the American Thoracic Society, argue that “[t]he [Trump] EPA has cherry-picked data to conceal the true health costs of air pollution. Its revised calculations diminish and devalue the harm that comes from breathing particulate matter, suggesting that below certain levels, it is not harmful to human health. This is wrong. The fact is: There is no known safe threshold for particulate matter. According to scores of medical experts and organizations like the World Health Organization, particle pollution harms health even at very low concentrations. Attempting to undercut such clear evidence shows the lengths the EPA, and by extension the Trump administration, will go to reject science-based policy that protects Americans’ health.”

What are the health dangers caused by air pollution for children and adults? Credit: American Lung Association.

What Mr. Pruitt, Mr. Wheeler and the Trump Administration don’t want you to know is that actions taken to reduce carbon also reduce the air pollution that causes illness and death.  A forthcoming analysis of the proposed change to the way the EPA assesses health benefits in the from Kimberly Castle and Ricky Revesz from the Institute for Policy Integrity at the NYU School of Law finds that:

The benefits from particulate matter reductions are substantial for climate change rules, accounting for almost one half of the quantified benefits of the Obama Administration’s Clean Power Plan. These benefits are also significant for regulations of other air pollutants, making this issue one of far-reaching importance for the future of environmental protection.

Opponents of environmental regulation, including the Trump Administration, have recently embraced an aggressive line of attack on particulate matter benefits. They argue alternatively that these benefits are not real; are being “double counted” in other regulations; or should not be considered when they are the co-benefits, rather than the direct benefits, of specific regulations….An examination of the scientific literature, longstanding agency practices under administrations of both major political parties, and judicial precedent reveals that particulate matter benefits deserve a meaningful role in regulatory cost-benefit analysis.

Pruitt’s EPA has also indicated plans to adopt a policy similar to legislation that House Science Committee Chairman Lamar Smith (R-Texas) has unsuccessfully pushed for years, over the objection of the country’s leading scientific societies. The policy builds on a strategy hatched in the 1990s by lobbyists for the tobacco industry, who invented the phrase “secret science” to undermine robust peer-reviewed research on the harmful impacts of second-hand smoke. The goal back then was to create procedural hurdles so that public health agencies couldn’t finalize science-based safeguards.

Climate and health

The US Global Change Research Program found significant health impacts from climate change, and documented several linkages between climate and air quality. “Changes in the climate affect the air we breathe, both indoors and outdoors. The changing climate has modified weather patterns, which in turn have influenced the levels and location of outdoor air pollutants such as ground-level ozone (O3) and fine particulate matter.”  It also found that climate change will make it harder for any given regulatory approach to reduce ground-level ozone pollution in the future as meteorological conditions become increasingly conducive to forming ozone over most of the United States. Unless offset by additional emissions reductions, these climate-driven increases in ozone will cause premature deaths, hospital visits, lost school days, and acute respiratory symptoms.

The air quality response to climate change can vary substantially by region across scenarios. Two downscaled global climate model projections using two greenhouse gas concentration pathways estimate increases in average daily maximum temperatures of 1.8°F to 7.2°F (1°C to 4°C) and increases of 1 to 5 parts per billion (ppb) in daily 8-hour maximum ozone in the year 2030 relative to the year 2000 throughout the continental United States. Unless reductions in ozone precursor emissions offset the influence of climate change, this “climate penalty” of increased ozone concentrations due to climate change would result in tens to thousands of additional ozone-related premature deaths per year, shown here as incidences per year by county (see Ch. 3: Air Quality Impacts). Credit USGCRP, 2016: The Impacts of Climate Change on Human Health in the United States: A Scientific Assessment. Crimmins, A., J. Balbus, J.L. Gamble, C.B. Beard, J.E. Bell, D. Dodgen, R.J. Eisen, N. Fann, M.D. Hawkins, S.C. Herring, L. Jantarasami, D.M. Mills, S. Saha, M.C. Sarofim, J. Trtanj, and L. Ziska, Eds. U.S. Global Change Research Program, Washington, DC, 312 pp. http://dx.doi.org/10.7930/J0R49NQX

Temperature-driven changes in power plant emissions are likely to occur due to increased use of building air conditioning. A recent study in Environment Research Letters compared an ambient temperature baseline for the Eastern US to a model-calculated mid-century scenario with summer-average temperature increases ranging from 1 C to 5 C. Researchers found a 7% increase in summer electricity demand and a 32% increase in non-coincident peak demand. Power sector modeling, assuming only limited changes to current generation resources, calculated a 16% increase in emissions of NOx and an 18% increase in emissions of SO2.

Wheeler and Clear Skies

While at EPW, Andy Wheeler was the Bush Administration’s point person on Clear Skies – an ironically named effort to essentially gut the Clean Air Act proposed in 2003.  The bill would have significantly delayed the implementation of soot and smog standards and delivered fewer emissions reductions of Nox and SO2 than strict implementation of the existing Clean Air Act would deliver.  Wheeler not only negotiated the bill to near passage (a tied committee vote killed the bill in 2005), he carried out Inhofe’s intimidation effort against an association of state air quality officers, asking the group to turn over six years of IRS filings and all records of grants they received from the EPA.

President Trump claimed to want the EPA to focus on clean air and clean water.  But his defense of Pruitt on Twitter and his nomination of Wheeler as Deputy Administrator makes clear that he has no idea of what it takes to deliver clean air to the American people.  The Trump Administration’s priority is to reduce regulation on industry at the expense of the health and well-being of America.

Department of Energy Releases Bogus Study to Prop Up Coal Plants

A few months ago, the Department of Energy (DOE) made a request to one of its national labs, the National Energy Technology Laboratory (NETL), to study the impacts on the electricity grid of a severe cold snap called the bomb cyclone that hit the Northeast in early January 2018. NETL conducts important R&D on fossil energy technologies. The report released last week uses deeply flawed assumptions to inaccurately paint coal (and to a lesser extent, fuel oil) as the savior that prevented large-scale blackouts during the extreme cold, while greatly understating the contribution from renewable energy sources. It also estimates a bogus value for coal providing these so-called “resiliency” services. One has to wonder whether this deeply flawed and misleading study is part of the administration’s continued attempts to prop up the coal industry at all costs, especially after FERC rejected the DOE’s fact-free proposal to bail out coal and nuclear plants late last year. The utility FirstEnergy, which owns and operates a fleet of coal and nuclear generators, immediately seized upon NETL’s report and is petitioning DOE for an emergency bailout.

Separating the Facts from the Fiction

The report emphasizes the fact that fossil and nuclear power played a critical role in meeting peak demand during the cold snap. Across six regions, according to the report, coal provided 55 percent of daily incremental generation, and the study concludes that at least for PJM Interconnection (which manages the electricity grid across 12 Midwest and Mid-Atlantic states as well as DC), “coal provided the most resilient form of generation, due to available reserve capacity and on-site fuel availability, far exceeding all other sources” without which the region “would have experienced shortfalls leading to interconnect-wide blackouts.” The report then goes on to incorrectly estimate value of these “resiliency” services to be $3.5 billion for PJM.

The nugget of truth here is that we do need reserve capacity to be available in times of peak demand, especially during extreme weather events that lead to greatly increased need for heating or cooling. And this is especially important during the winter, when the demand for natural gas for home heating spikes in some parts of the country, leading to higher prices and less natural gas available for electricity generation (since home heating takes priority over electricity generation in terms of natural gas pipeline delivery contracts). In the Northeast, which uses a lot of natural gas for heating, this shortfall in natural gas led to an increase in electricity generation from [dirty] fuel oil, as the report points out.

However, regional transmission organizations (RTOs) and independent system operators (ISOs) were prepared for the cold snap, and the markets performed as expected. PJM in particular put systems in place to prepare for extreme cold weather following the 2014 Polar Vortex, and electricity markets in the Eastern U.S. are organized to provide payments to power plants for providing either energy (electrons to the grid) or capacity (the ability to switch on and provide a certain level of output if called upon). As fossil generators retire because they are uneconomic, plenty of other resources are under construction or in advanced planning stages and will be ready at the time they’re needed. This is why planning for future electricity needs is critical, and this is the responsibility of regional grid operators—one they take quite seriously.

To that point, grid operators and reliability experts see no threat to grid reliability from planned retirements of coal and nuclear power plants. The North American Electric Reliability Corporation (NERC), whose mission is to ensure the reliability of the bulk power system for the continent, finds in its 2017 Long-Term Reliability Assessment, that (contrary to NETL raising potential reliability issues from future coal and nuclear retirements) most regions of the country have sufficient reserve margins through 2022, as new additions more than offset expected retirements. PJM, in its strongly worded response to FirstEnergy’s petition to DOE for an emergency bailout (see below), stated “without reservation there is no immediate threat to system reliability.

Beyond this, the report and its pseudo-analytic underpinnings really goes off the rails. Let’s take a few of its misleading points in turn.

How to Quantify Resiliency

NETL decided to consider the incremental generation from each fuel source—that is, how much more electricity was produced by each fuel during the bomb cyclone—as a metric for which fuel provides the grid with resilient services. As they put it:

“…we examine resilience afforded by each source of power generation by assessing the incremental daily average gigawatt hours during the BC event above those of a typical winter day.”

This is a bogus metric not only because it simply reflects the amount of unused or idle generation in the system, but also because the reference time period (the first 26 days of December) is a period when there wasn’t much generation from coal and oil. Turns out, there is a lot of coal-fired capacity sitting around because it is more expensive to run compared to natural gas. The only time it makes economic sense to call on these more expensive resources is when demand pushes electricity prices high enough, as it did during the bomb cyclone.

What NETL is basically saying is that the most expensive resources are the most resilient. The report then argues that the high cost of those expensive resources represents the value of “resiliency”—and that these expensive generators should be compensated for providing that value. It’s circular reasoning, and it’s the same argument that we heard all last fall as part of the fact-free DOE FERC proposal, which boils down to this: our assets can’t compete in the marketplace because they’re too expensive, so you (meaning, the ratepayer) should pay us more money to stay online.

The NETL report is essentially trying to invent a metric to define resiliency, and it’s wrong. There are certainly qualitative ideas about what resiliency means:

Infrastructure Resilience is the ability to reduce the magnitude and/or duration of disruptive events. The effectiveness of a resilient infrastructure or enterprise depends upon its ability to anticipate, absorb, adapt to, and/or rapidly recover from a potentially disruptive event.”  –NERC, 2012

But there is no agreed-upon quantitative definition for resiliency, which is one reason FERC has opened a docket to study the issue.

Enter Capacity Markets

The NETL report misses another crucial point. These resources are, in many cases, already being paid to be available when needed. In general, there are several ways that a given generating facility of any kind can make money: by providing energy; by offering capacity on demand; and by providing what are called ancillary services (things like voltage and frequency regulation, which ensure the stability of the grid). Without going into a detailed explanation of how these different markets work, it’s sufficient to understand that these markets exist—and are working as intended.

Instead of doing a detailed analysis of how fossil generators were compensated during the cold snap, or which plants may have been cheaper to run, NETL offers a deeply misleading back-of-the-envelope calculation: it multiplies the increase in the daily cost of electricity above an arbitrary baseline (see next section) by the number of days in the cold snap. This calculation fails to acknowledge that some of these generators are already receiving payments for those services by bidding into a market and agreeing to provide the service of additional capacity when needed.

Cherry-Picking Baselines to Attack Renewables

NETL’s flawed analysis also takes aim at renewables, suggesting that because of “below average” renewable generation, resources like coal and fuel oil had to come online to pick up the slack.

What NETL did here is classic cherry-picking. They compared the generation from renewables during the bomb cyclone to what they called a “typical winter day.” Except that it wasn’t. NETL used a 26-day period in December to compare baseline generation. Wind generation during the bomb cyclone event was actually higher than expected by grid operators in the Northeast and Mid-Atlantic. For example, In PJM, wind output from January 3-7 was 55 percent higher than the 2017 average output, and consistently 3 to 5 times greater than what PJM expected from January 3-5.

Actual Failure Rates

Instead of using NETL’s flawed analysis, looking at the actual failures rate of different generation resources during the extreme weather event provides a more accurate picture of the reliability and resiliency impacts. PJM did this, it turns out. As shown in the chart below, which compares forced outages during the polar vortex and the bomb cyclone, PJM’s analysis finds that coal plants experienced similar failure rates as natural gas power plants during both the 2014 and 2018 cold snaps. For example, on January 7, 2018, a peak winter demand day, PJM reported 8,096 MW of natural gas plant outages, 6,935 MW of coal outages, 5,913 MW of natural gas supply outages, and 2,807 MW of “other” outages (which includes wind, solar, hydro, and methane units).

The NETL study completely ignores the fact that baseload resources like coal and nuclear also pose challenges to reliability—because of limited flexibility, vulnerability to extreme weather events (like the polar vortex and bomb cyclone), extreme heat and drought affecting cooling water, and storm surge. During extreme cold, pipes and even piles of coal can freeze, meaning that coal plants can’t fire up.

FirstEnergy Begs for a Handout

Only a day after NETL’s report was released, the utility FirstEnergy submitted a request to DOE for emergency financial assistance to rescue its uneconomic coal and nuclear plants and heavily cited the NETL report. The basis of the request is section 202(c) of the Federal Power Act, a rarely used portion of the statute that allows DOE to keep power plants online in times of emergency or war. But as NERC, PJM, and others have pointed out, there is no immediate reliability crisis. The request is a Hail Mary pass to save the company from bankruptcy, and is not likely to hold up in court.

Garbage In, Garbage Out.

NETL has produced a document that isn’t worth the few megabytes of disk space it is taking up on my computer. As we often say when evaluating a computer model or analysis—garbage in, garbage out. The study appears to be politically motivated, and it reveals a deep misunderstanding of how the electricity grid works, using simplistic and misleading calculations to justify its conclusions. It is shrouded in insidious, analytic-sounding language that make it seem as if it were a legitimate study. It should be rejected out of hand by any serious person taking an objective look at these issues—as should FirstEnergy’s request for a bailout.

The World’s Population Hasn’t Grown Exponentially for at Least Half a Century

Recently I was looking at some data about world food production on the excellent Our World in Data site, and I discovered something very simple, but very surprising about the world’s population. We often hear (and I used to teach) about the threat of an exponentially growing population and the pressure it is supposed to be putting on our food supply and the natural resources that sustain it (land, water, nutrients, etc). But I found that the global population isn’t growing exponentially, and hasn’t been for at least half a century.

It has actually been growing in a simpler way than exponentially—in a straight line.

What exponential growth is

Exponential growth (sometimes also called geometric or compound-interest growth) can be described by an equation in which time is raised to a power, i.e. has an exponent—hence the name. But it also can be described in simpler terms: the growth rate of the population, as a fraction of the population’s size, is a constant. Thus, if a population has a growth rate of 2%, and it remains 2% as the population gets bigger, it’s growing exponentially. And there’s nothing magic about the 2; it’s growing exponentially whether that growth rate is 2% or 10% or 0.5% or 0.01%.

Another way to put it is that the doubling time of the population—the number of years it takes to grow to twice its initial size—is also a constant. So, if the population will double in the next 36 years, and double again in the following 36 years, and so on, then it’s growing exponentially. There’s even a simple rule-of-thumb relationship between doubling time and the percentage growth rate: Doubling Time = 72/(Percentage Growth Rate). So that population with a 36 year doubling time, is growing at a rate of 2% per year.

But probably the simplest way to describe exponential growth is with a graph, so here’s how it looks:

Figure 1. Exponential growth versus linear (straight-line) growth.

This graphic not only shows the classic upward-curving shape of the exponential growth curve, but also how it contrasts with growth that is linear, i.e. in a straight line. Additionally, it demonstrates a simple mathematical result: if one quantity is growing exponentially and a second quantity is growing linearly, the first quantity will eventually become larger than the second, no matter what their specific starting points or rates of growth.

This isn’t just abstract math; it also illustrates the most famous use of exponential growth in political debate. It was put forward by the English parson Robert Malthus over two centuries ago. He argued that the human population grows exponentially while food production can only grow linearly. Thus, it follows inevitably that the population will eventually outgrow the food supply, resulting in mass starvation. This is the case even if the food supply is initially abundant and growing rapidly (but linearly). The upward-bending-curve of an exponentially-growing population will always overtake it sooner or later, resulting in catastrophe.

Looking at real data

Critics ever since Malthus’ time have pointed out that his assumption that food production grows in a straight line is just that—an assumption, with little basis in theory. So I wasn’t surprised to see that the OWID data showed faster-than-linear (upward-curving) growth in global food production over the past half-century. What did surprise me was that the growth of the world’s population over that time period has actually been very close to a straight line.

Here’s that graph:

Figure 2. World population growth from 1961 to 2016, from the official U.N. figures available at ourworldindata.org. The data are expressed as an index, with the 1961 population = 100. To convert the index to actual numbers of people, just multiply the index value by 30,830,000, since the world population in 1961 was 3.083 billion.

The graph looks very much like a straight line rather than the upward-curving exponential, but is that really the case? We can test this by calculating the value of what statisticians call the R2 (or “coefficient of determination”) for this curve. The closer it is to a straight line, the higher R2 will be, and if the data fits a straight line perfectly then R2 will be exactly 1.0.

So, what’s the actual value for this data? It’s 0.9992. I.e. the fit to a straight line isn’t quite perfect, but it’s very, very close.

Is this some sort of artifact?

I was actually quite surprised at how well the data fit a straight line—so much so that I wondered if this was just an artifact of the method I used, rather than a real result. So I applied the same method—plot the data, fit a straight line to it, and calculate the value of R2—to the data for some of the world’s largest countries and regions, rather than the world as a whole.

For several of these, the lines looked very straight and the value of R2 was almost as high as in the graph for the world as a whole, or even slightly higher, e.g.:

Country or Region  R2 for the linear equation of Population vs. Time Brazil .9977 India .9954 Indonesia .9995 Latin America and the Caribbean .9994 North America .9966 Pacific island small states .9991

But for others it was considerably lower (e.g. .9777 for China, .9668 for the European Union) and two graphs proved clearly that the excellent fit to a straight line is a real result, not an artifact. These were the ones for Sub-Saharan Africa and Russia:

Figure 3. Population growth of Sub-Saharan Africa from 1961 to 2016, from the official U.N. figures available at ourworldindata.org. The data are expressed as an index, with the 1961 population = 100. Thin dotted line shows the best-fit straight line; thick dots show the actual data.

Figure 4. Population growth of Russia from 1961 to 2016, from the official U.N. figures available at ourworldindata.org. The data are expressed as an index, with the 1961 population = 100. Thin dotted line shows the best-fit straight line; thick dots show the actual data.

The point about the Sub-Saharan African graph is not simply that it has a lower value of R2 (0.964), but that its data deviates from the straight line in the way that an exponential curve should: higher than the straight line at the lowest and the highest time values, and lower than the straight line at the intermediate ones. It does fit an exponential curve quite well, thus showing that the method can pick out an exponential curve if the data do follow one. But this is the only region or large country for which that’s actually true.

The Russia graph doesn’t fit an exponential curve well at all—it actually curves downward overall, rather than upward as it should if it were an exponential—but it does show that the value of R2 can be much lower than 1.0 for real data. For Russia it’s 0.632. So as with the Sub-Saharan Africa case, it proves that the high value of R2 for the world as a whole is not an artifact caused by the method. It reflects the reality of the past 55 years.

Finally, since I and many of my readers are from the United States, here’s that graph:

Figure 5. Population growth of the United States from 1961 to 2016, from the official U.N. figures available at ourworldindata.org. The data are expressed as an index, with the 1961 population = 100. Thin dotted line shows the best-fit straight line; thick dots show the actual data.

For the US as for the whole world, population growth over the past half-century has been quite close to a straight line; the R2 is 0.9956.

A direct test of whether growth is exponential

These graphs and R2 values seem to indicate that linear growth is the best model for the world population over the past 55 years, but there’s another way to show that it’s not exponential. As I said above, exponential growth occurs when the percentage growth rate remains constant as the population gets bigger. So a simple test is to graph the percentage growth rate over time, and see whether it’s a constant—i.e., a horizontal line. So here’s that graph:

Figure 6. Percentage growth rate of the world population from 1961 to 2016, calculated from the official U.N. figures available at ourworldindata.org. The trend lines goes downward over time, rather than being horizontal as it would if the percentage were a constant.

This result, like the others, is quite clear. The percentage growth rate is not a constant, as it should be if the population were growing exponentially. Rather, it has been dropping steadily over the past half-century, from over 2.0% in the early sixties to below 1.2% now.

What exponential growth is Not

So, we should stop saying that the world’s population is growing exponentially. That hasn’t been the case for at least 50 years. Exponential growth clearly doesn’t describe the global reality of the twenty-first century.

But there’s actually a second reason to stop saying that the global population is growing exponentially, and that’s because the term is so commonly misused and misunderstood. Note the next few times that you hear someone use the word, and I think you’ll find that it’s not being used in the sense of “constant percentage-growth-rate” or “constant doubling-time” or even just “an upward-bending curve.” Rather, it’s being used—often with an emphatic stress on the “-nen-” syllable and an implicit exclamation mark at the end of the phrase—to mean “rapidly” or “quickly” or “fast” or “big.”

That way of speaking is common, but it’s also just plain wrong. Remember the example that I started with: the exponential growth rate can be high (e.g. 10%) or low (e.g. 0.01%) or intermediate (e.g. 2%). In every case it’s exponential growth, but it’s very fast exponential growth if the growth rate is 10% and very slow exponential growth if it’s 0.01%.

I’m not that sanguine about getting people to go back to using “exponential” in its correct sense, but I think it’s at least worth a try. After all, we already have several other good words for that other, incorrect meaning—e.g. “fast” or “big.”

Implications

The results don’t just imply that we should talk about population growth differently, but also that we need to re-think how it relates to food production. There is good news in these data, because they show that hunger and environmental catastrophe is not at all inevitable. Malthus’ argument just doesn’t fit reality.

While linear growth has its challenges, it’s far easier to deal with than exponential growth. The distinction between growing exponentially and growing in a straight line does matter. On that point, at least, Malthus got it right.

FEMA and HUD Budgets are Vital for Disaster and Climate Preparedness

Members of FEMA's Urban Search and Rescue Nebraska Task Force One comb a neighborhood for survivors impacted by flooding from Hurricane Harvey. FEMA

Last year’s record-breaking disasters—including hurricanes, wildfires and floods—were a reminder of how climate change and faulty development policies are colliding to create dangerous and costly outcomes for the American public. While much attention is focused on post-disaster recovery, we need to invest much more in preparing for disasters before they happen. The good news is that the omnibus budget deal recently passed by Congress appropriated significant funding for the Federal Emergency Management Agency (FEMA) and Department of Housing and Urban Development (HUD) to help foster community resilience, in many cases undoing steep cuts that had been proposed by the Trump administration.

FEMA and HUD’s role in building disaster resilience

The omnibus budget deal recently passed by Congress was clearly influenced by the unprecedented series of disasters in 2017. There seems to be a dawning sense of new realities regarding extreme weather (even if some prefer to disavow climate science). We saw this reflected in the budgets of FEMA and HUD.

FEMA administers several programs that help states, territories, and tribal governments build back after disasters as well as invest in preparedness measures to reduce the risks and costs of future disasters. Done right, with future climate and other conditions in mind, these grants can be a powerful catalyst for building community resilience.

Key FEMA programs include:

  • The Hazard Mitigation Grant Program, which helps communities implement measures to reduce long-term risks to people and property from hazards after a presidential major disaster declaration. The HMGP provides funding for a range of activities including voluntary home buyouts, home elevation and infrastructure retrofits and is generally 15 percent of the total amount of Federal assistance provided to a State, Territory, or federally-recognized tribe following a major disaster declaration. To mark 30 years of this program, FEMA has created an online data visualization resource that summarizes data for HMGP projects by county, state, FEMA region or by Congressional District.
  • The Flood Mitigation Assistance Grant Program, which helps state and local governments fund projects and plans to reduce the long-term risk of flood damages for properties insured by the National Flood Insurance Program. In the recently passed omnibus budget, this program’s budget was $175 million.
  • The Pre-disaster Mitigation (PDM) Grant Program, authorized by the Stafford Act to help states, local governments, and communities implement long-term measures to reduce the risks and losses from disasters. Typically, FEMA pays for 75 percent of project costs and states match the remaining 25 percent. In the omnibus, this program’s budget was $249.2 million. This was a striking increase from recent years, as one news story put it: that is three times the average annual amount over the past 15 years!
  • FEMA’s budget for flood risk mapping is also vital to ensuring that communities, planners, and policymakers are aware of these risks and can take protective measures to limit them. The omnibus budget provided $262.5 million for flood mapping.

HUD’s Community Development Block Grant (CDBG) program, especially the CDBG-Disaster Recovery grants, are instrumental in helping low and moderate-income communities—often the hardest hit by disasters—prepare, recover and build resilience. Our nation has long under-invested in safe, affordable housing–a challenge which is further exacerbated when disasters strike. Despite the Trump administration’s efforts to decimate HUD’s budget with an $8.8 billion proposed cut, Congress passed an omnibus budget deal that increased funding for HUD across the board–including $1.36 billion for the HOME Program and $3.3 billion for the Community Development Block Grant (CDBG) Program.

Despite repeated attempts by the Trump administration to cut agency budgets, including FEMA and HUD’s, Congress has recognized the importance of their work for the well-being of the American public, and has maintained or increased funding levels. Unfortunately, funding still remains much below what is needed by communities, especially as the impacts of climate change worsen.

Another continued area of concern that Congress must stand up against is this administration’s attempts to sideline science in policymaking. A recent egregious example of this: FEMA scrubbed all references to “climate change” from its four-year strategic plan, released last month.

An ounce of prevention is worth a pound of cure

Investing in resilience ahead of disasters—so-called hazard mitigation—is incredibly cost-effective and can save lives. That’s the clear message from an authoritative report from the National Institute of Building Sciences, Natural Hazard Mitigation Saves: 2017 Interim Report. Based on nearly a quarter-century of data, the report found that hazard mitigation projects funded by FEMA, HUD and the U.S. Economic Development Administration (EDA) can save the nation, on average, $6 in future disaster costs for every $1 invested (That ratio is even higher, 7:1, for measures to protect against riverine flooding).

The report also found that investing in measures that exceed requirements of the 2015 International Codes, the model building codes developed by the International Code Council, can save the nation $4 for every $1 spent. See the figure below for benefit-cost ratios for these two categories of protective measures to address different types of hazards.

In the aftermath of disasters, communities clearly need stepped-up aid, but the reality is we spend a lopsided amount of money post-disaster and shortchange pre-disaster investments to help limit costs and harms.  A 2015 Government Accountability Office (GAO) report  found that from fiscal years 2011-2014, FEMA obligated more than $3.2 billion for the HMGP (Hazard Mitigation Grant Program) post-disaster hazard mitigation while the Pre-Disaster Mitigation Grant Program obligated approximately $222 million.

A recent paper from Kousky and Shabnam underscores the challenges, highlighting that:

For FEMA, almost 90% of flood risk reduction funding comes after a big flood and the HUD CDBG-DR funding is only after a major disaster. Across agencies, absent a severe flood, very few dollars for risk reduction are available.”

We also need more (bipartisan) action to foster preparedness 

It’s critical to support and bolster existing federal agency budgets and programs that are helping communities become more resilient, alongside funding to help them cope with and recover from disasters. It’s simply a commonsense way to help protect people and property—and it’s a smart use of taxpayer dollars.

What’s more, budgets for disaster preparedness and protective standards are a bipartisan priority, despite political polarization about some of the underlying climate-related risk factors.

For example, South Carolina Republican Representative Mark Sanford recently called for a flood-ready infrastructure standard, saying:

“The process of flooding and rebuilding has become increasingly costly, as taxpayer dollars are being spent to rebuild or repair public infrastructure – sometimes multiple times. It makes no sense to go through this cyclical and costly process when the simple step of strengthening the federal flood standard can save taxpayer money and protect our communities.”

This standard is sorely needed since the Trump administration rolled back the Federal Flood Risk Management Standard just before Hurricane Harvey hit.

Florida Republican Representative Carlos Curbelo has co-sponsored the National Mitigation Investment Act, which provides incentives for states to invest more in protective building standards.

Federal, state, and local policymakers will also need to do a lot more to align existing and new policies and incentives with worsening risks in a warming world. One important near-term opportunity is reforming the National Flood Insurance Program, which the omnibus bill sets up for reauthorization by July 31 this year.

State and local governments leading the way

Massachusetts Governor Charlie Baker, a Republican, recently filed legislation for a $1.4 billion climate adaptation bond to help the state prepare for the impacts of climate change. Coming off a brutal series of winter storms, accompanied by damaging coastal flooding, the Governor and the legislature now have an opportunity to pass legislation to address the near and long term threats of climate change.

At the local level we need to see more progress along the lines of the encouraging news last week that the Houston City Council has just adopted more protective building standards in the city’s flood-prone areas. Houston Mayor Sylvester Turner said it best:

“We’re going to be futuristic. We are not going to build looking back. We’re going to build looking forward.”

That’s a goal our nation must aspire toward, especially as climate projections show an increasing risk of many types of disasters.

FEMA News Photo

With Pruitt Under Fire, Likely Successor Andrew Wheeler’s Coal Ties Deserve Scrutiny

Photo: Senate EPW

As ethics storm clouds build over Scott Pruitt, environmentalists eager for a new administrator of the Environmental Protection Agency should beware.

That is because the odds-on next leader of the EPA is Andrew Wheeler. He has been an unabashed inside man for major polluters on Capitol Hill. He lobbied for coal giant Murray Energy, a captain in that company’s bitter war against President Obama’s efforts to cut greenhouse gas emissions and enact more stringent clean air and clean water rules.

Wheeler assisted the efforts of refrigerant companies to resist stricter ozone rules and represented Energy Fuels Resources, a uranium mining company that successfully pushed for Interior Secretary Ryan Zinke to shrink the size of Bears Ears National Monument in Utah 85 percent, despite all its riches in Native American archaeology and art.

Confirmation now up for a vote

Nominated last October by President Trump to be Pruitt’s deputy administrator, Wheeler’s confirmation has been in limbo. But Senate Majority Leader Mitch McConnell fast-tracked Wheeler for a vote that could come next week, by filing cloture.

The evidence is abundant that Wheeler stands squarely with the agenda of President Trump and Administrator Pruitt to render the EPA as ineffective as possible. When Pruitt sued the EPA 14 times as Oklahoma attorney general between 2011 and 2017 on behalf of polluting industries, a top petitioner and co-petitioner in half those cases was coal giant Murray Energy. Wheeler was its lobbyist from 2009 to last year. Even with pro-coal President Trump well into his second year, CEO Robert Murray is still complaining in his current message on the company’s Website:

“Our industry is embattled from excessive federal government regulations from the Obama Administration and by the increased use of natural gas for the generation of electricity. In my sixty-one years of coal mining experience, I have never before seen the destruction of an industry that we saw during the Obama presidency.”

An action plan for rollbacks

Wheeler accompanied Murray to the now-notorious meeting a year ago with Energy Secretary Rick Perry, the one in which Murray handed Perry a 16-point action plan “which will help in getting America’s coal miners back to work.” That plan ultimately became the framework of a proposal by Perry to bail out struggling coal and nuclear power plants (Wheeler was also a nuclear industry lobbyist).

That particular proposal was shot down by federal regulators, but Trump and Pruitt have made good or are making good on most of those 16 points, including the US pullout from the Paris climate accords, the rejection of Obama’s Clean Power Plan, and slashing the staff of the EPA down to a level not seen since the 1980s attacks on the agency by President Reagan.

In suggesting that EPA employees be cut by at least half, Murray’s action plan claimed that the verbiage of Obama-era EPA rules were “thirty-eight (38) times the words in our Holy Bible.”

Wheeler has denied helping Murray draw up that document, but he certainly shares its sentiments, telling a coal conference in 2016, “We’ve never seen one industry under siege by so many different regulations from so many different federal agencies at one time. This is unprecedented. Nobody has ever faced this in the history of the regulatory agenda.”

Longtime Inhofe aide

Wheeler’s vigorous lobbying career came after serving as a longtime aide to the Senate’s most vocal climate change denier, Oklahoma’s James Inhofe. When the Trump administration announced Wheeler’s nomination, Inhofe hailed Wheeler as a “close friend.” That closeness was evident last May when Wheeler held a fundraiser for Inhofe, as well as for Senator John Barrasso of Wyoming, chair of the Senate Environment and Public Works committee that advanced his nomination by a party-line 11-10 vote. The Intercept online news service reported that Wheeler held the fundraisers after it was reported that he was under consideration to be Pruitt’s second in command.

Up until now, Wheeler has escaped the harsh scrutiny that has forced the withdrawal of some Trump appointees who were seen as embarrassingly close to industry, such as Michael Dourson’s failed bid to oversee chemical safety at EPA. Part of that was his good luck in being paired in his committee hearing last November with Kathleen Hartnett White, who spectacularly flamed out with her blatant skepticism about the sources of climate change, once calling carbon dioxide, a key greenhouse gas, the “gas of life.”

By contrast, Wheeler slickly held to dry, brief statements that climate change is real, while agreeing with Trump’s pullout of global climate change accords. He even tried to play the good Boy Scout. After Tom Carper of Delaware recited Scouting’s commitment to conservation, Wheeler said, “I agree with you that we have a responsibility in the stewardship of the planet to leave it in better shape than we found it for our children, grandchildren, and nephews.”

His long track record of lobbying suggests the opposite.

Pages